Building Information Modeling and Green Design
PART ONE
PART ONE
Recall from last year’s product launch that AutoCAD 2010—which included new freeform mesh modeling tools, greatly improved PDF support, and the ability to create intelligent, parametric drawings—was referred to as a “watershed event” in AutoCAD’s history, unmatched by any previous release. Autodesk continued its use of superlatives including “most exciting,” “fantastic,” and “best ever” to describe this year’s release of AutoCAD. While the application does include some very useful enhancements that build up nicely upon the last release, it is ultimately the users who will determine if these are indeed as ground-breaking as Autodesk makes them out to be. The enhancements fall under three main categories: improved conceptual design capabilities, increased productivity in document production, and better parametrics.
On the conceptual design front, AutoCAD 2011 includes a whole new set of advanced surface modeling tools in addition to the mesh modeling tools that had been introduced in AutoCAD 2010. The new tools, shown in Figure 1, enable users to easily create smooth surfaces and surface transitions, with automatic associativity that maintains relationships between all of the objects. In addition, the surfaces stay associated to their underlying geometry and automatically update when the geometry is changed, providing a fluid interface for 3D design. While the new surface modeling capability is undoubtedly most helpful for the manufacturing industry, as evidenced by the example shown in Figure 1, it can be extremely helpful in AEC for exploring organic building forms that can subsequently be exported as NURBS surfaces or solids to Revit for further development. And given that the majority of AEC users already have AutoCAD for their drafting needs, the new freeform modeling capabilities may reduce the need to use a different application such as Rhino or form.Z for conceptual design.
Also, because the surfaces created with the new tools stay associated with their defining 2D geometry, the ability to add various kinds of geometric constraints to drawing objects in relation to other objects—which was introduced in AutoCAD 2010—can also be used to control the geometry of the surfaces parametrically. For example, you could use a dimensional constraint to parametrically change the size of a circle, which in turn will automatically change any 3D surface object that has been defined from it. This is not full parametric 3D modeling that is available in sophisticated mechanical CAD applications, but is a useful extension of AutoCAD’s 2D parametric capabilities to 3D design.
Another new feature in AutoCAD 2011 that is relevant to conceptual design is point cloud support. Users can now bring in a point cloud created with a laser scanning device and use that as the basis for creating a 3D model, similar to how a drawing can be created by using a raster image as a reference. Point clouds with up to 2 billion points are supported. However, there is no way to automatically convert that point cloud into a 3D model—you still have to create the model from scratch. In time, however, third party developers could use Autodesk’s powerful API to develop enhancements to the point cloud functionality and provide some automatic conversion capability.
The third main enhancement on the conceptual design front is the expansion of the materials library to include an enhanced set of materials that enables users to create rich visual representations of 3D models (see Figure 2). It includes over a 1000 predefined materials that can be dragged and dropped to apply them to objects. The same library is also now included in all Autodesk applications, providing consistency and ensuring that material information is fully retained when the model is passed from one application to another. Users can customize the materials and save them to their own library. Libraries can be imported and exported as well as shared with other users.
On the 2D documentation front, one of the main improvements in AutoCAD 2011 is in hatching. The Hatch command can be accessed more easily through a contextual tab. A hatch’s scale, rotation, and origin can now be directly edited using an expanded object grip functionality. There are additional options for hatches include transparency, background colors, and gradient fills, which enable users to add more colors and shading to drawings. In addition to hatches, transparency can also now be applied to entire layers as well as specific objects, providing users with new options for managing the appearance of drawings (see Figure 3). There are new “Hide Objects” and “Isolate Objects” tools to control the visibility of objects regardless of layer, so designers can focus on the objects themselves without having to think about what layer they belong to. Polyline editing has been improved with enhanced grips that can be used to add, remove, or stretch vertices, and to convert straight-line segments to arcs, enabling a more direct manipulation of these elements.
AutoCAD 2011 also introduces two new commands that can speed the process of creating or selecting objects based on the properties of existing objects: the “Add Selected” tool, which can be used to create new objects based on the properties of an existing object; and the “Select Similar” tool, which enables quick selection of objects that include the same type and properties in the selection set. While these are not novel ideas and have already been implemented in several design applications, they should certainly help AutoCAD users do their work more quickly and efficiently.
Rounding off the set of improvements in AutoCAD 11 is the ability for design constraints to be inferred in real time, as the designer is drawing, rather than manually defining all the object relationships desired. This feature builds upon the constraint-based parametric drawing capability introduced last year and is a good step towards making the application smarter and easier to use.
Last but not least, AutoCAD 2011 is optimized to leverage Windows 7 functionality. It is compatible with all editions of Windows 7 as well as with Windows Vista and Windows XP operating systems. Also, it should be noted that there is no file format change that users have to worry about for AutoCAD 2011.
Natural Gas and Technology |
Source: ChevronTexaco Corporation |
In recent years, demand for natural gas has grown substantially. However, as the natural gas industry in the United States becomes more mature, domestically available resources become harder to find and produce. As large, conventional natural gas deposits are extracted, the natural gas left in the ground is commonly found in less conventional deposits, which are harder to discover and produce than has historically been the case. However, the natural gas industry has been able to keep pace with demand, and produce greater amounts of natural gas despite the increasingly unconventional and elusive nature. The ability of the industry to increase production in this manner has been a direct result of technological innovations. Below is a brief list of some of the major technological advancements that have been made recently:
Advances in the Exploration and Production Sector
Technological innovation in the exploration and production sector has equipped the industry with the equipment and practices necessary to continually increase the production of natural gas to meet rising demand. These technologies serve to make the exploration and production of natural gas more efficient, safe, and environmentally friendly. Despite the fact that natural gas deposits are continually being found deeper in the ground, in remote, inhospitable areas that provide a challenging environment in which to produce natural gas, the exploration and production industry has not only kept up its production pace, but in fact has improved the general nature of its operations. Some highlights of technological development in the exploration and production sector include:
Some of the major recent technological innovations in the exploration and production sector include:
Advanced 3-D Seismic Imaging |
Source: NGSA |
Offshore Production - NASA of the Sea |
Source: Anadarko Petroleum Corporation |
The above technological advancements provide only a snapshot of the increasingly sophisticated technology being developed and put into practice in the exploration and production of natural gas and oil. New technologies and applications are being developed constantly, and serve to improve the economics of producing natural gas, allow for the production of deposits formerly considered too unconventional or uneconomic to develop, and ensure that the supply of natural gas keeps up with steadily increasing demand. Sufficient domestic natural gas resources exist to help fuel the U.S. for a significant period of time, and technology is playing a huge role in providing low-cost, environmentally sound methods of extracting these resources.
Two other technologies that are revolutionizing the natural gas industry include the increased use of liquefied natural gas, and natural gas fuel cells. These technologies are discussed below.
Cooling natural gas to about -260°F at normal pressure results in the condensation of the gas into liquid form, known as Liquefied Natural Gas (LNG). LNG can be very useful, particularly for the transportation of natural gas, since LNG takes up about one six hundredth the volume of gaseous natural gas. While LNG is reasonably costly to produce, advances in technology are reducing the costs associated with the liquification and regasification of LNG. Because it is easy to transport, LNG can serve to make economical those stranded natural gas deposits for which the construction of pipelines is uneconomical.
LNG Delivery Facility with Tanker |
Source: NGSA |
LNG, when vaporized to gaseous form, will only burn in concentrations of between 5 and 15 percent mixed with air. In addition, LNG, or any vapor associated with LNG, will not explode in an unconfined environment. Thus, in the unlikely event of an LNG spill, the natural gas has little chance of igniting an explosion. Liquification also has the advantage of removing oxygen, carbon dioxide, sulfur, and water from the natural gas, resulting in LNG that is almost pure methane.
LNG is typically transported by specialized tanker with insulated walls, and is kept in liquid form by autorefrigeration, a process in which the LNG is kept at its boiling point, so that any heat additions are countered by the energy lost from LNG vapor that is vented out of storage and used to power the vessel.
The increased use of LNG is allowing for the production and marketing of natural gas deposits that were previously economically unrecoverable. Although it currently accounts for only about 1 percent of natural gas used in the United States, it is expected that LNG imports will provide a steady, dependable source of natural gas for U.S. consumption. To learn more about how LNG factors into the supply of natural gas in the United States, click here.
Fuel cells powered by natural gas are an extremely exciting and promising new technology for the clean and efficient generation of electricity. Fuel cells have the ability to generate electricity using electrochemical reactions as opposed to combustion of fossil fuels to generate electricity. Essentially, a fuel cell works by passing streams of fuel (usually hydrogen) and oxidants over electrodes that are separated by an electrolyte. This produces a chemical reaction that generates electricity without requiring the combustion of fuel, or the addition of heat as is common in the traditional generation of electricity. When pure hydrogen is used as fuel, and pure oxygen is used as the oxidant, the reaction that takes place within a fuel cell produces only water, heat, and electricity. In practice, fuel cells result in very low emission of harmful pollutants, and the generation of high-quality, reliable electricity. The use of natural gas powered fuel cells has a number of benefits, including:
How a Fuel Cell Works |
Source: DOE - Office of Fossil Energy |
The generation of electricity has traditionally been a very polluting, inefficient process. However, with new fuel cell technology, the future of electricity generation is expected to change dramatically in the next ten to twenty years. Research and development into fuel cell technology is ongoing, to ensure that the technology is refined to a level where it is cost effective for all varieties of electric generation requirements.
To learn more about fuel cell development, visit the Fuel Cells 2000 website here.
Natural Gas Technology Resources
The natural gas industry is joined by government agencies and laboratories, private research and development firms, and environmental technology groups in coming up with new technologies that may improve the efficiency, cost-effectiveness, and environmental soundness of the natural gas industry. Below are links to a number of resources that provide information on new technological developments in the oil and natural gas industry:
Common digital exchange format a solution to BIM problems
Re: Mindset change essential to successful BIM adoption (DCN, March 25)
I’m not sure who came up with the “#D” paradigm for BIM offshoots (“...three-dimensional design; 4D scheduling capabilities; 5D cost estimating; and emergent 6D lifecycle management”) but it rankles.
“4D” makes some sense, but the paradigm breaks down after that, IMHO.
Furthermore, it ignores several other significant attributes of BIM.
Notable among these are more efficient clash detection and energy modeling, not to mention the grail: the automation of precision fabrication processes.
What “D” are those? These techniques have been implemented - not in every BIM or every BIM project — but to an extent sufficient to demonstrate their feasibility.
Each holds at potential for savings at least equivalent to those offered by “4D,” “5D”, and “6D” techniques.
The market is full of software aimed at the “Ds” of BIM. All the buzz around them is drawing attention from the bigger problems BIM has: Interoperability (and not just between one Autodesk product and another), and object modeling standards.
Major BIM software producers started working on solutions to these problems as the International Alliance for Interoperability and developed a common digital exchange format, Industry Foundation Classes (IFC).
But after a decade, progress has slowed to a crawl even though IFC is essentially mature with respect to the basic components of buildings (columns, walls, floors, roofs), with some objects (windows, and doors) coming along close behind.
The whole effort, now under the aegis of buildingSMART Alliance, is nearly lost in the marketing cacophony.
If we want BIM to go forward, the application of technical and financial support for the IFC effort is needed.
The IFC model exchange format is not the ultimate answer to interoperability, as it surely will also pass with time. But it, or something like it, is a vital step along the way to the next step.
Nothing else like it approaches its level of development as far as I know. So it represents the best, if not the only, route to dealing with BIM’s biggest problems.
-Brian Lighthart
Your say
We need a BIM protocol that can be accessed at all levels
Re: Common digital exchange format a solution to BIM problems (DCN, April 30)
Brian (Lighthart) is right in his assessment of barriers that exist in successful deployment of a BIM solution cradle to grave.
As it was put to me at the Insight BIM conference held last month in Toronto by a colleague of CanBIM, if ‘Bimmers’ truly want to advocate for a common protocol of interoperability across all platforms, then the advocacy shift needs to move from a proprietary standard of file share (i.e. .dwg) to something that can be interpreted, modified, accessed etc. at all levels.
How about a dot BIM protocol for IFC?
Imagine the solution to the woes of implementing BIM platform solutions is for the dot BIM file extension to become the dot PDF of the modeling and building information world.
AutoDesk, Bentley etc. really need to take a hard look for the overall good of the industry, and I believe they are now or will be soon as these challenges grow with mass adoption in North America.
Someday an enterprising C++ programmer will come along and create the ‘dot BIM’ solution Bimmers are really looking for.
Until then, Industry Foundation Classes deployed in accordance with ISO 16739 is your alternative.
I just gave away a very profitable business idea - is anyone out there listening?
-Derek Smith, Canada BIM Council
With green rapidly emerging as the new global mantra, it is hardly surprising to find AEC technology vendors jumping on the sustainable design bandwagon, particularly those developing BIM (building information modeling) solutions. One of the most significant aspects of BIM is its ability to capture the description of a building in a semantically intelligent format that can be analyzed to study different aspects of its performance, including those related to energy use. Thus, there is a natural correlation between BIM and green buildings; in fact, I would even go so far as to say that if there ever was a technology "in the right place, at the right time"—at least in AEC—that has to be BIM in the context of sustainable design.
Of the leading BIM vendors, Graphisoft has traditionally been considered the front-runner in supporting energy analysis with IFC support and strong links from ArchiCAD to tools such as Energy Plus, ArchiPHYSIK, Ecotect, and RIUSKA (see more about this on Graphisoft's website). For Autodesk, sustainable design is rapidly emerging as a key focus area, as was demonstrated by presentations at Autodesk University 2006 and by its recent partnership with Integrated Environmental Solutions (IES) to closely integrate IES' building performance analysis tools with Revit. Bentley, in turn, hosted a one-day "BIM for Green Buildings" Executive Summit last month in New York, which I had the opportunity to attend. The event was focused on exploring the evident synergy between the new BIM-enabled design methodologies and objectives in sustainable design through a series of "best practices" seminar sessions by firms who were, according to Bentley, doing BIM and green design well, followed by an interactive "think tank" discussion with audience participation. The highlights of the presentations and an analysis of the key discussion points that emerged are captured in this AECbytes "Building the Future" article.
The Summit featured five main sessions, the first of which was by Bill Barnard and Myron Bollman of the Troyer Group, a full-service AEC firm providing planning, design, and construction services, for whom sustainable design has been an important component of its work right from the start, long before the LEED (Leadership in Energy and Environmental Design) Green Building Rating System was even established. Now, the firm is a charter member of the United States Green Building Council (USGBC), and has multiple staff members from each discipline in the organization trained in LEED. Most of the firm's presentation at the Bentley Summit was focused on describing the various green features and LEED strategies it had incorporated on some of its key projects. The firm has been using Bentley's solutions for over 10 years, including site analysis with GEOPAK Site and Google Earth, architectural design with Bentley Architecture, structural design and analysis using the integration of RAM and STAAD.Pro with Bentley Structural, HVAC design with Bentley Mechanical and Trace, and conflict detection with Bentley's Interference Manager. It was not clear if the firm was actually using BIM to further sustainable design in its practice, but the presenters did highlight what was needed for BIM and green design to come together: the ability to integrate necessary information such as materials, building loads, lights, occupants, climate, building codes, and so on into the BIM model so as to be able to carry out interactive analysis of different green design aspects, particularly at the preliminary design stage; linking manufacturers' product data into the BIM model to incorporate accurate material information for analysis; being able to use the BIM model to explore the budgetary implications of features such as a green roof, reduced water usage, etc., both in terms of first cost as well as recurring costs; and linking the BIM model to LEED certification forms so that the certification process could be automated.
The bulk of the next session by Rodger Poole of Gresham Smith & Partners, a large multi-disciplinary firm of architects, engineers, planners and designers working in diverse practice areas, was devoted to describing the firm's implementation of BIM using Bentley solutions, the benefits that had been achieved, the challenges encountered and how they were addressed. BIM was used in the firm for a wide range of tasks including program analysis, space analysis, material takeoff, automatic report generation, 4D scheduling, procurement, building commissioning, and various FM services. With regard to the topic of green buildings, the firm was a member of the USGBC and was seeing a growing interest in sustainable design. Poole went on to suggest some strategies for enabling sustainable design with BIM such as building performance modeling, site modeling for more context-sensitive design, and the development of on-the-fly energy calculators. However, there was no indication if the firm was actively designing green buildings or if BIM was being used to explore sustainable design strategies.
Volker Mueller of NBBJ then provided an overview of BIM and sustainability in his firm. NBBJ, a leading architecture and design firm with a global presence, is a long-term user of Bentley's BIM solutions and is a frequent winner of Bentley's annual BE Awards—it won two of the six awards in the Building vertical announced at last year's BE conference. Its BIM implementation was described in some detail in my article on the BIM Symposium at the University of Minnesota published last year, and will therefore not be repeated here. With respect to sustainable design, NBBJ appreciates the growing green movement and the tremendous responsibility it places on the AEC industry, given that buildings account for the largest amount of energy consumption in the US. NBBJ has a Sustainable Design Group within the firm comprising over a 100 LEED professionals; it has a growing number of LEED certified projects and projects tracking to LEED; and it is also implementing sustainable design strategies in its own offices, with its Seattle office aiming for the LEED Gold certification. With regard to tying BIM and sustainability together, these are currently in parallel but interrelated tracks at NBBJ: BIM models are used for solar studies as well as glare and heat gain studies; radiosity based rendering of the models is used to study natural light penetration and determine how to get more light using shafts and skylights; and the use of Bentley's multi-disciplinary BIM suite allows better systems coordination. Many of the general benefits of BIM that NBBJ is realizing also have a green design pay-off: for example, the programmatic clarity achieved with BIM leads to a more economic and thus more energy-efficient design; and the improved prefabrication capability and gains in construction efficiency lead to reduced energy use. For detailed energy analysis, NBBJ partners with a consultant to produce CFD (Computational fluid dynamics) diagrams, but most often, these are produced too late to make any significant changes to the design. What is critically needed is an easier and interactive link between the BIM models and analysis tools, so that the design can incorporate critical energy-related feedback from an early stage. Most of the current links between BIM and analysis tools rely on IFC import/export, which is not an optimal process, as NBBJ has found. (For more on IFCs, see the article, "The IFC Building Model: A Look Under the Hood.")
The last two sessions at the Summit also reiterated the point that the overall benefits of BIM contribute to a greener building. The first of these was by Robert Stevenson of Ghafari Associates, a full-service A/E firm that is well-known for its cutting-edge multidisciplinary BIM implementation, especially in the automotive and aviation sectors in projects such as the new General Motors Lansing Delta Township Assembly Plant and the Detroit Metropolitan Airport North Terminal Development. Ghafari's BIM approach has been described in detail in a dedicated article in AECbytes, published in November 2005. On the subject of green buildings, the main aspects of BIM implementation at Ghafari that contribute to greener design are conflict resolution at design time and just-in-time construction, which result in energy savings because of reduced scrap, reduced transportation, reduced site disturbances, and shorter construction time. Ghafari is also pursuing LEED certification on several projects by incorporating green elements and design features, but it wasn't clear if BIM was directly enabling or facilitating this.
The final session by Michael Wick by General Motors was useful in providing the much-needed owner's perspective on green design, which in this case was in the context of the General Motors Lansing Delta Township Assembly Plant project that was designed by Ghafari. This project was awarded the LEED Gold certification, which is the first time an automotive project has achieved this distinction. It has also won the 2007 AIA Environmental Leadership Award. There is no doubt, some irony to a green design award going to a facility for manufacturing cars, as Wick himself pointed out, but General Motors was keen to derive the many economic, environmental, community, and health and safety benefits of green design. It had a LEED accredited professional as part of the team and pursued LEED certification to establish environmental leadership, reduce long-term operating costs, as well as have healthier buildings for its employees. The facility will use 55% less energy compared to other plants, and its energy usage is estimated to be 30% below the ASHRAE standard, which is quite a remarkable achievement.
While most of the individual sessions were focused either on BIM or on green design, some interesting points on how the two come together did emerge in the Q/A session and discussion following the presentations. One was related to the use of the IFC, which Volker Mueller had briefly mentioned during his presentation and which he elaborated upon a little more during the discussion. While NBBJ does use the IFC to send data to consultants, the process is quite involved and time-consuming. Every exchange is case-specific and needs proper mapping to ensure that the application exporting the IFC file includes all the data that is needed by the receiving application. The exchange has to be tested before being used on an actual project. Thus, the use of the IFC to facilitate interoperability between applications is that not straightforward and could account for the relatively slow adoption of IFC-based analysis tools in conjunction with BIM applications. A potential solution to this problem could be to have applications, both for building modeling and for analysis, that use the IFC as their native file format so that the entire rigmarole of import/export and case-specific mappings can be avoided. No such solutions are available yet, and it is not known if any are even in the works.
Another critical point that was brought up was the possibility that the design model might not be the same as the models needed for analysis. Just as we have had the long-standing debate about design models versus construction models (most recently discussed in the article, "The AGC's BIM Initiatives and the Contractor's Guide to BIM"), we are now confronted with the same question with regard to different aspects of energy analysis. There is no doubt that different models are required for daylighting analysis, for thermal analysis, for a detailed DOE-2 simulation, for a CFD analysis, and so on, as different kinds of building information is needed for these different analyses. The question is whether the design model created by BIM applications can include all the information that would be required to automatically derive these different models for different kinds of energy analysis. If we want energy analysis to become an integral part of the design process, this capability is very important, so that users don't have to expend additional resources to create separate energy-related models. But does this then over-burden the design model and make it too cumbersome to work with? It is difficult to know the answer to this question until we actually have such BIM applications. We do have structural BIM applications that combine a physical model of a structure with an analytical model that can be sent to structural analysis tools, but we are still far from a multi-disciplinary BIM model that integrates not just spatial, structural, and MEP information but also includes all the data needed for the varied types of energy-related analyses mentioned earlier.
McGraw-Hill Construction Presents the 2010 Green BIM Conference in Boston, May 19