Drafting the Future

Our architectural, civil and mechanical drafting services can take your, commercial, industrial or residential sketches and preliminary drawings and create final fully coordinated construction documents.



Thursday, November 11, 2010

Building Information Modeling and Green Design


When designing the U.S. Federal Courthouse in Seattle, NBBJ and its consultants used computer modeling to analyze air temperature distribution to determine the benefits of displacement ventilation in the courtroom lobby, halfway up the tower (indicated with a green stripe on the drawing).

In a perfect world, energy simulations and design tools would be so well integrated that each time an architect moved a wall, added a window, or changed a lighting specification, the building’s predicted energy performance would be updated and displayed instantly. With that sort of real-time feedback, designers would quickly become skilled at optimizing the energy performance of their designs, and new buildings would be rapidly approaching carbon neutrality. Along the way, other aspects of a building, such as how well it uses daylight, how procuring its material will affect the planet, and even how much it will cost to build, could be similarly tracked and optimized. And all of this would be done while sharing a design seamlessly across disciplines. That world has not yet arrived, and the path to it is strewn with obstacles. But in some settings it is becoming tantalizingly close, thanks to the convergence of data-rich, three-dimensional (3D) design tools, ever-faster computers, and accepted protocols for sharing digital information about buildings across platforms. In spite of the significant investment that designers and contractors have to make to adopt building information modeling (BIM), they are flocking to it because it can reduce errors, streamline costs, and improve the performance of a facility in dozens of ways, not least of which is green performance.

A Brief History of Digital Design

In the early 1980s, technologically savvy architecture firms were replacing their drafting tables and pencils with workstations running computer-aided design (CAD) software. By the end of that decade, firms that hadn’t made that transition were in trouble. Through the 1990s, two-dimensional CAD drawings gave way to tools that could create three-dimensional views of a design, and more advanced tools enabled architects to design directly in three dimensions using virtual models. “Working with a model of a building is actually very natural, because it’s what we architects carry around in our heads anyway,” said Mario Guttman, AIA, vice president and CAD director at HOK. Structural engineers working on complex buildings have been among the early adopters of 3D CAD tools, but architects and other engineers now commonly use these tools as well. Building information modeling (BIM) adds additional “dimensions” onto those 3D CAD models by attaching information to elements in the virtual building. Early uses of BIM have advanced beyond collision detection to focus on specific functions, such as real-time cost estimating. Autodesk’s Revit, for example, is linked to cost data from RSMeans, so a project’s budget can be tracked as the design evolves. Sophisticated contractors are using tools such as Constructor from Vico Software (recently spun off from Graphisoft) to create cost estimates based on their own cost databases and also to model and optimize construction sequencing.

What is Building Information Modeling?

“BIM is not just software but a methodology of practice,” said Huw Roberts, Bentley Systems’ global marketing director, suggesting that “an architect or engineer would decide to practice BIM and use a bunch of tools to do that.” Adam Rendek, of Anshen + Allen Architects in San Francisco, added, “We are taking advantage of the intelligence that is embedded in the model. That’s what makes BIM different from 3D CAD.” The move towards BIM is driven in part by large building owners, including the U.S. General Services Administration (GSA), which, as of 2007, accepts delivery of designs for major projects only as interoperable models. Owners like GSA have documented the wastefulness of the conventional paper-based building delivery process and are dictating a more integrated approach. A handful of BIM-related organizations and initiatives joined forces under the umbrella of the National Institute of Building Sciences (NIBS) buildingSMART Alliance and, in February 2007, released the first part of a national BIM standard for industry review. Autodesk, the 800-pound gorilla in the CAD software jungle, has incrementally added data-linking capabilities to its flagship Architectural Desktop software package. In 2002, the company made a major commitment to BIM with its acquisition of Revit, a database-driven design software package. Autodesk is now actively seeking to migrate its longtime CAD customers into the Revit product line. Currently there are 200,000 licensed Revit users worldwide—doubled from last year, according to Jay Bhatt, vice president for AEC at Autodesk. Other major players in this market include ArchiCAD from Graphisoft (a Hungarian company acquired in 2006 by the German firm Nemetschek), and the Microstation suite of software tools from Bentley Systems. Bhatt estimates that between 5% and 10% of CAD users worldwide use BIM software from one of these companies.

Streamlining Building-Performance Simulations

The Pearl River Tower designed by SOM for construction in Guangzhou, China, includes integrated wind turbines and photovoltaic panels to offset its energy use. Inset is an Ecotect model showing the amount of solar radiation on the tower’s various surfaces.

As the number of designers working in BIM grows, so does the opportunity for using those virtual models to do more than just estimate costs. Working in two-dimensional, basic CAD drawings, “you had to do all this heroic behavior to create an environmentally sensitive design,” noted Bhatt. With the advent of BIM, however, “technology is facilitating a much bigger movement around sustainability in the buildings space,” Bhatt added. Vincent Murray, business development manager in the Boston office of simulation software company IES, agrees: “BIM opens up building-performance modeling to the entire building construction community,” he said. Energy modelers use specialized software to create a virtual model of a building. They then subject that model to the building’s anticipated weather and usage patterns to predict its heating and cooling loads and energy use. Until now, setting up an energy model took many hours, even for a relatively simple building, so iterations through various design alternatives were slow and expensive. “Now, since the model is available as a given, representing the actual current state of the design, we can shorten this amount of time dramatically,” said Rendek.

Energy feedback during conceptual design

One of the ironies of energy modeling and other simulations used in the design process is that they tend to require a fairly complete model of the building, which means that by the time the modeling is done, the design is fully developed and only minor changes can be entertained. BIM mitigates this problem to some extent because the integrated 3D design model makes it relatively easy to make changes, even late in the process, by eliminating the need to coordinate changes across multiple drawings. But early-stage simulations from preliminary 3D and BIM models offer the greatest potential benefits. Green Building Studio and (soon) SketchUp are optimized for use in those early stages—specifics on each follow. Green Building Studio. Green Building Studio (GBS) is a pioneer in the field of easy, basic energy simulation from design models. As both a company and a Web-based service of the same name, GBS includes a protocol for translating information from CAD software into the industry-standard DOE-2 energy simulation engine. Because an energy model requires data that isn’t typically defined even in BIM files, much less conventional 3D CAD, GBS fills in the gaps with many default assumptions. “Most of the tools that are moving forward are still engineering tools,” said John Kennedy, president of the company, referring to their intended use for analyses of fully developed designs by trained engineers. He added, “The whole point of this tool is early-stage modeling.” Kennedy has created plug-ins for Autodesk’s Architectural Desktop and ArchiCAD that assist users in defining HVAC zones and validating the BIM model to increase the chances that the energy simulation will provide useful results. This capability is integrated into Revit, so no plug-in is required. The software generates a file in gbXML format (an information exchange protocol developed by GBS) that the software uploads to GBS’s server for analysis. Minutes later, the designer can download the results of the model. GBS allows users five free runs; more runs are available for a nominal fee. GBS recently introduced a “design advisor” service that automatically generates proposed modifications to the design and allows users to experiment with a small number of alternatives. GBS also makes its DOE-2 input file available for download, offering engineers a shortcut for running their own early-stage energy models. Development of the software was funded largely by the California Energy Commission and Pacific Gas & Electric, but for ongoing support GBS is looking to other sources, including manufacturers that appreciate the potential for highly targeted product placement. On that basis, PPG’s SolarBan 70 glazing is one of the design alternatives from which users can choose. GBS has also developed a tool for Owens Corning that identifies what a building would need to implement to qualify for the Energy Policy Act of 2005 tax credit (see EBN Vol. 14, No. 9).

The Georgia State Parks Headquarters in Stockbridge, Georgia, was designed to be environmentally responsible using building information modeling services provided by the independent BIM consultants RCMS Group of Atlanta.

SketchUp and EnergyPlus. While it is a far cry from the full-fledged BIM tools, Google SketchUp offers a 3D modeling interface and the ability to assign characteristics to objects in the design. “A lot of designers prefer SketchUp early on because it’s such a facile tool,” noted Chris Leary, AIA, of KlingStubbins. Most mainstream design tools now have at least some capability to import models from SketchUp and to export simplified models out to it. That capability will soon carry more significance for green projects because by June 2007 the U.S. Department of Energy (DOE) expects to release a SketchUp plug-in for the powerful EnergyPlus modeling engine. DOE intends for EnergyPlus, which was released in version 2.0 in April 2007, to supersede the venerable DOE-2. While EnergyPlus is widely regarded as a more powerful and flexible simulation engine, its use has been limited by its lack of a user-friendly front end. “I could imagine that SketchUp would be a pretty good interface for making an EnergyPlus model,” said Kevin Pratt, director of research at KieranTimberlake. The plug-in, which will be available for both the free and the full versions of SketchUp, will help users define HVAC zones and assign thermal characteristics to elements in their models. It will then export an EnergyPlus input file for a user to run separately—although, according to Drury Crawley, AIA, Technology Development Manager at the U.S. Department of Energy, future versions of the plug-in should be able to run the simulation entirely within SketchUp. Tools like SketchUp are especially useful for early design studies. “A smart team working on sustainable design will start looking at energy models before even designing the building,” noted Guttman, adding, “They may do a lot of analysis on preliminary, pre-architectural models.”

Building Information Modeling and Green Design


Analysis during design development

More detailed energy analyses during design development, or verifying a building’s performance from the construction documents, is the traditional purview of mechanical engineers who specialize in energy modeling. Simply by translating building geometry automatically from a design model, 3D CAD and BIM tools have the potential to dramatically reduce the amount of time and effort required to set up those energy models. As noted above, that translation can be done from Revit and ArchiCAD. A more generic approach, developed by the Industry Alliance for Interoperability, uses a data structure termed Industry Foundation Classes (IFC), although support of the IFC standard has been spotty and the IFC definitions don’t cover all building data exchange requirements. Finally, there are several efforts at direct bilateral connections between BIM tools and performance modeling platforms. The following sections describe how the major BIM software tools support this type of analysis. Revit MEP Links to IES. In February 2007, Autodesk and simulation developer IES Limited announced a collaboration linking their tools. This collaboration began bearing fruit in April, when an incremental version upgrade to the mechanical engineers’ Revit product (Revit MEP) gained the ability to calculate heating and cooling loads directly using an IES engine. IES’s Virtual Environment is an integrated performance modeling package that models energy use, daylighting, computational fluid dynamics (CFD), and other attributes based on a single shared model of the building. Beyond the load calculation tool that is now provided with Revit MEP, users can purchase the Virtual Environment Toolkit, which includes the ability to do more sophisticated analyses. IES also sells individual modules separately that step up the modeling potential even further. The primary modeling engines within IES are collectively called Apache (unrelated to the Web server software). “Apache is being continuously updated by a team of leading experts,” claimed Murray. KlingStubbins has been an early adopter of both Revit and performance modeling tools. “We’re software junkies—we buy everything,” admitted Leary. “We had IES sitting around, but no one could find the time to use it. Now that we’re not having to recreate the data, it’s getting used.” Leary has seen results from the integration of these tools. In one case, the results of an IES simulation led Leary’s team to narrow a building to allow better daylight penetration. KlingStubbins is now engaged in a firm-wide evaluation of the tools, with engineers in the Philadelphia office comparing the results from IES with those from other modeling tools, and a team in Washington, D.C., examining the CFD analysis. “Who ever thought an architecture and engineering firm would be doing its own CFD modeling?” Leary asked. The fact that IES is tied only to Revit MEP and not to Revit Architectural presents an obstacle in the path towards energy modeling that is fully integrated into the design process, especially since Revit MEP is not as mature as the other Revit tools and some engineers are hesitant to commit to it. Autodesk and IES don’t see the dependence on the MEP module as a limitation, however—they believe the shared model can enhance communication and collaboration across disciplines. “We would hope that the integrated model with Revit would become the catalyst for integrated design,” said Murray. Involving experts in the energy modeling process, even if it is largely automated, is also a good idea in terms of interpreting the results. “If you don’t understand what’s happening behind the scenes, you can get some really misleading data out of the software,” warned Pratt. ArchiCAD and Ecotect. Graphisoft is pursuing a path similar to Autodesk’s by establishing ties with another integrated performance modeling package, Ecotect. Ecotect is used extensively in academic settings and is popular in many firms for early design studies. Architects rave over its intuitive graphic interface. “The advantage of Ecotect is that you can have very visual models showing the results of different scenarios,” said Patrick Mays, AIA, vice president of Graphisoft North America. Ecotect was created by Andrew Marsh, Ph.D., who is originally from Australia but currently resides in the U.K. Marsh and a tiny staff handle all development and maintenance, so keeping up with the demand for features and fixes has challenged them, especially as demand for the tool has mushroomed. Ecotect remains a valuable player in the industry, however, largely because of its connection to open-source tools, such as Radiance for daylight modeling and EnergyPlus for energy, in which Ecotect users can perform more robust simulations that are beyond the scope of its internal code. Graphisoft has enhanced ArchiCAD’s gbXML plug-in from Green Building Studio to serve as a translator to Ecotect. “We have the capability to map zones and export data, so properties of walls, windows, doors, are all tracked,” noted Mays. Right now the export to Ecotect is one-way, but users will soon be able to move Ecotect models back into ArchiCAD, according to Mays: “In two months you will see documentation and process for how stuff will work back and forth,” he said.

[enlarge image]

This screen capture from Revit MEP shows the heating and cooling load calculator from IES Virtual Environment running within the Revit application.

If Revit and IES are becoming an industry standard for mainstream, production-oriented architects, ArchiCAD and Ecotect are the darlings of those who prefer an alternative approach, in terms of both catchy graphics and, for the technically inclined, ability to be customized and extended. “ArchiCAD handles things more openly, which offers an advantage in terms of interoperability with a variety of modeling tools,” said Anshen + Allen’s Rendek, adding that “Revit is more of a closed system. It works well with other tools that are written to work with that system.” Along similar lines, users cannot easily extend or modify the library of predefined building assemblies in IES. After comparing Revit and ArchiCAD for building-performance modeling, Rendek found “no clear answer as to which is better. Both are good, and both have advantages and disadvantages.” Bentley’s BIM Solutions. Rather than linking directly to any specific building-performance package, Bentley Systems instead touts its flexible data structure as an ideal solution because it allows users to store any type of data and migrate that data into third-party tools for specialized analyses. Bentley’s primary vehicle for these translations is the IFC framework, which is finally gaining widespread support, according to Roberts. Roberts says that a unique strength of Bentley’s software is its ability to exchange data back and forth with other tools, including an option to selectively re-import modifications to a model. “The majority of the analysis tools aren’t as smart as BIM,” noted Roberts, “but they have all the processes for dealing with the information they care about—other than the geometry—internal to themselves.” Rather than managing all that additional information in one BIM software package, Roberts suggested that a more effective solution would be to allow the tools to pass back and forth the parts of the model that they do share while allowing any of them to modify the design. While Bentley’s tools have the potential to be a strong, open platform from which to develop energy-efficient design solutions, the company does not appear to have progressed as far as its competitors in supporting or promoting those capabilities. This two-way sharing of data, for example, is already working in the area of structural analysis but not yet in the areas of building performance most closely associated with green design.

Reality Check

While the ability to go directly from a design model to an energy simulation is tantalizing, and the capabilities are improving, we still have a ways to go. “It’s not as simple as pushing a button and getting an energy number. Analysis requires a lot of simplifying assumptions, and understanding what is really important and what isn’t,” HOK’s Guttman told EBN. Perhaps the most fundamental challenge is that energy analysis requires a range of inputs, only a few of which are included in a typical building-information model. The physical layout—what software engineers call “building geometry”—is a basic element in all 3D CAD models. Information on how the various elements are constructed and on their thermal performance may be included in a BIM model. But an energy model also needs location information—which it uses to track sun angles and apply appropriate climate data—schedules of operation, and a mapping of HVAC zones. Typcially, most of these additional elements don’t exist in an architectural BIM model, so they must be created either before or after the model moves to an energy simulation environment. Similarly, daylight modeling tools require information about the reflectivity of surfaces, and those that model airflow need to factor in friction coefficients. In this sense, although both conventional design and performance simulations are working from a virtual model of the same building, they need to know different things about that building, making their models quite different. As a result, it will never be possible to take a model that was built just as a visual representation and run a meaningful energy simulation: “I’m a little skeptical that you will actually be able to push a button and get a thermal model,” said Pratt.

BIM tools can support some analyses internally, such as this daylighting study of a residence, extracted from an animation created by Jeff Owens of Owens Architects in Lawrenceburg, Kentucky, using Bentley’s Triforma software. The full animation is viewable online at www.owensarchitects.com.

To get even rudimentary simulation results architects have to learn to create models with the necessary information, and for more sophisticated results, there will likely always be a need for specialists. “There is a lot of art to the science of energy modeling. You can’t just take an architectural model and run a thermal analysis on it,” noted Pratt, adding, “The real question is, do you understand what the results mean?” It’s not only the lack of necessary information that represents a problem; unnecessary information can slow even the most capable simulation engine to a crawl. “BIM gives you the ability to bring over much more detail than you would normally put into an energy simulation,” said Crawly, “but that also has a downside—you can bring over too much data and make the model overly complicated to run.” For example, he noted, including every closet in the model of a large building increases computer processing time without significantly affecting results. Companies, including Burt Hill and SOM, are addressing this issue through the formulation of teams that are expert in energy modeling and other building-performance analyses. These groups are engaged early in the conceptual design process working directly with the architectural design team. This integrated approach provides the design team with expertise in using analytical applications and ensures that Building Information Models contain the appropriate level of information to perform the simulations that can support important decisions. “Our Energy Modeling Team is also engaged later in design to perform more detailed simulations, but it is the early involvement that is important to set the strategies for the building,” noted Mark Dietrick, AIA, chief information officer at Burt Hill.

The Materials Promise

While they may not be ideal for thermal simulations, BIM models are well suited to tracking the materials used in a design. If the model is set up properly, the tedious and error-prone task of measuring each surface and volume to estimate material quantities is eliminated. “The only way to take off quantities accurately is out of a model,” said Bhatt. Accurate take-offs reduce waste, which is beneficial in itself. But in addition to providing an accurate measure of how much concrete to order, for example, a model can also track specific attributes of materials. When constructing a BIM model, designers can select building elements from a library of generic assemblies, or they can create their own libraries. Most models already link to cost information for those assemblies. In theory, they could just as easily store information such as quantities of recycled content or even environmental impact scores from life-cycle assessments of those assemblies. Any information that is available for the individual assemblies can instantly be aggregated for the entire model. “We’d like to make a change and be able to understand, in real time, the carbon impact of the change in terms of embodied energy of the materials,” said Mara Baum of Anshen + Allen. The challenge in practical terms is getting accurate information—a problem that is not unique to BIM applications. “It is very hard to get life-cycle data on most products,” noted Pratt. BIM offers one potential solution: to develop channels by which that information is streamed directly from building product manufacturers into the model. Just as many manufacturers now provide CAD representations of their products so designers can drop them right into a design, in the future they will likely publish BIM-friendly models of those same products, incorporating data about their properties. “Our end goal is that the building-product manufacturer publishes all the data,” said Noah Cole of Autodesk, adding that some companies, such as Trane, have already started down that path. Other software companies are also on board; according to Roberts, “Bentley is working with McGraw-Hill’s Sweets to help the manufacturers figure out how to store that stuff.”

Sweetwater Creek State Park Visitor Center near Atlanta, Georgia, is the first building in the southeast to earn LEED Platinum certification. Dan Gerding, AIA, managing principal of Gerding Collaborative, credits their implementation of building information modeling using ArchiCAD with aligning the client and design team around this ambitious goal.

But that proprietary-product-based solution “is very tricky in an architectural context because we try to use generic specifications,” noted Pratt. The fact that the design model is usually generic presents problems when it comes to getting parts from the manufacturer, agreed Roberts. “That’s been tried a few times but has never gained traction,” he noted, suggesting that those product libraries are more valuable when a building model is used during construction. Using information models to manage the construction process offers compelling advantages, many of which also have environmental implications. Software now available allows contractors to scan items for inventory management as they arrive on the construction site, and link them directly to their place in the building model. Eventually, electronic tags might allow contractors to track the location of each item and ensure that it is installed in the right place. Such tools could reduce errors and waste while streamlining the commissioning process.

Automated Documentation

The use of BIM raises a host of issues around liability and intellectual property and is forcing the industry to rethink the concept of contract documents. “We’re talking about new contracts, new relationships between architects and contractors and owners,” said Guttman. Currently, he said, the transition to construction is “usually done in a traditional contract arrangement—two-dimensional documents are the contract. But the model is shared in information meetings so everybody in the room is better informed.” Some building owners, including GSA, are demanding ownership of the virtual model, however—which concerns architects, who have traditionally retained copyright on their designs. Legal issues aside, the ability to share a virtual model through the construction process, and even as support for building operations, should improve actual building performance. “We can input information on the fly, as we are creating the model, that can be used directly for facility management,” said Rendek, and “that could bring a huge benefit to the client for little additional work.”

Implications for LEED

The ability of BIM tools to aggregate materials information and analyze other building information also has intriguing implications for the documentation requirements of rating systems such as the U.S. Green Building Council’s (USGBC’s) LEED. Noting that Adobe System’s Acrobat technology is the platform for LEED Online, “Anshen + Allen wants to work with them on streamlining the information uptake from the model into the LEED docs,” said Rendek. The Portable Document Format (PDF) created by Adobe provides portability and security for sharing BIM information, and Adobe is moving aggressively to enhance the ability to link data to individual elements in a 3D Acrobat file. While PDFs are valuable for sharing information among users before submitting it for LEED verification, in the future the actual submission won’t necessarily require a PDF file at all, notes Max Zahniser, USGBC’s certification manager for LEED for New Construction. “LEED Online was originally built on XML technology, so our templates are submitting XML packets into our database. We went that route so that we could eventually capitalize on the ability for other tools to submit those packets, without users having to go through LEED Online themselves.” Zahniser added that the next major enhancement to LEED Online, as it evolves to support a new underlying structure for LEED, will have more direct data-flow capability. That ability to deliver documentation seamlessly into LEED Online has obvious value for third-party LEED project-management tools, such as Johnson Controls’ Leedspeed, but in theory that information could come directly from the BIM software. Such an arrangement is not unlikely, given the partnership between USGBC and Autodesk that was announced at Greenbuild in November 2006 (see EBN Vol. 15, No. 12). While for now the dataflow into LEED Online still requires that a user log into the website, the information needs are already being streamlined. In particular, the latest release of IES Virtual Environments, which is closely tied to Revit MEP, has a built-in capability to perform LEED’s daylight calculation and report what percentage of the occupied space achieves the required 2% daylight factor. Users also have the option to report those results based on IES’s daylight simulation, and the results of either calculation could be used to demonstrate that a project meets the criteria for LEED’s daylighting credit. As capabilities of this type are expanded and the calculations verified, documentation coming directly from these analysis tools may increase the confidence of design teams that they are submitting documentation LEED will accept, and may even streamline USGBC’s verification process.

Transforming an Industry

How BIM Tools and Analysis Tools Interact

[enlarge image]

While the software can’t take all the credit, BIM tools are a key element of a broader trend in design towards integration of design disciplines and knowledge-based decision making. “For us BIM is the technology that supports integrated practice. Without the rich exchange of digital data, we’d be in bad shape,” said Volker Mueller, design technology manager at NBBJ. Leary of KlingStubbins points to the power of BIM and analysis tools for putting numbers on information that was previously more subjective. “When you’re in front of a certain kind of client, things that have numbers related to them are valued as decision-making points, whereas things that are qualitative are not,” he told EBN. Using Revit, Leary was able to quickly measure how many occupants would have direct views of a window: “When considering a shift to interior offices and external workstations, thanks to the Revit model we had a quantified way to drive the decision.” Real-time performance feedback during design can not only improve the building but also educate designers. “You see the changes and understand what’s going on. It’s not just spitting out numbers—the process of using an iterative tool is educational for us,” Leary reported. Similarly, Kennedy sees his Green Building Studio as a response to the “massive education problem” of getting architects up to speed quickly so they can begin designing buildings with the potential of becoming carbon-neutral. Given the speed at which technology changes, choosing modeling tools is like hitting a moving target. “It’s important to stay open and flexible rather than just following what the software vendors dictate,” suggested Rendek. The fact that only a small subset of designers is currently using BIM tools represents both a challenge and an opportunity. As designers move from conventional CAD to BIM they need training, and younger architects who never worked in older systems may have an advantage. With both green design and building information modeling on geometric growth curves, their marriage is mutually supportive. That’s a good thing, given the demands on the industry to learn quickly how to create buildings that make sense for our times.

Thursday, May 6, 2010

New Autocad

AutoCAD 2011

Recall from last year’s product launch that AutoCAD 2010—which included new freeform mesh modeling tools, greatly improved PDF support, and the ability to create intelligent, parametric drawings—was referred to as a “watershed event” in AutoCAD’s history, unmatched by any previous release. Autodesk continued its use of superlatives including “most exciting,” “fantastic,” and “best ever” to describe this year’s release of AutoCAD. While the application does include some very useful enhancements that build up nicely upon the last release, it is ultimately the users who will determine if these are indeed as ground-breaking as Autodesk makes them out to be. The enhancements fall under three main categories: improved conceptual design capabilities, increased productivity in document production, and better parametrics.

On the conceptual design front, AutoCAD 2011 includes a whole new set of advanced surface modeling tools in addition to the mesh modeling tools that had been introduced in AutoCAD 2010. The new tools, shown in Figure 1, enable users to easily create smooth surfaces and surface transitions, with automatic associativity that maintains relationships between all of the objects. In addition, the surfaces stay associated to their underlying geometry and automatically update when the geometry is changed, providing a fluid interface for 3D design. While the new surface modeling capability is undoubtedly most helpful for the manufacturing industry, as evidenced by the example shown in Figure 1, it can be extremely helpful in AEC for exploring organic building forms that can subsequently be exported as NURBS surfaces or solids to Revit for further development. And given that the majority of AEC users already have AutoCAD for their drafting needs, the new freeform modeling capabilities may reduce the need to use a different application such as Rhino or form.Z for conceptual design.

Figure 1. The new surface modeling tools in AutoCAD 2011 allow for easier creation of freeform surfaces. (Courtesy: Autodesk)

Also, because the surfaces created with the new tools stay associated with their defining 2D geometry, the ability to add various kinds of geometric constraints to drawing objects in relation to other objects—which was introduced in AutoCAD 2010—can also be used to control the geometry of the surfaces parametrically. For example, you could use a dimensional constraint to parametrically change the size of a circle, which in turn will automatically change any 3D surface object that has been defined from it. This is not full parametric 3D modeling that is available in sophisticated mechanical CAD applications, but is a useful extension of AutoCAD’s 2D parametric capabilities to 3D design.

Another new feature in AutoCAD 2011 that is relevant to conceptual design is point cloud support. Users can now bring in a point cloud created with a laser scanning device and use that as the basis for creating a 3D model, similar to how a drawing can be created by using a raster image as a reference. Point clouds with up to 2 billion points are supported. However, there is no way to automatically convert that point cloud into a 3D model—you still have to create the model from scratch. In time, however, third party developers could use Autodesk’s powerful API to develop enhancements to the point cloud functionality and provide some automatic conversion capability.

The third main enhancement on the conceptual design front is the expansion of the materials library to include an enhanced set of materials that enables users to create rich visual representations of 3D models (see Figure 2). It includes over a 1000 predefined materials that can be dragged and dropped to apply them to objects. The same library is also now included in all Autodesk applications, providing consistency and ensuring that material information is fully retained when the model is passed from one application to another. Users can customize the materials and save them to their own library. Libraries can be imported and exported as well as shared with other users.

Figure 2. The expanded materials library in AutoCAD 2011 that is also now implemented in other Autodesk applications. (Courtesy: Autodesk)

On the 2D documentation front, one of the main improvements in AutoCAD 2011 is in hatching. The Hatch command can be accessed more easily through a contextual tab. A hatch’s scale, rotation, and origin can now be directly edited using an expanded object grip functionality. There are additional options for hatches include transparency, background colors, and gradient fills, which enable users to add more colors and shading to drawings. In addition to hatches, transparency can also now be applied to entire layers as well as specific objects, providing users with new options for managing the appearance of drawings (see Figure 3). There are new “Hide Objects” and “Isolate Objects” tools to control the visibility of objects regardless of layer, so designers can focus on the objects themselves without having to think about what layer they belong to. Polyline editing has been improved with enhanced grips that can be used to add, remove, or stretch vertices, and to convert straight-line segments to arcs, enabling a more direct manipulation of these elements.

Figure 3. The new ability to apply transparency to layers in AutoCAD 2011 provides more control over drawing appearance. (Courtesy: Autodesk)

AutoCAD 2011 also introduces two new commands that can speed the process of creating or selecting objects based on the properties of existing objects: the “Add Selected” tool, which can be used to create new objects based on the properties of an existing object; and the “Select Similar” tool, which enables quick selection of objects that include the same type and properties in the selection set. While these are not novel ideas and have already been implemented in several design applications, they should certainly help AutoCAD users do their work more quickly and efficiently.

Rounding off the set of improvements in AutoCAD 11 is the ability for design constraints to be inferred in real time, as the designer is drawing, rather than manually defining all the object relationships desired. This feature builds upon the constraint-based parametric drawing capability introduced last year and is a good step towards making the application smarter and easier to use.

Last but not least, AutoCAD 2011 is optimized to leverage Windows 7 functionality. It is compatible with all editions of Windows 7 as well as with Windows Vista and Windows XP operating systems. Also, it should be noted that there is no file format change that users have to worry about for AutoCAD 2011.

Natural Gas and Technology
Source: ChevronTexaco Corporation
Over the past thirty years, the oil and natural gas industry has transformed into one of the most technologically advanced industries in the United States. New innovations have reshaped the industry into a technology leader, in all segments of the industry. This section will discuss the role of technology in the evolution of the natural gas industry, focusing on technologies in the exploration and production sector, as well as a few select innovations that have had a profound effect on the potential for natural gas. Scroll down, or click on the links below to jump ahead:

In recent years, demand for natural gas has grown substantially. However, as the natural gas industry in the United States becomes more mature, domestically available resources become harder to find and produce. As large, conventional natural gas deposits are extracted, the natural gas left in the ground is commonly found in less conventional deposits, which are harder to discover and produce than has historically been the case. However, the natural gas industry has been able to keep pace with demand, and produce greater amounts of natural gas despite the increasingly unconventional and elusive nature. The ability of the industry to increase production in this manner has been a direct result of technological innovations. Below is a brief list of some of the major technological advancements that have been made recently:

Advances in the Exploration and Production Sector

Technological innovation in the exploration and production sector has equipped the industry with the equipment and practices necessary to continually increase the production of natural gas to meet rising demand. These technologies serve to make the exploration and production of natural gas more efficient, safe, and environmentally friendly. Despite the fact that natural gas deposits are continually being found deeper in the ground, in remote, inhospitable areas that provide a challenging environment in which to produce natural gas, the exploration and production industry has not only kept up its production pace, but in fact has improved the general nature of its operations. Some highlights of technological development in the exploration and production sector include:

  • 22,000 fewer wells are needed on an annual basis to develop the same amount of oil and gas reserves as were developed in 1985.
  • Had technology remained constant since 1985, it would take two wells to produce the same amount of oil and natural gas as one 1985 well. However, advances in technology mean that one well today can produce two times as much as a single 1985 well.
  • Drilling wastes have decreased by as much as 148 million barrels due to increased well productivity and fewer wells.
  • The drilling footprint of well pads has decreased by as much as 70 percent due to advanced drilling technology, which is extremely useful for drilling in sensitive areas.
  • By using modular drilling rigs and slimhole drilling, the size and weight of drilling rigs can be reduced by up to 75 percent over traditional drilling rigs, reducing their surface impact.
  • Had technology, and thus drilling footprints, remained at 1985 levels, today's drilling footprints would take up an additional 17,000 acres of land.
  • New exploration techniques and vibrational sources mean less reliance on explosives, reducing the impact of exploration on the environment.

Some of the major recent technological innovations in the exploration and production sector include:

  • Advanced 3-D Seismic Imaging
    Source: NGSA
    3-D and 4-D Seismic Imaging - The development of seismic imaging in three dimensions greatly changed the nature of natural gas exploration. This technology uses traditional seismic imaging techniques, combined with powerful computers and processors, to create a three-dimensional model of the subsurface layers. 4-D seismology expands on this, by adding time as a dimension, allowing exploration teams to observe how subsurface characteristics change over time. Exploration teams can now identify natural gas prospects more easily, place wells more effectively, reduce the number of dry holes drilled, reduce drilling costs, and cut exploration time. This leads to both economic and environmental benefits.

  • CO2-Sand Fracturing - Fracturing techniques have been used since the 1970s to help increase the flow rate of natural gas and oil from underground formations. CO2-Sand fracturing involves using a mixture of sand propants and liquid CO2 to fracture formations, creating and enlarging cracks through which oil and natural gas may flow more freely. The CO2 then vaporizes, leaving only sand in the formation, holding the newly enlarged cracks open. Because there are no other substances used in this type of fracturing, there are no 'leftovers' from the fracturing process that must be removed. This means that, while this type of fracturing effectively opens the formation and allows for increased recovery of oil and natural gas, it does not damage the deposit, generates no below ground wastes, and protects groundwater resources.

  • Coiled Tubing - Coiled tubing technologies replace the traditional rigid, jointed drill pipe with a long, flexible coiled pipe string. This greatly reduces the cost of drilling, as well as providing a smaller drilling footprint, requiring less drilling mud, faster rig set up, and reducing the time normally needed to make drill pipe connections. Coiled tubing can also be used in combination with slimhole drilling to provide very economic drilling conditions, and less impact on the environment.

  • Measurement While Drilling - Measurement-While-Drilling (MWD) systems allow for the collection of data from the bottom of a well as it is being drilled. This allows engineers and drilling teams access to up to the second information on the exact nature of the rock formations being encountered by the drill bit. This improves drilling efficiency and accuracy in the drilling process, allows better formation evaluation as the drill bit encounters the underground formation, and reduces the chance of formation damage and blowouts.

  • Slimhole Drilling - Slimhole drilling is exactly as it sounds; drilling a slimmer hole in the ground to get to natural gas and oil deposits. In order to be considered slimhole drilling, at least 90 percent of a well must be drilled with a drill bit less than six inches in diameter (whereas conventional wells typically use drill bits as large as 12.25 inches in diameter). Slimhole drilling can significantly improve the efficiency of drilling operations, as well as decrease its environmental impact. In fact, shorter drilling times and smaller drilling crews can translate into a 50 percent reduction in drilling costs, while reducing the drilling footprint by as much as 75 percent. Because of its low cost profile and reduced environmental impact, slimhole drilling provides a method of economically drilling exploratory wells in new areas, drilling deeper wells in existing fields, and providing an efficient means for extracting more natural gas and oil from undepleted fields.

  • Offshore Production - NASA of the Sea
    Source: Anadarko Petroleum Corporation
    Offshore Drilling Technology - The offshore oil and gas production sector is sometimes referred to as 'NASA of the Sea', due to the monumental achievements in deepwater drilling that have been facilitated by state of the art technology. Natural gas and oil deposits are being found at locations that are deeper and deeper underwater. Whereas offshore drilling operations used to be some of the most risky and dangerous undertakings, new technology, including improved offshore drilling rigs, dynamic positioning devices and sophisticated navigation systems are allowing safe, efficient offshore drilling in waters more than 10,000 feet deep. To learn more about offshore drilling, click here.

The above technological advancements provide only a snapshot of the increasingly sophisticated technology being developed and put into practice in the exploration and production of natural gas and oil. New technologies and applications are being developed constantly, and serve to improve the economics of producing natural gas, allow for the production of deposits formerly considered too unconventional or uneconomic to develop, and ensure that the supply of natural gas keeps up with steadily increasing demand. Sufficient domestic natural gas resources exist to help fuel the U.S. for a significant period of time, and technology is playing a huge role in providing low-cost, environmentally sound methods of extracting these resources.

Two other technologies that are revolutionizing the natural gas industry include the increased use of liquefied natural gas, and natural gas fuel cells. These technologies are discussed below.

Liquefied Natural Gas

Cooling natural gas to about -260°F at normal pressure results in the condensation of the gas into liquid form, known as Liquefied Natural Gas (LNG). LNG can be very useful, particularly for the transportation of natural gas, since LNG takes up about one six hundredth the volume of gaseous natural gas. While LNG is reasonably costly to produce, advances in technology are reducing the costs associated with the liquification and regasification of LNG. Because it is easy to transport, LNG can serve to make economical those stranded natural gas deposits for which the construction of pipelines is uneconomical.

LNG Delivery Facility with Tanker
Source: NGSA

LNG, when vaporized to gaseous form, will only burn in concentrations of between 5 and 15 percent mixed with air. In addition, LNG, or any vapor associated with LNG, will not explode in an unconfined environment. Thus, in the unlikely event of an LNG spill, the natural gas has little chance of igniting an explosion. Liquification also has the advantage of removing oxygen, carbon dioxide, sulfur, and water from the natural gas, resulting in LNG that is almost pure methane.

LNG is typically transported by specialized tanker with insulated walls, and is kept in liquid form by autorefrigeration, a process in which the LNG is kept at its boiling point, so that any heat additions are countered by the energy lost from LNG vapor that is vented out of storage and used to power the vessel.

The increased use of LNG is allowing for the production and marketing of natural gas deposits that were previously economically unrecoverable. Although it currently accounts for only about 1 percent of natural gas used in the United States, it is expected that LNG imports will provide a steady, dependable source of natural gas for U.S. consumption. To learn more about how LNG factors into the supply of natural gas in the United States, click here.

Natural Gas Fuel Cells

Fuel cells powered by natural gas are an extremely exciting and promising new technology for the clean and efficient generation of electricity. Fuel cells have the ability to generate electricity using electrochemical reactions as opposed to combustion of fossil fuels to generate electricity. Essentially, a fuel cell works by passing streams of fuel (usually hydrogen) and oxidants over electrodes that are separated by an electrolyte. This produces a chemical reaction that generates electricity without requiring the combustion of fuel, or the addition of heat as is common in the traditional generation of electricity. When pure hydrogen is used as fuel, and pure oxygen is used as the oxidant, the reaction that takes place within a fuel cell produces only water, heat, and electricity. In practice, fuel cells result in very low emission of harmful pollutants, and the generation of high-quality, reliable electricity. The use of natural gas powered fuel cells has a number of benefits, including:

  • How a Fuel Cell Works
    Source: DOE - Office of Fossil Energy
    Clean Electricity - Fuel cells provide the cleanest method of producing electricity from fossil fuels. While a pure hydrogen, pure oxygen fuel cell produces only water, electricity, and heat, fuel cells in practice emit only trace amounts of sulfur compounds, and very low levels of carbon dioxide. However, the carbon dioxide produced by fuel cell use is concentrated and can be readily recaptured, as opposed to being emitted into the atmosphere.

  • Distributed Generation - Fuel cells can come in extremely compact sizes, allowing for their placement wherever electricity is needed. This includes residential, commercial, industrial, and even transportation settings.

  • Dependability - Fuel cells are completely enclosed units, with no moving parts or complicated machinery. This translates into a dependable source of electricity, capable of operating for thousands of hours. In addition, they are very quiet and safe sources of electricity. Fuel cells also do not have electricity surges, meaning they can be used where a constant, dependably source of electricity is needed.

  • Efficiency - Fuel cells convert the energy stored within fossil fuels into electricity much more efficiently than traditional generation of electricity using combustion. This means that less fuel is required to produce the same amount of electricity. The National Energy Technology Laboratory estimates that, used in combination with natural gas turbines, fuel cell generation facilities can be produced that will operate in the 1 to 20 Megawatt range at 70 percent efficiency, which is much higher than the efficiencies that can be reached by traditional generation methods within that output range.

The generation of electricity has traditionally been a very polluting, inefficient process. However, with new fuel cell technology, the future of electricity generation is expected to change dramatically in the next ten to twenty years. Research and development into fuel cell technology is ongoing, to ensure that the technology is refined to a level where it is cost effective for all varieties of electric generation requirements.

To learn more about fuel cell development, visit the Fuel Cells 2000 website here.

Natural Gas Technology Resources

The natural gas industry is joined by government agencies and laboratories, private research and development firms, and environmental technology groups in coming up with new technologies that may improve the efficiency, cost-effectiveness, and environmental soundness of the natural gas industry. Below are links to a number of resources that provide information on new technological developments in the oil and natural gas industry:

Common digital exchange format a solution to BIM problems

Re: Mindset change essential to successful BIM adoption (DCN, March 25)

I’m not sure who came up with the “#D” paradigm for BIM offshoots (“...three-dimensional design; 4D scheduling capabilities; 5D cost estimating; and emergent 6D lifecycle management”) but it rankles.

“4D” makes some sense, but the paradigm breaks down after that, IMHO.

Furthermore, it ignores several other significant attributes of BIM.

Notable among these are more efficient clash detection and energy modeling, not to mention the grail: the automation of precision fabrication processes.

What “D” are those? These techniques have been implemented - not in every BIM or every BIM project — but to an extent sufficient to demonstrate their feasibility.

Each holds at potential for savings at least equivalent to those offered by “4D,” “5D”, and “6D” techniques.

The market is full of software aimed at the “Ds” of BIM. All the buzz around them is drawing attention from the bigger problems BIM has: Interoperability (and not just between one Autodesk product and another), and object modeling standards.

Major BIM software producers started working on solutions to these problems as the International Alliance for Interoperability and developed a common digital exchange format, Industry Foundation Classes (IFC).

But after a decade, progress has slowed to a crawl even though IFC is essentially mature with respect to the basic components of buildings (columns, walls, floors, roofs), with some objects (windows, and doors) coming along close behind.

The whole effort, now under the aegis of buildingSMART Alliance, is nearly lost in the marketing cacophony.

If we want BIM to go forward, the application of technical and financial support for the IFC effort is needed.

The IFC model exchange format is not the ultimate answer to interoperability, as it surely will also pass with time. But it, or something like it, is a vital step along the way to the next step.

Nothing else like it approaches its level of development as far as I know. So it represents the best, if not the only, route to dealing with BIM’s biggest problems.

Your say

We need a BIM protocol that can be accessed at all levels

Re: Common digital exchange format a solution to BIM problems (DCN, April 30)

Brian (Lighthart) is right in his assessment of barriers that exist in successful deployment of a BIM solution cradle to grave.

As it was put to me at the Insight BIM conference held last month in Toronto by a colleague of CanBIM, if ‘Bimmers’ truly want to advocate for a common protocol of interoperability across all platforms, then the advocacy shift needs to move from a proprietary standard of file share (i.e. .dwg) to something that can be interpreted, modified, accessed etc. at all levels.

How about a dot BIM protocol for IFC?

Imagine the solution to the woes of implementing BIM platform solutions is for the dot BIM file extension to become the dot PDF of the modeling and building information world.

AutoDesk, Bentley etc. really need to take a hard look for the overall good of the industry, and I believe they are now or will be soon as these challenges grow with mass adoption in North America.

Someday an enterprising C++ programmer will come along and create the ‘dot BIM’ solution Bimmers are really looking for.

Until then, Industry Foundation Classes deployed in accordance with ISO 16739 is your alternative.

I just gave away a very profitable business idea - is anyone out there listening?

Bentley's "BIM for Green Buildings" Executive Summit

With green rapidly emerging as the new global mantra, it is hardly surprising to find AEC technology vendors jumping on the sustainable design bandwagon, particularly those developing BIM (building information modeling) solutions. One of the most significant aspects of BIM is its ability to capture the description of a building in a semantically intelligent format that can be analyzed to study different aspects of its performance, including those related to energy use. Thus, there is a natural correlation between BIM and green buildings; in fact, I would even go so far as to say that if there ever was a technology "in the right place, at the right time"—at least in AEC—that has to be BIM in the context of sustainable design.

Of the leading BIM vendors, Graphisoft has traditionally been considered the front-runner in supporting energy analysis with IFC support and strong links from ArchiCAD to tools such as Energy Plus, ArchiPHYSIK, Ecotect, and RIUSKA (see more about this on Graphisoft's website). For Autodesk, sustainable design is rapidly emerging as a key focus area, as was demonstrated by presentations at Autodesk University 2006 and by its recent partnership with Integrated Environmental Solutions (IES) to closely integrate IES' building performance analysis tools with Revit. Bentley, in turn, hosted a one-day "BIM for Green Buildings" Executive Summit last month in New York, which I had the opportunity to attend. The event was focused on exploring the evident synergy between the new BIM-enabled design methodologies and objectives in sustainable design through a series of "best practices" seminar sessions by firms who were, according to Bentley, doing BIM and green design well, followed by an interactive "think tank" discussion with audience participation. The highlights of the presentations and an analysis of the key discussion points that emerged are captured in this AECbytes "Building the Future" article.

Seminar Sessions

The Summit featured five main sessions, the first of which was by Bill Barnard and Myron Bollman of the Troyer Group, a full-service AEC firm providing planning, design, and construction services, for whom sustainable design has been an important component of its work right from the start, long before the LEED (Leadership in Energy and Environmental Design) Green Building Rating System was even established. Now, the firm is a charter member of the United States Green Building Council (USGBC), and has multiple staff members from each discipline in the organization trained in LEED. Most of the firm's presentation at the Bentley Summit was focused on describing the various green features and LEED strategies it had incorporated on some of its key projects. The firm has been using Bentley's solutions for over 10 years, including site analysis with GEOPAK Site and Google Earth, architectural design with Bentley Architecture, structural design and analysis using the integration of RAM and STAAD.Pro with Bentley Structural, HVAC design with Bentley Mechanical and Trace, and conflict detection with Bentley's Interference Manager. It was not clear if the firm was actually using BIM to further sustainable design in its practice, but the presenters did highlight what was needed for BIM and green design to come together: the ability to integrate necessary information such as materials, building loads, lights, occupants, climate, building codes, and so on into the BIM model so as to be able to carry out interactive analysis of different green design aspects, particularly at the preliminary design stage; linking manufacturers' product data into the BIM model to incorporate accurate material information for analysis; being able to use the BIM model to explore the budgetary implications of features such as a green roof, reduced water usage, etc., both in terms of first cost as well as recurring costs; and linking the BIM model to LEED certification forms so that the certification process could be automated.

The bulk of the next session by Rodger Poole of Gresham Smith & Partners, a large multi-disciplinary firm of architects, engineers, planners and designers working in diverse practice areas, was devoted to describing the firm's implementation of BIM using Bentley solutions, the benefits that had been achieved, the challenges encountered and how they were addressed. BIM was used in the firm for a wide range of tasks including program analysis, space analysis, material takeoff, automatic report generation, 4D scheduling, procurement, building commissioning, and various FM services. With regard to the topic of green buildings, the firm was a member of the USGBC and was seeing a growing interest in sustainable design. Poole went on to suggest some strategies for enabling sustainable design with BIM such as building performance modeling, site modeling for more context-sensitive design, and the development of on-the-fly energy calculators. However, there was no indication if the firm was actively designing green buildings or if BIM was being used to explore sustainable design strategies.

Volker Mueller of NBBJ then provided an overview of BIM and sustainability in his firm. NBBJ, a leading architecture and design firm with a global presence, is a long-term user of Bentley's BIM solutions and is a frequent winner of Bentley's annual BE Awards—it won two of the six awards in the Building vertical announced at last year's BE conference. Its BIM implementation was described in some detail in my article on the BIM Symposium at the University of Minnesota published last year, and will therefore not be repeated here. With respect to sustainable design, NBBJ appreciates the growing green movement and the tremendous responsibility it places on the AEC industry, given that buildings account for the largest amount of energy consumption in the US. NBBJ has a Sustainable Design Group within the firm comprising over a 100 LEED professionals; it has a growing number of LEED certified projects and projects tracking to LEED; and it is also implementing sustainable design strategies in its own offices, with its Seattle office aiming for the LEED Gold certification. With regard to tying BIM and sustainability together, these are currently in parallel but interrelated tracks at NBBJ: BIM models are used for solar studies as well as glare and heat gain studies; radiosity based rendering of the models is used to study natural light penetration and determine how to get more light using shafts and skylights; and the use of Bentley's multi-disciplinary BIM suite allows better systems coordination. Many of the general benefits of BIM that NBBJ is realizing also have a green design pay-off: for example, the programmatic clarity achieved with BIM leads to a more economic and thus more energy-efficient design; and the improved prefabrication capability and gains in construction efficiency lead to reduced energy use. For detailed energy analysis, NBBJ partners with a consultant to produce CFD (Computational fluid dynamics) diagrams, but most often, these are produced too late to make any significant changes to the design. What is critically needed is an easier and interactive link between the BIM models and analysis tools, so that the design can incorporate critical energy-related feedback from an early stage. Most of the current links between BIM and analysis tools rely on IFC import/export, which is not an optimal process, as NBBJ has found. (For more on IFCs, see the article, "The IFC Building Model: A Look Under the Hood.")

The last two sessions at the Summit also reiterated the point that the overall benefits of BIM contribute to a greener building. The first of these was by Robert Stevenson of Ghafari Associates, a full-service A/E firm that is well-known for its cutting-edge multidisciplinary BIM implementation, especially in the automotive and aviation sectors in projects such as the new General Motors Lansing Delta Township Assembly Plant and the Detroit Metropolitan Airport North Terminal Development. Ghafari's BIM approach has been described in detail in a dedicated article in AECbytes, published in November 2005. On the subject of green buildings, the main aspects of BIM implementation at Ghafari that contribute to greener design are conflict resolution at design time and just-in-time construction, which result in energy savings because of reduced scrap, reduced transportation, reduced site disturbances, and shorter construction time. Ghafari is also pursuing LEED certification on several projects by incorporating green elements and design features, but it wasn't clear if BIM was directly enabling or facilitating this.

The final session by Michael Wick by General Motors was useful in providing the much-needed owner's perspective on green design, which in this case was in the context of the General Motors Lansing Delta Township Assembly Plant project that was designed by Ghafari. This project was awarded the LEED Gold certification, which is the first time an automotive project has achieved this distinction. It has also won the 2007 AIA Environmental Leadership Award. There is no doubt, some irony to a green design award going to a facility for manufacturing cars, as Wick himself pointed out, but General Motors was keen to derive the many economic, environmental, community, and health and safety benefits of green design. It had a LEED accredited professional as part of the team and pursued LEED certification to establish environmental leadership, reduce long-term operating costs, as well as have healthier buildings for its employees. The facility will use 55% less energy compared to other plants, and its energy usage is estimated to be 30% below the ASHRAE standard, which is quite a remarkable achievement.

"Think Tank" Discussion

While most of the individual sessions were focused either on BIM or on green design, some interesting points on how the two come together did emerge in the Q/A session and discussion following the presentations. One was related to the use of the IFC, which Volker Mueller had briefly mentioned during his presentation and which he elaborated upon a little more during the discussion. While NBBJ does use the IFC to send data to consultants, the process is quite involved and time-consuming. Every exchange is case-specific and needs proper mapping to ensure that the application exporting the IFC file includes all the data that is needed by the receiving application. The exchange has to be tested before being used on an actual project. Thus, the use of the IFC to facilitate interoperability between applications is that not straightforward and could account for the relatively slow adoption of IFC-based analysis tools in conjunction with BIM applications. A potential solution to this problem could be to have applications, both for building modeling and for analysis, that use the IFC as their native file format so that the entire rigmarole of import/export and case-specific mappings can be avoided. No such solutions are available yet, and it is not known if any are even in the works.

Another critical point that was brought up was the possibility that the design model might not be the same as the models needed for analysis. Just as we have had the long-standing debate about design models versus construction models (most recently discussed in the article, "The AGC's BIM Initiatives and the Contractor's Guide to BIM"), we are now confronted with the same question with regard to different aspects of energy analysis. There is no doubt that different models are required for daylighting analysis, for thermal analysis, for a detailed DOE-2 simulation, for a CFD analysis, and so on, as different kinds of building information is needed for these different analyses. The question is whether the design model created by BIM applications can include all the information that would be required to automatically derive these different models for different kinds of energy analysis. If we want energy analysis to become an integral part of the design process, this capability is very important, so that users don't have to expend additional resources to create separate energy-related models. But does this then over-burden the design model and make it too cumbersome to work with? It is difficult to know the answer to this question until we actually have such BIM applications. We do have structural BIM applications that combine a physical model of a structure with an analytical model that can be sent to structural analysis tools, but we are still far from a multi-disciplinary BIM model that integrates not just spatial, structural, and MEP information but also includes all the data needed for the varied types of energy-related analyses mentioned earlier.

McGraw-Hill Construction Presents the 2010 Green BIM Conference in Boston, May 19