Drafting the Future

Our architectural, civil and mechanical drafting services can take your, commercial, industrial or residential sketches and preliminary drawings and create final fully coordinated construction documents.

WELCOME TO OUR BLOG

WELCOME

Thursday, May 6, 2010

New Autocad


AutoCAD 2011

Recall from last year’s product launch that AutoCAD 2010—which included new freeform mesh modeling tools, greatly improved PDF support, and the ability to create intelligent, parametric drawings—was referred to as a “watershed event” in AutoCAD’s history, unmatched by any previous release. Autodesk continued its use of superlatives including “most exciting,” “fantastic,” and “best ever” to describe this year’s release of AutoCAD. While the application does include some very useful enhancements that build up nicely upon the last release, it is ultimately the users who will determine if these are indeed as ground-breaking as Autodesk makes them out to be. The enhancements fall under three main categories: improved conceptual design capabilities, increased productivity in document production, and better parametrics.

On the conceptual design front, AutoCAD 2011 includes a whole new set of advanced surface modeling tools in addition to the mesh modeling tools that had been introduced in AutoCAD 2010. The new tools, shown in Figure 1, enable users to easily create smooth surfaces and surface transitions, with automatic associativity that maintains relationships between all of the objects. In addition, the surfaces stay associated to their underlying geometry and automatically update when the geometry is changed, providing a fluid interface for 3D design. While the new surface modeling capability is undoubtedly most helpful for the manufacturing industry, as evidenced by the example shown in Figure 1, it can be extremely helpful in AEC for exploring organic building forms that can subsequently be exported as NURBS surfaces or solids to Revit for further development. And given that the majority of AEC users already have AutoCAD for their drafting needs, the new freeform modeling capabilities may reduce the need to use a different application such as Rhino or form.Z for conceptual design.


Figure 1. The new surface modeling tools in AutoCAD 2011 allow for easier creation of freeform surfaces. (Courtesy: Autodesk)

Also, because the surfaces created with the new tools stay associated with their defining 2D geometry, the ability to add various kinds of geometric constraints to drawing objects in relation to other objects—which was introduced in AutoCAD 2010—can also be used to control the geometry of the surfaces parametrically. For example, you could use a dimensional constraint to parametrically change the size of a circle, which in turn will automatically change any 3D surface object that has been defined from it. This is not full parametric 3D modeling that is available in sophisticated mechanical CAD applications, but is a useful extension of AutoCAD’s 2D parametric capabilities to 3D design.

Another new feature in AutoCAD 2011 that is relevant to conceptual design is point cloud support. Users can now bring in a point cloud created with a laser scanning device and use that as the basis for creating a 3D model, similar to how a drawing can be created by using a raster image as a reference. Point clouds with up to 2 billion points are supported. However, there is no way to automatically convert that point cloud into a 3D model—you still have to create the model from scratch. In time, however, third party developers could use Autodesk’s powerful API to develop enhancements to the point cloud functionality and provide some automatic conversion capability.

The third main enhancement on the conceptual design front is the expansion of the materials library to include an enhanced set of materials that enables users to create rich visual representations of 3D models (see Figure 2). It includes over a 1000 predefined materials that can be dragged and dropped to apply them to objects. The same library is also now included in all Autodesk applications, providing consistency and ensuring that material information is fully retained when the model is passed from one application to another. Users can customize the materials and save them to their own library. Libraries can be imported and exported as well as shared with other users.


Figure 2. The expanded materials library in AutoCAD 2011 that is also now implemented in other Autodesk applications. (Courtesy: Autodesk)

On the 2D documentation front, one of the main improvements in AutoCAD 2011 is in hatching. The Hatch command can be accessed more easily through a contextual tab. A hatch’s scale, rotation, and origin can now be directly edited using an expanded object grip functionality. There are additional options for hatches include transparency, background colors, and gradient fills, which enable users to add more colors and shading to drawings. In addition to hatches, transparency can also now be applied to entire layers as well as specific objects, providing users with new options for managing the appearance of drawings (see Figure 3). There are new “Hide Objects” and “Isolate Objects” tools to control the visibility of objects regardless of layer, so designers can focus on the objects themselves without having to think about what layer they belong to. Polyline editing has been improved with enhanced grips that can be used to add, remove, or stretch vertices, and to convert straight-line segments to arcs, enabling a more direct manipulation of these elements.


Figure 3. The new ability to apply transparency to layers in AutoCAD 2011 provides more control over drawing appearance. (Courtesy: Autodesk)

AutoCAD 2011 also introduces two new commands that can speed the process of creating or selecting objects based on the properties of existing objects: the “Add Selected” tool, which can be used to create new objects based on the properties of an existing object; and the “Select Similar” tool, which enables quick selection of objects that include the same type and properties in the selection set. While these are not novel ideas and have already been implemented in several design applications, they should certainly help AutoCAD users do their work more quickly and efficiently.

Rounding off the set of improvements in AutoCAD 11 is the ability for design constraints to be inferred in real time, as the designer is drawing, rather than manually defining all the object relationships desired. This feature builds upon the constraint-based parametric drawing capability introduced last year and is a good step towards making the application smarter and easier to use.

Last but not least, AutoCAD 2011 is optimized to leverage Windows 7 functionality. It is compatible with all editions of Windows 7 as well as with Windows Vista and Windows XP operating systems. Also, it should be noted that there is no file format change that users have to worry about for AutoCAD 2011.

Natural Gas and Technology
Source: ChevronTexaco Corporation
Over the past thirty years, the oil and natural gas industry has transformed into one of the most technologically advanced industries in the United States. New innovations have reshaped the industry into a technology leader, in all segments of the industry. This section will discuss the role of technology in the evolution of the natural gas industry, focusing on technologies in the exploration and production sector, as well as a few select innovations that have had a profound effect on the potential for natural gas. Scroll down, or click on the links below to jump ahead:

In recent years, demand for natural gas has grown substantially. However, as the natural gas industry in the United States becomes more mature, domestically available resources become harder to find and produce. As large, conventional natural gas deposits are extracted, the natural gas left in the ground is commonly found in less conventional deposits, which are harder to discover and produce than has historically been the case. However, the natural gas industry has been able to keep pace with demand, and produce greater amounts of natural gas despite the increasingly unconventional and elusive nature. The ability of the industry to increase production in this manner has been a direct result of technological innovations. Below is a brief list of some of the major technological advancements that have been made recently:

Advances in the Exploration and Production Sector

Technological innovation in the exploration and production sector has equipped the industry with the equipment and practices necessary to continually increase the production of natural gas to meet rising demand. These technologies serve to make the exploration and production of natural gas more efficient, safe, and environmentally friendly. Despite the fact that natural gas deposits are continually being found deeper in the ground, in remote, inhospitable areas that provide a challenging environment in which to produce natural gas, the exploration and production industry has not only kept up its production pace, but in fact has improved the general nature of its operations. Some highlights of technological development in the exploration and production sector include:

  • 22,000 fewer wells are needed on an annual basis to develop the same amount of oil and gas reserves as were developed in 1985.
  • Had technology remained constant since 1985, it would take two wells to produce the same amount of oil and natural gas as one 1985 well. However, advances in technology mean that one well today can produce two times as much as a single 1985 well.
  • Drilling wastes have decreased by as much as 148 million barrels due to increased well productivity and fewer wells.
  • The drilling footprint of well pads has decreased by as much as 70 percent due to advanced drilling technology, which is extremely useful for drilling in sensitive areas.
  • By using modular drilling rigs and slimhole drilling, the size and weight of drilling rigs can be reduced by up to 75 percent over traditional drilling rigs, reducing their surface impact.
  • Had technology, and thus drilling footprints, remained at 1985 levels, today's drilling footprints would take up an additional 17,000 acres of land.
  • New exploration techniques and vibrational sources mean less reliance on explosives, reducing the impact of exploration on the environment.

Some of the major recent technological innovations in the exploration and production sector include:

  • Advanced 3-D Seismic Imaging
    Source: NGSA
    3-D and 4-D Seismic Imaging - The development of seismic imaging in three dimensions greatly changed the nature of natural gas exploration. This technology uses traditional seismic imaging techniques, combined with powerful computers and processors, to create a three-dimensional model of the subsurface layers. 4-D seismology expands on this, by adding time as a dimension, allowing exploration teams to observe how subsurface characteristics change over time. Exploration teams can now identify natural gas prospects more easily, place wells more effectively, reduce the number of dry holes drilled, reduce drilling costs, and cut exploration time. This leads to both economic and environmental benefits.

  • CO2-Sand Fracturing - Fracturing techniques have been used since the 1970s to help increase the flow rate of natural gas and oil from underground formations. CO2-Sand fracturing involves using a mixture of sand propants and liquid CO2 to fracture formations, creating and enlarging cracks through which oil and natural gas may flow more freely. The CO2 then vaporizes, leaving only sand in the formation, holding the newly enlarged cracks open. Because there are no other substances used in this type of fracturing, there are no 'leftovers' from the fracturing process that must be removed. This means that, while this type of fracturing effectively opens the formation and allows for increased recovery of oil and natural gas, it does not damage the deposit, generates no below ground wastes, and protects groundwater resources.

  • Coiled Tubing - Coiled tubing technologies replace the traditional rigid, jointed drill pipe with a long, flexible coiled pipe string. This greatly reduces the cost of drilling, as well as providing a smaller drilling footprint, requiring less drilling mud, faster rig set up, and reducing the time normally needed to make drill pipe connections. Coiled tubing can also be used in combination with slimhole drilling to provide very economic drilling conditions, and less impact on the environment.

  • Measurement While Drilling - Measurement-While-Drilling (MWD) systems allow for the collection of data from the bottom of a well as it is being drilled. This allows engineers and drilling teams access to up to the second information on the exact nature of the rock formations being encountered by the drill bit. This improves drilling efficiency and accuracy in the drilling process, allows better formation evaluation as the drill bit encounters the underground formation, and reduces the chance of formation damage and blowouts.

  • Slimhole Drilling - Slimhole drilling is exactly as it sounds; drilling a slimmer hole in the ground to get to natural gas and oil deposits. In order to be considered slimhole drilling, at least 90 percent of a well must be drilled with a drill bit less than six inches in diameter (whereas conventional wells typically use drill bits as large as 12.25 inches in diameter). Slimhole drilling can significantly improve the efficiency of drilling operations, as well as decrease its environmental impact. In fact, shorter drilling times and smaller drilling crews can translate into a 50 percent reduction in drilling costs, while reducing the drilling footprint by as much as 75 percent. Because of its low cost profile and reduced environmental impact, slimhole drilling provides a method of economically drilling exploratory wells in new areas, drilling deeper wells in existing fields, and providing an efficient means for extracting more natural gas and oil from undepleted fields.

  • Offshore Production - NASA of the Sea
    Source: Anadarko Petroleum Corporation
    Offshore Drilling Technology - The offshore oil and gas production sector is sometimes referred to as 'NASA of the Sea', due to the monumental achievements in deepwater drilling that have been facilitated by state of the art technology. Natural gas and oil deposits are being found at locations that are deeper and deeper underwater. Whereas offshore drilling operations used to be some of the most risky and dangerous undertakings, new technology, including improved offshore drilling rigs, dynamic positioning devices and sophisticated navigation systems are allowing safe, efficient offshore drilling in waters more than 10,000 feet deep. To learn more about offshore drilling, click here.

The above technological advancements provide only a snapshot of the increasingly sophisticated technology being developed and put into practice in the exploration and production of natural gas and oil. New technologies and applications are being developed constantly, and serve to improve the economics of producing natural gas, allow for the production of deposits formerly considered too unconventional or uneconomic to develop, and ensure that the supply of natural gas keeps up with steadily increasing demand. Sufficient domestic natural gas resources exist to help fuel the U.S. for a significant period of time, and technology is playing a huge role in providing low-cost, environmentally sound methods of extracting these resources.

Two other technologies that are revolutionizing the natural gas industry include the increased use of liquefied natural gas, and natural gas fuel cells. These technologies are discussed below.

Liquefied Natural Gas

Cooling natural gas to about -260°F at normal pressure results in the condensation of the gas into liquid form, known as Liquefied Natural Gas (LNG). LNG can be very useful, particularly for the transportation of natural gas, since LNG takes up about one six hundredth the volume of gaseous natural gas. While LNG is reasonably costly to produce, advances in technology are reducing the costs associated with the liquification and regasification of LNG. Because it is easy to transport, LNG can serve to make economical those stranded natural gas deposits for which the construction of pipelines is uneconomical.

LNG Delivery Facility with Tanker
Source: NGSA

LNG, when vaporized to gaseous form, will only burn in concentrations of between 5 and 15 percent mixed with air. In addition, LNG, or any vapor associated with LNG, will not explode in an unconfined environment. Thus, in the unlikely event of an LNG spill, the natural gas has little chance of igniting an explosion. Liquification also has the advantage of removing oxygen, carbon dioxide, sulfur, and water from the natural gas, resulting in LNG that is almost pure methane.

LNG is typically transported by specialized tanker with insulated walls, and is kept in liquid form by autorefrigeration, a process in which the LNG is kept at its boiling point, so that any heat additions are countered by the energy lost from LNG vapor that is vented out of storage and used to power the vessel.

The increased use of LNG is allowing for the production and marketing of natural gas deposits that were previously economically unrecoverable. Although it currently accounts for only about 1 percent of natural gas used in the United States, it is expected that LNG imports will provide a steady, dependable source of natural gas for U.S. consumption. To learn more about how LNG factors into the supply of natural gas in the United States, click here.

Natural Gas Fuel Cells

Fuel cells powered by natural gas are an extremely exciting and promising new technology for the clean and efficient generation of electricity. Fuel cells have the ability to generate electricity using electrochemical reactions as opposed to combustion of fossil fuels to generate electricity. Essentially, a fuel cell works by passing streams of fuel (usually hydrogen) and oxidants over electrodes that are separated by an electrolyte. This produces a chemical reaction that generates electricity without requiring the combustion of fuel, or the addition of heat as is common in the traditional generation of electricity. When pure hydrogen is used as fuel, and pure oxygen is used as the oxidant, the reaction that takes place within a fuel cell produces only water, heat, and electricity. In practice, fuel cells result in very low emission of harmful pollutants, and the generation of high-quality, reliable electricity. The use of natural gas powered fuel cells has a number of benefits, including:

  • How a Fuel Cell Works
    Source: DOE - Office of Fossil Energy
    Clean Electricity - Fuel cells provide the cleanest method of producing electricity from fossil fuels. While a pure hydrogen, pure oxygen fuel cell produces only water, electricity, and heat, fuel cells in practice emit only trace amounts of sulfur compounds, and very low levels of carbon dioxide. However, the carbon dioxide produced by fuel cell use is concentrated and can be readily recaptured, as opposed to being emitted into the atmosphere.

  • Distributed Generation - Fuel cells can come in extremely compact sizes, allowing for their placement wherever electricity is needed. This includes residential, commercial, industrial, and even transportation settings.

  • Dependability - Fuel cells are completely enclosed units, with no moving parts or complicated machinery. This translates into a dependable source of electricity, capable of operating for thousands of hours. In addition, they are very quiet and safe sources of electricity. Fuel cells also do not have electricity surges, meaning they can be used where a constant, dependably source of electricity is needed.

  • Efficiency - Fuel cells convert the energy stored within fossil fuels into electricity much more efficiently than traditional generation of electricity using combustion. This means that less fuel is required to produce the same amount of electricity. The National Energy Technology Laboratory estimates that, used in combination with natural gas turbines, fuel cell generation facilities can be produced that will operate in the 1 to 20 Megawatt range at 70 percent efficiency, which is much higher than the efficiencies that can be reached by traditional generation methods within that output range.

The generation of electricity has traditionally been a very polluting, inefficient process. However, with new fuel cell technology, the future of electricity generation is expected to change dramatically in the next ten to twenty years. Research and development into fuel cell technology is ongoing, to ensure that the technology is refined to a level where it is cost effective for all varieties of electric generation requirements.

To learn more about fuel cell development, visit the Fuel Cells 2000 website here.

Natural Gas Technology Resources

The natural gas industry is joined by government agencies and laboratories, private research and development firms, and environmental technology groups in coming up with new technologies that may improve the efficiency, cost-effectiveness, and environmental soundness of the natural gas industry. Below are links to a number of resources that provide information on new technological developments in the oil and natural gas industry:



Common digital exchange format a solution to BIM problems

Re: Mindset change essential to successful BIM adoption (DCN, March 25)

I’m not sure who came up with the “#D” paradigm for BIM offshoots (“...three-dimensional design; 4D scheduling capabilities; 5D cost estimating; and emergent 6D lifecycle management”) but it rankles.

“4D” makes some sense, but the paradigm breaks down after that, IMHO.

Furthermore, it ignores several other significant attributes of BIM.

Notable among these are more efficient clash detection and energy modeling, not to mention the grail: the automation of precision fabrication processes.

What “D” are those? These techniques have been implemented - not in every BIM or every BIM project — but to an extent sufficient to demonstrate their feasibility.

Each holds at potential for savings at least equivalent to those offered by “4D,” “5D”, and “6D” techniques.

The market is full of software aimed at the “Ds” of BIM. All the buzz around them is drawing attention from the bigger problems BIM has: Interoperability (and not just between one Autodesk product and another), and object modeling standards.

Major BIM software producers started working on solutions to these problems as the International Alliance for Interoperability and developed a common digital exchange format, Industry Foundation Classes (IFC).

But after a decade, progress has slowed to a crawl even though IFC is essentially mature with respect to the basic components of buildings (columns, walls, floors, roofs), with some objects (windows, and doors) coming along close behind.

The whole effort, now under the aegis of buildingSMART Alliance, is nearly lost in the marketing cacophony.

If we want BIM to go forward, the application of technical and financial support for the IFC effort is needed.

The IFC model exchange format is not the ultimate answer to interoperability, as it surely will also pass with time. But it, or something like it, is a vital step along the way to the next step.

Nothing else like it approaches its level of development as far as I know. So it represents the best, if not the only, route to dealing with BIM’s biggest problems.


Your say

We need a BIM protocol that can be accessed at all levels

Re: Common digital exchange format a solution to BIM problems (DCN, April 30)

Brian (Lighthart) is right in his assessment of barriers that exist in successful deployment of a BIM solution cradle to grave.

As it was put to me at the Insight BIM conference held last month in Toronto by a colleague of CanBIM, if ‘Bimmers’ truly want to advocate for a common protocol of interoperability across all platforms, then the advocacy shift needs to move from a proprietary standard of file share (i.e. .dwg) to something that can be interpreted, modified, accessed etc. at all levels.

How about a dot BIM protocol for IFC?

Imagine the solution to the woes of implementing BIM platform solutions is for the dot BIM file extension to become the dot PDF of the modeling and building information world.

AutoDesk, Bentley etc. really need to take a hard look for the overall good of the industry, and I believe they are now or will be soon as these challenges grow with mass adoption in North America.

Someday an enterprising C++ programmer will come along and create the ‘dot BIM’ solution Bimmers are really looking for.

Until then, Industry Foundation Classes deployed in accordance with ISO 16739 is your alternative.

I just gave away a very profitable business idea - is anyone out there listening?


Bentley's "BIM for Green Buildings" Executive Summit

With green rapidly emerging as the new global mantra, it is hardly surprising to find AEC technology vendors jumping on the sustainable design bandwagon, particularly those developing BIM (building information modeling) solutions. One of the most significant aspects of BIM is its ability to capture the description of a building in a semantically intelligent format that can be analyzed to study different aspects of its performance, including those related to energy use. Thus, there is a natural correlation between BIM and green buildings; in fact, I would even go so far as to say that if there ever was a technology "in the right place, at the right time"—at least in AEC—that has to be BIM in the context of sustainable design.

Of the leading BIM vendors, Graphisoft has traditionally been considered the front-runner in supporting energy analysis with IFC support and strong links from ArchiCAD to tools such as Energy Plus, ArchiPHYSIK, Ecotect, and RIUSKA (see more about this on Graphisoft's website). For Autodesk, sustainable design is rapidly emerging as a key focus area, as was demonstrated by presentations at Autodesk University 2006 and by its recent partnership with Integrated Environmental Solutions (IES) to closely integrate IES' building performance analysis tools with Revit. Bentley, in turn, hosted a one-day "BIM for Green Buildings" Executive Summit last month in New York, which I had the opportunity to attend. The event was focused on exploring the evident synergy between the new BIM-enabled design methodologies and objectives in sustainable design through a series of "best practices" seminar sessions by firms who were, according to Bentley, doing BIM and green design well, followed by an interactive "think tank" discussion with audience participation. The highlights of the presentations and an analysis of the key discussion points that emerged are captured in this AECbytes "Building the Future" article.

Seminar Sessions

The Summit featured five main sessions, the first of which was by Bill Barnard and Myron Bollman of the Troyer Group, a full-service AEC firm providing planning, design, and construction services, for whom sustainable design has been an important component of its work right from the start, long before the LEED (Leadership in Energy and Environmental Design) Green Building Rating System was even established. Now, the firm is a charter member of the United States Green Building Council (USGBC), and has multiple staff members from each discipline in the organization trained in LEED. Most of the firm's presentation at the Bentley Summit was focused on describing the various green features and LEED strategies it had incorporated on some of its key projects. The firm has been using Bentley's solutions for over 10 years, including site analysis with GEOPAK Site and Google Earth, architectural design with Bentley Architecture, structural design and analysis using the integration of RAM and STAAD.Pro with Bentley Structural, HVAC design with Bentley Mechanical and Trace, and conflict detection with Bentley's Interference Manager. It was not clear if the firm was actually using BIM to further sustainable design in its practice, but the presenters did highlight what was needed for BIM and green design to come together: the ability to integrate necessary information such as materials, building loads, lights, occupants, climate, building codes, and so on into the BIM model so as to be able to carry out interactive analysis of different green design aspects, particularly at the preliminary design stage; linking manufacturers' product data into the BIM model to incorporate accurate material information for analysis; being able to use the BIM model to explore the budgetary implications of features such as a green roof, reduced water usage, etc., both in terms of first cost as well as recurring costs; and linking the BIM model to LEED certification forms so that the certification process could be automated.

The bulk of the next session by Rodger Poole of Gresham Smith & Partners, a large multi-disciplinary firm of architects, engineers, planners and designers working in diverse practice areas, was devoted to describing the firm's implementation of BIM using Bentley solutions, the benefits that had been achieved, the challenges encountered and how they were addressed. BIM was used in the firm for a wide range of tasks including program analysis, space analysis, material takeoff, automatic report generation, 4D scheduling, procurement, building commissioning, and various FM services. With regard to the topic of green buildings, the firm was a member of the USGBC and was seeing a growing interest in sustainable design. Poole went on to suggest some strategies for enabling sustainable design with BIM such as building performance modeling, site modeling for more context-sensitive design, and the development of on-the-fly energy calculators. However, there was no indication if the firm was actively designing green buildings or if BIM was being used to explore sustainable design strategies.

Volker Mueller of NBBJ then provided an overview of BIM and sustainability in his firm. NBBJ, a leading architecture and design firm with a global presence, is a long-term user of Bentley's BIM solutions and is a frequent winner of Bentley's annual BE Awards—it won two of the six awards in the Building vertical announced at last year's BE conference. Its BIM implementation was described in some detail in my article on the BIM Symposium at the University of Minnesota published last year, and will therefore not be repeated here. With respect to sustainable design, NBBJ appreciates the growing green movement and the tremendous responsibility it places on the AEC industry, given that buildings account for the largest amount of energy consumption in the US. NBBJ has a Sustainable Design Group within the firm comprising over a 100 LEED professionals; it has a growing number of LEED certified projects and projects tracking to LEED; and it is also implementing sustainable design strategies in its own offices, with its Seattle office aiming for the LEED Gold certification. With regard to tying BIM and sustainability together, these are currently in parallel but interrelated tracks at NBBJ: BIM models are used for solar studies as well as glare and heat gain studies; radiosity based rendering of the models is used to study natural light penetration and determine how to get more light using shafts and skylights; and the use of Bentley's multi-disciplinary BIM suite allows better systems coordination. Many of the general benefits of BIM that NBBJ is realizing also have a green design pay-off: for example, the programmatic clarity achieved with BIM leads to a more economic and thus more energy-efficient design; and the improved prefabrication capability and gains in construction efficiency lead to reduced energy use. For detailed energy analysis, NBBJ partners with a consultant to produce CFD (Computational fluid dynamics) diagrams, but most often, these are produced too late to make any significant changes to the design. What is critically needed is an easier and interactive link between the BIM models and analysis tools, so that the design can incorporate critical energy-related feedback from an early stage. Most of the current links between BIM and analysis tools rely on IFC import/export, which is not an optimal process, as NBBJ has found. (For more on IFCs, see the article, "The IFC Building Model: A Look Under the Hood.")

The last two sessions at the Summit also reiterated the point that the overall benefits of BIM contribute to a greener building. The first of these was by Robert Stevenson of Ghafari Associates, a full-service A/E firm that is well-known for its cutting-edge multidisciplinary BIM implementation, especially in the automotive and aviation sectors in projects such as the new General Motors Lansing Delta Township Assembly Plant and the Detroit Metropolitan Airport North Terminal Development. Ghafari's BIM approach has been described in detail in a dedicated article in AECbytes, published in November 2005. On the subject of green buildings, the main aspects of BIM implementation at Ghafari that contribute to greener design are conflict resolution at design time and just-in-time construction, which result in energy savings because of reduced scrap, reduced transportation, reduced site disturbances, and shorter construction time. Ghafari is also pursuing LEED certification on several projects by incorporating green elements and design features, but it wasn't clear if BIM was directly enabling or facilitating this.

The final session by Michael Wick by General Motors was useful in providing the much-needed owner's perspective on green design, which in this case was in the context of the General Motors Lansing Delta Township Assembly Plant project that was designed by Ghafari. This project was awarded the LEED Gold certification, which is the first time an automotive project has achieved this distinction. It has also won the 2007 AIA Environmental Leadership Award. There is no doubt, some irony to a green design award going to a facility for manufacturing cars, as Wick himself pointed out, but General Motors was keen to derive the many economic, environmental, community, and health and safety benefits of green design. It had a LEED accredited professional as part of the team and pursued LEED certification to establish environmental leadership, reduce long-term operating costs, as well as have healthier buildings for its employees. The facility will use 55% less energy compared to other plants, and its energy usage is estimated to be 30% below the ASHRAE standard, which is quite a remarkable achievement.

"Think Tank" Discussion

While most of the individual sessions were focused either on BIM or on green design, some interesting points on how the two come together did emerge in the Q/A session and discussion following the presentations. One was related to the use of the IFC, which Volker Mueller had briefly mentioned during his presentation and which he elaborated upon a little more during the discussion. While NBBJ does use the IFC to send data to consultants, the process is quite involved and time-consuming. Every exchange is case-specific and needs proper mapping to ensure that the application exporting the IFC file includes all the data that is needed by the receiving application. The exchange has to be tested before being used on an actual project. Thus, the use of the IFC to facilitate interoperability between applications is that not straightforward and could account for the relatively slow adoption of IFC-based analysis tools in conjunction with BIM applications. A potential solution to this problem could be to have applications, both for building modeling and for analysis, that use the IFC as their native file format so that the entire rigmarole of import/export and case-specific mappings can be avoided. No such solutions are available yet, and it is not known if any are even in the works.

Another critical point that was brought up was the possibility that the design model might not be the same as the models needed for analysis. Just as we have had the long-standing debate about design models versus construction models (most recently discussed in the article, "The AGC's BIM Initiatives and the Contractor's Guide to BIM"), we are now confronted with the same question with regard to different aspects of energy analysis. There is no doubt that different models are required for daylighting analysis, for thermal analysis, for a detailed DOE-2 simulation, for a CFD analysis, and so on, as different kinds of building information is needed for these different analyses. The question is whether the design model created by BIM applications can include all the information that would be required to automatically derive these different models for different kinds of energy analysis. If we want energy analysis to become an integral part of the design process, this capability is very important, so that users don't have to expend additional resources to create separate energy-related models. But does this then over-burden the design model and make it too cumbersome to work with? It is difficult to know the answer to this question until we actually have such BIM applications. We do have structural BIM applications that combine a physical model of a structure with an analytical model that can be sent to structural analysis tools, but we are still far from a multi-disciplinary BIM model that integrates not just spatial, structural, and MEP information but also includes all the data needed for the varied types of energy-related analyses mentioned earlier.



McGraw-Hill Construction Presents the 2010 Green BIM Conference in Boston, May 19

Saturday, May 1, 2010

Looking to Experience Success


We are like dogs. We respond better to success than we do failures. Scads of platitudes have been written about learning from failure, and while it is possible to learn great lessons from life's clunkers, neuroscience now shows us that nothing succeeds like success. Have you ever had that golf game, presentation or some other challenge where everything you do with that challenge is golden? Then, the next day, you go at it again and you mess up one time and--bam! You can't seem to get in the groove. There is a brain-based reason for that, and the more aware of it you are, the more you can create success momentum in your business.

Earl Miller of MIT and Mark Histed of Harvard found that our neurons retain memory and become more finely tuned when we succeed, but they don't when success isn't present. There is a difference between the absence of success and the presence of failure. For instance, when a mistake leads to a negative consequence, we have a tendency to learn from it and veer in another direction. We don't necessarily learn what to do, but we learn what not to do. On the other hand, when there is absence of success but no apparent mistakes (you lose money in the stock market but have nothing tangibly to do with it), nothing appears to change in the brain, and relatively little--if any--learning takes place.

Here's what goes on that makes success so...well, successful. When you're learning something new and you have a success, even a small one, your brain gets a little reward bump of the pleasure neurotransmitter dopamine. Dopamine is used to thicken the neural pathways needed to learn a new skill. Your brain is drawn to activities that give you those little pleasure bumps. You can actually become addicted to success. But the big news is that the more you succeed, the longer your brain retains the proper information to help you succeed again.

The implications of this seem enormous for you and your business. We now know that what you celebrate (typically success) gets repeated, and the more you celebrate it, the more of that behavior you get. We used to think this was part of the psycho-babble voodoo of niceness in business and that it really wasn't necessary. It is and now it's proven. Look for low-hanging fruit in your company--small successes that everyone can participate in--because it's a good way to get the snowball of success and high morale rolling when things are going badly.

Following are five things you can do to take advantage of this science in your organization and get the snowball effect of success breeding success.

  1. You're looking to experience success, not to learn from mistakes. It might sound like semantics, but it is a different framing that works to keep you going. When you discuss mistakes, make sure you don't just focus on the what-not-to-do part of it. If you want to head down the path of success, you have to understand what is correct and try the successful behavior until you have a positive outcome.
  2. Get your head positively in the game. When you make a mistake, don't allow negativity to rule you. As "new-agey" as it sounds, stay positive. Your attitude should be that you get another chance to make it right. Focus on the next turn. Put in neuroscience terms, when you become mired in negativity, the stressor hormone cortisol bathes your brain and blocks access to the part of the brain that breeds success. You become more and more frustrated and will continue to make mistakes. Take a break! When you can come at the challenge with a new and positive perspective, you are primed to try it again.
  3. Nothing replaces practice. When you achieve success, mimic the very same behavior again relatively quickly after the last success. You'll build thicker neural pathways for the successful behaviors. In essence, practice and practice and practice the art of success. That's why you see golfers hit ball after ball on a driving range. Focus on the attempts you get right; ignore the wrong ones.
  4. Celebrate. If you're trying to teach someone a new skill, celebrate their successes and ignore their mistakes unless the failure is going to be harmful.
  5. Positive feedback is like cash in the bank. The old adage is true: "People have a tendency to become what we encourage them to be, not what we nag them to be." As a leader, catch people doing things right and make sure they know you've witnessed it. Give them on-the-spot feedback on it so that they get a big dopamine hit. Don't wait for company meetings or the next time you see them walking down the hall. One of your most important activities as a success chief will be to immediately dole out feedback when someone does something right. Get good and consistent at it.

There are few things as exciting as breeding success, especially when you know you can do it deliberately. If you want to be the titan of your industry and have a high hit rate at success, keep hitting. Notice when you and others get it right and do it again and again. We really are like dogs. Sit. Roll over. Good entrepreneur. Celebrate.


Scott Halford is an internationally known speaker and author of the bestselling book,Be a Shortcut: The Secret Fast Track to Business Success(Wiley and Sons 2009). He can be reached atwww.CompleteIntelligence.com