Issue: Volume: 24 Issue: 6 (June 2001)

Six degrees of simulation



By Jerry Laiserin

In architecture, engineering, and construction (AEC), every building is its own prototype, a one-off, an original. This approach places high value on analytical tools that enable designers to simulate and visualize the behavior of a proposed structure under varying conditions. Prescriptive building codes provide static formulas that regulate passive aspects of building performance, such as structural "dead loads" or thermal insulation values. However, descriptive aspects of dynamic building behavior-from lighting and acoustics to seismic and wind loads to fire resistance and escape routes-require correspondingly dynamic mathematical simulations that can best be understood through visualization.

The underlying computational techniques share several broad characteristics. Whether using radiosity (for lighting), finite element analysis (for earthquakes and vibration), or computational fluid dynamics (for smoke and flame spread), a typical analysis begins with dividing the building's surfaces, volumes, or components into small cells (squares or cubes). The analysis also assumes a uniform initial state across all divisions (for example, they are all dark, all at rest, or all at the same temperature).
When visualizing proposed lighting schemes for new buildings, architectural designers often prefer to render images using false-colors (right) rather than try to simulate reality (left) because of the limited contrast range of most computer monitors. (Sta




Purely numerical representations of physical change can be hard to comprehend. As change is introduced into the system (light, a tremor, or heat), the effect is calculated in each immediately affected cell. Over subsequent intervals, the analysis calculates the spreading impact on adjoining cells (in the form of reflected or diffracted light, transferred structural moment, or increased temperature). A complete analysis includes not only the effects of initial cells on adjoining cells, but also the effects of those subsequent cells reflecting and affecting the previous cells.

Because the results are so difficult to interpret from numerical answers alone, architects and engineers increasingly turn to visualizations. Whether in 2D or 3D, still images or animation, such visualizations play an increasingly important role in helping to predict, identify, and correct potential trouble spots in building performance prior to construction. Indeed, as new material technologies and economic demands drive the creation of buildings that are bigger, taller, and more complex than anything contemplated by the static approach of traditional building codes, designers are growing more dependent on simulation and visualization to help ensure that their designs are safe, comfortable, and enjoyable for future inhabitants.

Architecture has been described as the manipulation of forms in light. Long before CAD models were available, architecture students and practicing professionals alike studied lighting effects by squinting at small-scale cardboard models while moving their Luxo lamps in imaginary solar arcs across a scale-model sky. Developments in computer graphics at places such as Cornell and the University of Utah allowed ever more realistic lighting and shading algorithms-from Phong to Gouraud to raytracing to radiosity-to be applied to CAD geometry. Advanced lighting simulation work at the Lawrence Berkeley National Laboratory produced Radiance, a collection of Unix routines that yields, in the hands of a skilled operator with boundless computing resources, photometrically accurate simulations of any light source in any space.

Most architects use commercial products, such as Discreet's Lightscape, McNeel's Accurender, and Abvent's ArtLantis, that combine raytracing and radiosity with clever computational shortcuts and interactive interfaces to cut down on input/output complexity and processing time. How ever, as pointed out by lighting designer David Stone of New York-based Fisher Marantz Stone, the limited contrast ratio and color non-linearities of computer displays mean that while the computer output from such programs may be a realistic match for a photo displayed on the same screen, it is not necessarily an accurate representation of what the human eye would experience in the physical scene. As a result, many lighting designers prefer to work with false color images, in which lighting intensity is represented by color variation rather than brightness.
Animated visualizations of acoustical behavior show the propagation of sound over time, as initial sound radiates from the orchestra pit (left) and is reflected from the hall ceiling (middle) and balconies (right).




Although the majority of information about our immediate surroundings is visual, sound also ranks high in importance as an information channel, especially in a concert hall or auditorium (literally, a listening room). Acoustical simulation has come a long way since pioneer Wallace Sabine first fired a starter's pistol in an empty concert hall to judge the decay of sound, but much of the work still involves empirical testing. Unlike light, which is very fast and has correspondingly microscopic wavelengths, sound travels slowly (around 1100 feet per second) at comparatively long wavelengths (from fractions of an inch to several feet). Because of its slow speed and long wavelengths, sound is easily influenced by temperature, humidity, altitude, and complex interactions with materials and surfaces that variously absorb, reflect, and diffuse it. The aural equivalent of visual glare can occur if sound waves in an enclosed space become concentrated at certain frequencies due to interaction with resonant modes of a room's dimensions, or if reflected sounds arrive either too close or too far apart in time relative to the initial direct sounds.

Such hard-to-predict sound effects can render speech unintelligible in a theater or classroom, or make music unlistenable in a concert hall-in extreme cases, music can become unplayable because musicians on stage cannot hear each other clearly. Much of the pioneering work in acoustical analysis was done at AT&T Bell Laboratories (now part of Lucent) and the former RCA Sarnoff Laboratories, both in New Jersey. But the most successful acoustical simulation software now on the market comes from Europe. Catt Acoustic, developed by Bengt-Inge Dahlenbäck in conjunction with Chalmers University of Technology in Gotheborg, Sweden, is preferred by engineers for purely acoustic design. Ease (Enhanced Acoustic Simulation for Engineers), developed by Wolfgang Ahnert in Germany and marketed globally by Renkus-Heinz, is preferred for designing electro-acoustic or sound reinforcement systems and for hybrid spaces in which the room's natural acoustics may be assisted by, for example, electronic reverberation.
A 3D acoustical animation shows how direct and reflected sound energy originating at a podium on an auditorium stage arrives at specific listening positions in the audience. (Greater London Authority council chamber by Foster's & Partners, Architects;




Lawrence Tedford, a senior consultant with Arup Acoustics in San Francisco, a subsidiary of London-based consultancy Ove Arup & Partners, explains that both Catt and Ease are commonly used by importing AutoCAD design drawings, applying sound absorption coefficients as data attributes of various surfaces and objects in the CAD model, and then computing an acoustical raytrace. Just as raytracing for lighting is viewer dependent, acoustical raytraces are listener dependent and must be recomputed for different locations within a hall. The resulting 2D and 3D diagrams represent static snapshots of sound behavior that are both frequency dependent and time dependent (the aural equivalent of differently colored lights reaching a viewer with varying intensity over time).

One way around the limitations of visualization for acoustical phenomena is auralization. This technique creates an audible acoustic rendering of a space by imposing the space's computed impulse response, or acoustical signature, on a voice or musical instrument sound recorded anechoically, that is, in a special room with no sound reflections. Because usable anechoic sound sources are rare, auralization techniques are not widely used among acoustical engineers. Instead, firms like Arup have begun to develop their own routines to simulate and visualize the propagation of sound over time. Developed by Brian Katz, an acoustics researcher at Arup's New York office, the company's animated visualizations were initially done in 2D, and overlaid on CAD cross-sections of theaters and similar spaces.
Finite element analysis can predict the buckling of structural members under various seismic load conditions. (© Arup.)




More recently, Arup has experimented with 3D animated sound visualizations. These can be particularly useful when an architect wishes to fine tune the design of a performance or meeting space, as is the case in Arup's current collaboration with British master architect Lord Foster on the design of the council chamber for the new Greater London Authority.

Simulations and visualizations of lighting and acoustics can have a major impact on the usefulness and comfort of buildings, assuming everything about the buildings goes as planned. As part of their legal responsibility for public health, safety, and welfare, architects and their engineering consultants also must consider a structure's behavior in the event that disaster strikes. While the most common man-made disaster is fire, other natural disasters include earthquakes and hurricanes.

As anyone who has ridden out a bout of high winds on a high floor of a skyrise can attest, structures like New York's World Trade Center, Chicago's Sears Tower, or Kuala Lumpur's Petronas Towers sway in the wind, sometimes as much as several feet from side to side near the top. Rather than being a design defect, such freedom of movement is a carefully controlled response to the impact of winds. But flexing and vibration modes in floor plates, beams, and columns must be held within predictable limits to avoid the destruction of partitions, glazing, and other lighter-weight construction elements attached to the structural frame.
Simulated deflection can be visualized by color coding for vibration modes in slabs or other structural members stressed by wind or other dynamic loads. (© Arup.)




Just as with acoustics, one critical behavior that analysis of wind-induced vibrations seeks to avert is resonance, whereby a system's oscillations become self-reinforcing at some natural harmonic frequency. In acoustics, such harmonic frequencies occur where room dimensions are integer multiples of a tone's acoustic wavelength. In structural dynamics, the calculation of harmonic frequencies is more complex, but the results of miscalculation can be disastrous. Think of the infamous newsreel footage of "Galloping Gertie," the Tacoma Narrows Bridge that self-destructed in high winds in 1940. Today, when designing such super bridges as the recently completed Oresund, linking Denmark and Sweden across the North Sea, engineers at Arup rely on visualizations of vibration simulations to predict and control deflections in structural members.

A different and even more destructive kind of shaking occurs during earthquakes. Structural members and connections, designed to carry static building loads downward by gravity, are suddenly subjected to violent lateral motion and up and down movement as well-imposing irregular stresses on columns, beams, and trusses. General-purpose structural design tools, such as STRUDL (structural description language) originated in work at Georgia Tech and MIT. Visualizations derived from finite element analysis help engineers, such as the Advanced Structural Design group at Arup, to design trusses and connections with an optimal trade-off between seismic resiliency and structural efficiency.

Fire, smoke, and egress problems were among the first preventable disasters to be addressed by building codes. Infamous fires, such as the one that gutted the sweatshop Triangle Shirtwaist Company in New York City in 1911, killing nearly 150 workers in fewer than 15 minutes, often involved a convergence of unforeseen rapid flame spread, unpredictable smoke generation, and inadequate or obstructed means of escape. Modern building codes, applied to properly designed and maintained buildings, generally have alleviated the worst fire hazards.

However, advances in structural materials, enclosure systems, and environmental conditioning of large spaces make it feasible to create unprecedented building designs, such as international airport concourses, indoor sports arenas, multi-story hotel and shopping mall atria, even financial institution trading floors. Such spaces pose design and safety challenges that exceed the prescriptive dictates of traditional building codes. Much of the thinking in conventional codes is based on fire compartmentalization: wall and door construction capable of resisting fires of various intensities for varying lengths of time. Thus, architects may speak of 3/4-hour-rated doors in one-hour-rated partitions separating offices or hotel rooms from the adjoining exit corridor, and 3-hour doors in 4-hour partitions separating fire zones within a building or isolating stair towers. Yet, in the event of a fire in an airport concourse or shopping atrium, the rate of temperature rise and related flame and smoke behavior in the newer long-span, large-volume spaces cannot be accounted for or managed solely by following code-based fire-rating formulas.
Architectural engineers use custom simulations to visualize and analyze the spread of flames and smoke when designing large open interiors, such as airport concourses. (Kansai International Airport, Japan; ©Arup.)




Engineering groups such as Arup Fire merge several disciplines to address these problems. These include a thorough understanding of building de sign practices and materials, as well as the analysis of flows-specifically heat and smoke. Computational fluid dynamics, used in other contexts for aerodynamic analysis or for hydraulics, can be applied to the behavior of fire and smoke within building interiors. Just as the incremental effect of light, sound, or vibrational energy on any building surface or component can be analyzed and predicted along with reflected secondary or tertiary effects, so too can the impact of rising temperature be analyzed on materials and surfaces of a building. At certain temperatures, these surfaces may themselves ignite or give off noxious smoke, according to their flame spread and smoke contribution ratings. Animated visualizations of such flame and smoke behavior, such as Arup Fire has developed for the Kansai airport in Japan, can assist designers in predicting and enhancing the fire-resistance of new construction.
Engineers simulate the behavior of human occupants (right foreground) during an evacuation of an airport. (Stansted Airport, UK; © Arup.)




A further complication of fire and smoke in these new large open interior spaces is the exit behavior of human occupants. In a hotel or office building, these options are constrained by conventional doors, corridors, lobbies, and stairs. Similarly, in traditional assembly spaces, such as theaters or classrooms, the arrangement of seats and aisles effectively channels occupants toward the designated exits. However, in an open interior like a retail atrium, the distribution of people around the space is not easily predictable, nor is the optimal path of escape entirely obvious. Rather than simulate only smoke and fire, safety dictates that simulating the behavior of the occupants also is desirable. For an evacuation study at Stansted Airport in the UK, Arup Fire performed such a simulation, along with an animated visualization showing abstracted denizens fleeing for their virtual lives.

With the current state of the art, these six types of simulations and visualizations-light, sound, wind, earthquake, fire, and egress-must each be run separately by mutually incompatible software operating on separate instances of the CAD building model. In each case, even though the model represents the same building, it may be geometrically altered to suit the input requirements of a specific suite of analytic software. Moreover, data attributes attached to drawing or model features will also vary from model to model and from tool to tool. This process imposes a heavy burden of computational overhead on high-profile building projects, which can involve a dozen or more separate computer simulation models-all derived from the architect's underlying CAD model, yet each optimized for its own special flavor of CFD, FEA, or other simulation tool.

As more architects and engineering consultants adopt 3D CAD systems that are based on intelligent objects and are compliant with emerging standards for software interoperability (such as the Industry Foundation Classes of the International Alliance for Interoperability), the ability to simulate and visualize all aspects of a building's behavior will become more accessible. And that will result in safer, more useful, and more comfortable buildings for us all.

Architect Jerry Laiserin, FAIA, provides strategic consulting services to architects and their technology providers. He can be reached at jerry@laiserin.com.






Abvent North America
ArtLantis
www.abvent.com

Bengt-Inge Dahlenbäck
Catt
www.catt.se

Discreet
Lightscape
www.discreet.com

Georgia Tech and MIT
STRUDL
www.gtstrudl.gatech.edu

Lawrence Berkeley
National Laboratories
Radiance
www.lbl.gov

Lighting Analysts
AGI-32
www.lightinganalysts.com

Renkus-Heinz, Ease
www.rebkusheinz.com

Robert McNeel & Associates
Accurender
www.mcneel.com