High-Flying FX
Issue: Volume: 28 Issue: 9 (September 2005)

High-Flying FX

The underlying question this summer’s cinematic action-adventure love story poses is a big one: In the future, if unmanned airplanes fire the missiles, will we go to war more easily? As Stealth audiences quickly discover, that’s not the only problem an unmanned aircraft might cause.

Directed by Rob Cohen, who brought The Fast and the Furious and XXX to the screen, the Sony Pictures film puts audiences into the pilot’s seat of a Talon, a new hypersonic stealth aircraft. Flying alongside is EDI, an unmanned artificial intelligence-controlled Extreme Deep Invader. All’s well until lightning strikes EDI; the drone develops a mind of its own and threatens to ignite a nuclear Armageddon. Can three Navy test pilots, played by actors Josh Lucas, Jessica Biel, and Jamie Foxx, save the world?


“Rob [Cohen] had a few basic commandments,” says Joel Hynek, visual effects supervisor. “He wanted everything to be very clear, not like in Top Gun, which is a cool movie, but during the dogfight, no one knows where anyone is.”

A second command was to have the audience see the action from the pilot’s seat rather than watch it from a third-person point of view. “He embraced the first-person video gamer POV,” says Hynek. “He made it ‘gamer cool.’”

And third, Cohen wanted people in the audience to feel like they were flying. “He wanted it to be dynamic, different,” says Hynek.

The sum of these parts put much of the action into the hands of artists at Digital Domain, who surrounded live-action pilots with digital backgrounds and fashioned entirely CG shots. A crew of approximately 200 artists worked on 10 sequences (658 shots total) for the film. “These weren’t wire removals,” says Hynek. “They were toughies.”

To create the shots, the studio used Alias’s Maya for modeling and animation, Side Effects Software’s Houdini for effects, Pixar’s RenderMan for rendering, NewTek’s LightWave for shots inside the airplane engines, Adobe’s Photoshop for painting, and Digital Domain’s Nuke for compositing, Storm for simulating natural phenomena, and EnGen for creating digital terrain.

Hynek believes the studio raised the visual effects bar in five areas: allowing for an unself-conscious, freely moving camera, mimicking the aerodynamics of real flight, and creating CG clouds, CG terrain, and CG fire.

To help Cohen sell the idea for the film to Sony, Digital Domain created a 40-second sample shot. Engineers from Northrop Grumman helped design a plane for the test, then worked on aircraft for the movie.

“There were two planes: the Talon, which the hotshot pilots fly, and EDI, the invader,” says Hynek. “The engineers helped us flesh out concepts like where to put the weapons, and then production designer Michael Riva gave them a sexy Hollywood look.”

Once the project was green-lit, the studio modified X-plane, a PC-based flight simulator, to help design camera moves. “We had two monitors, two joysticks, one flying airplane, and one flying camera plane,” says Hynek. “It was good for quickly working out different types of shots, but we ended up doing previz in Maya in a traditional keyframe manner.”

For this film, the previz not only helped the storytellers design the action, but the data sometimes helped create the action by driving a gimballed cockpit, cameras, and lights on greenscreen stages.

Special effects supervisor John Frazier and his crew built the 70-ton gimballed cockpit on a soundstage at Fox Studios in Sydney, Australia. The gimbal rotated 360 degrees, rolling multiple times with a person inside, moved straight up or straight down, pitched 180 degrees on either side, and yawed 60 degrees total, according to Kelly Port, digital effects supervisor. The crew photographed it with a Spydercam, a Technocrane, and handheld cameras. Sometimes Navy pilots “flew” this device; other times an actor rode inside while previz data drove it.


To help director Rob Cohen give audiences the sensation of flying, a crew of 200 artists at Digital Domain created all-CG planes and terrains for 658 shots in the film Stealth.

“In the case of the Spydercam, one computer drove the camera, the gimbal, and the lights,” explains Hynek. “The camera would fly around and come whizzing up really close to the actor. It gave one pause.” Cohen took full advantage of all the dynamics, according to Hynek, who provides an extreme example: The Talon crashes, and just when the plane hits the ground, the camera flies in as Josh Lucas hits his head on the front of the panel.

For some data-driven shots, the camera move was often modified or undercranked (filmed at a slower frame rate than normal, to speed up the action) because the previz didn’t consider velocity. “In one shot, the camera moves into the cockpit while the plane goes up and then dives down, so we ran the move backwards, and had a camera upside down and undercranked,” says Hynek. “We shot it in reverse.”

The crew encoded the gimbal and camera motion when it could, but not the movement of the often-used handheld camera. “In the old days, a few years ago, we would have shot the background plate first and then the foreground greenscreen to match the lighting and perspective of the plate,” Hynek says. “But Cohen didn’t want to be a slave to the plate, so we shot the foreground first. Then we created a background to match the lighting and camera perspective.”

To make this possible, every shot went through Digital Domain’s Track software, which established the relationship of the camera and the gimbal. “That gave us choices,” Hynek says. “We could have all the movement in the camera, or we could assume the camera is still and have the plane moving, or any combination.” They made that decision during the animation phase.

Although the previz helped realize director Cohen’s intent for each shot, once the action moved into animation, things changed. “We’d redo the shots in animation,” says Hynek. The shots varied from all-CG, to close-ups of the live-action pilot with other planes visible outside the window, to shots of the live-action footage extended with CG into a complete plane. Animators incorporated cockpits tracked from the live-action plate into the animation; canopies and visors were added later.

Integrating the live-action cockpit with the digital plane was not always straightforward. Often, the animators had to tone down the movement captured on the greenscreen stage. “We had 20 shots where the gimbal was pitching so much that when we added the 70-foot-long plane, it looked like a bucking bronco in the sky,” says Port. “So we animated the plane and then projected the original photography onto the animated plane.”


Because the pilot’s visor and the airplane’s canopy would have reflected the equipment on the bluescreen stage, they were always digital, as is the plane flying nearby in this shot.

Reflections made the CG visors and canopies necessary. Visors worn on stage reflected that environment, not the film’s high-flying clouds, and those reflections were difficult to remove; the CG canopies had to reflect moving clouds and the sun. To add realism, the crew aged the canopies with dirt and minute scratches. And then, they added sun dogs, the little reflections that radiate in circles when a sun highlight hits the scratches.

Procedural animation helped move the flaps on the airplanes, but animators did the rest, setting the speed-500, 1000, even 5000 miles per hour-and keyframing the action in Maya using aerial photography for reference. “I’ve been a pilot for 30 years. I was riding herd on each shot,” says Hynek.

One of the key instructions from Cohen was to place the planes in an environment where speed became palpable, but when the planes fly in a clear, blue sky, there’s no way to tell how fast they’re traveling. That meant the crew needed to fill the sky with clouds in the foreground, mid-ground, and background.

“Cohen’s direction to the animators was to take advantage of the 3D space they lived in,” says Port. “He wanted to get away from the idea that the action was on a 2D plane in space. So, we created a playground of clouds in which the action took place.” In addition, the crew created rapidly moving, less-detailed vapor that interacted with the fuselage and wings, and streamed into the intakes.

For this, the group used Digital Domain’s Academy Award-winning Storm software, creating a library of different types of clouds for all the environments in which the planes travel throughout the film. Houdini provides Storm’s interface; Digital Domain’s Voxel B, the rendering. “Storm simulates a true volume and basically generates 3D volumetric noise,” explains Port. The rendering is efficient because the simulation is stored in volumetric buffers and represented internally on cards that always face the camera. “We can put lights in there and have it backlit,” says Port, “and have full control of the 3D noise.”


Without visual cues, it would have been impossible to tell how fast the planes were flying. So Digital Domain used a combination of its volumetric Storm software and Voxel B renderer to create a playground of CG clouds for the digital airplanes to speed t

Storm also helped the crew put a circle of flames in the sky. At one point, EDI, acting like a rebellious teenager, decides to keep the Talon test pilots from refueling at a dirigible refueling station 50,000 feet in the air. It blasts off one of the fuel hoses, and fuel spews out into a doughnut-shaped cloud that EDI then sets on fire.

The crew created both the fuel cloud and the fire with Storm. “We used the density field of the fuel cloud to drive the animation of the fire almost like a fuse,” says Port. The fire colors were based on time and density, and generated from a color lookup table. Because they used Storm for both types of imagery, the crew rendered the two simulations together rather than combining separate passes in compositing.


Since Digital Domain used its Storm software to create the blackish fuel cloud and the fire, the two simulations could be rendered together. Time and density determined the colors.

Proprietary software under development for two years at Digital Domain generated the ground beneath the planes and the atmosphere above. With the result, named EnGen, for Environment Generator, the crew could view the Earth from space and zoom down to a rock on the ground. On the ground, the terrain could include caves, rocks, roads, trees, snow, boulders, mountains, rivers, and so forth.

“It’s not just a simple height displacement,” Port points out. Shadows are soft when far away and hard when close, and the color of the atmosphere changes based on the sun and the viewing angle.


For the huge environments, a proprietary renderer automatically altered level of detail based on the airspeed, and divided each rendering job into chunks that were later assembled into final frames.

To create terrain for a location in Stealth, the crew often started with publicly available 3D topographic data and photographs of the location where the action takes place. Working in Maya, the team modeled a rough landscape in 3D, through which the director could fly a camera. Then, the group moved director-approved low-res meshes into EnGen.

Within the software, the crew added detail using nodes, sometimes as many as 1300 for a location. “We could get down to the behavior and the look of individual rocks and slopes,” says Port. “There’s even a node that makes nodes: Give it a curve, and it automatically creates a berm or maybe a road.” Artists could place rocks, create snow and vegetation using fractal noise patterns, or project a painting onto the terrain.

A proprietary renderer wrangled the rendering job, changing the level of detail based on airspeed. The renderer dropped to half-resolution for terrain beneath planes moving so fast that the ground below was blurred. Also, because the terrains were so huge and contained so many elements, the renderer divided each job into chunks that it assembled into final frames, and generated individual atmosphere and terrain passes for the compositors.


With its new EnGen terrain tools, Digital Domain altered the 3D models from topographic data, added such elements as rocks, caves, and rivers in levels of detail, cast shadows, and colored the atmosphere.

For compositing, Digital Domain uses its Nuke software, and for this film, the studio pushed the program in new ways. “We actually used Nuke as a shader,” says Bryan Grill, compositing supervisor.

Grill explains that lighters created the underlying airplane look by generating 16 rendered passes with different lighting effects. “These passes were control images, and were used as shaders,” he says. “We called them ‘shamposites,’ for shader-composites.” The technical crew combined the 16 passes into five layers and passed them on to the compositors. Compositors manipulating these layers controlled the color, reflectivity, and other lighting effects.

“It takes hours to re-render something,” says Grill. “But with these control images, we had the latitude to change the look.” This was important because most of the environments in the 658 shots were all-CG, and the average shot contained 50 elements-the visors, canopies, terrain, clouds, planes, engine effects, live-action elements, and so forth.

“We couldn’t wait until all the environments were done to render the planes,” notes Grill. By making the lighting pipeline interactive, the crew could begin working on the shots and then later correct the airplanes’ look as clouds filled the backgrounds. “We might have had to re-render one or two passes, but never the whole plane,” he says.

Compositors built the shots by starting with the terrain, giving the group its sun, sky, and ground. Each shot had as many as four different terrains depending on the airplanes’ altitudes. When the planes were high enough, the crew generated the terrain as a “pan and tile” background, more like a matte painting than a 3D model. When the camera moved closer to the ground, the terrain became fully 3D.

“We had Web pages set up so we knew what the altitude was for each shot,” Grill says. “We could press a button in Nuke, and the world would pop down where we wanted it to be.”

Next, the compositors layered in the clouds, using live-action shots of clouds for only around five percent; the rest were synthetic. Lastly, they inserted the airplanes, canopies, visors, and such flying effects as jet wash, heat exhaust, wing-tip vortices, and cloud vapor. Each effect arrived with 20 layers of controls, enabling the compositors to modify the look of the effects in much the same way that they changed the appearance of the plane.

The compositors’ challenge was blending the CG elements-background, sky, clouds, airplanes, effects, and live-action elements-into scenes that looked like they were filmed as opposed to a cinema-sized video game. To help everyone on the crew see what the elements would look like when projected in theaters, the studio created a viewer in Nuke that applied the color curves used in film.

“Everyone was looking at what the output should be like through the process,” says Grill. “For us, this has been a long time coming.” Earlier, he explains, CG artists might have looked at a photograph, looked on-screen at a rendered element, and when they matched, handed the element to the compositing team. By popping the element into the Nuke viewer, the CG artist saw what the compositor would work with instead.

“We were creating pictures from scratch,” Grill says, “and when you’re producing every element of a picture, you have control over every part. But that’s when things start looking unreal. We had to constantly educate people about photographic images-what the sky would look like if we exposed for the plane, what the plane would look like if we exposed for the sky or the clouds. We had all this in play.”


To design three Talon stealth fighters and EDI, the Extreme Deep Invader, the artists called on engineers from Northrop Grumman, and then added a sexy Hollywood look to the planes.

In this film, the action is often the story, and the visual effects are often the cinematography. “It takes a real discipline to have all this power and not abuse it,” Grill says. And for the people who see this movie and have ridden in the pilot’s seat, surely that’s a discipline they will want all unmanned weapons to learn as well.



Barbara Robertson
is an award-winning journalist and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.