Issue: Volume: 24 Issue: 6 (June 2001)

War Effort



By Barbara Robertson

The movie has all the makings of a classic Hollywood epic: Set against the backdrop of America's entrance into World War II in 1941, Touch stone Pictures' Pearl Harbor is the story of two pilots and best friends (actors Ben Affleck and Josh Hartnett) and a Navy nurse (actress Kate Beckinsale) who loves them both. Scheduled to open on Memorial Day, the high-profile film was produced and directed by Jerry Bruckheimer and Michael Bay, who worked together on the sci-fi thriller Armageddon, and was written by Randall Wallace, who wrote Braveheart. Some critics describe the film as a cross between Bay's Armageddon and James Cameron's Titanic; others liken it to Steven Spielberg's WWII epic, Saving Private Ryan. Like Titanic, this love story puts a human face on devastating historical events and gives the plot the necessary creative tension. Like Saving Private Ryan, it places the audience in the midst of realistic battle scenes, although in Pearl Harbor, the war is fought with air strikes rather than ground battles. Pearl Harbor's story moves through three air battles: the Battle of Britain, which took place in the summer and fall of 1941; the Japanese attack on Pearl Harbor the following December; and Lt. Colonel James H. Doolittle's (actor Alec Baldwin) raid on Tokyo in the spring of 1942. The heart of this movie, though, is the sneak attack on Pearl Harbor.

On the morning of December 7, 1941, 353 Japanese airplanes bombed the military base in Hawaii (then a US territory) for one hour and 45 minutes. Approximately 100 US Navy ships were in the harbor that morning-battleships, destroyers, cruisers, and various support vessels. At the end of the massive attack, 2341 American military personnel and 54 civilians were dead, 29 Japanese planes were shot down, 12 Navy ships were sunk or beached, and 164 American aircraft were destroyed.
The bow of the capsizing Oklahoma was a 150-foot rotating set piece; the back of the ship, including the tower, is digital. The swimmers are real; some sailors onboard are real, some are digital. Other CG elements include smoke, splashes, water and debris




In the film, 290 planes fly from Japan to attack Pearl Harbor, and the attack continues for 40 minutes. To help simulate this attack and the other air battle scenes in the movie, director Bay worked with effects studio Industrial Light & Magic, which had helped create some of the battle scenes for Saving Private Ryan. For Pearl Harbor, ILM's effects team, led by supervisor Eric Brevig, concocted a mixture of practical and digital effects that were seamlessly blended into live-action plates to create the "invisible" effects.

At ILM one morning toward the end of production, Ben Snow, associate visual effects supervisor, brings up several shots on a "viewing station" to show the range of digital effects created for Pearl Harbor. In one, the screen is filled with digital ships; only a small area in the foreground remains from the original live-action plate shot in Hawaii. Sailors, many of whom are digital, can be seen moving around the decks and waving to people onshore. In another shot, digital sailors and stuntmen cling to the side of the capsized battleship Oklahoma. Depending on the shot, Snow explains, a ship might be a miniature, a real battleship, a 150-foot set created for the movie that was extended digitally, or all digital. Most of the airplanes are digital, but not all. In a shot from a Battle of Britain dogfight between a real plane and a digital plane, the real plane becomes digital in mid-flight. All the planes, real or not, fire CG bullets. Some of the explosions are real; more are created with a mixture of live-action shots of real fire (so-called practical elements) and CG smoke. The plane crashes are digital. Tracers hitting the water are CG. The water is both digital and real.

Some of the most spectacular effects that Snow displays take place during the Pearl Harbor attack: the camera follows the at tacking Japanese planes as they fly low to the water, skirting between battleships. The air is dense with planes, all firing down on the ships. You see explosions everywhere; huge fiery smoke clouds fill the air. Sailors jump off the ships, are blown off the ships. Tracers ping the water between the ships. It seems terrifyingly real. But of course it isn't.
Animators set plane speed in real miles per hour and moved control points on a three-dimensional flight path in Maya to fly aircraft in scenes such as this one. (© 2001 Touchstone. Images courtesy Industrial Light & Magic.)




To create their part of the Pearl Harbor attack, the ILM crew started with 3D animatics provided by Bay that detailed his plan for the sequence. "Looking at those, we knew there would be elaborate camera moves, so we started by improving our links between CG and motion control, and then we looked at what we would have to do in CG," Snow says. The team identified four main areas of computer graphics research for the scenes in Pearl Harbor: lighting the digital planes, ships, and small complex objects in daylight environments; choreographing and animating the complex air battle scenes; creating realistic, large smoke simulations; and developing complicated crashing tools. "These are things we weren't doing so well before," says Snow. In addition, the team worked on new tools for creating digital people and putting them into scenes.

One of Bay's concerns going into the project was whether the team could make a believable CG plane, according to Snow. All told, the modelers created three types of Japanese planes-Zeroes (fighters), Kates (torpedo planes), and Vals (dive bombers)-to match the planes used in live-action shots. The live-action Zero was real, but the Kate and Val were American planes modified for the movie Tora, Tora, Tora, according to Ron Woodall, lead CG viewpainter. The team also made two American planes, the P-40 and B-25; two German planes, the ME109 Mes ser schmitt fighter and Heinkel HE-111 bomber; and the British fighter, the Spitfire. In addition, ILM modelers built two versions of eight different CG ships for battleship row in Pearl Harbor to show each ship before and after the attack. "The ships are huge," says Michael Bauer, CG supervisor. "In one battleship, there are 6000 geometry nodes."
The dock, sky, water, the ship on the left, and the small ship at the end of the dock were filmed in Pearl Harbor. The battleships, their moorings, and the sailors onboard are digital. (© 2001 Touchstone. Images courtesy Industrial Light & Magic. )




"The funny thing was that Michael Bay had strong concerns about the planes, but we were more worried about the CG ships," says Snow. In particular, they were worried about whether they could illuminate the ships properly. "The ships had complex surfaces with diffuse illumination," he explains.

In the past, effects teams simulated diffuse illumination, such as that typically created with radiosity, by using as many as 25 fill lights to produce light coming from all the different directions. But getting the right colors was a matter of trial and error, of someone deciding that a blue light might work in one place and a red light in another. For Pearl Harbor, the crew wanted a better method. Ultimately, the team devised a clever solution that simulates the ambient lighting created with radiosity without radiosity's processing penalty and that is not dependent on the orientation of an object in the environment.
Animators could select guns in the CG planes and then fire tracers. In this scene, the planes in the sky, the exploding plane, and the tracer fire are CG. (© 2001 Touchstone. Images courtesy Industrial Light & Magic.)




Hayden Landis, a sequence supervisor who helped devise the new ambient lighting system, explains that for years ILM effects crews have placed chrome spheres on location and photographed them. By unwrapping the photographed image on the chrome spheres and wrapping the image onto virtual spheres, they created reflection maps of the environment that were used to light CG objects placed into the live-action scenes. "I thought there had to be a way to use the [chrome sphere] photographs to get fill lighting," he says. Landis ruled out techniques that calculate fill lighting by taking a series of samples of the environment. "You can sample the environment a million times with raytracing, but I wanted to come up with something that could be used in production, so I just blurred the map," he says. "We get 80 percent of the look in a fraction of the time."

Blurring the map in effect averages the light in the environment and thus simulates diffuse light. By shooting rays from normals on the surface of the object being lit (that is, from a position perpendicular to the surface) straight out to the blurred environment map, he could determine the color and direction of the light that would fall on that point of the object's surface. Landis points to a softly shaded airplane on his computer screen, "You see here, the bottom of the wing is gray because the tarmac is gray, and the top is a lighter blue because the sky is bright blue. We also get a subliminal bluish tinge and reddish warm bounce from the runway. Since we're using all the light from the actual environment we get all the proper hues and visual cues that make the object look like it be longs there."

The next problem to solve was shadows. "Since the surfaces aren't raytraced, [points on the surface] have no idea whether they're inside an engine block or out on a wing," Landis says. To help create shadows, the team used Mental Images' Mental Ray. Basically by shooting rays out from the surface, the Mental Ray renderer could determine how much of the outside world every point on the surface could "see."
A tool implemented in Maya computed and animated the ailerons, flaps, and rudder based on how the animators "flew" the plane; the CG planes could be limited by what actual planes can do. (© 2001 Touchstone. Images courtesy Industrial Light & Magic.)




"If a point can see only 10 percent of the environment, it becomes only 10 percent as bright as everything else," Landis explains. "That produces nice soft shadows and a kind of radiosity look."

However, they noticed that these shadows weren't always accurate. "Some places were getting light from the sky even though the light would be blocked," Landis says. So in addition to having the rays determine how much light a point was getting, they had them record where most of the light originated. Using that information, the system would bend the normals to point toward the available light and that created appropriate occlusions. "It's a bit of a cheat, but you don't see anything wrong," says Landis.

Finally, the team made texture maps out of all this shading information and "baked" these maps into the objects. Thus, rather than calculating the shading for each object on a frame-by-frame basis during rendering, they stored the objects and the pre-calculated textures on disk in Pixar's RenderMan RIB files and had RenderMan bring the "baked" objects into the scenes as needed. "Baking" requires lots of disk space, but saves enormous amounts of processing time. "With solid surface objects like the planes and ships that don't have a lot of animating parts, we do the calculations once and we've got all that great ambient lighting information right at our fingertips," Landis says. The result is subtle shading created from the live-action environment by using methods efficient enough to keep the production pipeline humming along. Already, the ambient environment lighting technique is being used for effects shots in other movies underway at ILM.
The ILM crew developed a lighting technique that simulated radiosity to shade the CG planes. Ambient, diffuse lights made of colors from the live-action plate help blend these CG planes into their environment. (© 2001 Touchstone. Images courtesy Indu




Making the planes look real and fit well into the environment was only half the battle; the team also had to fly the planes and create believable crashes. "We had a fair amount of pre-production time on the show, so we had time to write the tools we needed to animate planes quickly with the highest level of control," says Scott Benza, lead animator. Benza worked with technical directors James Tooley and Craig Hammack to develop a system and an interface in Alias|Wavefront's Maya for the animators.

"We start animating by using an import planes button," Benza says, "and this brings up a dialog box that has every plane being used in the show. When a plane comes in, it's all set up and ready to animate."

Each plane that appeared on Benza's SGI workstation screen had its own flight path, a spline that he could scale and modify in 3D space. The plane always automatically pointed in the direction of the flight path. In addition to changing the flight path, Benza could change the plane's speed using actual miles per hour values. "We knew Eric [Brevig] would ask us how fast the planes were going in real world miles per hour," Benza says.
To create smoke, a simulation engine inside a volume propagated pressure waves from side to side, a heat source caused the atmosphere in the volume to rise, and millions of particles emitted in the volume followed the flow (top left). Shapes were controll




As Benza animated the plane, its ailerons, flaps, and rudder could automatically adjust. In real life, these parts control a plane, but in this virtual world, it can work in the opposite way. "We animate the plane first, so the software computes what the control surfaces should be doing and offsets the animation ahead of time. It's a reverse cycle, but the end result is the same, ideally," Benza says.

Each plane could have limitations based on what the actual planes could do. "Some of our most talented animators found this to be as challenging as creature animation," says Benza. "There's a lot of balance involved. How the plane reacts to the wind. How the wind pushes the plane around. A lot of times these planes don't fly perfectly straight and we had to fight giving them a spaceship look. The real planes weren't capable of being that precise."

While the Pearl Harbor attack required a lot of custom hand animation, for many of the shots, particularly those in which hundreds of Japanese planes are flying toward Hawaii, the animators could apply a level flight cycle, 300 frames of a plane flying straight through space making subtle adjustments as it flies, that was the equivalent of a walk cycle typically used in creature animation. To help choreograph hundreds of planes, the animators used Maya's "pawn" mode in which low resolution representations of the planes could be moved around in scenes.

Bombs were dropped using Maya dynamics; the animators simply set a release frame and the bombs were dropped automatically. For gunfire, the effects crew created a special tool using Maya's MEL scripts. "You bring in a plane, press a button, select the guns to activate, and the system gives you visible tracer fire," says Landis. When bullets hit objects in a computer scene, they determine what was hit, and that determination sets more effects in motion. "If a bullet hits the water, it triggers a splash, if it hits a ship, it triggers a spark, if it hits a plane, the plane starts emitting smoke," Landis says. The system stores all the hit points so they can be moved without re-running the simulation. "You have a file of hit points that trigger themselves at the right times and do the right things," Landis says. "You can pick them up and move them left or right."
ILM's rigid body simulation engine uses a clustering system so that many small pieces can exist as one large piece until the cluster is broken (right). This system helped simulate plane crashes, as in the shots shown below. (© 2001 Touchstone. Im




To create explosions resulting from the bombing and gunfire, the effects crew used both practical and digital elements. Digital smoke became especially important during the Pearl Harbor attack for two reasons, according to Snow: First, trying to match the movement of cameras used to film real explosions with the elaborate motion of the live-action helicopter camera that was used for the attack scenes, and then to lock the 2D elements together would have been difficult at best. The second problem was that it's illegal in the US to create real smoke clouds that would be big enough to reproduce the Pearl Harbor explosions.

Typically, CG smoke is created with particle systems using layers of fields and forces. Instead, ILM decided to use its in-house VOL Fluid system, a volumetric fluid dynamics system written by CG scientist John Anderson. Anderson, Ari Rapkin, and Vishwa Ranjan in ILM's software development department manipulated the fluid system so that it would produce natural looking smoke plumes. "Rather than ap plying a series of forces that operate in a purely Newtonian way to particles, which go straight until another force is applied to them, we used the real physics of fluid dynamics," says Raul Essig, technical director.

The simulation takes place inside a volume of virtual fluid, which is chopped into a grid. Inside the volume are a heat source and a particle emitter, and each point on the grid has a pressure and heat value associated with it. When the simulation engine runs, it propagates pressure waves, the heat rises and diffuses according to the laws of physics, and that drives the flow.

"We'd turn on the heat source and let it build up until it got hot enough to really start rising, let the pressure reach an equilibrium, and then turn on a particle emitter," Essig says. "As the particles rode along the flow of the fluid in the volume, the outer edges would slow down and start turning, and we would get rotating vortices that came up naturally from the rising heat moving through fluid." To help control the simulation, they used a slight momentum source to push a volume in a particular direction as if it were being blown by a wind, and they created valves that gave the mounting pressure an escape route.
Hand animation was used to create performances for digital sailors in this shot. At top left, the Oklahoma without digital sailors; at bottom left, with digital sailors; at right, the final shot. (© 2001 Touchstone. Images courtesy Industrial Light &




With millions of particles being simulated for each frame, each smoke plume required huge amounts of processing power and disk space. One plume, for example, required 30gb of disk space, says Essig. Moreover, the simulation engine often had to run for 800 frames just to get everything established before the team could begin creating the plumes, and this could easily take two days. Thus, rather than creating smoke on a per-shot basis, the team created a library of plumes that could be inserted into shots, and a library of RenderMan shaders.

"Typically when you're writing a shader for particle systems, you spend a lot of time and effort adding complexity and detail to make up for what is lacking in the simulation," Essig says. "For this simulation, I realized that the less enhancement I did, the better the simulation looked. The shader is fairly complex, but the complexity ended up being used primarily to preserve edge detail."

To put fire into the smoke, Essig wrote a RenderMan shader that had the particles self-illuminate in a fiery color based on a mixture of density and age. "If the particles are young and dense, we can assume they're still glowing from the heat of the fire," he says. To further heighten the illusion, they mixed real fire elements into the simulated smoke.

Some of the most spectacular explosions happen when the ships are being bombed. Others occur when planes crash.

ILM already had a rigid body system originally written by Anderson, Jim Hourihan, and Jeff Yost in the software development department that was used help the droids break apart in Star Wars Episode One: The Phantom Men ace. To simulate Pearl Harbor's plane crashes, Hourihan rewrote the system so that it would run fast even though hundreds of rigid bodies all connected to each other would need to be "over constrained." Hourihan explains: "The planes are composed of hundreds of little pieces that are all constrained together to make one plane while it's flying. These pieces break off while a plane is crashing, but while they're together, they have to be really together. Some times that means multiple constraints are in conflict." In one shot, even though a plane smacks into the ground at 200 miles per hour, it crumples but largely stays together.

Rather than try to simulate the plane's 200 pieces individually trying to stay together, Hourihan developed a clustering system. With this system, it became possible to convert lots of pieces, touching or not, into one big piece on the fly and then break it apart later. "The trick is to make sure you conserve the momentum so that when it breaks apart, it doesn't suddenly jerk," he says. "This was an interesting problem because it was a new thing to do with rigid simulation."

In addition, by using flexible spring pins to hold joints together, Hourihan allowed the planes and ships to bend and flex, but not break, when they were hit. "Because there are springs between the two ends of a joint, you can pull a joint apart, and it will flex as it tries to come back together," he says. This causes a wing, for example, to stay connected to the fuselage as it bends and flexes. To make the wings break off, an animator simply pulls out the pins.

The same rigid body dynamic system was also used for the ships. When the Arizona is torpedoed, for example, it lifts up out of the water and arches. "Scott [Benza] had the whole thing strung together with the string pins, so that the body of the ship could flex," Hourihan says. "When it went far enough, he released the pins and the ship came apart."

Soon after the bombing started at Pearl Harbor, five of the eight battleships were sunk or sinking, three destroyers were wrecked, a minelayer and target ship had capsized, and many other ships were badly damaged. In the movie, the crew on these ships was made of a mixture of real people and digital extras. "All the shots that have ships in them have digital sailors somewhere to make the shots more active," says Bauer. The sailors are not hero characters; they don't appear in closeups. Even so, the team created several variations of sizes and uniforms. To place the sailors on the ships, the animators would position particles that would later be replaced with prebaked geometry that had texture materials already applied, a technique used in previous movies. For Pearl Harbor, though, the team created an interface for Maya that allowed animators to pick a character and view the performance assigned to it, or to look at all the characters with particular performances-all the waving guys, or all the milling guys, for example.

Each digital character in the library had a performance built in. For characters with cyclical performances, an animator would draw a path and place the character on the path. For characters with specific actions, animators would give the character a starting position and let him go. The character might run to the railing, look over, and then jump. "Once he's in the scene, he's just a blocky representation that the animators can drag around and place wherever they want him," Bauer says.

The performances were primarily created with motion capture techniques. "We ended up with a library of 65 separate performances, each one long enough to be sliced into three sections," Bauer says. For example, in one performance a character might repeatedly duck and cover for 30 seconds; in another, a character would repeatedly fall and crawl.

"This interface wouldn't work for a battle scene with thousands of guys, but we were dealing with a few hundred guys here and there. We could get a rough population on a ship in a couple of hours," Bauer says.

In addition, some of the digital sailors were hand animated-the panicking guys swimming in the background in some shots, and some of the sailors sliding and tumbling off the capsizing Oklahoma.

Although digital, each of those sailors on the Oklahoma and on the other ships represents a real person who was at Pearl Harbor nearly 60 years ago. 1012 of the sailors who died on the Arizona remain at Pearl Harbor entombed within the sunken battleship.

On the wall leading into the Pearl Harbor production offices at ILM is a large bulletin board onto which the staff posted pictures and stories of relatives who were in World War II, and letters from soldiers and sailors, some of whom were at Pearl Harbor. Some of the stories are achingly sad. It's a good reminder that this time what they are creating with all their digital magic is founded in reality, not make believe.

Barbara Robertson is Senior Editor, West Coast for Computer Graphics World.