Issue: Volume: 23 Issue: 7 (July 2000)

Sea Change

By Barbara Robertson

It's hard to imagine anything more terrifying than finding yourself on a fishing boat being hammered by waves so violent they could toss your vessel like a toothpick. You plummet down walls of water, waves crash over the wheelhouse and threaten to break out the windows, and you know the chance of being rescued is not great. Few people have seen a storm this dangerous and of those who have, few survived.

This month you don't have to imagine how that would feel. Thanks to Warner Bros.' The Perfect Storm, you can experience a sea captain's worst fear vicariously, from a movie theater seat. Scheduled to open over the long July 4 weekend, the movie could be this summer's big hit. Based on the best-selling non-fiction book by Sebastian Junger, the film is directed and co-produced by Wolfgang Petersen, director of Das Boot and Air Force One, and stars George Clooney and Mark Wahlberg. Visual effects were created at Industrial Light & Magic (San Rafael, CA), and by all accounts include some of the most incredible shots of wild seas ever put on film. Much of the water is digital; many of the shots were created entirely with computer graphics.

The storm happened in late October 1991 in the North Atlantic when a low-pressure system filled with cold air bumped into a hurricane filled with warm air. When a high-pressure system moved in and kept the cold front grinding against the hot air, it became the rare event meteorologists call a "perfect storm." On October 28, waves had climbed to 70 feet. The first 100-foot wave hit the next day.

Heading straight into the storm was the Andrea Gail, a 72-foot steel fishing boat motoring her way home to Gloucester, Massachusetts, with a hold full of swordfish and a broken ice machine. In his book, Junger provides a terrifying account of the four-day storm as experienced by the people who were in it on fishing boats, sailboats, Coast Guard cutters, and rescue helicopters, with a special focus on the six-man crew of the Andrea Gail.
ILM used a fluid flow simulation to shape and move the ocean, particle dynamics for the crest of the wave, and cloth simulation for the sail to create this entirely CG scene.

But how to create such a storm on film? Actors could be filmed on a full-size model of the fishing boat, and that boat could fit in a tank of water-but a 100-foot wave certainly couldn't. Even with a quarter-scale boat, the wave would have to be 25 feet tall. The answer was to create digital water that looked and moved realistically.

"This is all about the reality of water," says Stefan Fangmeier, visual effects supervisor. "Most people have seen boats going through 30- or 40-foot waves, and wind blowing water, and waves washing up on the shore. Water is so elemental, everybody knows what it looks like. For that reason, and because it's a real story, these things have to be very real."

Effects studios have created relatively calm digital seas for such films as Titanic and Water World. Deep Impact had a few shots with a huge wave. But no studio had tackled a project requiring so much digital water in so many forms. "I didn't want to do it," says Fangmeier. "I figured it was too hard." Ultimately, he accepted the challenge and led an ILM crew that grew to around 100 people, largely R&D engineers and technical directors (TDs). Creating the film's 340 effects shots took the crew more than 14 months. Around 250 of the shots-about 30 minutes of the film-include digital water. In some shots, the digital water matches and extends practical (real) water in live-action scenes-at Warner Bros., a 100-foot by 92-foot tank floated a full-sized model of the Andrea Gail. In other shots, everything is CG.

"It's definitely the most challenging project I've done," says Fangmeier, who also supervised visual effects for another natural phenomenon film, Twister. "For Twister, we shot a location in Oklahoma and then put the tornado in it. For this film, we're creating the whole environment, including everything in the air."
The cloudy sky, the lightning bolt, and the water directly beneath were created with computer graphics to extend this harbor scene and provide stormy weather.

The digital water was created with two different types of simulators. One used computational fluid dynamics in a custom system to create a continuous field for ocean waves and other volumes. The second type used rule-based particles in Alias|Wavefront's (Toronto) Maya to create such things as splashes, spray, and foam. Complex procedural shaders in PhotoRealistic RenderMan (Pixar; Pt. Richmond, CA) added texture to the water's surface and all the color and lighting.

"It involves a lot of complicated systems with lots of data all interacting," says Fangmeier. "It's incredibly tedious, and it's technically very complex." An average shot is around 150 frames or 6.25 seconds; the file size for an average shot is 10gb. "With all the calculations, simulations, and particles going into a shot, we can run only maybe two or three tests during the day, so it's really difficult to nail it down and get it to look right," says Fangmeier.

Toward the end of production, one morning in May, Fangmeier sits at the controls of an Avid system in a dark room, reviewing the nearly final shots that were rendered during the night. An assistant with a clipboard, penlight, and telephone quietly pages team after team of TDs, who stream in and out as Fangmeier reviews their shots. In one, he wants more fill light on a boat's hull, a higher wake, and a different shape for water spilling over the wheelhouse. In another, a pattern of foam on the waves looks too even. The contrast on the water during a lightning strike is too harsh. A splash is too lively and too high. Most of the changes will require new simulations or new parameters in procedural shaders-or both.

"When we looked at this project, we thought, OK, it will be very difficult, but eventually we'll get the technology figured out because it's one ocean. What happened is that for every camera angle the water looks different. We can't use the same parameters and maps for different camera angles," says Fangmeier. "And there is very little margin of error. As soon as you get it too bright or too dense or too fast it starts looking fake. So, every shot has been quite a challenge. Creating realistic water is very different from painting models of dinosaurs and then lighting them in every shot."
Actress Cherry Jones was filmed on the Warner Bros. stage, where a tank big enough to hold the sailboat Mistral provided the location; wind machines simulated the storm. The splashes in the foreground are real water; the background water is digital.

The crew started with the underlying ocean-John Anderson's world. A former professor, Anderson has been creating oceans with numerical simulations for close to 20 years and sailing them in offshore races. "I would have come to ILM to make this movie if I hadn't already been here," he says. "It's the right time. We've just gotten the simulation technology to the point where we can do 3D flows that look real."

Anderson explains that 10 years ago, scientists had only enough processing power to create fluid flow simulations with 80 to 100 points on a side-enough resolution to define a main flow, but not enough to define instabilities. About five years ago, high-resolution, multiple-scale 3D simulations began to emerge in the scientific world as machines became faster.

Why are multiple scales important? "In a low-resolution simulation, you get a smooth field and the fluid starts looking like a lava lamp," Anderson says. "But when you see a real wave breaking you see kinks forming on it that get faceted before they turn to foam. You need that turbulence to make a simulation look real." To have that level of detail, there must be enough resolution to have two scales in one simulation-the large scale is the wave flow and the secondary scale shows the turbulence in the wave. Because the small scale is dependent on the large, the two must be calculated together.

"Two or three years ago, we couldn't have done the movie this way," Anderson says. "The simulations would have taken weeks or months to run, not days." ILM assigned 435 processors to this show: 115 in SGI (Mountain View, CA) O2s, 320 in SGI Origin 2000s.

A simulation starts with a random wave field. Anderson then adds wind and runs the "sim" until he sees the waves he wants. "Physical simulation is sort of this Zen-like thing," he says. "You just sit there staring at water for 12 hours at a time and tweak at it." Eventually, he produced 20 different ocean types that he calls "digital stock footage." The oceans are categorized by wave height-from calm seas with short-spaced, 5- to 10-foot waves, to wild, stormy seas with long-spaced 100- to 150-foot waves.

To save disk space, Anderson's program calculated the simulations with 3D physics, but used 2D height fields to represent the oceans with rolling or small waves. Data-intensive, full 3D simulations are saved for particular cases such as big breaking waves and volumes of water sloshing on deck. "Once the waves are breaking, there's no single value for the height, so you can't represent them in 2D," he says.
The large waves are fluid simulations with a layer of smaller simulated waves on top for texture. The foam, mist, and splashes were created with particles. Various shaders, including fractal-based displacements, added ripples, color, lights, and other det

To control the ocean's dramatic performance, various parameters in the simulation program allow waves to grow faster than in the real world, to break at specific frames, to push together, to become rogue waves, and so forth. TDs can also, later, sharpen the waves or make them lean in a particular direction. Although the simulation program is physically based, the simulations don't have to be physically accurate.

The 20 ocean simulations were output as 600- to 800-frame never-ending "periodic squares" in which the waves go out one side and then come back in the other. These simulations were first used to create 3D animatics in Maya.

To bring the simulation data into Maya in a useful form, technical director Masi Oka created an "ocean primitive," an amorphous shape that forms an ocean surface using simulation data imported into Maya via a special plug-in. With this, the TDs could move a camera around the ocean to look for areas they liked and then pick that data out of the simulation file for later use. "They were exactly like location scouts," says Doug Smythe, associate visual effects supervisor.

The next step was to put boats on the water. All told, the effects team created 12 models in Maya, including the Andrea Gail, the yacht Mistral, a Coast Guard cutter, and a container ship, tanker, helicopter, and plane.

Habib Zargarpour, associate visual effects supervisor, led the team that integrated the CG boats with the simulated water. Earlier, Zargarpour had used Maya to create an animation system for the pod race in Phantom Menace. For The Perfect Storm, he created a boat simulation system using expressions in Maya that determine a vessel's movement based on its buoyancy and propulsion, the drag in the water, and the movement of the underlying ocean. Custom plug-ins allow the boat to sense the speed of the water underneath.

Zargarpour demonstrates what it looks like using a virtual equivalent of a rudder to steer a simple model of a 210-foot Coast Guard cutter onto an 80-foot wave. The front of the boat comes out of the water as it rounds the top of the wave, and slams down. "When we saw that, we thought it wasn't right. And then we got reference footage of a 250-foot ship coming off a wave in the same way," he says. Animators could also alter the way the boat rides on the water by using parameters to change its buoyancy.
Actor George Clooney was filmed on a bluescreen stage. The splashes on the rigging are real; the background waves were created with computer graphics.

Once the team determined how to move the boat through the water, they began experimenting with ways to shoot the virtual scenes. "We thought it would be interesting for Wolfgang to see what it would be like if he were on another boat doing a real shoot with a hand-held camera," Zargarpour says. This would help him show the audience how big the waves are.

"When you're looking down at a 60-foot ocean, it looks pretty flat, but when you're sitting in a boat, you can see the wave towering over you," Fangmeier explains. The team created a camera as a rigid body in Maya, put it on a boat, placed a target on the boat they wanted to film, and had the virtual camera aim at the target. In the movie, as the storm gets stronger and the waves get bigger, the camera moves closer to the water.

The camera boat can be positioned anywhere relative to the target-beside it to ride the same waves, behind it to get a different view. As the target boat climbs up the face of one 50-foot wave, the camera boat might be sliding down the wave behind. Since the camera angle would be changing as the boats are riding the waves, the view would change from looking down at the target boat to looking up at its stern. It's a roller-coaster ride that changes the feeling from vicarious to visceral. "A lot of people get affected by it," Zargarpour says. "We became concerned about people getting seasick."

"This kind of continuous rolling would be very hard to create with storyboards unless you drew a lot of them, " says Fangmeier. "So, these animatics were very important, even for shooting on set."

Once the director signed off on the animatics, the team began the hard work of creating the rest of the water. Starting with the ocean simulation data from the animatics, they began adding layers of water elements created with particle simulations.

First, Zargarpour analyzed reference footage of waves and boats in oceans. Among the two dozen or so different elements he isolated (and named) were crest foam on breaking waves, churn foam underneath waves, crest mist spraying off big waves, crest wisps off little waves, chopper wash, boat wakes, and so forth.

Particle systems were developed for each. When the particles had to interact with each other-a splash on the rail of a sloshing, flooded deck, for example-they were created with fluid dynamics. More often, the water particles were generated with rule-based particle dynamics in Maya using dozens of custom plug-ins written by TDs for this project. The plug-ins provided turbulence controllers, wind controllers, force fields, emitters, and other tools to create randomness with control. "When we were starting to develop these things, I thought it would take a couple of weeks," says Zargarpour. "Four months later we were still struggling with crest mist."

For example, because crest mist on a 60-foot wave drops with the wave, they had to generate splashes and mist as the water falls with particles. To direct the motion, the TDs created pseudo aerodynamic force fields. "We ended up with a lot of specialists on the show because the elements were so difficult," Zargarpour says.

One team of 19 people worked only on splashes. "These are not push-a-button simulations," Smythe says. "It's not at all unusual for someone to do 20, or 30, or 40 takes." Splashes caused by boats, such as water breaking over the bow, were created with particles emitted at the intersection of the boat and the water; parameters in various plug-ins control such things as density and wind.

In one scene, the crew on the Andrea Gail is struggling to put plywood over the broken windows on the wheelhouse. An enormous wave crashes down on top of them. As they struggle to get back up, another wave drenches them. The actors were filmed on the Warner Bros. set and were splashed with real water, but the director wanted much more. "We had to match the practical splashes and get the water right on top of them," says Zargarpour. It took a lot of iterations."
Final images are made of many layers. (To) The Andrea Gail fully rendered, showing a cable-rigging simulation. (Second) The simulated, rendered ocean. (Third) A particle simulation of a side splash from water surrounding the boat. (Fourth) A simulation of

But the crest foam, arguably, required the most hand crafting. "Crest foam is the white fluffy stuff at the very top of a wave," says Smythe. "It's thick at the top; it makes thin trails going down the back of the wave; and a lot of it shoots off the front of the wave. It looks different from every camera angle-the shape, the way it picks up light, which parts become visible or invisible. We worked on it for a good seven or eight months."

To keep the data manageable as they worked on the shape of the foam, they would run simulations using only a few thousand particles. For final shots, each of those particles would drive another mini-simulation that would generate more particles. In a final shot, the crest foam alone on a wave could easily be made of 4 billion particles. Boat wakes were created similarly-with sprays of particles, each of which is amplified to spray hundreds more.

Each of these water simulations is rendered separately and put onto layers for final compositing. Shots often included 10 layers of particle simulations alone. In addition, to make the big seas look rough, the TDs would scale simulations of small waves down to 6-inch resolution and layer them on top of bigger waves, which were typically simulated at 5-foot resolution, and then periodically tile the two to gether. On top of that, procedural shaders added even more texture.

"We use an extremely complicated RenderMan surface and displacement shader that has around 300 parameters," says Smythe. Three scales of fractals are built into the displacement shader and with these, the TDs created 2-inch to 1-foot tall capillary waves, small ripples in the water, and moving patterns from wind gusts. The shader also controls the color of the water, the appearance of the particle simulations, and lighting.

Early in the R&D phase, Zargarpour worked with Masi Oka to write a section of the shader that could reproduce the clumping nature of water; many months later, Joachim Arneson found the right combination of variables that would make blobbies stick together, dissipate, move with turbulence, and expand like water drops. "Everything is animated in the shader, " Zargarpour says. "The water drops move and evolve."

The shader also adds reflections that simulate light bouncing on water. "We tried to set up everything with the correct physics for water," Smythe says. Also, because water is self-luminescent, they wrote shaders that scatter light inside water drops, and ones that would create specular highlights as a drop of water moved. "If you tried to do this with motion blur techniques, you'd get a smear," Zargarpour says.

In some of the night shots, they use lightning to illuminate the ocean in the distance and show the expanse. "We thought it would be to our advantage to have lightning flashes in lots of these scenes, but that turned out to be very difficult because, again, the ocean looks very different from lightning flash to flash," Fangmeier says.

As for the rest of the work? Many of the shots had CG elements in addition to the water that required sophisticated techniques. "When you're doing state-of-the-art stuff that's never been done before and you want to get it so real and relatively perfect, it's just a lot of very hard work,' Fangmeier says. The crew removed rigs, did tricky bluescreen extractions of people in windstorms, and intricate compositing. But it was so difficult to create the rough seas that it made this hard work seem easy by comparison. Of the clouds, for example, which were created with volumetric renderings, Fangmeier says, "That's pretty easy stuff, relatively speaking, compared to the water."

"I think anything to do with creating rough seas is the hardest thing you could possibly do," says Zargar pour. "I've tried to imagine what would be harder, but I haven't come up with anything."

"The next one will be easier," he promises.

Barbara Robertson is West Coast senior editor for Computer Graphics World.