Visual effects artists devise a new method for creating turbulent water in The Guardian
Creating realistic digital water has always been a challenge for visual effects artists. First, VFX teams found a way to generate relatively calm CG water for films such as Waterworld and Titanic. Then they tackled more turbulent seas, making waves, literally, in The Perfect Storm, with a wide range of water effects culminating in a 100-foot wave that sank the small fishing boat Andrea Gail.
Since then, the process has become a little easier, thanks to the R&D efforts of Digital Domain, Industrial Light & Magic, and other studios while crafting the effects for various films, as well as the progress made by fluid-simulation and DCC software vendors in recent years. Despite these advances, generating realistic CG water, even today, is far from smooth sailing. Several months ago, ILM, with assistance from Stanford University, unleashed new methods for simulating water, creating a 200-foot wave that capsized an 1100-foot 3D cruise ship in the remake Poseidon (see "Size Matters," April 2006).
More recently, visual effects facility Flash Film Works spearheaded what has become another high watermark in terms of CG fluid technology, with stormy ocean surfaces, including boat wakes, whitecaps, bow sprays, and enormous breaking waves featured throughout the film The Guardian. In The Guardian, directed by Andy Davis, the rough sea is more than just an effect; it is the main antagonist in the live-action movie about a team of US Coast Guard rescue swimmers and their daring missions.
"Disney’s challenge to us was that we raise the bar for ocean FX even further than we had before," says William Mesa, founder of Flash Film Works and visual effects supervisor on the project. Mesa is used to getting his feet wet when it comes to digital water; his Flash Film Works has been creating CG water for the past six years in a number of films, including The Deep Blue Sea and Into the Blue. For The Guardian, Mesa was able to build on that experience while working with Davis, with whom he has teamed on all but one of the director’s films.
"From the beginning, Davis wanted us to create storms and water footage by whatever means it was going to take to make the imagery look realistic, like the footage we had from the Coast Guard," Mesa says. Hoping that The Guardian will do for Coast Guard rescue swimmers what Top Gun did for Navy fighter pilots, the US Coast Guard lent the production its support, providing thousands of hours of film footage taken during actual rescues. "From all that footage, we had a fairly good idea of what the Bering Sea, where a lot of the movie takes place, was like under these pretty horrific conditions," Mesa adds.
A Storm Is Brewing
According to Mesa, a significant part of the film takes place on or in the ocean, mainly during stormy conditions. And, almost all of that water in the movie was digitally manipulated or created from scratch. Pixel Magic took charge of exterior, wide boat shots in which real water interacted with a real boat. For those shots, the particle splashes were generated from footage of real elements. Real particles were also added by Furious FX for the scenes filmed from the inside of rescue helicopters, while Digital Dream did a number of shots looking out from inside the helicopters.
(from l. to r.) shows the scene in its precomp stage, the RealFlow
fl uid simulation results, and the RealFlow water sim integrated into
the water that was hand-manipulated with a surfacing technique.
In the majority of scenes, however, it was up to the artists at Flash Film Works to ensure that the real water elements integrated seamlessly with the CG water surfaces. "I didn’t want to go into a movie like this saying that we had to do all of this water using a CG methodology. I wanted to be able to composite many live elements together with CG elements and do a mix that was believable," says Mesa. "Even before we were doing the animatics, we were developing a way to mix elements together to make this water realistic."
Some of the "calmer" water scenes were filmed inside a water tank that could generate up to five-foot waves. The actors were filmed against a 150-foot bluescreen that wrapped around the back of the tank. The artists then extended the water to include a vast sea that would appear behind the actors in the film. "We also took that footage and stretched it to pull up some of the peak-point heights and dropped it down to the lower position," explains Mesa. "So when the actors were in the tank, we were able to make the water look higher and lower, generating larger swells than what were actually there."
Other scenes were shot within a dry environment, requiring all-CG water. This was especially true of the big, wide ocean shots.
In the past, Flash Film Works used procedural methods to accomplish most of its water work. "A lot of the way in which CG water is created requires an action/reaction method of how the water is manipulated, meaning if you want to create more wave heights, you can add more wind factor or lift the swell heights," says Mesa. "But the calculations in doing that and how the computer lays out the sea requires some procedure. It has some duplication of what it will be doing down the line. So, when you look at a computer-generated open ocean, it doesn’t feel organic in what it is doing out there. When you look at real footage, the water is very different all over the place. You might have a large wave on the right of the frame and moments later a large wave on the left of the frame."
Moreover, the director wanted to be able to control what was occurring in the shots; therefore, he did not want to use a procedural methodology for creating the water in The Guardian. And, as Mesa explains, to change the water and make it do all the specific things the director wanted at key moments in time would take a tremendous amount of work if the water were created procedurally. "Up in the Bering Sea there are a lot of cross-swells, with one current going one way and another going in a different direction. They collide into each other and peak-point, with the waves breaking over top the other. That would have been difficult to control procedurally," he says. "So, we faced a big challenge even before this film started—we needed a new way of generating the water aside from a procedural method. We needed a way to control the water at all times."
To hit certain story points, the director needed total control over the water, so the group opted against generating the water procedurally. To this end, Flash Film Works teamed with Mark Stasiuk, who used Real- Flow particle technology to help create some of the fi lm’s most turbulent water, as seen in this shot (bottom). Above shows the water and boat interaction, along with the RealFlow water.
Flash Film Works spent more than a year in R&D before coming up with a method of hand-manipulating the water surfacing using a combination of proprietary tools and commercial software. "We mimicked the real water from the Coast Guard footage using, in simplistic terms, multiple projection maps," explains Mesa. With the reference footage as an underlay, the group created surfaces and controllable bump maps of sorts that were more like morph targets that the artists could control. This was done inside NewTek’s LightWave software. As Mesa points out, on one surface alone the crew might have 50 different projection surfaces that were created either by CG methods or by using real-water surfacing (or element surfacing) that was projection-mapped onto the CG surface to create a particular water surface. On top of that, the group would create other projection map surfaces.
Next, the artists animated the digital surfaces atop the actual water surfaces so the CG would act and react identically to the water in the footage. "From that point, we were capable of controlling and animating those surfaces exactly the way we wanted to," Mesa says. "So if the director wanted a large wave to come up alongside the boat as the actor was leaning over and performing a particular task, we could control the water process so it would do all the exact things at the right time, especially in relation to the action and reaction of the boat."
Staying on Course
The boat depicted in several scenes was a mechanical rig that operated dry. It was not completely motion-controlled, but it was able to receive data and hit positions at specific points in time. "We could place the camera at a certain spot, and we knew that eight seconds later the boat would be at another position, so we could reposition the camera," Mesa says. This allowed the crew to lay out, in an almost crude way, large-scale camera moves, with the assurance that the boat would be in the proper positions at the proper times. The group then brought the encoded data back to its facility at Flash Film Works, where it began creating the digital water to coincide with both those camera moves and how the boat was operating mechanically.
The next big step for the team was making the water react to the boat. To do this, the artists used the encoded data with the camera moves and applied it to a detailed CG model of the boat, built in LightWave. The model, however, was not visible, as it was placed underneath the footage of the mechanical boat. Next, the artists built a CG rig that was virtually identical to the mechanical boat’s rig so it could operate in the same manner. As a result, the crew could animate the CG boat to follow the exact movements of the mechanical vessel.
"It could be placed in our CG water so the CG water could react and be hit and deformed by the CG boat in the proper way," explains Mesa. "When you render out the water without the CG boat in it and put the footage of the [mechanical] rowboat back into the CG water, the rowboat looks like it is reacting to the CG water."
For the really rough water, Flash Film Works called on Mark Stasiuk, co-founder of Fusion CI Studios, who served as the fluid-simulation supervisor. Flash Film Works had been working with a current version of Next Limit Technologies’ RealFlow and was encountering some problems generating particles in fine enough detail. Stasiuk, meanwhile, had been experimenting with an unreleased 64-bit, multi-threaded version of the software that was capable of handling a much higher particle state. "At the same time, we were working with a 64-bit version of LightWave that could also work in a much higher particle state," Mesa notes. "As a result, we developed a new method of being able to transfer RealFlow data into LightWave with our own scripting system so we could work in super-high particle rates that hadn’t been achieved before."
Stasiuk, who performed similar work on Poseidon, has collaborated with a number of other effects studios on CG fluid projects. When he came aboard The Guardian, he brought along a number of optimization procedures, algorithms, simulation methods, and rendering ideas he had developed over the past few years that he could extend and/or customize as needed for this project (see "Go with the Flow," pg. 34). Yet, the fact that many of the shots had to achieve particular story elements posed a challenge for Stasiuk, just as it did for Mesa. "This requirement made for beautiful, meaningful visuals, but it also meant that we had to make the fluid elements behave within an existing, ‘near-finaled’ 3D environment," Stasiuk says.
To this end, Stasiuk devised a collision method using RealFlow 4 and particle dynamics that would react to the water surfacing methodology that Flash Films Works had created for the other rough seas in the film. "When we wanted a lot more particle dynamics to happen at a specific time and place, we could manipulate the water to make different movements," Mesa points out. "This was one of the big advantages to having our own method for doing water, as opposed to using a procedural method." In these 21 extra-turbulent shots, the artists manipulated the CG water surfacing to achieve greater or quicker movements so the particle dynamics would work faster and greater, depending on what was needed.
"Even though the particle dynamics were used only on 21 shots, they were some of the most difficult ones in the film," notes Mesa.
Getting physical simulations to behave in a very specific way is never easy. In those shots, Stasiuk’s job was to create simulated water elements that would interact with 3D objects such as helicopters and boats. In many of them, a CG boat moves through a polygonal, deforming ocean. Neither element was dynamically driven or simulated, so there was no true fluid interaction. Thus, the RealFlow team had to create the elements to sell the physicality of the shots, "to make the boat look like it was actually touching that ocean," he says.
To accomplish this, Stasiuk created a variety of procedural Python scripts for specific natural phenomena, and deployed those as production tools that would, for instance, maintain stability, create water particles at the waterline of the boats, and generate breaking waves. He also crafted tools for generating additional passes such as spray and mist from the fluid elements. "This involved developing the methods and scripts, generating high-quality hero elements for a particular shot, and training artists to use the tools so they could generate elements in other shots," he says. "They could take 3D elements such as boats and ocean surfaces, and quickly simulate interactive effects like realistic whitecaps and boat wakes, providing them as rendered elements for the compositors."
In this dramatic all-CG scene, 3D objects aboard a
digital cargo ship spill into the sea as waves crash
against the boat. In all, the VFX crew had to deal with
particle dynamics on nearly 20 different locations on
the ship model, a task that had to be done in pieces.
This technology allowed Mesa to order up complex, large-scale fluid elements consisting of millions of particles that precisely fit the director’s requirements. It also resulted in rich, photorealistic fluid behaviors. In one scene, a boat is sinking and the actors are being pulled off the back by rescuers. In addition to the mechanical motion, the CG motion, and the CG water reaction generated by Flash Film Works, the crew also needed water displacement that would shoot out of the boat as well as waves that would bang against the sides of the boat. "A lot of water would be displaced coming off the boat as it began to submerge," notes Mesa. And that was done using Stasiuk’s particle system.
In another exciting 18-second all-CG shot, a distressed cargo ship is tossed about during a storm as its load—cars and debris—is spilled into the sea. Again, the same water surfacing methodology from Flash Films Works, supplemented by Stasiuk’s particle technology, was used to achieve the enormous splashes that flood the front of the ship. "The particle dynamics reacted to the boat in much the same way as the CG water surfacing interacted with the CG boat," Mesa notes.
In those shots and others, the artists still had to tackle the dynamics work in segments, despite the ability to generate super-high particle rates. In the cargo boat scene, the shot called for a giant splash to slam across the front of the boat, water to pour off the vessel, and waves to hit against the side as the ship teetered in the water. As a result, there were 15 to 20 different places on the boat where the artists had to deal with the dynamics. "We had five different particle dynamics for just that part and another four or five running across the bottom of the water, and four or five more for the water pouring off the boat itself—in addition to what is going on behind the boat," explains Mesa. "Behind the boat were big waves that were hitting the cargo. There were so many dynamic renders for that one shot because we had to combine a number of different dynamics together to create a specific wave splash just to get the particle count fine enough."
What this meant, says Mesa, is that each of those 15 to 20 dynamic locations had five rendered versions. The entire rendering for the fluid, mist, spray, and other passes were simulated over a day or so for most elements and rendered as self-shadowing particles over a few hours inside LightWave through an optimized pipeline. According to Stasiuk, the biggest hurdle the group faced was rendering the large quantities of data. Typical hypervoxel renders in LightWave were not meant for such large particle numbers; thus, the process was taking too long. Alternatively, the team opted for sprite-type rendering for the largest particle numbers.
In LightWave, though, that was not a straightforward process. "However, under the direction of Dan Novy and Jen Hachigian, we were able to use the Python scripting to develop a custom export of the data as LightWave partigon objects. This meant the fluid particles could be rendered as single polygons," says Stasiuk. "It was much faster than other alternatives, and it looked great."
To composite water elements and other imagery into the scenes, the group used Eyeon Software’s Digital Fusion, utilizing the Z buffer space within the composite. This was due to all the 3D camera moves within the shots, which also had to conform in the composite. For tracking other shots, the group used Apple’s Shake.
In Deep Water
According to Mesa, the big difference between the water requirements in this film and those in other movies is the vast amount featured in The Guardian. Also, the viewpoint for this movie is often at water level. "In The Perfect Storm, there is a little of that, but not much," he says. "In this movie, there are many shots of the swimmers in these giant, stormy seas; the audience gets to witness these horrific conditions from the swimmers’ perspective."
Flash Film Works spent more than one year in R&D before devising a method of handmanipulating
the water surfaces to control the majority of the fl uid in the fi lm. The
studio accomplished this task using proprietary and commercial tools.
And, like the water work in The Perfect Storm and Poseidon, the CG fluid simulation in this movie sets a new precedence of what can be done within the digital realm by opening different avenues for the creation of virtual water. "Anytime you are doing a procedural method, you need to think out the process—make calculations—and that utilizes computer power," says Mesa. With this method, all the computer had to do is render out the surfacing or positions the group told it to create, which results in a huge leap forward in actual speed, time, and changeability. And after the render, if the artists want to change something, the shot can be quickly re-rendered without the computer having to recalculate the whole process over again. Since the water was laid out in a morph-target fashion, the artists just animated it as they would any other animation.
"So many people were worried that we had to render out all this CG water. And, I remember back when we did The Deep Blue Sea; some shots took a week to render the water," says Mesa. "For this film, we rendered out the general surfacing in multiple layers, about five on average. We didn’t try to render all the CG water with the lights reflecting on it. Things like that were rendered out separately and composited together. If we didn’t like it, we could control it in the composite."
In the film, the Coast Guard rescue swimmers have to battle the most horrendous storms conjured up by Mother Nature. Similarly, the digital effects artists had to control the natural phenomenon that they themselves had created. In the end, both triumphed.
Karen Moltenbrey is the chief editor for Computer Graphics World.
Go With The Flow
With a PhD in fluid mechanics, Mark Stasiuk, co-founder of Fusion CI Studios, along with producer/director partner, Lauren Millar, has worked with software developer Next Limit Technologies in its R&D efforts for the past three years while using the company’s commercial RealFlow software in features and commercials. Most recently, Stasiuk assisted Flash Film Works in creating millions of particles that augmented the digital surfacing technique devised by Flash Film Works for generating turbulent water in The Guardian. In a Q&A with CGW chief editor Karen Moltenbrey, Stasiuk discusses the work he did for the film.
What was your task/role on the film?
I worked in-house with Flash Film Works as the fluid simulation supervisor. My main role was to supervise and teach a group of eight RealFlow artists to work with Dan Novy (the technical supervisor) on pipeline issues, to develop custom production tools, and to communicate with VFX supervisor William Mesa and various CG artists and compositors involved in shots requiring CG fluid effects.
Have you done similar work in the past?
I did similar work on Poseidon, working within CIS Hollywood, and have also consulted with several studios on CG fluid projects. For many of our clients, I have acted as an advisor for their workflow and pipeline, done troubleshooting, and provided fast turnaround custom scripts to accelerate their simulations or eliminate problems.
How did you apply that knowledge to The Guardian?
I came into this project with an accumulated library of optimization procedures, algorithms, simulation methods, and rendering ideas that I had already developed. I then extended many of those to a more advanced state, or customized them for this project’s particular needs. I was also familiar with the issues that new RealFlow users face, so was able to find ways to get junior artists productive faster.
Is your specialty solely in fluid sim?
Really it’s in general dynamics, including general particle FX, with fluid simulation being a special (and especially difficult) area of dynamics. We work on problems involving rigid bodies, soft bodies, dust, smoke, plasma, fire, explosions, etc., including interactions between these different things. In addition, we provide help with render pipelines related directly to the FX elements we create.
What can you provide that a VFX studio cannot do on its own?
Typically, small to midsize VFX studios can’t easily maintain a high level of expertise in these kinds of effects over the long term; it’s just not what they are focused on from day to day, so it becomes expensive and unwieldy. And, certainly, they have difficulty maintaining a highly specialized line of R&D. We bring along years of accumulated R&D, a rare ability to quickly perform robust, new R&D, plus strong, specialized support from Next Limit Technologies, with whom we have an exclusive collaborative relationship.
Exactly what does that R&D entail?
Our body of R&D includes everything from ways to make simulations run two or three times faster than they would otherwise, to stability methods, to custom force fields for achieving certain behaviors. In addition, we have the hardware and a group of artists who we’ve trained to get shots done fast. So we can quickly turn around complex effects with very little ramp-up time. Studios can struggle for months to get to the point where they are productive with complex fluid FX, whereas we can turn around useful iterations in a matter of weeks. It’s just because that’s what we’re focused on and experienced with.
Why was this task specially challenging?
The Guardian was challenging for two reasons. First, a significant amount of R&D had to be done during production, just because of the timeline and the evolving needs of the production. Second, many of the scene elements were non-dynamic and strongly art-directed, because the shots needed to achieve very particular story elements.
Why couldn’t you use out-of-the-box fluid technology?
For a minority of elements, we did just that. But for many of the elements, the needs were very particular and art-directed. Simulations generally do cool and realistic things, but sometimes that’s not what the director needs. We needed extra control. Plus, for a number of effects, there just aren’t pre-fab tools available in RealFlow—for example, there are none for generating realistic splashes around the intersection of any two polygonal objects. We also needed to keep the simulation times manageable and stable.
How did RealFlow enable you to accomplish your goals?
RealFlow is a relatively fast and stable dynamics solver with a lot of flexibility built into it now that the product incorporates scripting. It provided the foundation for us. The fact that the software provides a lot of potential for custom control via scripting within a well-developed UI makes it ideal for this kind of work. In addition, RealFlow’s new 64-bit version allowed us to access massive amounts of RAM and, therefore, run much-larger-scale simulations.
Did you use any other software or hardware?
We used LightWave 9, and made use of both 32-bit and 64-bit AMD Opteron systems. The 64-bit systems had up to 16gb of RAM to deal with the large number of particles and polygons in the simulations.
Which features/functions did you use, and for what end?
We used a combination of the built-in tools (fluid particles, force fields, fully coupled fluid-object interaction) plus the Python scripting capability to do custom tasks, like tailor-made force fields.
How long did you work on this project?
Fusion’s part in the project lasted about four months; the solution to the rendering issue was developed over just a few weeks early in the production work, which was early enough that it was really solved before we got into the peak of the simulation work.
Are there any other points about your work that are worth mentioning?
For me, a highlight of this work was that a relatively small group of talented VFX artists, without massive resources or months and months of pre-production R&D, could deliver advanced CG fluid effects. That speaks to the quality of the tools (RealFlow and LightWave), as well as to the skill of the artists involved and the support at critical moments from the software developers.
At any time did you have to dial back the effects?
One of our favorite custom simulation tools was a script I created to magnify forces to get bigger splashes—we dubbed it the "cowbell force." We’d often get requests for more cowbell. But by the end, in a lot of cases, we toggled on too much cowbell. It was nice to hear, ‘OK, less cowbell, please’ from the supervisor.
What’s next for you?
We’re focusing on providing fluid effects elements for features now, and working less in-house with other studios. And we’re currently involved in a few projects requiring large-scale, non-water-type fluid effects and continuing to work with Next Limit Technologies to ramp up to RF5 in the process. Watch out for some amazingly violent and yet highly ‘directable’ particle effects coming soon.