Fox/Blue Sky bring a flood of innovation to Ice Age: The Meltdown
In the 2002 prehistoric animated adventure Ice Age, a melancholic mammoth (Manny), a klutzy sloth (Sid), and a stoic saber-toothed tiger (Diego) form an uneasy bond to return a human child to his family. The story of the mismatched trio venturing across the frozen tundra warmed the hearts of filmgoers across the globe, making $382 million worldwide, earning an Oscar nomination for best animated feature film, and setting the stage for the climate-changing follow-up, Ice Age: The Meltdown. Making $70.5 million during its opening weekend, the CG sequel was another mammoth success for Fox’s Blue Sky Studios, eclipsing the original Ice Age’s weekend take by almost $25 million and becoming the year’s first box-office bonanza.
As the story opens, the animals are reveling in the virtual water park that global warming has made of their once-frozen habitat, frolicking on water slides and ignoring the apocalyptic warnings of Fast Tony, a con-artist turtle voiced by Jay Leno. But, Manny, Sid, and Diego soon learn that Fast Tony’s dire forecast is about to come true. The towering glacial cliffs that loom over their home, holding back the melting ocean, are about to burst and flood the valley-and drown all the creatures in it. Their only hope is to journey to the other end of the valley and escape on a primitive ark.
Images TM & © 2006 Twentieth Century Fox.
From above and below, their trek is fraught with perils and predators. A slalom course of bursting geysers, teetering rock formations, sharp-toothed amphibians attacking through cracks in the ice, and a flock of hungry-eyed vultures, who break into a Busby Berkeley rendition of “Food, Glorious Food,” are but a few.
Along the way, they meet two manic possums, Crash and Eddie, and their “sister” Ellie, a mammoth who believes she’s a possum. Still pining for kinship with a fellow mammoth, Manny finds hope in Ellie, and begins a fumbling courtship. Meanwhile, Sid’s desire for self-worth is kindled by a race of miniature sloths who worship him like a god-until, of course, they try to sacrifice him in a pit of lava. Sid also peels away Diego’s false bravado and forces him to confront his fear of swimming. And, finally, the beleaguered Scrat continues his quest for the ever-elusive acorn in wildly inventive comic interludes. The simplest one involves a botched pole-vaulting attempt using a pole that’s a bit too short to reach the far edge of deep crevice. You can imagine what happens.
If the stark icescape of the first film reflected Blue Sky’s initial trepidation at embarking upon a feature film, then the verdant playgrounds, gushing floodwaters, ultra-realistic fur, and elastic character animation of the sequel are a sure sign that those fears have been washed away.
“We’re a lot more confident now,” says lead animator Dave Torres. “But it still wouldn’t be any fun if we weren’t constantly pushing ourselves technically and artistically. And on this film, we had huge hurdles to overcome, specifically in the form of complex water and fur simulation.” To achieve that, Ice Age: The Meltdown would not only require the development of new tools for water and fur simulation, but for simulating froth and splashes, as well. Character animators also set a goal of pushing smear frames, squash and stretch, follow-through, and overlapping action to extremes. To do so would require the building of new meshes and rigs for all the characters to handle the extremes of motion.
Because only handful of the sequel’s 60-plus animators had worked on the original Ice Age, Blue Sky conducted seminars at the start of production to establish guidelines for the animation of each character, giving examples of expressions and showing what director Carlos Saldanha did or did not want. The animators also created internal Web pages geared to each character, and spent a day of team building at the Bronx Zoo, studying tigers and elephants.
Led by lead modeler Mike Defeo, artists discarded the NURBS meshes used for the first film and resurfaced all the characters using subdivision surfaces in Autodesk’s Maya. “Except for the eyes and teeth, all the characters were a single-surface sub-D mesh,” says Torres. “All the problems we had on the first Ice Age, with tears and seams around T-junctions and the convergence of [five or more] surfaces, were gone.” Aesthetically, the characters retained their original design, except for Diego, whose eyes are now more cat-like.
Thanks to a new, volumetric fur tool, called Fur Follow Through, the long mane and woolly coat of Manny and Ellie sport millions of hairs that respond to wind, inertia, and gravity.
Working primarily in Maya, riggers outfitted each of the main characters with more than 800 controls. These included forward kinematic/inverse kinematic (FK/IK) handles, Maya blend shapes, Maya Set Driven Keys, corrective blend shapes, and simple deformers such as lattices for flattening out a piece of geometry, creating impacts, or adding a hint of squash and stretch to the beaks of Fast Tony or the vultures. Instead of using Maya’s Sculpt Deformers for the dynamic animation of wobbling bellies, jostling fat, or bulging muscle, Blue Sky uses a proprietary tool called Follow Through. While not a fully dynamic solution, Follow Through is a joint-based system that uses the “gross” motion of a piece of geometry to calculate overlapping, follow-through, or other secondary motions that would be tedious for an animator to keyframe. Follow Through is most clearly visible on the more gratuitously cartoony characters such as Scrat, specifically on his cheeks, ears, belly, and the spline animation of his tail.
With multiple animators often working on the same character for any given shot, animators used another tool-called Pose Tool Box-to access a wide variety of recorded poses so they could seamlessly blend with one another’s animations. “Pose Tool Box lets us record hundreds of physical expressions, such as sad, angry, mad, happy, and so forth,” says Torres. “When an animator creates a new pose, we can store all or any part of it, from the face to various parts of the body. Then, we can use them as “hookups” between shots. For example, when there’s a cut on action, on the last frame an animator can snap to a pose from the Pose Tool Box, so that the next animator knows where to begin his or her animation.”
Cataloging poses became doubly important when character animators challenged one another to a smear-frame competition. Trying to outdo one another with squash and stretch, animators created wildly exaggerated poses that, without the Pose Tool Box, could have created a consistency nightmare for multiple animators working on the same character. To create these exaggerations, the animators manipulated special squash-and-stretch nodes placed by riggers at the ends of the joints. “We could extend the nodes, and it would simulate squash and stretch, preserving volume, thinning out the geometry with extension, or fattening it with compression. We would scale, rotate, and translate the joints, and essentially try to break the rig,” notes Torres. “We intend to push it even further on our next film, Horton Hears A Who!” Choosing a winner of the smear-frame contest, Torres points to a scene in which Crash and Eddie are logrolling down a hill. Sucked underneath, they’re flattened and stretched in true Chuck Jones fashion.
For the many still lakes, ponds, and puddles interspersed across the melting landscape, Blue Sky used Next Limit’s Real Flow and displacement shaders.
Animators also took smear frames to extremes on Sid. In one shot, in which the sloth exclaims prematurely, “We’re gonna live,” only to realize otherwise, and says, “We’re gonna die,” Torres says the process was pushed to the point where it didn’t look like Sid. “But it works for the mood,” he says. “So, while we tried to exaggerate as much as possible, we didn’t want to violate the look of the characters from the previous film or, worse, distort the true personality of the character.” To that end, animators did not exaggerate heavily on Diego, whose withdrawn nature demanded a subtler approach.
Ellie’s comic delusions about being a possum also demanded a highly nuanced performance-one that was broad enough to capture the humor of her delusion but restrained in a way that showed those delusions stemmed not from stupidity, but psychological need. “We didn’t want her to come across as dumb,” states Torres. “She lost her family when she was young and found a surrogate family in two possums. So we wanted to show she was brassy, smart, and caring, and that her confusion-much like that of Tarzan-comes from the way she was raised.” Similarly, because Manny was the “heart” of the film, animators also abstained from giving him wide-eyed, cartoonish expressions that would undercut his emotional weight. To capture the romantic subtext in his interactions with Ellie, animators used eye darting, stuttering, stammering, and eye-contact avoidance.
Artists built all the sets and props prior to animation. While layout artists established most of the blockings, the animators did a lot of the camera work themselves, especially when their performances exceeded the scope of a shot. In fact, the animators had more freedom than ever before in staging their scenes. Only a few sequences were prevized, among them Whack-a-Mole and Balance. In the latter, all the characters find themselves on top of a rock that’s teetering precariously on a bunch of other rocks stacked one on top of the other in the middle of a giant canyon. “The choreography of the characters was extremely intricate and interconnected,” says Torres. “Previzing it was a huge collaboration among animators, previz, layout, and modeling.”
Possums Crash and Eddie were squashed and stretched to extremes using Blue Sky’s proprietary Follow Through tool.
Meanwhile, in Whack-a-Mole, Diego and Sid are twisted into knots while trying to snatch Crash and Eddie, who keep popping in and out of holes in the ground. “Previz artists put placeholders for where the holes would be, judging how far they had to be apart so that Sid and Diego could reach from one to the next without being too far from Crash and Eddie,” says Torres. Using this rough choreography, modelers modeled the ground plane with the holes in the correct position, and layout artists created the appropriate camera movements, all of which was then sent to animation.
Thanks to a new, fully voxelized, volumetric fur tool, all the animals in Ice Age: The Meltdown sport a new coat of fur, each bearing millions of hairs that respond to wind, gravity, inertia, and turbulence. In the first film, Blue Sky employed image cards bearing the image of one to three hair strands. Though alpha maps and transparency gave them a sense of dimension, their motion, even in windstorms, was caused by a jiggling of the cards and appeared somewhat stiff. In contrast, the new fur system procedurally draws millions of B splines on each character. Each point on the spline carries information about its position in space, along with color, length, density, transparency, and other attributes. From these millions of hairs, a couple thousand are selected to be rig hairs, which are attached to the animator’s character rig in Maya and drive the animation of the other hairs.
To simulate motion dynamics on these “rig hairs,” technical director Adam Burr wrote a tool called Fur Follow Through, which adapts the Follow Through tool for fur animation. Using the tool, the animators could adjust the hair’s drag, inertia, cycle, and settle time, as well as assign force vectors for wind, gravity, turbulence, and other environmental influences. “You can see the breezes running through the hair now. And, when characters move forward, the hair is drawn backward,” explains Torres. In addition, Fur Follow Through can recognize when the fur is partially immersed in water, automatically selecting its Underwater Follow Through so that it appears to flow with the current. However, the fur is still not fully interactive. If, for example, a character puts its hand to its chest, the fur would penetrate the hand rather than compress under it.
When Sid’s “rig hairs” intersect with a water body, the hairs automatically assume their wet look, reflecting changes in density as they absorb water.
Indeed, because the animators could only see the rig hairs during animation, their biggest gripe during production was fur intersection. To resolve that problem, the team fed all the Maya animation into its Grinder system, which translated it into scripts for CGI Studio, its proprietary raytracer. When the rendering was complete, it was the job of the technical animators to scrutinize each frame for fur penetrations and then notify the character animators. “Often, they’d see the fur go right through the hand, and we’d have to go back and alter the animation to bring the hand out of the fur,” says Torres.
With millions of hairs to process and only four months to render the entire film, Blue Sky moved to a 64-bit architecture for Ice Age: The Meltdown, upgrading its renderfarm to 1000 2.4 ghz processors, increasing its storage capacity to 40tb, and installing new Angstrom workstations running dual 2.4 ghz processors and Nvidia-based graphics cards. Under the new system, the average render time per frame was 13 hours. “We used a primitive form of the new hair for the humans in the first Ice Age,” says effects lead Eric Mauer. “However, since CGI Studio is a raytracer, which means all the geometry has to be in the scene at render time, our RAM footprint for the voxel bodies was really prohibitive. Each scene had to be represented in less than 1gb. Through our new architecture and advancements to the fur-voxel rendering made by researcher Maurice Van Swaaij, our RAM footprint for Ice Age: The Meltdown was 6gb.” In addition, Van Swaaij also made advancements to CGI Studio that enhanced the motion-blur effect on the fur.
For creating matted, bedraggled hair, the process was twofold. When artists procedurally modeled the fur, they also modeled its wet look, establishing the frequency with which the hairs would clump and the changes in density as it absorbs water. Then, during animation, when the rig hairs intersected with a water body, the hair rig accessed the wet or dry fur description, and morphed between the two.
“Intersections between water and fur are always a challenge,” says effects lead Kirk Garfield. “Because fur and water are both transparent bodies, you have to boolean one out of the other any time they’re touching. So, within our pipeline, we came up with templates to easily boolean out the fur that was in any other transparent bodies, such as bubbles.”
Blue Sky relied on four primary tools for water simulation: Next Limit’s Real Flow for creating the crashing, folding waves of the dam burst; a proprietary tool developed by researcher Simon Brown called Wave Synth for the intense, choppy waters that flood the valley after the burst in the third act; a customized rig employing Maya Particles for creating splashes; and a proprietary Froth tool developed by Rhett Caulier for saturating characters in a foamy spray.
Once the dam bursts and the main wave passes, the deluge of floodwaters flowing through the entire set and through which the characters must swim is the work of Wave Synth. Falling under the class of spectrum-based techniques, Wave Synth’s fundamental building block is called a Gerstner wave, which resembles a sine wave and has been used for a long time in oceanography. Using such variables as wave height and speed, and simple equations for calculating how waves move in deep water, Wave Synth sums together many waves.
Artists remodeled all the characters, including the acorn-obsessed Scrat, using subdivision surfaces before rigging them with Maya IK controls. Then, with Blue Sky’s proprietary Follow Through tool, the team added secondary animation, and with the Fur Follow Through tool, animated the hair.
While the same technique has been used for such films as Titanic, Wave Synth differs by not compressing the calculation of thousands of waves using Fast Fourier transform, which tends to compromise the “peakiness” of the waves. Instead, Wave Synth builds a wave spectrum, choosing only the best waves to add together, rather than summing thousands of inferior ones. Simply stated: It’s quality over quantity. “You can add hundreds of waves, but if you choose bad ones, with the wrong wave heights, lengths, and speeds, it won’t look realistic,” says Brown. “So, the spectrum defines the type and amount of waves that will be used to achieve the most realistic simulation.”
Wave Synth’s ability to regulate a wave’s level of detail with the proximity of the camera is one of its greatest advantages. It can produce complex choppy waves close to the camera, but as the waves recede into the horizon, it will only calculate what is necessary. However, since Wave Synth is tailored for fast-flowing waves that swell and crest violently but do not fold over on themselves, it was not used for the dam burst. For the calmer waters of the many ponds and still lakes visible at the opening of the film, the artists used simple displacement shaders, bump maps, and noise patterns. For any light disturbances of those waters, they used Real Flow.
Because the characters are constantly thrashing about in water-especially when Wave Synth was used during the deluge in the third act-the effects animators led the character animators by providing an animated NURBS patch showing the troughs and crests of the waterline, so they could choreograph their character animation with it in Maya. On the other hand, when the water simulation was done in Real Flow, effects animators received the character animation first, and then ran the simulation to match it.
Blue Sky’s CGI Studio raytraced the entire film in only four months, including the millions of hairs and complex water reflections.
For splashing and spraying water, the artists used a patch-based Splash rig employing Maya particles. “The rig would allow us to pose a NURBS patch representing the water surface, emit particles off it, then turn on dynamics and let gravity take over,” explains Garfield. “We could put splashes everywhere without breaking the budget. Sometimes, we’d use our own in-house mesher to fatten the droplets; in other cases, we would just render it as spray.” While effects artists could make the splash particles interact with the separately simulated water surface, they found that it was rarely necessary.
To envelop characters in froth and foam when they’re close to a large body of water, researcher Rhett Caulier developed a proprietary froth simulator that worked with Wave Synth. Using a point-cloud system to represent the water surface, the characters, and the environments, the tool calculated the intersections between the geometry, and then procedurally generated the froth particles that aerated off the water surface, clouding over the characters. Constrained to the water surface, the froth particles became part of the main visual cue indicating the direction and the speed of the current, and used a laminar flow-collision model to swirl around objects in their path. “It’s a collection of scripts and C++ code that runs entirely within our proprietary system. So, only after all the data from Maya was fed into our Grinder could we apply froth to it,” says Caulier.
Having abandoned texture maps on Robots, Blue Sky continues to use a proprietary procedural method for texturing, which layers materials made of various noise functions (see “Mech Believe,” March 2005, pg. 22). To create the gauntlet of bursting geysers that the characters must cross, artists used CGI Studio’s Smog tool, which simulates smoke, steam, clouds, and other aerosols by defining an isosurface within which light is scattered and absorbed.
Finally, for Ellie’s dream-like flashback to the loss of her family, artists created the snow in the scene using Maya particles, instancing them with larger spheres or ice chunks when the snowfall thickens. To create the footprints in the snow, the effects team used Z depth maps taken from an orthographic camera to produce a set of displacement maps in Apple’s Shake. Many former Disney artists who Blue Sky Studios hired after the closing of the Orlando studio painted the beautiful deep vistas of the tundra and backdrops for the valley scenes. Finally, artists generated the grass in the valley using the new fur tool, animating the millions of blades with Fur Follow Through.
For Blue Sky Studios, Ice Age: The Meltdown entered theatres leaving a flood of innovation in its wake. In fact, the studio is currently preparing sketches on Wave Synth, the froth simulator, and its splash rig for this year’s SIGGRAPH. Torres emphasizes the importance of making each film a learning experience-an opportunity to grow technically and artistically.
“Everything we do, we learn from,” Torres says. “With each film, we search for better ways of working, better tools, and are constantly developing things to help out work flow, not only to make us faster, but to make our jobs easier.”
Martin McEachern, a contributing editor for Computer Graphics World, can be reached at firstname.lastname@example.org.
Computer Graphics World April, 2006