After Effects
Issue: Volume: 27 Issue: 5 (May 2004)

After Effects

The tremendous critical and financial success of Shrek, which received the first Best Feature Animation Oscar and earned more than $480 million dollars in box-office revenues, made a sequel inevitable. But when the green ogre married beautiful Princess Fiona, the fractured fairy tale had its perfect ending. So, what could the plot masters for Shrek 2 do?

"We take up where the story left off," says Andrew Adamson, writer-director of PDI/DreamWorks's Shrek 2, which opens May 21. "We examine what happens 'happily ever after.'" For the newlywed ogres, the honeymoon ends when Shrek (Mike Myers) and Fiona (Cameron Diaz) receive an invitation to the Kingdom of Far, Far Away from Fiona's parents, King Harold (John Cleese) and Queen Lillian (Julie Andrews).
Shrek, Fiona, and Donkey's journey to the Kingdom of Far, Far Away set this fractured fairy tale in a land ripe for parody and demanding of new, art-directed tools for handling crowds of townspeople and new leading characters.




"It's what we all go through," says Adamson. "You marry the whole family. The parents expected Fiona to come back with her Prince Charming, but the Prince was too late, and now they have Shrek. The mistake a lot of people make is trying to find a fairy-tale conceit and a modern equivalent. We find real conceits and then find the fairy-tale equivalent."

One of those equivalents is the land of Far, Far Away itself, which looks like a fairy-tale version of Rodeo Drive and Holly-wood. "A large part of what we do in Hollywood is manufacture contemporary fairy tales, so it seemed an appropriate place to target," says Adamson. "We found a tone last time that we could play with and expand this time. There's more satire and more parody. Because we can have more interrelations with characters, we can have a more complex story."

In Shrek, many of the scenes had, at most, Fiona, Shrek, and Donkey (Eddie Murphy). In Shrek 2, Fiona's parents, her Fairy Godmother (Jennifer Saunders), Prince Charming (Rupert Everett), and Puss In Boots (Antonio Banderas) all have leading roles as well. In addition, thousands of people live in Far, Far Away, including recognizable fairy-tale characters and parodies of the rich and famous.

New tools—faster hardware, primarily dual-processor Pentium 4 machines from Hewlett-Packard, and new in-house software—made it possible for the crew to cope with all of these characters and with additional complexity in the environment as well. "Every blade of grass, every leaf, every cloud in the sky is synthesized," says Andy Hendrickson, head of technology at PDI/DreamWorks. "It's all CG, all of the time. Right now we have 3500 processors working on the show. It's one of the biggest renderfarms I've managed."

This is the third 3D feature animation for PDI, and with this film, the R&D crew focused largely on ways to give artists more control—from more facile interfaces for the animation system to art-directed control of simulation systems. In addition, the technical team created new effects and lighting tools and put a new renderer—first used for the stereo 3D ride film Shrek 4D—to a major test. "We upped the ante from Shrek in tool-making as well as image-making," says Arnauld Lamorlette, head of ef-fects, who notes the goal was to make art-directable tools. "The imagery is not photoreal. It has a sophisticated, stylized reality."
The rigging (at left) for Puss In Boots (final at right; flat shaded at center) was complicated by the requirement that he could act like a biped or a four-footed cat, wave a hat that has an attached feather, hitch up his belt, flick his tail, and take of




On the surface, the three returning stars—Shrek, Fiona, and Donkey—didn't change much. Beneath the surface, though, the characters were rigged with new animation controls. "The facial animation system was rewritten to work on higher-resolution models and to work faster," says Tim Cheung, supervising animator. "Before, we would work with only a rudimentary mask of the faces. Now, we see high-resolution faces move. In general, there is more serious acting in Shrek 2 and less action, so we had to have subtle facial animation."

In addition, the Character TD team applied the updated, muscle-based, award-winning facial animation system to the characters' entire bodies in a simplified form. "There was a big development effort to help the animators create more believable characters," says Lucia Modesto, co-character technical director supervisor. "The additional animation controls and increased model resolution gave the animators control over more subtle details."

Of all the major characters, Puss In Boots's animation rig was the most complex. "He's a very complicated character with a belt, tail, hat, whiskers, and boots," Modesto notes. "When he's wearing boots, his foot has to be a biped human-like foot," she says, "but when he's out of his boots, he has a cat foot with an extra joint." To solve that problem, the TDs created a rig that let animators toggle from a four-footed setup to a biped. They also added Puss's hat to the rig. "It was cumbersome for animators to parent it, so we made it part of the setup," she says.

To help add subtlety, the character TD department also rigged some muscles to move automatically. For example, Prince Charming's neck muscles tighten automatically when he clinches his jaw, and if he takes a drink, his Adam's apple bobs up and down.
Prince Charming and Fairy Godmother were but two of the stylized humans added to the cast for Shrek 2. New subsurface scattering and global illumination techniques developed at PDI helped soften their CG edges.




Five teams of animators worked on the film, with five to six animators on each team. In addition to creating performances for the main characters, the animators were charged with moving other objects—notably Shrek and Fiona's cart, and a medieval type of contraption used in an escape attempt. Also, animators working closely with the effects teams created cycles for crowd animation. To help make these performances fit into the stylized world, all the characters, whether primary or secondary, were given realistic skin, hair, and clothes.

Soon after the first Shrek film opened, Stanford professor Henrik Wann Jensen (now at UCSD) drove to PDI to explain his research on subsurface scattering. "During his talk, I got an idea for doing subsurface scattering in a new way," says senior effects developer Juan Buhler. That idea led to a SIGGRAPH paper co-authored with Jensen, as well as to the technique's use on leading characters in Shrek 2, and because it reduced the computational requirements for subsurface scattering, to its use on crowds.

Buhler describes the technique: "When light enters a translucent object, part of it reflects and part goes inside and bounces around. So, when we render pixels, there's a lot of computation because we have to consider the light falling on surrounding pixels. My idea was to separate this into two processes."

First, an object is covered with points, and then the illumination of each point is calculated individually. The result is a cloud of colored points in the shape of the object; the points have various degrees of brightness. To render pixels and create the final image, the illumination of the pre-calculated points is averaged within a radius, and a translucency value is assigned to the result. "Say, for example, the object is an ear with different thicknesses in different parts," Buhler explains. "Where it's thin, more of the points will be bright than where it's thicker."
Shrek couldn't change much for the second film; however, he was given more subtle facial expressions, thanks to new animation tools that allowed animators to work with higher-resolution models and some body hair.




To create the final look, new texture shaders written by senior effects animator Jonathan Gibbs managed several layers of skin and texture maps, adding visual complexity with such elements as dirt, oil, freckles, moles, bumps, colors, and specularity. "We had half a dozen sets of texture maps for each part of each main character," says David Doepp, surfacing supervisor. "Each set had around 100 maps, and the characters had 70 to 75 parts."

With a cast that included humans as main characters, thousands of townspeople, plus a cat, a donkey, and a horse or two, the technology teams for Shrek 2 paid particular attention to the problem of creating and simulating hair and fur. However, for hair with extreme motion, the team relied on Alias's Maya for simulation, most of the hair was animated as part of the character setup. For this, the team created new setup tools. "We had more than 200 controls for setting up hair styles," says Gibbs. "We had to change the user interface."

Michael Day, lighting sequence supervisor, explains that the team called in a wig maker to help devise a simpler approach. "The hardest thing is to style hair and know what you'll get without running the simulation," he says. To help the artists see what the hair would look like, the crew first modeled curves in Maya to create guide hairs that would be used for the simulation. Those hairs were extruded as big tubes and exported into PDI's proprietary software; the tubes represented the volume of hair and were then exported into PDI's proprietary software.

Then, the team applied parameters such as curliness to the tubes and interpolated them, which produced more, smaller tubes. They could look at the results, set new parameters, and continue interpolating until the tubes became small enough to look like 92,644 hairs, in fact, on Prince Charming's head. A "clumpiness" parameter controlled whether the hairs would try to stay in the tubes or not and, thus, how messy, smooth, or clumpy the hair would look.
It takes powerful magic to turn a donkey into something beautiful, not to mention dynamic hair simulation for his mane and tail.




The simulation engine moved the guide hairs from frame to frame based on these parameters and on the character's movement. But even for this, the team provided artistic control. "We set magnets that would pull the hair in particular directions and then ran the simulation on top," says Larry Cutler, co-character technical director supervisor. "The magnets would control the simulation, but the animators could control the magnets."

Once the hair was simulated, it was interpolated as part of the rendering process with the guide hairs controlling the hierarchical levels of clumpiness according to parameters set earlier. Also in rendering, color ramps were applied and volume shadows were calculated.
Prince Charming's hairdressers at PDI applied red magnets to his golden locks that animators used to direct his hair simulation.




For fur and body hair, texture maps controlled direction, length, density, and color. "Sometimes we could paint texture maps once for the characters, says Doepp, "but when Donkey and Puss are in the rain, we had to change both the direction and the specularity of the fur."

Puss caused additional furry problems. "He wears a belt that slides and smooshes the fur," says Gibbs. "We had to track the belt over the surface, and then generate a texture map for each point that controlled how much and in which direction the fur would lie down along the moving belt line."

Other improvements for the characters included better integration of both tight clothes (created with PDI's in-house system) and loose clothing (simulated with Maya Cloth), as well as better eyes. "On the first film, we had a line between the tight cloth and the simulated cloth, so the characters wore belts, but on this film the Queen and Fairy Godmother do not have to wear belts," says Ken Bielenberg, visual effects supervisor. Lamorlette explains that all the characters' eyes—even those of the thousands of people in the crowds—include caustics and also refractions when a character looks sideways.

A new "finaling" group fixed any problems with the characters before they were lit. For example: "If a character's contortion caused the simulated cloth to interpenetrate the model under the arm, we might sculpt out the armpit to make the simulation act OK," according to Bill Seneshen, senior effects animator. Also, the team would sometimes dynamically deform geometry at this stage to, perhaps, make a finger look like it was really pushing on something. "We tried to get rid of everything that looked CG," says Lamorlette.

Techniques such as these helped soften the hard edges of the synthetic world, but new lighting techniques had arguably even more impact. Rather than always lighting scenes using CG key lights, fill lights, and so forth, the team decided to incorporate global illumination in the studio's new integrated rendering and lighting tools. "We wanted to create a warmer, more believable look," says Lamorlette. "But we had to make the technique faster to use it on such a large scale."

Also, rather than using this physics-based lighting technique to mimic reality, the team needed to be able to create a stylized reality. "Tightly coupling the rendering and lighting tools gave us flexibility and optimization," says Hendrickson. "Having a lighting tool in your renderer is a big advantage you don't often see in the outside world."

Eric Tabellion, who helped create the new lighting tools, notes that PDI's implementation of global illumination, which he describes as similar to radiosity, includes radiance caching (to speed rendering) and a method for reducing the computational complexity. "We can replace a castle with a simple plane," says Philippe Denis, lead lighter. "This allows us to select how much complexity is involved."

Although the new lighting tool can create soft, diffuse lighting for an entire environment, the biggest impact in this film has been on individual characters. "We used to struggle to light characters under their chins without getting harsh shadows," says Bielenberg. "The more we can break down the harshness, the more we can move viewers into our world. At first, the lighters would use global illumination only sparingly, but now it's used in every shot, and always for the characters."

The impact of such tools as these extends beyond surface appearances; the new tools have a more fundamental role to play. "Shrek was a training ground for the story," says Adamson. "There were technical and aesthetic things we could not do then because of computational limitations and software that wasn't done. But now, the technology is challenging me and others to create even more complex stories."

Barbara Robertson is a contributing editor forComputer Graphics Worldand freelance journalist specializing in computer graphics, visual effects, and animation. She can be reached at BarbaraRR@comcast.net.


Alias www.alias.com
Hewlett-Packard www.hp.com


Adding to the visual and technical complexity for Shrek 2 were organic settings with atmospherics such as rain. In one sequence, the characters are caught in a downpour. "We had geometry changing dynamically on characters' faces as water flowed on the surface," says senior effects animator David Allen. For rain, the team used a particle-driven simulation.

"The simulation particles used the surface information so that they could drip water off the face," says Allen. "We generated blobbies from the particles in different thicknesses and rendered them with refractions."

For a scene in which flaming rocks are flung through the air, the team used smoke spheres driven by a particle simulation. "The spheres were replaced by volumetric noise and rendered using self-shadowing technology," explains effects animator Matt Baer.

Other sequences used CG water simulations, for which the crew developed new techniques to generate surfaces and foam for the studio's award-winning fluid-simulation engine. "There's a magic potion that's spilled in a factory, and because it's magic, interesting things are happening with it," says Lamorlette. "We had to develop tools that could tweak and cheat and twist the laws of physics."
To help make Donkey's hair look wet, PDI's lighting and effects crew changed the specularity and made it slightly darker.




New tools also helped artists quickly create 3D plants and foliage. For example, working from a style and color of a leaf approved by the art director, modelers would create several versions that would be rendered by the effects team. "Then we generated sprites that we put in Maya Paint Effects," explains Lamorlette. The artists painted in Maya, using these little sprites to visualize a 3D environment. When they finished, we exported the data from Maya into our system, and created and instanced it in 3D. This gave the artists control over several million polygons in an interactive way. There are flowers and vines all over this movie." —BR


Many of the advances made for the star characters, such as special shaders for their eyes, subsurface scattering, and hair and cloth simulation, were also applied to secondary and crowd characters.

To create the thousands of secondary characters, the team started with five body types (two men, two women, a child), as well as several types of heads and a variety of outfits that they could mix and match fairly automatically. "The crowd system knows the rules," says Cutler. "A maid doesn't go to the ball; a skirt with a high waist is matched to a high bodice."

Some of the secondary characters even had speaking parts. "Shrek 2 was 10 times more sophisticated than Shrek," says Modesto, explaining: "The waiters at the dinner are secondary characters. For these, we could change face shapes without having to tweak underlying rigging that was the same for all the secondary characters."
You can't have a kingdom without subjects, which meant the crew at PDI needed to develop new tools for creating, animating, lighting, and rendering thousands of townspeople.




Secondary characters used in crowd simulations moved according to cycles developed by animators. "The system managed 1300 animation cycles," says effects animator Seth Lippman, "plus all the parts—heads, hats, and so forth— and the cloth and hair simulations." Given this complexity, the technical team devised methods of reducing the computational and storage load, and yet found ways to give artists control over the simulations. "We developed a system we call Dynamic Crowd Character," says effects artist Scott Peterson. "We bake out all the character types and store them on disk for every frame, so we came up with a geometry-based compression scheme that reduces the storage to one-third of what it was. Even so, we can override the pre-baked characters to have per-character variations created procedurally or with hand animation." —BR