Reef Madness
Issue: Volume: 27 Issue: 10 (October 2004)

Reef Madness

Think The Godfather meets Tex Avery or Chuck Jones, switch the sound track to hip-hop and reggae, set the action in a painterly underwater world, turn the characters into sea creatures, and you have a taste of Shark Tale, the first 3D animated feature from DreamWorks Animation. Scheduled to open October 1, the film features the voice talents of Will Smith, Robert De Niro, Martin Scorsese, Renee Zellweger, and Angelina Jolie and an "in-fin-ite" number of fishy puns.

The lead fish is Oscar (Smith), a sometime hustler whose day job as a tongue cleaner at the whale wash couldn't be worse. Oscar gets his big break when a shark chasing after him dies in an accident. The little hustler takes credit for killing the shark, and changes from shark bait to shark slayer. Things go swimmingly for the new big fish on the reef until the neighbors spot another shark. Luckily, it's Lenny (Jack Black), who has a secret of his own, and the unlikely pair become partners.

The partnerships that turned this story into a feature film were formed more than three years ago when DreamWorks Animation began its sea change from a traditional animation studio into a 3D studio, following in the wake of PDI/DreamWorks' award-winning box-office hit Shrek and other CGI films. "By the time [DreamWorks'] Spirit was finished, everyone wanted to get on the CG bandwagon, so we had a 100 percent changeover," says producer Janet Healy. "Also, it was a time when amazingly talented people were available in all areas of the CG industry. We hired 150 really good artists."

To give this new talent pool the tools they needed to create production designer Dan St. Pierre's stylish environments and help directors Bibo Bergeron, Vicky Jenson, and Rob Letterman tell the story, the production team sifted through an ocean of possibilities when deciding what pipeline to build for the film—and the studio.
Cartoon animation techniques were used by co-director Bibo Bergeron and a staff of former 2D animators to create comic performances for Shark Tale's characters, thanks to innovative Maya rigs that allowed "off model" deformations. Inset: Oscar




"We formed a brain trust," Healy says. "We had 14 key people and, depending on the discipline being discussed, special guests. We met every day for two months." At the end of the brainstorming sessions, they settled on off-the-shelf application software but with PDI's file and data formats for everything below the application level. "We wanted to build a pipeline for the film that would last," she adds, "and we realized that strategically we needed to leverage as much as we could from PDI to put one foot toward unification [between the two 3D studios]."

At the application level, DreamWorks chose Alias's Maya and PaintEffects, Pixar's RenderMan, and Mental Images' Mental Ray, all running on Hewlett-Packard machines. But it wasn't that simple. It became clear during the brainstorming sessions that to achieve the look St. Pierre had in mind, the entire film would need art-directed global illumination, and to create the animation Bergeron wanted, Maya would have to be rigged to allow cartoon animation.

The story takes place in an underwater New York City, complete with Times Square and an outer borough in a coral reef-styled setting. "Our world was not meant to be naturalistic," says Jenson. "It is an allegory. We didn't want it to look like a sunken human city; we wanted a reef. It's a complete fantasy, but it triggers familiar responses for us."

Kevin Rafferty, lead CG supervisor, who was part of the brainstorming team, helped streamline methods for creating the underwater city. "We needed to find ways to bring the city to life in the way the art director wanted without breaking the memory bank," he says. "Originally, we just wanted the feeling of a city built by fish, but as the film evolved, we needed a recognizable underwater Times Square with familiar icons such as the Flatiron building, billboards, jumbotrons, and neon lights."

The recognizable buildings were "hero" buildings, created from scratch to resemble those in the real world. For other buildings, the team used a modular approach: Rafferty, who came to DreamWorks after working on live-action visual effects for 20 years in such studios as Industrial Light & Magic, led a team that designed and built a "Tool Box Town." He points to his computer screen where an asset browser scrolls through a catalog of building parts—building shells, windows, stoops, doorways, billboards, fire hydrants, and much more. Each piece carries two or three types of texture information so that one piece of geometry can sport various styles through changes in hue, bump maps, and displacement maps. "We took advantage of asset-instancing within Maya so that even though a window might be used 100 times in one building shell, it would be counted as only one window in memory," he says.

Once the building pieces were approved by the art department and built in the modeling department, a city could be assembled. To further the allegory of a city built by fish, projection maps applied to several buildings added colorful veins of coral, graffiti, and other visual elements to the underwater fantasy.
The DreamWorks effects team used Maya particles and PDI's FLU simulator to create and animate foamy bubbles that interact with the whale and the surrounding water.




Further increasing the visual richness of the undersea environment were bubbles, coral, kelp, and other seaworthy items, most of which were generated with Maya particles or Maya PaintEffects. For example, for the whale wash where Oscar worked, a 23-person effects crew led by Mike Miller created foam, bubbles, steam, and goo. "We calculated that we had between 350,000 and 400,000 particles for one foamy whale shot," he says. To move the particles for that shot and others, the team used Flu, a proprietary 3D fluid simulator developed by Nick Foster at PDI/DreamWorks.

For vegetation, coral, and brushes in the whale wash, and other elements, the team used Maya PaintEffects. "We needed to 'fishify' the movie, so we used PaintEffects to paint vegetation onto surfaces," says Miller. "We used it almost like a fancy graphical user interface. We would paint the surfaces, see how it looks in Maya, move stuff around, and change colors and styles. We like the PaintEffects front end, but the back end wasn't usable for us."

Rather than rendering the PaintEffects elements in Maya, the effects team wrote a Maya plug-in that exported PaintEffects curve data as points into proprietary files. The points turned back into curves during rendering via a proprietary RenderMan DSO (dynamically shared object), carrying with them information about motion, size, color, and so forth.

"We could also instance geometry to the curves using the Renderman DSO," Miller says, "and with the help of shaders, we could use the normals exported from Maya to light them with shadows. We could even do procedural kills." For example, after generating a field of kelp, the team realized that some vegetation was in front of the camera as it moved. They pruned the branches without having to go back into Maya.

Pipeline designer and developer Scott Cegielski, who wrote the Maya plug-in, the RenderMan DSO, the shaders, and other tools, devised a way to drive procedural motion in the DSO. "We could have a character grab a piece of PaintEffects stuff and move it out of the way," says Miller. "It's a really innovative tool. We're going to try to document and share our work at SIGGRAPH."
Clockwise, from top left: First, layout artists use low-res graphics to block out camera moves and character poses. Then, animators create the character's performance, and the layout department dresses the set. Next, effects artists set up the bubble and




Because the environments were so complex, rather than pre-process and send the renderer all the geometry for buildings, characters, and other models in RIB (RenderMan Interface Bytestream) files, the Shark Tale team put into its RIBs only shading and lighting information plus a procedural primitive that points to models stored on disk in a proprietary format. As a result, the RIBs were one-tenth the size of a typical RIB, according to Doug Cooper, visual effects supervisor. In addition, models in the proprietary format were smaller than they would have been if converted to a RIB file format.

"We combined the advantages of procedural primitives, which people use for instancing, with RIB archiving, in which models are pre-baked into RIBs and loaded as needed," says Cooper. "It gave us the means to manage complexity."

The proprietary model format was created by a stitcher, which managed a complex arrangement of NURBS. "We can have five-point unions and T-junctions that you can't do with typical stitchers, which means that our character models can be more flexible," says Cooper. In this system, the Isoparms (curves on the surfaces) can look like a crazy-quilt jumble of streets in a medieval village rather than parallel lines on a grid. "I can't tell you the details of the stitching," Cooper says, "but I can tell you that we have to convert it into a set of surfaces that RenderMan understands."

That conversion happens on demand at render time. "Because we can load one model at a time, we don't have to stuff everything into the RIB file," says Cooper. "RenderMan loads in a RIB with the shading information but no models. When it gets to a model, the procedural primitive kicks in. So, if we have a scene with thousands of windows on buildings in a city, instead of feeding all those windows into RenderMan in one big stream of RIB files with models baked out of Maya, the window is loaded off disk when it's needed and turned into something RenderMan understands. When RenderMan is done rendering the window, it's discarded."
Modelers and animators needed to create an allegorical world in which characters could sit at tables and use their fins as hands—and then swim away.




The shading and lighting information that went into the little RIB file took an equally innovative path. "We were able to apply global illumination [in all 1375 scenes] throughout the film using commercial tools," Cooper says.

"We used Mental Ray for global illumination—specifically for exposure mapping [ambient occlusion] and bounce light," he says. "We had RenderMan use the pre-computed Mental Ray data when we rendered our final beauty passes." By separating all the components in this way, they could tune parameters in each to enhance the film's painterly style.

"Dan St. Pierre, our production designer, had soft area shadows and bounce lighting among his specific goals," says Mark Wendell, CG supervisor. To simulate the soft shadows created with ambient light, rather than using dozens, perhaps hundreds, of fill lights to emulate diffuse lighting, many visual effects studios have begun using a technique called ambient occlusion in which the amount of light received from a sky dome is calculated for every point on a surface. The result looks like a perfectly exposed gray-scale photograph taken on an overcast day. To integrate CG elements with live-action photography, the ambient occlusion pass is composited with environment lighting.

"We wanted more artistic painterly control," says Wendell. "We knew the production designers and art directors all came from a painting background, so we wanted to give the lighters a way to control a simulated ray-traced lighting environment." The system he developed and Mike McNeill implemented put the controls for ambient occlusion and bounce light into RenderMan lights. DreamWorks lighters could place CG lights as usual, but the lights they used were far from typical—soft shadows and bounce lights were built in. The lighters could paint anything with colored lights and soft shadows—even the kelp fields created with PaintEffects.

Here's how the lighters might use the system for a background: First, ambient occlusion (what DreamWorks calls "exposure") would be pre-computed with Mental Ray and stored as texture maps. Then, the lighters would place traditional key lights into the scene for directional light and shadow. Next, they would add the system's new "exposure lights"—fill lights that include the pre-computed ambient light.

"You can hand place these lights to your heart's content and really sculpt the space," says Wendell. As they would have done using fill lights, the Shark Tale lighters placed lights and created shadows, but the shadows in the nooks and crannies of the space were soft and looked natural because they used the pre-computed ambient occlusion. Further, each of the "exposure" lights not only controlled where the light fell, but also its color, intensity, and so forth, just as did the fill lights without ambient light built in.

"When you put the exposure lights together with the key lights, you get directional shadows from the traditional key lights plus soft lighting from the ambient occlusion," says Wendell, "and yet the lighter can make a green light a little bluer without compromising the soft contact shadows. The exposure [ambient occlusion] was computed once with Mental Ray—in the render farm overnight—and then after that all the lighting with the exposure maps is straightforward RenderMan lighting using the maps, so the turn-around is fast." Last, a lighter might want to emphasize characters or an area in a scene with direct light that picks up colors from the environment as it bounces.
Clockwise, from top left: A scene lit with fill lights. Rendering the scene with ambient occlusion. Placing "exposure" lights that include the pre-computed ambient. Adding key lights.




"The hard part of using traditional ray-traced bounce lighting is that you do the direct lighting and then run the bounce simulation," Wendell says. "It takes a long time because it's a big ray-tracing calculation and if it's not what you want, you have to go back and change the direct lighting. We also pre-compute the bounce and get a given result. The difference is that we can selectively put the bounce back into the scene with a hand-placed light."
Clockwise, from top left: Calculating bounce light from the "exposure" lights and key lights. Hand-placing bounce light. The final image.




To do this, they baked the scenes lit with the key lights and the exposure lights into a texture map, computed the bounce light from that map, and stored it as another texture map. "Now, we can do the same thing with the pre-computed bounce as we did with the exposure lights," Wendell says. "And because we can put the bounce lighting back into the scene, not globally but with hand place-able lights, we can alter that bounce result." The lighter could, for example, shower an element with a golden tint. "When you put it all together," he says, "you get soft contact shadows and an enhanced bounce light right where you want them."

While the techniques invented by the DreamWorks Animation team solved particular problems for Shark Tale, the ability to create a traditional 2D style of cartoon animation with 3D tools and to art direct the look of a scene by using customizable lights that contain pre-baked global illumination and bounce light will likely find their way into other productions.

"It's the best of both worlds," says Wendell. "Simulation and art direction."

Barbara Robertson is an award-winning journalist and a contributing editor for Computer Graphics World.


Usually, animators perform 3D characters using models with built-in shapes that, for example, might control the way a character smiles. But most of the animators working on Shark Tale were traditional 2D animators.

"We decided we did not want to do shape interpolation," says producer Janet Healy. "We wanted our animators to be able to go 'off model,' to have the same ability in computer graphics that they have when they draw. So, when Oscar flaps his fins, it's not just motion blur happening, he's got five arms. And if you looked frame by frame, you would be able to see 'Picasso in-betweens'—with his chin in one place and his forehead in another—just like you'd see if you stepped through a 2D animation."

The extreme stretching and deformation was accomplished with Maya plug-ins developed by DreamWorks' character riggers and with custom skinning and wrapping tools. "Bibo has a distinct squash-and-stretch style," says visual effects supervisor Doug Cooper of director Bergeron. "We had to pull character rigs in directions we've never pulled them before." —BR

Modelers sculpting Oscar and Angie could create complex arrangements of Isoparms (surface curves) knowing a proprietary stitcher could handle them.