Mech Believe
Issue: Volume: 28 Issue: 3 (March 2005)

Mech Believe

They don’t follow Isaac Asimov’s laws of robotics. They don’t have to. The bots in Fox/Blue Sky Studios’ film Robots live in an alternate world, a ball-and-socket, wheels-and-gears, mech-anical universe where lampposts walk home in the morning and a character named Crank Casey is made from... you’ve guessed it...a crankcase. There’s not a human in sight.

“We call the film Robots, but there’s nothing in it that’s science fiction or futuristic,” says Chris Wedge, codirector with Carlos Saldanha. “It’s a colorful, whimsical world of mechanical people, a comedy adventure.”

The film, which puts the voices of Ewan McGregor, Halle Berry, Mel Brooks, Greg Kinnear, Drew Carey, Amanda Bynes, and Robin Williams into animated characters, is on track to become more well-loved than the studios’ predecessor, Ice Age, which was nominated for a Best Animated Feature Oscar in 2003 and scored big at the box office. But in every other way, it couldn’t be more different.






“It’s very dense,” says Wedge. “There are lots of characters, lots of action, and in every shot, a lot to look at. With Ice Age, we didn’t know whether we could make a feature film, and I think some of our insecurities resulted in a style that looked very simple. We were more confident on Robots, so we bit off as much as we could get in our mouths.”

Production designer William Joyce, who created the famously successful 1998 TV series, Rolie Polie Olie, inspired the complex, quirky, always-in-motion environment with characters to match.

“Bill Joyce would go out shopping for antiques near where he lives and bring in huge amounts of junk,” says visual development artist Daniel López Muñoz. “We looked at automobile parts, coffee pots, perfume bottles. I designed one building based on the gas tank of a Harley Davidson and another that looked like a snail. We drew inspiration from many things, but we interpreted everything in a mechanical way, whether it was a building or a robot.”

Thus, you won’t see a tree or a bush or any furry little critters in Robots, but you might spot your grandmother’s oven or spare parts from your uncle’s Buick.
Robots Rodney Copperbottom (left) and Fender, a “Rusty” who lives in Robot City, were assembled from CG parts inspired by junkyard finds and antique shop rummaging by Blue Sky artists and production designer William Joyce.




“We used the whole gamut of industrial design from the past 100 years as reference,” says Wedge. “You’ll see a social hierarchy for the robots, from ones that look old and worn out to ones that look shiny and new. Our main character [Rodney, played by McGregor] was inspired by an Evinrude outboard motor that my grandfather had when I was a kid.”

The plot centers on Rodney, a clever robot who searches for his hero, the super inventor Bigweld, only to discover that he has been replaced by the evil Ratchet, one of the newer machines. “Rodney grew up without some of the material benefits the newer robots have,” says Wedge. “The newest machines are the corporate executives, the people climbing the social ladder, and they’re in the best shape. Rodney is born into being outmoded. He falls in with the Rusties, but he’s so inventive he becomes a hero.”

All told, a modeling crew of around 14 people built the intricate principal characters. “We have one character, called Madam Gasket, who was inspired by a furnace, and she has more geometry than an entire set and cast of characters for any given sequence in Ice Age,” says Michael DeFeo, modeling supervisor.

To help speed the modeling process, the team developed a data management system that let them rummage through bins and catalogs of parts. “When we finished a character, we would break out any parts that could be reused-nuts, bolts, pistons-and put them in our backlot,” DeFeo says. “When we started on the next character, we’d have some ready-made parts.”

Another triumph for the ’bot builders was finding ways for animators to do cartoon-style squash and stretch with the hard metal parts. For low-end robots like Rodney, modeled after gas-driven machines with pistons, pivots, and axels, the modelers created multilayered cylinders. “Every arm or limb had a double-piston setup so the cylinders could telescope out to get arm stretches,” says DeFeo. For newer robots like Ratchet, sliding sheets of metal provided flexibility. All the models were fashioned in Alias’s Maya using subdivision surfaces.
Ratchet, a later model robot than Rodney, was built with sliding sheets of steel using subdivision surfaces that were converted to Bezier patches for raytracing.




“Usually carmakers use NURBS to create parts, and each part is made from a set of patches,” says research and development team member Maurice van Swaaij. That might be fine for a car, but it can be torturous for an animator-or for the technical director who has to glue seams back together. “With subdivision surfaces, you can have an arbitrary topology with handles and holes, and it’s all one surface,” he says. “Modelers have an easier time, and animators don’t have pieces with seams everywhere.”

To make Robot City look inhabited, the crew needed to populate it with hundreds of mechanical extras. So, in addition to building hero characters, modelers created parts for untold numbers of “Frankenbots.” For these, the modelers cranked out several basic robots and then disassembled them into parts.

“We had bipedal robots, uni-wheeled robots, and multi-wheeled robots,” DeFeo says. “We had dozens of heads, torsos, limbs, and legs. A designer could pick and choose a head, a torso, and an arm, then press a button and they would attach to a pre-existing rig.” The system would scale the parts proportionally.

“The volume of work was scary at first because we could see how many parts were needed,” DeFeo says. “But the backlot system was valuable and the shared rigs made the volume and diversity achievable.”

The modelers adopted a similar strategy for dealing with Robot City, the complex urban environment in which the robots live. Inspired by the inner workings of a watch, the city was created in layers, with the oldest, industrial revolution-inspired neighborhood at the bottom. “We would build core ground planes, and then our assembly department would build environments from parts we created,” DeFeo says. “For example, we’d model the floor, walls, and ceiling, and they’d add the set dressing.”

For Ice Age, Blue Sky built sets based on layouts, pragmatically constructing only what the camera could see. “We lost a lot of work because things changed later on,” says Robert Cardone, 3D layout supervisor. Thus, for Robots, the crew devised a new plan: They’d build the entire set and look for camera angles within it. However, they’d do so by cleverly creating modules that could be pieced together as needed like a play set. “It gave us 360 degrees in which to move the camera,” Cardone says. “Something was always there.”

In addition to camera moves, layout artists also roughed in lighting. “We show where light comes from as part of the thumbnail stage,” says Cardone. “That goes to the art department so they can follow the shadow shapes. Half of composition is about what the shadow shapes are doing.”

The layout artists also composed shots showing character choreography. Once the cameras were set, characters placed, and thumbnails approved, 3D artists tightened up the action by working in Maya. The result was a digital workbook that represented each scene that went to the animators. “We have a script that grabs a camera, the characters, and all the set pieces for a shot and gives it a file name,” says Cardone. “So when the animator opens that file, everything that’s needed is there.”
Fender’s worn paint and rusty spots were not produced by rendering painted texture maps. Instead, the R&D team fabricated procedurally based methods to create complex materials.




About 35 animators worked on the film. At first, the crew tried to animate the ’bots as if they were truly machines-with no deformations. “We thought it would be clever if we never had to bend the metal,” says Wedge. “But by the time we finished, we did squash and stretch and deformations to make the characters more appealing. I think the audience forgets they’re made of metal at some point.”

Also, to keep the machines alive, the animators soon realized the mechanical people had to keep moving. “A mechanical person is dead at the beginning so you have to create the illusion of life,” says David Torres, animation lead. “It’s not like a furry or fleshy creature. If the machine stopped moving, it died. So, to create weight and life, we had to move everything slightly.”

For example, with most animated characters, when an eyebrow moves, the forehead wrinkles, and the cheek also moves. Not so with the ’bots. Each part moved independently. Ironic-ally, even though the characters were machines, most of the action was hand animated. “We didn’t have muscle systems, but we did have some mechanical setups,” says animation lead Galen Chu, noting the piston-like structures for extending limbs. “Also, we have a utility called ‘follow-through’ that we used, so that when a character turned its head, we got a bit of overlap for free.” And when Madam Gasket walked around, the steam, smoke, cinders, and ash that followed her were animated procedurally.
Lighting an all-raytraced environment became an intricate dance between diffuse and specular illumination.




Similarly, the city had to keep ticking along as well. “Because it’s like the inside of a clock, everything is moving,” says Torres. “Pendulums, cars, trains, blimps. We set up our own procedures, so that we could store animation curves. For example, any time a gear pops into a scene, we would load the gear animation.”

Ultimately, the entire world comes to life when it’s lit and rendered-not a small task in such a busy environment as Robot City. “We didn’t have to deal with subsurface scattering,” says Dave Esnault, lighting supervisor, "but we did have to light in a completely metallic world. We don’t use environment maps; we use the environment. Sometimes, we would get the lights perfect for a character, and then a highlight on a wall behind would reflect back onto it.” Thus, lighters sometimes rendered characters and backgrounds in separate passes.
Fender’s eyebrows were bendable windshield wipers, but most body parts for the ’bots were not deformable. So the characters had to be rigged with “movers” that rotated or translated individual body parts mechanically, as shown at t




“We didn’t want to go into the making of Robots with any preconceived constraints,” says Ludwig. “We wanted to show how far we could take the rendering, our forte.” But when the development group examined the design, it became clear that it was beyond the capabilities of what the budget would allow. “The world was extraordinarily complex, but it’s amazing what you can achieve when you put yourself into a challenging place.”

“The world appeals to me more than anything,” says Wedge. “Ice Age was fun, but the characters would go from one set piece and one idea to another. Robots feels more like a movie. We made a complete fantasy world, so we had to invent a lot of rules about how it works. Much of the time in the movie is spent discovering new aspects about the world.”

“It’s a little trippy,” Wedge adds. But it’s a ride worth taking.

Barbara Robertson is an award-winning journalist and a contributing editor for Computer Graphics World.




To peel back Robots’ complexity, the research team removed texture maps from surfaces. “We came up with procedurally based ways to create materials,” says Carl Ludwig, director of R&D, “and that cut memory usage way down. It also made it easier because we did not have to deal with UV spaces. The technique is akin to procedural textures, but we had a tremendous amount of local control. We could create grease stains.”

Rather than using painted texture maps, the crew created these materials with a node-based system developed in Maya. “It was a real paradigm shift,” says researcher Maurice van Swaaij. “The materials are created out of noise functions that are evaluated in space and layered together-a paint layer, another paint layer, a rust layer. You can hook little processing nodes together to make elaborate things.” One node might interpolate between two colors and then be hooked to a third node; a control element called a “spotty” might add bits of texture.

Sometimes the team worked on simple geometric shapes; however, more often, they used actual models. When they finished, the system compiled the network of nodes and launched the renderer. For rendering, the subdivision surfaces were converted to Bezier patches. “Compiling was the essential step,” says van Swaaij. “We’ll not go back to mapping, even for organic characters.”







Blue Sky’s proprietary raytracing software, CGI Studio, has evolved since the studio was founded in 1987. Often, people think of raytracing as slower than other rendering methods. But R&D director Carl Ludwig boasts that Blue Sky has found ways to make it more efficient. “Our average render time was about six or seven hours per frame,” he says. The studio has some 500 processors in its renderfarm-a combination of Xeon and 48-bit addressable AMD chips.

“One thing we do is look at where the rays will do the most good,” Ludwig says. “We’ve built an almost expert system that knows where light comes from, what needs to be addressed, and where it will be rendered.”

Eugene Troubetzkoy, one of the founders of Blue Sky, is also one of the fathers of raytracing, says Ludwig. “He has become very efficient at calculating the intersection between a ray and a surface.”

“Because we use a raytracer, it’s easy to set up algorithms for diffuse reflection and those sorts of things,” Ludwig adds. “But when you want to do diffuse reflections from highly specular materials, you have to be careful because there’s so much energy. Sometimes we’d see a little dancing, so we’d have to increase the sampling rate. Also, we had a notion of how to treat specular energy along the ray, which was a rather important breakthrough for us. But I don’t want to give away too many secrets.”

He’ll reveal one secret, though: “The problem when you render something digitally is that the brightest you can get is white,” says Ludwig. “So to make spots on the metal look brighter than a white spot, the renderer supports a huge dynamic range and can represent values up to a million. We can then do a calculation on energy and bloom the white out into a flare. When you look at the flared thing, your eye almost winces even though it’s no brighter. It’s a psycho-physical thing that your brain does.”


In one scene, lead character Rodney accidentally tips over a domino and starts a chain reaction that builds into an ocean of little black blocks with white dots that carry one of the characters on a wave as if he’s surfing. The sequence is as over the top as the avalanche of acorns in Blue Sky’s Oscar-nominated short film “Gone Nutty.”

Because the sequence requires millions of dominoes, the animation had to be procedural. However, effects lead Robert Cavaleri wanted animators to control the action. So, he created a kind of blanket underneath the dominoes using a simple NURBS patch; animators choreographed the changing shape of the wave by manipulating this geometry using keyframe animation. Cavaleri then covered the blanket with millions of procedurally animated particles that were rendered as dominoes-little pieces of geometry with randomly generated white dots-all created within Blue Sky’s CGI Studio.

“Procedural math and simulation can be hard to control, but by incorporating keyframe animation techniques, we were able to sell the story points, drawing attention to or away from characters and elements when needed,” says Cavaleri. “We also got approvals faster because we could get a buy-off on the wave animation and characters before we designed the look of individual dominoes on the waves.”


On his way home from a party, the character Fender breaks into song and dances into a fountain. “He’s in an area of the street that has oil fountains,” says effects lead Robert Cavaleri. “They’re lit beautifully; it’s like Las Vegas.”

To create the oil, the effects team first tried using particles, but the resulting motion didn’t look like flowing oil. “The Newtonian mechanics were not as pleasing as fluid dynamics,” he says. “So we used [Next Limit’s] Real Flow software and the resulting motion looked right for the action. The artistry, though, came from the technical directors tweaking the parameters to make it feel like oil and not mud or water.”