The Game's Afoot
Issue: Volume: 28 Issue: 11 (November 2005)

The Game's Afoot

The question is, What if board games were real? Author and illustrator Chris Van Allsburg has turned that flight of fancy into two children’s storybooks, and the storybooks have become films that rely on effects rather than Allsburg’s drawings to realize the fantasy. For the 1995 movie Jumanji, Industrial Light & Magic brought a jungle of CG animals into young Danny and Walter Budwing’s lives and living room to scary effect, developing custom hair simulation and facial animation software to do so.

Now, Allsburg’s second story, Zathura, is being lifted off the pages and enlivened in film: Directed by Jon Favreau of Elf fame, Sony Pictures Entertainment’s Zathura opens November 11. In Zathura, the shift in time and space happens not in a jungle but in outer space, where the brothers dodge meteor showers, a dysfunctional robot, lizard-like aliens called Zorgons, and the pull of the black hole named Zathura.

In keeping with the film’s style, which visual effects supervisor Pete Travers describes as having the look of “an old science-fiction movie,” the storybook adventure blends practical and digital effects. “The design had to look like it was born out of a board game created in the 1950s,” he explains. Sony Pictures Imageworks handled 200 digital effects shots, Café FX created a CG character, and Digital Dimension also lent a hand.


A children’s board game called Zathura sends players into outer space, thanks to digital effects created, in part, at Sony Pictures Imageworks.
Images © 2005 Columbia Pictures Industries, Inc.

As in most effects films these days, the digital sleight of hand was a mixture of raw R&D with tried-and-true methods from, in Imageworks’ case, Spider-Man 2, The Haunted Mansion, and The Polar Express (which was based on a Christmas story by Allsburg).

“Producers like to know we aren’t reinventing the wheel for every effect,” says David Seager, one of three CG supervisors on the film. “But we started from a blank slate for a handful of things.”

The effects start when the boys pop up the first game card, which reads, “Meteor Shower Take Evasive Action.” As the card predicts, meteors start bombarding the house. Because this sequence sets the stage for action to come-each card has a new frightening result-the director wanted a high level of energy. “He wanted creative control over where the meteors would hit for the greatest visual impact,” Travers says.

Imageworks provided that control. The first step was matching the camera moves used to film actors pretending that meteors were flying through the room. The matchmovers used Alias’s Maya, which was also employed for modeling and animation throughout the show. “In addition, the matchmove department generated basic geometry for the environment so we had our bearings,” says Seager.

Next, based on the boys’ movements, layout artists began plotting the timing and placement of the meteors. “If an actor had changed direction abruptly, the director wanted it to seem like he had reacted to a meteor,” Seager explains, “but we didn’t want it to be too scary.”

After the director approved the choreography, effects artists began putting hundreds of meteors into the scenes in front of and behind furniture and other objects in the room. Although the meteors are pieces of fiery orange geometry, they move so fast through the scene that the audience sees only the smoke trails left behind.

The technique for creating the smoke trails evolved from technology called Strands, which created the misty atmosphere in Disney’s film The Haunted Mansion. Using tools that work within Side Effects Software’s Houdini, the effects artists started the tasks by identifying an impact point and the timing of the hit. “The system backtracked the speed and the angle the meteor had to travel, and generated a coarse particle trail,” says Seager, “and it generated sparks from the impact points.”


That particle trail became the starting point for a simulation that was modified with such parameters as buoyancy, noise, and turbulence to produce a refined smoky stream. “We didn’t want dense, thick smoke because the room would have been too dark,” Seager says. “So, we experimented until we got a gentle smoky trail.”

Because the actors had been filmed in a room with holes cut into the floor for the meteors, the compositors-using Imageworks’ proprietary Bonsai software-painted over those holes and added others to match the postproduction choreography. To complete the illusion, they layered in sparks and placed flames on the rims of the holes.

In a second major digital effects sequence, Danny opens the front door and sees that the house is on the edge of Saturn’s rings. For this shot, everything outside the door is CG. A matte painting with nebulae and stars mapped onto a curved surface forms the background, and 3D asteroids ring the planet, which is itself a 3D object with fluid simulations creating gaseous storms on the surface.

CG supervisor Bob Winter’s team started with NASA photos of Saturn as references, but soon realized that re-creating reality wouldn’t be dramatic enough. “Saturn is very simple,” says Winter. “The surface has uniform rings with subtle hue shifts between them. It’s boring. We needed to add more life to it.”

Instead of looking at Saturn, they decided it would be more dramatic to put the house on the edge of an asteroid belt and orient the camera straight down the belt. To create a feeling of depth, they populated the belt with asteroids in a variety of sizes.

Modelers working in Maya built a library of 25 rocky asteroids that were instanced into thousands. “We rendered them in layers,” Winter says, “to control the hue shifts.” By placing lights behind the rocks and using subsurface scattering, the artists made areas of the asteroids translucent, to create the effect of dirty ice crystals in the rocks.


To heighten the drama, Imageworks modeled Saturn and then used fluid simulations to create gaseous storms on the surface. Next, they added an asteroid belt, rendering various-sized icy rocks in layers.

During another sequence, the boys’ house is pulled toward a make-believe planet, the fireball Tsouris3. “We needed to create a rock-encrusted planet that had a lot of volcanic activity,” says Winter. “[The director] wanted bursts of smoke and debris coming out of the volcanoes. And, to show the house moving close to this planet, we needed an immersive atmosphere.”

For this, Imageworks started with a procedurally-generated reddish-orange base texture on the planet’s surface and then layered, in Adobe’s Photoshop, textures of mountain ranges. For the corona, they used multiple levels of volumetric rendering with various noise patterns that were rendered out in layers. At the end of the process, compositors tweaked the levels and colors to complete the immersion illusion.

Meanwhile, inside the house, a robot appears when the boys draw a card with the message: “Your Robot is Defective.” Travers describes the robot as a “giant, lumbering character that destroys half the house trying to get the kids.”

The robot looks something like a windup toy from the ’50s, with a barrel chest, thin legs and arms, and two antennae. For an exploding door sequence, Imageworks created an entirely CG robot; however, for most scenes, because the director wanted to capture the robot “in camera,” the production crew filmed a stunt actor, clad in a black leotard and wearing a robot’s torso and head. Imageworks later painted out the actor’s legs and arms and, working in Maya, replaced them with skinny, robotic appendages.

Once the camera was matchmoved from the scanned live-action film, rotoscopers created the matching CG torso. Animators could see the live-action torso from the photographed image and, on top, a transparent replica in CG with digital arms and legs.

“Because the torso doesn’t deform, once the matchmovers lined up the CG torso, we had a dead-on lock for the performance,” says Winter. “The only tricky part was when the human actor ran and we got squash and stretch in the spine. So, when we replaced his legs with the CG legs, our animators had to compensate for that.” They did this by separating the upper and lower body and creating a piston motion between the two to absorb the difference.


In most scenes, the robot is an actor in a suit, whose arms and legs were replaced by Imageworks with CG appendages. Occasionally, however, it’s an all-CG model.

To render the reflective metal of the robot-whether all CG or part CG-as it walked through the live-action scene, the team used Pixar’s RenderMan and reflection cards. These cards were 2D planes with images of the interior projected onto them; the images were provided by the production crew. “We built our own reflective house so that the robot would correctly reflect the interiors as it went from room to room,” says Winter. To determine which part of the robot’s surface would actually reflect the surroundings, they used reflection occlusion passes-if the robot raised its arm, the surface beneath would not be rendered with reflections. “The most tedious part was setting up the surface so that it matched every little smudge of grease and every scratch in the practical robot,” says Winter.

The reptilian Zorgons fly battleships that resemble old iron warships with rusted cannon doors and furnace-lit interiors. Sometimes the ships were practical-eight-foot models filmed on greenscreen stages. Sometimes the ships were CG. But even the practical ships needed CG touches: When a practical ship pulls up outside the boys’ window and fires a cannon into the house, effects artists added digital smoke and cannon blasts. “We also added practical fire to a fireplace where the boys were hiding, and digital fire to augment digital debris,” says CG supervisor Bob Peitzman.

With one exception, the Zorgons themselves were hunched-over stunt actors wearing suits created at Stan Winston Studio. The suits included a head and neck that protruded from the actor’s chest; a bluescreen hood covered the actor’s own head. Thus, for every shot with a Zorgon, Imageworks removed puppeteering rods and wires and the bluescreen head, and added spikes and plates from the front to the back.

“The matte painters supplied the spikes,” says Peitzman. “We matched the performer’s actions in varied, changing lighting conditions and angles without doing a full-CG replacement.” For this, compositors worked in Autodesk Media and Entertainment’s Discreet Flame software as well as Imageworks’ Bonsai.

In one shot of a lizard jumping up into the spaceship, however, the suit was too heavy for the stuntman, so Café FX created a CG Zorgon. “It took several months to create the shot because the Zorgon is such an intricate model,” says Vicki Gallowayweimer, visual effects producer for Café FX. “He has armor on his knees, shoulders, and elbows, and plates that move.”

Café FX built the model in NewTek’s LightWave, animated the character in Maya, and rendered it with Mental Images’ Mental Ray. For compositing, the team used Eyeon Software’s Digital Fusion to layer the character into Imageworks’ files and plates. “When you have only one shot that’s CG against live action in the entire movie, you have to be very deliberate about matching the way it looks, the way light hits it, the way it moves,” Gallowayweimer says.

During the climax of the film, the Zorgon aliens circle and shoot at the boys in their house, and then the aliens-and the boys-discover they’re all moving close to a black hole. Outside, a camera floats in space and looks at the unfolding scene; inside, the house is breaking apart.

Surrounding the house were as many as 60 Zorgon ships-a combination of miniatures and CG-that banked around the house and shot at it. The CG ships were built in Maya with subdivision surfaces and were lit using techniques honed on Spider-Man 2 for ambient occlusion, reflection occlusion, key lighting passes, and rim lights. The shots were all rendered through RenderMan.

The stars and other elements in the background were created with procedural techniques in Houdini that populated a virtual sphere with patterns matching those on a painted tarp. “Any direction you looked, you’d see stars and also the gas surrounding the stars so that the black of space is not just space,” Seager explains. The stars were dots, but sprites created the blue cloudy areas around the stars and the nebula.

For the sprites, the team drew from a large library at Imageworks containing clouds and other elements, photographic and painted. “We had a system set up so that once the camera was matchmoved, we could flip a switch and run out passes for the stars,” Seager says. “For the most part, getting to Version One was quick. Then, we’d begin tweaking elements.”

Imageworks’ custom Splat renderer, which was developed for The Polar Express, made quick work of rendering the sprites. They used it to help render debris for the house destruction, as well. “We had two types of destruction,” Seager explains, “explosive destruction from Zorgons shooting the house and, later, the house getting ripped apart as it nears the black hole.”

For exterior shots of the house destruction, the production crew filmed a miniature model that pulled up chunks of dirt as it was ripped from the ground. Because they destroyed the house during filming, the crew shot the destruction sequentially. “We received a lot of before and after plates,” says Seager, explaining that they wanted multiple motion-control passes as well as matte passes because the house was rim lit; rather than trying to pull mattes from the rim-lit shots, the compositors used the greenscreen passes.


To put the fire into this fireplace where the children are hiding, Imageworks used practical fire. The house is destroyed throughout the film with the help of digital debris.

To manage the debris, Imageworks created a system in Maya named the Debris Hose. With the “hose,” they could specify a location or direction and shoot debris toward it. They created the debris by pulling predefined geometry from a library, instancing it, and attaching the geometry to particles. The particle simulation determined the motion for all but a few pieces of hero debris; animators keyframed hero elements. “In some cases, we had to custom-model interior bits, such as pieces of wall with wires sticking out,” says Seager.

The black hole itself was a combination of practical and CG elements. To create its fiery rim, the production crew mounted a camera to a black disk. “From the camera, you see fire shooting out from a ring of black,” says Seager. The fire was created with practical explosions.

At Imageworks, the scanned film was slowed down and run backward so the fire looked like it moved from the outside in rather than the inside out. “We had done early tests using all CG for the black hole, but Joe Bauer, the VFX supe, wanted to try this,” Seager says. Thus, rather than “chasing down a gaseous flowing effect in CG,” as Seager puts it, they picked elements they liked from 20 minutes of fire footage-sometimes entire frames, sometimes a small piece of a frame.

Compositors meshed these elements in Autodesk’s Discreet Inferno to produce an image looking straight into the black hole. The 3D team then positioned that hole in the 3D environment, rendered it out, and handed it to compositors working in Bonsai to blend with the rest of the action.

One of the last pieces in the puzzle was the Zorgon ships’ thrusters, which spewed fiery elements. For this, practical fire shots were mapped onto 3D cards.

In addition to the major sequences, shots throughout the film were augmented and enhanced. When it was too hot for the stunt Zorgons, effects artists added fire. There were also lightning flashes, explosions, and even a CG goat (modeled and animated in Maya) that greets one of the brothers as he climbs onto a Zorgon ship.

“It was nice to have a full range of work on this show,” says Seager, “from tried-and-true traditional work like the robot to raw R&D like the Tsouris3 planet.”

The show also gave the Imageworks team a chance to try new methods of working. “Some people specialized, particularly those working on the spaceship and on the star fields,” says Peitzman, “but 80 percent of the time, people did both 3D and compositing. That’s become more the norm these days at Imageworks.”

Allsburg’s stories make pretense-a board game-real, albeit an impossible, exaggerated reality. In the 10 years since filmmakers brought his first board-game-based story to life with digital effects, that story premise still holds. Similarly, although the tools have changed and the artists have gained experience, the premise for making effects remains the same: Fit the effect into photoreality and make it believable-even when that means putting a house on the rings of Saturn.


Although the house is a miniature, the clumps of dirt are real, and the fire is made of photo elements, this final shot was composited digitally.


(Above and below) In keeping with the style of the film, practical effects were often composited into the live-action plates, as was required for these scenes.



Barbara Robertson is an award-winning journalist and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.