Four Play
Issue: Volume: 28 Issue: 8 (August 2005)

Four Play

When a cosmic storm envelops an inventor, astronaut, pilot, geneticist, and industrialist, all five become transformed in unearthly ways. The inventor, Dr. Reed Richards, becomes elastic Mr. Fantastic. The astronaut, Ben Grim, grows into an orange rock-like superhuman dubbed the Thing. Sue Storm, the geneticist, becomes Invisible Woman, and her hotheaded brother Johnny, the pilot, turns into the Human Torch. Together, they’re the Fantastic Four. And the industrialist? He becomes Dr. Doom.

Directed by Tim Story, the Twentieth Century Fox film was created with the help of a dozen studios that crafted 885 visual effects shots. “The shots span over half the movie,” says Kurt Williams, the visual effects supervisor, whose staff acted as a central nervous system for the effects production. “We had a database and systems in our office that helped the vendors work quickly. It was a ‘one way in, one way out’ depot of information.”


1: Mr. Fantastic (actor Ioan Gruffudd) stretches thanks to the effects wizards at Soho VFX.

Because there were so many characters and vendors, Williams turned to Proof Inc., for the previz work, and then brought the firm back again to do “post viz.” “They added the characters and backgrounds to footage so the editors could cut them quickly,” he says. “We had a short production time. We needed to give the editors one vision to cut with.”


2: Sue Storm (actor Jessica Alba) deflects a Doom ray created at Giant Killer Robots by using an invisible shield from Stan Winston Digital.

Williams organized the production by character and major sequences, singling out the work done by Giant Killer Robots (San Francisco), Soho VFX (Toronto), Stan Winston Digital (Van Nuys, California), Meteor Studios (Montreal), and CIS Hollywood as the core vendors. Other studios that worked on the film include CobaltFX, Hydraulx, Pacific Title, Pixel Magic, Kleiser-Walczak, and CafeFX.


3: Actor Michael Chiklis is turned into The Thing with the help of a rubber suit.

Concept: “The cosmic storm sets off all their powers,” says Williams. “How it affects the characters was important to Tim [Story] and me.”

The CG environment-the space station, the storm, and the impact on the characters-was handled by CIS under the direction of Bryan Hirota and John “DJ” Desjardin. Building the space station was a straightforward process: The crew used Alias’s Maya for modeling and animation, Pixar Animation Studios’ RenderMan for rendering, and Apple’s Shake and Autodesk Media and Entertainment’s Inferno for compositing. Creating the storm and the transformation of the characters was not.

“They wanted something no one has seen. We ended up with a highly charged gaseous cloud full of electricity,” says Hirota. “We started by creating conceptual renderings in [Adobe’s] Photoshop.” For this, a combination of RenderMan and Steamboat Software’s Jig helped create volumetric effects that were composited in Shake. Mel scripts controlled layers of particles that were exported to RenderMan and composited in Shake.

The transformations were handled with volumetric and particle effects in Maya, RenderMan, Jig, and Inferno, as well. “We ended up match-moving each of the characters in Maya,” Hirota says. “Then, we ran particle simulations down the hallways of the space station.”

Concept: As Reed Richards [Mr. Fantastic] stretches, he’s regenerating himself,” says Williams. “He doesn’t completely lose mass. And, his suit had to regenerate itself as he stretched, too.”

The effects and R&D wizards at Soho VFX stretched their skills to create the elastic superhero, Mr. Fantastic. “We started envisioning how we’d build this character last summer,” says Soho VFX’s Berj Bannayan. “Every part of his body can stretch and change shape in some way. We had to build a rigging and modeling pipeline [in Maya] so that any time an animator wanted, [Mr. Fantastic] could stretch across the room.”

To give the animators controls that would let them turn an arm into a whip, reach under a door, or wrap around a pipe, Bannayan’s technical crew created a stretching rig without bones or joints. “It’s a system of NURBS curves and smooth primitives,” says Bannayan. “Sometimes, there might be hundreds inside the arm.” The NURBS curves pulled the cvs (control vertices) on the skin, which was a polygonal mesh; the amount of detail in the mesh depended on the shape and position of the limb.

Animators used a series of controllers to manipulate the elastic limbs, and could switch from an IK rig to the stretchy custom rig on the fly. “If you grabbed an end effector, it would switch to our stretchy rig,” explains Bannayan. In addition to controlling the shape curve of the surface, the curves also managed the profile, allowing the animators to squash, stretch, rotate, and twist the limbs.


Multiple layers of texture maps to simulate actor Gruffudd’s skin were sometimes tweaked on a frame-by-frame basis. The stretching effect was created with a special rigging and modeling pipeline built in Maya.

“It’s hard to describe because there is no analogy for what we did,” says Bannayan. “It was almost like the keyframe in an animation curve. With our geometry, you could introduce new shapes, twists, and bends that normal rigging techniques don’t allow.”

For fine details, a muscle system made of blend shapes and magnets moved the skin. “You can see muscles under the skin when it’s stretched,” says Bannayan, “even when it’s elongated.”

For texture maps, the crew used photographs of actor Ioan Gruffudd as the basis for many layers of shading. To match the fine details on his hands and face, they painted 16k texture maps and often had as many as 10 to 12 displacement maps in the shaders. Textures were sometimes changed on a frame-by-frame basis. When Reed stretches his hand under the door, the hand has 25 layers that were tweaked over 1000 frames.

“Every shot got special attention,” says Allan Magled, VFX supervisor. “Every shot required its own tracking, and they all had issues, some impossibly hard. We tried every tracking software program, even one we wrote, but we ended up hand tracking the shots in Maya. Some shots took weeks of tracking.”

For one scene, for example, the team had to replace the left side of Reed’s body, locking the geometry to the live-action plate frame by frame. Fortunately, Reed’s costume was tight and didn’t require cloth simulation; however, getting the texture to match was tricky. “The material was a cross between silk, velour, and tin foil,” says Magled. “And, we had to match it in the same frame and lighting.”

For hair, the studio used its own fur software; for rendering, Soho works with custom RenderMan-compliant software created for the studio by 3Delight.

Concept: “Johnny heats from his core and the flames are so hot, they don’t attach to his skin,” says Williams. “We couldn’t make him look like a burn victim, but he had to be realistic.”

At Giant Killer Robots (GKR), a team of 90 effects artists caused actor Chris Evans to flame out on cue. Because the effect was added to the actor, Evan’s performances were tracked in great detail using the studio’s proprietary software Tracula, with an assist from 2d3’s Boujou. The techniques had been honed on such films as Blade, Son of the Mask, and The Matrix.

“We can integrate effects into live-action characters in unique ways, choosing what we want to use-virtual or live action,” says Peter Oberdorfer, VFX supervisor. “Rather than replacing humans with virtual guys, we try to maintain the real performance as much as possible. The tracking team was crucial, making sure everything lined up.”

To help ensure that the virtual track-ing model was accurate, the crew worked from cyberscans of Evans. Motion-captured data from the actor and his stunt double delineated the rig’s range of motion. Animators could then manipulate the resulting virtual model of Johnny to exactly match the actor’s performances.


Left: Giant Killer Robots gave fiery Johnny Storm his torch by tracking fluid simulations, particle animations, and practical elements into live-action plates. Right: In this shot, everything is CG.

The Torch’s fiery effects were generated around the virtual model with the live performance mapped on top. “For close-ups, we’d use wind, buoyancy, heat, and other fields that would move a fluid simulation according to the movement the actor made,” says Oberdorfer. When Johnny flies through Manhattan all ablaze, though, they handed the acting torch to the synthetic actor.

For the fire, GKR chose Maya’s fluid engine, and then hired Alias to extend it. In addition, the crew developed proprietary techniques to place simulation volumes in ways that would optimize render time, whether Johnny was moving, flying, or standing mostly still. Practical elements shot for specific scenes helped make the effect convincing.

“We had a library of ele-ments we could use as sprite animations with particle sys-tems or as stand-alone elements,” says Oberdorfer. “We didn’t paint ourselves into a corner by relying exclusively on one technique.” Rendering was ac-complished in Mental Images’ Mental Ray, using as many as seven layers of shading for the flames.

GKR also created a virtual Dr. Doom, using a cyberscan of actor Julian McMahon in full costume for modeling and Syflex for cloth simulations. For his lightning bolt-like “Doom” ray, the crew relied on a proprietary lighting tool that, using an L-system core, creates branching rays with particle-based plasma glows.

Concept: “During a giant wreck on the Brooklyn Bridge, the Fantastic Four have to utilize their powers for the first time in public,” says Williams. “Of course, the real Brooklyn Bridge was not available for filming.”

Instead, an exact scale replica of a 200-foot section of the bridge was built in a Vancouver parking lot. “We had to surround it with bluescreen,” says Williams. “We had scaffolding rigs with bluescreens and tractor trailers with additional greenscreen that we could move quickly. We also had Meteor build a fully textured CG model of the bridge, with CG water below, CG boats and aircraft, and a CG Manhattan and Brooklyn.”

The long sequence begins with the Thing (played by actor Michael Chiklis, wearing a rubber suit) jumping down to rescue a man attempting suicide. Traffic on the south side of the bridge grinds to a halt, and something kicks off an explosion. A fire truck swerves and pierces the outside of the bridge, causing the wheelman on the back to dangle dangerously above the water.

“It was a mix-and-match shot,” says Paul Nightingale, visual effects supervisor at Meteor. “We used special effects, CG to augment the sets and fire truck, and when it was impossible to have a real fire truck, full CG. We also added fire, the cities [at each end of the bridge], helicopters, and more. A number of shots were fully CG.”


A 200-foot replica of the Brooklyn Bridge was extended and sometimes replaced entirely by a CG model, created at Meteor, along with the fire truck and other vehicles, water, and the cities on both ends.

The bridge was built, animated, and lit in Maya using plans and photos of the real bridge but twisted to match data from a Lidar laser scan of the on-set bridge. The CG bridge was rendered in RenderMan using level of detail to reduce the geometric complexity in the distance. “There were an inordinate number of rivets,” Nightingale says. “We didn’t want to drive the modeling team crazy.”

Procedural shaders added dirt and rust to painted textures. “We had a layer of nonspecific texture for the whole bridge and then another layer with more detail to age and match the photographic reference of the bridge and the set,” explains Nightingale.

A CG animatic created by the previz team helped the crew determine which parts of the background could be 2D, 2½D, or 3D depending on the amount of parallax seen through the camera. To create the far backgrounds, artists stitched together Photoshop paintings. The paintings were brought into Maya as 16-bit textures and projected onto a cyclorama-a ribbon that encircled the entire virtual set. “Once we had built that, we could put the camera in any position, and the system we created would automatically generate the sky and freeway traffic along the border of Manhattan,” says Nightingale. The sky was projected onto the inside of a dome. Details close to the camera were 3D unless they could be 2½D.

“We knew the camera would be based in the center of the bridge most of the time,” Nightingale says, “so everything could be built with that knowledge of where the action would take place. We could switch between the cyclorama with the texture map and 3D geometry of skyscrapers. If there was a dead spot when we spun the world around the camera, we’d add a building.”

For the water, the crew used Arete Entertainment’s Psunami to create the surface and reflections. To add depth, they rendered multiple layers of fractal patterns through RenderMan and then combined those passes with the Arete RenderWorld passes in compositing.

The vehicles ranged from 3D models bought over the Internet and used in the distance, to custom models created at Meteor for replacing and augmenting vehicles in the accident scene. “Sometimes we had to replace everything from the Thing to Manhattan,” says Nightingale. That included some 30 vehicles plus the fire truck. For the fire, they used a mixture of pyro from the set, propane flames they shot separately to produce fire controlled by Sue Storm’s force field, and CG fire created with Next Limit’s RealFlow.

For rendering, the crew calculated ambient occlusion using RenderMan, sometimes on a frame-by-frame basis. “There were 85 million rivets, all casting little shadows,” says Nightingale. “But we got a degree of subtlety with ambient [light-ing] that we couldn’t get any other way.”

Concept: “Sue bends light,” says Williams. “She takes the background behind her and projects it onto her front. Sometimes she’s invisible, but the audience can still read her expressions. The job was to give Tim [Story] a tool set to dial Sue in and out as he wished.”

A crew of around 25 artists at Stan Winston Digital took on the job of creating that tool set and 80-some shots. “The biggest hurdle was trying to come up with something everyone liked,” says Randall Rosa, co-visual effects supervisor with Andre Bustanoby.

They did so with a combination of 3D and 2D techniques and tools: Avid XSI, Mental Ray, Autodesk Combustion, and Shake. Modelers created virtual Sue from a cyberscan of actress Jessica Alba. By using incident (Fresnel diffraction) passes to illuminate contours and edges and make them translucent, and with refraction passes to bring in the background, renderers created the invisibility illusion.

“You could think of Sue like a wine glass that you hold in front of your eye,” explains Rosa. “The way the background looks through a wine glass is the basic refraction. The key characteristic for Sue was how much of her affected the background.”


CIS built the CG space station using Maya, RenderMan, Shake, and Inferno. For the cosmic storm that tears down the hallways, the crew used RenderMan and Jig for the volumetric effects.

Although the team created a Mental Ray shader that allowed the artists to affect the refractions, they decided to work in 2D instead.

“Jeff Wolverton, a lighting TD, knew how refractions work, so he wrote a plug-in for Shake that simulated the 3D effect,” says Bustanoby. “It allowed the client to art-direct the look.” Thus, a compositor working in Shake could slide the background so that, for example, a dark spot wouldn’t land on Alba’s cheek. “We rendered a normals pass, and a color rendition on the match-moved Sue mesh gave the compositor a visual lookup table,” explains Bustanoby. “The plug-in looked at that to get a sense of the direction of normals on the surface relative to the camera and bent the background through it.”



4: The Human Torch (actor Chris Evans) acquires his fire from Giant Killer Robots.

Sue has the ability to throw her in-visibility “cloak” outward, turning it into a force field. For this, the crew used a blend of geometry-based soft-body and fluid dynamics and particle effects, all created within XSI. For match-moving, they relied on three software tools: Science-D-Visions’ 3D-Equalizer, RealViz’s MatchMover, and The Pixel Farm’s PFTrack.

If she happens to be wearing street clothes and not her Fantastic costume, Sue’s clothes remain visible when she becomes invisible and hover in the air. For this effect, the crew created CG clothes and used Syflex cloth-simulation software working within XSI to make them float. “The volume of painting and roto work that went into this was incredible,” says Rosa. “Every part of her occluding the clothes had to be painted in 2D or created in 3D.”

Many of the shots from Stan Winston Digital and the other studios involved more than one vendor: Sue, for example, is hit by Dr. Doom’s plasma ray, created with Giant Killer Robots’ lightning bolt software. Williams’ office acted as a clearinghouse, controlling the ins and outs. “I hope this kind of collabora-tion becomes standard,” says Bustanoby. “Even in the throes of delivery, it was synchronized and smooth.”

Considering the type of effects, that multiple characters in multiple shots were created by multiple vendors, and the tight schedule and budget, someone might call Williams Mr. Fantastic, as well.



Barbara Robertson is an award-winning journalist and a contributing editor for Computer Graphics World.