Beauty Supply
Issue: Volume: 32 Issue: 10 (Oct. 2009)

Beauty Supply

Read in-depth features about how cutting-edge CG was created for various movies making Oscar buzz.

Of the 800 visual effects shots in Walt Disney Pictures’ Surrogates, few are visible, although you might wonder how Bruce Willis managed to look so young. Directed by Jonathan Mostow, the sci-fi film stars Willis as an FBI agent named Greer who, like everyone else, interacts remotely with the rest of the world through a robotic surrogate. When someone is murdered, Greer steps out of his safe haven and discovers a conspiracy.


Humans in Surrogates control their perfect robots from the safety of their homes. Digital makeup helped the surrogates look so flawless.
Mark Stetson supervised the visual effects. “A big portion of the work went into polishing the surrogates and polishing the plates,” he says. “The whole show is pretty subtle. It’s a whodunit in a present-day parallel reality, so we hinted at the robotic understructure of the surrogates and sprinkled hints of the network that drives the surrogates. At the same time, we presented the reality of the world as if it was natural that everyone was beautiful and all the actors were flawless. We upped the level of perfection of all these already beautiful people in the movie.”

Five studios worked on the film: Synthespians, Sandbox F/X, Assemble, MPC Vancouver, and Industrial Light & Magic (ILM). In addition, an in-house team worked compositing and polished some shots. “There was a lot of craft and handwork in these effects,” Stetson says, “a lot of hidden work.”

Synthespians: Surrogates, Skin Deep and Deeper
Synthespians had planned to do 60 shots to help sell the idea that the people in the film are robots, until Stetson asked this studio and others to do a digital makeup test. The test was to make Willis look younger when he plays his surrogate during the first half of the film. Synthespians passed, and then picked up another 200 shots that involved de-aging Willis and polishing some of the other actors. Approximately 35 people in the studio’s Los Angeles and Massachusetts facilities worked on the project.

“They realized that if Greer is going to have a robot version in the real world living his life, he’s not going to create a 60-year-old,” says Jeff Kleiser, Synthespians founder and visual effects supervisor. “The surrogates can look however the people want. So, we made Greer’s surrogate look like a younger guy, like Willis looked in [the TV show] Moonlighting. And, we made him look consistently young in all different types of lighting environments.”

The studio started with a cyberscan of Willis’s face to create a 3D version in Autodesk’s Maya that the modelers modified by lifting skin in his chin area, smoothing wrinkles around his eyes, and shortening his ear lobes. Yannix, a studio in Thailand, tracked the camera motion and the motion of Willis’s head in each frame so the artists at Synthespians could fit the CG head onto the actor’s body. To texture the model, Synthespians’ artists projected the filmed footage of Willis onto geometry that matched his performance. Artists working in Apple’s Shake manipulated the textures as needed and then cleaned up the wrinkles and other imperfections enough to make the robotic substitute look younger but not plastic.

“There was too much perspective change on his head as he was acting to use a 2D solution,” Kleiser says. “So we came up with this next-gen technique, which involved tracking the cyberscanned head to the performance. We were able to match the 3D head to his head.”

In addition to creating Willis’s surrogate, Synthespians also touched up some of the other actors, to clean up imperfections the robots wouldn’t have. But, they also peeled back the magic, figuratively and literally, to prove that the robots weren’t human. Their CG peekaboo endoskeletons appear in approximately 60 shots. In one scene, a beautician pulls back the skin on a surrogate’s face to reveal the mechanical infrastructure. In another, a landlady has a CG skull and neck made of digital pistons.

Stetson describes a third hint at the robotic substructure. “The human Greer finds his surrogate wife, Maggie, entertaining friends by passing around a ‘jacker’ device that sends an electronic signal,” he describes. “When he’s confronted by his wife, he smashes the skin off one of the surrogate party friends and you see the endoskull.”



(Top) Synthespians digitally polished surrogates into subtle perfection.
(Bottom) The studio inserted internal mechanics for robotic reveals.

Sandbox/Assemble: Robot Factory, OD Device, Battles

Nothing sells the idea of a culture dependant on robotic surrogates more than seeing a factory pumping out thousands of mechanical doppelgangers. Assemble, a small studio in Berkeley, California, worked with Sandbox, a small studio in Pittsfield, Massachusetts, to create the robot factory and other shots. All told, the two studios created the most shots—approximately 350. “We treated the two studios like a separate creative partner for the movie,” says Stetson.

One scene the studios worked on takes place on an army base. “We see thousands upon thousands of people in chairs operating surrogates elsewhere in the world,” Stetson says. “Sandbox and Assemble created that special satire, and they also did the big factory shot where you see Greer [Willis] and Peters [actor Radha Mitchell] being escorted through the manufacturing building with huge assembly lines.” Assemble built the robots, and for many shots, Sandbox handled the compositing.

“Our shop does straightforward 3D and 2D,” says John Nugent, VFX supervisor at Sandbox. “Assemble does specialty CG—the complex models with lots of moving parts. For the big, wide factory shot, Bruce and Radha look out into a greenscreen, so Assemble built the endless line of CG robotic machines picking up arms and legs, art welders, and rows of surrogate bodies moving like suits on a dry cleaner’s rack.”

The biggest shot for Sandbox alone were battle scenes for which they built CG helicopters and planes, and added dust trails and explosions. “I was on set at a quarry south of Woburn, Massachusetts, for this,” Nugent says. “It’s supposed to represent an Afghanistan-like warscape. They had folks dressed in surrogate army gear and pyrotechnics going on. We did sky replacement, set extensions, and a lot of battle-scene enhancements—explosions, firebombs, blasts, gun hits, missile trails—all that fun, active stuff.” For the 3D work, the artists used Maya, and for 2D, Shake.
      
Sandbox also did R&D and look development on the “overload device,” also known as the OD, which kills surrogates. “We did that in tandem with MPC,” Nugent says. “Mark had both of us ping-ponging, simultaneously zeroing in on a look. In the film, when someone aims the OD like a blow-dryer at a surrogate and presses the trigger, it sends an energy field toward the surrogate.”

The Sandbox artists created the effect using a combination of 3D and 2D. The 3D artists created gray-shaded mattes that controlled particle animation, and then the 2D artists used filters to warp and distort the background and to add colors and other effects.

“We also had Assemble help us with a shot in which the camera flies into an eyeball as it’s blasted by the OD device,” Nugent says. “They did the basic CG, and we did the compositing and 2D enhancement.”

MPC Vancouver: OD Device, Arm Stump, Dead Surrogates
MPC Vancouver used a similar method to create the OD’s energy field. The artists generated particle streams in Maya that they moved into Shake for compositing, driving displacement with mattes, 2D element passes, and 3D elements. “We start in Maya and then have a number of proprietary systems to take Maya to the next level,” says Doug Oddy, VFX supervisor.

They put those systems to work on a shot in which the OD device kills police officers. “We built digi-doubles to roto-animate their movements so we could launch the effects from the digi-doubles,” Oddy says. “We have a subsurface lighting effect under their skin. Their eyes explode, and we see the machines coming apart. We did a lot of shader work, and on top of that, we had a tremendous amount of compositing and re-lighting.”


VFX house Assemble built a computer-generated factory and manufactured the surrogate robots in the factory, as well.
The bulk of MPC’s work, though, centered on replacing Bruce Willis’s arm with a mechanical stump following a chase sequence during which the arm of Greer’s surrogate is severed at the bicep. Willis wore a bluescreen sock on his arm for the remaining shots starring his wounded surrogate, and MPC attached a CG burned, charred, mechanical stump with leaking fluids to his shoulder in the filmed plates.

“The mechanical device mimics human kinetics, so we used the human body as a template and created a hydraulic-based system,” Oddy explains. “We thought of each muscle as a series of bladders that can cause a muscle to extend.”   

Because Willis starred in some shots and stunt doubles in others, MPC scanned both Willis and one double to create a hybrid Greer, and then roto-animated the mechanical stump into place.

“It was fairly straightforward” Oddy says, “but roto-animating movements is always time-consuming. We used the same rig for the actor and the stuntman, and flexed it so we could reposition the subtle physical changes between the two.”

The fluids, however, required R&D. “We worked closely with Scanline to get the level of viscosity we needed from [its] Flowline software to have the fluids leak out of the arm the way we wanted. We wanted them sort of pulsing out, as if a pump is running,” Oddy says. “So, we built a rig within the system to get consistent results for all the shots.”

In addition to the CG arm, MPC put the digi-double’s whole body to work, as well. “By and large, Bruce [Willis] did the stunts, but Greer is kicked around pretty good,” Oddy says. “Our digi-double is a hybrid Greer that can leap across 60 containers and get impaled.”

MPC also created a digital replica of a helicopter that the studio fitted with digi-doubles for Greer and a police officer, and, as did all the studios, smoothed actors into surrogates with a little airbrushing.

“SSI [Synthespians] did the heavy lifting, but we did a little of this difficult work,” Oddy says. “We could rotate our digital Greer around and pull a piece from it at the angle we needed and render it in the right lighting conditions for the compositing team.” Because they had built Greer as the smooth surrogate, they could graft little pieces from that model into Willis’s face, if needed, for his surrogate version. But, it wasn’t easy.

“Any time we seem to get a handle on our collective ability to do something,” Oddy adds, “we up the ante. We can have an actor in the middle of a disco and have a caustic light cast across his face, and graft a little piece onto that face. Years ago, we might have restricted the actors and the camera moves. Now we do not. But, it’s always challenging and still time-consuming. The face can look so different from shot to shot.”

Brickyard: Digital House, Dead City
Ironically, among the 35 shots that Brickyard created is one that required constructing a digital version of a building three blocks from its office in Boston. “In the story, it’s the mansion owned by the leader of a company,” says Brian Drewes, visual effects producer at Brickyard. “They wanted the building to look imposing and more symmetrical than was shot on set, so we adjusted the façade with CG elements across three or four shots.

The largest amount of Brickyard’s work, however, involved changing an overhead helicopter shot that traverses the city of Boston. “They didn’t shoot it with the intention that it would be the end shot,” Drewes says, “but they chose it in editorial.” As a result, Brickyard had to paint out moving cars and people, and composite in surrogates, cars, smoke, satellites, and general mayhem.

“They wanted to see devastation across the city,” Drewes says. “It’s a great shot and a very long shot, and it was a ton of work. We had to imagine what would happen if everything stopped working; if the surrogates driving cars and walking to work would be shut down.”

To create the shot, which started out at 2000 frames but ended up at 1000, Brickyard placed CG and 2D people and cars on a ground plane using Maya. “Then we added a lot of hand-painted details,” Oddy says. For a car that crashed into a building, for example, they might accordion the front by adding 2D on top of 3D, composite in some debris and tires that fell off, and sharpen the scene with glass shards. They also built CG proxies for the buildings to add satellite dishes, towers, and other mechanical details.


Synthespians developed new techniques to lop several years off actor Bruce Willis, a possibility they suspect other filmmakers will find interesting.
“We ended up modeling the entire cityscape,” Drewes says. “We tracked the shots in [Andersson Technologies’] Synth­Eyes, exporting that data into Maya and then feeding everything into [The Foundry’s] Nuke.”

As did the other studios, Brickyard also did a little surrogate polishing. “We cleaned up extras they wanted to look perfect,” Oddy says. “They tried to cast as many perfect people as they could, but we did some neck tuck-ups and wrinkle removals.”

For one actor, though, they did the reverse. “She wakes up at the end and you see her as a human, but she looked too perfect naturally, so we added some veins, discoloration, and age marks,” Oddy says.

Industrial Light & Magic: Big Crash
In addition to these studios, compositing supervisor Marshall Richard Krasser led a group of digital artists at ILM that layered together a car crash. In the film, Guy Cantor (played by James Cromwell) has hijacked the Peters surrogate, the character played by Radha Mitchell, and is operating the surrogate. During a chase sequence, he has that surrogate deliberately crash into the human Greer’s car, and then the surrogate gets away. “We filmed the chase at the Paramount backlot,” Stetson says, “and then shot stills in Boston to have backgrounds to fill out and replace the palm trees. ILM’s work was mostly 2.5D digi-matte work, plus they created a stunt double, removed rigs, and added CG props and windshields. They did a nice, clean job.”

That was the goal, in fact, for all the studios a clean job with invisible effects. In general, Stetson calls this a medium-size show with an array of interesting tricks. “It’s not a giant epic spectacular,” he says. “It’s a murder mystery, and that’s the way Jonathan [Mostow] played it.”

Of all the tricks, the one that might have the most lasting impact is the work the studios did in making the surrogates look so beautiful. “This technique can give an actor a good 10 or 15 years more range,” Kleiser says. “It’s easy to make an actor look older with makeup; harder to make them look younger. I suspect a lot of people will take an interest in it.”