Extreme Effects - Inspiration
Issue: Volume: 31 Issue: 8 (Aug. 2008)

Extreme Effects - Inspiration

The clue to Warner Bros.’ sixth film in the Batman franchise and director Chris Nolan’s second is in the title: The Dark Knight. The Knight is Marvel Comic’s Batman (Christian Bale), a crime-fighting superhero who acquires power from his armor and his weapons, not from something supernatural or alien. The “dark” is everywhere: The black comedy that emanates from Joker’s (Heath Ledger) dark mind, and Batman himself riding his black-as-night motorbike, the cape on his  famous costume billowing like inky smoke behind.

Critics applauded Nolan for looking deep into Bruce Wayne’s troubled soul, as we followed the billionaire’s evolution from citizen to vigilante in Batman Begins, and for grounding that film’s superhero effects in a gritty physical reality. In The Dark Knight, Wayne confronts the consequences of becoming Batman, and the reality is grittier.

Double Negative’s Paul Franklin, who received a BAFTA nomination for supervising the studio’s photorealistic work on Batman Begins, laughs as he remembers discovering the mission for The Dark Knight. “When we started on this film, the production designer said it would be the realistic version of Batman Begins. He said they considered Batman Begins to be stylized and unrealistic, even though they attempted to ground it in observed reality.”

Four studios helped create the The Dark Knight’s approximately 700 visual effects. “Double Negative, the lead vendor, did around 370 shots—the digital Batman work, all the Gotham City creations and extensions, and the digital Batpod (Batman’s motorbike),” says Nick Davis, overall visual effects supervisor. “Framestore CFC did all the Harvey Dent (Two-Face) effects and a sequence in Hong Kong. Cinesite predominately did 2D cleanup work for removing rigs and wires from complicated live-action mechanical effects. And, BUF created Batman’s sonar vision.”

The shot list almost makes the work sound simple and straightforward. In fact, most shots were complex mixtures of live-action cinematography, greenscreen footage, miniature models, physical effects, and CG that, for the most part, had to be so photographically and physically realistic no one could tell it’s there.

But, that’s why we’re here.

Bruce Wayne’s World
Double Negative had built a Gotham City for Batman Begins, but Nolan’s increased demands on photorealism and the realization that the studio would be matching footage filmed with IMAX cameras sent the digital architects back to their computer screens. “We were able to reuse some models, like the digital Batmobile and the monorail that plays in the background, but everything had to be overhauled and refurbished to hold up to scrutiny,” Franklin says. Double Negative bases its pipeline on Autodesk’s Maya, Pixar’s RenderMan, and Apple’s Shake.

For example, other than Wayne tower, the buildings in the prior film worked in predetermined lighting conditions. For this film, no buildings had baked-in lighting. “We lit them the hard way, in 3D,” Franklin says. “But we had more flexibility.”
Framestore CFC, Double Negative, and BUF all worked on IMAX shots. Framestore CFC handled a Hong Kong sequence, and BUF added Sonar Vision to various shots (see “Sonar Vision,” this page). However, most of the IMAX work, which totaled approximately 40 minutes of the film, landed at Double Negative.

“We initially planned to work at 8k resolution,” Franklin says. “But, we concluded that 5.6k resolution was more realistic.” At that resolution, each frame required approximately 160mb. “Huge, but manageable,” he says.

Double Negative’s hero sequences included a lengthy armored car chase during which Batman switches from the “Tumbler” (his Batmobile) to the Batpod, and a sequence inside the so-called Prewitt building.

Lower Wacker Drive
The chase sequence demonstrates the complex integration between live action and digital effects. It takes place on Lower Wacker Drive, where the Joker’s henchmen in a garbage truck smash cars between them and an armored car. Using his Batmobile as a battering ram, Batman tries to stop them. The sequence takes place largely in-camera on location and in miniatures created at New Deal Studios.

“In the shot, the garbage truck and all the cars flipping are real, the rocket-propelled grenade launcher is real, and the Batmobile is mostly real, all filmed in IMAX in Chicago,” Franklin says. “We took out rigs, extended the backgrounds and sets, added traffic, smoke, and dust, and sent a digital Batmobile flying past the Joker for logistical reasons. But they blew up a real Batmobile on set in Cardington, England. When they didn’t like the result, they totaled a second Batmobile to get three shots.”

The Joker speeds through the streets of Gotham City. For a bank heist in the opener, and during other chase sequences, Double Negative created digital backgrounds, a digital Batmobile, and a digital Batpod.
 
After the Batmobile crashes, we see Batman inside checking the computer displays. He flicks a switch, and we see a front tire spin. And, from this point forward, the Batmobile is digital. Batman flips another switch, panels move, and we see inside. The car lurches. Gun pods lock into place around the wheels. A final toggle blows all the panels off, the front wheels swing into place, and Batman accelerates away on his Batpod, cape billowing behind.

Working in Maya, James Guy built the model. Bruno Baron textured it. Dorian Knapp animated the broken Batmobile. Nicola Hoyle developed a secondary dynamic system for the shuddering and shaking. Dan Wood and Mike Nixon handled simulation, and Teena Roy and Hoyle, known in the studio as the “vixens of destruction,” moved the body panels. “It’s terrifically complicated, but they make it look easy,” Franklin says.

Although a stunt rider wearing a Christian Bale mask most often raced the practical Batpod, at speeds up to 100 miles per hour, Double Negative’s digital Batpod and digital doubles for Batman and the stunt rider rode into some shots.

LaSalle Street
The police pursue the Joker, who is now in a semitrailer driving down LaSalle Street, a continuous mile of skyscrapers. “It’s like a canyon,” Franklin says, “very dramatic.” A police helicopter flies down the street with a sniper hanging out the window. Two henchmen in buildings on either side of the street thread cables across to catch the helicopter.

“They flew a real helicopter down the street, but when we see it catch in the cables, it’s completely digital, as is the street, because they couldn’t shoot plates from the angles they needed,” Franklin says. “We could place our digital camera at 10 feet from a building, and it still looked good at IMAX resolution.”

The digital helicopter catches the wires, crashes, careens down the street, smashes into a window, spins wildly, sprays glass on the fully digital street, and catches fire.

Several proprietary tools helped the artists create the shots. Double Negative’s DNasset managed objects modeled in various levels of detail. DNshatter, which works with the studio’s rigid-body simulation tool, Dynamite, shattered the glass.
DNsquirt, a new fluid-simulation tool developed in collaboration with Stanford researchers, and Maya fluid dynamics—both rendered through DNb, a volumetric renderer—engulfed the helicopter with flames.

To create the surface street, the crew used lidar scans and photography. “It looks flat,” Franklin says. “But it undulates all over the place. We photographed every inch of the road in excruciating detail, and lidar-scanned the entire surface. We could selectively replace the road in IMAX shots because the survey data gave us confidence in the tracking. It was easier than painting things out.” For tracking, Double Negative used 2d3’s Boujou, The Pixel Farm’s PFTrack, Science-D-Vision’s 3D-Equalizer, and an in-house photogrammetry program, DNphotofit, that also does camera solves using tracking points.

The shot ends with a real fuselage that the production crew set on fire and crashed into the armored car, and with Batman screaming after the Joker on his Batpod between exploding cars and practical pyro.

A stunt rider wearing a Christian Bale mask raced the practical Batpod, but Double Negative’s CGl Batpod and digital doubles starred in dangerous shots and those calling for the bike to do acrobatics.
 
“They drove a real Batpod through the practical pyro, but our digital Batpod did the extreme maneuvers, like 90-degree turns,” Franklin says. “The maximum deflection for the real bike is five degrees.” And, Batman couldn’t flatten the real bike, which he needed to do for the next shot: He slides between the wheels of a semitrailer that flips 180 degrees, tail over nose, and he comes out the other side.

Cables woven around lampposts stopped the semi and caused it to flip, uprooting the lampposts. The 18-wheeler is real, rigged with metal teeth on its nose that dig into the road surface and flipped using a telegraph pole fired from a nitrogen launcher. Double Negative removed the nitrogen launcher, added the cable, removed the teeth, and replaced the road and the lampposts. “The prop lampposts didn’t have enough weight, so we rebuilt them and animated them with dynamics,” Franklin says.

The shot ends with Batman riding his digital Batpod up the side of a wall. “It’s like the scene with Gene Kelly running up the wall in Singin’ in the Rain,” Franklin says. “When the bike gets to the apex, the center flips 180 degrees, so Batman is still the right way up when he’s on the ground.”

Partying in the Prewitt
For a later sequence, Double Negative artists matched the location, the Trump Tower in Chicago, which was under construction, with a 40-story digital model that they blended into a nine-story set filmed at Warner Bros.’ studio in Cardington. “It was insanely complicated,” Franklin says. “The roof was covered in construction cranes. There were cement mixers, spools, shipping containers. You could see straight into the open floors because they hadn’t covered the sides with glass. We had to build the whole thing down to the ground, including the police cordon.”

During one shot, several men who try to corner Batman at the building’s edge become roped together. “Batman kicks one over the edge and the rope pulls the next guy over, and the whole chain tumbles out of the building, 30 stories up,” Franklin explains. “They shot it from helicopter. The stunt guys had rigs to protect them and crash mats for when they slam back into the building.” So, Double Negative replaced the building and lit the digital model to match the movement of practical lights.

“The biggest challenge with the Prewitt building, though, was the huge amount of keying work for shots filmed on the set in Cardington,” Franklin says. “They had a 200-foot greenscreen outside the set that we replaced with views of Gotham City.”
They also added Gotham scenes out the window for shots during a supposed penthouse party filmed in a ground-floor lobby in Chicago. “The party is about 100 shots, and every shot required background replacement,” Franklin says. “The lighting conditions are constantly changing from magic hour, to deep magic hour, to full night.”

During the sequence, the Joker pushes Rachel Dawes (Maggie Gyllenhaal) out the window and, without hesitation, Batman dives after her.

“That was a CG shot,” Davis says. “Double Negative put Gotham outside the window, and created CG doubles for Maggie and Batman, but when they land at the bottom, it’s a live-action scene. So, it involved complicated greenscreen and character work.”

Similarly, Double Negative re-created an interior environment and replaced Batman and two hostages with digital doubles for a shot later in the sequence when Batman grabs the hostages and jumps over a stair rail to escape. And, the studio put digital doubles on the rooftop of the digital building for a confrontation between Batman and the Joker.

If that weren’t enough, the studio also created two ferries, filled them with panicked citizens, set them afloat on digital water, and gridlocked 30,000 people and 10,000 automobiles on Wacker Drive on their way to the ferries.

For the crowds, the studio wrote a simple, easily art-directed crowd placement tool. “The artists could paint lines of people,” Franklin says. “They aren’t aware of other people, but they don’t stand on top of each other.” Animators created motion cycles for the crowd by rotoscoping extras in reference footage.

Hong Kong
Long before the ferry evacuation, Bruce travels to Hong Kong to pursue money launderers, an IMAX sequence for which Framestore CFC built Hong Kong’s tallest skyscraper, the 88-story Two ifc, its cousin, the 39-story One ifc, the environment surrounding the buildings, and a CG Batman. “Bruce Wayne gets into his Batman garb on top of the Two ifc,” says Davis. “So we filmed plates from the top and from helicopters to have digital Batman leap off the building.”

The shot starts with a stuntman on greenscreen. It transitions into a CG Batman who flies around digital Hong Kong, and then transitions again to the stuntman who smashes through a window of the lower building. The stunt Batman sticks bombs onto other windows and blows out the side of a miniature building that Framestore CFC inserts into a CG building. Then, stunt Batman fires a little weather balloon that floats up to a real C130 transport plane. Digital Batman grabs onto a skyhook and enters a digital cargo hold in the plane.

Some directors might have composited the Joker in front of CG fire, but Chris Nolan had Heath Ledger walk out of a real building that was blown up. CG previsualizations helped the demo experts plan the timing of the explosions during the continuous, in-camera shot.
 
“Our buildings had to match shots with real buildings cut directly with CG buildings, and in one shot, we have a bit of our CG building on top of the real building,” says Framestore CFC’s Webber, “and we put a 2.5D city around the building. We were pretty well equipped to do the buildings, but we created new photogrammetry techniques for the highly detailed office interiors.” Framestore CFC also uses a Maya, RenderMan, and Shake-based pipeline.

To create the interiors, the studio shot high-resolution fish-eye stills inside numerous Hong Kong offices, several per room, using a digital camera mounted on a custom-built camera head. They stitched the stills into environment maps and triangulated the positions of objects inside to reconstruct the room from the photographed environments. “We didn’t model to the level of pencils and pens,” says Ben White, CG supervisor, “but we modeled computer monitors, chairs, lamps, furniture—anything with a clearly-defined form.” Photographs projected on the models provided the realistic textures.

The studio created most of these digital sets, the background matte paintings, the tiled environments, and the highly detailed foreground buildings with modeled interiors at 5.6k resolution, with some renders at 8k. “We didn’t need to change the pipeline,” White says, “although it did make the network glow a bit.”

As for Batman, the studio paid particular attention to his cape as he glides through the city. “Our cloth sims had to be carefully worked out to get the right feel for the cape,” says Webber, “and we worked on new shaders to reproduce the way the light bounces off all the different black surfaces.”

Keeping his gliding style grounded in physical reality was equally important. “Everything needed an extra level of naturalness above and beyond photoreality,” says Webber. “It wasn’t enough to look real. It had to look really possible.”
While Bruce Wayne/Bat­man might be conflicted in the film, the director was not. Fortunately, an army of artists controlling state-of-the-art technology provided the superpower to bring his vision to light.

SONAR VISION
Batman can see through walls...when the machine is working
Consider that in Gotham City, everyone has a cell phone. Knowing this, we learn in The Dark Knight that billionaire Bruce Wayne and his high-tech friends built a supercomputer that translated all those signals into digital data that acts like 3D sonar. As a result, whenever someone uses a cell phone, Batman can see, on monitors in his Batcave and through special goggles, the scene surrounding the phone. BUF created the effect.

Inside the Batcave, hundreds of screens line the walls, and every screen tells a little story that BUF created, a moment from everyday life—people cooking, talking, riding to work in a car, sitting at a desk. “We see the city on these screens,” says Dominique Vidal, visual effects supervisor. “Batman also uses [the goggles] like a virtual camera to see through walls and floors.”

To create these “sonar vision” scenes, the studio modeled typical apartment rooms, a subway, stores, and various other environments. To add people, they shot the slice-of-life performances with the studio’s Video Motion Capture (VMC) system, which uses four cameras to capture multiple angles, and then rotoscoped the actors using the four angles to help animate the CG characters. For shots of Batman looking through his sonar-vision virtual camera while riding through Gotham City, BUF modeled a large part of the city by referencing matte paintings and models from Double Negative.

“Chris [Nolan] didn’t want X-ray vision,” says Alain Lalanne, visual effects producer. “He wanted to see all the surfaces of the objects. He hates a CG look.” 

And yet, the studio needed to create the scenes in CG to give themselves and Nolan the flexibility of a CG camera. “We knew we had to see people’s everyday lives, but we wanted to give Chris the freedom to tell us whether he wanted to see people cooking, being at the office, or walking down the street,” Lalanne says. “So, it was easier to create the actions with CG characters than produce thousands of minutes of 2D footage.”

In addition, to sell the idea that the images “pulse on” when the cell phone goes live, the footage sometimes needed to look transparent or opaque. And sometimes, the technology breaks down.

The erratic technology was Nolan’s idea. The audience discovers the notion of imperfect technology during a se­quence toward the end of the film that takes place in the under-construction Prewitt building. Sonar vision makes it possible for Batman to explore the skyscraper without moving. 

“He uses the gadget to fly his vision through the building to look at different places and clarify the action,” Vidal says. In doing so, he discovers which characters are good and which are bad. He can see the people on different floors and in different rooms, those being held hostage, the Joker, and the SWAT team. Sometimes we see through a wall to other rooms in the building, room after room after room, and it gives a vertigo effect. 

“During the process of building this effect, Chris [Nolan] and Nick [Davis] pushed and pushed us to make something photorealistic,” Lalanne says. “When we had achieved the photorealistic quality, Chris told us to destroy the image. He explained to us that Batman doesn’t have superpowers, only technology. And, technology is never perfect; the machine Batman was using was not working well. It was very smart of them. It’s easier to destroy an image with a lot of quality than to do the reverse.”
“But,” Lalanne continues, “we had to find a design that would explain to the audience that the machine was a prototype and not perfect. Chris explained the idea of the machine, but it was a thing that doesn’t exist. We had to imagine what this non-existing machine would let us see. That was quite interesting.” 

The artists at BUF altered the images with flashes, noise patterns, shaking, artifacts, and so forth using shaders and 2D effects. And then they created what Vidal calls a choreography of texturing the image.

“The [sonar-vision] shots are between the live-action shots,” says Lalanne. “So, we used the virtual camera to give a violent rhythm to the scene.” For example, they might add a flash at a particular point to staccato the rhythm.

 “It is not far from a stroboscopic effect, but not too aggressive,” Vidal says. “It took us a year to find the rhythm, the decay of each, how long you’d see the set, how long it would be revealed.”

To create the sonar-vision effects, BUF used proprietary software. “That’s the interesting thing with BUF,” Lalanne says. “We develop a specific software for every movie. For sonar vision, we created software, for this movie, for this effect, for this director.”  –Barbara Robertson
 

TWO-FACE
When Harvey Dent's face catches fire, Framestore CFG provides the flames and the afterburn in 120 shots

The effect is this: A fire splits actor Aaron Eckhart’s face right down the middle. One half is normal. The other half is not. The fire burns his skin crispy in some places, burns through to the bone in others.

“The last time Two-Face was in a film, prosthetic makeup created the burned side of the face,” says Nick Davis, visual effects supervisor. “This time, we wanted it to be a subtractive effect, as if half his face had gone. You see his teeth, exposed muscles, his eyeball.”

The trick, then, was to create the burned half, move it in concert with the normal half, and somehow have it all look realistic. But, not too realistic.

“Chris [Nolan] wanted us to come up with a design that was photorealistic and shocking, but not so revolting that the film would be unwatchable,” says Davis. “We have an emoting, performing human with a human face, although it doesn’t look very human.”

Tim Webber led a crew at Framestore CFC that created the effect. “There were two big challenges,” he says. “The first and biggest was tracking [the burned face] onto the actor’s performance. We had to do a full-performance capture during a normal film shoot to get such a precise performance that we could stick the digital face on without any sliding. The second was that all these different bits of burnt flesh had to look real.” The performance tracking also enabled the studio to stick photographed fire elements onto Harvey Dent’s face when he bursts into flame.

CG supervisor Ben White led the tracking effort. On set, the crew generally used three small HD “witness” cameras running at 48 frames per second to capture Eckhart’s performance. For a sequence that takes place inside a car, they taped lipstick cameras inside the vehicle and, in one shot, could use only the film camera.

Eckhart wore a partial prosthetic on the back of his head and markers on his face, approximately 25 primary retro-reflective markers and 50 makeup dots. To line up the markers consistently from one day to the next, makeup artists applied the dots using a cardboard mask cut with holes. Ring lights on the main camera and the HD cameras helped capture the markers in low light.

For tracking the captured dots, the crew used Movimento from RealViz, now part of Autodesk, and hand-tracked 2D points when necessary. “First, we established the position and orientation of the actor’s head in each shot in conjunction with the camera track, using filtering techniques to get rid of noise,” White explains.

Once the crew positioned Eckhart’s head in the scene, they placed the facial track onto his digital model and began applying his facial deformation, binding the geometry to the virtual tracking points and making sure the face deformed accurately.
“Even though we used the cardboard mask, the markers were off a few millimeters every day, which would deform his head in a way we didn’t want,” White says. As a result, the rigging and scripting team developed techniques to sample the position of the primary markers on the skin and then determine where to base deformers and muscle groups. And although the facial tracking was fastidious, sometimes it wasn’t enough.

“We’re used to seeing skin slide over bone in CG characters, and we captured that skin performance, but in Two-Face, we can see the bone,” White says. “We needed to know what’s going on underneath.” Thus, they used Eckhart’s teeth to determine how to position his jawbone and attach muscles to his skin and bone. When he tenses his jaw angrily and there’s no captured movement, animators and the facial-tracking team keyframed the performance.

Animators also keyframed the digital eyeball’s movement to match the actor’s eye. “The fake eye had to move exactly like the real one,” White says. “In the CG world, usually an eye has a center and it rotates perfectly. This eye is a ball of jelly in a socket pulled by muscles, and we can see it. We had keys on every single frame in every single shot.”

Look-development lead Rob Allman worked with a small team—a compositor, a texture artist, a modeler, and an animator—to devise a realistic look for the burned face. “We had to match the skin as if it had been burned, which meant all the good skin hadn’t been completely damaged,” he says. “So we had carbonized flesh, and then all the organic things—the exposed muscle tissue, the teeth, the eyes. And, the face is divided exactly down the center of his head, which is unnatural, but it’s a necessary part of his character, so we had to make that credible.”

For reference, the team had a prosthetic model from Warner Bros., 2D concept art created by artists at the studio, and a burned chicken. “We bought a chicken at the supermarket, burned it, and examined it under light,” Allman says. “Burned skin has a twinkly quality, like charcoal. It’s important to see how it moves and deforms the face.”

For subsurface scattering, the team implemented systems developed by Pixar that are part of the standard RenderMan tool set, embellishing them to create the charred look, and created other variations for the gums, teeth, and eyes, all in neutral environments.

“We didn’t have footage for quite a while, so we worked blind, looking at reference, writing shaders,” Allman says. “As soon as we got the footage, we introduced the colors and lighting from the plates. There’s only so much you can do in a neutral environment.”

During the development process, Allman worked back and forth with compositing. “A compositor would develop something in Shake that I rolled back into shader development,” Allman says. “It was a two-way process. If the 2D artists are involved, they can inform the 3D, translate the 3D, and it enriches the look.”  –Barbara Robertson
 

Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.