Man of Steel Previsualization
Issue: Volume 36 Issue 6: (Sept/Oct 2013)

Man of Steel Previsualization

Pixel Liberation Front helped immerse Man of Steel Director Zack Snyder in various scenes, where he could see the sets, characters, and vehicles in full scale. “He would do his camera work and communicate the look, pace, and feel of the sequences,” says PLF’s Previs Supervisor Kyle Robinson. And that was not always easy.

The Smallville battle sequence - shot on location in Plano, Illinois - was one of the most heavily previs'd sequences in the film. It's the stage for Superman's first seismic showdown with General Zod and two other Kryptonian exiles as F35s and military choppers circle in the skies above. To choreograph the scene's complex interplay of live action, CG, editing, and camera work, PLF's previs team hunkered down at an LA soundstage with Stunt Coordinator and Second Unit Director Damon Caro, who worked out the entire routine with stunt performers playing Superman, Zod, and the other Kryptonians. Sometimes strapped into cable harnesses, all of them wore color-toned tracking suits with markers. Pixel Liberation Front (PLF) recorded the fight with two Sony video witness cameras and keyframed the digital doubles in Autodesk's Maya over the movements, before placing them inside the virtual Smallville set.  

From there, PLF took the scene to Snyder's personal studio, where, using PLF's Vicon virtual camera system, he could walk around the Smallville street - about four to five blocks long - study the unfolding melee, experiment with camera angles, make changes to the set, staging, or the beats, and work out the right pacing and coverage, long before setting foot on location in Illinois. 

"We worked with Zack to get the camera angles around the action he liked; we could fly a plane through the scene, and so forth. He'd figure out how he wanted to cover everything, where he wanted to place the fight on the street, so that when they got to the location, the crew knew how to dress the set and stage the fight," Robinson says.

At his studio, and ultimately on set, Snyder and PLF also worked closely with Damon Caro, and Visual Effects Supervisors Guillaume Rocheron and John Des Jardin. "Guillaume, the VFX supervisor at MPC, which handled the final effects on the sequence, did an awesome job," says Robinson. "He and Des Jardin worked close together, studying and refining the previs to nail down the camera movements they needed to link the practical cinematography with what they called they called the 'envirocam' photography, which was crucial for creating seamless handoffs between the real Michael Shannon or Henry Cavill and their digital counterparts - when they suddenly sprung into flight, for example."

On set, Snyder, Guillaume, and Des Jardin staged the actors according to the previs and fitted them with Xsens' MVN mocap tracking suits for placing CG armor and facilitating the transitions from live action to CG.

"Once the plates were shot, they were given back to us so we could fill in our previs assets, adding planes and Superman flying, and working with Damen again to get exactly what was requested, before handing over [the postvis] to editorial so they could stitch it all together," says Robinson. To track the camera moves during the postvis process, PLF used The Pixel Farm's PFTrack, while relying on Adobe After Effects for compositing.

Proof Inc. Lends Hand

While PLF tackled previs and postvis for Man of Steel, Proof Inc. handled on-set visualization (durvis) on two key sequences, including the launching of the star craft bearing the baby Kal-El from the exploding Krypton.

The facility also provided on-set visualization to fill in the background of the minimal practical set used for the Black Planet, ensuring that camera moves contained certain CG landmarks. To generate the real-time composites, Proof Inc. used Lightcraft's Previzion system, a real-time camera tracking, CG rendering, and compositing tool. "We received the models from previs and prepped them in Maya for the Previzion render engine," says Previs Artist Anne Lee. "This involved making the geometry as lightweight as possible and applying real-time shaders for the materials." The Previzion system would then track the camera, map the data onto the virtual camera, and use it to render the CG component of the real-time composite. Once the shots were approved, Proof Inc. handed off the Maya files containing the virtual environment and the tracked camera to the main vendors, which began re-creating the composite with the finished effects.

Like PLF's Robinson, Proof Inc.'s Ron Frankel also recognizes the bottleneck in making real-time adjustments to a shot. "Real-time animation tools and real-time playback [using modern game engines, for example] have tremendous promise for pushing previs to the next level," he says, adding that advancements in real-time lighting technology would be a huge boon as well. "As it stands, there isn't much a DP can do with lighting tools that can only mimic real-world lighting. So, real-time lighting tools would be fantastic, especially if they could accurately simulate physical lighting. If that were the case, then we'd see more involvement from the DP."