Orchestrators of the Dream
Issue: Volume 36 Issue 6: (Sept/Oct 2013)

Orchestrators of the Dream

It was 1971, and a 25-year-old Steven Spielberg was so nervous about shooting his first feature, Duel, that he quickly hired a sketch artist to draw out every shot on a long strip of paper, which he then proceeded to tape along the four walls of his tiny motel room, so he’d know exactly where he was – and was going – during the hectic, 13-day shoot.

Indeed, directors have been trying to previsualize their films long before looking through a lens or exposing a single frame. And while those old films of Spielberg, Martin Scorsese, and Alfred Hitchcock have an unmistakable polish through their extensive storyboarding, what we’re seeing today is a level of visual choreography that’s a step above even films from those masters.

Iron Man 3 Previsualisation

THE THIRD FLOOR conducted previs and postvis processes for Iron Man 3.

This year alone, millions of moviegoers watched, mesmerized, as Tony Stark flitted deftly across a jungle of cranes and gantries surrounding a sprawling shipping yard. Working in tandem with a fleet of Iron Man suits and dodging gunfire from rabid super-soldiers, he tries to rescue the president strung up over a docked oil tanker. On a storm-whipped sea, viewers saw terrified fishermen stare slack-jawed as alien leviathans rose from the deep to slug it out with massive, man-operated robots. And far away, as Krypton crumbles all around, they watched Jor-El ride his winged H’Raka and spar with General Zod before launching his only son, Superman, to his new home on Earth. It’s all a whirlwind of action that unfolds with such balletic editing and precise, orchestrated staging and effects, it’s hard to imagine how it was conceived through storyboards alone.

That’s because it wasn’t. It was conceived in previs.

In the wake of soaring audience expectations, shrinking schedules, and ever-growing artistic ambition, previs is expanding into a juggernaut department. Far from just being the birthplace of a director’s vision, in the words of previs pioneer and Proof Inc. Founder Ron Frankel, previs is slowly becoming “a nexus of inter­departmental communication.”

“Historically, previs came out of effects, so we’re very well integrated there, but we also interact well with the art department and with camera,” he says. But previs is also enfolding other departments as well, including lighting, wardrobe, and sound, because, as Frankel says, “previs is frequently the best – and in some cases, the only – way for various departments to get together and discuss their common challenges.”

Aaron Weintraub, VFX supervisor at Toronto’s Mr. X, which previs’d Guillermo del Toro’s Pacific Rim, agrees. “Today, we’re seeing a convergence of the three phases of production. Because of virtual production technology, camera moves from previs can survive through the shoot and on into post.” The artwork, assets, and pipeline set up in previs, he adds, are leveraged throughout the shoot and into postproduction, eliminating uncertainty and reaping cost-savings everywhere. “As production elements are folded back in, previs becomes postvis, which becomes actual VFX and postproduction.”

So, what’s postvis? From its initial roots in the conceptual phase, previs is dividing into at least five unique branches. The first, previs, is the conceptualizing of shots at the preproduction stage. In the second stage, durvis allows directors to layer in previs “during” the actual shooting of live action, often through the use of augmented-reality setups. Thirdly, postvis involves the compositing of previs into the live-action plate for the verifying of the final effects. Next, techvis encompasses the mechanical design and testing of props and sets, usually to ensure they’re simpatico with elaborate camera and stunt work. And, finally, there’s stuntvis, in which previs artists collaborate on set with directors and mocapped stunt people for the planning and execution of complicated action.

All five phases of the previs machine were hard at work in this summer’s biggest films. For insight into their creation and into the future of the art form, we turned to some of the masters of the previs game: Pixel Liberation Front, which handled Man of Steel; Proof Inc., which did on-set visualizations for the film; The Third Floor, which helmed both Iron Man 3 and Oblivion; Halon Entertainment, which tackled The Wolverine; Mr. X, which laid the groundwork for Pacific Rim; and previs specialist Cameron Sonerson, who helped Baz Luhrmann realize visionary set pieces for The Great Gatsby.

Iron Man 3 Previsualisation

THE THIRD FLOOR previsualized the complex climactic Seaport Battle for Iron Man 3.

Iron Man 3

Known industrywide as a previs powerhouse, The Third Floor designed several action set pieces for Iron Man 3, among them, the Mansion Attack sequence, during which military choppers strafe Tony Stark’s seaside Malibu estate carved into the hillside. The facility also previs’d the Suit Connect Test, the Crash Landing in the Forest, the Air Rescue (in which Stark forms an aerial human chain to rescue freefalling passengers from a downed Air Force One), and, finally, the climactic Seaport Battle, which unfolds on a morass of gantries and scaffolding surrounding an impounded oil tanker.

“Having multiple story lines converge during a massive, multi-suit fight was challenging,” says Previs Supervisor Todd Constantine. “We worked closely with Visual Effects Supervisor Chris Townsend to ensure camera moves, lenses, and staging would remain realistic and plausible.” Those were things they couldn’t ascertain from the initial storyboard animatics, which Director Shane Black loved. Therefore, the job of the previs team was to maintain the essence of those animatics but also help him explore and plan what could actually be shot.

“In our initial pass, we would lock down the best lens and the ideal order for the shots,” says Constantine. “Since the sets for Tony Stark’s mansion and the garage [where he does the Suit Connect test] had already been designed, we choreographed the action within the scope of the set designs, indicating camera angles, movement of actors and props, and the layout of various effects that worked for those specifications. The seaport was an actual location, so previs was used mainly to determine the sets that needed to be constructed to accommodate the filming and the safety of the actors.”

Once Cinematographer John Toll had shot the live-action plates, the postvis process began; artists started tweaking the previs and layering it back into the plate to ensure it would gel with the completed effects.

The postvis crew collaborated with Visual Effects Supervisor Townsend, Editors Jeff Ford and Peter Elliot, and VFX Editors George McCarthy and Logan Briet. When a scene needed postvis, McCarthy and Briet would pass over a reference cut and the live-action plates, along with Black’s notes.

As part of the previs and postvis process, The Third Floor provided its digital assets to the main effects vendors, Scanline and Trixter, including camera information and character and set animation.

Most of the high-intensity action sequences in Iron Man 3 adhere closely to their previs’d counterparts primarily because the director, visual effects supervisor, and second unit director were so heavily invested in developing them. “When it comes to sequences that have been previs’d, the level of stress about the game plan is lifted. When the previs is exciting but maybe the location or set has been changed just before shooting, the filmmakers can use the previs as a type of guide as to what the sequence should feel like.”

Whether followed religiously or used as a guide, what’s most rewarding about previs, says Constantine, is witnessing the director’s vision at the embryonic stage, helping it come to life on the big screen, and still seeing the vestiges of one’s own creativity in the finished product.

But many times, Constantine continues, the previs team will just receive a handful of script pages and a quick meeting with the director to go over the key beats. “In all cases, when we sit down in the theater to watch the final result, we get to experience all the creative moments leading up to how the scenes were decided upon, how our artists contributed to the sequences, and how the director’s vision finally translated to the big screen.”

Oblivion Previsualisation

THE THIRD FLOOR helped block action and explore the look and function of environments, vehicles, and other elements in the world of Oblivion.

Oblivion

The Third Floor also previs’d the sci-fi film Oblivion, which sees Earth destroyed by aliens known as Scavs in 2077. Only Tom Cruise’s Jack Harper and his partner Victoria remain on Tower 49, repairing drones that defend ocean-borne power stations feeding energy to a human colony on Titan.

Four months before production, Previs Supervisor Nick Markel met with Director Joseph Kosinski to sketch out 25 sequences, including the opening battle, the drones’ attack on the Scav compound, and the final aerial battle. With a background in 3D software, Kosinski sat with Markel and, guided by the storyboards, figured out three things: the camera, specifically the lens choice, movement, and composition; the feel of the animation, including how Tom Cruise’s Bubbleship would fly; and the previs edit.

While Kosinski began sketching out his shots, many of the guidelines had already been drawn, particularly by the art department, which brings up an important point: Says Markel, “There’s a fine line between starting too early and starting too late, which varies considerably from show to show. A number of things should be answered before previs starts in order for it to be effective, but it should also start early enough to produce useful material for planning and budgeting. Production Designer Darren Gilford had already finished conceiving, among other elements, the look of the Sky Tower and the Bubbleship, so we received the digital models from the art department and were able to move seamlessly right into helping Joe [Kosinski] figure out the action and sequence of events.”

Since Kosinski is a director who likes to shoot as much in-camera as possible, Markel also did techvis to figure out front-projection solutions to capture backgrounds in camera. Once Kosinski finished shooting the live action, however, Markel and his team went through an extensive postvis phase using Adobe After Effects to provide temporary visuals for editorial, while Pixomondo and Digital Domain, the main VFX vendors, created the final visual effects.

The Wolverine Previsualisation

HALON USED PREVIS to test out ideas in The Wolverine, including camera placement.

The Wolverine

In August, Marvel’s resident lupine superhero, Wolverine, roared back onto the big screen in a new film that follows him as he travels to Japan, confronts a figure from his past, and wrestles with his demons. To help carve a fresh, cinematic approach to the man with the adamantine skeleton, Director James Mangold enlisted the previs specialists at Halon FX, who were led by veteran Previs Supervisor Clint Regan.

“James [Mangold] used previs to explore and test moods and ideas that he wanted for portions of the film,” says Regan. “In the end, we previs’d approximately six sequences, including the opening scenes and the finale, which went through several versions before James found what he was looking for. James liked to search for the right mood.”

They worked hand-in-glove with Production Designer Francois Auduoy, who would sit with Regan and Mangold and build the sets on the fly to meet the needs of the scenes they were blocking. Halon’s artists built the bulk of the sets, characters, and animations in Autodesk’s Maya and MotionBuilder, using a library of rigged character models, sets, vehicles, and EFX stand-ins. “Once we had enough pulled together, we started to block the scenes in Maya. We then moved the characters around the set to find the most efficient and interesting places for the action to occur and the most interesting ways to shoot the important story points.”

Previs is driven by a constant need for efficiency, and Regan is encouraged by recent improvements to Maya’s viewport, specifically the addition of tools for real-time, depth-of-field, shadows, and normal maps, which, he says, were not available until previs on The Wolverine had wrapped.

While the previs artist almost always plays midwife to the vision of the director or other department head, there are times when he or she is handed the directorial reins. That happened when the director gave Regan and VFX Supervisor Greg Steele the task of designing a standoff with a bear.

On The Wolverine, Regan’s second outing with Mangold after 2010’s Knight and Day, a relationship of trust had developed. “When that happens,” Regan says, “I’m able to make suggestions I feel will help the story. Ultimately, my job is to interpret the director’s vision, give him options, and help him reach his vision of the story he is trying to tell.”

Pacific Rim Previsualisation

MR. X PERFORMED a range of previs work, along with techvis, for Pacific Rim

Pacific Rim

In another tale of alien invasion, del Toro’s robot epic Pacific Rim explored the fusion of man and machine with unsurpassed visual flare and awe-inspiring spectacle. Giant leviathans, known as Kaiju, rise from the Pacific Ocean, killing millions and consuming Earth’s resources. In response, mankind constructs massive robots, called Jaegers, to take them on. The Jaegers are manned by two pilots whose minds are linked in a neural nexus.

Known for his fantastical imagery and hyper-real, saturated color palette, del Toro enlisted visual effects house Mr. X. Among their many tasks, the artists had to design the initial builds of the Jaegers and Kaiju, and then craft a fully animated, ocean-set battle sequence that would establish the action, mood, and pacing of the Jaeger-Kaiju clashes. Afterward, they tackled the complicated techvis required throughout the shoot. This included virtually testing the mechanics of the digital and practical Jaegers, testing the gimbal systems of the practical cockpits, establishing the range of motions for the actuator systems the pilots drive with their arms and legs, designing the on-set hydraulics that make the boats and actors appear as if they’re tossed on the turbulent waves, and, finally, mapping out the stage, camera, and greenscreen layout for shots requiring complex rigged stunt work and camera moves.

For the first test sequence, says Previs Lead Craig Calvert, del Toro’s main goal was to nail the feel of the film. Lens choice, composition, and shot pacing were priorities. “We spent a lot of time working with the sense of scale so that these incredibly huge beings could move fast enough to be fierce yet slow enough to be believable. We were still at a stage where the actual size of the Jaegers and Kaijus was up for debate, and our tests helped inform those choices.”

Furthermore, since the designs of the Kaiju and Jaegers were still in the embryonic stage, previs enabled del Toro to flesh out their movement, morphology, and mechanics. Therefore, the team also performed rigid-body simulations to test the real-world physics and dynamics involved in objects clashing at a such a massive scale.

While establishing the mood, framing, and staging of the battles, the previs also helped to inform the set design. “For example,” says Calvert, “the animation and camera work we designed for the fishermen on the [ocean] were later used to inform the technical specs of the practical rig for the boat.”

Del Toro required techvis for much of the set construction, specifically coordinating the props and greenscreens with the ambitious camera work. Borrowing plans from the art department, artists re-created the sets in Maya and then blocked in the shots from the storyboards.

“We explored the lenses and compositions described in the boards, identifying potential issues, such as a camera needing to be far higher than the stage could allow,” says Calvert. “Also, identifying the optimal number and placement of greenscreens without affecting anything artistically was a frequent problem we helped solve. Testing all of this in the computer ahead of time allowed us to feed back into the art department, suggest layout changes or, if no physical solution was present, show the storyboard team the constraints they were working with.”

Throughout the shoot, del Toro also employed a durvis system for virtually walking through scenes in the Shatterdome, a massive structure that houses the Jaegers. To help the director understand what would be added beyond the actual stage, Mr. X utilized a real-time, iPad-based “augmented-reality” setup. Standing on the stage, which included a small part of a decorated wall and the foot of one Jaeger, del Toro was able to use the iPad as a camera, seeing the set extension in real time and identifying where other CG objects were located.

“They could frame-up shots then and there, point their cameras in the same direction as the iPad, and be confident they were looking at the cockpit of the towering robot second from the left,” explains Calvert.

When ILM was chosen as the main effects vendor, Mr. X handed off its meshes, Pixologic ZBrush sculpts, animation, and texture files so the effects team could hit the digital ground running. Like The Third Floor’s Constantine, sitting down with the director and being the steward of that person’s vision at the seminal stage is, for Calvert, one of the most rewarding aspects of the job. Nonetheless, “it’s always a great feeling when one of your ideas surprises a director and later makes it into a film. A great previs artist needs to be a jack-of-all-trades. [He or she] needs to be able to animate and understand how camera lens choices affect the shots” – and, as Pacific Rim proved, “the ability to understand architectural diagrams and solve technical problems on the fly is a huge asset.”

Man of Steel

While Jaegers were squaring off against the Kaiju, Zack Snyder’s reworked Superman saw the son of Jor-El return to Earth, meet Lois Lane, and tangle with General Zod in ever-escalating spectacles of mass destruction. One of the year’s most effects-intensive movies, Man of Steel launched into the box-office stratosphere on the strength of almost two years of previs work, helmed by the digital wizards at Pixel Liberation Front.

“We came on early in preproduction, when we were pulled in by Production Designer Alex McDowell,” says PLF’s Previs Supervisor Kyle Robinson. “Initially, we started off building environments based on some storyboards Zack had drawn himself. We were doing a lot of virtual location scouting, taking Zack into Alex McDowell’s set locations and letting him [use a virtual camera to] scout it for himself.”

PLF would animate the environments, vehicles, and characters, and then, using a Vicon virtual camera system coupled with Autodesk MotionBuilder, immerse Snyder completely in the realm of the scene – be it on Krypton or in Smallville. Snyder could see the sets, characters, and vehicles in full scale and get a 1:1 relationship with the CG elements. “Zack was always after getting the action, composition, and personality of the camera work correct,” says Robinson.

Indeed, the “personality” of the camera was critical in Man of Steel, especially during the combat sequences featuring the military, Superman, and General Zod’s minions, where Snyder aimed hard for a handheld, spontaneous feel, something redolent of a war correspondent dropped right into a battlefield. To that end, he favored a lot of whip pans and rapid zooming to catch snippets of action in a chaotic, documentary approach.

“He was looking to get a lot of ‘embedded journalist-like photography,’” says Robinson, “so there was a lot of handheld, quick pans and snap zooms, as if you were someone right there on the spot trying to focus on all the action,” says Robinson. Nowhere was this approach more evident than in the opening battle on Krypton, especially when Jor El is flying around on his winged H’Raka.

Once the postvis was complete, PLF packaged up its Maya files and sent them to MPC, Double Negative, Weta, and the other vendors servicing the show. “Those files gave them the starting place where we stopped, so there was no need to reproduce shots or any loss in production time,” says Robinson.

The virtual location scouting system, including the recording and keyframing of stunt performers in color-toned suits using the Vicon system at Snyder’s studio, was a godsend throughout Man of Steel’s long preproduction process, allowing Snyder, McDowell, and the rest of the visual effects team to plan scenes with a speed and efficiency that greatly streamlined production. Robinson sees advancement in the virtual camera, in its functionality, portability, and versatility, as a cornerstone in the development of previs.

“Potentially, the system could double for the virtual camera environment as well as the mocap, so when we travel around and they want to do any scouting, or a director wants to see a character do a specific action, for example, we could set up the virtual camera and the scanning system, put a stunt person or actor in a suit, and switch it over to motion capture. We could then target that mocap data straight to the skeletons of our previs characters,” explains Robinson.

“The limitations of the system right now,” cautions Robinson, “come in the real-time tailoring of the shot. For example, we could do a cycle animation for a director to shoot involving two cars driving down the street. But if the director wants one to swerve, someone has to stop and do that animation and import it across. Right now there is the current bottleneck, although it is getting better, with some of the 3D mocap systems starting to make it easier to have real-time adjustments made on the fly.”

The Great Gatsby Previs

Previs proved helpful for Director Baz Luhrmann on the set of The Great Gatsby.

The Great Gatsby

Previs is also finding a home in intimate period dramas as well, albeit those amped up by Baz Luhrmann’s loud and flashy direction, such as The Great Gatsby. Known as a visually audacious filmmaker, Luhrmann needed a guide, someone to help navigate the film’s complex, musical set pieces. To that end, he turned to previs specialist Cameron Sonerson, whose credits include The Adventures of Tintin and next year’s Maleficent.

Filmed on five stages at Fox Studios and many locations in and around Sydney, a single shot would often be an amalgam of numerous plates. It was a logistical nightmare, factoring in the partial sets that were built, fitting in the stereo camera rig, what would be digital, what would be practical, and how to combine multiple locations into a single frame. “We had meetings with Baz where he’d discuss his vision for scenes. While we had storyboards, oftentimes we only had an excerpt from the script, and we’d design and pitch ideas to Baz,” says Sonerson.

One of the most complicated set pieces, the New York High Line sequence, saw the main character and his friend drive from Gatsby’s mansion to a speakeasy downtown, with the elevated railroad track of the High Line, which cuts through Manhattan’s Lower West Side, looming in the background. The drive went from East Egg, Long Island, through the Valley of Ashes, then past the TJ Eckleburg sign to New York City.

The production crew also needed Sonerson to provide detailed techvis for layouts of actually achievable shots. With so many large-scale and elaborate sets – for Dan Cody’s yacht, the Buchanan’s mansion, and Gatsby’s party scenes, for example – space would often be limited, requiring careful plotting of camera placement, technocrane movement, speed timings, and so forth.

While relying on Adobe’s Photoshop and his custom Maya tool set throughout the previs, in postvis Sonerson used The Foundry’s Nuke for compositing. After the shoot, Sonerson’s team tracked the camera and postvis’d the plates over both the previs background and set extensions, delivering the rough composite to editorial to work out the final cut. Once that was approved, Sonerson handed off the previs environments to the final effects vendors, who then used them for their initial layouts and crafting high-res models.

Prevising the Future

With each show, Sonerson is constantly inventing new tools for his previs kit. Mostly Maya-based, he says, the software allows for the rapid setup of a pipeline and the output of useful data for on-set, location, and editorial use. For Gatsby, he was able to add a multi-camera rig and exporter tool to process and output an environment for on-set viewing on an iPad.

“What’s currently lacking in previs,” Sonerson adds, “is that bridge between creating the shots and the output of actually useful data describing how they’re to be achieved on set. The advent of tablets and GPS-based software made a big difference for Luhrmann, especially when coordinating the camera with a lot of bluescreen and background replacements. As VFX shots become more complex, tools that enable the crew to understand how to set up a shot to achieve final VFX will help not only on set, but will save a lot of time in postproduction, as well.”

Frankel echoes Sonerson’s desire to see better analytic tools in Maya for the output of more useful data for the on-set crew. “On a recent production for Universal, we worked on an incredibly complicated shot that had to be precisely engineered. The tolerances were so small that we were making pan-and-tilt adjustments of less than one-tenth of a degree. And we had to analyze camera speed and acceleration to make sure the rigs were capable of starting and stopping within the available space,” he says.

Having worked on numerous 3D films, Sonerson has already built custom digital camera rigs to emulate the stereo cameras used on set, along with data output tools for Maya. He says, “Per-frame data from the previs would then be exported from Maya for use by the crew for placing and moving the camera, staging actors and props, and determining focus, interocular distance, and convergence settings.”

Frankel would also like to see better analytic tools within Maya for testing 3D. Right now, they operate in a hunt-and-peck mode, he says, and animators can make adjustments quickly and evaluate how the shot is working, but it’s difficult to analyze and evaluate a composition. It would be amazing to have access to image analysis tools similar to what stereographers have on set, Frankel adds.

Along with greater integration with the on-set production crew, PLF’s Robinson would also like to see a future where previs is partnering with, and feeding, the final vendor more closely, entwining their pipelines, techniques, and technology to help expedite their finished work. That aside, he’d also like to see a quicker mocap system, streamlining the process of using the virtual camera and the MVN suit so it can be quickly deployed for real-time exploration of a scene.

While the introduction of real-time depth of field in Maya’s viewport has been a huge help, the need for advancement in real-time technology is always pressing. Along with better real-time lighting, The Third Floor’s Ramirez would also like to see advancement in real-time shadows.“

“It’s exciting to see advances like Viewport 2.0, and game engines capable of real-time shadows, depth of field, and so on,” adds Constantine, “but there are still limitations to those technologies in terms of speed, flexibility, ramp-up time, and unpredictability when used in a fast-paced production environment.”

Mr. X’s Calvert believes the tools are slowly improving, but looking at the type of imagery that can be obtained from today’s game engines, they’re still lagging. “Using the Crytek or Unreal engines are things we’re investigating for the future,” he says. “A much, much better viewport in Maya is also high on my wish list.”

The Third Floor’s Markel adds his voice to the choir: “The single greatest improvement right now would be a better viewport display in Maya. If we can keep the speed but improve quality, that’s a benefit. Adding reliable ways to get realistic lighting would be of great value to the DP, but it has to be done in the viewport, and without rendering.”

Another potential game-changer looming on the horizon is the type of augmented-reality setup used by del Toro and Luhrmann. “We’re seeing it gaining traction,” says Calvert. “Tools that easily allow a director to step within a virtual set and operate alongside digital characters will become more widely accepted.”

In the near future, Frankel envisions previs evolving into the “nexus of interdepartmental communication,” and Markel adds that the key to maximizing its potential in this new role is making previs a separate department with its own budget, into which all departments could contribute relevant information.

Martin McEachern is an award-winning writer and contributing editor for CGW.