Defying Gravity
August 16, 2011

Defying Gravity

Ant Farm and Zoic Studios create Inversion game trailer, setting the tone for the upcoming title release
Ant Farm recently engaged in a unique collaboration with Zoic Studios to create a game trailer for the upcoming title Inversion. Known for creativity on the forefront of game adverting, Ant Farm and Zoic produced a tone piece for the game that resonates as a movie trailer with a thrilling hint of what’s to come.

Based on a compelling original concept by Ant Farm, Zoic Studios’ Loni Peristere directed live-action vignettes filmed with the Canon 5D & 7D by accomplished DP Vincent Laforet to replicate the look of user-generated video. Zoic Studios’ 3D team then created photorealistic floating CG objects that were seamlessly integrated into the live-action scenes. These were then edited with Ant Farm’s animated transitions and graphics. The impact is eerie and dramatic, a world where gravity is unexpectedly flipped, and the events unfold as if captured by its unfortunate witnesses—just like the feature on which the concept is based.



The game will be released in 2012; in the meantime, the game trailer is keeping viewers on the edge of their seats.

According to Leslie Ekker, creative director of Zoic Studios’ commercial division, the crew utilized its extensive library of assets for this piece, leveraging the work it has done before and some new pipeline enhancements to allow for ultra-efficient animation and rendering of CG cars, trucks, people, and debris. The team used Autodesk Maya models and animation, and rendered in Chaos Group’s V-Ray for quick, photoreal looks; custom HDRs were captured on location.
 
“We were able to include much more animation than we had even hoped for because the pipeline we have been refining is so streamlined,” says Ekker.



As for the textures, Ekker explains, “We restored several models from our archives, and some needed V-Ray textures and shaders to run through the new pipeline. We created them using some of the car paint setups we had from other projects, and some custom setups as needed.  There wasn't a need for photographic textures because most of the animation was background, in a heavily affected look with lots of atmosphere, smoke, glare, and noise, often with extreme camera motion.”

The artists also used 3D particle systems to place and animate the myriad floating elements in several shots.  They created a particle dynamic system and used instancing to place hundreds of photographic elements in the scenes. This created a subtle and naturalistic mass floating effect.



The crew composited in The Foundry’s Nuke, rendered in V-Ray, and used photographic and cinematographic elements shot internally, ranging from toy cars and figures to live humans floating on greenscreen.

All of the plates were shot on the Canon 7D in HiDef video. This format was chosen for its rapid production pace, efficient production costs, and suitability for the desired end result.  The comps were all intended to look like they were shot on several smartphones, and so the image degradation process, a dual-platform treatment, had to be anticipated, Ekker says. This required several iterative cycles to hone in on the precise settings needed to achieve the look with good control.  Since the 7D, and many other similar cameras, use H264 compression, all greenscreens were shot on a HVX video camera so that the greenscreens would work smoothly.



The big challenge here, as it so often in this new world, was budget, notes Ekker. “We wanted to create a visceral realistic experience with a grand feature-scale production feel.  The fact that it was to feel like it was shot with phone cameras didn't make that much easier! It still required hundreds of things floating up in the city in several shots, intimate close-up effects, big photoreal animations, and a design and cinematography that felt dynamic,” he adds.

To solve these challenges, the studio used “lots of practical and inventive tricks,” says Ekker. “We needed dozens of cars floating a few blocks from camera, and even closer.  We reused existing in-house assets, and also shot our own floating cars as miniatures on rods manipulated on greenscreen here in our offices!  We had interns, artists and friends sit or lay on swivel stools and shot them ‘floating and thrashing’ on greenscreen, too.”



In addition, the group bought a collection of action figures and posed them floating on greenscreen to form background victims.  Moreover, they used glare and camera motion to make elements hold up better, and retain a visceral real and artifact-affected look. “We created building damage by comp’ing wrecked building elements into the high-rises with corrected perspectives,” says Ekker. “We bought a supertanker model and textured it in basic fashion to float amongst the towers, because it didn't need anything more elaborate, due to lighting and artifacting. We made lemonade from a basket of lemons!”

So, what challenge did the studio face when using this type of live action style to advertise a game, which is all-CG?  Ekker explains: “There are moments of game style animation of floating buildings and debris, and they were created and cut in by the client. We stuck to a very real world, ‘this is really happening’ attitude for our animation. Cars and trucks , people, and debris were all made as real as possible with very little time to refine. We quickly achieved this realism with our great, solid pipeline and quality assets.”

The biggest technical challenge, though, that Ekker believes will doubtless go unnoticed was the 3D tracking in the project.  The Canon 7D is a little better than the 5D, due to the smaller chip size, but the rolling shutter artifact inherent in the video frames simply plays havoc with optical flow tracking software, he points out.



“The distortion it creates varies frame to frame, and when you factor in the crazy camera movement we wanted, getting a solid track is generally thought to be impossible. We shot some tests early in the pre-production schedule, and fed them into our tracking pipeline. The results were concerning,” Ekker says. “We shot more tests, evaluating how much movement we could tolerate, with both the 5D and then 7D, and found that there were ways, with care and hand-tweaking, to get useable tracks from the footage. This informed the shooting and the tracking processes together. We went into the shoot with confidence, and the selects were brought in to tracking with confidence and experience to allow for success.”

Ekker then adds: “Don't try this at home kids, but we did manage to get solid tracks from the craziest moving shots, and it shows in the work. This new way of shooting no longer gives our artists pause. We feel new confidence around the new paradigm of production that is increasingly necessary due to budget and schedule constraints we face so often today.”