Hero Effects
Issue: Volume 39 Issue 4: (Jul/Aug 2016)

Hero Effects

Two superhero sequels, an alien re-invasion, and another adventure that boldly goes where no man has gone before are among the blockbuster movies lined up for summer. Technological advances make characters more real, more in-the-moment than ever before, environments imaginative and breathtaking, and action beyond thrilling, as the X-Men and the Teenage Mutant Ninja Turtles battle new opponents, the international community faces off against a second alien invasion, and the Enterprise crew fends off their own alien threat.


Independence Day: Resurgence

It was 1996 when alien invaders first attacked Earth in hopes of eradicating the human race and taking over the planet. The film was Independence Day, and it grossed more than $306 million at the box office. Directed and written (with Dean Devlin) by Roland Emmerich, the film went on to win an Academy Award for its visual effects.

Fast-forward 20 years, and Emmerich, along with many original cast and crew, including VFX Supe Volker Engel and VFX Producer Marc Weigert, return for Twentieth Century Fox’s Independence Day: Resurgence. In this long-awaited sequel, the aliens are back, in much stronger force than before, and trying yet again to take over the planet.

The film has just under 1,800 visual effects shots completed by numerous facilities, including Weta, Scanline, MPC, Image Engine, Digital Domain, Cinesite, Luxx Studios, Trixter, and Engel and Weigert’s own Uncharted Territory.

“In the first movie, we were restricted with showing the aliens,” Engel says. “It was back in the days with puppets and some creature-type work, where we could only show the upper torso. We used lots of fog to hide everything. In the new film, the aliens play a much bigger part. We show the alien versions from the first movie, but we also have a subspecies of soldier aliens and a much bigger version, which is the queen.”

All the alien work was completed by Im-age Engine, except for the queen, which was done at Weta.

There’s also more interaction with the actors and a chase between the queen and a school bus around Area 51 on a lake in broad daylight. “This movie has it all: set ex-tensions, big disaster pieces, a moon base, a space battle, a dogfight, bluescreen shoots… everything,” says Engel.

The film has at least four times as many visual effects shots as the original, accord-ing to Engel. Also, since the first film, the technology used to complete the work has advanced considerably.

“Back in the day, 20 years ago, it was about 90 percent miniatures on the first movie, and probably less than 10 percent of that was computer-generated. Here, there’s not a single miniature in the whole movie,” Engel says.


As Weigert points out, all the vendors are using everything in the book for their work – mostly Autodesk’s Maya for animation, a lot of Side Effects’ Houdini for particle effects, and so on. Some vendors, including Uncharted Territory, are using Autodesk’s 3ds Max. For rendering, they use Chaos Group’s V-Ray, Solid Angle’s Arnold, and DNA Research’s 3Delight. Alas, software compatibility can become a challenge when splitting up the work.

“Most of the single parts are too big for any single vendor. But you also want to avoid having vendors doing double work. You don’t want the next vendor having to reinvent the wheel in terms of rendering and trying to figure out the lighting and the shading that’s involved,” says Weigert. “That’s something that’s difficult, especially if you’re dealing with animated characters. That’s why we kept all the regular aliens with one vendor, Image Engine, and then all the queen shots with one vendor, Weta.”

The entire opening moon sequence in the first 20 minutes of the film was completed by MPC, with a handful of shots by Trixter. Luxx in Germany completed the establishing shots of Washington, D.C.

“As the mother ship is approaching Earth, Scanline takes over, and all the destruction on Earth is completed by Scanline,” says Weigert. “The dogfight outside of the mother ship was completed by Digital Domain, and then our heroes are flying inside the mother ship and that’s all Uncharted Territory.”

The overall size and scale of the film – a signature of Emmerich’s – challenged the various teams, but none more so than the scale of the mother ship. “It’s 3,000 miles in diameter. When it hits the ground, it either creates a big tsunami or smashes through half of Washington, DC. It’s just of an enormous scale,” says Weigert.

“It was a little like the Death Star [from Star Wars]. You had to get very detailed. It had these hundreds of thousands of tiny little lights on it, but this ship is so alien that you don’t even have these lights,” Weigert continues.

“You have this kind of light pattern on it, and as you start getting closer and closer to it, you have to figure out how to show the surface scale, which is a whole new challenge.”

Previs (by Uncharted Territory and Method Studios) played a critical role. Emmerich relied heavily on an Ncam system, framing every bluescreen set extension shot. On set, Emmerich was able to see the gigantic hangar of the Area 51 interior with thousands of aircraft inside.

Engel and Weigert had a rare opportunity to sit down with Emmerich one last time to make sure all the visual effects shots worked well with all the live action. “Wow, it comes together really nicely,” he says. – Linda Romanello

Star Trek Beyond

The third chapter in the current Star Trek franchise recently hit theaters, with a new face at the helm. JJ Abrams, who directed both 2009’s Star Trek and 2013’s Star Trek: Into Darkness, took on the role of producer for the new release, Star Trek Beyond. Assuming the director role was Justin Lin of Fast and Furious, who applied his own unique style to the Paramount Pictures release, while respecting fans’ love of the Gene Roddenberry property.

The Enterprise crew of Captain Kirk (Chris Pine), Spock (Zachary Quinto), and Uhura (Zoe Saldana) all return for the latest sci-fi adventure, which features more than 1,400 visual effects shots that help take audiences to the furthest reaches of the galaxy.

Double Negative, with locations in Vancouver and London, served as the lead VFX house on the film, handling nearly 500 shots among its studios. Kelvin Optical and Atomic Fiction also contributed nearly 400 shots, and another 200-plus were created in-house by the production team of Bad Robot/Sneaky Shark/Perfect Storm Entertainment.


Dneg handled much of the heavy lifting when it came to the VFX, but elements were often shared among the different VFX houses. For Dneg, this included the Enterprise, the warp-speed effects, and the Yorktown base. Kelvin Optical designed the CG marauder soldiers, as well as worlds and environments, and Atomic Fiction created many of the elements that were then used by Dneg and Kelvin in their respective shots. Kelvin and Atomic also collaborated on the unfamiliar planet that the crew encounters.

Dneg’s Peter Chiang served as overall VFX supervisor, and Raymond Chen led Dneg’s Vancouver team.

“Justin comes from this gritty Fast and Furious-type photography, and that aspect had to translate to this Star Trek film,” says Chiang. “We did a lot towards dirtying things up, making sure it has that filmic, photographic look and going more for the photorealistic, blown-out, photochemical reaction.”

According to Chiang, he and Lin realized from the beginning that they would have to observe certain laws of the franchise or suffer the wrath of Trekkie hate mail. “Obviously, this is a Justin Lin film,” says Chiang. “We talked a lot about the aesthetic. I knew we were going to present things in a Justin Lin way – even down to the design of the Enterprise and what it goes through.”

The team made minor alterations to the Enterprise, created by ILM for Into Darkness, keeping the spirit of craft but making sure it fit the narrative that Lin was striving for. In the last film, the Enterprise gets damaged and is rebuilt, so there is a shot at the very end that shows a slightly retrofitted Enterprise, which was only built for that one shot. That is the ship Dneg started with, then fleshed it out and made it into something usable for the whole take-down sequence.

The studio’s shot count – the Vancouver location was responsible for 274 shots, with London handling another 197 – doesn’t necessarily represent the true extent of Dneg’s work. “A lot of these shots are very large,” Chen explains, “especially in the space sequences. There are rotating and traveling cameras. To say it is one shot is underplaying a little bit of the complexity.”

Beyond the space sequences, there’s also a lot of on-the-ground action that re-lied heavily on VFX. “The crew is stranded on a planet,” says Chen, “which is partially set extensions [and] some full-CG environments. Also, a lot of stuff happens within the bridge or the interiors of the different spaceships. The environments themselves are quite massive. We had to create the environment of a space station – the Yorktown – which is essentially like building an entire city. It’s a massive environment – 1.3 trillion polygons! Almost everything in there is a build item – every building, every bench, every lake, and waterway. It’s a massive undertaking.”


Dneg runs a Linux pipeline and uses Maya for most of its modeling and animation. The studio relies on The Foundry’s Nuke for compositing. For this film, they employed Isotropix Clarisse for lighting.

When asked which shots are their favorites, Chiang and Chen had different replies.

“It was a fantastic opportunity to give a new version of warp speed,” says Chiang.

“Right from the outset, I presented the idea of folding space and gravitational lensing – that sort of stuff, the science of it. They did a stretch idea in Into Darkness, and they always do that streak, slip/scan type idea, from Star Trek: The Motion Picture all the way through Into Darkness. I really wanted to sell it more as the science of folding space. We end up with a unique shot and look of the way in which the warp bubble works.”

Chen points to the film’s environments. “Not to reveal too much, but there are a couple of worlds that we created that are unique. The Yorktown is supposed to be the pinnacle of Federation civilization. It’s a large city out in space that has multiple planes of gravity and is built for docking spaceships and housing different species of Federation civilizations. We spent a considerable amount of time building it and are proud of the final results.” – Marc Loftus

X-Men: Apocalypse

Marvel’s X-Men are back on the big screen this summer fighting Apocalypse, the first and most powerful mutant from the X-Men universe.

As the lead VFX house, MPC Film likewise marshaled its forces in Montreal, London, and Bangalore, India, to complete just under 1,000 shots. Digital Domain and Rising Sun also contributed to the effort, all under the eye of VFX Supervisor John Dykstra.

Priority number one, says James Rustad, CG supervisor at MPC, was building and destroying the city of Cairo for the final battle sequence. “It was the subject of the first meeting we had on the first day,” he adds. No live-action plates of the city were in hand at the time, but the previs (by The Third Floor) gave MPC a good concept of what needed to be done.

“Traditionally, we build an extra 50 meters around the plate, and the rest is matte painting,” Rustad explains. “But some aerials needed to be full-CG, so we decided to build everything in CG. That meant we could light things consistently and address creative notes more efficiently than making changes to a matte painting.”

To construct Cairo as it was in 1983, the setting for the story, MPC obtained satellite images of the city as well as older maps and reference photography. “We chose two or three zones and re-created them as accurately as possible,” Rustad says. Given the population density of the city, those zones encompassed 500 to 1,000 buildings.

“We also built four or five backgrounds that we could cut and paste around the area, with modifications and rotations. Every thing looked very organic and natural – a purely imagined city wouldn’t have felt right,” adds Rustad.

It took about eight months to build and destroy Cairo, adding roads, damaged cars, rubble, and debris to the cityscape. Modelers used Autodesk’s Maya and Pixologic’s ZBrush to create the library of buildings, which were textured with The Foundry’s Mari and Adobe’s Photoshop. Proprietary layout tools charted the neighborhoods. Digital matte paintings, based on location footage shot in Jordan, formed the outskirts of the metropolis. Side Effects’ Houdini was deployed to pre-shatter and destroy Cairo.

Digital Domain, meanwhile, worked on the opening sequence set in ancient Egypt as well as the Auschwitz environment and some one-off shots for the world destruction.

For the Egypt sequence, the studio did a large variety of work – some large-scale environments, slave crowds, and extensive matte painting. Before lighting the large city environment within Maya, custom scripts were needed so the software could handle the heavy amounts of geometry.


The crowds were also challenging, requiring the group to instance a number of character variations with custom animation cycles across a large environment.

“We ended up rendering close to 300,000 agents, which we had to deliver in stereo,” says Carlos Cidrais, Digital Domain’s lead lighter on the film.

For the Auschwitz sequence, Digital Domain not only undertook a faithful re-creation of the concentration camp itself, but also tore it apart in a spectacular fashion. “We had to deal with large datasets being generated by the FX department,” which we did using V-Ray proxies, says Cidrais.


The scene when Angel transforms into Arch-angel was probably the most complex scene for MPC. “It took a good couple of months to work out how to do it,” says Rustad. “We had concept art from Fox, but nobody knew what it would look like when Apocalypse recon-structs Angel’s wings in metal. Our idea was to have a fractal growing and meshing pro-cess whereby the metal pushes the feathers out. But we had to work out which bits would be hand-animated, which would be simulations, which would be done in shading.”

The feathered wings feature about 100 primary feathers with special shaders reflecting a silvery light. The heavier metal feathers were only about 30 in number, so the loss of the extra feathers had to be ac-counted for. “We deformed and bent the feathers to the shape of the metal feathers so they lined up and we could wipe from feather to metal,” Rustad explains. “The extra feathers were pushed out and floated to the floor.”

A digital scan of Angels’ upper body was used to create a replica torso; animated formers represent his ribs sliding under the skin during the transformation – a “rather gruesome moment that has people squirming in their seats,” says Rustad. – Christine Bunish, Linda Romanello

TMNT: Out of the Shadows

Everybody’s favorite heroes on the half shell are back, and they’re facing multiple threats as they strive to safeguard the world. Shredder has returned and joined forces with mad scientist Baxter Stockman and henchmen Bebop (a warthog) and Rocksteady (a rhino). And Krang is leading an alien invasion in the skies over New York City. That’s a lot for any superhero to contend with.


Teenage Mutant Ninja Turtles: Out of the Shadows comes on the heels of the successful 2014 TMNT film. “We knew this picture would be double the size of the first movie,” notes Pablo Helman, VFX supervisor at Industrial Light & Magic, who assumed the same role on the original feature.

As the lead VFX house, ILM divided up 1,385 VFX shots among its San Francisco, Vancouver, London, and Singapore locations. ILM also supervised Quebec-based VFX house Hybride; additional VFX vendors included Ghost VFX in Denmark, Base FX in Beijing, Whiskytree in San Rafael, California, and Atomic Fiction in Oakland, California.

ILM upped the ante for this sequel with motion capture. “The performance-capture system we used on the first film had not been very friendly in production nor with the animators. We needed a system that would be volume-friendly, animator-friendly, and a lot simpler to use,” Helman notes.

In development at the time of the first picture, ILM’s proprietary Muse 2.0 is now a full-blown, high-resolution performance-capture system that places 138 markers on the actor’s face and body. It even tracks pupils to show eye movement. Using Muse for the TMNT sequel enabled ILM “to get an incredible amount of fidelity and a lot of nuances in the performances” of the four Turtles, Helman says.

Comedic performances are built from the best of a number of takes, so it was crucial for Muse to assist in that process.

In contrast to the Turtles, the non-anthropomorphic Bebop, Rocksteady, and Krang were basically keyframed. “The actors gave performances in ADR with facial performance capture recorded as reference,” says Helman. Autodesk’s Maya enhanced the action integrated into both the keyframed and motion-captured animation.

Audiences will notice that Out of the Shadows takes its title quite literally. “This film is about two stops brighter than the first movie,” says Helman. “We now see more of the characters. The design changed to soften some lines in the assets, and the animation changed to make the characters more appealing.”

The movie marks the first time that ILM was tasked with creating river rapids.

“We’ve done lots of oceans and waves but no rapids,” notes Helman. Live-action aerials and boat work were shot in Brazil. ILM then created the rapids, water splashes, mist, and their interaction with the characters.

Animators used the fluid sim tool in ILM’s proprietary Zeno package, matched the reddish clay color of the real rapids, and married the fluid sims with plates of a tank, half of a C-17 aircraft, the Turtles, and Bebop and Rocksteady.– Christine Bunish