|The question is: If the natural elements in a film must behave in an unnatural way, how can a director and visual effects supervisor keep the visual effects from becoming the star of the film?
“Night struck the word ‘magic’ out of our vocabulary,” answers Industrial Light & Magic’s visual effects supervisor Pablo Helman, referring to M. Night Shyamalan, who directed The Last Airbender.
The Paramount Pictures film, an adaptation of the animated television series Avatar: The Last Airbender, tells the story of Aang (Noah Ringer), who lives in a world where “benders” can control the elements fire, earth, air, and water. As the story begins, the Fire Nation has torn this world apart, and 12-year-old Aang, an avatar and the last known Airbender, must restore balance. Only an avatar can control all these elements, but Aang must learn how.
“Night [Shyamalan] came to ILM and talked to the crew about the spirituality behind why things moved,” Helman says. “Directors don’t often come and talk to the crew, but it was really important for him that they knew this wasn’t magic. It wasn’t a fantasy. It was a grounded story. If a soldier, a Firebender, moved fire from one place to another, it took energy to do that. It wasn’t a magic trick. I think that affected the work.”
That work included re-creating the animated fire, water, air, and earth bending from the animated series as realistic elements, turning cartoon animals into believable live-action creatures, and creating mythical environments. To do this, ILM created new technology for simulating fire and controlling water, helped Shyamalan direct the animated characters, and refined the studio’s system for creating digital doubles. ILM’s work began two years before the film released.
“Our task was to come up with images that no one has seen before,” Helman says, “which means we haven’t seen them, either. So, we didn’t have technology in place to do the film, but then, we never do. We know we can come up with the solution because we have the resources.”
All told, ILM created 500 shots for the film; the shots encompass more than an hour of screen time. “This movie more than any other was about screen time, because the shots are not short,” Helman says. “For this director, it’s about the camera moving around and connecting one character to another. We had one shot that was four and a half minutes long with earth and fire bending. We had to create all the elements. Take things out. Put things in. Matchmove. Particle work. Just imagine what it’s like to manage 7000 frames, and that’s only the plate.”
Because the studio didn’t know what it looked like to bend fire, air, earth, and water, how a six-legged creature would move, how a flying lemur could fold its wings, and so forth, the art department began working on concept art early to judge scale, color, temperature, mood, tone, composition, and distribution of shapes. “We produced more artwork than any show I’ve worked on,” Helman says.
First, the elements.
Quick and Beautiful 3D Fire “The particle work in this film is extraordinary,” Helman says. “It took on a different meaning for us. Fire doesn’t just behave like fire; it behaves in a specific way with a story—a beginning, middle, and end.”
For reference, with Shyamalan’s edict in mind to keep the simulations grounded in reality, the crew looked for examples of people playing with fire. They found footage taken at the Burning Man festival. “We showed it to Night and he loved it,” Helman says.
Then, a team of fire starters launched a development project. “We brought Olivier Maury, an R&D engineer, onto the crew,” says Craig Hammack,
associate visual effects supervisor. “He and I worked closely on the basics of the fire during the ramp-up time, and then once production started, he got all the other technical directors up to speed. He ended up running shots as well.”
To re-create the fire-bending effects from the animated series in a live-action setting, they had to accomplish two goals: create controllable, photorealistic fire, and do so in a manner that was easy enough to use that they could produce a large number of shots quickly.
“Traditionally, fire simulations are laborious and tedious,” Maury says. “And, the time it takes from launch to render is large.”
At left, ILM developed new technology to bend fire and, above, new techniques to control water, which ranged from these long tentacles to giant waves.
Hammack and Maury began by evaluating the interactive fire-simulation system Christopher Horvath had developed to create the fire controlled by Dumbledore in Harry Potter and the Deathly Hallows. That system uses a series of 2D slices to produce quick renders of detailed fire simulations at very high resolution from camera view. But, the Airbender fire would need to travel great distances, often straight at the camera with large camera moves.
“The camera moves Night wanted with fire bending were difficult with the frustrum-based solution,” Maury says. “Also, he wanted different types of fire with different textures.” The studio’s Physbam simulation system, on the other hand, could produce the necessary 3D fire simulations, but that system runs on CPUs and couldn’t provide the quick turnaround that Hammack and Maury wanted.
“Chris [Horvath] inspired us to use the GPU for simulation and rendering,” Maury says. “He taught us that it could be a reliable tool.”
As did Horvath, Maury used GPUs from Nvidia; for Airbender, the Quadro FX 5800 cards with 4gb of memory. By this time, CUDA 2.1 was available, and using that framework, Maury developed a 3D fluid simulator and volume renderer now known as Plume (see Viewpoint, pg. 10).
“Plume is a fairly traditional smoke solver with an artist-friendly combustion model in that it allows artists to quickly come up with fire and flame looks by controlling the rendering,” Maury explains. “Artists can iterate and show results as a final render, rather than show an intermediate visualization.”
And that, according to Hammack was key. “We rarely saw anything but final renders out of the system, so we didn’t have to judge a geometric representation of the fire before we got to a rendering stage,” he says. “We realized that for simulations like fire, it’s about the iteration, about the speed with which you can turn around the takes.”
To overcome the GPU’s memory limitations, they developed a system to chain grids together. “Because the fire travels in one direction, we can put the grid domains next to each other and have one feed into the next,” Hammack says, “so we were able to maintain resolution over a large space that way. We built the system from the beginning to optimize memory.”
At right, Aang (Noah Ringer), a young avatar and Airbender, learns how to manipulate water from a water-bending friend. Above, effects artists at ILM used sculpted geometry to create source and target objects for the studio’s Physbam fluid-simulation engine.
Dan Pearson, digital production supervisor, describes Plume’s rendering engine. “It’s a full-on ray-marching renderer with fire and smoke built in. It casts rays to do internal shadows. You associate a density ramp and color ramp with the values inside the simulation, and as you run down each camera ray, you accumulate density and color, and that gives you the final color.” For smoke, the system calculated self-shadows as it ran, using the light source from the fire and external sources.
“It’s a great tool,” Pearson says. “It ran 50 to 100 times faster than our software equivalent at the time. We could produce a rendered simulation in a couple hours rather than a couple days.” And, not only for fire. As Plume developed and the crew became facile with the tool, the artists began using it to bend air and dust, as well.
“With this technique, we began to use it for applications we had never considered using a full simulation to do before, even little dust hits,” Pearson says. “We can do something physically correct without waiting until tomorrow morning. We get a whirling motion, and things react from collisions nicely, and it generally looks more believable. At the least, it’s a great place to start.”
ILM’s design prototyping team created, rigged, and rendered simple 3D models of Appa, then printed 2D artwork of the six-legged creature in various poses that the art department painted and combined with backgrounds.
Earth, Wind, and FireAang bends earth in only two instances in the film, and for those, the studio used a combination of the fracturing technology developed for Indiana Jones and the Kingdom of the Crystal Skull (see “Keys to the Kingdom,” June 2008) and Plume. “We could fracture the earth geometry, and from that, drive particles that we fed into Plume to generate the dust,” Hammack says.
For air bending, the crew used Plume to generate fluid dynamics, and then an up-res’ing technique to create the look. “We saved out the grid data from the simulation, which had nice air currents and swirls, and fed a couple billion particles through it,” Pearson says. “Then, we ray-marched through the particles. Each particle added just a little density to the grid. It gave us a lovely, wispy look.”
Air bending, it turned out, was more difficult conceptually than technically: The team considered contrail vapors before deciding to pull the look from the surrounding environment. Colored lights that matched the environment tinted the neutral white base color of the particles and hinted at the surrounding world.
“The most fun shots, though, were when we played with multiple elements together,” Hammack says. “We could have a gust of air disrupt the base fluid simulation that the fire is driven through. You get a sense that one affects the other because they do.”
For example, at one point in the film, Aang deflects fire shot at him by bending the earth; that is, generating a wall of dust that causes the fire to swirl around it, carrying dust particles with it. The bending happens during the four-and-a-half-minute shot.
“The fire system lets you introduce geometry as obstacles, so we fed the walls of dirt directly into Plume, and the fluid dynamics wrapped the fire around it,” Hammack says. “Then, since we were doing dust in Plume anyway, we could feed the dust into the fire flow.”
Before, big effects shots such as that, which required multiple simulations and long render times, took someone with deep technical knowledge to run. On this film, effects artists ran the shots. “This movie was great for me,” Hammack says. “We finally have a tool that is interactive enough to break the tradition of having the most technical person in the company run the big shots. I have to say that fire bending was a lot more fun than water.”
Water, Water, WaterThe water simulation tasks ranged from a huge wave, to tentacles of water twice the size of a person, to a ball of water that rises from the ocean. Yet, as with the other elements, the director wanted the water bending anchored in reality.
“Our challenge with water was the multiple levels of scale,” Hammack says. “But, we decided to push forward with existing tools for water simulation and put our development resources into the other elements.”
Early in the film, as Aang learns how to bend water, the water needed to look uncontrolled, as if the young avatar struggled. “Night wanted people to feel that struggle through the motion of the water,” Hammack says. “Later, we show ultimate control. So telling the story through water was another challenge. The fluid engines want to do real physics, and when we fight against that, it’s harder for the engine. The level of instability we add is difficult; it takes much more computation. So, we relied heavily on our fluid-simulation expert Lee Uren. Ryan Hopkins led the team that produced the wave at the end. And, Dan Pearson set up the water-rendering pipeline.”
Because the wave was so big, the simulation team decided to build an underlying structure for the fluid simulation using sculpted geometry. To render it, they used a technique similar to that for the big waves in Evan Almighty,
Poseidon, and Pirates of the Caribbean. “We have a complicated fluid simulation underneath that we turn into a mesh and put displacement shaders on top,” Pearson says. “On top of that, we render points for the splashes, some mist and other atmospherics, and use volume renders, as well, for the mist. It’s a complicated pipeline with lots of independent simulations that we have to chain together—fluid sims, to particle sims, to more particle sims.”
Because the large wave was so deep, the water could be opaque. The big bubble and the tentacles of water that Aang bends, however, are transparent. “We used geometry as source and target objects feeding geometry into the fluid simulation and telling it to try to reach the shape of the geometry,” Hammack says.
Animators worked with geometric shapes to clock out the choreography and timing for the water bending, using as reference videos they found on YouTube of astronauts popping water balloons in zero gravity. “It took a big leap of faith for Night to look at our gray-shaded tentacles or blobs and trust the result,” says Tim Harrington, animation supervisor. “But, he soon realized it was part of the choreography process.” Once Shyamalan approved the choreography, the animators sent the geometry to the fluid-simulation crew and on into Pearson’s water-rendering pipeline.
“For the giant tentacles, we had an underlying structure driven by a mesh, and we streamed particles on top,” Pearson says. “We had a technique where we could mesh the particles directly into a surface. We could combine these animated meshoids, as we called them, with simulations, and then generate a mesh with everything altogether. At that point, we had a surface with characteristics based on the radius and weight of each particle and how much influence each particle had.”
That surface is what they rendered. “When you store simulations on grids, you have a lot of data to carry around,” Pearson says. “So, we could do the process once, turn it into a mesh that we could store and re-render efficiently using [Mental Images’] Mental Ray and [Pixar’s] RenderMan because we were storing the interface, not the internal structure.”
Appa, Momo, and DragonsAlthough the animators helped choreograph water-bending shots, their main task was creating about 200 shots with CG creatures: Appa, a six-legged flying bison-like animal that Aang rides; Momo, a little flying lemur; a hybrid between a Komodo dragon and rhinoceros; and a spirit dragon.
“Making the designs for these characters organic but close to the animated characters in the series was a very difficult dance,” Helman says. “We started with Appa. To work out a way for us to understand how a six-legged creature would walk, we used the outlines from the 2D artwork to create a creature and rig it. We wanted to be able to pose the creature in a specific way to create 2D artwork.”
The design prototyping team would then pose the 3D model, render it, and deliver it to the art department, where artists placed Appa against a background and painted on the plastic render. “He’s in the right scale, and the rig tells you what the limbs are doing,” Helman says. “That gives the director an idea of how the creature moves before he commits to camera moves, and without having to wait for an entire production cycle.”
To create the six-legged walk, the animators referenced elephants and polar bears. “We came up with a modified four-legged walk,” Harrington says, “where the pairs of front legs work together but are slightly offset. It’s a solution that’s believable to audiences.”
Appa’s face was a separate challenge. “He has a face that’s almost human,” Harrington says. “He looks like a giant bison, but if you look at cows and bison to try to figure out how to bring life to the eyes, you don’t see a lot going on in their heads. So, we looked at gorillas. They seem slightly more intelligent.
For Momo, the advantage and the problem were the same. The team could look at real lemurs for reference, but lemurs don’t fly. “From our conversations with Night at the beginning of the project, we knew it was really important that the characters have a purpose,” Harrington says. “Night would say that Momo is like a street kid, always looking for an opportunity, maybe looking for food or a warm place to stay. If there is danger, he might bolt and then show up again when it’s safe.” Like Aang who befriends him, Momo is, apparently, the last of his kind.
For flying reference, the animators discovered giant fruit bats from the Philippines that were roughly the same size as Momo, and based Momo’s mechanics on those creatures. “Anytime we put wings on a creature, it becomes a logistical nightmare,” Harrington says. “And Momo has to walk on all fours like a lemur, so we didn’t know what to do with the wings at first, how to fold them up, where to put them. We ended up folding them in a line along his forearm so they looked appealing when closed and believable that he could open them and fly.”
In addition to Aang’s animals, the animators had to move the huge creatures that the Fire Nation warriors ride into battle. The creatures look hybrid between a rhinoceros and a Komodo dragon. “The thing about Komodo dragons is that they have a goofy way of walking,” Harrington says. “They lead with their elbows and drag their knuckles. It looked odd when we put that movement on a giant creature, so we made the creature move more like a bulldog.”
Behind Aang is a computer-generated environment, one of many created for the film by ILM’s digital matte-painting department.
In one shot, the creature scales an ice wall, digging into the wall with its claws, moving purposefully and steadily given the size of the beast. “You have to show the effort it takes for something that heavy to climb a wall,” Harrington says. “If you rush it, it looks fake.” Skin and muscle simulation applied after animation helped add believability, as well.
The fourth creature, a spirit dragon, is the least visible and the most complicated. “Night’s concept was to have an abstract coil of snake in a cave, and each time you see him, we reveal a little more,” Harrington says. “You see a close-up of his eye and notice a third eyelid. Then, the next time, you see his whole face. Finally, at the end, he leaves the cave and you get a glimpse of how giant and majestic he is. He opens his wings, and he’s amazing and beautiful. But, as soon as you take in his beauty and majesty, we obscure him in smoke, and you don’t see him anymore. He is a complicated creature utilized effectively.”
Digital Double Although Noah Ringer, the young actor who plays Aang, is a skilled martial artist, many of the shots were too dangerous for the child actor. One shot in which he’s fighting several warriors all at once, for example, continues for 5000 frames. “We knew we would do some flying and gliding shots that the stuntperson couldn’t do, but as the show progressed, we took on more challenging digital-double stunt work,” Harrington says. Ringer’s stunt double was a woman about his size, so ILM replaced her head in the footage with Ringer’s head.
“We decided to do something a little different this time,” Helman says. “Before, we’d capture the actor in a rest pose and then build a library of shapes based on photographs. But what happens is that when you transition to shapes, it doesn’t always look like the actor again because we all have a specific way to go from smiling to not smiling, for example. So we decided to capture the in-betweens.”
To do that, they had Ringer sit in the middle of a stage surrounded by six monitors. The shot with the stunt double played on those monitors so he could mimic the action as he watched and moved around. While he acted, ILM captured his face—textures and performance—with six HD cameras plus one positioned in front of his face. The studio’s “clone cam” technique helped the team build a CG model from the footage and texture it. They also used the captured footage to move the model.
“Tim [Harrington] could look at the captured performances and pick the ones he wanted for the shot, and then we decoded the information,” Helman explains. “So instead of being driven by keyframe shapes, the actor drove the performance of the CG face.” Animators could edit the performance in much the same way they’d edit motion-capture data applied to a facial rig.
“Then, on top of that, we did a final technique to have a one-to-one match,” Harrington says. “The R&D department came up with a process called the Mardi solver.” While Ringer performed, he wore a tracking device that helped the team stabilize his head and isolate his facial performance. Then, using Mardi, they snapped that geometry with the animated character. “The gross movement has to come from the stuntwoman, so we needed to apply Ringer’s performance to a non-moving head and then put that head into the plate,” Harrington says. “The cool thing about the digital-double work is that we had a small enough number of shots that we could try out this innovative new technique.”
Building a (Non-magical) WorldTo create a world in which element bending might take place, ILM’s digital matte-painting department built air temples for the Fire Nation, carved buildings out of huge ice cliffs for the Northern Water Tribe, and created other digital locations. When the painters built air temples for the Fire Nation, they added industrial-looking bridges to reach them. And, rather than placing an elaborate temple in a viewer’s face for other shots, they hid much of it behind trees in a park.
“The environments are difficult work because the project is geared toward getting things to look real,” Helman says. “So, my push was that if we’re doing geography, we should shoot something, even if we’re going to replace it. That way, the camera move would be correct, the physical timing would be correct. And that’s something Night appreciates.”
And that gave the digital matte painters a starting point for many shots. Then, they pored through Flickr images looking for
vacation photos that showed environments with an Asian influence.
“It was all about capturing the vibe of a real place,” says digital matte painter Barry Williams. “With CG, it’s easy to get off track. I needed to know how it would look if I shot something with this environment at this time of day. Night wanted fantasy with a realistic edge.”
And, for that style, the director picked the perfect visual effects supervisor. “I think that’s what I like doing best, effects that are part of the story,” Helman says. “I think this film is fresh in that the effects are not the star. In other movies, you have effects with explosions and incredible camera moves that are not real, and if you take them out, the story is still what it is. In this film, if it weren’t for the effects, the story would be different. The effects had to be part of the narrative, especially with the director.”
And because of that, once again, ILM pushed the state of the art of effects, particularly fire simulation and digital-double work, forward once again. n
Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.