Trickery and Tweaks
Issue: Volume: 32 Issue: 8 (Aug. 2009)

Trickery and Tweaks

Read in-depth features about how cutting-edge CG was created for various movies making Oscar buzz.

For the sixth Harry Potter film, visual effects studios practice new forms of CG magic.


CG fire created at Industrial Light & Magic whips around Dumbledore, who conjured up the flames to repel the digital Inferi crawling up the crystal island.
Images ©2009 Warner Bros. Entertainment, Inc.

It’s a dangerous world now for magicians and Muggles, as readers of JK Rowling’s enormously popular Harry Potter series of books know, and as movie audiences will discover. Warner Bros.’ sixth film in the franchise, Harry Potter and the Half-Blood Prince, puts the villainous Lord Voldemort back on the attack, with his acolyte Draco Malfoy helping to double the trouble. And, as always, the good wizards, Hogwarts Professor Dumbledore and his student Harry Potter, the chosen one, must stop them.

But Harry and his fellow students of magic have other problems to cope with. They’re teenagers now, as are the actors who portray them, and hormonal distractions play a major role in this film, sometimes with a magical twist. The kids are growing up. The plot is grittier and darker. And the result is one of the most critically acclaimed Potter films yet.

David Yates, who directed the previous film Harry Potter and the Order of the Phoenix, returns to lead the crew of Half-Blood Prince and will continue on for the final two films in 2010 and 2011—both based on the seventh, and final, Potter book. So, too, does senior visual effects supervisor Tim Burke, who has now worked on five of the six films, and the acting crew who have starred as the students of witchcraft and wizardry from the beginning: Daniel Radcliffe as Harry Potter, Emma Watson as Hermione Granger, Bonnie Wright as Ginny Weasley, Rupert Grint as Ron Weasley, and Tom Felton as Draco Malfoy.

     
Because ILM created the digital fire using a combination of low-res 3D fluid simulations and high-res 2D simulations, TDs can produce quick results. From left to right, a data plane showing the underlying sim, particles colored according to temperature, and an initial composite.

As is typically the case with Potter films, several VFX studios located primarily in London conjured up most of the effects, with Industrial Light & Magic (ILM) doing some heavy lifting from the US. We talked with ILM, the Moving Picture Company (MPC), and Cinesite, which created some of the most interesting effects, about their work on the movie. In addition, CEG Media, Double Negative, and Rising Sun Pictures contributed digital effects.

ILM: Firestorm


Dumbledore and Harry are on a scouting mission to find one part of Voldemort’s soul, a so-called Horcrux, hidden in a crystal cave. ILM provided shots for Harry and Dumbledore’s entrance to the cave, and created the cave itself as well as the lake inside. After entering, the two wizards row across the lake to the island, a set piece, and find the Horcrux. And then, the action begins. Dumbledore collapses. Harry dips a shell into the lake to bring him some water. As soon as he touches the water, hundreds of CG Inferi, trapped souls that guard the Horcrux, splash up and pull him beneath the CG water. The camera follows Harry underwater.

To rescue Harry, Dumbledore, now recovered, conjures up fireballs that he slings into the lake. As jets of fire slash through the water, we see thousands of Inferi, arms and limbs wrapped around one another, on the sides of the island. They look like coral, and it’s no surprise to learn that ILM’s Aaron McBride, who created concept art for Davy Jones’ crew in Pirates of the Caribbean, helped design them. The fire scares the Inferi. They release Harry, and as the camera follows his swim to the surface, we see an orange glow on the water. The visual effects wizardry has only just begun.


The goal in creating the firestorm surrounding Dumbledore was to create highly detailed and directable fire that didn’t look like it was made of particles.

On the island, Dumbledore is directing a raging firestorm that swells to 100 feet high and 200 feet in diameter. Waves of fire swirl around him. And then, like Moses parting the Red Sea, Dumbledore says a magical word, and the churning wall of flames splits to create a safe passage across the lake for the two wizards.

Tim Alexander led a crew of approximately 60 people at ILM who worked on the sequence’s 160 shots for about a year to create the crystal cave, the CG water, the CG Inferi, and the CG fire. When they began, they did not have the ability to create such a fire, so Chris Horvath took on that problem in November 2007. This year, SIGGRAPH accepted the technical paper he and Willi Geiger submitted on the process titled “Directable, High-Resolution Simulation of Fire on the GPU.”

“I wanted to emulate the ability of really skilled compositors to paint with filmed elements,” Horvath says. “So the original intention was to come up with something that would allow us to shape the use of filmed elements with particle simulation.”

With this in mind, Horvath first tried to bend and stretch sprite elements along a particle path. It didn’t work. Horvath believed that might have been because the filmed elements he had to work with didn’t have the long trailing streams of fire he needed. So, because he didn’t have the photographed elements he wanted, he decided to use CG elements instead. And thus, the science project began.

“The goal was to not have anything particulate,” Horvath says, “no sprites and none of the artifacts that had been present in the past. That was the basis of our technique, which is essentially image-based simulation.”

The result is a two-stage process that technical directors used to create the firestorm. The process begins with a 3D particle simulation using Geiger’s FakeFlow system. “It does a coarse, quick, low-resolution enforcement of the Navier-Stokes non-divergence component and adds a little viscosity,” Horvath explains. “One of the things we learned is that a little of this goes a long way. We used tiny fluid grids—64 cubes at the most, and sometimes 32—but that gave us a base fluidity as a starting point.”

The simulation results provided the flowing motion that Horvath needed, but not the detail. “If we had an infinitely large computer that could hold a 2kx2kx2k grid, we could theoretically perform a full-blown fluid sim, but it would take days and days and days,” Horvath says. “So we found a way to split the grid into slices and exploit parallelism in the GPU.”

This is where the magic happens. First, since the only thing that matters in visual effects is what the camera sees, the second part of the process is oriented to the frustum—the camera’s field of view. Rather than running a 3D simulation, Horvath stacked, in effect, a series of rectangular, two-dimensional slices of a 3D grid. He spaced these simulation planes equally apart and faced them to the camera from the foreground to the background, to produce high resolution close to the camera and appropriate resolution farther back.

Let It Snow


For a wintery sequence that demanded more than the fake flakes caught on camera, Cinesite drew on proprietary image-based tracking software called Motion Warper to dust the actors with snow. The studio started by photographing the fake snow to create tiny elements. Motion Warper then tracked the live-action image and applied the elements. “It does pixel analysis on the whole image,” says Ivor Middleton, CG supervisor. “It isn’t just a corner- pin application. We were able to snow up Hagrid, with his huge beard and furry cloak blowing in the wind, more or less straight out of the box with a little tweaking. It’s something we’ve also used to track scars and for face replacement.” –BR


“All the particles from the coarse [3D] simulation are drawn on each plane,” Horvath explains. “But a weighting function determines their size. If they’re close to the plane, they’re drawn strong. If they’re far, they fade off, and the size of that fall-off is important. The particles become the input to another Navier-Stokes fluid-flow sim, a variation of the same equations we ran earlier.”

Because this simulation is two-dimensional, though—the slice is only a plane—the simulation can run at 2k, even at 4k resolution, which adds a tremendous amount of detail. The result of the simulation is a set of images that contain density, temperature, texture, fuel, and velocity data. “These are combined by our renderer to produce the final image,” Horvath says. “They’re volume-rendered from near to far.” Although the simulations run separately for each plane, the underlying particle simulation provides continuity.

Horvath created the images using a clever hack that treats texture maps like large input data arrays and a final display like a large output data array. “This was before CUDA was available,” he says, “so we used OpenGL, the GL shading language GLSL, and GPGPU, which tricks OpenGL into using textures as computation planes. And instead of using shaders to draw things, we used them to compute things. Because every­thing is stored in a texture, we can display everything immediately. You can have our simulation on screen in front of you and hit a key to switch between different data planes [the different sets of images].”


Cinesite created magic candles for the great hall, and then for an explosive scene near the end of the film, blew them out. For that sequence, the studio digitally detonated a stained-glass window and shattered the dinnerware.

Horvath used EXR files to store the temperature and density data, and GTO files to store the simulation controls. For most of the shots, the TDs ended up using 128 slices. “There’s no reason to choose 128,” Horvath says. “There’s nothing special about that number. It’s just that programmers are geeks and like to use powers of two. Typically, the number of slices required is a feature of how much depth complexity we need.”

To simulate a fireball, for example, the technical directors used only eight slices, and the simulation took only 10 minutes. But, for the parting of the so-called Red Sea shot, during which Harry and Dumbledore row across the lake between walls of fire, the TDs needed to run six simulations that they layered from back to front.

“That’s the only shot in which the main body of the fire is multiple elements,” Horvath says. “That’s because we needed more than a thousand slices, and that was too big for the renderer to hold in memory. It was our nightmare shot.”

Fiery tunnels aside, for the most part, by using the GPU fire-simulation process, the crew could look at a fully rendered fire in about half a day. “It was great,” Alexander says. “We had never done photorealistic fire, although we’d tried many times in the past. The standard approach, rendering a heavy-duty 3D particle sim volumetrically, can take days on end. And, now we had super highly-detailed photorealistic fire that we could change twice a day.”

Cinesite: Magical Explosions


All told, Cinesite created 474 shots, of which 204 had CG elements and the rest were cleanups. Andy Robinson was the compositing supervisor, and Ivor Middleton, the CG supervisor. To the studio, Hermione’s lovebirds in scene 63 were the most important.

“We’ve mostly done environments for Harry Potter, so having CG creatures to work on, as well, was nice,” Middleton says. “They’re little lovebirds that reflect Hermione’s mood.” At the beginning of the shot, as Harry comforts a distressed Hermione, the birds twitter around her head. But when Ron Weasley enters with another girl, the birds turn into angry little darts.

“They become aggressive and zoom toward Ron,” Robinson says. “When he exits, they hit the door and burst into an explosion of feathers.”

To create the feathers, the 3D artists made small planes, rather than using hairs. “We didn’t need to build individual barbs,” Middleton explains. “We could simulate the feathers with texture maps and an anisotropic shader.” Once the birds hit the door, the dynamics switch on. Lead technical director Holger Voss turned the birds into the explosion of feathers using a combination of Autodesk Maya’s rigid-body dynamics and nCloth, which handled individual feather dynamics.

The second explosion that Cinesite created for the film involved inanimate objects—a stained-glass window in Hogwarts’ great dining hall. “They didn’t shoot anything for it,” Robinson says. “The shot was an afterthought.”

The shot takes place near the end of the film. The Death Eaters, who are invading Hogwarts, explode the stained glass window. Plates slide off the tables, and cups smash onto the floor. The magic candles move with the shock wave, and the flames go out. 

Although Cinesite had created the magic ceiling and magic candles for the great hall, as the studio has done in the past, for this scene the artists had to reproduce the set, as well. “We remodeled the entire room from two camera views,” Robinson says. “We rebuilt the tables, cups, plates…everything in 3D using digital stills for textures.”

After consulting with Jose Granell, Cine­site’s model unit supervisor, the team decided to create 100 different explosion events as the detonation moves down the great hall, to mimic how Granell would rig a series of practical explosions to create a dense effect.

“We staggered the explosions and previs’d it with Tim Burke to get the timing,” Middle­ton says. For the sequential explosions, Voss used Blast Code software to shatter the stained glass, animate the plates moving off the tables, smash the cups, and manage the interaction with the environment.

“The dynamics were rigid-body simulations with Blast Code,” Middleton says, “but with Holger [Voss’s] careful choreography.” Lastly, Robinson’s team composited dust and debris passes rendered with Pixar’s RenderMan into the explosion using Apple’s Shake.

MPC: Quidditch


Now that the contestants are teenagers, Yates and Burke wanted to give the Quidditch matches a little edge. “Even though it’s still kids on magical brooms, we tried to make the sport more of a sport,” says Nicolas Aithadi, visual effects supervisor at MPC. “We decided to create more credibility by giving the camera more of a television feel. So, we looked at everything having to do with aerial shows and Formula 1 races.” They created the broadcast-sports feeling and dynamism they wanted by using the virtual equivalent of 500mm lenses to flatten the perspective and by having the cameras catch up with the action.


MPC created a digital stadium for the extreme Quidditch sport and used videogrammetry to put real faces on digital athletes that star in 90 percent of the shots.

During Half-Blood Prince, we see a tryout and a final match. For both, although the crew shot the actors on wires on a bluescreen stage, in 90 percent of the final shots, the competitors are CG. “We wanted something more controllable than what we could shoot on set,” Aithadi says, “and more extreme.”

Knowing the digital athletes would be close to camera, MPC decided to use video­grammetry to create the photoreal characters. “It’s like photogrammetry, but with moving images,” Aithadi says. To capture the moving image, they positioned four cameras around an actor wearing 80 tracking markers on his or her face and sitting in a chair. Two cameras were in front, high and low, and two were at 45-degree angles on each side. 

“The idea was to use that data to animate the CG face, and at the same time, acquire the textures,” Aithadi says. “We’d have moving geometry and moving textures, and the texture would match the geometry exactly.”

Mystery Train


During a particular shot, created with help from The Moving Picture Company (MPC), the camera focuses on students Ginny, Luna, and Dean Thomas, who are riding in a train. The camera pulls out of the window, a dining car rolls past, and then, the camera goes inside the window of a third car to focus on Harry, Ron, and Hermione.

But there was no train—only the inside of one train car on set, configured and then reconfigured to simulate the three train cars, and filmed with a motion-control camera. MPC created the rest and put the jigsaw puzzle together.

“It was insane,” says Nicolas Aithadi, visual effects supervisor. “We had one person working on the shot for six months, creating everything that didn’t exist on set—the train walls, the track, everything outside the windows—dealing with all the motion-control elements, and keeping all the motion-control and CG elements in sync. At one point, I thought, ‘Enough already. We’ll do it all in 3D. I’ll pay for it.’ But, it would have been a long process in 3D as well, and the real characters help sell the shot. It has a nice feel to it, and the way it fits into the story is quite cool.” –BR


The first test took a week and a half and had promising enough results that the crew decided to go ahead and capture all the Quidditch competitors performing a library of expressions. Before the crew started the capture, though, they had each actor do a minute and a half warm-up.

“We had done the test with Rupert Grint, who plays Ron Weasley,” Aithadi explains. “And, we asked him to do an extreme movement. When we got the data back, we could see blood moving inside his face in a specific way. I had never thought of that before. So the warm-up helped get the blood flowing in the actors’ faces.”

After shooting each actor, the artists at MPC used camera projection to apply the moving images on a CG model of the actor’s head, and then stitched the edges and converted the images to UV maps. “It was very freaky to see the mouth and eyes moving on the UV map,” Aithadi says.


The digital stadium for the Quidditch tryouts is not yet draped with large pieces of cloth, so MPC needed to build every post and beam that you can see behind Harry Potter and Ginny Weasley.

Furtility, the hair system MPC developed for 10,000 BC (see “Making History,” March 2008), had to be upgraded to help the groomers work with hair that’s thinner and not as long as mammoth hair, and to create hair that moves but retains a part. “I think the shots with Ginny are so good because the hair helps sell the whole thing,” Aithadi says.

Apparating Sucks


The fastest way for a wizard to transport is by apparating: He or she thinks of a destination and magically teleports there. But, it’s dangerous, so wizards cannot receive a license to legally apparate until they turn 17, Harry’s age now. The question for the filmmakers was, what does apparating look like?

To answer that question, artists at The Moving Picture Company (MPC) created drawings to visualize VFX supervisor Tim Burke’s idea of something like chewing gum that causes Harry and Dumbledore’s bodies to pull and deform, and blend into each other.

“That was our biggest mistake,” says Nicolas Aithadi, MPC’s visual effects supervisor. “Tim loved our ideas. And [director] David Yates loved them. But we didn’t have a clue about how to do them. I’d explain the ideas to our modelers and animators, and they’d look at me with empty eyes. The shot is eight seconds long, and it’s the most complex we had to do, completely abstract from start to finish.”

In the final shot, Dumbledore and Harry are standing in a train station. Dumbledore says, “Take my hand, Harry,” and they apparate. The world spins around them and sucks them inside. They collide, their bodies distort and twist, and their clothing and skin blend together until you can’t tell one from the other. For Dumbledore, it’s magic as usual. For the novice Harry, it’s painful. Then, they pop out into a village.

The team spent the first two or three months tossing ideas like hot potatoes back and forth between the animation and modeling departments. Finally, they decided to start with models based on five images that represented the stages Harry and Dumbledore go through as they transport.

Meanwhile, the matchmove department rotoscoped Harry and Dumbledore—that is, actors Daniel Radcliffe and Michael Gambon—from footage shot at the train station and in the village. And, the character development team designed a rig that allowed animators to stretch low-res models of the characters in abnormal ways. In addition, one modeler created a train station in CG and then, without changing the topology, turned the train station into a village.

Hand modeling also helped transform the characters. “We didn’t have time to do R&D,” Aithadi says. “We made high-resolution characters for the rest poses, and then every five frames for eight seconds, a modeler remodeled the blendshapes with the same topology. When the bodies overlapped, we created blends as if they were the same object. We had 10 or 15 modelers die during the project,” he adds with a chuckle. Once modeled, animators used special rigging to drive the way the blendshapes worked.

To control the skin and cloth textures, the team rendered 15 versions of the animated characters, including one with everything made of skin, another with everything made of velvet, and a variety of lighting passes. But, they also ended up blending the textures in 2D.

“I know I should have taken this artwork and thought about it before loving it so much,” Aithadi says. “And I did love it. It is so beautiful. But, that’s why we like to work on Potter. The effect is very different—a bit of something new.” –BR
 

For cloth simulation, they relied on Syflex to move three different fabrics used in the team uniforms. More difficult than the kids’ capes, though, was the cloth covering the stadium. “It has 50-meter-tall pieces of cloth that are so big you need small details or you’ll kill the scale,” Aithadi notes. “But, the more detail you have, the more polygons you have. Oliver Winwood spent three months getting that simulation right, and then he had to add the wind, and the problem with wind is that you can quickly see a pattern when the mathematical waving is too uniform.”

The tryouts, however, happen in an uncloaked structure. “All you can see is the wooden structure,” Aithadi says, “the stairs inside the towers, everything. We had to build every beam. We ended up with 35,000 pieces of geometry.”

In addition to the CG digital doubles and the two stadiums—one bare and one dressed with cloth—MPC built the surrounding environment, a task that took the matte painters six months. They started with stills taken by the production crew in Scotland and a panorama that served as a background, but that background was five miles from the stadium. The matte painters needed to fill the space between. “We didn’t want flat depth,” Aithadi says. “We wanted haze between trees and to have that haze changing every 200 meters.”

To give the artists a sense of scale, the crew built a 360-degree environment based on the stills and placed spheres at various depths within that environment, starting with a sphere the size of a foreground tree. “Because of the spheres, we could define depth, which gave us the ability to create a 2D environment that looked 3D,” Aithadi says. “Using those spheres in depth, we created level curves, and then created geometry at the right depth that we camera-projected with photographs. The only problem was that the trees were flat, so compositors manually rotoscoped tree edges to place haze behind them.”

Water Zombies


The lake inside the crystal cave where Harry and Dumbledore travel to retrieve a Horcrux churns with Inferi, corpses bewitched to do a dark wizard’s bidding. Industrial Light & Magic (ILM) created these naked, soulless creatures, which Voldemort gave the task of protecting the Horcrux. “They look like people but skinnier,” says visual effects supervisor Tim Alexander. “We have thousands in some shots.”

For shots with up to 100, primarily those that attack Harry and pull him into the water, animators keyframed the performances. To populate the entire under­water cave with thousands of Inferi all tangled together and writhing, the crew used motion cycles created by the animators that they instanced onto cards driven by a particle simulation. “We rendered 800 frames of seven or eight cycles that were spread throughout the cave and randomly chosen at different times,” Alexander says. “When they’re cards underneath the water, we built them so we could relight them. We didn’t bake prelighting into them. We can treat them like sprites, but they have normals.”

For the water, ILM used the studio’s PhysBam software to create a 3D simulation for swells, splashes, and ripples from the Inferi. But for the shallow water, CG artist and developer Chris Horvath created a GPU-based solver using a surface plane. “We were doing 1k and 2k shallow-water sims in real time,” he says. “It’s the same Navier-Stokes equation used for 3D simulations, but we collapsed one of the dimensions in the assumption that the water is shallow.” In the 3D sim, the Inferi are geometry. In the 2D sim, which is image-based, they are elements.

Underwater, there is no simulation. The filmmakers created the illusion by filming Daniel Radcliffe (Harry) underwater, to have his hair and clothes float realistically. ILM animated the Inferi as if they were moving in water, and then composited Harry and the Inferi into swirling CG particulate matter lit by the glowing fireballs that Dumbledore shoots at the zombie-like creatures. –BR

Growing Up


When Warner Bros. released the first Harry Potter film eight years ago, the actors playing the teenagers now were children. To create their witchcraft and wizardry, however, the visual effects have been state of the art for each Potter film, and the magical effects have helped drive the films’ huge successes. But, only one, Harry Potter and the Prisoner of Azkaban, has received a visual effects Oscar nomination.

Now that the children have moved toward adulthood, though—both the actors and the students in the stories—the effects have grown edgier and more sophisticated, too. So, perhaps this Potter will break the spell. That is, unless the Muggles take Potter effects for granted.

Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.