Issue: Volume 33 Issue 6: (June 2010)

Against The Grains

By: Barbara Robertson

When a film has the word “sand” in its title, you can bet that, somewhere, a pile of visual effects artists spent months pushing particles around computer screens. For Prince of Persia: The Sands of Time, a swashbuckler set in a mystical sixth century, the ones shaking digital sand out of their artistic shoes did so primarily in London’s Soho district.

Tom Wood, who had supervised visual effects at The Moving Picture Company (MPC) for two Harry Potter films and Kingdom of Heaven, was overall visual effects supervisor. He parsed the major work for Prince of Persia into four relatively equal portions for MPC, Double Negative (DNeg), Framestore, and Cinesite.

“Some shots had to be shared, but because it’s a chase movie, really—the hero and heroine move from one place to another for most of the film—we could separate the work geographically,” Wood says. Actor Jake Gyllenhaal plays the hero Dastan, a street urchin adopted by a king and raised as a prince. Dastan warily joins forces with the feisty princess Tamina, played by Gemma Arterton, whose family protects the Sands of Time.

In addition to sand effects, postproduction on the action-adventure-fantasy swirled around magical elements, the period setting, and the multiple locations. Double Negative handled the magic. Framestore created the vipers and, for the finale, filled a large trap room with sand and then emptied it. MPC built an enormous mythical city of Alamut and sent digital armies swarming over sand dunes. Cinesite surrounded live-action actors with two more digital cities and enhanced landscapes between. In addition, a fifth London studio, Nvizage, worked on previs, and its postproduction branch, Nvizible, helped with set extensions and weapon enhancements.

Mike Newell, known for Harry Potter and the Goblet of Fire, directed. Walt Disney Pictures and Jerry Bruckheimer Films, the team that launched the $2.68 billion Pirates of the Caribbean franchise, produced the film. While Pirates drew its story from a theme-park ride, Prince of Persia owes its theme and a built-in fan base to a 2003 video game of the same name, one in a series of Prince of Persia third-person action games created by Jordan Mechner and first introduced in 1989 (see Editor’s Note, pg. 2). Mechner helped write the script for the film.

 The plot device driving the story is a “dagger of time.” Inside the crystal hilt of this dagger is a vessel with the magic and powerful Sands of Time, a gift from the gods that can rewind time and allow its possessor to rule the world. Dastan gets his hands on it. Everyone wants it. But he and Tamina must protect it even while being chased by a crafty sheik, a master knife thrower, and deadly Hassasins trained to kill with scary weapons.



Rewinding Time
When Dastan pushes the jewel on the dagger, time stops, and he has an out-of-body experience during which he watches time go backward. When all the sand has drained out or he takes his thumb off the button, time goes forward again, and he re-enters his body to operate the future differently. Wood worked with a team at DNeg led by VFX super­visor Mike Ellis to design the effect, which takes place four times in the film.

“A key component that [director] Mike Newell wanted early on was that anyone pressing the button had to step out of body and look back at everything rewinding,” Ellis says. “The actor had to be in two places in the frame, going forward and going backward. That made it complicated to shoot. And, Mike didn’t want just footage playing backward. He wanted it to be an effect linked to the dagger, and he wanted the whole environment to feel different.”

To shoot the actors, Double Negative used an “event-capture” system that the studio had first experimented with for Quantum of Solace, a next-gen evolution of the “bullet-time” technique used for The Matrix. On set at Pinewood, the DNeg crew installed nine Arri Group Arri­flex 435 film cameras equipped with identical lenses in a curve that roughly followed the path the actor would take, and then filmed the action at 48 frames per second (fps). Afterward, layout artists used proprietary software to derive a smooth camera path through the scene.

“We combine those cameras and create the entire shot as a CG version of itself,” Ellis explains, “the set, the background, the actors.” They separated the actors from the backgrounds, rotoscoping each for all nine cameras. Next, taking the rotoscope for all nine cameras, they created a basic cookie-cutter form in 3D space, a mesh for each actor in the shot. Using a second process, the crew shrink-wrapped that basic form into one with higher detail by comparing motion vectors between different cameras. The result was a more accurate mesh of the actor. To apply the photographic textures to the mesh, layout artists tracked the parts that needed to look photoreal in a shot. Then, they broke the CG character into particles by giving the particles a geometry target and then applying the appropriate photographic textures to the particles.

When Dastan presses the button on the dagger, magical, moving sand bursts from his body. The camera then looks through his eyes at himself moving backward in time. A slight sand trail follows. “Because we had the real-world photographic elements from the nine cameras and had all these angles around the actors, we could use the textures on the particles even when the characters are breaking up,” Ellis says. Moving the particles through the studio’s DNeg Squirt fluid-dynamics system added the magical motion.

Squirt also played a role in shots that DNeg created for the finale. During a big confrontation, the magical, light-emitting sands of time pour out of a 300-foot-high crystal. “We had to treat the sand as a character with a life of its own,” Ellis says. “It barrels into everything, breaks down the cavern, bounces off walls. It’s a destructive force.” To create the sequence, the artists worked with layers, starting with the sand crashing to the wall, which triggered rigid-body simulations that crumbled the wall. “The rocks that fell off the walls interplayed with the original sand elements, though, so we had to simulate the sand again,” Ellis says. “It was a circular thing.”



The Sand Trap
Getting to the crystal was half the fun for the Framestore artists, who built a huge, underground labyrinth and a trap room full of sand. When the surface of the trap room collapses, the actors slide down tons of digital sand.

“It was a huge undertaking,” Wood says. “As the surface collapses, the sand flows away, revealing architecture that collapses and falls. So, Framestore took a pragmatic approach. They spent a long time developing software approaches so they didn’t have to render billions of particles bouncing off one another. They didn’t make a room full of sand; they made it look like a room full of sand.”
The idea for the collapsing sand room originated at Framestore. “We pitched it to the studio a couple of weeks before we needed to film it,” says Ben Morris, visual effects supervisor. “We hardly had time to do storyboards. And, we thought it would take two to four weeks to film, but they gave us four days. It was the very last thing to shoot.”

At first, they had Jake Gyllenhaal slide down a sloping sandpit built on set. “Jake was fantastic, but the sand got into people’s eyes, and resetting the sand was laborious,” Morris says. Moreover, the sand didn’t move and flow like it would in a collapsing sand room. So instead, they had Gyllenhaal slide down fiberboard slopes. That meant all the sand would be digital.

“We tried to avoid creating one massive simulation that would take days and days to render,” Morris says. “And, we didn’t want a physics-based system that we couldn’t change. So we broke it down into smaller pieces, to control what was going on. In post-vis, we defined the surface shapes for the main body of sand and the architecture it revealed.”

Effects artists created planes that represented all the shapes in the shot and sent particles flowing down the surfaces, to produce the main flows. Additional simulations layered on top sent particles surging when shapes collided. Shallow waves sliding on top of one another created a leading edge for angled surfaces that created the illusion of sand flowing down a sinkhole. Rigid-body simulations set in motion to collapse the architecture triggered secondary animation that popped sand off colliding surfaces and poured sand from tilting shelves. And the sand simulations triggered dust layers.

“We probably had 20 or 30 simulations,” Morris says. “We had the main flows, sand pouring off the edge of the sinkhole, the pile of rubble with interactive sand around the objects and spilling around the rocks, stuff pouring from the ceiling, and residual trickles of sand.”

The crew spent six months post-vis’ing the two-minute sequence to build the story, doing minimal simulations to get approvals on the shots before running the detailed sims. To add volumetric detail inside the falling sand, lead effects artist Alex Rothwell restructured a pre-baked, voxel-based occlusion calculation developed to create hair occlusion and deep shadows for Aslan, the lion in Narnia.

“We got to the point where we could run simulations overnight, and they were controllable,” Morris says. Using a particle cache format that Rothwell developed, seed particles instanced into a volume of randomly spaced secondary particles produced 300 to 400 additional particles at render time in, for example, the shape of a squashed sphere. Multiple layers of these squashed spheres produced an opaque surface. And, compositors used Apple’s Shake and The Foundry’s Nuke to construct the final shots, sometimes working directly with particle caches in Nuke.

“It was enormously ridiculous, in a way, to design a shot as we were shooting it, but I think the result is fantastic,” Wood says.



Massive City, Huge Armies
Much of the action leading to the finale takes place on soundstages at Pinewood in the UK and in sets on location in Morocco, with visual effects studios extending the practical sets and creating digital environments.
Twenty kilometers southwest of Marrakesh, near the humble, dusty village of Tamesloht, production designer Wolf Kroeger constructed a fictitious city of Alamut, a huge set with merchant stalls along narrow alleys lined with frescoed mud walls. A 50-foot-tall palace stands in the town square, a central fountain gurgles nearby. At MPC, artists turned the set on location in Morocco and additional sets built on nine soundstages at Pinewood into the huge city.

“For the Morocco set, we extended the palace up to six stories and topped it with a golden dome,” Wood says, “built a Tower of Babel-style structure, and huge environmental extensions. For an interior courtyard at the 007 stage in Pinewood, we added city streets, people, a variety of buildings running up to the palace, and the huge tower in the center. It’s a huge thing, all generated as a 3D build.”

To manage such a large environment, MPC improved the existing environmental pipeline, which the studio had used for films such as Kingdom of Heaven. “We’ve had a packaging pipeline in place for a while, but we did a lot of work just to lay out such a city,” says Matt Middleton, CG supervisor at MPC. “It has around 20,000 buildings and 180,000 props.”

To construct the city, a digital town planner would start with approximately 30 different types of buildings and polygonal planes—that is, patches­—that defined sections of the city. Each patch could contain numerous types of structures and variations of each type. The studio’s proprietary layout tool randomized building variations and types, placed them on the correct contours, and even added arches between houses.

The artists used the same layout tool to add prop variations for the buildings by working from a collection of nearly 60 props—carts, pots, rooftop canopies, for example, and washing lines with laundry hanging from them. “We’d do a set number of cloth sims for a long frame range, say 3000 frames, but because we never had shots that long, we could offset the sims and never see repetition,” Middleton says. “Each patch might have a thousand or so buildings in it, but you don’t physically have the geometry in a scene. It’s a virtual layout. You can see representations in [Autodesk’s] Maya in an OpenGL preview.” Thus, lighting artists could preview the layout and adjust the mix of houses and buildings in real time.

Alamut is the setting for a battle sequence, and during one of the shots with armies attacking the city, MPC created and animated 18,000 digital people, all of whom wore long clothing. The long clothes and the flag-bearing soldiers caused the studio to integrate a cloth system into its tried-and-true proprietary crowd-simulation system, Alice. “We haven’t always simulated cloth in the background, but the flags looked ridiculous otherwise,” Middleton says. Although the crew still used Syflex cloth simulations for close-up hero shots, the integrated system helped speed simulations for shots with large numbers of crowd-simulation agents.

During the city siege, motion-capture cycles applied to the agents helped the team create shots with panicking citizens, charging armies, and an elite force repelling up and down the city walls. The biggest crowd shot, however, was of approximately 10,000 horseback and foot soldiers running over a sand dune. The animals and humans kick up sand and dust as they run, using sprites attached to particles in Maya. The particles emit based on foot position.

A second proprietary system, called Tickle, managed the Maya-to-RenderMan interface, provided preview nodes, and more. “We don’t use Maya lights,” Middleton says. “We use Tickle lights. You can set up your passes, that is, which shaders and lights are in various passes for rendering. You can also set up all the normal maps, depth images, and depth maps. And, we used it to a degree for shader building. We have a shader-writing team that builds the core features, and then the artists can plug those together.”

For compositing, the artists used both Shake and Nuke. “For a select number of shots, we had environment artists working in Nuke,” Middleton says. “Nuke wasn’t fully integrated into the pipeline yet. That’s happening a lot more now.”



Dirty Sets, Nasty Weapons
As Dastan and Tamina journey through sixth-century Persia on their mission to save the Sands of Time, they travel through two cities, which house several action sequences. 

Sets built on location in the Atlas Mountains gave the actors walls and rooftops to climb and jump from, and gave the postproduction team lighting reference, but these partially built cities had one problem. “They were too clean,” says Mark Kasmir, visual effects supervisor at Cinesite. “As our lighting and displacement map work progressed, we found we often had to replace the original sets with 3D because our level of aging and destruction looked better. Nine times out of ten when the director said he didn’t like something, he was pointing to something in camera, not CG. That made us feel good.”

To build the cities, the crew started with two sets of 30 generic building blocks, some basic cubes for simple houses and others six stories tall created in Maya, as well as several props. From these, they created cities with 1000 buildings—one a rich city with painted walls, and the other  rougher and more organic-looking.
“We had a great level of detail in the displacements, textures, and staining, which the lighters can use at their discretion,” Kasmir says. “We also had a selection of windows, doors, and props to differentiate the buildings, and we’d intersect buildings to create new ones. If you push a small building into the side of a larger one, suddenly you have a new building with a portico.”

To populate the cities with people walking in the streets, cooking outdoors, and so forth, the crew used Massive’s software. Particles generated in Side Effects Software’s Houdini added smoke to the cooking fires.

To make the action sequences believable, Cinesite put weapons in the actors’ hands, and created wounds with digital prosthetics. The weapons ranged from swords, daggers, and arrows to whips. The artists filled hilts held by the actors with swords and daggers, and sent arrows flying on curves created in Maya and imported into Nuke. “We let the compositors animate the arrows,” Kasmir says. “We imported the geometry for an arrow and its corresponding UV map, and let the compositors play around with the trajectory. Because we had 3D geometry, we could see where the arrow was in 3D space in the comp.”

As for the whips, a stuntman practiced with real whips, but he replaced the whips with handles for filming. “That conjured up loads of animation problems,” Kasmir says. “He got more flamboyant when he used only the handles, and they filmed the sequence at 48 frames per second.” So, to animate the whips more effectively, they retimed the shots to 24 fps, animated the whips, and then retimed them again to the slow-motion 48 fps. One whip had spinning metal weights at the end, and the other had a claw-like end, so the riggers set up separate systems for the tips and the whips.

“We animated the whips on a per-shot basis,” Kasmir says. “We placed the tips and made them look dangerous, and then we did something interesting with the leather whips.”

Although the film has close to 1200 visual effects shots, most, like the weapon enhancements and city extensions, will blend into the rich sixth-century texture. Behind the scenes, though, the studios creating these effects devised unique systems to produce the effects in new, more efficient and effective ways. Framestore invented methods for creating a volumetric look without the expense of volumetric simulations. Double Negative pushed its state-of-the-art, event-capture system further. MPC developed an efficient pipeline for building and rendering massive cities. And Cinesite convinced a director that real-world geometry, not computer graphics, looks too clean. We may have turned a corner. 

Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net .

 

Back to Top
Most Read