Dark and Stormy Knight: Part one of two-part special feature
Issue: Volume: 28 Issue: 6 (June 2005)

Dark and Stormy Knight: Part one of two-part special feature

For the millions of Star Wars fans around this world, a ticket to Star Wars Episode III: Revenge of the Sith buys one last ride to a much-loved galaxy far, far away. It’s the final flight in a journey begun nearly 30 years ago when George Lucas introduced the world to the Force, Jedi Knights, an Evil Empire, and the Dark Side. Episode III, the sixth film in the series, but the saga’s linchpin, marks the turning point in the story: In this episode, Anakin, the heroic young Jedi Knight, falls into the dark side and becomes the loathsome Darth Vader, setting up the final three films. In reality, however, this film concludes the cinematic journey, one of the most financially successful film franchises in history, with a stunning climax and rave reviews, particularly for the visual effects.

The five episodes thus far have made visual effects history, and Episode III is on track to do the same. To create effects for the first Star Wars film, now named Episode IV-New Hope, Lucas created Industrial Light & Magic. And to realize Lucas’s vision of intergalactic war and peace, ILM forged technical breakthroughs in visual effects and computer graphics: from optical filmmaking to digital, from motion-control cameras used for the first film to virtual cameras in synthetic environments for the last, from stunt doubles for leading actors to digital doubles, from puppets and actors in robot suits to armies of digital droids, clones, and furry Wookies, and lastly, from human to digital stars, including Jar Jar, Yoda, and, in Episode III, the evil CG General Grievous, commander of the droid army.


It’s fitting that for this last tour of the Star Wars universe, the major technical breakthroughs helped build the galaxy rather than create its denizens. For Episodes I and II, released in 1999 and 2002, respectively, Lucas had asked ILM for more and better characters and digital doubles. That mandate spurred development in skin, hair, cloth, motion capture, rigid-body dynamics, and in rendering crowds of digital characters. Thus, for Episode III’s digital cast, the toolmakers fine-tuned the character animation technology-applying motion-capture cycles to multiple digital characters, for example, streamlining rigging, repurposing hair simulations, changing small pieces of a cloth simulation, and so forth-to make the process more efficient.

Meanwhile, for Episode III, Lucas turned his attention to the environments. The action in Episode III takes place on and above several planets: the city-world of Coruscant, the volcanic Mustafar, the Wookies’ dreamy lake and forest home planet Kashyyyk, the sinkhole planet Utapau, exotic Felucia with its fungi, idyllic Alderaan, Padme’s Naboo, and the barren Tatooine. On those planets, and in the atmosphere above them, the actors (live and digital) craft the epic tale in wide shots of cities, mountains, lakes, forests, deserts, and in interiors-corridors and pilot stations on battleships and planets, the Senate Chamber, conference rooms, and private apartments.


Paintings by digital matte artists at Industrial Light & Magic appeared in three-fourths of the 2151 visual effects shots. In addition, using a new camera mapping tool called Zenviro, technical directors and CG supervisors assembled complex environments.


Seventy-two physical sets were used for principal photography, as were miniatures of complex environments, from the Wookies’ giant trees, to Utapau’s complicated sinkholes, to Mustafar’s rivers of lava created in ILM’s model shop. But the sets were usually small pieces of large digital environments, and the miniatures often became elements in synthetic environments.

“Some shots require computer graphic [environments],” says John Knoll, who supervised 1700 of the 2151 visual effects shots. “For others, the smart thing is to build a model and shoot it. And then there is a big gray area where you could do it either way. Sometimes you make the choice on a gut feeling. Sometimes you choose CG for the control. Sometimes it’s price. For this film, George [Lucas] asked us to bias more of the shots in the gray area toward digital. That drove a few bits of technology forward.”

Fiery simulations provided the first opportunity for technical advancements: The film opens with the longest space battle of any episode. The battle takes place in the upper atmosphere of Coruscant, where Obi-Wan and Anakin fight General Grievous’s droid army to rescue Chancellor Palpatine. In the previous episodes, smallish fighters fired at one another during the space battles. This time, behemoth, mile-long ships shoot thunderbolts at each other, and because they’re fighting in Coruscant’s atmosphere, not outer space, they trigger explosions.

“The space battle lasts eight minutes,” says Willi Geiger, CG supervisor. “We have lasers, fire, explosions, digital pyro. We used everything, and the kitchen sink.” Literally, it turns out, the crew did add a kitchen sink, as digital shrapnel hurled into one battleship’s side.

Although many of the fireballs, the flaming debris, and the smoke trails during the battle used live-action elements, some were entirely digital. These used ILM’s proprietary smoke simulation software and new pyro techniques. The simulator, developed in collaboration with Stanford University under the guidance of Ron Fedkiw, uses a fluid solver and particles that can be positioned and adjusted to sculpt the combustion. As these combustion particles burn, they create areas of local expansion. In addition, “spin” particles can create vortices in precise areas. The new tool developed for Episode III allowed the particle sequences to be retimed non-linearly. That meant the speed and timing of the pyrotechnics could be changed-that is, art directed-without running another simulation.

For further tweaking, particles output from the simulation engine could be imported into Alias’s Maya, and models and simulations could be exported from Maya to the simulation engine. A new rendering system that used a custom RenderMan DSO handled volumetric rendering of the large particle sets.


ILM created the fiery explosions by combining fluid simulations, particle effects, and live-action elements, and rendered them with a custom volumetric rendering system that worked with RenderMan.

For the first digital fireballs in the sequence, the crew started with the fluid-simulation engine, retimed it and adjusted it in Maya, then gave it a cellular look with Voronoi patterns based on the particle positions and a noise function. For this fireball, the crew again started with an explosion from the simulation engine. They imported the resulting particles into Maya, used those particles to trigger secondary events, exported the particles back into the fluid engine, ran the simulation again to include these events, and then combined and rendered all the simulations together. A similar process produced such other effects as smoke clouds during the battle.

Beneath the battle’s fire and smoke lies the dense city-world of Coruscant. The establishing shot for the planet-the wide-view, flying-in-for-a-landing shot-is an intricate 3D matte painting, one of many used in the film. Digital matte supervisor Jonathan Harb estimates that three-fourths of the 2151 visual effects shots incorporated matte paintings-or, in the case of establishing shots, were entirely matte paintings. “We’d revisit the places many times,” he says. “And, even though we had assets that could be re-used, George [Lucas] wanted particular moods, so the painting changed. The complexity of these places, especially Coruscant, was amazing.”

Generally, the painters would start with relatively simple 3D geometry-models that they would texture, light, and render. Then, they began working on the rendered result, adding light as well as details. For an establishing shot of Naboo, digital matte artist Yanick Dusseault started with underlying rendered geometry from earlier episodes, adding trees and other painted elements, including final lighting.


For some shots, rather than sending Coruscant’s digital environment to technical directors to finish, matte painters created the spaceships darting between the painted buildings.

Similarly, for an establishing shot of Coruscant, digital matte artist Yusei Uesugi incorporated buildings from both Episode I and Episode II. “What’s incredible about this shot,” Harb says, “is that it incorporates elements created and rendered in [Autodesk Media and Entertainment’s] 3ds Max, [SplutterFish’s] Brazil, [Alias’s] Maya, [Softimage’s] XSI, and [ILM’s] Zeno.” The shot was rendered using global illumination, and then Uesugi painted the rendering and projected it back onto the geometry.

“A shot like this might require 30 layers of projection, but compared to the time it would take to calculate global illumination for every frame, it’s worth it,” Harb says. “Plus, there’s always going to be a place like this area. See that wash of light on that building? It wasn’t in the rendering. Someone is always going to want to add something like that.”

In addition to adding painted details and lights, the matte artists would often add moving elements-spaceships navigating the corridors between Coruscant’s skyscrapers, for example. It fits with a kind of one-person, one-shot philosophy that the matte department shared with ILM’s former “Rebel Mac group,” which is now permeating the pipeline. New tools have given people working on the pipeline more flexibility.

In the past, for example, the matte-painting department would have created all the “digimattes,” whether for pilot ships or entire planets. This time, thanks to a new camera mapping tool named Zenviro, the 33 matte-painting artists concentrated on “establishing shots,” while CG supervisors and technical directors crafted the “zenvironments.” Alan Trombla led the Zenviro development effort in ILM’s R&D department based on earlier work by Dan Goldman, now at the University of Washington.

“Suddenly, camera mapping was available to the rest of the crew, and they were taking on shots that would have come to us,” says Harb. Although designed to handle huge environments, Zenviro quickly became adopted throughout the pipeline.

“People began using it for all kinds of things because it’s really easy to use,” says Knoll. “If they had a shot where the texture didn’t hold up in the foreground, a lot of times they’d fix it with Zenviro rather than sending a little thing like that to the matte-painting department.”


Kashyyyk, the home planet of the Wookies, was fashioned, as were other planets, from live-action photos, 3D models, digital still images, and matte paintings.

They built big things, too-a pilot station, cities on the sides of sinkholes, corridors on space ships, and large environments. CG supervisor Hilmar Koch, who oversaw work on synthetic environments, estimates that Zenviro settings were used for approximately 500 shots. “We used it for a lot of the heavy action shots,” he says.

Working within Zenviro, CG supervisors could build a 3D scene using multiple image planes and 3D models, all textured with photographic elements that were projected onto the 2D and 3D geometry and then touched up, when needed, in Adobe’s Photoshop. In other words, Zenviro artists could project high-res textures-2D images-onto 2D cards and onto 3D geometry within a 3D environment, paint on the textures, and move the camera around inside the resulting environment. The images could be live-action photography, motion-control shots of miniatures, paintings, 3D renderings, digital photographs, and so forth. The system managed multiple projections so that all surfaces seen by the camera had projected images and the correct amount of detail.

For example, if a cyclorama-a series of digital photographs or paintings stitched into a 360-degree loop-encircled a Zenviro scene, the camera saw the painting everywhere it turned. As the camera moved, it became obvious whether elements in the scene needed to be fully 3D or if they could be photographic elements on 2D cards-or, if they could be seen at all. Thus, Zenviro made it possible to have synthetic environments in which the camera is following fast action without needing to model and render an entire 3D scene for every frame.

“George [Lucas] pushed to do shot composition at the end of the pipeline,” says Koch, who managed the synthetic environments. “So we had to build synthetic worlds in which to fly the camera, and Zenviro allowed him to work with miniatures in the same way he worked with pure CG assets.”

Some of the largest miniatures were used to create the sinkholes on Utapau, the Wookies’ 1000-foot treehouses on Kashyyyk, and lava beds on Mustafar. For all three planets, matte paintings provided the first look.


Digital matte artists created the senate office building on Coruscant (above) by painting a rendering of a 3D model (top, left) that was textured with a basic “cement” shader (top, right).

On Utapau, aerial footage of a landscape not far from Lucas’s Skywalker Ranch provided the background onto which stills of the sinkhole models were projected. On Level 10 inside a sinkhole, the complex environment where General Grievous fights Obi-Wan, was created with Zenviro.

For Kashyyyk, footage taken in China helped matte painters set the scene. The Wookies’ treehouses were 12-foot miniatures shot from various angles that were replicated and augmented digitally for final shots. Branches and leaves were created with a Maya simulation using probability offsets. The branches were grown inside spheres; when a branch hit the sphere’s border, it sprouted leaves. Here, again, Zenviro helped set the stage for a battle with backgrounds layered on cards.

The battle, which pits the Wookies and Yoda against the droids, takes place near a lake. “The water is calm, which made it easy,” says Craig Hammack, CG supervisor, “but at the same time, it was hard because it was shallow and clear.” A complex RenderMan shader created the flat surface, with RenderMan’s raytracer handling reflections and refractions. Maya particles and practical elements helped create splashes for the digital droid tanks.

The film’s climax, the fight between Anakin and Obi-Wan, takes place on Mustafar. “Anakin and Obi-Wan have become enemies,” explains Rob Coleman, animation director. “The mentor is hunting the apprentice. It’s the end of hope and friendship. It’s the bowels of hell.”

Roger Guyett supervised the sequence, which involved creating droids and character doubles. But the lava is the real star-10 to 15 minutes of fiery eruptions, molten rivers, flaming waterfalls, and blazing geysers. The far background is a matte painting; the rocks and lava rivers are miniatures-the molten material is actually gallons of methylcel poured into four-foot-wide channels. Many of the volcanic eruptions are clips from footage taken when Mt. Etna erupted during principal photography. The rest is CG.

Because the lava is integral to the plot and takes part in the action, it had to be art-directed in ways fluid simulations wouldn’t allow. Instead, a crew led by Willi Geiger that included Philippe Rebours, Kevin Sprout, and John Helms developed techniques that used Maya particle effects to create 100 shots of splashing, falling, flowing, erupting fiery lava that matched the real Mt. Etna lava and interacted with the methylcel lava.

For massive lava events, giant eruptions in the middle of the fight, the crew hand-sculpted the flow using a combination of runtime expressions, locally controlled fields, and inter-particle forces. “Maya particles react individually to field forces, but real fluids are a continuous body,” explains Geiger. “So using a system created by Masi Oka, we added inter-particle forces to make each particle aware of the other. We also created a system that allows us to precisely control the influence of each field. Rather than using global fields like ‘gravity,’ we could cheat physics to have some gravity here, another amount there, none somewhere else.”


The fire below Anakin and Obi-Wan (shown inset against greenscreen) combined footage of Mt. Etna erupting, motion-control photography of a miniature, and Maya particle effects.

For other shots, the crew rendered the particle simulations as deforming or implicit surfaces that were sometimes textured with photographic elements and sometimes used to create displacement images that compositors used to warp the photographic plate-for example, to make flowing lava from the miniature splash when something hit it. The most intense and dramatic example of the crew’s work, though, is the lava fall-Niagara Falls made of liquid fire. To organize the particles into streams, they constrained a series of radial fields into narrow tubes and had the streams within each tube flow to a point.

The film doesn’t end in the bowels of hell; the scenes that follow prepare the way for the next three films, the first episodes Lucas released. So, too, has the technology that Lucas’s effects studio created for this film and the others paved the way for new kinds of filmmaking. As directors turn ever more toward digital tools, the facile manipulation of virtual environments will become increasingly attractive. In this film, Zenviro created science-fiction worlds, but it can as easily replicate a real landscape-or create a mixture of real and fantasy, giving directors “new hope” for the future.



Barbara Robertson is an award-winning journalist and a contributing editor for Computer Graphics World.

Next month, in Part Two of our series, we look at the new cast of virtual characters and digital doubles in Star Wars Episode III.