A skeletal cyborg with bifurcating arms twirling lightsabers hunts down Jedi warriors. Hundreds of Wookiees, perched high in an enormous tree city, launch a chaotic beachfront assault on invading storm troopers. Obi-Wan Kenobi and Anakin Skywalker, amid the molten lava inferno of Mustafar, clash in a climactic duel. These are but a few of the pivotal sequences from Star Wars: Episode III-Revenge of the Sith
that were brought to life with digital performances crafted by the animators at Industrial Light & Magic, the digital effects facility formed by director George Lucas.
From the vast lagoons and jungles of the planet Kashyyyk to the gray sinkhole world of Utapau and the sleet-filled ruins of Mygeeto, Episode III
’s myriad environments (see “Dark and Stormy Knight,” June 2005, pg. 10) are teeming with more than 385 computer-generated characters. This massive digital ensemble, the largest ever assembled for a Star Wars film, features returning characters such as Jar Jar Binks, the Separatist leaders San Hill, Watt Tambor, and Shu Mai, scores of clone soldiers and droid armies, and a slightly older Yoda, who appears in 173 shots. Of the 385-member CG cast, 185 are soft-body characters, and include the digital debut of the Wookiees, who swing from vines through the canopies of their tree villages and fight en masse in the battle between clone and Separatist forces near a freshwater lagoon. Also appearing for the first time is the Boga, a giant reptilian steed ridden by Obi-Wan as he navigates the vast chasms of Utapau and pursues the star of the new digital cast: the dreaded General Grievous, who is hired by Darth Sidious to purge the galaxy of the remaining Jedi.
Flanked by a cadre of bodyguard droids carrying electrified pikes that are impervious to lightsabers, Grievous is an intricate amalgam of flesh, gears, rods, pistons, and tubes. The biomechanical terror appears in 84 shots and is the bane of Obi-Wan, attacking him in the corridors of a Separatist warship and later on Utapau. When Grievous’s mechanized body blooms into full, four-armed attack mode, the armor plates on his limbs pull back, his retractable, spike-like ribs unfold, his skeletal spine telescopes and rotates 360 degrees, and his arms split in two, spinning like fan blades as they wield deadly lightsabers. Although primarily a mechanical creature, Grievous’s body is sparsely interfaced with bits of organic tissue, including a fleshy sack nestled within his chest cavity and reptilian eyes embedded in fleshy tissue and covered in mucous membranes.
Even though the galaxy far, far away in Episode III is teeming with hundreds of digital characters, there is one technical terror that reigns supreme: the biomechanical General Grievous.
ILM’s modelers built Grievous’s rigid surfaces using Alias’s Maya and Power Animator, then turned to ILM’s proprietary organic modeling software, I-Sculpt, to fashion the fleshy bits fused to the machinery. While the model’s rigid surfaces were almost entirely polygonal, the finished version also incorporated several organic parts sculpted with NURBS. Composed of numerous interrelating parts, the character’s skeletal body displays a dizzying freedom of movement that entailed a number of rigging challenges.
“A huge effort was invested in how all of the mechanics of the creature, with so many degrees of freedom, would work,” says ILM creature developer Aaron Ferguson. “In addition, throughout his long development cycle, Grievous constantly evolved and changed. The animators would rig a part so it was capable of a certain range of motion, then it would have to perform new movement for which it was not designed or rigged.” In order to cope with the changes and rig Grievous for full rotational functionality, the animators created “cheats” for the troubling poses or placed troubling parts on alternate layers, rendered them separately, and fixed the errors during compositing.
“Once Grievous was animation-ready, the team began crafting the walk cycles, which ultimately demanded changes to the model,” says Ferguson. “Even changes in his voice necessitated alterations in his walk cycles and overall character, such as his breathing, posture, and the cadence in his step, forcing us to rework some geometries, rest positions, or other poses.”
To expedite the keyframing in Maya, ILM’s artists implemented a range of expressions and Set Driven Keys for setting up complex correlations between the movements of Grievous’s many parts, such as the sliding, unfolding, and retracting armor plates. Once completed, the animations were baked-or converted to keyframes on animation curves-for use in ILM’s various in-house software.
ILM’s Zeno dynamics software, for example, handled the rigid-body dynamics as Grievous breaks apart during his melee with Obi-Wan. It also simulated the effects when Grievous’s guards get torn apart during battle and when their bodies are bombarded by objects that fall on, bounce off, and collide with their surfaces.
For realistic interaction between the actor and the CG creature, ILM initially used stand-ins for Grievous and his droids to choreograph the motion. Later, animations were selected that complemented those movements, and were then applied to the CG models.
In ILM’s Caricature (CARI), which is primarily used for facial animation, the artists performed a cloth simulation on Grievous’s flowing white cape. CARI’s cloth engine, which also simulated the belts and crude animal-skin garments of the digital Wookiees, employs two types of collision paradigms: collision bodies and collision volumes. Collision bodies calculate surface-to-surface collisions to avoid intersections with the adjacent geometries, while collision volumes are tessellated geometries that represent the character’s underlying body. Using these volumes, CARI could determine when the cloth geometry penetrated the body, then would exert a force to ease it gently back outside the volume, preventing it from suddenly popping out.
General Grievous’s primary opponent in the film is Obi-Wan, who tangles with Grievous and his droids in vicious lightsaber battles aboard the Invisible Hand
flagship of the Trade Federation and on the planet Utapau. To choreograph the duels between actor Ewan McGregor (Obi-Wan) and the digital characters, ILM used visual markers arrayed across the set and affixed to stand-in performers for Grievous and his droids. This eased the processes of rotoscoping, character tracking, and match-moving the various objects and the camera motions later in the process.
However, because of the four-armed nature of the character and its super-human range of movement, motion-capturing Grievous’s stand-in was difficult and rarely done. Instead, Lucas shot numerous plates of McGregor swinging his lightsaber, dodging, repelling, and parrying blows, and performing a host of other maneuvers-usually with the stand-in. This resulted in a library of footage from which the animators would select the actions that best complemented their digital choreography of Grievous.
In several scenes, such as when Obi-Wan is attacking the Grievous guards-cutting off their heads, sending their body parts crashing to the floor, scattering sparks and debris in all directions-blue-suited guard stand-ins would knock over on-set objects, move around, and establish the general space in which the digital characters would perform. Using these plates, ILM match-moved the shots in Zeno before compositing Grievous, the droids, and, sometimes, digital doubles for Anakin and Obi-Wan into the scenes.
General Grievous stalks Jedi, particularly Obi-Wan, until the two meet in a final showdown.
According to Ferguson, one of the greatest challenges facing the film’s character animators was to make the battles between the seemingly invincible Grievous and Obi-Wan appear believable. One approach they took to endowing the battles with credibility was to have Grievous do flashy maneuvers by spinning his lightsabers and doing elaborate arm motions that were more of a display than an attack,” says Ferguson. “In addition, we tried to keep Obi-Wan constantly retreating.”
For the beachfront battle scene on Kashyyyk, hundreds of digital Wookiees were used to expand a small contingent of costumed, live actors into a massive army of swarming storm troopers. While a handful of “hero” shots prominently display the digital Wookiees, most of the close-ups are reserved for the costumed actors. However, for shots that called for the Wookiees to throw large objects, jump spectacularly, swing on vines, or perform in the wide-angle shots, Lucas turned to their digital counterparts. “The live actors just could not do all the motions with the full gear on,” says Ferguson.
At the start of production, creature developers modeled six Wookiee models in I-Sculpt. But when they realized that the majority of the Wookiees’ actions would be mocap-driven cycle animations for midground and background crowd characters, they decided to combine those six models into one “uber” Wookiee model. Using this all-purpose digital Wookiee, which sported all the clothes, headgear, armaments, and body armor of the original six characters, the animators had to simply toggle off the geometry they didn’t want rendered. To further individualize the crowd Wookiees, the crew swapped out maps for the different hair textures and developed shape libraries to change basic body proportions.
In contrast to the limited facial expressivity of Chewbacca’s mask from the original trilogy, the head masks for Episode III
’s Wookiees are outfitted with animatronic mechanisms for remotely flexing various facial muscles. Nevertheless, the costumed Wookiees still exhibited limited, yet nuanced facial movement, which ILM had to duplicate in the digital doubles. The library of Wookiee facial blend shapes, sculpted in CARI, included a range of fear expressions and basic phonemes, as well as blinking and scowling. In contrast to Maya, ILM’s CARI does not require the artists to create separate models for each new morph target, allowing them, instead, to model the new shapes directly onto the base model. CARI analyzes the new shape and calculates the difference between the base and target poses.
A large group of CG Wookiees fight shoulder to shoulder with costumed actors in the battle scene below. The creatures were crafted from a “super” model that was then individualized with different body shapes and fur.
After the artists modeled the faces and applied the hair in Zeno, they would often lose the form of the digital faces, a common problem in fur and hair simulations. “The masks worn by the actors had the short hair gelled neatly into place around the face to accentuate the eyes and mouth, so you could easily read the facial expression,” says Ferguson. “The hair gets much longer on the back of the neck, along the chest, and down the sides of the body and the legs, and would blow around in the wind.” To control the body hair, ILM’s technical directors (TDs) expanded Zeno’s hair tool, enabling the artists to finesse details, from density, direction, and clumpiness to tufting, crimping, and curliness.
To animate the digital Wookiees, ILM motion-captured the film-plate actors. The group then combined the captured movements into longer cycles-running, jumping, charging, gesturing, shaking spears, shooting weapons, and falling down-and mapped them to the character skeletons in Maya.
An improved version of ILM’s spring deformer system, developed to make the Hulk’s muscles jiggle for the 2003 feature film (see “Body Building,” July 2003, pg. 24) was also used for Episode III
’s characters. This was most visible on the Boga, Obi-Wan’s bipedal lizard steed, to make the flesh on its legs jiggle and flap around. As the springs flexed, rippled, and billowed the Boga’s skin, they now had the ability to preserve the interior volume and hold it as the originating point for the flapping. “This means that you could loosen up the flesh, have it flap and bounce around, and because it never loses volume, it doesn’t look like an empty bag floating around,” Ferguson explains.
Since both McGregor and Hayden Christensen (Anakin) were unavailable for reshoots, Lucas turned to the NURBS models ILM had constructed of each actor at the start of Episode II-Attack of the Clones
. In addition to updating the models to replicate the current look of the actors, the crew also overhauled their riggings to take advantage of new FK/IK blending tools in Maya and a variety of chaining mechanisms designed to interface with ILM’s mocap pipeline and give the animators finer control of the character.
Since ILM’s software was unable to resolve some of the Maya chaining mechanisms at the start of the production, the team was forced to rely heavily on caching geometry to transfer animations between Maya and the group’s in-house software. However, throughout the course of the film’s production, ILM made an effort to outfit its in-house tools with equivalents to the Maya rigging tools, making the animation transfer easier and more efficient near the end of the project. In fact, Maya eventually became the team’s primary animation package, a move which, according to ILM chief technology officer Cliff Plumer, enabled the animators to expand their digital characters by allowing them to show emotion and interact with the live-action actors more convincingly than before.
ILM is also consolidating its hair, cloth, and rigid-body dynamics tools within Zeno (see “Acts of War,” pg. 12), where the dynamic effects can be simulated together rather than separately, as was the case for the Wookiees. Indeed, because the Wookiees’ hair simulation was performed in Zeno, and their clothing and utility belts in CARI’s cloth engine, any interaction between the hair, clothing, and belts could not be simulated simultaneously when they brushed against each other.
With ILM restructuring its pipeline around Zeno and Maya, one of the company’s primary goals is to stop relying on NURBS and instead fully embrace the more stable subdivision surfaces, especially for creature construction. “The models [for the Star Wars
prequels] and their shape libraries had already been built with NURBS, so we continued to use them,” says Ferguson. “But in the future, we hope to make the change.” In addition, ILM’s flagship Zeno will contain new match-moving and photo-modeling tools.
ILM improved its spring deformation system, which the studio created some time ago for the Hulk’s muscles, to make the flesh on the Boga (a lizard-like creature) jiggle and bounce around more naturally in Episode I.
closed the book on the
saga, but perhaps not on the digital characters. Development has already begun on a new Star Wars television series, which Lucas says will focus on peripheral characters from the period between Episodes III and IV, and harness the power of the new Zeno-based pipeline. So, while fans may have seen the last of Anakin and Obi-Wan, new chapters may still be written on the further adventures of the Gungans, Dexter Jettster, Jabba the Hut, or Boba Fett.
Like the other actors in the film, Yoda’s return in Revenge of the Sith
is marked by subtle changes in his appearance, most of which are designed to make him appear younger than he did in
The Empire Strikes Back
Merging the digital Yoda with the enfeebled puppet meant applying a new skin simulation technique to add age lines to the folds under his eyes. This “skin relaxation” technique, developed originally for the film The Hulk
in 2002, is designed to make the skin appear to stretch, crease, and fold realistically.
“We tried using that engine on Yoda, which is a skin-over-bone system, and we got some good results, but we ultimately tuned them down a bit because we wanted to remain faithful to the ‘heavy skin’ look Yoda is known for,” says creature developer Aaron Ferguson. “It was an interesting result, and some might argue a more beautiful result, but it was a question of keeping on character.”
The skin simulation the group ultimately performed on Yoda was concentrated primarily around the eyes and is mostly visible in extreme close-ups. Also, the artists sculpted new blend shapes to make the character look a bit older than he was in Episode II
, a contributing editor for
Computer Graphics World
, can be reached at firstname.lastname@example.org.