Making Halo 3 Shine
Issue: Volume: 30 Issue: 12 (Dec. 2007)

Making Halo 3 Shine

Last month, we kicked off this series on true next-generation titles by highlighting Midway’s Stranglehold. In this issue, we detail the cutting-edge graphics in Halo 3, and we will end this three-part feature with a look at the stunning visuals in Ubisoft’s Assassin’s Creed.

It seems like yesterday when Bungie’s Halo became the must-have title when Microsoft released its original Xbox console. Winning all kinds of game awards, Halo also topped all the video game best-seller charts for nearly two years. In November 2004, Bungie’s Halo 2 was released, igniting a firestorm among the gaming community as rabid fans waited overnight in long lines to be the first to continue the saga of Master Chief.

Halo 3, the final game in the Halo trilogy, puts players back into the green gunmetal armor of Master Chief, as he returns to Earth to save mankind from destruction at the hands of the Covenant, the evil alien coalition. This as the baddies are still trying to activate the deadly Halo array while under the influence of an even more dangerous enemy called The Flood, a sentient parasite that makes monstrous puppets of its victims.

Production on the game began in the wake of a flood of criticism over Halo 2’s ending, which many thought was too much of a cliffhanger (see “The Halo Effect,” January 2005). “It was on our minds, of course, but the events of Halo 3 were in a significant way decided before Halo 2 ever hit stores,” says writing lead Frank O’Connor. “We knew there was more story to tell, and we knew that we had shipped a cliff-hanger ending that would have to be resolved in a third part. We were careful to try and ensure a satisfactory conclusion.”

For the third installment, Bungie remained true to the protocol of “guns, grenades, and melee attack” that make Halo feel like Halo, but also tried to evolve the gameplay to keep the combat fresh and interesting. “That was done is subtle ways with timing and balance, but also in less discreet ways, including new weapons, vehicles, and, of course, equipment,” says O’Connor.

In addition, in the four-player cooperative mode, rather than assuming the roles of identical Spartans, players can now take control of Master Chief, the Arbiter, N’Tho ’Sroam, and Usze ’Taham. “This decision was made to create real personalities for the co-op mode, so that you weren’t all simply using clones of the Chief, but without impacting the core fiction in any meaningful way. Multiplayer tends to take giant liberties with canon, but we wanted to make that less jarring in Campaign [mode],” says O’Connor.
The war between the humans and the Covenant continues in Halo 3. Like its predecessors in the series, the title incorporates current state-of-the-art graphics and AI.

To create the immersive environments, Bungie once again used Havok Physics to drive the behavior of objects and characters in the game. Bodies fly through the air using ragdoll physics, vehicles explode, showering debris onto their surroundings, and new additions such as the portable gravity lift and the “man cannon” propel players through the world. Havok is also used for special effects, including plants swaying in a jungle and Flood tentacles hanging from a ceiling.

While Halo 3’s graphics engine is tied very closely to the Xbox’s HDR model, O’Connor says the most important factor in delivering the game’s immersive sense of “place” is the unification of all the elements. “The HDR, the apparent and actual scale, the Mie and Rayleigh scattering (for simulating clouds, gases, solids, and liquids on light) in the atmosphere are all designed to immerse the player in our vision of the Halo worlds,” he says. “Each individual aspect of the game, from the PRT (precomputed radiance transfer) lighting to the types of shaders we use in explosions and other effects are built to match an evolution of our art style. Initially, we saw a lot of criticism based around the idea that we never shot for photorealism or to ape the art style of games like Gears. In the end, the choices we made were right for Halo.”

Character Modeling

Bungie overhauled all the character models from Halo 2, pushing their polygon counts from 3000 to around 6000. The Chief clocked in at approximately 6000 polygons, the highest density the team could afford to maintain within the expansive and chaotic nature of the war, which could have “an army” of characters on screen at a given time. More importantly, the game marked the first time Bungie was able to utilize the projection-mapping method of taking higher-resolution geometry and mapping it onto a lower-resolution poly character. “This allowed us to get fine details and more realistic features on faces and organic surfaces,” says 3D art lead Shi Kai (Shiek) Wang.
The Halo vehicles were modeled within 3ds Max and hand-painted in Photoshop.

The real-time reflection mapping now present on Master Chief’s armor proved a double-edged sword, adding detail at an enormous rendering cost. “First off, [it] was extremely expensive, so it would be used mainly in the cinematics. However, [to handle the in-game reflection mapping] the programmers figured out a way to project environment maps in each level, store it in the game, and have it blend dynamically on the Chief’s visor in-game and between different levels. We were able to break it down into different areas, so even when the Chief is in the same level but in different lighting situations (such as indoors versus outdoors), the reflection mapping would shift,” says Wang. “The Xbox 360 gave us a bigger bucket to fill our creative assets.”

While artists modeled and animated the characters in Autodesk’s Maya, each of them drew from a raft of packages—Autodesk’s 3ds Max and Mudbox, Pixologic’s ZBrush, and Softimage XSI, as well as Maya—to incorporate the projection mapping into the game. “The most significant modeling challenge was adding all those different systems and making it work with our proprietary engine tools.” Of all the characters, the marine proved to be the most difficult, Wang adds, because he was the first to undergo this procedure.

During the making of Halo 2, when the armored characters assumed extreme poses, modelers often ran into “joint rotation pinching” problems around T-junctions or some of the more difficult seams, such as the shoulder. To solve the problem for Halo 3, Bungie hired a rigging artist solely dedicated to solving these problems. “The problem is twofold: One is rigging, and the other is animation,” explains Wang. “This rigging artist ensured that the animators and other rigging artists were constantly talking about what changes were being made and what animations were being authored. Unfortunately, there weren’t any magic solutions except to make sure all the holes were filled.”

In order for the player to see the damage being inflicted upon the enemy, it was more important to model damage permutations into the foes rather than “friends.” Hence, the Covenant’s soldiers have damage effects modeled into their geometry. On the other hand, artists chose to “bloody up” the Marines using a simple texture swap. “That was because they would fight at your side, not against you, so a gradual change in destruction isn’t necessary,” Wang explains. “The only indicators you need to know for your Marines is whether they’re alive or dead, and with a texture swap, that worked perfectly. The Chief and ODSTs (akin to Special Forces) were not broken down because we didn’t want to make them look weak once they were killed. We all thought the feel for these main characters worked best when their bodies were intact.”

For each of the main characters, the shader system is like a thousand-layer cake, made primarily in Adobe Photoshop. Artists, however, sculpted normal maps and bump maps in 3D using ZBrush and Mudbox, which were then either imported into Photoshop for painting in the details of the height maps or touched up with Nvidia’s Normal filter.

The face, for example, comprises the following maps: base, detail, bump, bump detail, specular, occlusion parameter, subsurface, and transparency. The metallic shaders for Master Chief were a lot simpler; nevertheless, the shader system culled the functions depending on the needs of the material. Generally, the size of the maps was set at 768x768 for bump and normal maps, 512x512 for the base map, and 256x256 for the detail map.

“Once we were ripping maps from higher-res geometries, we would rip them at 2048s or 1024s and let Photoshop’s scaling do the job of sizing details down,” notes Wang. “This ensured that more detail would be captured at higher res first, then properly scaled down to fit the size our engine can handle.”

Character Animation

When rigging Master Chief’s sectional body armor to deform realistically with the more pliant undersuit, animator Nathan Walpole says the first objective was achieving dynamic posing. “We focus on hitting the silhouettes and the poses we desire; if two pieces of armor slide over each other or mildly intersect, but the pose and motion on the character are fantastic, we feel we did our job,” he says. “‘Passion before pixels’ is what makes our animation shine.”
The artists used real-time reflection mapping on Master Chief’s armor for the cinematics; in game, they blended projected environment maps onto his visor for a similar effect.

The entire Halo 3 cast is outfitted with standard FK/IK controls built in Maya. Bones are the foundation of the base skeleton, while helper joints aid in the deformation of the character and maximizing a character’s flexibility. “We did not break the mold with regard to rigging. Instead, we focused on the core structure of the character and more robust custom tools within Maya to help our animators focus on the art of animation rather than fighting with the latest gimmicky spline IK solution or other rigging fads,” Walpole says.

Walpole continues: “Custom rigs for the Flood Pureform characters got a bit more complex, but nothing truly out of the FK/IK standard. What we did break ground on was the inclusion of cinematic and rigging tools within Maya to speed and solidify production.” The animator points out that Bungie’s tools are likely to become a discussion topic at the upcoming Game Developers Conference (GDC) in February, much like its cinematics pipeline and tools were at the fall ADAPT conference in Montreal.

Bungie also made significant improvements to Halo 2’s animation and blending systems, specifically to handle the wildly mutable Flood Pureforms and the more dynamic Brutes. “For the Flood transforms, we [implemented the capacity for] uniform scaling within our engine. Pulsating bits and boils would scale and writhe while the characters transformed,” describes Walpole. Using joint scaling and translation tricks, the engine could also simulate two characters fusing and dividing as the same transformable entity. “When we wanted to change the Flood Pureform creature into the Flood tank, we worked our magic within the engine to mask the swap of two separate models and skeletons at one shared pose,” he adds. “The pliable rigs and scaling in the engine aided this seemingly impossible task.”

In total, animators created more than 1000 animations in Maya for Master Chief and his allies, the Elites, not including cinematics or story-moment animations. The game boasts upwards of 10,000 animations for gameplay alone; all the animations were done by hand without the use of mocap or other similar tools.

Walpole says one of the greatest challenges of animating game characters is finding each one’s soul, or “motion hook.” One of the best motion hooks was found for the evil alien Brutes. “Having a character like the Brute show deliberate attitude when armored, then a different persona when unarmored, led to some creative animations and a well-rounded character,” says Walpole. The Brute’s “motion hook” ultimately ended up driving many gameplay decisions and trickled down to other difficult-to-solve characters, like the transforming Flood Pureforms.

Halo 3 also adds a new type of weapon to Master Chief’s arsenal: the heavy, two-handed plasma turret, which weighs heavily on the Chief’s movements. “These turrets broke all the rules of our traditional first-person animation system because the character was now being viewed closely in third person. All of a sudden, in third-person view, we could see small errors and idiosyncrasies in our animations,” says Walpole. As a result, the team had to revisit code and develop smoother interpolation methods between the animations to make Master Chief and his Elite counterpart look weighted when carrying the removable plasma turret. The “weight” was completed by hand-keying the animations while viewing video reference. “Without reference, your animation is lost,” Walpole adds.

Cortana, Sergeant Major Johnson, and Commander Keyes all return with a facial animation system based on 50 bones for controlling more than 17 phonemes and eight emotive states, ranging from happy to sad, angry to pensive. “Multiple animations would play and blend over one another to create the final result,” says Walpole. “Managing 50 bones is not an easy or time-productive task for any animator.” The group incorporated the help of OC3, an automatic speech system for game content, to drive the speech portion of the animation for each language that is supported. Animators then layered the emotive states over that procedural speech animation to enhance the lines of the English dialog using a custom tool that allowed the animation team to work with 50 nodes with relative ease.


Halo 3’s epic story unfolds across environments that are larger and more detailed than any of its predecessors. There are ocean-size waterfalls, a scorched African savannah, barren otherworldly plains of sand and rock, lush jungles intercut with rocky cliffs—all leading to the dark, glowing corridors of alien ships. To this end, the team was faced with the challenge of creating an entirely different visual language from one mission to the next.

“We chose to begin the adventure in a dense jungle on Earth to start players out in a familiar, intimate environment so we can both refamiliarize the player with the game and set the stage for the impending siege of Earth,” says environment art lead Christopher Barrett. “As we move through the campaign, we slowly open up the world to the player by showing the war-torn planet and what they are fighting for. Setting the first part of the game in familiar surroundings allows us to pull the rug out from under the player when we put the person on the completely alien, barren wasteland of the Ark.” In addition to the music, the group used various color palettes and types of spaces to evoke entirely different moods during different parts of the game that complemented the narrative.
All the characters contain standard FK/IK controls. Helper joints within the bone structure give the characters more flexibility in their movements.

One of the greatest challenges was establishing an immense, epic scale for the environments, especially the Halo ring. “Texturing a structure the size of a small planet is no easy task. In the past, we might have used a matte painting of some of those immense elements, but in Halo 3, we actually built them,” says Barrett. “The first time I saw the model for the half-constructed Halo, I couldn’t believe it. You could fly the camera through the girders and structural beams, see giant sentinels hovering above the surface constructing the Halo. Sometimes I think our artists are insane,” he jokes.

Once again, Havok Physics powers the almost-universal destructibility of the environmental elements, including glass windows, fusion coils, pieces of electronics, and metal crates and tables. “We tried to put more dynamic objects in Halo 3 than we did in the previous Halo games, as they help bring the world alive and make it feel real,” says Barrett.

Physics and Effects
Havok’s continuous physics technology allowed Bungie to introduce the Mongoose human transport vehicle for the first time. “Because the Mongoose moves so fast relative to its size, it wasn’t possible to simulate before Havok 3 and continuous physics,” says Eamon McKenzie, physics engineer. “We tried to implement the Mongoose in Halo 2, but it wasn’t possible with Havok 2’s time-step locked physics update. Also, because continuous physics enables stable simulation of fast-moving, small objects, we were able to simulate our weapons as rigid bodies this time, instead of with point physics as we did in Halo 2.”

All the vehicles fall into one of three physics simulation categories: friction, antigravity, and flying. Interestingly, the Brute Chopper, the enemy’s transport vehicle, is a hybrid of all the vehicle physics primitives and uses a custom version of the friction-point physics for the front end and antigravity physics for the back. “It makes sense because Brute technology tends to be a hybrid of Covenant technology and gritty mechanical things,” adds McKenzie.

Using Havok Physics, artists rigged the abundant plant life just like ragdolls. McKenzie explains: “The plants are a set of rigid bodies connected by physically driven constraints. One thing that makes plants different from other objects is that they live in what we call the proxy world.  They exist in a separate instance of Havok from everything else. The player has a parallel rigid body in the proxy world that is soft keyframed to the player’s real-world position. This parallel rigid body in the proxy world interacts with plants without any feedback; actions in the regular Havok world can affect things in the proxy world, but nothing that happens in the proxy world affects the main simulation.” This was important, he notes, because the group wanted to be able to place plants throughout the environment without worrying about impeding the motion of AI and the players.

From scorching blasts of flamethrower fire, plasma bursts, fog, smoke, snow, and splashing water, the game is filled with hundreds of unique particle effects, all of which were produced by a custom particle engine developed in-house. “Particle simulations are based on randomness, so our tools are designed to control and shape particle systems while retaining that element of chaos that makes them feel realistic,” says Steve Scott, effects art lead.

Scott describes the process. “The effects artists start by painting textures for particles in Photoshop. Those bitmaps are applied to 2D cards that are spawned in a particle system,” he says. “Working with the particle tool, we tweak parameters such as spawn rate, gravity, air friction, and scale to make the particles look and behave the way we want. Usually, we add more particle systems to flesh out the effect.”
Artists modeled “damage” permutations into the geometry of the enemy characters.

For complex effects, like the plasma grenade explosion, many different layers of particles must all be coordinated to look like a cohesive whole. “Of course, all those flashy particle effects are worthless if they don’t support the gameplay,” Scott notes, “so we solicit a lot of feedback from the designers to ensure that the effects work well in the context of the game. Iteration is a critical part of the process.”

According to Scott, after the particle effect is “right,” decals, lighting, and exposure controls are all linked in to complete the effect. The team can spend days crafting an effect that only lasts a fraction of a second, which is another reason why the new Saved Films feature, which allows players to rewatch their games, is valuable. “You can slow down time and see all of the details that we put into it,” he says.

As vehicles and soldiers thrash through the mud and water, artists also used particles to simulate the splashing and spraying. While the particles update according to simple water physics, displacement maps are computed from the particles as they propagate through the water body, simulating the rippling surface.

“The repeating water ripples are generated off-line using a custom authoring tool. The artist can specify direction of flow, speed, shape of wave profile, amplitude, and a variety of other parameters that control the appearance of the water body,” says graphic art lead Hao Chen. The resulting normal and displacement maps are then applied to a water mesh that is tessellated in real time depending on the distance from the eye. The interaction of water and object is simulated using wave particles before the dynamic ripples are combined with the off-line generated ripples for final rendering.
The main thrust of the AI programming for the final Halo chapter focused primarily on conveying a hierarchy among the Covenant forces, so that a high-ranking Brute is seen leading the ferocious Jackals and the mindless Grunts by arraying them in combat and instilling courage. “To do this, we introduced the concept of rank within the AI characters, so simply adding a high-ranking Brute such as a Chieftain or a Captain—each denoted by different armor permutations and actions—to an encounter would automatically create the desired behavior,” says AI engineer Max Dyckhoff.
To convey the concept of the hierarchy, Bungie employed three factors: positioning in combat, group behavior, and broken behavior. “When it comes to positioning, the Grunts, Jackals, and lower-level Brutes will position themselves between their leader and the player, defending him from enemy fire,” says Dyckhoff. “To show solidarity inside the Covenant squad, we introduced group behaviors, whereby the lead character will shout out an order for all his followers to do such things as throw a grenade at a certain location.”

Finally, when a Covenant leader is killed, programmers wanted to both reward the player and intensify the pack mentality. To do so, Bungie introduced “broken” behaviors that are triggered when the leader of a pack is killed. Grunts will flee from their targets or go insane, pulling out two plasma grenades and run kamikaze-style straight at the player; Jackals will huddle together behind their shields; Brutes will become a lot more cautious and take cover.

Aside from developing the enemy hierarchical behaviors, Bungie also wanted to give some context to the enemy encounters in Halo 3 by giving the enemies “activities” and “vignettes,” which would play out before the player comes upon them. Such activities included characters smoking or playing cards. Vignettes are rather more complicated, and are often used to tell small stories before encounters begin. Examples include scenarios like Brutes torturing Marines, a group of marines introducing the player to a situation, and so on. The designers scripted as many vignettes as time would allow, as they really added flavor and realism to the world.

“Halo’s AI has been the topic of many papers at GDC, and other studios are beginning to use the ‘behavior tree’ system for their games, including Maxis, which is using it in the development of Spore,” says Dyckhoff.
Vehicle Modeling
Unlike the characters, the artists built and rigged Halo 3’s vehicles in 3ds Max, where modeling rigid objects is easier and more precise, according to Wang. “[Max] allowed us to be more flexible with the destruction states, if we wanted to change one of them,” he points out. “Surprisingly, none of our vehicles used projection mapping; they were all hand-painted [in Photoshop], which gave them a more earthy feel, dirtier and with less of an ‘assembly-line machine feel.’ It was a conscious decision even after we introduced the projection-mapping methods to our characters.” Artists modeled most vehicles in four levels of destruction: minor, medium, major, and destroyed.


From the dazzling god rays pouring through the jungle canopy and scattering through the leaves, to the pools of shadow and artificial light from machinery onboard the spaceships, Halo 3’s lighting is generated via global illumination and photon mapping. Unlike traditional light mapping, the game stores incoming lighting at every surface point and encoded as a spherical harmonics (SH) vector. Static geometry is rendered directly from the SH light map. Dynamic objects sample from the underlying SH light map, so that their lighting is consistent with the surrounding environment.

The lighting system is fully integrated with the Halo material system, which is capable of rendering a variety of diffuse and glossy material realistically under different lighting conditions. Shadows from the objects are computed using a combination of PRT for blurry, low-frequency shadow) and shadow maps. The cut-scenes are rendered using exactly the same technology as the main gameplay.

“The Jungle scene (Sierra 117) is a good showcase of various elements of our graphics engine working in harmony, while the Tsavo Highway level is a good demonstration of outdoor lighting, where the dominant light sources are the sun and the sky,” says Chen. “The interior of the Storm level showcases the indoor lighting and the material system well, with light pouring in from the opening in the ceiling getting combined with artificial lights in the interior and a variety of metal and other surface materials.”

Halo 3’s sky serves both as a backdrop to a level and the main HDR light source. Physically correct sky and sun models form the basis of the HDR sky. The atmosphere system uses Mie and Rayleigh scattering computation to account for everything from simple haze, to the blue color in the sky, to aerial perspective of distance scenery. With the HDR lighting and global illumination at the foundation, the material system allows the group to render a variety of realistic material, from wet rock, to moss, to tree barks and leaves.

The atmosphere system, which handles the scattering of light through a medium, is used in subtle ways to add mood and depth. According to Chen, the artists also used a fair share of cheats, such as placed cards and billboards, to add visual interest and contribute to the overall visual sophistication of the game.

Fight to the Death

As the war rages on between developers for the coveted crown of the first true next-generation game, studios are exploiting all types of new digital tools and techniques. Aside from advanced AI, such an experience demands thousands of unique characters, entailing thousands of unique textures, meshes, animations and behaviors, potentially creating an asset management nightmare—which many developers are now encountering. Nevertheless, in the race to break the mold and exploit these new possibilities, all developers might do well to remember Bungie’s maxim “passion before pixels.”

Martin McEachern is an award-winning writer and contributing editor for Computer Graphics World. He can be reached at