Game Films
Issue: Volume: 30 Issue: 2 (Feb 2007)

Game Films

Studios partner with developers to raise the bar in film-to-game adaptations
By Martin McEachern

Special Gaming Section
THE GODFATHER
MEET THE ROBINSONS
OVER THE HEDGE
SUPERMAN RETURNS

There’s a new phenomenon emerging in the high-stakes world of big-budget filmmaking. If you could sneak your way onto the closely guarded set of Spider-Man 3 or the upcoming big-screen version of Iron Man, you’d be surprised to find the film crews working alongside a group of people whom you would have never seen a few years ago. They’re video game artists, taking photographs, recording the proportions and textures of sets, and working closely with production designers and the visual effects team to makesure their game upholds the same production values as the film. They’ve become an almost permanent fixture on the moviemaking scene. The reason is because in this era of cross-platform marketing and convergent technology, major film properties live twice—in the movie theater and in the interactive realm.
 
When studio executives green-light a potential blockbuster, they’re now launching an army on two fronts. As the film crew springs into action, so, too, do their counterparts on the game development team, poring over the script, studying the concept art, borrowing cyber scans of the actors, collaborating with the director and writers, and rehearsing the actors for the vocal performances that are now almost obligatorily lent to a game. With a $30 billion global market at stake and the power of new consoles making it easier for film and game crews to share digital assets, movie studios are forming close-knit partnerships with game developers to push the quality level of their film-based games, often embedding both teams on the same premises to facilitate collaboration.

Last spring, for example, shortly after Activision signed a multi-year agreement with DreamWorks Animation for the exclusive video game rights for DreamWorks’ upcoming feature films (including Bee Movie, Kung Fu Panda, and all future Shrek films), DreamWorks announced that Activision would be opening a studio facility on the DreamWorks Animation campus. "This is an unparalleled step in the convergence between Hollywood and video games," says Robert Kotick, chairman and CEO of Activison. "For the first time, we’ll be able to align our games’ production schedules with those of the movies, from the preproduction phase onward. This will allow us to fully leverage the movie assets and story lines, in addition to collaborating closely with the talented production teams at DreamWorks to develop story lines that expand the movie experience in new and compelling ways."

In the same spirit of collaboration and consolidation of assets and talent, ILM and LucasArts are now housed in the same building on George Lucas’s Presidio campus. "[The idea] came from George himself," says LucasArts president Jim Ward. "And it’s key to rebooting our game studio." To create its upcoming Indiana Jones 2007 and next-gen Star Wars titles, LucasArts will also be using its Zeno Game Editor, which is incorporated into ILM’s Zeno Development Environment, allowing both film and game artists access to the same tools. LucasArts can now use ILM’s high-end art creation software, while ILM can tap into LucasArts’ real-time technology, which underlies their previsualization tools.

Jacob Meakin, lead programmer at Edge of Reality, which developed the game title based on DreamWorks’ Over The Hedge, sees this marriage between real-time and cinematic effects tools spreading throughout the industry. "One of the most interesting developments is how some film studios are now using game studio software to prototype movies [through animatics]," he says. "This allows them to bring their ideas to life without having to wait a long time for rendering. That means the assets for both films and games will be more similar during production, which could definitely promote sharing."

Besides DreamWorks, other studios offering unprecedented support to game developers include Sony Pictures Entertainment, Walt Disney Studios, Paramount, and Twentieth Century Fox. More importantly, these games are now being developed under the auspices of heavyweight directors such as Peter Jackson and James Cameron, so that they don’t merely encapsulate a film, but expand deeply upon its universe. At the 2005 GDC show, lifelong gamer and renowned film director James Cameron announced his plan to create a "highly involved" massively multiplayer online game (MMOG) to enrich and extend the narrative of his upcoming feature film, the rumored sci-fi epic Battle Angel Alita.

"In my next film, we’re planning on simultaneously developing a major motion picture and, hopefully, a major game title that coexists in the same world and that shares the same characters," Cameron says. "Going into that world will actually inform those watching the film and vice versa." Later, he told BusinessWeek: "So much of literary sci-fi is about creating worlds that are rich and detailed and make sense at a social level. We’ll create a world for people and then later present a narrative in that world. You are exploring the interaction of technology and the human imagination, and you play it out in a highly competitive, fast-paced [game]."

Recently, Steven Spielberg signed a long-term deal to develop games for Electronic Arts, while Peter Jackson’s heavy involvement with Ubisoft’s video game adaptation of King Kong (see "Aping Film," March 2006) turned out to be just a precursor to his latest venture: his newly announced Wingnut Interactive, a partnership with Microsoft to produce interactive titles for the Xbox 360.

The first release will be a spin-off of the Halo film, which Jackson and partner Fran Walsh are producing. Jackson describes the new Halo game as "not quite a game, not quite a film." Jackson’s manager, Ken Kamins, offers this cryptic tease about the director’s plans: "The idea is to create a kind of interactive entertainment that’s not games as we know it."

This level of commitment to the art of video games—from film studios, leading effects houses, and highly acclaimed directors—should ensure that the next generation of film-to-game adaptations will not be the second-rate marketing vehicles that often stigmatized the genre in the past. Rather, they should succeed in fusing the emotion of films with the intense absorption of interactive entertainment like never before.

For an in-depth look at the challenges of developing film-based video games, the emerging parallel production pipelines, and the technological barriers still standing in the way of seamless asset sharing, we’ve turned to the developers of some high-profile, film-to-game adaptations: Electronic Arts’ The Godfather, Buena Vista Interactive’s Meet the Robinsons, EA’s Superman Returns, and Edge of Reality’s Over the Hedge.
 


THE GODFATHER

Back to top

The operatic mafia drama about young Michael Corleone’s reluctant descent from a free-thinking, law-abiding war hero to the head of his crime family is making its much-hyped debut on the new Nintendo Wii and Sony PlayStation 3 this month, following the Xbox 360 version released last fall. Those games, along with those for all the other major game platforms, feature a star-studded voice cast including James Caan as Sonny Corleone, Robert Duvall as lawyer Tom Hagen, and the late Marlon Brando as the patriarchal Vito Corleone. While the returning cast members certainly helped to elevate the game to the pedigree of the film, re-creating 1940s New York, the nuances of the actors’ performances, and Gordon Willis’s shadowy cinematography—with its sepulchral palette of browns, golds, and blacks—required countless hours of studying the film, visiting the locations in New York, and poring over Paramount’s extensive photo archives. "We even digitized signature sequences so they could always be instantly accessible at each artist’s desktop," says EA producer Joel Wade. "We kept the first movie running on a loop in our central team area so people could stop and catch a few minutes [of it] at a time."

With the film’s complex, interweaving, multi-character story lines, following the movie plot linearly would have been impossible in the game. "We decided very early on what we didn’t want to do, which was present a linear level-based movie game. It would have deprived us of some great game opportunities—like the ability to create your own mobster to bring up through the ranks. By embracing the world of The Godfather, and weaving your story through the fiction of the book and movie, we could have our cake and eat it, too."

This approach, Wade says, allows the player to experience countless hypothetical scenarios posed by the film. "What if, when Don Vito or Sonny ordered a hit, you could be his right-hand man to carry it out? What if you could be the man who delivered the horse’s head to Woltz? What if you could expand the Corleone empire block by block through negotiation and strong-arm tactics? We knew that if we could take those amazing game concepts, set them in the authentic streets of 1940s New York, and bring the characters we all know to life, we’d have an amazing package."

Far from merely undergoing a simple textural face-lift, the game encompasses so much more. The memory and speed of the new consoles allowed the team to make vast improvements throughout all areas of the graphics and gameplay. For example, EA was able to add more interactive objects to the interiors, increasing the fun factor in "persuading" reluctant merchants to pony up protection money by hurling bottles at them or destroying their wares. The team could also double the number of pedestrians and cars in the world, imbuing the streets with the bustling energy of the period. Moreover, artists could apply real-time reflection maps to the cars and enhance the quality and quantity of special effects, such as smoke and blood.

Bringing Film Legends To Life

Obviously, one the greatest challenges in developing film-based games is re-creating the specific characteristics of each actor. Since The Godfather is more than 30 years old, creating these digital doubles was especially difficult. "We had to watch the film repeatedly to fill in each individual’s profile and details," says character lead Jenny Ryu. "We gathered as much existing reference as we could, such as screen shots from the film and production stills, to model the face mesh and paint the facial maps. Using [Adobe Systems’] Photoshop, we created various layers to composite with photo sources and hand-painted sources. For instance, Vito’s face map was composed of 5 percent photo sources and 95 percent hand painting, while Michael’s face map was a composite of 50 percent photo sources and 50 percent hand painting."


Artists used a combination of sources to create the lifelike characters in The Godfather.For instance, Vito’s face comprises 5percent photo textures and 95 percent hand painting by artists using Photoshop.

Ryu and her team prepared the face maps at 1024x1024 pixels—the highest resolution accommodated by both current-gen and so-called next-gen (current) game platforms—then downsized each map to fit into the memory they allotted for each platform. "For instance, for major characters like Vito, Sonny, and Michael, we used 256x256 for the PS2 and Xbox, and 512x512 for the Xbox 360," she says. "For less-crucial characters like pedestrians, we traded quality for variety by using a 128x128 map."

Modeled in Autodesk’s Maya, the in-game characters comprise approximately 3500 triangles, while the versions used in the cinematics contain nearly 5000 triangles. Modelers limited less-important characters to 2500 polygons. To accelerate the real-time rendering, each in-game character can appear in four levels of detail (LOD), from 3500 downward. To quickly generate generic characters, such as mobsters and pedestrians, EA used a morphing tool to vary a base mesh in Maya, then further differentiated them using texture swapping and tinting technology. Once modeled and textured, the characters shared a proprietary EA skeletal rig developed in Maya. Bone-based, the unified rig features FK, IK, and global IK controls, Set Driven Keys, and a proportioning rig.

Creating facial performances for Al Pacino, Robert Duvall, James Caan, and Marlon Brando would be a daunting challenge to even the best animator. Fortunately, EA was able to do a facial-performance capture of James Caan and Robert Duvall during their voice-recording sessions for the game. In addition to the facial capture and video footage of the recording sessions, EA used every animation weapon at its disposal, including Set Driven Keys, hand-keyed animation, and automated animation for in-game and face-to-face characters.



EA did a facial-performance capture of James Caan and Robert Duvall during their voice-recording sessions for the game. When the data was unavailable, the artists animated the faces by hand.

"For the Xbox 360, we added new ‘living world’ motions, such as conversation variants and emotional interactions, to amplify the realism of the pedestrians and character interactions with their environments," says animation director Seth Swanson. For much of this "living world" and face-to-face dialog, EA used its proprietary facial animation tool, FACT (for Facial Action Coding Technology), which can procedurally combine 35 phoneme-based poses for speech and another 35 emotion-based poses for expression. For the facial animation in the Xbox 360 version, EA used a combination of both mocap data and its FACT system.

"When we didn’t have performance-capture data available, we hand-keyed the lip sync," says Swanson. "We also used audio and text inputs for our FACT tool to generate lip sync procedurally for less-critical characters that wouldn’t be viewed up close. With the volume of dialog in the game (over 15,000 unique lines), this was critical." EA is currently investigating the potential of Softimage’s Face Robot in streamlining its facial animation pipeline.

Set Design

As each classic sequence from the film has been stamped indelibly into the consciousness of moviegoers, so, too, have the sets in which they unfolded: the sprawling Corleone compound; the mansion where Woltz awakens to the horse’s head; the bar where Luca Brasi meets his demise; the hospital where Michael narrowly saves his father’s life; or Louis’ Italian-American Restaurant, where Michael shoots Police Chief McClusky and Virgil Sollozzo at point-blank range. EA re-created them all. "We benefited from the multitude of shots in the movie that allowed us to re-create an interior that would be both accurate to the film and serve as a compelling gameplay space," says interior lead Isao Kogure.
Since the ground and building facades comprise more than 50 percent of the on-screen real estate, EA used extensive normal mapping to increase their detail without adding geometry. Windows, roads, grass, dirt, sidewalks, and other pavement-type textures received normal maps. In particular, players can see the effects of normal mapping in the many warehouse windows as the camera pans back and forth. In some instances, each windowpane has the varied reflectivity of irregular glass, which adds life to an otherwise static scene.


The in-game main characters, such as those in the scenes above, are over half the size of those that appear in the cinematics, where the average size is 5000 polygons. Both are modeled in Maya, though in the game, the characters are rendered in one of four levels of detail.

Art director Margaret Foley-Mauvais provided the team with a wealth of dated photographs for modeling the period cars and buildings. In addition, a small group of artists visited New York and shot more than 4000 photographs of all the neighborhoods to gain a more immediate sense of the city’s character. "Fortunately, many of the key landmarks, like the Brooklyn Bridge or Empire State Building, haven’t changed in appreciable ways," says Kogure. "The most challenging sets were the very large ones, like the warehouses, hotels, and compounds. That’s because it’s easier to fill a smaller space and make it appear visually ‘full.’ In larger spaces, a creative balance of lighting, texturing, object placement, and gameplay focused the player so that the environments felt as compelling as any other."

By far, the biggest modeling and texturing challenge was re-creating the Corleone compound, which required an almost forensic reconstruction of the set using movie stills and the poorly preserved sketches from the Paramount set department. "The only reference material we had for the Corleone compound were shots from the movie and a site plan that was produced by the set department when the movie was made. The original drawing was produced in pencil, and much of the image was faded and obscure. There was no relevant scale associated with the drawing, and the interior of the buildings weren’t outlined in the drawing," says Kogure. "Ultimately, we had to reverse-engineer a complete virtual layout using a combination of the drawings and movie stills."

Real-Time Cinematography


Capturing Gordon Willis’ Oscar-nominated cinematography, so crucial to evoking the mood of warm family loyalty chilled by constant bloodshed, was another major challenge. It demanded advanced lighting techniques normally reserved for high-end film work—techniques such as global illumination and ambient occlusion—to ensure that the characters and sets were bathed in bounce light from those rich blacks, browns, and golds.

"We’re already precomputing global illumination and ambient occlusion," says lighting technical artist Matt Christmann. "The great thing about a lot of these advanced techniques is that they can be rendered into vertex colors, light maps, or shader-layer maps as part of the lighting solution, and applied in-game at run time. The game engine combines all these layers together in the shader to make the lighting you see when playing the game."

For example, EA’s artists baked ambient occlusion into all the destructible objects in the game. These objects can be placed anywhere in the world; vertex and pixel shaders then combine the "radiosity-like" ambient occlusion with the dynamic game lights. "A lot of previous-gen games have a ‘Hanna-Barbera effect,’ meaning you can tell exactly what part of the scene is going to move. The Godfather on the Xbox 360 has interactive and destructible objects that mesh seamlessly with their environments, keeping the player busy playing the game instead of noticing visual artifacts," says Christmann.

"We paid close attention to the cinematography and art direction of the film. This probably pushed us to go further in terms of contrast and darkness than we normally would have for a video game," says Foley-Mauvais. "It was a challenge keeping a balance between accurately representing the cinematography and lighting and making sure we created and lit scenes that were appropriate for gameplay."

Powering the game’s highly sculpted cinematography is EA’s real-time lighting engine. As lead lighter Larry Weiss explains, "The exterior environment is lit by first baking an ambient occlusion pass into the vertices to control where light can reach and how intense it will be. Then, using our real-time Time of Day engine, the key light (sun/moon) is added, along with dynamic shadows cast from buildings and other objects. We can control the shadow density and color to create the look of dawn, midday, dusk, and night. The Time of Day engine then interpolates between them."

Weiss and his team lit the interior venues with a combination of full-colored light maps applied to the floors and walls, and vertex lighting on the props and furniture. Using Maya, the artists set up lights to illuminate the characters and vehicles dynamically. During gameplay, the engine calculates which four lights to use at any given time. "It was a constant pain switching from a static-lit world on current-gen consoles to the real-time, dynamically lit world of the next-gen version, which changed with the time of day," adds Weiss.

Operatic Violence

Almost every scene in The Godfather pivots on bloodshed, which made fluid simulation and ballistic effects pivotal to the game. Visual effects lead Jeff Kuipers and his team used a combination of hand-painted textures, particles, and physics for blood effects. The direction and caliber of the bullet impact determined the trajectory and size of the blood splatter, which was then projected onto surfaces such as storefronts and automobiles. "Of course, we imposed limits on ourselves to make sure we respected the violence level of the films without ‘over-gorifying,’ " says Kuipers. "At one point, we experimented with ‘meaty bits’ to enhance some of the executions, but quickly decided that it didn’t make sense for our game."


While lighting the game scenes, the artists tried to adhere to the art direction established in the movie. This was done using the EA real-time lighting engine along with other lighting passes.

For fire, smoke, and smoldering guns, EA used its proprietary FX sequencing software, Alchemy, which combines animated shaders, 3D objects, particle animations, texture animations, dynamic lighting, physics, and even frame-buffer effects like bloom, blur, and distortion.

"Realistic fire and explosion effects were crucial to adding danger and excitement to missions, and making weapons like the Molotov cocktail or tommy gun so much fun," says Kuipers. "For the fire, we used a combination of Maya fluid simulations and proprietary shaders. The explosions pushed our effects system to its limit, with the addition of realistic fireballs, [heat] distortion, thousands of smoke and debris particles, and sub-effect systems like streaming firebrands that used physics to collide with nearby objects." Water effects, for fountains and gushing fire hydrants, employ a mixture of animated surface shaders (combining displacement, reflection, specular, and UV animation) and particle animation.

According to EA, the most indispensable capability of Alchemy is its ability to relay information about nearby physical effects to the characters and environments, so that they propagate dynamically throughout a scene. For instance, explosions will shatter nearby windows, break objects, knock down characters, or set cars ablaze. "This allows the player to trigger unexpected chain reactions of explosions and destruction during a firefight. Each weapon also has unique impacts for each surface type, so that shooting up a hotel lobby with bullets ripping through carpet, upholstery, wood paneling, and plate glass windows is a lot more fun and realistic," adds Kuipers.

"We’re very proud of all of our effects," adds development director Steve Coallier. "While working on The Godfather, we had already seen the game Black, developed by our Criterion Studio in the UK. We were so inspired by their ability to cause mayhem with so much of the environment that we decided to push for more virtual destruction in The Godfather 360 version."

While Havok was used for the physics, Coallier says that Alchemy created the most spectacular destruction effects in the game, such as the Molotovs, dynamite, and bombs that decimate an enemy compound. "Alchemy allows a visual effects artist to sequence stunning multi-part effects using particle effects, shaders, texture animation, dynamic lighting, 3D objects, and physics. It also lets a sound designer add the right audio effect to bring the complete experience together."
 


MEET THE ROBINSONS

Back to top

Based on William Joyce’s classic A Day with Wilbur Robinson, Disney’s second 3D film tells the story of a brilliant 12-year-old orphan named Lewis. A tireless inventor, Lewis’ latest project is the Memory Scanner, which he hopes will help him retrieve memories of his mother. Before he can use it, however, his invention is stolen by the evil Bowler Hat Guy and his dastardly moll, Doris. Crestfallen, Lewis gives up hope of awakening his past. That’s when he meets a mysterious boy named Wilbur Robinson, who whisks him away in a time machine to spend a day with his eccentric family. In a future world of floating cities and flying cars, Lewis and his new friend track down Bowler Hat Guy, save the world, and discover the truth about Lewis’ past.
 
In an effort to capitalize on its expertise in traditional animation, Disney’s sophomore effort, much like Chicken Little, invokes the broad animation style of Disney’s cartoons from the ’40s and ’50s, such as Goofy’s How to Play Baseball. This meant that the game animators at Avalanche Studios had to study Disney’s animations intensely, analyzing the construction and capabilities of their character rigs and building theirs to match. Avalanche’s animators also met with Disney’s animators to get a firsthand account of the personalities and behaviors of each character.


An early cooperative effort between Disney and Avalanche Studios enabled the game artists to maintain the movie characters’ look and feel within the interactive world. The film animators also provided a personal perspective into each character’s personality.

"We would do test animations, trying to capture the feeling and nuances, and then create our final game animations by incorporating what we had learned," says Jeff Bunker, art director for Avalanche.

Luckily, Avalanche was able to employ the character models used in the film, albeit after reducing their density to around 2500 polygons. "This gave us a nice start and made the modeling more of a conversion process than a creation process, which generally requires more art direction, critiques, and approvals," says Bunker. Since Disney Animation and Avalanche use Autodesk’s Maya for a base package, the conversion process was easy.

To handle the broad, cartoon-style animation, Avalanche avoided using IK rigs. Instead, the animators designed a base character skeleton consisting of bones and constraints that offered more latitude for squash and stretch. This rig allowed Avalanche’s animators to approximate the extremes of motion made possible by the "broken rig" setup used by Disney Animation (see "The Sky’s the Limit," November 2005). With a "broken rig," Disney’s animators can grab any part of a character’s body and push, pull, or twist it in any direction without worrying about gimbal lock, order of rotation, and other IK solver problems.

"Luckily, we already had been exploring very similar techniques for Tak and the Power of Juju game. This gave us a great head start when it came to translating their rigs into ours," says Bunker.

For facial animation and lip sync, Avalanche created a series of Set Driven Keys to move the facial bones. The entire cast of Meet the Robinsons lent vocal performances to the game, all of which were hand-keyed. Of course, animators also hand-keyed each character’s motion cycles, remaining true to the film’s spirit of extreme squash-and-stretch, overlapping, and secondary action. Wilbur, the main character, has about 200 animations.

As the team did with the character models, it was also able to incorporate Disney’s textures and sets into the game. "Of course, we had to down-res and down-sample them, and often make them tile in different directions to get more coverage out of them as well," says Bunker. For the Xbox 360 version of the game, the artists turned many of these textures into highly detailed normal maps, using them to accentuate older and more weathered surfaces.


For the game characters, the artists started with the actual movie models and then reduced their density. Both studios used Maya as their modeling tool, making the process easier.

Capturing the film’s aesthetic, which is characterized by large shapes of flat color and shadow, posed another challenge for Avalanche, especially for the lighters. "Our environments are typically lit by vertex colors generated from lights within Maya," explains Bunker. "We have dynamic, real-time lights for characters or other objects that move throughout a level. To achieve fire and flickering bulb-like effects, we use a combination of vertex colors and dynamic lighting. The big struggle is getting the world, characters, textures, and lighting to all come together at the end without breaking the harmony of the overall scene."

Throughout the game, Avalanche’s effects animation combines bone animation, Set Driven Keys, UV scrolling, particle effects, and lighting. Asked to choose the effects he’s most proud of, Bunker points to a complex scene in which an Egyptian temple crumbles to the ground with Wilbur trapped inside: "We pulled this off with carefully timed playback of Set Driven Key animations that were originally created with physics simulation. It turned out pretty cool."
 


OVER THE HEDGE


Back to top
 
Activision’s Over the Hedge, based on DreamWorks’ feature film, is a good example of how game development can struggle when a film fails to coordinate its production schedule with that of the game. Given only two years from concept to completion, Edge of Reality had to begin with a script that was not only unfinished, but also constantly changing. This meant that the group couldn’t anchor the gameplay experience on the film’s plot points. Alternatively, the crew developed a post-movie storyline in cooperation with DreamWorks.


The 3ds Max game characters in Over the Hedge exhibit the same cartoon-like squashing and stretching as the stars in the movie, which were crafted in Maya.

Like the film, the game follows a bunch of woodland animals suddenly displaced by suburban sprawl. Led by a fatherly tortoise named Vern, their makeshift family consists of a hyperactive squirrel (Hammy), a porcupine couple (Penny and Lou), a skunk (Stella), a possum given to histrionic death scenes (Ozzie), and his much-embarrassed teenage daughter (Heather). Although Vern tries to keep them on the safe side of the giant hedge now cutting through their home, they are soon seduced across the booby-trapped lawns of suburbia by RJ, a con-artist raccoon in search of human goodies.

Midway through the film’s production, the game team began to receive digital models from DreamWorks, enabling the artists to model "right on top" of some of the character models used in the film. However, because DreamWorks pushed subsurface scattering, global illumination, and HVDs (for self-shadowing on fur and foliage) to their utmost limit in the film (see "Hedge Fun," May 2006), most of their assets were unusable. In addition, DreamWorks pioneered highly advanced rendering techniques for the feature throughout the film’s production, such as averaged normals for softening the lighting, which made it difficult for the game team to obtain screen shots early in production.

"We learned to work in such a way that we could adjust our game assets to best match the film assets as they developed from concepts to the final product," says lead artist Billy Sullivan.

Though Over the Hedge was primarily an Autodesk Maya-driven production, Edge of Reality developed the game in Autodesk’s 3ds Max. The main characters range from 2000 to 2200 polygons and feature the same cartoon-like squashing and stretching seen in the film, especially in the wildly elastic Hammy. "All of the rigs in the game are unique FK, IK, local-space setups," says lead animator Chad Hranchak. "Squash and stretch was heavily applied using translation of bones rather than scaling, which allowed us to avoid any export issues through the engine. We found no need for driving bone position through sliders or buttons."

Hranchak’s opinion is that he finds those tools to be very limiting, often making motions, hand positions, and face shapes too uniform and linear. Instead, the group has developed proprietary tools that help with poses and selection while tying nothing to the rig.

The game also re-creates the "Sonic Hammy" effect from the film, whereby the speedy little squirrel takes off in a sudden blast of cartoon motion blur, bending grass and leaves in whatever direction he goes. Artists accomplished the effect by first hiding the model and then using both animated sprites and traditional particles to generate the blur and the plume of dust cast in his wake.

So much of each animal’s personality is expressed through their speed of movement that it made it very difficult for the team to push them through the environments at the pace of the gameplay without losing the spirit of each character. "For example, we had to make Hammy run at the same speed as [the turtle] Verne but still have Hammy retain his hyper-fast feel. The same goes with jumps, attack distance and timing, knock-backs, and, well…everything," says Hranchak.


The game was ported to various platforms, including the PC (above),PS2 (right) and Xbox (below). For all the games, the main characters averaged 2000 polygons each.


The artists incorporated Havok physics into the title. This enabled the characters, such as those shown above, to accurately crash golf cars and destroy the trimmed lawn in the process.

Shifting Point of View

To force the viewer to look through the eyes of the animals and feel the dizzying scale of suburbia, the film adheres religiously to a squirrel’s-eye point of view. Obviously, this cinematic approach could not be retained in the game’s cooperative gameplay mode. "We needed to move the camera up and away from the characters to allow both players to see the action and characters at all times," says Hranchak.

The suburban neighborhood, with its many leafy yards and busy streets, was the most texturally and geometrically complex environment to light and render in real time. "We needed a high level of detail almost everywhere," says Sullivan. "Our solution was to make each yard, house, and lot a separate level object that could be populated into a single 3ds Max file for each level. By doing this, we were able to [level of detail] each of them. We still needed occlusion planes and careful camera placement to keep the frame rate from dropping."

The team relied primarily on light maps for the harsh light of suburbia and the more soft, diffuse light of the outlying woodland, but resorted to vertex lighting and vertex painting for more challenging lighting scenarios. "By building shadows into our breakables, we could make those shadows disappear when the object was broken," says Sullivan. "We utilized real-time lighting on all the characters. It was difficult to get consistent lighting and shadows from lot to lot without leaving visible seams, so we ended up keeping the primary light source almost straight up to avoid casting shadows across the lots. Also, we changed the time of day with color instead of the position of the sun."

While the game’s AI was programmed from scratch, the team incorporated Havok physics so the animals could wreak maximum destruction across the carefully manicured lawns, smashing up barbecue stands, golf carts, and everything else in their path. "The most exciting effects are in the mini-games, where players race remote-controlled cars," says lead programmer Jacob Meakin.

Over the Hedge was produced prior to the newly struck alliance between DreamWorks and Activision, and reflects many of the difficulties that have encumbered game development in the "pre-parallel development" era. Activision producer Steve Rosenthal is excited about the new partnership between DreamWorks and Activision, and its potential to achieve parallel production.

"The on-site studio facility will help speed up parallel production tremendously. Giving the game production teams on-site access to the film teams allows both Activision and DreamWorks to share information and assets with minimal delay," Rosenthal says. "The game developers will have instant access to the creative minds behind the film projects, and the film team will be able to see their characters brought to life in the game world. Ideally, all of the meetings, asset transfers, and game review sessions, and even some amount of level building and design, will take place at the new DreamWorks campus studio. The prospects are very exciting."
 


SUPERMAN RETURNS

Back to top

Based in part on last summer’s blockbuster film from director Bryan Singer, EA’s Xbox 360 game also borrows extensively from Superman’s rich comic book heritage, offering an array of menaces to Metropolis, including the robot giant Metallo. As Metallo and other villains tear up the city, a health bar reflects the state of the city. It’s up to the Man of Steel, utilizing his full suite of superpowers—from flying to heat vision and freezing breath—to maintain the city’s health bar, or it’s game over.


Like Over the Hedge, the Superman Returns game production started well in advance of the movie,making it more challenging for the game artists.

Because production on the game had to begin well before production on the film, EA was unable to depend on most of the movie’s assets and conceptual art. However, all the principal actors did voice work for the game. In addition, EA visited the sets in Australia to gather reference material for the cinematic set pieces, and had access to the cyber scans of the actors, from which they were able to model the main characters in Autodesk’s Maya using anywhere from 20,000 to 40,000 triangles.

Aiming for a more stylized, idealized look than the film, EA textured the characters using a hybrid of painted texture maps and normal maps derived from the scan data. The average resolution for each actor’s texture map was 2k, not taking into account the numerous additional mattes for specularity, ambient occlusion, normal maps, and so forth.

Aside from Metallo, EA’s Superman Returns features a large cast of fantastical characters, including winged creatures that resemble dragons, called Scoldfires, and demon-like characters, called Gnarls, that run on all fours. While these creatures required unique skeletons, underlying all the human characters is the same modifiable skeleton, which allowed the animation team to transfer animations. In addition to the standard bones and IK, the rigs were driven by a procedural node-based system utilizing Set Driven Keys, expressions, and dynamic secondary controls for tails, antennae, and numerous other secondary parts.

"Due to time constraints throughout the production, the rig had to be created generically and procedurally to allow the character rigger to get characters ready for the pipeline quickly. There were a few characters like the bots that were started with the production rig and then modified to fit their unique structure," says EA’s technical artist Kyle Wood. "The massive, open-world aspect of the game and the complexity of our city forced us to make the hard decision of reducing our character rigs and animation sizes to get them all to fit within the appropriate budgets. Nearly all action/open world titles have to deal with this constraint [in the next-gen era]."

One of the biggest obstacles still facing seamless film and game asset sharing is the inability to transfer complex rigging, such as sculpt deformers for bulging biceps or jostling flesh, to a game character. In film work, the surface geometry of a character is rendered. "In a game, after the character is animated in Maya, the geometry, textures, and everything else is stripped away until there’s nothing left but an animated skeleton. This joint data is then fed to the programming/engineering team, where the character model is re-bound to the joints in such a way that a game engine can read it. Data cannot (at this time) be spit out directly from Maya into a game engine," says Wood. Hence, the biggest challenge for EA’s riggers on Superman Returns was making sure that animators saw in Maya exactly what would be exported to the programmers.

"All deformations had to come from joint rotation—no scaling or translation, no blendshapes, no lattice deformers, which means no bulging biceps or blendshape muscle masses," says Wood.

Similarly, all the characters share the same facial rig (comprising bones and procedural nodes) so that the animators could create and save poses to be used on one another’s characters. Cinematic animation lead Ken Keys created the initial phonemes as well as simple expressions that the animators could combine to form a wide variety of facial poses.

"Because the game does not limit Superman’s range of flight, we could not restrict the camera. We had to create a living, breathing city in its entirety. There are no facades held up by virtual 2x4s; it’s not a back lot," says EA’s Mathias Lorenz. To create this real-time, fully functioning concrete jungle, EA made sure Metropolis is teeming with normal maps. For example, they form the many impact-craters left in streets and buildings by Metallo’s rockets.

In fact, when Metallo and Superman clash, their battles produce an awesome symphony of destruction, all orchestrated by the Havok physics engine. "I can’t think of any other game title that incorporates physics-based gameplay and uses the environment as strongly as Superman Returns. The satisfaction of being able to pick up and throw a gas tanker and watch it ricochet off a flying Metallo, or collide with a massive explosion against him, is indescribable," says EA’s Anthony Marinello. "Artists who have been working on the game from the start still drop their jaw and call other people around to see when some of the more exciting destructive moments happen."


The EA team used reference material from the movie sets whenever possible and used digital scans of the actors to serve as a basis for modeling the game characters, which the artists created in Maya. Then they textured the model swith painted maps and normal maps from the scans.

Many of these intense, destructive moments occur in WarWorld, a fully destructible arena featuring massive segmented towers that will topple and crumble into rubble, which Superman can pick up and throw at his enemies. Later in the game, a series of twisters threaten the city’s industrial sector, razing everything in their path. Flaming cars go hurtling through the air, water towers burst, trees and lampposts are uprooted, billboards and rooftop air conditioners are thrown across the island before exploding.

Another game mode allows the player to play as the evil Bizarro, with the goal to destroy the city. "Blasting Bizarro’s heat breath down a busy city street and watching as citizens flee in panic, 50 cars explode, throwing off doors, axles, tires, and other flaming bits, which, in turn, will then set nearby trees and bushes on fire, are things that will never, ever, get old," says Marinello.

Since the animation team did not have film footage of actor Brandon Routh, one of the greatest challenges was capturing Clark Kent’s characteristic humble demeanor for the game’s cinematics. This demeanor is largely the creation of actor Christopher Reeve, who chose to play mild-mannered Clark not as a goof, but as a guy pretending to be a goof and quietly enjoying the spectacle of others underestimating him.

"Animators are sometimes referred to as actors with pencils, or in our case, actors with Maya. Because we had very little footage from the Superman Returns movie, we used film reference of Christopher Reeve. Animators could often be seen in the hallways acting out a scene or a line of dialog, and working with other animators to improve the character’s performance," says Wood.
 


WORKING IN TANDEM

Back to top

While each developer used very different tools and techniques to meet the unique challenges of their games, there is a consensus among all of them on what is needed to improve film-based games. Many agree that instead of approaching the development of these products in a linear fashion (from license to film to game), using a model with the license at the core and all of the products fanning out from it—with lots of overlap, supporting and playing off each other to produce a cohesive entertainment experience—is the pathway to success. And the goal is to seamlessly continue and extend the enjoyment the audience has experienced with the license, film, or game.

Edge of Reality’s Rob Brown concurs: "Games that need to match the plot of a movie more precisely are going to need access to the major plot points early on, which means the movie producer and the game publisher are going to have to work more closely together." In addition, EA’s Dan Whiting (Superman Returns) stresses the need for the film actors to be engaged early on in the game’s production as well. "The sooner the game studio gets [the performance] data, the better. Acting and performance in animation can take quite some time; it’s no good to lay down a voice track a month before the game is scheduled to ship. Early involvement is even more important than frequent or large involvement."


Successfully creating game properties based on feature films, such as The Godfather (left) and Meet the Robinsons (second from left), requires a tremendous amount of cooperation between both groups at the planning stages.

When crafting film-based games—such as those from Over the Hedge (second from right) and Superman Returns (right)—developers need to take advantage of the interactive space and create properties that are more than a marketing vehicle for the movie.

Many developers agree that the involvement of the film actors is crucial to forging that all-important nexus between film and game. However, while digital asset sharing is becoming easier and easier, what the two worlds may not be able to share are their inherently incompatible artistic sensibilities. Actors and filmmakers want to make drama; game artists are usually focused on producing visceral, and often violent, experiences. Actor Matt Damon has been notoriously vocal in his disdain for the violence of the medium, and has steadfastly refused to participate in any games based on the Bourne film franchise, leaving a gaping hole in that game series.

"They offered me a huge sum of money, but I won’t be involved in any ‘body count’ games," Damon says. In the same vein, director Francis Ford Coppola had this unfavorable response to an early build of EA’s The Godfather: "They use the characters everyone knows, and then for the next hour they shoot and kill each other. I had absolutely nothing to do with the game, and I disapprove."

Ultimately, it’s the consumers who will determine what direction these games will take. If the filmmakers’ vision and the emotional connection to the actors are important to their gameplay experience, the industry may be forced to concede to their demands and change—or risk suffering the financial losses. Putting such pressure on developers to question and transcend the conventions of game-making can only lead to creative growth. And for an art form still in its infancy, that may be a good thing.


Martin McEachern is an award-winning writer and contributing editor for Computer Graphics World. He can be reached at martin@globility.com.