Character Driven
Issue: Volume: 26 Issue: 4 (April 2003)

Character Driven

The Getaway provides the adrenaline rush of an action movie and the immersion of a first-person shooter game

By Karen Moltenbrey

When Sony, Microsoft, and Nintendo unveiled their current generation of robust game consoles in late 2001, they were hoping to steal a piece of the high-end gaming business from the PC platforms, which dominated the market through their ever-increasing power. To be successful, the console makers needed partners—game developers with visions of creating innovative titles.

The more daring of these responded with products whose dazzling digital imagery rivaled, and often surpassed, the graphics in PC games. They even started taking aim at a new target: feature films—no easy feat considering that graphics in games, unlike those in the movies, are rendered in real time to enable the medium's interactivity.

To take this step, some big names in gaming began adopting certain technologies used to generate CG in films. Several months ago, gaming giant Electronic Arts (EA) scored a major-league hit with its Triple Play baseball title by using realistic-looking characters created from digital scans of each player's head and face. Then, game developer LucasArts and film production company Industrial Light & Magic eclipsed EA's achievement, as the two companies from the George Lucas empire shared their software and processes to create the game Star Wars Bounty Hunter (see "Combining Forces," February, pg. 16).

Now Sony Computer Entertainment Europe's (SCEE) Team Soho has leapfrogged those accomplishments with the first game to truly blur the line between prerendered computer-generated film imagery and real-time rendered 3D game imagery. What's more, the title not only breaks new ground with star-quality characters and compelling backgrounds, it also includes a full-scripted story befitting the big screen.

Three and a half years in the making, The Getaway incorporates the latest graphic tools and techniques such as motion capture, digital scanning, and morphing that were used to create the virtual actors in recent films such as Die Another Day, Star Trek: Nemesis, and The Lord of the Rings: The Two Towers. As a result, the characters neither look nor act like typical polygonal game models—they are incredibly lifelike, from their facial expressions to their body language. Moreover, the London environments in which the action takes place are authentic re-creations of the city's locales; even the game's weather and traffic patterns are based on real data.

To give The Getaway the look and the feel of a film rather than a game, the producers treated every aspect of the project like a movie—from the story development to the final cut. "We wanted to broaden the market for the PlayStation 2 by making a game that is adult-oriented in terms of an emotive story line, similar to what you have in a good film," says Gavin Moore, Team Soho's director of animation (see "Escape Artist," pg. 48). "We wanted it to appeal to movie audiences and gamers, so we reduced the playing time from 40 hours to 20-plus because adults don't have that much time to devote to a game. But that meant we had to provide extra value elsewhere to justify the product's $40 price."

Sony's Team Soho achieved unprecedented realism for The Getaway, which contains lifelike characters and authentic environments.




Team Soho tackled that challenge first by having scriptwriter Katie Ellwood craft a compelling story that drives the game. She then filled the narrative with complex characters, including those that audiences love to hate. "Movies are very good at setting up situations that make you despise the characters," says Moore. "And that's what we wanted to do as well."

To that end, The Getaway tries to evoke a strong reaction from players at the outset, immersing them in the drama through a realistic CG cut-scene, or video clip. The story focuses on Mark Hammond, an ex-gangster and convicted bank robber who is trying to walk the straight and narrow, and Charlie Jolson, a notorious crime boss who, after ruling London's underworld for 30 years, is now struggling to maintain his illegal enterprises as rival gangs encroach on his territories. To enlist the aid of Hammond (his former associate), Jolson kidnaps his wife and son, and orders him to eliminate the competition. But the plan goes awry, and Hammond's wife is killed, making the ex-con even more determined to get his son back and exact revenge on Jolson. The player assumes the role of Hammond, and has 12 hours—which unfolds in real time—to carry out the mobster's orders. (Later, the player can change roles during a 12-hour iteration as the vigilante cop Frank Carter.)

"You start the game after witnessing those events, only to discover that Jolson has framed you for the murder, making it impossible for you to go to the police," explains Moore. "So you immediately begin to hate Jolson and the guys who shot your wife." To maintain the player's emotional investment in the plot, Team Soho integrated an unprecedented 1 hour and 23 minutes of film-quality cut-scenes, each approximately 2.5 minutes long. "The story line is never tacked on as an afterthought," adds Ellwood. "It is always at the forefront, pushing the game forward."

The Getaway's cut-scenes are innovative on a technical level as well. Rather than using prerendered scenes that are played frame by frame in a movie format, The Getaway's cinematics are processed in real time by the PS2's engine, as are the game graphics. This enabled the artists who created the clips to make changes on the fly during production and view the results instantly. "The total length of our cinematics is comparable to a full-length CG movie," says Moore. "We didn't have the time required by the usual process of rendering all night, making changes in the morning, and then rendering the scene again and again until we got it right."

Rather than rewrite the part of crime boss Charlie Jolson for a younger person, the artists "aged" the actor by applying effects makeup and then rescanning him.




To create the cinematics in real time, the team used Kaydara's MotionBuilder software, which is often employed for producing real-time animation in broadcast applications and for processing and applying motion-captured data to 3D models. "We were already using MotionBuilder for our mocap data," Moore says, "when we decided to use it as our real-time game editor too." He estimates that the real-time editing and rendering functions enabled the 12-person cut-scene team to complete what would have required 100 people to do using a traditional animation/rendering package. Furthermore, MotionBuilder's real-time playback feature helped the group achieve a consistent look between the game and cut-scene imagery. "The difference is not jarring, like it usually is," adds Moore.

The artists put the real-time rendering to the test with the game's robust character models, which appear during play and in the cinematics. From the main actors to the pedestrians, drivers, and work crews populating the scenes, each had to look the part it was playing. Early on, Simon Wood, a production designer from the James Bond films, created several character sketches, which the digital artists used as reference while building the models from scratch in Alias|Wavefront's Maya. Unfortunately, these models did not hold up well against the photorealistic backgrounds the artists had already created.

(From left to right) The artists regenerated the face of actor Ricky Hards (Charlie Jolson) with Eyetronics' ShapeSnatcher, which uses a light-projected grid to obtain depth information. The software then meshes together the multiple surfaces to form




A better option, Moore determined, was to digitally scan the heads and faces of actual people and map the data onto 3D character models, giving them a realistic appearance. Thus, he did as any film director would do and held a full-fledged casting call. "We needed actors for the motion-capture session and for the voice work," he says, "so we scanned them too." Using Eyetronics' ShapeSnatcher Suite, the group captured the facial details of approximately 125 people, 25 of whom are principal actors in the game, while the remaining 100 appear as extras.

"The characters have such a high level of detail that we'd still be building them years from now if we had modeled them from scratch," contends artist Dave Smith of Team Soho. "You need the proper realism to fully project emotions in the characters. We wanted the player to see the expressions on a character's face, and you just can't achieve that level of detail by hand."

Where the developers hit a wall was in the sheer volume of work. That's because all the characters had to be modeled, rigged for animation, and set up with SCEE's proprietary real-time facial animation software. Early in the process, the group integrated the animation system into a complex, fully skinned head template that uses "joints" to simulate the muscles in the face for producing emotions. Then, the team wrapped the facial textures from the scans onto that base model, ensuring that the simulated muscles—positioned differently for each scanned actor to provide a unique appearance—lined up correctly under the skin on the model. The artists accomplished this by morphing the textures onto the template with Eyetronics' Liquid Faces software, which aligned the muscle joints to match the geometry of the scan and correctly positioned the character's skin over the template. "Then, we just added hair and tweaked the animation to match the actor's movements, using video of the real person as reference," says Smith.

Not only do the digital actors walk the walk in The Getaway, but they also talk the talk. Using a combination of Puppet Works' Voiceworks and proprietary software running inside Maya, the artists synchronized the face and lip movements of the models with the voices of the real actors recorded during the motion-capture shoot. Team Soho even hired actors from specific areas of London so the characters would have the proper accents.

To make the characters move as realistically as they looked, the group shot approximately 11 hours of motion-capture footage using both a magnetic and optical system for accurately recording the real actors' movements as they performed a scene. For the cut-scenes, the team used Ascension Technology's MotionStar Wireless magnetic tracking system and a set of the company's 14-sensor 5DT data gloves for motion-capturing the actors as they performed on a fully built physical set. Because the system is magnetic, the team could accurately acquire the movements within a crowded space without incurring the data loss that often results when optical system markers are occluded from the camera's view by props, sets, and other actors.

Motion data was tested on a low-res Maya model (above) before it was applied to a fully textured character (see below).




For the in-game graphics, which do not feature the same, close, one-on-one interactions among the actors, the group used a Vicon 8i optical system from Vicon Motion Systems because it provides cleaner, more precise results when given unobstructed views and its setup allows for a larger capture area.




"The motion-capture process was exciting for the actors," says Joe Rice, who plays detective Frank Carter in the game. "It's like working in a theater-in-the-round in the sense that the actor is seen from every angle, giving the person unlimited freedom of expression in his or her role."

The animators, using the real-time functionality in MotionBuilder, applied the resulting data from both mocap applications to fully rigged 3D body models created in Maya. Using the measurements taken from a low-resolution digital scan of each actor, the artists had molded each body to precisely match the corresponding actor. MotionBuilder then processed more than 512mb of information in real time, giving producer Brendan McNamara a clear picture of the resulting animation as it was being created. "During the motion-capture session, McNamara was free to direct the actors and extract their best performances without worrying about getting accurate mocap data," says Moore. "Meanwhile, I was looking at the digital characters on the screen, so I could tell immediately whether we were getting the necessary motion data."

The artists spared no detail in creating game sets. Above is a photo of London's Regent Street; below is the virtual version.




Back at SCEE's offices, the animators tweaked the data inside MotionBuilder and animated the dynamic objects, such as doors, cars, and chairs. The team also used the software to light the scenes and perform the camera cuts—tasks usually done in the modeling software. According to Moore, by using MotionBuilder as the hub for its entire production pipeline, the group retained the integrity of its data by not transferring it among different programs.




The realism of The Getaway characters is matched by that of the game's virtual sets. Using survey maps and infrastructure data, the artists re-created 40 square kilometers of London to generate an authentic, living, breathing city. "We created our geometry in real-world units, so it takes the same amount of time to drive from one side of our London as it does in real life," Moore explains. And, as the player moves from one location to another, the look and feel of the neighborhoods gradually change.

Yet, what makes these environments especially unusual is how they were built. First, a location scout surveyed the city, searching for both indoor and outdoor locales, and then the artists hit the streets with digital cameras, painstakingly photographing all the objects. This information was then used as reference for constructing not only a digital replica of a certain location, but a physical one as well. "At times we used some pretty dangerous settings, like crack houses," Moore says, "and we obviously couldn't just go into a place like that for very long with our actors and equipment."

Although the extra construction was time consuming, it provided the artists with the required accuracy. Because the actors, their movements, and now the sets were virtually re-created in exact detail, whatever motion the actors performed on the physical set—such as picking up a bottle—the same action would occur in the digital world.

Creating an authentic replica of London was vital to the game but an enormous job, complicated by the city's changing seasons. "The game takes place in one day, so the weather has to remain consistent," Moore notes. For added realism, the team integrated a proprietary weather system that changes the conditions and the scene lighting—from bright and sunny to dark and rainy—to mimic an average fall London day, from sunrise to sunset.

Moreover, the group created a timed traffic pattern based on actual data, so cars in the game stop at red lights, just as real London drivers would. The team also added a physics system, so car crashes that occur during game play result in realistic damage.

Players can learn new information about the story line by assuming the role of rogue cop Frank Carter.




At times Team Soho feared that it was throwing too much information at the PS2, the only platform on which The Getaway runs. "Ken Kutaragi, who invented the PS2, was worried that we might burn out the console's DVD," Moore recalls, "because the laser would be constantly moving across the DVD [to read all the information], especially when it came to rendering the streets of London." To prevent this from happening, the group recorded its virtual city onto the DVD in map-specific order. That meant all the streets feeding off Oxford Street, for example, were placed onto the disk just as they appear in real life, so if the player turns right onto Oxford Street, the laser moves right as well. "The laser does not have to move far to load the data," Moore adds. "So, the load is instantaneous, and we reduced the wear on the PS2 motors."

Achieving the authentic look and feel in The Getaway was an expensive investment in both time and money (nearly $8 million), but necessary for absorbing the player into the story line. Even so, Moore considers the interactive entertainment a steal compared to the production costs of a typical movie.

"Games will be evolving tremendously during the next few years, pushing the level of realism and story lines even further as players look for innovative and exciting products," Moore says. The result, he adds, will be a win-win situation: Consumers will benefit because they will have better games, and the graphics industry will advance because the market will require better tools and techniques.

For now, the mix of real-time 3D graphics, 3D scanning, motion capture, and the PlayStation 2's hardware power, has enabled SCEE to create a game unlike anything that has come before. And once players boot it up, they're unlikely to care about the technology that makes The Getaway a reality because they'll be too immersed in the story, characters, locations, and action on the virtual London streets.

Karen Moltenbrey is a senior editor for Computer Graphics World.




Alias|Wavefront www.aliaswavefront.com
Ascension Technology www.ascension-tech.com
Eyetronics www.eyetronics.com
Kaydara www.kaydara.com
Puppet Works www.puppetworks.com
Vicon Motion Systems www.vicon.com