EA races to add broadcast-quality realism to its PS2-based Formula One game
By Barbara Robertson
The "next-generation" game consoles such as Sony's PlayStation 2 are delivering 3D graphics that are tantalizingly close to realistic. These graphics are not photorealistic yet, but the promise of film-quality imagery, which would be the ultimate in real-time eye candy, tempts many game developers to focus on creating "cinematic games." Not Evan Hirsch, head of visual development at Electronic Arts Europe. "I think game developers should be looking at broadcast television as a model, not at film," he says. "Film is scripted, but games are live events, regardless of their genre. It's television, not film that covers live events. Plus, while we're a ways away from film-quality imagery [in games], we're getting very close to broadcast-level images."
Thus, when Hirsch began thinking about the production of EA Sports' F1 Championship Season 2000 for the PS2, he decided to follow his own advice. In this game, players drive cars modeled exactly after last season's 22 Formula One cars, right down to the logos, driven on tracks modeled after all the courses on the circuit. EA even hired Andy Blackmore, a former industrial designer on the West Maclaren Mercedes Formula 1 team to model the cars. F1 Championship Season 2000, which is a mix between an arcade-style game and a simulation, uses a physics engine in the core of the game's engine to help create a realistic driving experience. To give the player a more realistic visual experience, texture maps driven by artificial intelligence (AI) in the game engine simulate wear and tear on cars and track. In addition, Simon Britnell, lead texture artist, led a team that created texture maps for the environment around the tracks with trees, gravel, and grass appropriate for each location.
|Using a professional pit crew, EA's visual development team created scripts that allowed EA's motion capture team to collect data in groups of two or three for the 22-man crew. That data helped animators create eight pit crew scenarios for the int|
To further heighten the realism, two innovative elements help the game seem even more like a Formula One race: broadcast-style camera coverage for track previews and 30-second replays, and 22-man pit crews that were created with 3D graphics and animated with the help of motion-capture data.
Each track in the game has between 25 and 35 cameras unique to that track. A player can see the race via an onboard, first person camera; the rest of the cameras, which are external, can give players a preview of the course via a "parade" lap and replay a previous 30 seconds of their race. "The cameras are always recording the race," Hirsch says. "We keep 30 seconds in a buffer so you can see your race in replay." Because he wanted these replays to mimic the televised coverage of Formula One races, Hirsch hired Keith McKenzie, the award-winning director of Formula One racing for the British television network iTV, as an advisor.
"Half of photorealism is presenting information in the right way," Hirsch says. "Finally the game machines and PC hardware are allowing us to do these things, but they are new to game developers. We needed to bring in someone from the broadcast world who knows how to make the lighting and cameras for live coverage work."
With McKenzie's help, Hirsch determined where to place the cameras on the track and created rules for camera usage that would be implemented in the game engine. "A director knows whether to pick a wide shot or a tight shot when, say, one car is passing another," Hirsch says. "Our cameras can make those decisions during the game."
Each camera has different "behaviors," that is, different positions around the track, lenses, and heights above the track. One camera might represent a steadicam, another might be mounted in a helicopter. The AI engine in the game uses rules to pick appropriate cameras in real time during a race. Here's how it works: When a car enters a zone, a camera starts recording. If the car is alone, it triggers a camera that zooms in. But if the car is in a pack, it triggers a camera that does a wide shot. "Keith [McKenzie] would tell me to use a high camera for a wide shot of the pack, but if the camera can't go wide enough, then to use a helicopter shot, and that's in the game," Hirsch says. If an event happens-a crash, a bump, or one car passing another, for example-the AI engine looks for the nearest camera that is flagged for that type of event.
It turns out, though, that the biggest challenge was not imitating broadcast coverage of a race, but figuring out how to have the cameras follow a car being driven by a nine-year-old or by someone intending to run it off the road or crash it.
"We had the cameras set for the race line, which is the line that a professional driver would use," says Hirsch. "We discovered [the cameras] were missing the races driven by first-time players, the young kids, and the theoretical lunatics."
Working in Alias|Wavefront's Maya, Mat Selby, senior 3D tools engineer, developed a proprietary camera rig system that Hirsch used to try to catch all the action no matter how unpredictable. "The camera sometimes makes mistakes," he says. "But that happens in the real world, too."
Real world failures are intentionally mimicked in the game as well. Cars have fuel failures and other problems that require pit stops. In the real world, a pit crew can change a car's four tires and refuel it in six seconds. Hirsch wanted to imitate that process in this game using 3D characters. To do that, he knew he'd have to find a way for EA's motion capture team to collect animation data for 22 people working together, something no motion capture system is capable of doing. He hired the Orange Arrows pit team and a retired F1 team manager, borrowed a Formula One car, and began videotaping the pit team in action for each of the eight scenarios that would be in the game, filming each person with a close-up camera and a wide-angle camera. Then, he spent several months analyzing the resulting 12 hours of videotape. Ultimately, he was able to plot all the actions of each person on the crew during each pit stop. Hirsch says it was like creating a playbook for a football team by watching replays of a game. During one six-second pit stop, 12 people might be switching tires while three others manage the fuel hose, wipe the visor, watch to see when the car can leave the pit, and so forth. "It was a fascinating study," Hirsch says.
Once the motion study was completed, Hirsch and animator Graham Bell were ready for the motion capture sessions. EA's motion capture team, which is based in Vancouver, Canada, brought in equipment from Motion Analysis and their proprietary software to capture the Benneton Play Life pit crew at work. Utilizing a complex set of motion trees developed from the videotape analysis, Hirsch's team had created scripts for motion capturing two and three people at a time. To be able to synchronize the 3D characters that would be created later, the scripts called for one person to repeat his movements in two sessions. For example, imagine that three people in the pit crew are changing a tire-call them A, B, and C. The team would first capture A and B doing their tasks, then capture B and C doing their tasks with B repeating his movements from the first session. Later, the animators would use B to synchronize A and C so that the three animated characters would work together properly during the game. Of course, A, B, and C had to also be synchronized with the other 19 members of the pit crew. All told, it would take 66 motion capture sessions to capture data for one scenario.
|With up to 35 cameras on the course available to record races for 30-second television-style replays, the trick was determining which camera to use at what point in real time during a race, particularly for cars being driven by first-time players.|
Creating a complicated game such as this is hard enough, but to do so in less than a year for a new console was especially challenging. "We didn't have time to plan very far ahead," says Oliver Castle, who wrote a rendering engine from scratch specifically for this game. "We had to take an educated guess at what resolution we'd have and set that as a target for the artists."
Castle points out that one of the advantages of game consoles is that once they're in the market, the technology is stable, and as the developers learn how to work with the technology, the games get more sophisticated. In this case, though, they had to start from ground zero. They got their first PS2 developer kits in February; the game was released in December. In the intervening months, the F1 Championship Season 2000 team had to write the game engine, physics engine, AI engine, and rendering engine, and then build models and environments, paint texture maps, and learn how to work with the PS2's idiosyncrasies. For example, the machine has lightning-fast processors but only 32mb of RAM so they had to learn how to, in effect, feed the machine's monster CPUs through a straw. "You have to look at different ways to drive the graphics synthesizer to maximize its potential," says Castle. Even though the rendering engine Castle developed for this game handles nine layers of textures per polygon and fast previews, he thinks of it as a starting point.
Other parts of the game will eventually be developed further, too. For example, while the cars, which were created in Side Effects Software's Houdini, are largely procedural and can tessellate on the fly during the game, the track, also originally created in Houdini, had to be converted to polygons for this first version. "It would have been nice to have a procedural track because then it would be resolution independent. Also, the artists could make a change in the track and the world would go with it," explains Hirsch. "The way it is now, if you change the track, the artist has to change everything that joins it-the shoulder, drainage ditch, and other surfaces."
"Creating a game is all about balance and constraints," adds Castle.
With sales of the PS2 already topping one million units in the US alone, this console is likely to stick around for a few years, giving game developers plenty of time to fine tune that balance and find ways around those constraints. As for this team-"Our goal was to give game players a broadcast-level experience," says Hirsch. "I think we've gotten pretty close."
Barbara Robertson is Senior Editor, West Coast for Computer Graphics World.