|During the past few years, as the boys of summer take the field, the cinematics team at Sony Computer Entertainment America (SCEA) has been looking ahead to the next season of its MLB franchise, finding ways to increase the game stats when it comes to graphics simulation and realism. For its just-released MLB 2006, however, the group expanded its efforts beyond the immediate field of play, re-creating Major League Baseball parks across the country in their proper settings. As a result, game players now can experience the local excitement outside, in addition to inside, the stadium prior to each virtual matchup.
At the start of this PlayStation 2 title, the player chooses a team and an opponent, and while the information is load-ing, a 15-second cut-scene movie begins, showing off the home stadium and its surrounding area, complete with the buzz and movement of realistic crowds and a variety of vendors hawking their goods. “We wanted to make the player feel as though he or she is walking toward a particular stadium on game day,” says Gregory Jüng, CG supervisor at SCEA’s Cinematic Solutions Group.
To this end, the group re-created all the city-specific ballparks in the league, providing both day and evening scenes for the approximately 50 locations. Whenever possible, the team used photographs as reference sources to accurately construct the parks, which were modeled and textured in Alias’s Maya. “The baseball fans out there know every detail about their home stadiums, down to placement of the flowerpots,” says Jüng. “And because we were doing fly-throughs of the environments, the scope of each scene was that much greater because we had to include specific monuments and other landmarks from the surrounding area.”
In addition to the league stadiums, the artists built imaginary locales for exhibition contests and six old-time stadiums to coincide with matchups from yesteryear.
According to cinematic project super-visor Jahil Nelson, the Boston Red Sox’s Fenway Park was one of the more difficult stadiums to build digitally, mainly be-cause of its long, boxy shape. “Fenway is a classic,” he notes, “but it was tough figuring out which angle to show it from; a camera approaching a flat surface looks boring.” To overcome this issue, the artists incorporated a larger number of vendors and crowds into the scene, which helped break up the three-dimensional space. “We did our best to capture the Boston ambiance,” adds Nelson. “Boston is a big sports town, so naturally there would be more sports fans wearing team jerseys and colors in the crowd compared to other locations.”
The crowds indeed became a focal point for the artists. In previous versions of MLB, the people outside the stadium were robotic and cartoon-like. “The crowds were sparse and the modeling, texturing, and animation were limited, with the same people and vendors appearing in each one,” says Jüng. “This time SCEA wanted a ‘feeling’ to the crowds, with the people performing different actions and interacting with one another and the vendors.” So, in addition to looking different, the characters act different as well.
To achieve this, the artists first modeled, textured, and rigged 38 character archetypes within Maya, a task that the five-person team completed in less than four weeks. Because each character would be part of a large crowd simulation of 300 to 500 digital people, the models had to be lightweight, but they also had to be high quality for the desired level of realism.
“In a traditional pipeline, a team of our size could build about five or six fully textured and bound high-quality models, but they would not be very lightweight,” says Jüng. “To achieve the desired quality level, we needed low-res geometry with high-res color maps and displacement maps. Low-res geometry would be easy to bind and animate, and would be really light for the crowd simulation. And at render time, the high-res maps would re-create the model, transforming it from low to high resolution.”
|For the virtual crowds, such as the one shown here outside the San Francisco Giants’ SBC Park, the artists used a new character-creation technique along with a complex crowd simulation.
The challenge, though, was getting this lighter information from dense, scanned models. In the past, explains Jüng, artists would model and texture characters based on artwork, but using hand-built and hand-painted surfaces gave the imagery a cartoon quality. For this crowd, however, the group wanted the characters to look real. So, the team used real people as models. A casting call of sorts went out within the company, and various employees answered.
After selecting individuals with different body types, SCEA used an InSpeck body scanner to capture full-body and facial geometries of the volunteers, who were digitized wearing various outfits. This resulted in super-dense, high-resolution geometry that looked photoreal, notes Jüng, but was unusable for production.
Typically, artists rebuild a proper-size topology over the scan data, which takes a lot of time. However, they are unable to use all that wonderful minute detail-the exact thing the group wanted to retain from the scans. However, the team discovered a way to solve this problem by using a combination of software-Maya, Headus’s CySlice, Geometry Systems, Inc.’s GSI Studio, and Adobe’s Photoshop. In essence, these programs produced a texture-defined model as opposed to a model with a texture on top of it.
“CySlice and GSI have been around for a while, but the new version of GSI should change the way people work with characters, and if it doesn’t, it should,” maintains Jüng. “When working with realistic models, it removes the artist from the equation somewhat, but the quality is there, and the process is much faster.”
|The photoreal virtual visitors at Yankee Stadium are made from textures-defined 3D models.
To create the crowds, the artists first built a template character in Maya that outlined the desired geometric topology and UV layout. Next, they used GSI to align and merge the scanned data with the template, transferring the topology and UV layout. Then, CySlice generated the color and displacement maps.
“What’s great about this process was that we could decide what resolution we wanted: really high, really low, or something in between,” says Nelson. “And we wanted some-thing very low in resolution but with high-res displacement and color maps.” As a result, the MLB 2006 crowd characters were the equivalent size of game-engine models, allowing them to be moved around quickly and easily within a scene.
“The low-res model looked poor, which worried a lot of people. But when it rendered, it looked great,” says Jüng. “The downside to this method is that in a traditional pipeline, you can see what you are working with, but we’d have to render it in [Mental Images’] Mental Ray every time we wanted to review the modeling or texturing. It’s a different way of working, and sometimes the artists were frustrated with the process until they got used to it.”
Using CySlice and GSI also streamlined the overall texturing process for the faces and clothing by allowing the group to define the desired surface topology and UV mapping within the program. In all, the team created more than 3000 clothing texture sets acquired from the scanning process. The downside to using texture-defined models, however, was that in order to make the cloth move, the simulation had to be done on the texture rather than the geometry, so the results were less accurate but, as Jung notes, still fine for this application.
Immediately after the scanning session, the team used a Vicon mocap system to motion-capture the volunteers performing dozens of moves ranging from answering a cell phone to approaching a vendor, stopping, and then browsing through goods at a concession or souvenir stand. The artists then added these actions to a database. Next, the group set up a complex set of rules within BioGraphic Technologies’ AI Implant artificial intelligence software, and watched the scene unfold. The simulation technology randomly distributed the characters, mixing the crowd’s attributes based on the artists’ scripts. So if the crowd were at Boston’s Fenway Park, for instance, the artists would set a variety of attribute percentages (values for gender, age, race, clothing sets, etc.) to achieve a unique human mix that would be representative of that park’s location.
Next, AI Implant intelligently selected walk cycles and animations from the library and applied them to the characters. So, sometimes a character interacts with a vendor and buys merchandise, while another may turn away or even avoid approaching the vendor at all.
In the end, the group created approximately 50 unique crowd scenes con-taining up to 14 layers, which were composited in Apple’s Shake. Rendering was done using Mental Ray (for more efficient displacement map usage), though standard Maya lights were used because they rendered faster.
|Building the game’s old-time stadiums, including the Polo Grounds (which stood from 1911 to 1964), required extensive research by the artists, given that many of the parks no longer exist.
While the rest of the SCEA team spent a good portion of a year creating the game content for MLB 2006, the cinematics group was able to complete its work in about three months thanks to this new process. Jüng estimates that by contracting an outside vendor using a different methodology, the price would have been in the ballpark of $1.5 million or more. “We were able to deliver the project at a fraction of that price.” Indeed, the new modeling and simulation approach saved the team a great deal of time overall, though near the end of the project, the artists incurred longer than normal render times as the displacement maps were processed.
“We see this title every year, and we definitely broke new ground this time,” says Jüng, noting that the group is using the same character-creation techniques on SCEA’s new SOCOM III title. Looking ahead to next year’s baseball season, the artists would like to bat a thousand on the lighting for the nighttime crowd renderings, which were more generic for MLB 2006 than the group would have liked because of time restrictions.
Nevertheless, by trying new digital techniques, the cinematics team hit the ball out of the park in MLB 2006.
Karen Moltenbrey is a senior technical editor at Computer Graphics World.
|For MLB 2006, Sony’s cinematics team re-created all Major League Baseball parks, including the Boston Red Sox’s Fenway Park, (above) as well as nonexisting locations (below).