Issue: Volume: 29 Issue: 7 (July 2006)


By Karen Moltenbrey
Artists create a 3D animation, projected in 360 degrees, topromote EA’s NBA Live 06

electronic arts recently challenged the digital artists at Brickyard VFX to a game of roundball, only the court, the players—and everything else involved— were completely digital.
Electronic Arts wanted an animated piece promoting its EA Sports game title NBA Live 06 that would feature the video game’s characters, realistic virtual versions of actual professional basketball players from the NBA as they performed amazing feats on the court in front of a packed arena. And, the presentation had to play out on a large 360-degree screen in front of the gaming industry’s most discriminating eyes.
For this project, the name of the game was realism—NBA Live 06, released several months ago, takes character modeling and animation to a new level of realism. Likewise, the promo had to do the same, and more, since the animation had to play out on a huge circular screen where every digital detail (or potential fl aw) would be clearly visible. In the end, Brickyard achieved those goals, as viewers had to look closely to determine whether the action they were seeing was CG or video footage from an actual game.

Brickyard was tapped to rework game assets
from EA’s NBA Live 06 for projection onto a
large 360-degree screen. To accomplish this,
Brickyard had to tweak many of the game
assets by creating new lighting setups and
advanced shaders.
To achieve this, Brickyard passed the project to its in-house 3D division, which opened last fall. There, the artists used game assets, motion-capture data, and sophisticated lighting techniques to create the final animation, which included a digital stadium filled with 25,000 fans. Brickyard delivered the final animation at high resolution for 360-degree projection on EA’s 2006 Video Wall during E3, the world’s largest electronic gaming show. “We were able to take game models and assets and, through lighting and shading techniques, show the kind of filmic quality and resolution that’s possible with 3D,” says Brickyard artist Yafei Wu.
According to Jay Lichtman, executive producer at Brickyard, the company established its 3D studio to support the ever demanding and technical requests coming from various sectors of the industry: television commercials, gaming, and film. Previously, the company delivered highend visual effects scenes through its inhouse compositors but outsourced the CGI portion. “It made more sense to create an in-house 3D studio that worked directly with our compositors and graphic artists,” he says. “This enables a much more efficient internal pipeline and, inevitably, raises the level of quality on any project with the artists working side by side.”
Since its formation, the department has completed nearly 15 projects before getting tapped for the EA assist. Among those were high-end commercials that aired during the Super Bowl and the Olympics, in addition to CG promos for the gaming industry. Currently, Brickyard is in discussions with several studios pertaining to feature-film work.
In the Game
NBA Live 06 features photorealistic models of basketball superstars that perform, act, and actually look like their real-life counterparts. For the promo project, EA handed off those in-game assets, including the player models, stadium, textures, and even the motion-capture data used to animate the game’s digital basketball players. (For an in-depth look at how the game was created, see “The Art of the Deal,” October 2005, pg. 15). Because the artists did not have to create the models from scratch, they were fairly ahead in the game. On the other hand, the data required a great deal of work to make it usable for the promo   animation, so the first order of business whenever EA delivered an asset was to clean it up. Sometimes this was accomplished through the use of scripts that group had created, while the remaining assets required basic manpower. Some of the tweaks occurred to the mocap data, while others demanded the development and application of advanced shaders and lighting setups so the CG players would look realistic on the large screen.
As Lichtman points out, the biggest challenge was the amount of assets and textures that were needed to create a full CG stadium filled with a cheering crowd and on-court players. So it was vital to the image quality and to make deadline that the group work with those assets in an efficient and manageable way. “Our R&D phase allowed us to establish an efficient pipeline, and the subsequent render times were no longer such an issue,” he adds.
Yet, to work with this amount of data, the team had to write a myriad of scripts to facilitate every process. As a result, the artists were able to focus on the aesthetic aspect and let the pipeline handle the rest. For example, the shaders Brickyard created picked up the textures from the assets they were attached to, enabling the group to handle subsequently fewer shaders.

Brickyard created scripts that cleaned up the game assets, so the
team could work with the complex data more efficiently.
EA originally built the NBA roster using image-based techniques and, with its custom Vicon mocap setup and Autodesk’s MotionBuilder software, applied the animation information and cloth simulation to the 3D models. When remodeling tweaks were necessary, Brickyard used Autodesk’s Julya, along with Pixar’s PhotoRealistic RenderMan (PRMan) for all the shading and rendering. Compositing for the project was done in Adobe’s After Effects and Apple’s Shake, while image enhancements were made in Adobe’s Photoshop.
When Brickyard received the EA mocap data, sometimes it was presented as full Julya scene files from which the animation was extracted, and sometimes as animation files. Whenever changes had to be made to the motion-captured data, the revisions were minor, such as planting a player’s feet on the floor, tweaking the way objects and cloth intersected, and producing re-animations to accentuate the original data and to accommodate the client’s requests. The artists also received the cloth animation (simulated cloth simulations) from EA that were used for the players’ uniforms.
“There were revisions to the animation even through the last week of the schedule,” says Lichtman. “These changes, though, were facilitated by the use of a nice control rig that EA had created for the players.”
Faces in the Crowd
The game crowd, meanwhile, was driven by motion-capture animation recorded by EA using six performers. The team then applied those actions to six digital characters, and turned the movements into animation cycles. The Brickyard artists also received placement locators along with a script to place objects where the locators were. Next, they created RIB archives of each animated model so they could use simple proxy boxes in their scenes; at render time, the boxes were automatically switched to a random model with random animation based on predetermined parameters. The artists then replicated, shaded, textured, and rendered the crowd to create a randomized effect. According to Wu, the crew used live action reference stills to add detail and efficiently output all the data, including a crowd of thousands, within PRMan.
“Of course, each character varied slightly in scale,” notes Lichtman. “Finding the right amount of hue shift proved to be the key factor in ultimately matching the reference images of real crowds supplied by the client. Because we ended up with more than 20,000 proxy boxes in the scene (the amount of people rendered in the crowd), we actually made a RIB archive that included the stadium, as well. So in the end, our final scene only consisted of 10 hero players and one polygon box that, at render time, switched into a fully animated crowd populating a fully lit stadium.”
For the stadium, Brickyard modeled the space in 3D, and used HDR technology to light the interior building as well as the players and fans. According to Lichtman, the group started by lighting the stadium in the usual way, with spot, point, and area lights, and ambient occlusion. Once the lighters were happy with the results, they baked the lighting as a “brickmap,” which is a Pixar RenderMan feature that allows artists to bake information as a hierarchical 3D texture.
“In this case, we baked all the lighting information and used that in conjunction with a custom shader that multiples the lighting information with our textures,” Lichtman explains. “This allowed us to change textures (primarily for logos and such if and when the clients request it) without the overhead of all the lighting calculations.”
With the stadium lighting complete, the group created an HDRI map of the result and used that as its base lighting setup for the crowd and players. For the players, the artists used an average of six area lights, to beauty-light the players on top of   the created HDRI image in order to put the rims and speculars wherever they were needed. The HDRI was used for the main diffuse and bounce lighting, while the rims and speculars were more directed to create an artistically appealing look.
“The shaders we created were fashioned in such a way that we could reuse as much as possible of what we had received from EA,” says Lichtman. “This meant we had to create a custom normal map shader that relinked the Julya hypershade textures to our custom PRMan shaders. Our shaders for the players consisted of one diffuse layer, three different specular/reflective layers, one rim layer, and a subsurface scattering layer.”
Because of the nature of the project, Brickyard tried to restrict itself to just using the textures from the game. The only textures created in-house were for the subsurface scattering layer. Brickyard generated those by building a rough skeletal structure inside every player, and using the distance between that and the player to determine the strength of the subsurface scattering. That was baked out, Lichtman explains, as a texture to accentuate the subsurface scattering calculation.
“We were working with a lot of information, but the real trick was lighting and rendering the piece for high resolution and a high degree of realism,” says Lichtman.

To create the 360-degree animation, the artists broke the scenes
into three 120-degree films, and then stitched them together and
dewarped them using Shake and After Effects plug-ins.
For rendering, the group used 12 dual-core Boxx render nodes with Pixar RenderMan. In the end, the stadium and crowd had an average render time of six minutes per frame with baked lighting and ambient occlusion, motion blur, and depth of field. The players had an average render time of six minutes per frame, including ambient occlusion, subsurface scattering, depth of field, motion blur, area lighting, and raytraced reflections for the floor.
Theater in the Round
The finished file—three 120-degree films—was joined together using stitching and dewarping programs for projection in the round. As Lichtman explains, the group was tasked with creating a 12-second, full-CG animated basketball game at 60 frames per second in 3357x1117 resolution. This was then projected onto a 360-degree screen to create a buzz at E3. (Brickyard was given plug-ins and scripts for Shake and After Effects to warp and stitch the images together.)

Though the players are the main attraction in the animation, the
background crowds add some excitement of their own. The fans
were created with motion capture and then randomized
The novel animation was an ambitious project, especially for a 3D rookie. But with the work, Brickyard’s 3D division scored a winning shot for the facility, and a slam dunk for EA.

Karen Moltenbrey is the chief editor for Computer Graphics World.  
Back to Top
Most Read