Issue: Volume: 23 Issue: 9 (September 2000)

Time Warp

No surprises." That's the mantra Ed Gross and Vince Pedulla have intoned daily for the past two years, ever since they began working on a large-format 3D film that will mark the debut of Madame Tussaud's New York, a 360-degree domed theater attraction scheduled to open this fall in Times Square. And thanks to a combination of careful planning and innovative 3D graphics tools, the project has gone just as they hoped.

"We took great pains to make sure every thing would look right before we sat down to create it," Pedulla says. "We didn't want any unpleasant surprises because we couldn't afford to render this film more than once. It would've taken too long and choked too many resources."

Gross and Pedulla are technical director and animation director, respectively, at Century III, the teleproduction facility at Universal Studios Florida (Orlando) that was subcontracted by Evans & Sutherland (E&S) to complete the production and post-production work for the 12-minute film. Created for the Tussaud's Group, the film, tentatively titled "It Happened in New York," combines live-action footage shot in high definition, classic stock footage, and 3D animation to transport audiences on a virtual journey highlighting many of New York City's most memorable moments in history.
Artists inserted 3D animation into classic stock and live-action film footage to re-create some of New York's most memorable moments for a domed-theater short film made for Madame Tussaud's New York. The project features such highlights as Babe Ruth's gam

One of the first venues that audiences visit is Yankee Stadium and the 1928 World Series to see Babe Ruth hit his game-winning home run. Other sites include the front of the marquee where Marilyn Monroe posed for her famous photo while standing over a subway grate during the filming of The Seven Year Itch, Little Italy in the 1940s, Radio City Music Hall in the 1950s, Elvis Presley's appearance on The Ed Sullivan Show, Central Park in 1969 along with a ticker-tape parade welcoming the Apollo astronauts back to Earth.

All the environments are computer-generated, and in all but one venue, the famous personalities come from stock footage. The other people in the show scenes are live-action extras shot against bluescreen and composited into the CG environments.

Audiences will view the show in 360 degrees and hear it via a six-channel stereo surround system when the theater opens in November. E&S is providing the entertainment system and programming that will transform the imagery created by Century III into a format that can be projected onto the theater's 40-foot domed walls. At press time, Century III had approximately two months left of post-production work on the project.

According to Pedulla, the biggest challenge about this job is that it's an immersive presentation. "In a conventional job we'd have to consider only one screen, so we wouldn't have to build any imagery that would be behind viewers or out of the camera's field of view," he explains. "But with 360-degree domed presentations, five screens cover all the way around the theater, from the floor to the ceiling, and the camera's in constant motion. So we have to build not just what's in front of viewers, but also what's behind and above them, and to their left and right."

That means Pedulla and Gross had to create the equivalent of a five-camera rig inside Alias|Wavefront's Maya, the software the Century III modelers and animators used to produce the film. "We have five cameras grouped as one node that we move as a unit, and each camera shoots its own view-front, left, right, top, and behind, similar to a cube but without the bottom," Gross explains. "So instead of seeing the CG environment from the point of view of only one camera, we're showing it from the point of view of five, each with its own 90-degree view of the scene."

That also means the final rendered frames for the project are huge-10mb each-to provide film-quality resolution. "For every final frame we build, we have to render five frames because we have to render five different camera views that must be stitched together to form a seamless, high-resolution image," Gross adds.

Before any frames were created, Pedulla and Gross carefully planned the entire film. First, the storyboards, designed by Century III's Daniel Cruz, were scanned into an Avid Media Composer system. "That's where we played around with the timing to make sure we knew what needed to be in view, where and when," Pedulla says.

Next, modeling supervisors Jason Diaz and Eric Ortiz built "block sets"-comprising simple polygons to represent where the buildings would go-using Maya on Hewlett-Packard NT-based Kayak XU workstations equipped with E&S Tornado 3000 graphics cards. "We also we built a 'pawn' guy to represent where people would go so that we could get the scale of the buildings right," Pedulla says.

"These initial steps also determined the timing for our live-action production shoot," adds Gross. "Without doing this, we wouldn't have been able to time out the live action."

After creating Maya animation paths based on this data, the modelers began creating in Maya 3D buildings from their polygonal counterparts and placing them into the appropriate scenes. After performing another motion test to ensure that everything was correctly scaled, the team began dressing the sets with 3D props such as fire hydrants, signs, garbage cans, cars, and so on. To create trees and other vegetation, they used Greenworks' XFrog procedural L-systems generator, and to texture the models, Century III's Thomas Graham used Adobe Systems' Photoshop.

To lend more photorealism to the models, the team painted some of them using Right Hemisphere's Deep Paint 3D. "Deep Paint was useful, especially for painting the Statue of Liberty," comments Pedulla. "We wanted her to be photoreal, including the stains on her surface; we spent a few weeks painting just her."

Lighting the CG scenes was also an important consideration, according to Pedulla, because the project combines daytime and nighttime sequences. Concerned that daytime scenes lit with ambient lights alone would look too flat, the team used a combination of ambient lights, point lights, and spotlights in Maya to simulate the bounce lights one would find on a hard set.

Lighting the nighttime scenes was a bit more complicated. Explains Pedulla, "Whenever we created glows-shader glows, light glows, lens flares-they would dim out and then flash back to full strength as they passed from one camera to another in Maya, and it looked funny." To solve the problem, the team, headed by lead compositor Fawn Trivette, created glows, isolated them as flat white elements, used them as matte channels to create soft, defocused elements, and then composited them into the film negative. All the work was done in Adobe After Effects and Avid Media Illusion. "People like to see things like park lights and headlights with glows around them," Pedulla says. "They make CG environments look more realistic because they soften such harsh elements."

According to Pedulla, rendering the scenes was a huge task. "Normally, you'd have to render 30 frames for each projected second of film. But because we're rendering five frames-one from each of our five cameras-we have to render 150 frames for each second, which basically has quintupled our rendering needs." To handle the challenge, the team used four Boxx Technologies RenderBoxx workstations, each equipped with a four-processor array and 1gb of memory. To further help in managing the rendering task, the team relied on two network rendering applications: RenderMax Lite from RenderCorp., and Lemon Pro from Martin Heigan, a senior 3D animator at The Video Lab Group, a post house in South Africa.
As part of the film's finale, the camera pans past the Statue of Liberty, a Viewpoint model that artists realistically textured with stains created in Deep Paint 3D. The group generated the water using a procedural shader in Maya.

The Century III team utilized classic stock footage of famous personalities whenever possible. To extract the individuals from the footage, they rotoscoped the elements using After Effects, Pinnacle Systems' Commotion, and Media Illusion on HP Visualize workstations. They then composited the elements into the CG environments using After Effects.

The group also used After Effects to colorize the footage, which was in black and white. For artistic reasons, they decided to use sepia tones on the Babe Ruth footage. The only venue for which they couldn't use stock footage was the one with Marilyn Monroe. "We had to shoot a Marilyn Monroe double because the shot that we could get had pans and tilts, and she got completely cut off at the waist," Pedulla explains.

One of the biggest challenges the team encountered when combining the live-action extras in the various scenes with the CG backgrounds was matching their perspective with the perspective of the virtual cameras. Because the team was dealing with five camera frames, traditional motion-control techniques wouldn't do. "For example, if you're shooting an extra who moves from screen left to screen right, which matches a virtual camera that performs the same move, the motion of the camera will suffice," he says. "If the extra leaves the frame, that's fine; it's what he's supposed to do.
To achieve the proper timing of the live-action motion-control extras that were composited into the scene, Century III developed a Maya work flow using particle "cards" that were aim-constrained to the camera.

"In our case, this was a problem," Pedulla continues, "because if the person left the frame, he would enter the next frame." Creating a practical camera rig that matched Century III's virtual camera rig was impossible because the artists needed to be able to change the perspective of the extra without moving the person around in the frame. Also, the extra's rate of movement had to precisely match the timing of the CG cameras.

To deal with this issue, Century III motion-control specialist Bob Self and live-action director Jack Tinsley placed the extras on a turntable in front of a locked camera. Then they moved the turntable either clockwise or counterclockwise, depending on whether the extra was on the left or right side of the virtual camera rig.

Timing the movement of the turntable was critical in order to pull off the effect. "At the Radio City Music Hall venue, for instance, a lot of people are lined up for an audition," Pedulla explains. "We start off behind them, so you see their backs; then we move alongside them, so now they're at a 90-degree angle to us. Then we move past them, and we see the front of their bodies. It was important that we made sure we had the correct perspective throughout the shot for this to look right." To help do this, Gross says, Tinsley developed a formula that determined the extra's exact rate-of-perspective change in regard to the CG cameras.
Artists used Maya's built-in cloud particle renderer and complex procedural shaders to create the billowy effect around this image of the Empire State Building.

Because Gross and Pedulla didn't want the extras to appear as though they were standing still, some of them walked in place on treadmills that were situated on top of the turntable. The team then keyed out each extra using Media Illusion and added shadows using After Effects, depending on where the extra was placed in the 3D scene.

Compositing the extras into the CG environments also was difficult. Because the location of the extras had to be tracked in three dimensions at all times using multiple cameras, traditional compositing techniques were not a viable option. Therefore, the team applied the processed turn table footage for each extra to a single polygon in Maya as an animated texture map, including the shadows. "In addition, we added aim constraints so that each polygon would face the camera at all times and you would never see them from the side, since they were purely 2D elements," Gross notes.

After compositing individual extras into the CG scenes, the artists had to devise a way to composite crowds, such as the crowd in Yankee Stadium during the 1928 World Series. Realizing that inserting the people one at a time would be too time consuming and memory intensive, the team created a particle system in Maya and applied it to the 3D geometry that represented the shape of the ballpark stands. Initially, the idea was that each particle would be assigned to pick a random extra from 70 available shots, and each particle would then be given a random start frame so the timing of each extra wouldn't be synchronized. This required each extra to be morphed so the movement was loopable.
Below is one of the film's opening shots of modern-day New York. Above that is the same image in its "dome negative" format.

"The basic approach in this case was correct, but the Maya OpenGL renderer, which handles sprite particles, could not handle texture maps of this volume because it would have to load all of them into memory," Gross says. So the group used Maya's particle instancing technique, which replaced each particle extra with an individual polygon that in turn was randomly chosen from a group of 70 polygons, each representing an individual extra. To ensure that the instanced polygons faced the camera, the team, with the help of Alias|Wavefront support, created run-time expressions that forced each particle group's normal axis to aim toward the camera.

After the live-action and stock footage are composited into the rendered 3D scenes, Century III will run each group of five rendered images-front, back, left, right, and top-through a program called SkyStitch, from Sky-Skan, to create a dome negative. The dome negatives will then be delivered to the folks at E&S, who will run them through Sky-Skan's SkyVision Renderer software, which will break them out into six individual images that are edge-blended for each projector.

Projecting the images computed by the software over the surface of the dome will require six high-resolution Barco Projection Systems (Amsterdam, The Netherlands) projectors, which will receive the imagery from six Alcorn-McBride (Orlando) HDTV video playback units. Five of the projectors will cover the 360 degrees around the horizon and up to approximately 60 degrees above the horizon. The sixth one will cover the zenith area at the center of the dome. When the imagery is projected, the result will look like one large image that appears continuous to viewers.

Although this attraction will have taken Century III approximately 24 months to complete, Pedulla and Gross say the time will have been well spent. "Going to a movie is great, but I never really feel like I'm immersed in the movie," Pedulla continues. "I often want to look behind me, or to the left or right, out of the camera's frame, to see what's in the environment. But of course I can't.

"With this attraction, you'll feel totally immersed in the environment," he concludes. "We think it will get a lot of attention."

Freelance writer Audrey Doyle is a Computer Graphics World contributing editor. She can be reached at

Adobe Systems
San Jose, CA

Pinnacle Systems
Mountain View, CA

Right Hemisphere
Bellingham, WA

Palo Alto, CA

Toronto, Canada

Avid Technology
Tewksbury, MA

Austin, TX

Toronto, Canada

Nashua, NH

Evans & Sutherland
Salt Lake City

Karlsruhe, Germany