Photo Anime, Hyper Pop Art
Issue: Volume: 31 Issue: 4 (April 2008)

Photo Anime, Hyper Pop Art

In June 2006, the team of filmmaking visionaries who had created breakthrough visuals for The Matrix series found themselves brainstorming again with Matrix directors Andy and Larry Wachowski. This time, the genre-busting Wachowski brothers wanted to create a live-action film based on the classic 1960s television series Speed Racer, an English-language adaptation of the seminal Japanese anime Mach GoGoGo. So the team—production designer Owen Paterson, art director Hugh Bateup, and visual effects supervisors John Gaeta and Dan Glass—fed the brothers ideas about set design, animation, and color treatments. As they did, it soon became clear that Speed Racer would extend the group’s foray into virtual cinema beyond anything they had done before.

“The really attractive thing about this project was that we were inventing another universe,” says Gaeta. “And, it wasn’t entirely clear what the rules of that universe were yet.” On May 9, the world will see the result when Warner Bros.’ Speed Racer bursts onto the screen in a techno-colored, fully saturated, and fast-moving world no one has seen before in live-action cinema.

As did the animated television series, the live-action Speed Racer centers on car racing, but the only real elements in the film, for the most part, are the actors and a monkey filmed largely on greenscreen stages with Sony’s new F-23 digital camera. In the racing sequences, which comprise about half the 2000 visual effects shots, actors and their digital doubles drive some 63 digital cars racing on CG tracks in digital stadiums filled with millions of digital people. In the non-racing sequences, digital backgrounds wrap around the actors in another 1000 shots. But, these are far from typical CG cars, tracks, stadiums, and backgrounds.

Shifting Paradigms
With a target audience aged 10 to teen in mind, the team felt free to ignore conventional filmmaking for both Speed Racer’s action scenes and its style. “We imagined what the generation that created X Games would invent for motor sports if they could,” says Gaeta. “We created elaborate tracks with ramps and crazy shapes, added Larry and Andy’s affinity for kung fu, and made cars that can drive on walls, jump sideways, and crash onto other cars.” They termed the car-on-car action “car fu,” and it happens with blazing speed inside hyper-real, pop-art, virtual environments.

After an initial four-minute test that the brothers showed to Warner Bros., Kim Libreri and Mohen Leo led an eight-member team at Industrial Light & Magic (ILM) that created further tests to help define the racing look. (Later, the two visual effects supervisors, both of whom had worked on films in The Matrix series, shifted over to Digital Domain, where much of the racing action took place.) Their mission was to pay homage to the original anime, but with CG and photography rather than cel animation.

Procedural systems helped animators keep the cars on tracks, as illustrated in the bottom images, shown before rendering (left) and after rendering (right). Any wheel on the CG cars can turn 180 degrees, as in this final shot (top). This helped animators create the “car fu” action shots.

“We decided to establish a cartoonish feel by using rostrum camera techniques,” says Leo. In part, the idea of imitating rostrum techniques was born of budget concerns. In animation and visual effects, for example, a rostrum camera might move across a painted background to create the illusion that a static car in the foreground layer is moving. “We could save money and harken back to the cartoon animation aesthetic we were after, so that arose fairly early in the design,” Glass says. “And that was one of the basic things that led us to shoot on greenscreen.”

Then, the crew discovered examples of high-quality QuickTime VR images (QTVR), more often seen in low res on real-estate Web sites for panning through photographed rooms. They decided to use the high-res QTVR to create 360-degree virtual environments.

Bubble Wrap
“We’ve been reconstructing camera moves through photographed environments for years to achieve pretty impossible camera moves,” Glass says. “So we took aspects of that into this simple environment. We call it ‘bubble technology.’”

Glass organized a world unit to build those environments. Teams of photographers took high-resolution bracketed digital stills around the world, shooting a minimum of eight photographs around a circumference in each location. In San Francisco, Dennis Martin, who had been a virtual cinematographer for The Matrix, led a team that used New House Internet Services’ PTgui software and scripts he wrote to stitch the multiple exposures into HDRI spheres. “We had 28k images around the circumference of a sphere,” he says.

A second team led by environment art director Lubo Hristov, who had been a digital matte painter for The Matrix Revolutions, among many other films, moved onto the set in Berlin to colorize and paint the assembled backgrounds (see “Bubbles on Set,” pg. 16).

“By the time we got to Berlin, we had QTVRs we could pan around in, and that everyone—the director of photography, the directors, and the actors—could see,” Glass says. “Traditionally, you shoot the live action on the greenscreens first, and then, when you know what angles you need, you shoot the backgrounds. We had to do things in the opposite order because of the number of [virtual] locations.” Also, because these backgrounds were not typical.

First, the artists often combined shots from different corners of the world into one image without regard to geography. Second, they stacked spheres inside spheres to create onion-like layers upon layers, and cut holes in the layers to let the images beneath show through. Then, referencing multiplane camera techniques used in cel animation but applying it in a spherical world rather than a flat plane, they moved the layers. Gaeta dubs the technique they developed “photo anime.”

Glass describes the look: “If the camera does a dolly move, you expect to see parallax, but with still plates, you don’t. We fake it by cutting a few layers to see a shift, but it isn’t fully realistic. The photographs look real, but they move in slightly odd ways.”

The team embraced the oddity and started experimenting. Into the bubbles, Stephen Lawes, the in-house compositor, slipped greenscreen elements of Gaeta or Glass walking on a treadmill or standing on a turntable—shot with very little camera motion. To create false camera moves, Lawes cut through the layers and moved them. And, similarly, artists at BUF Compagnie played with interesting multiplane ideas, as well.

Elastic Parallax Nuances
The group then pushed the look even further. Gaeta explains: “Once we got to the spot where we were slipping and sliding, slicing and dicing the layers to make elastic parallax nuances, we thought, ‘Why not enhance the colors?’ This is where Lubo’s work became critical.”

The crew had determined early on that they wouldn’t try to twist the digital technology into looking like film. “Speed Racer is grain-free and noise-free, so there’s no connection to film, but it wasn’t our intention to make it look like TV,” Gaeta says. “We wanted to find a place in between.”

Artists manipulated the virtual backgrounds, which they created from enhanced photographs, to amplify emotional moments.

For inspiration, they referenced the Manipulator Web site, pop art, high-end fashion photography, car commercials, and such master of anime as Hayao Miyazaki. “We realized that pop-art photography and commercial photography have a strange overlap with anime,” Gaeta says, “the colors, the way they juxtapose and light things to create images more expressive than real.”

And, that was only the beginning. “When we decided we didn’t need to imitate reality, it started to get very exciting,” Glass says. “We started playing with people’s perceptions to create emotional moments, to exaggerate them.”

For example, to enhance the moment when a racer’s will to win takes over, the artists at Digital Domain swooped the environment into an abstract, highly saturated blue tunnel with fast-moving colored lines running through it, much as an anime artist might do.

“In a similar way, we had focus completely under our control,” Glass says. “We made deliberate choices about what is in focus and how things defocus. Traditionally, highlights defocus into circles. And in animation, whole backgrounds often have the same level of blur or defocus, which pops the foreground off the background. So, the idea of doing a simple box filter across the whole background was very interesting, especially with photographs. But John [Gaeta] prompted, ‘Why stick with a box?’”

As a result, Digital Domain, for example, wrote software to create what Glass calls an “amusing set of filters.” Anything in a scene, whether live-action photography or painted background bubble photography, can defocus into a variety of fuzzy shapes, including triangles, diamonds, or hearts.

Racing at Digital Domain
In addition to Digital Domain, more than a dozen visual effects studios worked on postproduction for the show—compositing greenscreen shots into the digital bubbles and creating the races. Of those studios, Digital Domain, Sony Pictures Imageworks, and ILM focused primarily on CG cars and the racetracks, with Digital Domain landing the bulk of the CG work.

A crew of 440 artists at Digital Domain drew on the studio’s experience in creating digital cars for television commercials, along with their expertise in visual effects, to create the stylized look of the cars, tracks, and environments in Speed Racer.

With two or three minor exceptions, the cars are always CG. Working in Autodesk’s Maya, Digital Domain’s modelers started with the previs cars to create the hero and background models. Animators referenced animatics and live-action footage of actors “driving” digital cars that compositing supervisor Darren Poe had assembled on set.

“We had hours and hours of animatics created by Digital Domain and BUF,” says Gaeta. “Digital Domain created our car rigs and standards for track surfaces so there would be no disconnect later. And, their animators did exquisite previs animation. Having the primary vendor doing previs animation with pre-rigged cars was the only way we could have survived. It accelerated animation approval.”

Car Fu
Johnny Gibson, digital effects supervisor, calls the car fu animation rig one of the most advanced he has seen. “The cars have advanced drive trains, the axles flex around so that the cars can drive sideways, and any wheel can turn 180 degrees,” he explains. “The wheels rotate at the correct rate for the car’s speed on the track. For medium to far shots, when the wheels turn, the steering wheel turns automatically.”

For close-ups of cars with greenscreen drivers, camera trackers at Digital Domain matched proxy 3D cars to cars “driven” by actors in cockpits on motion bases shot on set, and animators worked with the proxies to create hero animation. During the races, digital cars speed at around 200 or 300 miles per hour over tracks created with two kilometers of geometry.

“It’s a ridiculous amount of geometry,” says Libreri.

Gibson estimates that the crew handled four times the number of assets that the studio had ever handled at one time for any previous project. “I get between 300 and 500 e-mails a day requesting assets. Models, textures, lighting, set layouts, animation rigs. I’ve never encountered anything as complex.”

To help animators control the fast-moving cars on the huge tracks, the crew created procedural systems. “The final race takes place in the middle of a city, in an enormous stadium with a crowd of two million digital people,” describes Leo. “We built something like a roller coaster with truss work and metal pieces—but a roller coaster the length of a racetrack.”

An animation system kept the cars on the twisting tracks unless animators made the cars jump, and in that case, a complex suspension assembly bounced the car’s body and deformed the tires appropriately when the vehicle landed. For straightaways, animators could set the amount of procedural bounce. 

To add stylized effects to the cars racing around the tracks, the crew could move the digital cars—in their deformed states on every frame—from Maya into Side Effects’ Houdini. In Houdini, clever systems automatically created stylized tire trails, skid marks, peel-out smoke, sonic booms, trailing bits of debris, and other effects on the fly.

For explosions caused by car crashes, the crew used Digital Domain’s Storm, which is a volumetric renderer. But, rather than motion-blurring sparks from car crashes and explosions, the effects artists built motion blur with geometry to control sparks at a sub-frame level. “The sparks are curves and streaks created based on velocity,” says Gibson. “The problem in doing this is that when the camera moves, you can get parallelograms. So, to get a line biased in the direction of the camera, we compute sparks in camera space.”

To help the film keep a PG rating, the effects crew rescues the drivers in car crashes by covering them in foam. As the digital fluid flows over the driver, the system that the effects artists devised grows spheres everywhere the foam sticks. “On top of those spheres, we grow more and more spheres,” Gibson says. “We render both: the fluid-driven surface and the volume driven by the bubbles.”

As the cars race around the tracks, they reflect everything in the stadium, from the flashbulbs popping in the crowds to the neon signs. “It’s the worst of all possible worlds,” says Haarm-Pieter Duiker, CG supervisor. “We have highly reflective cars moving fast.” Another Matrix alum, Duiker took a hiatus from marketing his Color Symmetry software to lead the rendering team. Approximately 50 lighters worked in Mental Images’ Mental Ray to create the shots; Duiker had 20 lighters on his team alone.


To give the directors, the cinematographer, and the actors a view into the artifi cial world that would later surround them, Digital Domain turned its former compositing software, Nuke (now owned by The Foundry), into a real-time system named “Sparky.” On set, Digital Domain’s Darren Poe and other
compositors used Sparky to put footage of the actors, shot on greenscreen with HD cameras, inside the 360-degree QTVR “bubbles,” that is, spherical collages stitched together from color-corrected and otherwise manipulated photographs. It was, in effect, a live keying system.

“Having the ability to see on set what would later be in the show was useful,” says Mohen Leo, VFX supervisor at Digital Domain, “but also, they wanted to be sure that as they shot the sequences, the way in which they decided to line up the backgrounds would be preserved for post houses.” Thus, the post houses could see how the directors wanted the foreground and background images positioned, as well as the fi eld of view. —Barbara Robertson

Because the goal was to have the scenes look like extremely processed photography, the rendering team started by creating photorealistic digital cars and tested their techniques side-by-side against real cars. “We shot reference Corvettes in Michael Bay’s parking lot,” says Gibson. When the real and unreal cars were indistinguishable in photographs, the rendering team moved on to complex scenes using raytracing and final gathering to create reflections and shadows.

Shader writer Jim Rothrock, who formerly worked at Mental Images, replaced the entire final gathering solutions in Mental Ray with new technology he created from scratch. Duiker then organized and enhanced the new rendering solutions into a production-ready system.

“In one approach, you can bounce lights off everything, and we did that for the tracks; we raytraced the entire scene,” Duiker says. “But for the cars, we rendered panoramas per frame per car.”

In effect, the digital cars act like the chrome balls used on set to gather light. Rays, sent out from the cars in every direction, gather data from the background images of the stands, the signs, the light banks, the video screens, and so forth.

“It’s as if we put a virtual chrome ball in the center of the car,” says Duiker. “So, on a per-frame, per-car basis, at that point in the frame, we know which lights to render; as each car moves, we see all the elements in the environment on the car. It gets us a long way.” In addition, lighters, for example, might use CG lights to simulate headlights.

To handle the rendering load, Digital Domain expanded the facility’s rendering capabilities. The facility put eight-core Intel machines with 16gb RAM on every artist’s desk and installed 700 render machines with the same specs in a separate facility linked to the studio by fiber optics.

The crew rendered the scenes using a multichannel approach in Open EXR to produce approximately 20 layers so that compositors could better control various aspects of the scene. For the races, the background bubbles tended to be of sky and stars because the environment was largely the CG stadium filled with crowds. Effects artists created the crowds with Massive software and by replicating 2D images of 20 extras filmed on greenscreen stages. By placing the 2D images on cards instanced using parameters to create such variations as color and size, and then applying the cards to points in a 3D point cloud that represented the stadium, the artists multiplied the 20 extras into thousands. “We could push the camera in on these images and see the people in detail,” Gibson says.

Poe led the team of 50 compositors who assembled the shots in The Foundry’s Nuke. “We can view proxy versions of the track and cars in the 3D environment from camera angles as we’re compositing,” he explains. While on set, he had done early tests to help refine the super-stylized pop-art look of the film by experimenting with color, lighting, and graphic defocusing. In postproduction, he and the compositors on his team applied the results of those tests. “We added the bling,” Poe says, “lots of lens flares, glows, and optics. We airbrushed in extreme color.”

Racing at Imageworks, ILM
Because most of Digital Domain’s shots take place in stadiums, that studio worked with simple bubbles. The three racing sequences in the 250 shots created at Imageworks, however, take place in a desert and in the mountains, which the production artists provided in the form of more complex bubbles. Like Digital Domain, Imageworks used Maya for modeling, rigging, and animation, and Houdini for effects. In addition, Imageworks relied on Maxon’s Cinema 4D for some projections of images onto geometry. And, as with Digital Domain’s racing shots, Imageworks’ races are all-CG except for the actors filmed in cock­pits on motion bases. Both studios worked from shot designs laid out in previs.

“The cars are doing all kinds of over-the-top stunts and crazy things,” says Kevin Mack, visual effects supervisor. “Sometimes the ground is moving and not the cars—the road is on a conveyer belt to avoid moving through huge volumes of geometry, which is also part of the animation style. Our job was to keep things feeling real with proper weight and physics, despite the antics. It’s stylized, and we looked for opportunities to go super-stylized.”

For example, they might create more or less motion blur than normal, or put a car perfectly in focus for a couple frames while it’s ripping by at a ridiculous speed. “The shots weren’t supposed to be realistic,” says Peter Nofz, digital effects supervisor. “They were hyper-realistic. The regular visual effects rules didn’t apply; the idea was always to take everything to 11. We could have fun with the material.”

The bubbles were particularly fun for Mack, who works with powerful, saturated colors in his own fine-art digital paintings and who has been creating digital backgrounds using matte paintings and multiplaning for visual effects in such films as Apollo 13 for years. “But we’re taking that idea to such an extreme here,” he says. “It’s some of the coolest stuff I’ve worked on. It’s like pop art that’s alive and animated, and it’s tricky. You can’t just saturate an image. We’ve gone to great lengths to get absolute complements with gradations in hue as well as value. It’s amazing. Really beautiful; extremely cool.”

For ILM’s John Knoll, the fun of the project was moving out of his comfort zone. He supervised a small team that created a 30-shot sequence in which a race-car driver confronts bad guys in a big semi-truck on a lonely highway.

“I have strong opinions [on] lighting and texturing objects to look realistic from all the films I’ve worked on in the past,” Knoll says. “But the normal rule book for photography was inappropriate here. It’s a crazy, odd stylization, and it takes a little while to wrap your head around where to go with it. It’s a bold experiment. No one has ever done this before. In the end, we ended up with something attractive, appealing, and pleasing. And, it was fun.”

For Gaeta, part of the fun of this project was pulling together a crew that he had worked with on The Matrix, as well as talented effects supervisors in many studios. “We brought a lot of unbelievable people with us from our community,” he says, “and a dream team of creative and technical talent—Pierre Buffin, Kevin Mack, Kim Libreri, Dan Glass, John Knoll, and others, people on the highest plane of visual effects capability, the masters of making things look right.”

But, “right” in this film took on a new meaning. “Hats off to those who experimented with this new, unnamed genre in Sky Captain, Sin City, 300,” Gaeta says. “It’s the Matrix trilogy’s virtual cinema, virtual cinematography, and virtual effects mutated toward the new style and a looser application I’m calling photo anime.”

And, even though Speed Racer’s style is radical, filmmakers might use the techniques to create films in other styles ranging from photoreal to non-photoreal.

“It’s happening because of advancements in digital cinema and compositing that give us the ability to mount an end-to-end movie,” Gaeta says. “When you add ‘editography’ and post cinematography, you get the feeling that we’re moving the goal lines for production design and cinematography both. Credit George Lucas for seeing this early on. It’s not about visual effects, it’s about creating the form of a film.”  


John Gaeta and Dan Glass served as overall visual effects supervisors.
Jake Morrison of Exhaust led the in-house team that pioneered post cinematography, bubble compositing, and anime look development.
Dennis Martin sent photographers around the world to shoot images that he assembled into “bubbles.”
Matt Walters managed the asset pipeline.
Euisung Lee, a 3D previs artist, developed concept art for the car fu combat scenes, stunts, and races.
Stephen Lawes supervised the in-house compositing team.
Lubo Hristov of Christov Effects & Design manipulated the location photography bubbles to create hyper-real pop-art backgrounds.

Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at