Issue: Volume: 24 Issue: 7 (July 2001)

Inside Moves



By Barbara Robertson

All images © 2001 Warner Bros. and DreamWorks.

When the magic tricks that visual effects wizards perform are successful, the illusion convinces an audience that the impossible is plausible. For the movie A.I. Artificial Intelligence, in addition to creating the effects, the team applied their wizardry to the process of transforming the director's vision into the illusion. By using computer graphics in new ways, they developed tools that gave the director the freedom to create new ideas in pre-production and on the set. In so doing, they made their own post-production work easier and less expensive.

Directed by Steven Spielberg, A.I. is the legendary film that Stanley Kubrick envisioned but never made. Kubrick (A Clock work Orange, 2001: A Space Odyssey) began seeking visual effects advice for A.I. in 1993 from eight-time Oscar winner Dennis Muren at Industrial Light & Magic (San Rafael, CA), and had discussed the film with Spielberg (E.T., Jurassic Park) over the years. When Kubrick died in 1999, his widow reportedly presented Spielberg with 1000 A.I. storyboards. Spielberg wrote a script and called on ILM to create the effects. The result opened June 29.

Based on the Brian Aldiss short story, "Super-Toys Last All Summer Long," A.I. takes place in a future world in which the greenhouse effect has caused the polar ice caps to melt. With most of the land underwater, population growth is strictly controlled. Thus, "orgas" (people) rely on "mechas" (robots/androids) for services-gardening, joke-telling, companionship, sex. The actor Jude Law plays one such mecha, a personal pleasure android named Gigolo Joe, who becomes a sidekick for David (Haley Joel Osment), a mecha whose artificial intelligence computer gives him emotions. David serves as a "son" for a human couple with a terminally ill child. When the child recovers, Mom tells David to leave and come back when he is a real boy. Thus, the androids' search for their humanity begins.
At ILM, technical directors used custom software to position 120 lights for this Rouge City entrance. Everything in the scene was created with computer graphics.




A.I. has approximately 200 effects shots, according to Muren, senior visual effects supervisor, with "some sort of computer graphics" in 80 percent of them. For example, an underwater city is made of miniatures and CG models. David's super teddy bear is sometimes CG, as is the character Dr. Know. Many of the robots are animatronics, but some of the actors playing robots have partially invisible heads thanks to computer graphics. Complex effects under the direction of Muren and co-visual effects supervisor Scott Farrar are scattered through the movie, but the new techniques were prompted by one sequence in which the androids' quest takes them inside the futuristic Rouge City.

Part Bangkok, part Amsterdam, part Las Vegas, Rouge City is outrageously huge, brilliantly colored, and overtly sexy. Cars drive through the open mouths of male and female statues to enter. Inside, the buildings are fanciful and erotic. One looks like a cocktail glass. Another resembles a woman lying on her back with her knees bent, legs spread apart. People crowd the streets, neon lights decorate the buildings, brightly colored holograms dance in the air.

The city was created with a mixture of stage sets, miniatures, and CG models. Typically, in "digital backlot" scenes, actors interact only with buildings on the stage; the miniatures and CG models are added to extend the stage set into the background. But in Rouge City, much of the live action takes place among and near buildings that were invisible to the actors.
ILM used a game engine to create an interactive tool for planning shots like this. Here, some buildings are miniatures; some are CG, as is the amphibicopter.




The process of building Rouge City started in the art department. Wilson Tang, lead concept designer, began with the original storyboards drawn by comic book artist Chris Baker. Taking a pragmatic approach, Tang created detailed drawings for the minimum number of buildings he thought would be needed to imply a large, dense city.

When these hand-drawn details were completed, he developed animatics, or 3D storyboards, which were pre-rendered walk-throughs of the proposed virtual set. For these, Tang used NewTek's (San Antonio, Texas) LightWave running on a Macintosh to model and render representations of his proposed buildings, whether miniature or CG. Then he placed them as if they were really in position on the stage set. Because the new designs varied from the original storyboards, these animatics would show Spielberg which buildings would be visible in various camera angles.

For example, the lead characters leave Rouge City by flying a CG "amphibicopter" through the city, up the front of a male building, and out. For this shot, Tang's animatic had the camera looking up to preclude creating street-level detail. "If Steven had said, 'I want the camera pointing down,' it would have affected everything," Tang says.

In some productions, 3D animatics mark the end of pre-visualization work. With A.I., the pre-viz fun had only just begun. Tang's models from the 3D animatics moved in two directions: into a game-like real-time 3D environment that Spielberg used to interactively plan shots in the virtual city; and into a virtual set created in ILM's motion-capture studio, which Spielberg used while shooting actors on stage.

To create the real-time 3D environment, Ward downloaded and customized a game engine from the Internet that was good at managing outdoor scenes, adding virtual clones of real-world cameras and lenses, aspect ratios, camera heights, correct perspectives, and so forth. A player in this "game," Ward explains, might select a 98-degree point of view that matches a VistaVision 15mm lens.

"We also did a lot of development on the physics of cameras," says Ward. "If you've got six guys rolling a big camera down a track, it doesn't stop on a dime."

The result was burned onto a CD and given to Spielberg. "We like to think of it as virtual location scouting," says Tang while demonstrating the system's capabilities. As he moves around the virtual city using arrow keys and a mouse, he changes a wide-angle shot to a zoom. Pressing "H" for helicopter lifts the camera off the ground, where it had been placed for a child's point of view. The city looks complex as the virtual "helicam" flies past various buildings...until he points the camera down. Then, the buildings look like chess pieces placed on a bare board. "The placement of the buildings was very carefully considered," Tang says

With this real-time version of the animatic, Spielberg experimented with camera angles and became familiar with the virtual location he would see later on the real set.

The ILM team fashioned the virtual set system in a large room typically used for motion-capture sessions. Seth Rosenthal, set visualization supervisor, describes the department that created it as "the place that's the overlap between CG production and live-action production." Adds Mike Sanders, set visualization engineer, "We're brining new technologies into the old paradigm."

To demonstrate the system, Rosenthal rolls a dolly with a large film camera and a camera tracker past several people moving around on the bluescreen stage. To the side, a monitor shows an entirely different scene: On the monitor, the people are walking inside Tang's virtual set.

Accomplishing this sleight of hand meant turning equipment and software designed for a fixed installation in a broadcast studio into a mutable system that worked on a film set. Muren wanted to give Spielberg complete freedom on the set to throw out all the storyboards if he wanted. "He really gets inspired by what he sees around him and he changes things," he says. "This system gave him a chance to do that on a bluescreen set."
From top to bottom: Spielberg checks the virtual set; Jude Law and Haley Joel Osment on the bluescreen set; CG elements; the miniature; the final shot with added CG elements.




Sanders created the CG part of the virtual set by brining Tang's 3D models into Brainstorm Multimedia's (Valencia, Spain) Studio software. In action, a Radamec (Surrey, UK) camera tracker looks at disks hanging from the ceiling as the camera moves. The disks are printed with concentric circles like bulls' eye targets; the varying circle sizes on the hundreds of disks provide data, sent to a nearby SIG (Mountain View, CA) Onyx computer that tells a virtual camera where to move in the CG world. An Ultimate (Chatsworth, CA) compositing system pulls mattes from the bluescreen shots; that is, turns the blue areas into an alpha channel and separates the people from the bluescreen; and then composites the people with the CG images. Video hardware in the Onyx syncs and locks the video signal so the CG images can be placed in the alpha channel. Brainstorm's software handles the realtime rendering for the CG elements.

During the shoot, while Spielberg was directing, Sanders was driving the virtual set, constantly changing locations and angles, and quickly switching cameras and calibrating virtual lenses to follow. When asked, he could also move the virtual city around a fixed camera. The resulting composite was automatically recorded.

"The director got artistic control over the performance, and we got unambiguous direction as to what he wanted to see in the final shot," says Rosenthal.

To manage all this, Sanders created an interface in ILM's proprietary Zeon software, which he hooked into the Brainstorm software's open architecture. With Zeon, he calibrated lenses similar to those that would be on the 150-foot set and created an interface that allowed him to quickly change lens parameters, field of view and the offset between the tracking camera and the film camera lens.

At one point in the Rouge City sequence, Jude Law's character mimics the actions of a dancer on a building, but Law sees a bluescreen, not a building. The building will be a miniature, the dancer will be a CG character. "Steven Spielberg would say to Jude Law, "OK, I want you to be looking here,'" says Rosenthal. "[The system] let him place the action in context."

"It's very exciting to look at exactly the same view that's going to be in your movie," says Muren, who believes the new system has two main purposes. "It lets the director come up with a better idea than what he pre-visualized, and everybody on set sees it. We created shots that were different from the storyboards. That's a big creative plus. The other thing is that it helps with budgeting. A producer can say, 'Well, that looks terrific, but is that building going to cost a lot of money?'"

In fact, it turned out that some buildings in Tang's virtual set were never built. The tool designed to give a director creative inspiration ended up helping the effects studio save time and money. "There were buildings that never showed up in the shots," says Rosenthal. "We wouldn't have known that without the live system."




Barbara Robertson is Senior Editor, West Coast for Computer Graphics World.




Images courtesy ILM.

Photographs by David James.