Issue: Volume: 23 Issue: 10 (October 2000)

Reel People




by Barbara Robertson

People often speculate about when and if we will see digital people realistic enough to fool us. This past summer showed us the spectacular work done by Sony Pictures Imageworks to create a digital replica of Kevin Bacon from the inside out for Hollow Man, and by Digital Domain to create a digitally younger James Brown. Similarly, in "Young at Heart," a short film that premiered in the Electronic Theater at Siggraph 2000, a woman's face is digitally cloned and aged. In each of these works, the attention to photorealism is remarkable, and begs the question of when we'll see a digital character that can completely deceive us.

But equally interesting are an increasing number of digital representations of people that concentrate on what makes us human rather than solely on what makes us look like humans. These virtual humans may not be photorealistic, but they deliver an emotional impact that few photorealistic humans have yet accomplished.

This month, in addition to "Young at Heart," we've singled out two short films, "Jersey" and "Synchronicity," because together the three films represent three completely different approaches to creating digital humans. In "Young at Heart," photorealistic appearance and motion combine to create a bittersweet view of aging. In "Jersey," cartoon-style talking heads deliver a believable, gritty shtick. In "Synchronicity," nude dancers evolve in a surreal, painterly world.

Both commercial and proprietary software programs were used to create these animations, with "Young at Heart" and "Jersey" created, at least in part, to demonstrate proprietary software. The tools used for "Jersey" are now available commercially; software used for "Young at Heart" will be used to create virtual humans on the Internet.

Technologists will continue striving to build tools for creating the perfect, photorealistic digital human-and artists will continue using these tools in unusual ways, as these films demonstrate. When it comes to expressing human emotion-from the bawdy to the beautiful-sometimes nonphotorealistic representations are the most effective.




I really like regional humor. There are so many attitudes and accents in the US, yet we don't see much of that being done in computer graphics," says animator Joe Alter. In Alter's award-winning film "Jersey," two fast-talking garbage men from New Jersey amuse themselves by verbally ogling women's body parts as they drive along in their truck. That is, until one realizes that it's his daughter the other is talking about. "I got the idea from my friends Craig and Salvatore," Alter says. "We'd ride around in a '72 Cadillac Seville, me in the middle, and they'd be talking about ass."

"Jersey" won the "Best Synthetic Actor" award at Imagina 2000. Since then, it has been making the film festival rounds, including the Siggraph 2000 Electronic Theater. What makes this animation particularly interesting from a computer graphics point of view is that the camera is focused solely on the faces of the characters. There's no animation in the background; the dialog and facial animation alone carry the three-minute film, something rarely accomplished in 3D animations.
To create the models for "Jersey," Joe Alter used NewTek's LightWave, and for facial expressions, his own LipService software, both running on Pentium III machines.




"It's got this shtick quality to it," Alter says. That it works is a tribute to Alter's talent as an animator and as a software developer.

When Alter began working on his own after 15 years as an animator and technical director in such visual effects studios as Robert Abel, Boss Film, Industrial Light & Magic, and Centropolis, he wasn't able to find commercial software with the same features he had grown accustomed to in the studios' proprietary software. "I really wanted to do character animation and there were gaps in the commercial software tools," he says.

Thus, he developed his own facial animation and lip-synching software, now dubbed LipService. Rather than relying on simple morph targets for facial animation, LipService is a shape interpolation program, which gives an animator more flexibility. In addition, because Alter incorporated modeling tools into the software, he uses it to tweak shapes as he's animating. "It's as important to have the right shape as it is to sync it with the sound," he says, and points out that he also uses the shape interpolation to create eyebrow expressions and to make characters breathe.

As it turned out, LipService not only made "Jersey" technically possible, it inadvertently led to financing for the film: To test the software, Alter created a 30-second animation titled "Zeke." A friend showed "Zeke" to people at Film Roman. (known for The Simpsons); Film Roman provided funding for a short film; and Alter and Film Roman Creative Director Eric Redomski collaborated to create "Jersey."
During the three-minute film, the camera never leaves the characters' faces. "Jersey" won the "Best Synthetic Actor" award at Imagina 2000.




In turn, "Jersey's" success is driving sales of LipService. Some 30,000 people have downloaded "Jersey" from Alter's Web site (www.joealter.com), and some are buying LipService. "My software is what's supporting me, but I hope that will change as I do more shorts," Alter says. On the other hand, a new short film might cause him to develop new software. The two characters in Jersey are bald, for example, because he didn't find commercial software that sufficiently addressed problems such as flipping and stretching that can happen when CG hair is animated with dynamics programs. Now, he's written a hair program, Shave and a Haircut, and sells it along with LipService for $295. Both work with NewTek's (San Antonio, TX) LightWave. It will be interesting to see what this animator/software developer creates next. -BR




Our goal was to put a digital actor into a standard context and fool the audience into thinking they were looking at an old woman," says Mark Savage, co-director of R&D for LifeFX, the company that made the short film "Young at Heart." Shown for the first time during the Electronic Theater at Siggraph 2000, the film was created to convince effects studios to use surgical simulation software for character animation. In "Young at Heart," a chanteuse sits in her dressing room remembering herself as a young and beautiful singer. As she sings, her face changes from that of an old woman to one of a young girl. Her face is digital throughout the film.

The skin of the old and the young woman was modeled with LifeFX's software, textures were painted with Avid's (Tewksbury, MA) Matador, and lighting was added with shaders using Pixar Animation Studio's (Pt. Richmond, CA) PR RenderMan. Facial animation was created with a combination of motion-capture data applied to the LifeFX model and dynamics calculated by the LifeFX software. The resulting animated faces were composited with the live-action scene using Kodak's (Rochester, NY) Cineon software. In the film, everything except the face of the singer is real.
To create the digital face, LifeFX used proprietary software developed originally for surgical simulations, along with Matador, RenderMan, Cineon, and Maya.




At the core of the LifeFX software is a finite- element model of a human face, a math field, Savage explains, that represents a block of tissue. Built into this underlying model is medical knowledge about human physiology including information about bone and skin structure. To transform this generic model into the actress's face, LifeFX took measurements of her face with a proprietary motion capture system that was developed originally for surgical simulations, then used those measurements to reshape the underlying model. To age her, they asked make-up artist Todd Masters of MastersFX (Arleta, CA) to provide his impression of what she'd look like at 80. By adding that information to the math model, they created wrinkled skin, baggy eyes, and other signs of aging.

To animate her face, the company used motion-capture data for the "raw" performance, then tweaked it using parameters and algorithms. "Anything we can't track is calculated," says Savage. "The motion-capture data gives us a state, and the software uses that to calculate how the skin behaves." The software can calculate how eyelids contract, for example, and the subtle way the skin around the eye moves. "Anything that dynamically changes was created in LifeFX," Savage explains. Finally, to attach the LifeFX faces to the woman's body, the team used Alias|Wavefront's (Toronto) Maya to create skeletons that moved her neck appropriately.

The result is a remarkable illusion-good enough to fool at least some of the audience most of the time. The company's goalposts, however, have shifted. Now, rather than trying to market the software to studios, LifeFX will use the technology to put photorealistic avatars on the Internet-virtual talking heads that can deliver your email messages, represent your company online, act as a Web site host, and so forth. LifeFX expects to unveil its Internet plans over the next few months and have the first avatars ready by the end of the year. We'll know then whether the chanteuse has landed a new day job or not. -BR


Top: Dots on the actress help track motion for compositing. Middle: The aged, flat-shaded digital face replaces the real. Bottom: The final image. Bottom: The final image.




Computer graphics is usually photoreal or cartoony. I wanted to create a surreal painting, an animated painting," says Hans Uhlig, director of the short film "Synchronicity," which premiered at the Siggraph 2000 Electronic Theater. Produced by Tony Hurd, the film will be shown at several film festivals later this year, starting with LEAF in London. "It's a story about growing up," says Uhlig, describing the film. "The dancers are born, they fight over their identity like teenagers and finally, they realize that they have to live with each other. When they learn they don't have to protect themselves, the shell around them breaks apart and they're bathed in sun."

Uhlig picked female dancers for his surreal painting because he wanted to create an animation that showed the beauty of the female body without being insulting. To do this, he considered motion as well as picture quality, and decided to capture live dancers rather than rely on keyframe animation. "Computer graphics animated by hand has a particular quality," he says. "You always see the function curve. You see the ease in and ease out and you lose emotion. With motion capture, you get emotion for free."
To create his painterly animation, technical director Hans Uhlig relied on help from colleagues at ILM willing to work in their spare time.




Fortunately, Uhlig was in the right place-Industrial Light & Magic-at the right time to do such a film. Through ILM's Independent Projects Program, Uhlig had access to the effects studio's technology, including motion-capture equipment, and to any people he could convince to work with him in their spare time.

He began, though, by working with a non-ILM colleague, composer Paula Telander (San Anselmo, CA), who also helped choreograph the film. After rehearsing with two dancers for a week, movement for the entire film was captured in one morning using a seven-camera Vicon (Oxford, UK) optical motion capture system. The data collected from 40 dots on each dancer was cleaned using Kaydara's (Montreal) Filmbox, and brought into Avid's Softimage. Once the animated skeletons were in Softimage, ILM's Stefan Fangmeier began working with the camera. "He was like a director of photography running around in that 3D space, looking through each camera and picking what he liked," Uhlig says explaining that Fangmeier could replay the motion-captured dancers repeatedly from different angles.

With the sequences in place, using techniques developed by technical director James Tooley, Uhlig could alter the skeletal animation created by the mocap data by using keyframe animation to raise an elbow, for example, or lift a chin. Tooley also set up the chains that cause the skeletal bones to move the skin. Uhlig further refined the animation with layers of secondary muscle and skin movement using ILM's own Caricature software. "For the back flips, I layered between 60 and 100 shapes on top of the animation," he says. Finally, with artistic lighting created with RenderMan shaders by Christian Foucher, textures painted by Susan Ross, and painterly compositing by Tim Alexander, who used ILM's Comptime to smear 2D layers together, Uhlig began to see his vision come to fruition.
With motion capture, Uhlig believes he gets emotion that's difficult to hand animate. To add secondary animation, he uses CG tools.




"Synchronicity" represents the culmination of 15 years of work with computer graphics for the self-taught animator and former racing engineer. "I wanted to use the technology and talent at ILM to do an art piece," he says. The artistic experiment worked. "Synchronicity" is an aesthetic success and that success helped move the former technical director into a new role as a director in ILM's commercials division. Let's hope his success will inspire other artists to use computer graphics in experimental and nonphotorealistic ways to further widen our palette for expressing human emotion.

Barbara Robertson is Senior Editor, West Coast, for Computer Graphics World.
Back to Top
Most Read