Issue: Volume: 24 Issue: 4 (April 2001)

All About Eve

A lifelike virtual 'personality' becomes a rising star

By Audrey Doyle

She hosts her own radio show every Saturday morning. In February she appeared on the cover and a fashion spread in the popular French women's magazine Madame Figaro. She has a modest flat in Paris's 11th arrondissement, an agent with the Parisian talent agency Rouge, and a part-time job as a waitress at L'Opus Lounge, a trendy Paris bar.

Her name is Eve Laura Solal. And she's arguably the most humanlike virtual character ever created.

Eve is the brainchild of Paris-based Attitude Studio, a facility formed a year ago to provide end-to-end virtual character development services for applications in television, film, the Internet, and video games. So far, Eve is Attitude's first finished creation, but a full-length all-3D film starring some of the studio's other digital characters is already in the works, and a deal for a second 3D film is being negotiated.

"The demand for lifelike virtual 3D characters is very high today," says Marc Miance, chairman and managing director of Attitude. "We formed this company and developed our own technology to meet this demand."
Attitude Studio created a realistic virtual woman named Eve, then gave her an identity, including a job, a loft, vital statistics, and friends. Now the company is acting as her agent, trying to find her work as an actress. (Image courtesy Attitude Studio)

Attitude's technology, called emotion mapper, is a set of tools that the company developed to work within Alias|Wavefront's Maya to perfect motion data that's been captured with Attitude's Oxford Metrics Vicon motion-capture system. Attitude created Eve last year as a way to demonstrate the emotion mapper tools-and to promote the company and its motion-capture, modeling, and animation services-at last November's Salon du Satis, a trade show and exhibition held in Paris. For the demo, Attitude created a 1-minute, 40-second video clip starring the 3D Eve as she was interviewed by a journalist. The demo was shot from the point of view of the reporter, so viewers saw Eve against a live-action background.

According to Miance, the facility's use of the emotion mapper tools is what differentiates Eve from other virtual characters, and Attitude from other facilities. "With emotion mapper, we were able to give Eve humanlike emotions, so she comes across as a real person in a much more believable way than other 3D characters," he says.

Emotion, Miance explains, is conveyed not just through facial expressions, but also through body movement. "In emotion, you have motion. And that's the first thing you notice about Eve, that her motions are humanlike," he says. "You can tell that other 3D characters are computer-generated because their movements are stiff. When you see those characters, you don't connect with them; you don't feel like you're looking at a real human. But when you see Eve moving, you sense from her realistic movements and her behavior that she has emotion. And because of that, you sense that she is a real person."

According to Miance, because Eve's lifelike qualities make her far more believable and appealing than existing 3D characters such as computer game heroine Lara Croft, Attitude can market Eve as a complete "personality." At Eve's Web site,, you can, among other things, learn about Eve's life, conveyed through an interview with French journalist Thomas Baudouin-Alexandre; read her resume, which touts her education and her experience in film, television, advertising, and the record industry; and learn about her parents.

You also can learn personal details about Eve. For instance, she's 5 feet, 7 inches tall, she was born in Boulogne Billancourt on May 3, 1978, her favorite color is red, and she loves cats. You can also view a number of streaming video clips: one is a tour of Eve's flat, shot and narrated by Eve herself; one is of her friends as they discuss their relationships with Eve; and one is of Eve's colleagues at Fun Radio, where she's a part-time DJ.
Eve recently landed a modeling job for Madame Figaro, a French women's fashion magazine, where her digital image graced the cover and inside pages. (Image copyright 2001 Madame Figaro/F.Farre and Attitude Studio/Bruce Tejtelbom)

While these elements of the Web site do a decent job of building a "life" for Eve, its most impressive part is the streaming video clip presented at Salon du Satis that shows Eve being interviewed by a journalist "friend." Shot from the point of view of the interviewer, the videotaped session takes place at L'Opus Lounge and begins with exterior footage of the locale, a real Paris establishment. The interviewer enters the lounge, which is packed with patrons drinking and mingling. As he approaches the dance floor, viewers see several real people dancing. One of those dancers, however, is not a real person; it's Eve. But her appearance and movements are so lifelike that it's nearly impossible to tell that she's a 3D character.

The interviewer catches Eve's attention and they walk to the bar, where Eve sits down. As she talks, her body, face, and lips move with amazing realism, making her on-screen presence quite convincing. At the conclusion of the session, the background fades to black, and Eve's body metamorphoses to a wireframe model.

According to Attitude motion-capture specialist Remi Brun, it took a team of 12 people five months to create the model and animation of Eve. The modelers, working from concept sketches hand-drawn by independent graphic artist Marc Majiori, built the skeletal structure for Eve's body and face using Maya running on NT workstations. Then the motion-capture session took place. A performer sporting approximately 40 markers on her body and 25 on her face, plus a pair of Virtual Technologies CyberGloves on her hands, acted out Eve's movements and spoke her dialog. Meanwhile, Attitude's 12-camera Vicon system captured the motion data, and a video camera captured the session and the audio.

After using Kaydara's FilmBox to map the marker data to the Maya skeleton, the animators built Eve's body in Maya and synchronized her lips to the audio using SyncMagic, a phoneme-based lip-sync pro gram from the French company of the same name. They used the videotape of the motion-capture session as a reference during the lip-syncing process.

When these steps were completed, the team turned to the emotion mapper tools. Miance says that the tools work within Maya to enable animators to rely more on motion-capture data and less on key frame animation to obtain realistic results. "Before emotion mapper, to get the movement you needed to trick the eye into making you think you were looking at a real person and not a CG person, you had to keyframe the animation. You couldn't get those nuances of motion from the motion-capture performer be cause existing motion-capture technologies don't pick them up; you had to create them yourself frame by frame.

"With Eve, there is very little keyframing, and it's only in the face," he adds. "Nearly all of her movement comes from the performer, from the motion-capture data. So the resulting animation is more realistic."

According to Brun, the team used the emotion mapper tools in two ways. "First we used them to improve the look of certain areas of the body, such as the neck, shoulders, knees, elbows, wrists, and fingers; any part of the body that, when it moves, you'd normally see the bones and muscles moving also," he says. "Because of emotion mapper, when Eve turns her head, for instance, her neck area doesn't look like a flat cylinder. Instead, you see the muscles in her neck. You feel there's something inside the neck, that it's not just a hollow cylinder or a piece of rubber twisting as her neck turns."
For Eve's dramatic debut, Attitude Studio created this videotaped interview with the digital character. The images above depict the progression of the live-action scene as the 3D Eve model is inserted into the shot.

They also used the tools to animate Eve's face. According to Brun, Eve's facial animation comprises three channels of data: motion capture, lip-sync, and keyframe animation. "Using emotion mapper," he says, "we mixed the channels of data and manipulated them to get the most realistic result."

After the animation was finished, the team created Eve's hair using Maya Fur and her clothes using Maya Cloth. Rendering and lighting were done in Maya as well. To track the animation of Eve to the live-action background footage, the team used RealViz's MatchMover, a camera-tracking utility that automatically recognizes points in a moving image and adjusts a number of predefined markers to track the points. Discreet's Flame was used for final compositing.

According to Miance and Brun, the most challenging part of creating Eve was animating her face. "With facial animation, you must be precise because people see other people's faces everyday, so they have something real to compare the animation to," says Miance. "It's not like animating a dinosaur, which no one has seen in real life."

"Facial animation has always been difficult, and it still is, even with emotion mapper," adds Brun. "But we're constantly improving the software." In fact, the full-length film Attitude is working on, called Renaissance and expected to be completed in 2002, will star characters whose facial animation will appear even more realistic than Eve's, predicts Brun.

The difficulties of facial animation notwithstanding, Miance, Brun, and the rest of the team at Attitude feel the future is bright for Eve. "We are pushing to get her on TV, and we are speaking with a client and agency for her to appear in a TV commercial. We expect the deal to close soon and for the commercial to air by the summer," says Miance. "Also, we are working with a broadcast network here in France to get her a series of her own by September."

They also believe the future is bright for Attitude. "I've been working in this industry for many years, and even though the content of the Eve animation isn't new-it's just a girl dancing and talking-we've brought it to a new level, where you can see the virtual actor talking to you with emotion," says Brun. "We hope this will bring a lot of people back to the idea that motion capture is good not just for action movies, but also for giving a virtual actor the presence of a real actor.

"I'm not saying we can get it now, but I think we're on the way," he concludes. "And what we're showing with Eve is just the beginning."

Audrey Doyle, a contributing editor to Computer Graphics World, is a freelance writer and editor based in Boston. She can be reached at

Alias|Wavefront *
Attitude Studio *
Discreet *
Kaydara *
Oxford Metrics *
RealViz *
SyncMagic *
Virtual Technologies *