The new software solution, which will be available in Q4, 2013, captures a live actor's facial performance with any video source, and instantly transfers that performance to a 3D character in real time. Faceware Live will be on display at this year's Siggraph, booth #809.
Faceware Live captures facial performances of actors without markers using either onboard computer video or webcam, the Faceware Head-Mounted Camera, or any other video capture device, then streams that into Autodesk MotionBuilder. After a quick one-button calibration, Faceware Live tracks and solves the performance directly onto any 3D model. If the actor smiles, so too does the model, in real time. Faceware Live currently works on Windows. Future versions of Faceware Live will support additional Autodesk products, operating systems and game engine technologies.
"We've seen growing demand across a number of industries for realtime facial motion capture," said Peter Busch, vice president of business development at Faceware Technologies. "Faceware Live is our revolutionary approach to meeting this demand. This product addresses the many challenges that arise in production and enables rapid content creation for anything from live stage performances to on-set pre-visualization to a desktop tool for animators."
Faceware Live has many applications, including:
- On Set Pre-Visualization -- Directors, actors, and creative leads can see their on-set facial performances streamed onto their digital characters instantly, which helps them gain an immediate understanding of how a live performance translates to the digital character. Faceware Live even captures highly accurate eye movement, allowing for a true indication of performance for real-world production tasks such as 3D camera blocking and placement.
- Live, Interactive Experiences -- Digital characters can now interact live with real performers for an unlimited number of use cases, including broadcast television, corporate events, and tradeshows.
- Desktop Production Tool- Artists can produce animation instantaneously for rapid prototyping and content creation, all at their own workstation.
How It Works
Faceware Live consists of two simple components: Image processing and data streaming. In realtime, Faceware Live reads from a selected video source, running at any frame rate, and tracks and solves the facial performance to a set of animation values that are streamed from the application. Autodesk MotionBuilder accepts the stream of animation and applies it to a set of pre-defined controls. Using Faceware's Character Setup tool the animation can be easily mapped onto any digital character.
"We've designed Faceware Live so that the image processing component does all of the heavy lifting. There is almost no work for MotionBuilder other than applying animation curves to your character," said Jay Grenier, director of technical operations. "This gives you an extremely stable and scalable solution for realtime facial animation."