Faceware debuts plug-in for Epic Games’ Unreal Engine
August 12, 2015

Faceware debuts plug-in for Epic Games’ Unreal Engine

SHERMAN OAKS, CA — At the SIGGRAPH show in Los Angeles, Faceware Technologies (www.facewaretech.com), the provider of markerless 3D facial motion capture solutions, launched the Faceware Live plug-in for Epic Games’ Unreal Engine. The new integration will allow Unreal 4 developers to capture facial movements with any camera and instantly apply those movements to characters in the Unreal Engine.
The plug-in was co-developed by Australia’s Opaque Multimedia, the company behind the Kinect 4 Unreal plug-in. It’s well suited for quickly generating facial animation for animatics in the pre-viz process, or for creating facial animation for live interactive events, shows and concerts.

Unreal Engine users capture an actor’s facial movements using any video source, such as an on-board computer video or Webcam, the Faceware Pro HD Headcam System, or any other video capture device. That captured movement data is then streamed instantly into the Unreal Engine to drive a character’s facial animation in realtime. The facial movements display in realtime via Unreal's Animation blueprint system.
 
“Faceware Technologies has a long history of creating some of the most iconic and realistic faces in games and films, while Epic’s Unreal Engine is known for helping create some of the best-selling games ever,” says Peter Busch, vice president of business development at Faceware Technologies. “Integrating our realtime technology with their premier game engine was just a natural fit.”   
 
“The ability to combine the power of the realtime performance capture pipeline of FaceWare Live, and the power and flexibility of Unreal Engine 4, gives developers an incredibly powerful tool,” says Norman Wang, the director of development at Opaque Multimedia. “Developers will be delighted to see how easily they can access the data from the Faceware plug-in through the native Blueprint interface.”