New Approach to DIY Mocap
August 7, 2018

New Approach to DIY Mocap

A new approach to DIY, full-performance motion capture will be showcased at this year’s Real-Time Live! at SIGGRAPH 2018.

The session “Democratizing mocap: real-time full-performance motion capture with an iPhone X, Xsens, IKINEMA, and Unreal Engine , ” takes place on Tuesday, August 14, from 6 pm to 7:45 pm at the Vancouver Convention Center’s West Building, Ballroom AB.

Cory Strassburger, co-founder of LA-based cinematic VR studio Kite & Lightning, will demonstrate how an iPhone X, used in tandem with Xsens inertial motion-capture technology, can be used for simultaneous full-body and facial performance capture, with the final animated character live streamed, retargeted and cleaned via IKinema LiveAction to Epic Games’ Unreal Engine – all in total real time.

He says: “Thanks to recent technology innovations, we now have the ability to easily generate high-quality full-performance capture data and bring our wild game characters to life – namely the iPhone X’s depth sensor and Apple’s implementation of face tracking, coupled with Xsens and the amazing quality they've achieved with their inertial body capture systems. Stream that live into the Unreal Engine via IKinema LiveAction and you've got yourself a very powerful and portable mocap system – one I'm very excited to show off at SIGGRAPH 2018.

Taking a Beby character from Kite & Lightning’s upcoming “Bebylon” game, Strassburger will show on stage how this simple, DIY set up, powered by accessible technology, can power real-time character capture and animation. He will demonstrate how the new approach to motion capture does not rely on the process of applying markers or setting up multiple cameras for a mocap volume, but rather relies only on an Xsens MVN system, a DIY mocap helmet with an iPhone X directed at the user’s face, and IKinema LiveAction to stream and retarget (transfer) the motion to Beby in Unreal Engine. Via this setup, users can act out a scene wherever they are.

The Real-Time Live! session will also cover the implications and potential of this process on future creative projects, revealing how new scalable workflows can empower games, CG, and animation industries at the indie level without the need for huge budgets.