Real Time, A Virtual Reality
Hein Beute
Issue: Volume 40 Issue 3: (May/Jun 2017)

Real Time, A Virtual Reality

As I sat in the audience during the live Hellblade demo by Ninja Theory at GDC 2016, I was excited, even though I knew what was coming.

Real-time performances are special. There is a unique kind of energy generated on stage, and as a spectator, you become part of the spectacle. Real-time VFX performances are among the most exciting uses of the medium, but it also comes with extra pressure.

I felt a bit of that pressure during the GDC stage show, as I knew about the effort that was put into this demo by all the different parties, including Epic, with its Unreal Engine 4, and Cubic Motion, IKinema, and

Xsens, with their real-time mocap technologies.

It was a beautiful demonstration of what is possible using real-time motion capture in a game engine for a cinematic production. The real-time component, enabled by game engines, empowers animators to reinvent their pipeline.

Actors can control their characters in real time, in any scene. It creates new options and significantly reduces production time.

The demonstration created a stir in the community with reactions like, “Hellblade takes real-time motion capture to the next level” and “The demo takes a big step forward in crossing the ‘Uncanny Valley.’” It was indeed an amazing presentation. Epic and Ninja Theory have done an incredible job of blurring the boundaries between film and gaming production technology, demonstrating that a scene can be shot, captured, edited, and rendered to film production quality in minutes instead of days or weeks.

A few years ago, this was unheard of; now we are in danger of taking it for granted.

Access to real-time motion--capture technology not only saves time and money, but it also preserves the integrity of the actor’s performance. It becomes much easier to capture the subtle nuances of timing, emotion, and improvisation. Faithfully re-creating an actor’s performance is also crucial for an immersive experience, like in VR games.

Playing VR games is all about the user believing that he or she is present in the virtual environment, and being immersed only by sight and a pair of floating hands – a common occurrence in many VR projects – does not help this cause.

Full-body immersion, where you can actually see your whole body and the body of your opponent, offers a richer experience that will immerse players for a longer period of time. An inertial mocap system, like Xsens’ MVN, allows users to comfortably wear inertial sensors that re-create their movements and send the data wirelessly. That truly puts them “in the game.”

Technology

We are frequently asked how difficult it is, and how much more technological knowledge an artist needs, to create a realistic character. Is it a matter of plugging together commercial off-the-shelf hardware and software? Can anyone set up a live production for a concert, theme park, TV show, or theater? Or, do you need a large team and preparations of many months?

A believable character like Senua in the Hellblade demo is not created in a week or two. It takes expertise at several different facets to ensure a believable virtual performance. To create a believable digital character, the VFX industry focuses on giving attention to the details of creating a character, beginning with the skinning and rigging part.

Once you have a character, it then needs to be brought to life with rendering techniques and full-body motion capture. Integrating a motion-capture system with Faceware, IKinema, Manus VR, and HTC Vive, for instance, can take a lot of testing to find out how to properly place the hardware on the performer’s body, and to verify whether the data can be properly combined in the game engine – the supplied plug-ins for streaming data into the game engine don’t always cooperate. Xsens’ MVN has been integrated with many other hardware vendors and game engines, thus enabling clients to seamlessly integrate MVN into their desired setup.

Under Pressure

Real-time performances are incredibly demanding, as nothing can be “fixed in post.” The pressure of a real-time performance for a VR application or a live show can, however, be alleviated when you can trust the tech you are using.

In a real-time application, you cannot deal with data gaps typically seen with optical systems when markers are occluded. Occlusion can be resolved by using more cameras, but to keep the system manageable (and affordable), this is not realistic. For the price of one camera, however, you can purchase an entire additional MVN inertia system.

During the development of MVN Link, there was a clear focus on making it suitable for real-time applications, which resulted in robust hardware that can take a beating. The wireless connection is also using the Wi-Fi standard connecting it to a wireless network, ensuring that all data is always received in real time. A fully redundant inertial mocap setup can also be achieved by wearing a double set of trackers on the body, each connected to a separate PC on a different Wi-Fi channel.

An example of a real-time setup that requires redundancy and ease of use is the Royal Shakespeare Company’s (RSC’s) performance of The Tempest, which was performed in William Shakespeare’s hometown, Stratford-Upon-Avon (see “Stage Presence,” January.February 2017).

The RSC is at the forefront of bringing VFX to theater with its performance of The Tempest. In a unique partnership with Intel and The Imaginarium, it reimagined the magic-filled stage production to include fully digital characters, which were driven by mocap actors on stage. The Imaginarium worked on the setup with its experienced VFX team, then handed over the real-time production to the RSC, which was trained during the rehearsals and had been running it thereafter without a hitch.

The Future

What is left to improve upon? I think that a tighter hardware integration of the different systems will result in a more accurate performance, thanks to synchronization. It will also result in a more versatile and easy-to-use setup.

On the software side, improvements can be made – for example, why have separate retargeters? Would it not be great for the user to have IKinema natively supported by Unreal, so the person does not have to install it and set it up separately? There is room to grow.