<I>ADAM</I>: Humanizing Robotics With Performance Capture And Unity
July 10, 2018

ADAM: Humanizing Robotics With Performance Capture And Unity

Are you a fan of cult films? Dystopion worlds? Then you’ll love ADAM. This proof-of-concept demo has grown into a viral hit, a fully real-time series rendered in Unity engine. Beyond this, the film also delivers extremely photorealistic standards of performance capture – delivered by Animatrik and DI4D.
ADAM Unity demo

Two years ago, Academy Award-nominated director Neill Blomkamp wrote something he labelled “The ADAM bible”. This 50-page document explored the rich possibilities of an original, dystopian world created by writer and director Veselin Efremov in 2016. Whereas Efremov’s original story introduced a world where robots live in a suppressed and violent society, Blomkamp saw potential for so much more.

Leveraging the power of his independent creative outfit Oats Studio, Blomkamp decided to further expand the world. ADAM has now grown into a viral hit series of three short films, transforming what was once a Unity real-time demo into an immersive, engaging, and ongoing story.



This was ADAM’s original intention: to explore the applications of Unity’s real-time game engine in filmmaking. It is truly impressive in this regard. All three films are fully real-time, with users able to pull out the camera and explore any scene in all of its intricate detail. Beyond this, the film also delivers extremely photorealistic standards of performance capture – delivered in partnership with Animatrik and DI4D.

“We wanted to keep everything in-camera as much as possible, and translate the real world into Unity’s game engine,” says Chris Harvey, VFX Supervisor at Oats. “We approached Animatrik, who have the biggest space available for independent performance capture in North America. Their team is one of the best in the world – I knew that from working with them on films like Chappie and Zero Dark Thirty.”

The blend of physical and digital

The creation of ADAM’s setting is an interesting one indeed. Oats performed physical location scouts, set up props, such as a fire truck, and built all costumes practically. Not a typical process for a digitally created environment…

The goal was to capture a real environment and transfer that into the Unity engine; basically digitising reality. After two days of photography, drone work and photogrammetry – coming in at 20,000 shots – Oats was able to take the gathered data and transplant it into a motion capture volume.

The upshot was that the Animatrik stage matched an imagined physical location. Crew could look through a virtual camera and see their fictionalised world overlap Animatrik’s 10,000ft mocap volume, with everything in it.

“We’d load in the gathered photogram data, placing sandbags and ramps where necessary to mimic rough terrain,” explains Harvey. “Animatrik went as far as to measure the exact angle of slopes and undulations in the ground.



“Animatrik helped us in building believable, empathic digital characters. Our actors were stepping across, over and around obstacles – that was key to the body performance capture. Even when shot from the waist up, details like stumbling motions when crossing uneven ground came across in the character’s gait. Movement felt very natural. ”

Emotive robotics

ADAM explores a world where human prisoners have their consciousness forcibly transferred into mechanical bodies. It was important that their underlying humanity came through in the animation. To achieve true-to-life performance, Oats cast real actors to play their corresponding robotic avatars.

“There’s a scene where one character, Mary, smashes her brother over the head with a rock,” recalls Harvey. “We were capturing her face, with all the little nuances and details. The actress really got into that scene. She broke down and started crying. It was incredibly emotional. We had to stop shooting for a moment and take a break.”

The team then needed to marry this body capture with facial performance. At the time of shooting, technical circumstances meant these two key character aspects couldn’t be captured simultaneously.

In partnership with Animatrik, facial performance capture company DI4D provided high resolution 4D facial animation. Working with Animatrik and DI4D, the Oats team could capture performance geometry at 60fps, then stream that data into the Unity game engine and attach it to the necessary characters.



“We wanted to try something different from traditional facial capture,” says Harvey. “That was a key decision in going with DI4D’s facial capture system. The end result was so realistically human.”

Rendering in realtime

Unity is the world’s most popular creation engine, reaching more than 3 billion devices the world over. But rarely is it used in a filmmaking pipeline.

“It was all new to us,” says Harvey. “We’d never performed work in a game engine before so we had a lot of fun experimenting with real time renders. Essentially, we were able to take a game-based tool and adapt it to work in a visual effects pipeline. The results were spectacular.”

Unity proved to be a powerful and instantly responsive toolkit. ADAM ran live at 30fps, rather than pre-rendering. According to Harvey, the biggest advantage of using a real-time engine was a newfound flexibility in creative iteration.

“We were fighting with some of the lighting and outdoor scenes, shot at high noon with very straight, harsh sunlight. It’s hard to make that look aesthetically pleasing. As an experiment, we took 30 seconds out of the film and adjusted lighting conditions to 2:00 o’clock, 4:00 o’clock and so on to see what they would look like.

“An hour later, we had each of these 30-second wedges of different times a day. You can’t experiment that quickly and see the final result without a real-time engine. Unity is just so fast. It’s all live, all the time. Every artist can see the big picture. Creative decisions are much more holistic and flexible, rather than isolated.”

Creative flexibility

“The whole point of the studio is to do what we want,” Blomkamp once said of Oats on Twitter, referring to the studio’s foundations as a freeform, independent entity capable of experimenting with the latest filmmaking technologies. “No constraints.”

By teaming up with Animatrik and using tools like DI4D’s PRO Facial Performance Capture System, Oats gained access to more of that flexible technology: the most up-to-date, efficient and emerging motion capture solutions.

“Brett and the team are probably the most professional, most knowledgeable performance capture specialists that I’ve ever worked with,” concludes Harvey, “They’re close partners in any project.”