Ninja Theory Rolls Back the Curtain on Its Real-Time Work
July 16, 2018

Ninja Theory Rolls Back the Curtain on Its Real-Time Work

Since releasing its AAA independent video game Hellblade, and bringing the shockingly human protagonist Senua to life for promotional events, Ninja Theory has been on a roll.
Five BAFTA awards later, and recently acquired by Microsoft, the Ninja theory team sat down with technical artist Matthew Stoneham for a deep dive into what’s next for Ninja Theory.



Congratulations on the acclaim and BAFTA awards surrounding Hellblade. Did you anticipate this success?
We took some big risks with this project, and when you’re so close to something it’s always difficult to know how it will be received, but there was a point a few months from the end where we dared to hope. Seeing the accolades trailer for the first time was when I really realized that we’d made something special.

I think the secret to the success of the Hellblade project wasn't necessarily the production methodology, we didn't even have a producer on the project until the last four or five months! I think the key was just the experience level of each team member; we were averaging over 10 years of experience per person. Everyone was a specialist in their own department, having that amount of knowledge and mutual trust really helped the team thrive. 

Casting Melina as Senua and being able to shoot performance capture whenever we wanted was hugely beneficial. Typically, you'll cast an actor and only get access to them for very short periods of time. Having Mel in the office and being available anytime we wanted helped us to experiment and explore different ideas for both cinematics and gameplay.

Senua is very human for a video game character – how did you achieve that realism?
Hellblade is completely centered around Senua, what she’s experiencing, and how she perceives the world around her, and for this to be successful we needed to connect with people and communicate that emotion in an accurate and believable way. Right from the start there was a conscious decision to focus our resources on her, and to use the highest quality technology and solutions to achieve our aims. Finding partners that shared our creative vision was key, and Xsens was a huge part of this journey.

You teamed up with Xsens to bring Senua to life at industry events. Is that something you might continue for future projects?
Absolutely. One of the key aspects of Hellblade was our open development policy, which allowed us to connect with the community in a way that was previously not possible. The live events at GDC, FMX and SIGGRAPH as well as the live Facebook Q&A stream helped us generate a buzz around Hellblade, as well as being an awesome experience to work on.

It’s something we’re really keen to keep developing in the future. There’s a huge appetite for this kind of material, people we meet always want to talk about the development diaries and live performances.

So what made GDC such a success for you?   
When we revealed part way through the cinematic that Senua’s body and facial performance was being driven live and in real time by Melina, I think it really blew people’s minds. Later, I watched the stream on Twitch, and when we pulled back the curtain to show Mel in the Xsens suit, there was a good few seconds of no comments as people’s reactions to the reveal started to sink in. Then the comment stream went crazy, people were just blown away by it.

It was the first time it had really been done, certainly on that scale, and was the culmination of a huge amount of work by both ourselves and our partners. For it to be received so well was really awesome.

What would an inertial motion-capture setup allow you to do that an optical setup wouldn’t?
For logistical, cost or simply practical reasons it’s not always possible to use our optical stage.
We’ve used the Xsens suit on a number of key occasions; for the GDC demo, for our Facebook live Q&A stream and at the Hellblade launch event at the Wellcome trust in London. In each of these cases it simply wasn’t practical to consider optical, and the Xsens suit did a really great job for us.
At SIGGRAPH we did use an optical stage but this was a huge undertaking, utilized significant amounts of hardware and required specialist knowledge to run so there’s a clear trade-off here.
It’s also possible to use the Xsens suit untethered, saving the performance to the onboard SD card. This is a feature we’d like to explore in the future.

What is next for Xsens and Ninja Theory moving forward? Any new locations you're interested in exploring or any new movements or stunts?
It’s always been in our DNA to find new ways of working and to utilize technology that helps us work better and faster, with Hellblade we tried to push this even further. We’re looking to build on that momentum, and that includes looking at how we can further develop our use of performance capture technology such as Xsens. Being able to use the Xsens inertial system is like having another feather in our bow.

On a personal level, I’m really keen to do more events such as GDC and SIGGRAPH, although it was fairly stressful it was also hugely rewarding. Working so closely with partners we learnt a huge amount in a very short space of time, and this carries forwards though Hellblade and into our future art pipelines.