Playacting
Issue: Volume 40 Issue 1: (Jan/Feb 2017)

Playacting

Playacting
ACTOR MARK QUARTLEY (SEATED), PLAYING THE CHARACTER ARIEL, WEARS MOCAP TECHNOLOGY EMBEDDED IN HIS STAGE COSTUME.

During the past decade or so, advances in performance capture and related technologies have given birth to many unforgettable screen characters, including Gollum and Caesar from The Lord of the Rings and Planet of the Apes, respectfully. Now, that same technology, coupled with recent real-time developments, has taken center stage at the theater, giving digital life to various forms of a character in the Royal Shakespeare Company’s (RSC’s) stage production of “The Tempest.”

“The Tempest,” one of Shakespeare’s final plays, tells the story of Prospero, an exiled magician who decides to settle old scores, enlisting help from his servant, the spirit Ariel, who uses his magic and cunning. The enslaved Ariel obliges in hope that Prospero will adhere to his promise to set him free.

Throughout the play, Ariel assumes many non-human forms, including a water nymph and a Harpy. The character, visible only to Prospero, also features more stage directions than almost any other in Shakespeare’s repertoire. Lest we forget, the character is magical and able – and willing – to do just about anything that Prospero commands: “to fly, to swim, to dive into the fire, to ride on the curl’d clouds.”

For centuries, Shakespeare has reminded audiences of Ariel’s invisibility through lines spoken by Prospero, while theater companies have devised a host of creative solutions to illustrate Ariel’s shape-shifting and exuberance. In this recent production, the RSC has been able to illustrate the intended breathtaking wizardry of this character using real-time performance capture and CGI as Ariel morphs into various forms on stage in front of audiences.

Groundbreaking

This marks the first time that a completely digital character has been used in an RSC production. The feat was made possible through the RSC’s collaboration with Intel and in association with The Imaginarium Studios, the production company founded in 2011 by Actor-Director Andy Serkis (who performed Gollum and Caesar, among other digital characters) and Producer Jonathan Cavendish. Intel, which was involved in the technical process for “The Tempest” through its entirety, also provided the machinery and technology behind it.

The concept started a few years ago when Gregory Doran, artistic director at the RSC, wanted to do some type of extravaganza to mark the 400th anniversary of Shakespeare’s death. He began focusing on the masque – an event staged by royalty in ancient times that included technical advances of the day and all the pomp and pageantry costing a king’s ransom. “The Tempest” was chosen since it was one of the playwright’s last plays; it also contains a masque scene. “They were looking at what the masque would be like today, and Greg [Doran] decided he wanted something big and unique,” says Ben Lumsden, head of studio at The Imaginarium.

What caught Doran’s eye was the Intel CES 2014 keynote demonstration showing an interactive adaptation of Scott Westerfield’s Leviathan of a virtual world with a gigantic CG whale floating through it and breaching the screen. The unique presentation mixed storytelling, VR, AR, cinema, and 3D into a visually stunning display. Sarah Ellis, head of digital development at the RSC, met with Intel, and soon The Imaginarium was brought on board.

“In the end, our technology was not used in the masque element of the play, but that was where the impetus came from,” says Lumsden. Instead, the RSC opted to focus on Ariel in character mode, and determined that he would have to drive the performance in real time – to get the proper emotional connection between the two main characters, Ariel and Prospero. The question, however, was whether the actor playing Ariel (Mark Quartley) should be present on stage at the time his digital character, or avatar, appeared.

“Greg [Doran] was a firm believer that it was vitally important for the actor (Quartley) to be on stage. So, [with the setup] it was like he was a puppeteer and a puppet at the same time, akin to what they did with the theater production ‘War Horse.’ Audiences see the puppeteers inside the life-size horse puppets but soon forget the humans are present. “You invest so much in the creatures,” says Lumsden. “It’s about suspending disbelief – you are taking that extra step in the journey.”

While this unique approach required Quartley to “puppeteer himself,” says Lumsden, the process did not require a steep learning curve. The actor met with Serkis for a workshop, where the performance-capture veteran offered pointers. “But it’s all about the acting, and Mark is a great physical performer, which helped him with the part. He is in good shape and took to the physicality of the role.”

Technical development of the process, however, took much longer, spanning nearly two years.

How It Works

Ariel is a complex character, in more ways than one. According to the story, he is invisible except to Prospero. And he takes many forms, some human-like, some not. “The technology made Ariel a real spirit instead of just another character in the play. He could be shown in several forms other than his onstage presence,” says Hein Beute, product manager at Xsens, whose technology played an important role in acquiring the actor’s movements.

Whether as Ariel the spirit or another form, Quartley performs onstage in a special costume designed by the RSC. It is a skintight garment similar to a wet suit that has the appearance of the body’s musculature, as if the skin were peeled back (a la the Body Worlds exhibition). Xsens’ MVN motion--capture technology is embedded within the costume, using 17 sensors to track the actor’s performance and movements.

Playacting Playacting
A CG MODEL OF ARIEL (ABOVE) WAS USED FOR THE PROJECTION SHOWING THE CHARACTER TRAPPED WITHIN A TREE (BOTTOM).

When Ariel transforms into a non-human form, Quartley’s movements drive a CG avatar – a digital apparition of Ariel the spirit – that is projected somewhere on the stage. “When the avatar appears, Mark can still be seen on the stage as he performs the digital character,” explains Lumsden, noting that there doesn’t seem to be confusion among the audience, as they tend to look wherever Prospero looks – whether that is at the avatar or Ariel the performer.

Ariel assumes two distinct forms (a sea nymph and a bird-like Harpy); in addition, there are hundreds of color variations of Ariel and effects, such as him bursting into flames, breaking into particles, and exploding/contracting.

The avatars and effects were created by Silvia Bartoli, Brenainn Jordan, and Dan Orchard at the Imaginarium. The digital assets were mocked up in Adobe’s Photoshop and modeled in Pixologic’s ZBrush, then brought into Autodesk’s Maya, where they were rigged before they were imported into Autodesk’s MotionBuilder for retargeting. That data is then run through Epic’s Unreal Engine, where the CGI resides, and output to d3 video servers with Intel Xeon processors. The servers are connected to the RSC lighting desk, which in turn controls the projectors.

The digital characters and imagery are projected onto unique screens situated around the theater – sometimes up high, sometimes low to the ground. “The projection of Ariel as an avatar is on every one of them at some point,” says Lumsden. “There are also times when he [Mark as Ariel] is on the stage but not projected – there is always a creative logic for the CG to appear, whether it’s to tell a story, illustrate a point, or express a strong emotion.”

For instance, at one point Ariel tells Prospero how he took the form of fire on the sinking ship that forced the magician’s rivals to the shores of his island, and shows himself as a flame. In another scene, Prospero goads Ariel, asking if he remembers how he freed the spirit servant after the witch Sycorax imprisoned him within a hollow tree. In this amazing scene, a cylindrical cage wraps around Ariel and traps him onstage, while the projection shows him encased within a massive tree structure.

A total of 27 projectors and 30 projection screens are used during the production. “The main Ariel projection surface is a system we call ‘the cloud,’ ” explains Lumsden. “There are many viewing angles in the theater, and it becomes complicated when you are talking about projecting a computer graphic. It’s not like going to the cinema.”

The imagery is projected onto a mesh surface that is similar to mosquito netting, which is transparent when there are no projections. “It captures light quite well,” adds Lumsden. Vicon cameras and tracking technology are used to determine where the screens are, on the stage and around it, so the video server knows where the projections should occur.

Playacting Playacting
ACTOR QUARTLEY WEARING A HEAD-MOUNTED CAMERA AND MOCAP SUIT. ONE OF MANY CG VERSIONS OF ARIEL CREATED BY THE IMAGINARIUM FOR "THE TEMPEST."

Vicon’s optical camera system also tracks the whereabouts of moving objects on stage, some of which are held by the actors. So, precision is vital.

The data processed by the Vicon Tracker software uses Intel Xeon and Core i7 processors.

Without question, the setup – designed and devised by the crew at the RSC – is extremely complex.

The group did, in fact, consider having Quartley perform offstage in a capture volume, controlling the avatar projected onstage. “It would have been technically easier, and we would have used an optical system [as opposed to the inertial system], which gives you more geographic precision,” explains Lumsden. “But as Greg [Doran] asked, how do you get that intimate relationship between the two characters, the master and the spirit, Prospero and Ariel? They have to look at each other, and the relationship has to be true and authentic every night. That’s hard to do when one of the actors is offstage.”

Another alternative would have been to use pre-canned animation, but that would not have done justice to the play, either. “[The alternatives] would have been less challenging from a purely motion-capture standpoint,” says Lumsden. “But the creative messaging would not have been as strong.”

Motion Technology

Xsens’ inertia motion-capture systems are especially well suited for live performances, since there is no need for cameras and markers, or a preconditioned area. The data can then be streamed in real time into the Epic Unreal Engine 4, as it was in this case. “It gave the RSC the chance to be creative with Ariel’s costume by hiding the technology and focusing on its visual appearance,” says Beute. “It also gave Mark as Ariel the flexibility to move around the stage without technical limitations.”

As Beute points out, motion capture is typically a one-off shoot: You plan a shoot, capture the movements, and you are done. Taking motion capture to the theater means the technology needed to be robust and reliable enough to sustain several plays, or shoots, a week.

It’s worth noting that despite the cutting-edge nature of this application, the RSC used an off-the-shelf Xsens system with the latest MVN Studio software.

While Lumsden cites the overall project as the biggest challenge, he is quick to point out that the real-time facial-capture technology for this application proved especially difficult. “We created our own real-time setup for the show, and it was really challenging to get that working. It’s proprietary software we wrote, but now it will be part of future projects we do at the Imaginarium,” he says.

While there are some commercial options available, those are closed systems. The Imaginarium’s Grip platform – which calibrates, tracks, and retargets facial movement – works in real time and is scalable, making it ideal for this situation.

The setup entails a head-mounted camera (HMC), which in this case is tethered, meaning there is a cable at the back through which the video stream is sent, rather than through wireless transmission. The HMC also illuminates Quartley’s face, as the capture environment is fairly dark and the group wanted the audience to be able to understand what they’re seeing, so IR wasn’t considered, Lumsden notes.

“We ingest the video data at 720p50 and feed it into our real-time facial tracker,” he says. “A nice feature here is that we don’t need to apply any special makeup for this step (as Quartley is wearing his Ariel stage makeup).”

As Lumsden explains, the tracker had been trained on images of Quartley captured in several sessions over the last year, so it’s well tuned to variations in the image that occur between shows. The Imaginarium annotated those training images with the position of the actor’s facial features, such as specific points on the lips, jaw, nostrils, eyebrows, eyelids, and pupils. The crew compensates for small variations in the point of view of the helmet by identifying stabilizing points at the temples, on the bridge of the nose, and where the nose meets the lip – areas that move just slightly. The rest of the tracked points are interpreted as animation of the face. Those animations are retargeted onto the Harpy rig controls and streamed to the rendering engine (UE4), where they are combined with the body controls, rendered, and then projected.

“We wrote a blueprint in UE4 that receives the animation data over the network and writes it onto the rig,” Lumsden says.

Playacting
REAL-TIME FACIAL AND BODY CAPTURE PERFORMED ON STAGE BY ACTOR QUARTLEY DROVE THE ANIMATION OF THIS 3D HARPY.

The retargeting is learned from hand-tuned expressions that are mapped to facial expressions that Quartley provided in a facial range-of-motion (FACS ROM) training set. “From there, we can make corrections to keyframes to improve the result,” says Lumsden.

As Lumsden points out, the group does not use timecode synchronization for this setup. All of the systems – face, body, render, projection – are running in real time, at approximately 50 frames per second. “Of course, there’s latency, and more than I’d like, but the objective here is to be as close to live as possible,” he says. “Normally I’d like to run the face at a higher frame rate, but we’re limited by the choice of hardware suited to the HMC form factor.”

The face is primarily a blendshape rig, making it easier to be certain of uniformity across the various platforms where it exists (Maya, Unreal, GRIP). Going forward, the Imaginarium is looking at Fabric Software’s Fabric Engine as a potential solution to that.

“The animation isn’t going to be comparable to a Hollywood blockbuster, where it’s been perfected by animators in postproduction,” says Lumsden. “It’s enough for now to be opening the door of this new world of real-time performance capture, and I’m thrilled that Intel and the RSC have been bold enough to want this, and very proud of the work that our team at the Imaginarium has put together to deliver it.”

Moving Forward

According to Lumsden, a mandate at the Imaginarium from Serkis – who comes from a theater background – is to make performance capture as prevalent as possible within different forms of media. “He loves the idea of pushing the boundaries,” he adds.

Theater is now entering an era where characters and scenes can be presented in ways that are more visually engaging, and, in many cases, far beyond what even the authors originally imagined.

“Inertial motion capture is changing how far productions can push their craft, bringing high-end digital characters into live shows,” says Beute. “With ‘The Tempest,’ the RSC is creating a real-time application that is both immediate and novel, something audiences always want to see on their nights out.”

Indeed, performance capture has been a game changer in film, but doing a live performance every night in real time on a theater stage presented a unique challenge, one that has been met with resounding success.

“In films, we are used to highly realistic visual effects. The ability to create visual effects in real time brings a lot of freedom to any application. Being up close to the stage and performing live are big challenges for bringing these visual effects to the theater, and until now, they have relied on visual tricks, like onstage magicians do,” says Beute. “Adding our technology brings live, realistic visual effects a step closer to the theater.”

Imogen Moorhouse, CEO of Vicon, adds: “Combining one of Shakespeare’s most renowned plays with innovative, augmented reality driven by Vicon [and others’] technology, signals a change in theater production and traditional media. It’s important to continue to innovate in motion capture, and the Royal Shakespeare Company has demonstrated just how versatile the technology is when used to ignite the imagination of theater-goers old and new.”

Yet, it’s not just motion capture and facial capture that are opening new doors. Advances in real-time technology are making an impact across entertainment, as well.

“There is some crazy stuff going on in real time,” says Lumsden, as he points to Epic Unreal Engine’s Sequencer Editor, a multi-track editor for creating and previewing cinematic sequences in real time, as a particular technology to watch. In addition, many inside and outside of the gaming world are closely watching the progress of Ninja Theory’s work on the upcoming video game Hellblade: Senua’s Sacrifice, which utilizes real-time cinematography to capture the nuances of the digital character’s facial expressions, the lighting, and visual effects, then renders it as 3D data directly within UE4’s Sequencr.

No matter if it’s film, gaming, theater, or other genres, there will always be those who think outside the box, even for a production that is centuries old and has been performed countless times.

Two years ago when the RSC was searching for a technology that would not only dazzle audiences but also enhance the theater experience, the group had asked themselves what Shakespeare would be using if he were alive today. The answer, they agreed, was that he would be looking at all the cutting-edge tools and technology available. And that is exactly what this group decided to do, and as a result, gave the performance of a lifetime.

Karen Moltenbrey is the chief editor of Computer Graphics World.