Between the Lines
Issue: Volume 40 Issue 1: (Jan/Feb 2017)

Between the Lines

It is a period of civil war. Rebel spaceships, striking from a hidden base, have won their first victory against the evil Galactic Empire.

During the battle, Rebel spies managed to steal secret plans to the Empire’s ultimate weapon, the DEATH STAR, an armored space station with enough power to destroy an entire planet.

Pursued by the Empire’s sinister agents, Princess Leia races home aboard her starship, custodian of the stolen plans that can save her people and restore freedom to the galaxy....

Rogue One: A Star Wars Story is the eighth Star Wars feature film and the eighth for which artists at Industrial Light & Magic (ILM) have created the visual effects. With a 1,700-shot total, this “outlier” squeezed into the Star Wars timeline between Episode III: Revenge of the Sith and Episode IV: A New Hope ranks among the most intense. Produced by Lucas film and distributed by Walt Disney Motion Pictures, the feature, at press time, was a weekend away from becoming a billion-dollar box-office hit.

The idea for Rogue One came from ILM’s Oscar-winning Chief Creative Officer John Knoll, who was visual effects supervisor on Episodes I, II, and III, and supervised the effects for the 1997 special editions of Episodes IV and VI.

For Rogue One , Knoll was executive producer, writer (story by), and overall visual effects supervisor. Gareth Edwards directed Rogue One . Felicity Jones, who received an Oscar nomination for The Theory of Everything , has the starring role as the renegade Jyn Erso. Diego Luna plays Rebel Alliance Captain Cassian Andor.

“On most pictures, I’m a visual effects supervisor, and the boundaries are pretty well defined,” Knoll says. “On this picture, it was interesting to have a say in the broader part of how the film was made. I still did the same VFX job, but I could exploit more detailed knowledge to make that more efficient. I could prioritize things knowing which shots were more stable and what was likely to change.”

Star Wars
THE DEATH DESTROYER SEEN HOVERING MENACINGLY IN THE ATMOSPHERE WAS ONE OF MANY SPACESHIPS BUILT AT ILM.

The Battle Begins

At the end of Episode III, the Empire controlled the galaxy, and the surviving heroes were in hiding or pretending to support the Imperials. The plot for Rogue One exists between the lines in the opening crawl of Lucas film’s Episode IV: A New Hope , the first Star Wars film.

That scrolling text described a Rebel attack and the theft of Death Star plans by Rebel spies. But, no feature film had Rebel spaceships winning their first victory against the evil Galactic Empire, and none featured Rebel spies stealing secret plans for the Death Star. That opened an opportunity no Star Wars fan could resist, and ILM artists are among the most passionate Star Wars fans. The first two acts of Rogue One set up the theft attempt during a gigantic space battle that takes place through the third act.

“This project was one of the most fun I’ve ever been on,” says Rogue One Visual Effects Supervisor Nigel Sumner. “It’s Star Wars.”

Artists in all four of ILM’s studios worked on the film creating digital set extensions, ships, and characters, with Sumner supervising work in San Francisco and Vancouver, Mohen Leo supervising effects in London, and Alex Pritchard overseeing the artists in Singapore. Hal Hickel was overall animation supervisor. All four supervisors – Knoll, Sumner, Leo, and Hickel – received BAFTA nominations for their work. ILM’s artists received seven VES award nominations, the most of any feature film in competition this year. And, Knoll, Leo, Hickel, and Special Effects Supervisor Neil Corbould received Oscar nominations.

“All our studios have great talent, so we don’t put the hardest work here or there,” Hickel says. “It’s really down to the producers and coordinators to balance the work among the studios.”

When the volume of work permitted, each studio concentrated on one of the four key locations. London artists built digital sets and created effects for sequences filmed in Jordon that stood in for the desert moon Jedha. Singapore artists handled the Rebel attack on the stormy planet Eadu. Vancouver artists worked from plate photography taken in the Maldives and the RAF Bovingdon Air Station in the UK for sequences on Scarif that take place during the ground battle. The San Francisco crew managed the space battle that intercuts with the Scarif ground battle and occupies the entire third act.

Technical innovations for the film largely centered on new systems developed for the production and the digital humans Grand Moff Tarkin and Princess Leia Organa, created in the San Francisco studio.

On-Set VFX

The filmmakers shot most of Rogue One on stages at Pinewood Studios near London. There, to give the director, cinematographer, and actors a view of the larger surroundings in which they’d appear in the film, ILM displayed pre-rendered visual effects on huge LED screens. Sometimes those images would be in-camera and appear in final shots. For example, when Jyn is on Coruscant, actor Jones could see the city-covered planet out her window, and the director and cinematographer could film those views.

“It was nice for Gareth [Edwards] and the DP because they could get the appropriate lighting,” Leo says. “Also, it allowed Gareth to choose compositions he might not have chosen if he had only seen bluescreen.”

A 270-degree dome made from LED panels that surrounded a hydraulic motion base took the idea to an extreme. ILM worked with The Third Floor to generate 360-degree animated content displayed on the LED panels. Those moving images would surround an actor riding inside a spaceship “cockpit” on the motion base.

“The cockpit moved, and on all sides the actors would see the animated content, the planets, ships, laser fire, and explosions,” Leo says. “They’d feel like they were in space. Gareth could switch from one planet to another, get the lighting he wanted, and call out explosions and attacks with TIE fighters.”

With the press of a button, the LED screens could even flash dramatic streaks as if a ship entered hyperspace.

“People could not get enough of it,” Leo says, laughing as he admits to taking a few hyperspace rides himself. “A lot of people came by and wanted to be sent into hyperspace. It was a really impressive setup. We used it quite a bit for the space battle and some other scenes. Most of the X-wing and U-wing content was shot on that stage because we got such nice lighting interaction on the actors and the cockpit. In many cases, we would replace what we see out the cockpit windows with higher-quality images appropriate for where the shot landed in cuts later, but we had moving reflections and lights. And, it helped the actors get into it.”

Star Wars
MOTION FROM ACTOR ALAN TUDYK, WHO WORE STILTS AND ARM EXTENSIONS ON SET, WAS USED AS A BASIS FOR K-2SO’S PERFORMANCE.

Back on the ground, scans taken during production in London of every practical asset gave ILM artists a digital repository they could use to augment and create digital assets, including digital doubles.

“We didn’t go for full-3D asset builds of Jyn and Cassian,” Leo says, “but we had enough to create a likeness.”

The London team also added tentacles to Bor Gullet, a mind-reading, octopus-like creature that took four or five puppeteers to move on set, and transferred data captured from actor Alan Tudyk onto the CG model of K-2SO to help animators perform that droid.

K-2SO

Neal Scanlan’s special effects team designed and built a full-sized maquette of the tall, lanky droid that was referenced on set, and then ILM artist Landis Fields altered the design to give animators a 3D character with more articulation. On set, Tudyk wore a motion-capture suit as well as hand extensions and stilts with bionic ankles designed by Scanlan. The extenders gave the director, other actors, and the ILM motion-capture crew the 7-foot-2-inch-tall droid’s actual proportions. When it was dangerous for the actor to be on stilts, Tudyk wore a backpack fitted with rods that held a facsimile of K-2’s head to create the appropriate eyeline.

“That long-legged gait felt right, but putting the motion-capture data from Alan [Tudyk] onto K-2 wasn’t push-button,” Hickel says. “We didn’t have to change his performance, but we did amplify his pantomime a bit in some places. A little cock of the head, maybe. And even though Alan was capable on the stilts, occasionally there’d be an extra wobble. We kept those weird wrinkles unless they made him look unsteady.”

A week before filming began on location in Jordan, ILM brought Tudyk to the San Francisco studio’s motion-capture stage to practice.

“He was wearing the motion-capture suit and arm extensions, and was on the stilts,” Hickel says. “We retargeted his motion to our K-2 model in real time so he could see himself as K-2. And, we recorded the session and gave it to him so he could keep looking at it. He had a journey to figure out how to express himself without relying on facial expressions.”

Like the droid C-3PO, K-2 has an unmoving face. However, unlike C-3PO, K-2 can at least move his eyes.

“I took on the eyes as my project,” Hickel says. “Historically, the design aesthetic for droids is industrial sensibility. Before K-2, we didn’t have eyes that turn. K-2’s eyes are like security cameras that swivel around. It gave him an expressiveness and humanity a notch above what we’ve seen design-wise in Star Wars droids before, without violating the rules. That little trick can tell you the character is thinking.”

Star Wars
THE EMPIRE'S STORMTROOPERS LEAD THE NEW AT-ACT WALKERS IN THEIR BATTLE AGAINST THE REBELS.

Forty animators spread across ILM’s four facilities worked on the film, with the London animators handling K-2, Bor Gullet’s tentacles, the two-legged AT-ST walkers, and ships. Vancouver animators performed the four-legged AT-ACT walkers and other assets for the third-act ground battle on Scarif. Singapore artists animated all digital assets during the Rebel attack on Eadu. San Francisco animators also contributed to K-2, but the bulk of their work was on the digital humans (see “Grand Moff Tarkin and Princess Leia” below) and the space battle.

Rebel Spaceships

During preproduction, modelers and texture artists in San Francisco led by Digital Model Supervisor Russell Paul and Viewpaint Supervisor Steve Walton created most of the ships: X-wings and new U-wings, TIE fighters and the new TIE striker, Krennic’s Imperial Shuttle, transports, gunships, freighters, a new Hammerhead Corvette, the Mon Calamari Admiral Raddus’s MC75 cruiser, blockade runners, medical frigates, and more.

“I believe we made all the vehicles from scratch,” Sumner says. “We had access in San Francisco to Lucas film’s reference photos of original models, and the Lucas Museum’s archived maquettes, artwork, paintings – a lot of the original Star Wars pieces.”

For example, to match the color of their digital Star Destroyer with the original, they referenced an archived 3-foot Star Destroyer model.

“We lifted part of the model and found a paint swatch that hadn’t decayed or been exposed to sunlight,” Sumner says. “Using a color sensor, we discovered we were five percent off the original paint color.”

Knowing that the modelers who worked on the 1977 New Hope had created ships with “kit bashing,” that is, by using pieces of various commercial model kits to add details, Paul found some of those same 1977 model kits.

“He scanned parts from those kits into the computer and started building a library from these little pieces in off-the-shelf kits,” Sumner explains. “Because this movie is tied so closely to New Hope , we wanted to capture the aesthetic using today’s techniques.”

Action!

Although there was a previs stage for the space battle, the ILM team in San Francisco defined much of the action.

“Our animators had proxy, or low-res, versions of all our assets, and sometimes worked with our real-time engine,” Sumner says.

Knowing that Director Edwards liked to use a Steadicam and handheld cameras on set to explore the space and look for angles, the ILM team devised a system for doing the same thing in the space battle’s all-digital scenes.

“By putting together HTC Vive hardware with VR tracking and an iPad with a controller, we had a virtual camera system that we could set up anywhere in 20 minutes,” Leo says. “We’d record Gareth’s clips, give them to editorial, and ILM would refine the animation.”

Animators blocked out the space battle in CG, sometimes using their own work and sometimes 3D Maya scenes from The Third Floor that they had optimized for real time. Edwards could then fly around a scene, framing angles and looking for moments while the battle happened around him.

“We tracked Gareth’s movements and his viewpoint changes,” Hickel says. “We could attach his [virtual] camera to a ship, miniaturize him, or make him huge as he looked for camera angles through his virtual viewfinder. He could change the lens and the focus. The virtual lenses matched the lenses used to film the show.”

Keeping the three main stories being played out in the third act clear, however, was a challenge for everyone.

“At times it could be frustrating, but it was really exciting to be able to suggest story beats,” Hickel says. “In the middle of trying to keep all the motivations and goals clear, John [Knoll], who conceived the story, said he thought it would be cool if we could find a way to push one Star Destroyer into another. So, animator Euisung Lee made a kind of mini movie from that idea. We’d do that throughout. We’d establish a beat and then assign and cast the shot to animators with the right skill sets, and they’d move it along. Once the battle started to gel and we could proceed in a more linear fashion, it was a lot of fun.”

Fun seems to be the operative word for the visual effects crew on Rogue One.Perhaps it was having one of their own come up with the concept and original story draft. Perhaps it was simply that this is a Star Wars film. But their exuberance shines through the final frames.

“I wish this project could keep going,” Sumner says. “There was so much foundation work, and then we’d see a new awesome shot appear in dailies. That’s where the payoff is. The first time I saw a space battle with X-wings arcing down, I knew the film would be fun and dynamic. I wanted that moment, when everything comes together, to last longer.”

Grand Moff Tarkin and Princess Leia

“I felt strongly from early on that the best ending for a movie about the mission to capture the Death Star plans would be with Leia getting the plans,” says John Knoll, executive producer, writer, and visual effects supervisor on Rogue One. “And, that a movie which features the Death Star as the central McGuffin needed to have Tarkin playing a role. It would be odd considering how important he is in [ Star Wars: Episode IV– A New Hope] if he didn’t appear in this movie.”

But New Hope released in 1977, and Peter Cushing, Tarkin’s actor, died in 1994.

“The way we thought of it was like trying to do a historic figure,” Knoll says. “I felt that given how recognizable Peter Cushing is, he didn’t fall into the category of someone we could recast and call Cushing; it would look odd if our Tarkin didn’t look like the Tarkin we knew from New Hope. But, as soon as we made that decision, whether we’d use makeup or digital effects [to reproduce Tarkin], we had to cast someone who would embody Peter Cushing’s mannerisms, someone who had the right physical build and, ideally, someone who could do his voice as well.”

They found actor Guy Henry.

The ILM crew captured Henry’s performance with a head-mounted display (HMD) fitted with two infrared cameras. Henry had infrared dots on his face and wore ILM’s iMocap suit.

“Previously, we had used visible light cameras with white LEDs mounted next to the camera to flood an actor’s face with light,” Knoll explains. “That’s good for tracking data, but it’s bad for everyone else. I didn’t want that extra light on the faces in footage taken with the visible light camera. During the Pirates movies, we had always used the taking plate as reference, and it was a big part of how we made Bill Nighy look present in the plates. We could see how the shadows fell, the color of the shadows, and the ratios. So, we tested the infrared cameras and realized it is a much better solution. The actor can’t see the infrared lights, and the taking camera can’t, either. It doesn’t interfere.”

Star Wars Star Wars

As they had done for Warcraft and Turtles , the crew built digital models of the actor being captured and the character, in this case, Peter Cushing as Tarkin, whose facial expressions would be driven by the captured data.

To create Tarkin’s model, ILM artists looked at hundreds of images in the Lucasfilm archives and at shots from New Hope . The images didn’t provide geometry, but they gave modelers information about face size, proportions, and symmetry. From there, the process involved a lot of handwork to capture the likeness.

New Tools

“We’ve also been developing a new tool at ILM we call Flux, which allows us to do shape recovery from archival footage,” Knoll says. “It’s pretty flexible and powerful, and has a lot of potential. We can give Flux a model, and it will try to fit it to an image or animation. So, we fed our Tarkin model into Flux and ran a handful of shots from New Hope through the solver to see what our model needed to do, to learn what we could about Peter Cushing’s facial expressions.”

To build a facial shape library for the Tarkin model, the artists started by transferring a library of FACS shapes captured from Henry and then refined the shapes when the muscles Henry used were different from those Cushing used to make particular phonemes.

The ILM crew captured the data and applied it to the Henry model first using an updated version of the studio’s Muse and SnapSolve software to move the data onto the CG characters.

“The model of Guy Henry was our confidence test step,” Knoll says. “We checked the tracked performance to know we’d squeezed all the juice out of it we could get. Once we felt that transfer worked and was maximized, we transferred it onto our Tarkin model. If something didn’t look right, we wanted to make sure that it was because of the difference in the way Cushing made an ‘f’ sound. Not because of an error.”

In all, the process of creating and animating the digital Tarkin took about a year and a half. Digital Leia was easier.

“She’s in one shot and she delivers one line of dialog,” Knoll says. “We didn’t do scans; we sculpted the model by hand. And we motion-captured [actor Ingvild Deila] using standoff witness cameras and long lenses framed in tight enough for good data. We did this with Carrie Fisher’s permission. She saw the work in progress and the finished result. If she hadn’t been happy with it, we wouldn’t have put it in the movie.” – Barbara Robertson

Barbara Robertson (BarbaraRR@comcast.net) is an award-winning writer and a contributing editor for CGW.