Bounty-ful VFX
Issue: Edition 1 2020

Bounty-ful VFX

Images courtesy Industrial Light & Magic. ©Lucasfilm, Ltd.

The challenge for the filmmakers working on Lucasfilm's television series The Mandalorian was in creating the look and feel of the Star Wars universe. The same look and feel as the blockbuster feature film audiences have viewed for 40 years. But on a television series budget and with an eight-chapter schedule. Eight chapters with approximately 3,800 visual effects shots.

Industrial Light & Magic visual effects supervisor Richard Bluff, the production's overall visual effects supervisor, oversaw that work with help from ILM visual effects supervisor Jason Porter.

"It's two and a half movie's worth of work in half a movie's time," Porter says. "Even with two supervisors, we were jammed all day, every day. But, no one wanted to do the TV version of Star Wars. We wanted to do the Star Wars version. We needed that quality and scope."

A practical set piece of the RazorCrest positioned in the StageCraft volume for real-time interactive lighting and reflections.

Several vendors, including four of ILM's five global studios (San Francisco, London, Vancouver, and Singapore) contributed to the show. ILM was the lead vendor. Hybride, Pixomondo, Image Engine, Ghost, StereoD, Base, Il Ranchito, MPC, and ILP also worked on the series.

Created by Jon Favreau, The Mandalorian, which streams on the Disney+ channel, is set approximately five years after the feature film The Return of the Jedi. In The Return of the Jedi (Episode VI, 1983), we saw the Galactic Empire fall and a New Republic begin to take shape. Thirty years later, though, in The Force Awakens (Episode VII, 2015), the First Order will have gained/regained enough strength to threaten the New Republic.

Meanwhile, during those 30 years between, especially in the Outer Rim along the edges of the universe, chaos reigned. It is in this time and place that an armored bounty hunter bound by a Mandalorian creed earned his living as a hired gun.

This Mandalorian is a man with no name early in the series. He's called "Mando" - even when we learn his name in the eighth chapter. Mando wears a suit of armor and a helmet he doesn't remove. According to the creed, if a living human were to see his face, he could never wear the helmet again. Mando hosts a variety of clever weapons within his suit, which is upgraded in an early chapter to the all-metal Tesla-sleek version. He reflects the world around him.

"On our show, Mando is a mirror ball with a shiny helmet," Porter says. "Normally, from a purely effects standpoint, if I saw something like that I'd groan, 'We'll be painting out greenscreen on every shot. Or making a CG helmet.' But because we had a virtual environment instead of a greenscreen, we had only a few minor paint-outs. In 90 percent of the shots there were no fixes."

The Mandalorian DP Greig Fraser lines up a shot inside the StageCraft LED volume. The inset shows the final in-camera shot from the scene.

Actors in a Virtual Environment

"Jon Favreau, series creator and executive producer, had a distinct vision of how he wanted to film the series and the technology that would be required," Bluff says.

Thus, more than half the content in the eight chapters of the first season was filmed using a new virtual production workflow that ILM has dubbed StageCraft. With StageCraft, the filmmakers could capture in-camera shots of actors performing within virtual environments. LED screens that encircled the actors displayed dynamic photorealistic digital landscapes and set extensions on curved walls and the ceiling.

ILM's visual effects team created all the photorealistic real-time environments that were loaded into StageCraft. A key feature of StageCraft is that it allowed the perspective in the displayed graphics to be correct from the camera's point of view as the camera moved; StageCraft was used to manage and live edit all the environments. The data was then sent through Unreal Engine for final pixel display on the volume. Nvidia GPU-based systems powered that high-resolution display.

"The volume we used for Season 1 was 75 feet in diameter and had LEDs for approximately 270 degrees of a continuous circle," Porter says. "At the open end, we had large LED flyaway panels on chain rigs that we could pull out of the way to get actors, sets, and equipment in and out. So we had close to a 360 environment."

The 270-degree video wall was 20 feet high and extended into the ceiling. The environments created by ILM that played on these LED walls could be edited in real time during the shoot, and lit and rendered from the camera perspective. Thus, the images in-camera had accurate interactive lighting on the actors and practical sets from the 3D graphics on the video wall.

"We have been developing these concepts of virtual environments for some time, and Jon Favreau wanted to push forward with technology developed for his Lion King and other shows," Porter says. "On previous shows, we had used LED panels for lighting but not for final pixels. We didn't have the whole package. On The Mandalorian, we baked the cake from all the ingredients. This was the first time we have made a cohesive system that is more than just an experiment. We have final-quality imagery on the LED wall, real-time camera tracking for perspective correct view, and the availability to view and make changes on the fly."

To make the cohesive system and achieve his vision for the series, Favreau brought ILM, Epic Games, Golem Creations, and technology partners Fuse, Lux Machina, Profile Studios, and ARRI together.

"We thought it might give us a few shots - we estimated 15 to 20 percent of the workflow," Porter says. "But it grew and grew, and became so successful we used it for probably 50 percent of the photography on the whole show. A lot of shots went straight to camera. And again, our hit rate there was greater than we expected. Approximately 60 percent of our environment work went from shooting in the volume to finals, without the need for additional post work. For other shots, of course, we had minor rig removal for the motion-tracking cameras, and I'm not counting shots in which we put a droid or spaceship or something. But, it ended up more successful than we imagined. From our first tests, we saw very quickly that bringing in photography mapped onto 3D geometry could be very successful."

In one of the early tests, Mando creeps outside The Client's (Werner Herzog's) safe house in an alleyway. The alleyway is a partial set, but the sky and the distant buildings are virtual.

"You watch him walk around and the light and reflections on him are perfect," Porter points out. "We saw the system sing, and people started to get excited."

Note the reflections on Mando’s suit and the droid.

Before the Shoot

Lucasfilm design director Doug Chiang and his team created concept art and initial designs that went to a virtual art department managed by the show's production designer, Andrew Jones. That department constructed real-time graphics for VR location scouting.

"A big step in our process came from Jon [Favreau]'s desire to have a virtual art department and virtual scouting in the prep work," Porter says. "He brought that over from The Lion King and previous shows. He wanted a rough environment brought into a shared virtual environment in VR headsets so the entire creative team could be in the same virtual space doing location scouting and blocking. It wasn't the same as on The Lion King, where they were actually creating shots. For us, it was more like a real scout. Walk around. We want the sun over there. I think the camera should be here, what do you think?"

The VR location scout took place at the Mandalorian production office, a space in Playa del Rey shared with The Lion King. Notes from the scout traveled from the virtual art department to The Third Floor for previs, and to ILM for final resolution imagery that would go on the wall.

 In addition, knowing they would create all the digital environments in the volume, ILM environments supervisor Enrico Damm had teams do traditional location scouting and photogrammetry capture of real-world locations.

"We brought all that data back here, did our proprietary magic to de-light it so it could be re-lit for lighting choices on the day, and reconstructed it so that the final could be played in real time," Porter says.

Meanwhile, in close collaboration with series DP Barry "Baz" Idoine, Richard Bluff, and the episodic directors, Jones and his crew built the physical sets, many of which would need to tightly integrate with the virtual environment inside the StageCraft volume. Sand dunes built by the production team that were on set, for example, fit seamlessly with the photography.

"We may make minor adjustments on the physical side and/or on the virtual set, moving little pieces, making color tweaks if the virtual didn't match the color in the real world," Porter says. "We had a tech team working behind the scenes to integrate the physical and virtual environments even on the days we were shooting. On dailies, we couldn't tell where the set ended and the virtual environment started."

Scenes were often lit with HDR photos of real lighting instruments - that is, with virtual lights in the volume's LED walls and ceiling. In StageCraft, DPs Idoine and Greig Fraser could have bounce cards and other lighting modifiers that would affect the physical environment.

"The DP could instantly put up a card anywhere in the world without having a grip hold a card," Porter says. "The DP could say, 'I want the left side of the sky to come down two stops,' and it could happen immediately. They loved shooting in the virtual environment. They could have magic hour 24 hours a day. They could say, 'I want the sun over there,' and we'd move it. And actors loved being in a final environment. They hate talking to a tennis ball in front of a greenscreen stage. Instead, they could concentrate on being the character without having to imagine what was going to be there later. The best praise we got was from the actors."

The changing perspective wasn't even a problem. To give the environment the correct perspective, the view the camera saw was calculated just for that camera. The virtual environment is fully 3D, rendered correctly from the camera position and put into the wall. Then photography is projection mapped onto the spot the camera will see.

"If the environment didn't change as the camera moved, it would look like a picture on a wall, not like a real environment," Porter explains. "That's one of the key pieces of technology that makes this work. The change is not that distracting; it happens quickly. The computation is really fast. I suppose if you were to stand in the middle and we'd spin the camera all the way around, we could make you sick in minutes, but of course we don't do that. That fact that it could, though, is a testament to how much you think the world is real."

Compositors added digital characters and ships to plates filmed in the virtual environment.

Not Only Virtual

In addition to the virtual environments inside the StageCraft volume, the crew also shot exteriors and sets on a backlot and sets on a traditional stage. The prison break in Chapter 6, for example, was shot on stage in a practical prison ship set. Scenes on the sanctuary planet were shot outdoors with pools of water, landscaping to 20 yards, and bluescreen beyond, with digital set extensions composited in later.

And, for the scene with the AT-ST walker crashing through woods, the filmmakers built a forest surrounded by bluescreen on a stage for some shots and used a virtual forest for others, depending on the action.

StageCraft in the Future

The ability to control a virtual background gives filmmakers benefits beyond the amazing ability to shoot a digital environment in camera. “We can do a lot of cool tricks,” says animation supervisor Hal Hickel. For example, if the crew knew the content on the LED wall would be replaced in postproduction because they planned to add a large creature, they’d still use the digital environment but with animated green cards behind the actors to make it easy to swap backgrounds later. The actors would still be lit with the ambient lighting in the digital environment. Similarly, the director of photography could place digital cards that affect the lighting on the actors. In addition, although StageCraft had a minimal impact on the work of animators, that could change. It’s possible that some time in the future, the crew might motion-capture an actor on a separate stage and retarget that data in real time onto a creature that’s part of the digital environment. “It would also be super cool to have a character on the wall, maybe a huge character, that we pre-animated so the actors could watch it walk by,” Hickel says. “The eyelines would be correct, and they can react to it. If they were sitting in a car, the creature would reflect on the windshield. There are a lot of things that would have been hard to do in the past. I see this affecting my job in good ways.”


For animation supervisor Hal Hickel, though, the show was business as usual - as usual for a Star Wars film, that is - except for the television schedule. Shots in The Mandalorian have digital creatures, droids, and ships, and as did early Star Wars films especially, shots with motion-controlled miniature ships. 

"The video wall didn't shape the character work that much," Hickel says. "We had the same role as on a feature: anything that moved, whether creatures, characters, or flying spaceships. StageCraft changed how some things are done - sets have to be ready to go as the film started, so that shifted the environment work further upstream and changed their working style. But, as amazing as StageCraft is, and it is amazing, for me, it was down to the usual fun stuff. I got plates and we put in whatever."

There was one exception. When Mando signs up for a heist mission to break people out of prison, animators created digi-doubles that appeared in environments on the LED walls. And, digital flying ships were projected onto the walls in some backgrounds.

"Sometimes, the ships were there just for eyelines, but sometimes they appeared in camera," Hickel says. "We also did some ships flying through clouds, and I expect we will have more of that now that we have figured out what we can do. But so far, 99 percent of the character and ship work was more like traditional postproduction."

Hickel began working with Favreau long before postproduction, as the story and characters coalesced.

"Jon likes to take characters barely seen in the original films and put them front and center," Hickel says.

The Blurrg, a two-legged beast of burden, was one of those characters. The cumbersome creature first appeared in the 1985 television movie Ewoks: The Battle for Endor and in one episode of the animated series Star Wars: The Clone Wars in 2009. 

"The Blurrgs were hard to get right," Hickel says. "They're all mouth with a pair of legs and a tail.  It was difficult to get a credible run and walk, and they have to leap across track in the terrain. For the Mudhorn creature, our London team could bring up rhinoceros reference. But nothing moves like a Blurrg."

Actor Misty Rosas wore a prosthetic suit and a puppeteered mask to play Kuill. Actor Pedro Pascal played Mando.

The Blurrg, the hover cart, and the background are all digital. Actors rode motion bases programmed with Blurrg animation curves.

In The Mandalorian, the Ugnaught Kuill, performed by Misty Rosas in a full prosthetic suit and mask built and puppeteered by Legacy Effects, and voiced by Nick Nolte, teaches Mando how to ride a Blurrg. On set, the stunt actors playing Kuill and Mando rode a sophisticated motion base in the volume.

"I've been on a fair number of shows where we discuss having a proper motion base, but so far it's gone off the rails," Hickel says. "We always end up doing a 'poor man's' motion base with a buck on cushion bags, or bungee cords with grips off camera. Then we're tied into that motion in animation. This time was the first time where the motion base did exactly what we said it would do. Pixomondo created the run and walk cycles for the Blurrg that drove the motion base. When we first saw Kuill riding the motion base using the same curves as Blurrg, it was awesome."

But for shots of Mando riding a bucking Blurrg, the crew had a stunt actor ride a mechanical bull surrounded by mats instead. In wide shots, both are digital.

Hal Hickel also oversaw the character development of two creatures unique to The Mandalorian: One, a half alligator/half blubbery sea lion, attacks during Chapter 1 in a difficult icy CG environment executed by ILM. Another, executed by Pixomondo, was a pterodactyl-like flying creature that attacks Mando and others on Nevarro trying to save The Child.

"We wanted to do a reverse Jaws," Hickel says. "It's dark, so we can't see the creatures that come diving out of the sky and picking off people. We also did The Alien thing where we had a stroboscopic blaster light up the creatures as they come into view."

The Child, aka Baby Yoda, in the StageCraft volume. The character was often digital when walking.

Baby Yoda

The character in The Mandalorian garnering the most attention, however, is The Child, which has become known as "Baby Yoda." Lucasfilm concept artist Christian Alzmann designed Baby Yoda, and John Rosengrant's team at Legacy Effects built and puppeteered him.

"Our job was to match the puppet," Hickel says. "There have been articles about what a success Baby Yoda is because it's a puppet. But, the banner photo they often use is one of our CG shots. They're not wrong - the skill of the puppeteers made it a massive success. And if the puppet works, the shot is done except for maybe having rods to paint out. So, we can go to town on other characters."

However, Baby Yoda was digital in many shots: when the baby tries to eat a frog, most of the time when it's walking, and in a scene in Chapter 3 when the baby crawls out of its pram and pulls a knob off the dashboard.

"It's hard to have a puppet perform something like that," Hickel says. "But also, that scene was shot early in the schedule and everyone was unsure about how much puppet and how much CG we'd use. Later in the schedule, it might have been the puppet."


Of course, there are also droids in The Mandalorian - security droids, mechanics, and a star, the droid IG-11, who in the early chapters is a bounty hunter, too.

Taika Waititi, who directed Chapter 8 of The Mandalorian, provided IG-11's voice.

"There's a Butch and Sundance thing between IG and Mando," Hickel says. "The shoot-out they had happened on a backlot for the real sunlight. But the majority of IG-11 shots happened in the volume."

Prop master Josh Roth built a full-sized IG-11 prop from the waist up for on-set lighting reference. However, after seeing how good it looked on camera, Favreau asked Legacy to add rods in order to puppeteer it some shots. "As with Baby Yoda, we looked at the puppeteered performance for reference," Hickel says. "IG-11's torso wobbled when its arm came up, for example. We emulated that."

A motion-captured actor also contributed to the performance.

"If he did something on the day that was great, we'd use it, but IG did things humans can't do," Hickel points out. "We knew we couldn't use most of the motion capture, but he gave the actors eyelines."

Animators at Hybride led by Ken Steel created most of the IG-11 performances, with ILM matching Hybride's IG.

Crawler, Walker, and Speeders

The Sandcrawler was a physical set with a cast of small people playing the Jawas crawling over it, but the walker and speeder bikes were digital.

A transport vehicle, the two-legged AT-ST walker was briefly seen in Star Wars: The Empire Strikes Back and more prominently in Return of the Jedi. It crashes through a forest on the sanctuary planet in The Mandalorian's Chapter 4 and stops, hesitantly, at the edge of a pond.

"That was great fun," Hickel says. "We had a devil of a time getting the step into the water right. It was the nexus of editing and animating, of trying different orders of the shots. Jon really wanted it to be a character, so we don't look inside.

The speeder bikes were also digital. For their animation, Hickel referenced speeders from the first Star Wars films.

"I put together a reel of all the practical shots of Luke Skywalker's speeder," Hickel says. "They had mounted it to a pole and swung it around like a merry-go-round. Some speeders were three-wheeled vehicles with mylar around the base so they'd look like they were hovering. I've always been a fan and amateur historian of the visual effects ILM has done over the years, and I was programming motion-control rigs when I was doing stop motion at Will Vinton's studio in the '80s and '90s. So, I'm familiar with the basic technology."

That came in handy for Mando's ship, the RazorCrest, which is sometimes digital and sometimes a motion-controlled miniature.

The RazorCrest was sometimes digital and sometimes a motion-control miniature.


To stay within the Star Wars aesthetic, Hickel paid close attention the way spaceships moved in the early films so that shots with digital ships would have the same look.

"I think a lot of the look came from George [Lucas] using scenes from the Battle of Britain, from real aerial photography of fighter planes, of WWII planes shot with long lenses in which they're trying to keep the plane in frame. That set the tone. So you don't see the ships traveling a lot in camera, but the sky whooshes by. Also, you don't see spaceships coming from infinity and whooshing past back into infinity because they didn't have enough motion-control track."

So when Favreau asked if ILM could build a miniature and shoot the RazorCrest shots practically, the crew agreed. If it had been a known ship, the answer would probably have been no because the studio has so much experience making the digital ships look like the old ones. But, the RazorCrest was new. And having a miniature would be helpful as they built and lit the digital version.

Also, although they didn't have time to build a model with working landing gear, and flying over terrain would be tricky, they found around 15 shots that could work with motion control.

John Goodson, who had worked in ILM's model shop in the 1980s and '90s and then moved into digital shipbuilding, built the miniature RazorCrest in his garage. John Knoll, ILM's chief creative officer, built the motion-control rig in his garage and wrote the software to run it. Hickel designed the camera moves. They set up the rig with a 50-foot-long track and a camera with a pan and tilt head on ILM's motion-capture stage.

"Everyone was excited about building models and doing the shots old school," Porter says. "It was a very successful way to give the shots a Star Wars look."

The team also relied on others at ILM who have been with the studio since the early days, such as visual effects supervisor Dennis Muren and Lorne Peterson, former model supervisor at ILM, now retired. Peter Daulton, an assistant cameraman at ILM on Return of the Jedi, and now a digital artist at ILM, was especially helpful as Hickel designed the camera moves.

"I'd ask him if he were shooting, what lens would he use, how long was the tracking," Hickel says. "Shooting with physical motion control limited us in the same way as the original filmmakers. It gives the shots a certain look."

Thus, in The Mandalorian, we can see the best of 20th century and 21st century visual effects filmmaking techniques. Shooting actors in an LED volume is a long way from filming spaceships with a motion-control rig, but the combination keeps audiences guessing and creates the unique Star Wars look and feel. On to Season 2! 

A Group Effort

Several vendors created the visual effects for The Mandalorian: Industrial Light & Magic was the lead vendor, developed StageCraft, created all the digital environments for the LED wall, and worked on several sequences. Overall visual effects supervisor Richard Bluff, with help from ILM visual effects supervisor Jason Porter, led teams of artists in San Francisco, Vancouver, London, and Singapore who worked on the show. They crafted the AT-ST’s nighttime journey through the forest and the battle with the walker in Chapter 4, the lava tunnel and TIE Fighter attack in Chapter 8, and all the miniature flying space shots of the RazorCrest. ILM’s Hal Hickel supervised all the animation on the series. Hybride did most of the IG-11 performances, plus environment work, especially in Chapter 1 at the raider village. Pixomondo worked on the Blurrgs, the desert in Chapter 5, mud planet work in Chapter 2, and other environments throughout the series. Image Engine artists were heavily involved with the nighttime street battle at the end of Chapter 3 and the prison-break droid work in Chapter 6, among other shots. Ghost created set extensions for the sanctuary garden planet in Chapter 4, working along with ILM artists who did the establishing shots. They also handled the flashback scenes with the child Mando being chased by killer droids. Base also did set extensions on the garden planet and the forge effects in the armorer’s workshop. MPC did the entire nighttime speeder bike chase in Chapter 5, a brief scene outside the hangar with The Child and Mando on bikes, an encounter with the Tusken Raiders, and the hover cart traveling shots in Chapter 4. ILP created holograms and fitted characters with digital prostheses in the bar on the ice planet. StereoD did cleanup and probably had the highest shot count of all the vendors.

Barbara Robertson ( is an award-winning writer and a contributing editor for  CGW.