ILM & Epic Games Develop Groundbreaking LED Stage Production Tech for 'The Mandalorian'
February 21, 2020

ILM & Epic Games Develop Groundbreaking LED Stage Production Tech for 'The Mandalorian'

SAN FRANCISCO —Industrial Light & Magic (ILM), a division of Lucasfilm, Ltd., Epic Games (maker of the Unreal Engine), together with production technology partners Fuse, Lux Machina, Profile Studios, Nvidia, and ARRI unveiled a new filmmaking paradigm in collaboration with Jon Favreau’s Golem Creations to bring The Mandalorian to life. 

The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using real-time game engine technology and LED screens to represent dynamic photo-real digital landscapes and sets with creative flexibility previously unimaginable. As part of today’s announcement, ILM is making its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies, and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this groundbreaking new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20-foot-high by 270-degree semicircular LED video wall and ceiling with a 75-foot-diameter performance space, where the practical set pieces were combined with digital extensions on the screens. Digital 3D environments created by ILM played back interactively on the LED walls, edited in real-time during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by Nvidia GPUs. 

The environments were lit and rendered from the perspective of the camera to provide parallax in real-time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Jon Favreau, executive producer and director Dave Filoni, visual effects supervisor Richard Bluff, and cinematographers Greig Frazier and Barry “Baz” Idoine, and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve real-time in-camera composites on set.  

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of partners such as Golem Creations, Fuse, Lux Machina, Profile Studios, and ARRI together with ILM’s StageCraft virtual production filmmaking platform and, ultimately, the real-time interactivity of the Unreal Engine platform.  

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances andtest the limits of real-time, in-camera rendering,” explained Jon Favreau, adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”  

“Merging our efforts in the space with what Jon Favreau has been working towards using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” said Rob Bredow, Executive Creative Director and Head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real-time on stage providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Richard Bluff, Visual Effects Supervisor for The Mandalorian, added, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”  

“We saw this collaboration as an opportunity to push game engine technology beyond the state of the art, into high-end photo realistic VFX pipelines,” said Kim Libreri, CTO, Epic Games.

“Jon Favreau had the foresight and experience with the teams and technology to know that we could pull off the momentous task of bringing the creative center of gravity back on set in a direct-able and highly collaborative manner. Rendering CGI in real-time has always been the holy grail for visual effects and thanks to the Unreal development team, we are now starting to make this a reality that will forever change the way we make films.”