Turning Vision into Reality through Technology
With The Mandalorian
, filmmakers Jon Favreau and Dave Filoni have been explicit in their desire to “bring Star Wars to the screen in a new way.” With the scope and ambition of the series only increasing on the second season, it was crucial that the actors and viewers not only experience a huge range of new worlds — but truly believe in the reality of the worlds being created and are able to build emotional connections with the characters.
This ambition has required new filming techniques to be rapidly developed and deployed — chief among them virtual production techniques, including camera tracking for in-camera VFX (ICVFX).
Virtual production in its simplest form is the merger of physical and digital worlds. Through a combination of immersive technologies like virtual reality (VR) and augmented reality (AR), as well as ILM StageCraft and real-time render engines, virtual production allows filmmakers to view their projects live on set to quickly react and make changes as needed, rather than having to wait until post-production. Virtual production also offers several logistical benefits as it allows for more iterations of scenes or shots to be created with fewer personnel in a shorter space of time, therefore significantly reducing production costs.
Allowing the creative team and the actors themselves to better visualize the environments on shoot day is paramount. Production teams previously had to imagine the final scene while using greenscreens to shoot, with visuals applied in post-production after the fact.
ILM has invested heavily in leading the way with these techniques – and projects such as The Mandalorian
and George Clooney’s feature
The Midnight Sky
has been a tour de force of just what is possible with virtual production.
Motion-capture technology in a virtual production pipeline is a crucial component in making these endeavors a reality. Vicon’s technology has allowed ILM to re-create the universe of Star Wars in compressed time with 60 different live environments, which they can use over and over again.
Everything from VR scouting, previsualization, performance capture and in-camera VFX using giant LED walls can make use of Vicon technology in some way.
One of the biggest leaps forward has been real-time capture in the volume itself, which requires high- resolution cameras and large frustums. The latest Vicon hardware has enabled ILM to accurately track cameras on set while moving about no matter if the camera is handheld, on a crane, a Steadicam, or some other support equipment. This has helped to create a 360 degrees virtual production environment at large scale such as ILM’s pioneering StageCraft LED volumes, enabling them to capture a whole new category of shots while successfully blending photoreal visual effects with live action, which previously wasn’t possible.
Making the Impossible, Possible for 25 years
For Rachel Rose, ILM R&D supervisor who oversees the studio’s developments for virtual production, the success of The Mandalorian
and all ILM projects requiring motion capture owes much to the collaboration: “Since day one Vicon has enabled us to do things that were never possible before — and that’s as true today as it was in the ’90s. Vicon’s technology and hardware have constantly advanced throughout our relationship, and the processing power available to us with their technology is like no other. We can deploy and always count on Vicon’s tech as it’s such reliable, robust hardware requiring only a quick calibration.”
“ILM always looks to collaborate with those who are making best-in-class software/hardware solutions for problems we’re solving. If a solution doesn’t exist, we’ll solve it on our own, but we’re not looking to reinvent a solution that’s already there. We are incredibly lucky that we have a long-standing relationship with such an innovative company like Vicon. The absolute best thing I can say is that with Vicon I have a powerful performance capture system that just works.”