Although not a new concept, virtual production has really started coming into its own recently. What exactly is virtual production? In simplest terms, virtual production combines live-action footage and computer graphics in real time. These productions involve the use of a game engine, in most instances Epic Games' Unreal Engine or Unity Technologies' Unity engine.
Virtual production can influence every aspect of the production pipeline, from development and pre-production, to production, and postproduction. And it's not just big Hollywood productions that see its value. It is becoming a practical solution for projects of all sizes, down to the independent or student filmmaker. That's because the advantages it offers are many: faster and more creative workflows, more iteration and collaboration, and a reduction of time spent on set, to name a few.
In this section on virtual production, experts at Epic and Unity shed light on this growing trend.
EPIC: THE FUTURE OF VIRTUAL PRODUCTION IS HERE
By Miles Perkins, Business Development, Epic Games
Entering 2021, the film and TV industries are on the precipice of a significant change. There's been an explosion of interest in virtual production over the past year, with more and more departments outside of VFX exploring what they can do with real-time game engine technology, VFX and animation facilities starting to significantly overhaul their traditional pipelines to leverage game engines, and studios beginning to grasp that the creative benefits of shifting to real-time methods is a risk worth taking.
Our industry has been approaching the perfect storm of advancements in both software and hardware for many years, and we're finally arriving at the moment where we are able to create and render photoreal images in real time. The possibilities that come with that are incredibly exciting and can be liberating for filmmakers.
Virtual production means different things for different projects, and it could include one or more different techniques touching different departments. Some examples of virtual production techniques include, but are not limited to: in-camera visual effects, whereby a CG background is displayed on an LED wall that provides proper lighting and reflections, and captured in real time by the camera, rather than being created in post; virtual location scouting, whereby filmmakers can go into virtual reality and interactively explore and manipulate their CG environments to make creative decisions prior to filming; and real-time character animation, whereby motion-capture or facial-capture data is used to bring a CG character to life instantaneously.
For me, the broadest definition of virtual production is anything that bridges the gap between what is physical and what is virtual within milliseconds. What we're doing is using technology to create a continuum where the physical and virtual live in harmony, and filmmakers don't need separate vocabularies or separate instructions to realize their vision.
We're creating a working environment where a director and cinematographer (and more) can interface with their world in the same way, whether it's physical, virtual, or a combination of both. The way we're able to do this is with real-time game engine technology that lets everyone see and iterate on the creative vision together, instantly. Whereas in the past, render times of hours, days, or even weeks meant that filmmakers had to compartmentalize their decision-making process; now with game engines, a production can be much more iterative and collaborative.
With an approach such as in-camera visual effects, the benefits are incredibly clear - everyone can see and react to the environment on the LED wall, with all the lighting and reflections captured accurately in-camera right from the get-go. The phenomenal quality of shows filmed with this approach, starting with The Mandalorian and more recently
Westworld and others, has created a huge appetite for in-camera visual effects. At the transition from 2019 to 2020, I could count the number of professional LED stages on one hand. Now as we transition from 2020 to 2021, that number is well over 100, with more being built each month.
Of particular importance with the pandemic, LED stages provide the benefit of being able to create any location without having to travel, and with more modest on-set crews.
Diamond View Studios’ high-res LED wall is used for extended VR productions. Image courtesy Diamond View Studios.
But there are so many other ways that storytellers are experimenting with virtual production. These techniques may not get as much attention, but they are still exciting to me because they show the true scale and impact of real-time technology on filmmaking. Game engines are now being used in departments where traditional digital content creation (DCC) tools were rarely present.
For instance, in editorial, traditionally if there was a continuity issue, the editor would have to send notes back and either the production would have to re-shoot it or the post team would have to re-render it.
Today we're seeing editors using the Sequencer tool in Unreal Engine, so if there are continuity issues with anything digital - such as the way birds are flying in the background - the editor can fine-tune it themselves and move on. That is so powerful because it lets editors focus on telling the story and being iterative as they make changes, instead of being burdened by the process of passing notes and waiting on other departments to fix the issue.
Another fascinating area for game engines is action design, or the stunt department. Because game engines are interactive and can be physically accurate, the stunt designer or stunt coordinator can pre-test every move ahead of time virtually, and ensure that everything is safe, effective, and has continuity.
For instance, with a virtual car, you can ultimately make it perform like a digital twin of your physical car, and experiment with tweaking the suspension or the speed at which you take a turn, seeing the results instantly. Mapping out a stunt sequence like this also helps the cinematographer determine the best camera moves ahead of time that will be necessary, along with determining the lighting design. So by the time everyone gets to the set, it's just a matter of execution. This approach is not only a lot safer, but it can further reduce the number of shoot days - thus saving precious time and money.
Today's audiences have an insatiable desire for more and more content, and they're consuming it across a multitude of platforms. Whereas previously studios would have to rebuild the same assets for every use case - a film, a video game, a theme-park ride, an immersive experience, and so forth - now with game engines, the same original asset can be ingested and reused anywhere. This not only saves time and resources for all the vendors, but it ensures continuity and quality for the brand, no matter how their asset is being deployed.
Ultimately, human beings are storytellers. We connect and thrive by telling our own stories and bearing witness to the stories of others. With virtual production techniques and the efficiency and democratization of today's technology, more stories are going to be told and reaching more people than ever before. After many years of building toward this inflection point, I'm thrilled to finally see what unencumbered creative storytelling might look like.
Miles Perkins is a key member of Epic Games' Unreal Engine team, driving strategic business development and adoption of real-time workflows in the M&E industry, informed by his extensive experience in film, television, and emerging technologies. He built his career over 23 years at Lucasfilm, and then moved to Jaunt, an early startup in the XR industry.
UNITY: REAL-TIME REV AT THE HEART OF VIRTUAL PRODUCTION
By Rory Armes, Vice President of Solutions Development, Unity
It's no exaggeration to call 2020 the most challenging year for TV and movie production in recent memory. The COVID-19 pandemic had dramatic, industry-altering effects after it became clear that few studios on the planet would be spared. Making any kind of scripted entertainment is an elaborate, expensive process even during the best of times, to say nothing of when it's suddenly too dangerous to do things in person that you'd usually take for granted. This became even more obvious when the industry-altering effects of the pandemic began to show.
Yet while mandatory bubbles and extensive protocols became one way to deal with the unique circumstances of entertainment production in keeping everyone involved safe, the logistics of shooting episodic TV or films during a pandemic accelerated a trend that was already well under way: the shift toward virtual production.
What exactly is virtual production? Let's start by agreeing that the term itself is probably too narrow to describe its impact. While virtual production generally refers to using software to combine live-action footage with computer-generated imagery or material, which many would consider to be the domain of traditional VFX, it's more precise to say that it is real-time production which has caught on and revolutionized the future of TV and filmmaking, particularly for the realities of our current time.
The technology needed to get to this point has woven its way into the industry over time. Digital editing was arguably the first big step forward, giving editors the ability to start their work while filming was still going on. VFX was another key area of technological improvement, as artists began conjuring impressive results from their own workstations. Still, most of the developments of the previous few decades have been devoted to speeding up the traditional steps in a time-honored process of assembling TV shows or movies.
VIRTUAL PRODUCTION BENEFITS
Real-time production shakes up the entire paradigm because it removes the need to create a film in a step-by-step, linear fashion. Instead, every part of a project can be worked on simultaneously, and the lines between them are blurred or even erased altogether. For example, VFX has historically been part of postproduction - to make a classic like Star Wars, you shot the film first and added the effects later. Virtual production techniques make that distinction meaningless, as you can now add the effects as you go, and do it in real time, within a completely digital environment if need be.
The benefits in terms of speed are obvious. The entire moviemaking timeline can be compressed when the talent involved can build out all parts of it simultaneously. Additionally, those people aren't required to be in the same physical location - a huge bonus in a year like the one that just ended. Team members can easily collaborate in a scene - in the same space, seeing the changes live - yet work remotely.
An area where these tools have shined during the pandemic is in pre-production, when shots can be assembled in CG before anyone heads to a live set. When restrictions require everyone to stay away from shooting locations, team members have the power to craft more of the film or show remotely, making decisions in advance of when actors and teams can head to set again. This pre-production work using virtual tools also allows grips to see where they need to lay the tracks for dollies, and set designers and carpenters to see where to place props with regard to shadows, which helps them see how that impacts the tracks.
More than anything, though, real-time production puts power and control back in the director's hands. The traditional filmmaking steps often turned decisions into risk/reward equations. Do I try that shot again, hoping for a better angle or more perfect lighting? Or do I press on because it isn't worth the extra time and money involved? When you're able to do everything from visualizing an entire virtual set to determining the perfect camera placement on the fly in real time, it puts some of the spontaneity back into the mix, allowing for serendipitous what-if moments to flourish where they once caused fear.
This creative freedom isn't limited only to those with bottomless pockets. Using real-time production technology in the pre-visualization portion of a project leads to significant cost savings by more fully informing every choice. Traditionally, postproduction was the biggest drain of time and money, responsible for delays and budget overruns too numerous to mention over the years. By enabling more collaboration before any filming even begins, anyone can now render those kinds of concerns moot.
From Love and 50 Megatons. By Josephine Ross, producer; Denis Krez, VFX supe; Paulo Scatena, TD.
That kind of collaboration and freedom dovetails nicely with one of our core beliefs at Unity: "The world is a better place with more creators in it." Our goal has been to develop real-time production tools that anyone can use, from students making their first film to those working on a big Hollywood production. Since so many virtual production tools have hit the market in the past couple of years, we've had the chance to learn what works and what doesn't. We noticed a lot of holes in the way creators were developing content, and that the best way to plug them meant scalable architecture, responsive programming, and an all-in-one solution.
Unity's status as a leading platform for creating and operating interactive, real-time 3D content means we can offer that solution. Our foundation in the gaming industry is the starting point, as developers use Unity on all platforms to build video games that need to have thousands, sometimes millions, of interactive objects rendered at 60 fps as a player advances through a world. That kind of power is a natural fit for real-time production, and more and more creators are realizing it.
Jon Favreau is one of them. The acclaimed director and producer put Unity to use while creating The Lion King, constructing a virtual Serengeti inside the engine. Of course, the result wasn't animation, but rather an ambitious live-action film that utilized traditional camera techniques within a virtual space. What appeared to viewers as a gorgeously realized expanse was almost entirely filmed on a set just 70 x 40 feet.
Unity's commitment to real-time production is only ramping up. With moves like the acquisition of Digital Monarch Media, which built proprietary technology (that shined in movies like Blade Runner 2049) on top of the Unity engine for virtual cinematography even before joining our team, we continue to provide robust tools for anyone looking to unlock their own creative potential.
Change can be scary, and there's no question that real-time production is a massive change from how TV and movies have long been put together. But limits are even more frightening, and if 2020 taught us anything, it's that you never know precisely when you might run into new ones. What real-time production does above all else is provide an environment for all the talented people in film to work in a limitless environment, free from hardware or budget restrictions. If you have every lens, infinite dolly track, infinite lights, infinite everything, wouldn't you be able to get to the heart of the story you want to tell that much faster?
That freedom to experiment and collaborate, from start to finish and regardless of location, is at the heart of what most people call virtual production. But it's the real-time aspect of it all that is the key, and even when other factors remain uncertain, you can be sure that it will only grow in prominence as we forge ahead in 2021 and beyond.
Rory Armes is the vice president of Solutions Development at Unity, where he leads development of broader platforms, solutions, and applications that help facilitate the virtualization of digital content processes using real-time 3D. Prior to joining Unity, Armes developed games both personally and for Electronic Arts,