In-camera VFX: Shifting the Paradigm with Real-time Tools
David Morin, Epic Games
January 20, 2020

In-camera VFX: Shifting the Paradigm with Real-time Tools

In animation and visual effects, we’re currently in the midst of a great transition from traditional post workflows to real-time workflows — driven by improvements to real-time rendering tools such as game engines, and through an industry-wide growing familiarity with VR. 

(Above image: VR location scouting)

Though the need will always remain for traditional rendering tools, we’ve now seen Unreal Engine used in over 70 film and television productions since Disney’s Rogue One in 2016, and with each project we find that more and more people across departments are realizing that they can use the engine to get quicker feedback and, ultimately, reach a desired creative vision more efficiently.  

In 2020, we expect this trend to continue and to culminate in an overall shift of creative decision-making back to the set as real-time tools enable more collaborative and interactive workflows, even when visual effects are involved.

Several years ago, VR drove adoption of game engines in the motion-picture business, with film and TV studios needing real-time interactivity to properly develop ancillary VR entertainment experiences. More recently, the consumer VR market has slowed down from its initial expectations, but a side effect from the years of VR experimentation is that studios realized that immersive headsets and game engines are actually great tools for certain aspects of film and TV production — and started to incorporate these tools into their production workflows. At the same time, Epic has been working with leading creatives to develop new Unreal Engine features specific to film and TV workflows, making the engine more user-friendly and easily tailored to these types of projects.

(Lighting demo)

One of the most beneficial use cases to arise from this is VR location scouting, which allows key stakeholders (such as the director, cinematographer, production designer, art director, visual effects supervisor and others) to collaborate and interactively explore a digital environment and hone in on key creative choices to finalize a scene or a look ahead of production, rather than punting it to post. This type of interactive multi-user VR exploration can also be used to review any digital element, from spaceships to characters. With real-time rendering, all users can test out lighting options, traverse environments, and more on the fly, enabling more informed artistic decisions earlier in the process. 

All of this helps to set creative expectations ahead of time, allowing all departments to work more effectively toward a shared vision. This also allows CG and VFX teams to get involved earlier in production and offer more meaningful creative contributions up front, thereby heading off some typical challenges such as compensating for poorly lit plates, or slogging through tedious iterations.

(Virtual camera)

Just this year, with additional hardware advancements and new virtual production-focused features in Unreal Engine, such as nDisplay and Live Link, productions are now able to utilize LED walls on live-action sets for even greater immersion and collaboration, and ultimately capture accurate lighting, reflections and visual effects all in-camera. Rather than being confined to VR, LED walls allow filmmakers to bring digital environments, characters and other elements directly to the set, facilitating a true blending of the digital and physical worlds. In this context, everyone can see and interact with the same digital elements in a meaningful way, rather than trying to light a scene or deliver lines against a green screen. The visual fidelity that can be achieved today with real-time rendering can reach final pixel quality — with lighting, reflections, digital environments and visual effects elements all captured live in-camera.

Improvements to real-time technology are also impacting animated productions, as actors performing on a motion-capture set can receive instant feedback on how their performances are informing the characters, and adjust accordingly. Real-time tools are also impacting all the different flavors of “vis” — including previs, techvis, stuntvis, postvis and pitchvis — with game engines enabling better quality content, faster. The common thread across all of these improvements is that the visual quality keeps getting better, allowing for near-final if not final pixels at every stage.

In addition to the many projects already released, we are looking forward to the premieres of a number of film and TV projects in 2020 that are utilizing Unreal Engine in new and exciting ways, and to the continued adoption of real-time workflows across projects of all shapes and sizes. With this anticipated shift in the creative center of gravity, the set is becoming freed from the limitations of green screen and transforming into a collaborative sandbox where everyone can contribute their ideas, reach creative consensus and encounter the types of “happy accidents” that can only happen with interactive trial and error.  


David Moran is head of the Los Angeles Lab of Epic Games, developer of the Unreal Engine.