DNeg: Creating Reality in an Unreal World for 'The Matrix'
Karen Moltenbrey
February 18, 2022

DNeg: Creating Reality in an Unreal World for 'The Matrix'

When The Matrix Resurrections, the fourth installment of the popular science-fiction film series, finally released, viewers were ready for stunning visual effects. After all, this was a Matrix film And helping make that a reality required some extraordinary work by a number of VFX facilities, including DNeg the lead effects vendor on the film.


In all, DNeg was responsible for 723 shots, split among the facility’s London, Vancouver, and India sites. According to Huw Evans, DNeg’s (London) VFX supervisor on Resurrections, the work involved quite a large mix: some especially heavy and incredibly complex environment builds, along with creature work for the Synthients, digi-doubles for lead actors, complex FX development work, multiple fully-CG shots, tricky de-aging work, and a host of the usual wire-removal/crew removal tasks.

A good deal of that work took place in the so-called “real world” of the Matrix. In particular, the London crew was responsible for the sequence when Neo wakes up in the Anomaleum chamber, his escape through the power plant, the underground megacity of Io, the flashback to the Machine War, and scenes involving the CG ships traveling down tunnels or though derelict structures. The group also handled the sequence outside of the real world involving Neo’s fight in the dojo.

The Vancouver team, headed by Ahron Bourland, worked on the modal (a testing sandbox for game code) rooftop chase at the start of the movie, the theater where Neo takes the red pill, the bullet train segment, and the flashback to when Neo and Trinity were being operated on by the machines in the lab. The India team, meanwhile, supported the work at the other two locales.

The project presented a multitude of technical challenges in various areas. In particular, the sheer scale and density of detail required for the huge environment builds were at the top of the list, along with lighting and rendering them at 4K in the multitude of broken-up passes, which had to be combined in the final comp.

In fact, the London DNeg team had to devise new ways of handling and passing its assets across departments. “While we were confident we could use USD to construct the assets, then scatterers, to help fit a vast amount of geometry into RAM at render time, a separate concern was handing off these assets to other departments,” explains Evans. To achieve this, the group organized its work so that specific areas could be output as hero USD assets, with enough detail to allow animation to interact with the geometry, but still light enough to allow them to work with usable frame rates within Autodesk’s Maya.

Render time, however, was difficult because of this overall complexity and due to the full-4K working resolution. To deal with the sheer scale and complexity of the environments, there was a clear need to establish a system to ingest, manipulate, and standardize exports for memory-intensive assets, according to Evans. “We created a bespoke FX pipeline that slotted between the environment and lighting departments, allowing us to pick up, process, and pass data along as efficiently as possible,” he adds.



In the Dojo
At one point in the film, the new Morpheus (Yahya Abdul-Mateen II) and Neo find themselves in a dojo, a modern-style structure situated in the middle of a lake with colorful foliage-filled trees visible along the banks. Morpheus’s goal: to find out if Neo retained his kung-fu skills. The answer is yes, although it take some time for those skills to return.

To create this sequence, DNeg did not call on its own muscle memory, but instead utilized a new move: using Epic’s Unreal Engine to render out these CG film environments. “We wanted to push real-time rendering in Unreal to achieve final-quality renders, which hold up at 4K resolution, not just in the background on an LED screen, but front and center as a fully digital environment,” says Evans.

These scenes — which involved live actors and a practical set piece, along with the digital environment — marks the first time DNeg had used the game engine for a full film sequence. As Evans explains, many of the tools and techniques relied on for rendered VFX weren’t yet available in Unreal Engine.

“We were running custom bleeding-edge releases, giving us OCIO color support and layered rendering, to name a few, which weren't originally present when this work was first started,” Evans says. “This gave us the ability to run renders through our compositing pipeline, helping us push the quality level to where we needed it for a fully-digital background with a forest of trees and falling leaves as far as the eye can see and a large lake with wind blowing across the water surface.”

Getting high-quality results straight from Unreal helped the group quickly block out the sequence and figure out the geography between shots, while also giving editorial something to cut with. The opening shot contained a combination of a code reveal into the real-time Unreal environment, flying down to the digital dojo, built to closely match the on-set practical set piece built using Lidar and texture photography. It is here where a practically shot Neo is shown standing inside.

Morpheus was a full-hero digi-double that DNeg had to match and blend between two takes of a practical performance, then simulate and transition between two digital costumes to create the final effects. Throughout the sequence, the artists completed digital head replacements to remove stunt performers, digital set extensions, and set repair to ensure continuity with the damaged sections. This all culminated with the dojo being blasted apart, which was crafted in a way that it could break and splinter in an art-directable manner, while still feeling grounded in reality.

For an in-depth look at more of DNeg’s work on the film, see the Jan./Feb./March 2022 issue of CGW.