<i>Our Flag Means Death</i>:  Venturing into uncharted virtual production territory
Kendra Ruczak
May 26, 2022

Our Flag Means Death: Venturing into uncharted virtual production territory

Sometimes the greatest journeys are born from the decision to leave everything behind and chart a course for the unknown. This is the path wealthy 18th-century aristocrat Stede Bonnet (Rhys Darby) takes in the hit HBO Max original comedy Our Flag Means Death. Created by David Jenkins and based loosely on a true story, the swashbuckling series follows Bonnet as he abandons his life of privilege to become a pirate and set sail on the high seas. Known as the “Gentleman Pirate”, the woefully inexperienced Captain Bonnet leads a ragtag crew of buccaneers aboard his ship, the Revenge. Executive producer Taika Waititi also stars as Blackbeard, a notoriously fearsome pirate who forges an unexpected bond with Bonnet. 
A virtual voyage
With the show set on a pirate ship surrounded by the ocean, the most significant challenge was the task of creating an affordable yet robust production environment for the cast and crew. Cinematographer and VFX supervisor/producer Sam Nicholson, ASC developed a virtual production workflow to make principal photography for the series both manageable and cost-effective. Nicholson is also the founder of Stargate Studios, which has provided groundbreaking VFX and production services to hundreds of television series, feature films, and other media since its inception in 1989. In addition to offering compositing, 3D, matte painting, and editorial services, the studio specializes in both computer-generated and photographic plate virtual production. 

Virtual production, which has become an increasingly effective alternative to on-location production or traditional green screen utilization, relies on what can be delivered on set in real-time. The goal of virtual production is to recreate reality. Through a combination of well-planned off-axis projection, interactive lighting, top-notch LED display operation, and clever cinematography, the illusion of reality can be achieved. If it feels real to the cast and crew, it will elevate the content and ultimately feel real to the viewer. 


Courtesy of Stargate Studios

Nicholson knew virtual production would be ideal for Our Flag Means Death after completing the HBO comedy thriller series Run (2020), which takes place almost entirely in a virtual train environment. “It's an extremely challenging concept to do an entire pirate series on the water, on a sound stage with no water,” Nicholson explains. “HBO is a very forward-thinking studio. We had just completed Run, which was an entire series on a train where the train doesn't go anywhere. That was great, and it worked very well in virtual production. So we carried that forward. The HBO execs called and said, ‘Can you do this? Is it possible?’” 

The 10-episode season would require a 14-week shoot on a sound stage with limited time for pre-production. “It’s a half-hour comedy. It's not a hundred-million-dollar movie. So first of all, it has to make economic sense,” Nicholson notes. “We only had six weeks of pre-production. I shot a test very early on before the show was greenlit when I first heard about it, which gave everybody the confidence that we could do it.”

Nicholson was very excited to collaborate with Waititi, who helmed the creative direction of the series. “The opportunity to work with a creative genius like Taika doesn't come along very often. He just has a fabulous vision and once you're on board that ship, he trusts everybody to do their thing and bring their A-game,” he recalls. “It's an opportunity for me to take my whole team and say, ‘This is going to be an incredible project; let's figure out how to do it.’”



Testing the waters
In addition to creating the virtual ocean environment, Nicholson and his team were also tasked with managing the project’s on-set playback. This required them to capture ultra-high-resolution plate imagery that would be stitched together and displayed on a massive 30-foot by 165-foot LED wall surrounding a practical pirate ship set. The team began conducting extensive virtual production tests in their facility well in advance, working with smaller screens, existing footage, computer-generated ships, and other tools to establish a production and data management pipeline that would be reliable when scaled up on set. This included shooting plates locally and testing grain, resolution, curvature of the screen, ambient bounce light, and any other factors that would influence production.

For virtual production, it’s essential to have ample time during pre-production to shoot high-quality plates, design the set to properly fit the stage and LED volume parameters, integrate the lighting, and test the look and colorimetry of the screen versus the cameras. All of these aspects play into the finished look, which will be created live on set. “We scaled that test up and started looking at the reality of going to Puerto Rico to shoot all the plates,” Nicholson recalls. “We decided on practical shooting, water, particularly in landscapes, because we did not have time or the money to render CG everything. And by the way, why do it if you can get on a plane and go get it?”


Courtesy of Stargate Studios

Assembling equipment
Carefully selecting equipment that could handle the project’s intense virtual production workflow was a vital step in the process. Nicholson and his team decided to utilize a 360-degree rig comprised of multiple Blackmagic Pocket Cinema Camera 6K Pros mounted at 45-degree increments, plus five URSA Mini Pro 12K cameras for a total horizontal resolution of 60K. “We decided on the Blackmagic 12K cameras and the 6K cameras because we could afford to put five or six or nine of them together in different configurations,” Nicholson explains. He finds the Blackmagic Design ecosystem to fulfill a critical niche in high-resolution imaging, especially for VFX, thanks to a fast, intuitive user interface paired with robust dynamic range and color imagery. “The data itself, the Blackmagic Raw [BRAW], is highly efficient and very high quality. Our target was a 20K screen in real-time that would be full resolution edge to edge,” Nicholson continues. His team also decided to utilize Blackmagic products for the project’s complete data management system. This included DaVinci Resolve Studio, an ATEM Constellation 8K switcher to control the multiple camera arrays, the HyperDeck Extreme 8K HDR, multiple DeckLink 8K Pro units to distribute the data that feeds the Unreal Engine during production, and more. 



Capturing plates
Six weeks before principal photography began, Nicholson and his team traveled to Puerto Rico to capture plates for the project. Using specialized camera array rigs, they shot a wide range of ocean footage at different times of day and during a variety of weather conditions: day, night, magic hour, inclement weather, and more. This allowed them to build a diverse library of high-definition footage for the show. The dependability of the camera arrays was vital to completing this stage of the project. “When you're on a boat swinging around in the middle of the ocean and the sun is going down and you're five miles offshore, half your crew is sick, you're shorthanded, and then here comes the rain…the last thing you want to do is worry about the technology,” Nicholson recalls. 

With such a large volume of footage captured from multiple cameras, building an efficient data management pipeline was key. “We had to shoot very long takes—five minutes times nine cameras or five 12Ks. So each camera had a four terabyte SanDisk SSD attached to the top of it so that we could rig the cameras and have 20 terabytes of memory without having to reload,” Nicholson explains. The team’s workflow was designed to maximize the efficiency of the data offloading process. “Don't forget that after a 14-hour shoot day, when you get back to the land, you have to go back to some hotel room and download 20 terabytes of data a day, and it has to be ready for the next morning.” The data would also need to be backed up, which amounted to a massive 40 terabyte overnight transfer after each shoot day. 


Courtesy of Stargate Studios

Processing and stitching
After completing the plate photography in Puerto Rico, Nicholson and his team transported the footage back to Stargate Studios. When the ingest and transcoding process was completed, they created a nine-way split to allow all of the cameras to be viewed simultaneously. “You can't look at 12K, but Resolve allows us to play back in real-time and pull a proxy that people can look at very rapidly,” Nicholson notes. “That all has to be done before you start shooting because you have to get it in front of the director and the directors of photography, and they have to make selects so you can up-res it again and get it back on the wall. So it's a unique pipeline.”

The team stitched all of the footage together into a composite image that was comped at 60K. Next, they split it into 8K quadrants that were then processed through the 8K DeckLinks and Resolve pipelines. An anamorphic squeeze was applied, which was then de-squeezed on set. This gave the image the parallax of a ship but kept it as steady as if it were shot from the land. Nicholson was very impressed with the workability of the Blackmagic files. Though the files were high-resolution with high dynamic range, they were still manageable and ideal for a virtual production pipeline.


Courtesy of Stargate Studios

Setting sail on set
Principal photography for the series was conducted on a large sound stage on the Warner Brothers lot in Burbank, California. During pre-production, Nicholson worked closely with production designer Ra Vincent to build a virtual volume that would accommodate the show’s main set piece, a full-sized pirate ship. Built with a flexible modular design, the ship was designed to split into three segments that could be wheeled in and out to create different ships. This allowed the set piece to function not only as the primary pirate ship the Revenge, but also as English and French warships and other vessels. Nicholson collaborated with Sweetwater to design a curved LED volume comprised of approximately 3000 panels that would be an ideal fit for the space.

Blackmagic hardware is an essential component of ThruView, Stargate Studios’ proprietary virtual production system. “ThruView is a combination of volumetric tracking, on-set very high-resolution playback—in this case, 20K; color timing and everything that the Resolves give us, and integrated pixel track lighting, with all the lights on the set being controlled through the playback system,” Nicholson explains. The wall was divided into six 8K sections that made up a full-resolution LED volume. This setup required six synchronized Resolve systems for full color timing and focus control, six HyperDeck Extreme 8K HDR units and essentially six of each component required for playback plus two backups of each. “With a system that complex, your failure percentage goes way up,” Nicholson adds. “You have six times more wiring, six times more potential for disruption or failure. So the dependability of each component has to be bulletproof and you can't afford to have any variance in the engineering of the products themselves. If they act a little bit differently, something's going to fail.” 



With such a high potential for system failure, Nicholson was very grateful for the support he received from the Blackmagic team throughout the process. “They were fantastic to support this effort because I said, ‘We're going to try to do this,’ and they said, ‘Great, how can we help?’ And I really think that's what makes a company like Blackmagic so unique. They are highly responsive to the people who are using the hardware in very innovative ways,” he notes.

Matching the image perspectives with the curvature of the LED volume required a complex coordination process. “We had 14 computers running 14 A6000 Nvidia graphics cards on that wall,” Nicholson explains. “Once you have all these systems coordinated, it goes into the Unreal Engine to do the off-axis display because the cameras are tracked. We're on a 165-foot curved J-shaped wall. It's got a 180 curve at one end and then it goes straight. It's a very unique shape of a wall, and all of the images have to be mapped onto it and displayed based on the correct perspective. Or you wind up with a horizon that bends, and horizons don't bend.” Even with the multitude of challenges and the complexity of the system, Nicholson was thrilled with the overall performance of the setup. “I'm very happy that I can honestly say in 14 weeks we didn't have a minute of downtime because of the virtual production,” he recalls. 



Invisible immersion
The success of the project’s virtual production workflow was especially impressive considering that the visual effects were not meant to be noticed. “This is a half-hour comedy. This is not a Marvel movie. The visual effects or the virtual production are supposed to be invisible,” Nicholson notes. “You're not supposed to sit there and go, ‘Oh my gosh, what a great visual effect.’ You're supposed to believe you're on the ocean or in a harbor.”

Nicholson and his team collaborated closely with visual effects supervisor David Van Dyke throughout the production. “We continually had discussions about what we could achieve in real-time and what would be better left for traditional visual effects,” he recalls. “One of the most important questions that every production should ask if you're thinking about virtual production is ‘What should we not do in virtual production?’” During production, the stakes are very high, and it can be disastrous if anything goes wrong on set. This is why it is extremely vital to focus on what actually needs to be completed virtually and exclude anything that could cause unnecessary issues during production, incurring huge costs as a result. “I'm not a gymnast in Cirque du Soleil, but it would be like falling off the trapeze in front of everybody at a big show,” Nicholson notes. “It's a high-risk thing, so you have to be incredibly well-rehearsed. This is live performance, so you have to be very confident in the tools that you're using and what you can achieve. And again, that comes back to relying on the technology.”

Not only does the technology need to function seamlessly, but it also needs to be intuitive and easy to operate. “The user interface has to be pretty seamless,” Nicholson explains. “Then an operator can learn that and become very fluid in controlling the hardware and making it seem really easy. It's like throwing away the sheet music for a pianist. The last thing you want to see is an artist on set taking out a manual, trying to figure something out. That is not an option.”



The future of virtual production
According to Nicholson, the future of virtual production will require greater control to manage even greater demands. The ability to make adjustments seamlessly in real-time is key to the success of future virtual productions. “It’s going to get more complex; things aren't getting simpler,” Nicholson notes. “A director will come back and say, ‘I really like that sequence, but can you make it shorter?’ or ‘Can you make it faster?’ or ‘Can you change the clouds?’ Now we're talking about visual mixing in real-time.” He envisions a real-time visual mixing console, analogous to an audio console, to control the visual effects for virtual production. “The Unreal Engine is all about that. It's rendering things in real-time,” he continues. “It’s all a matter of data and processing. That's why our relationships with Nvidia, Unreal and Epic, and Blackmagic are so important to us, because it's the intersection of all these technologies where the magic is really happening.”

Virtual production allows Nicholson and his team to keep looking forward and going beyond what they originally thought was possible. “We are now designing impossible projects,” he shares. “But if you really understand the technology and you have good communication with the hardware manufacturers and the software, you know where they're going to be in six months. So, we're designing with tools that don't even exist yet. It comes back to the dependability of futureproofing your idea. If you're going beyond what everybody else is doing, you need to be somehow getting access to tools that they may not have.”



Virtual production takes the technical and artistic elements of visual effects work and adds the element of real-time performance into the equation. “If you say, ‘What is virtual production?’ It's what you can achieve in real-time,” Nicholson explains. “It doesn't matter how you get there. It doesn't matter whether it's laser projection, OLED, LED, big camera, little camera, a television set that you buy at Best Buy, or 5,000 panels…It's what you can achieve in real-time for your show.”
Nicholson is optimistic about the direction virtual production is headed in the future and the possibilities of blending photographic and rendered worlds. “It's a very bright future,” he notes. “Virtual production is not just what you can render, it's what you can see. It doesn't matter whether it's 2D, 3D, or 2.5D; it doesn't matter whether it's 3DoF [three degrees of freedom] or 6DoF [six degrees of freedom] or how you're tracking, or anything. It’s different for each production and you can start very small and scale it up to something massive.” 

Nicholson’s advice for those interested in the field of virtual production is to go beyond what they think is possible and to never stop educating themselves. “If you're pushing the limits and you bring fresh ideas to a company like Blackmagic or many of these companies, they need the creative input. They need users who are doing unique things with their products. So first of all, learn about their products, learn them inside and out,” he adds. “It's an exciting future, and it's just getting better.”

Kendra Ruczak is the Managing Editor of CGW.