AWS SIGGRAPH Keynote Looks to the Past to Uncover the Future of Content Production
August 11, 2021

AWS SIGGRAPH Keynote Looks to the Past to Uncover the Future of Content Production

With more than a year of almost fully virtual events to learn from, organizers and companies are reimaging the trade show keynote experience. Fittingly, the same technology helping to keep productions rolling throughout the pandemic – safely and remotely – is now also being used to deliver more thoughtful and dynamic presentations.
“Through the looking glass: the next reality for content production,” an Amazon Web Services (AWS) Featured Speaker Session from SIGGRAPH 2021, provides both visual spectacle and thoughtful introspection, making use of the latest virtual production techniques and best practices. Pulling together leaders from Netflix, Weta Digital, Company 3, AWS, Epic Games, and Amazon Studios, the keynote lays the foundation for today’s computer graphics technology with a look at past inflection points, hosted by AWS’ Eric Iverson, CTO for AWS Media & Entertainment.

During the session intro, Iverson’s digital backdrop changes to reflect the advancements of each decade, before he embarks on a stroll through a leafy park, that is in fact recreated on a virtual production stage. Captured on the XR Stage at 204 Line facility in Pacoima, CA, the presentation was made possible by ICVR and Brendan Bennet Productions. Directed by Scott Kelley, with ETC Head of Adaptive and Virtual Production Erik Weaver as Executive Producer, it was filmed with a three-camera setup, an unconventional approach considering most virtual productions employ a single camera. Epic Games’ Unreal Engine was used for real-time rendering, and the dynamic digital park environment was created by ICVR, which leveraged a range of AWS services during production. Using AWS, ICVR was able to iterate on the environment, with a remote team working across time zones to ensure the visuals closely matched the physical set, which included foliage, a bench, and dirt flooring.

Charting the Beginnings of CG
Iverson kicks off the presentation by highlighting developments that have helped democratize CG production. The brief tour through history serves to remind viewers how significantly the industry has progressed in a few short decades, with each milestone achievement accelerating technology advancement. By evaluating the past, the possibilities of the present and future more clearly come into focus.

Along with bell-bottoms and disco, the 1970s gave rise to CG industry, including the launch of the SIGGRAPH conference, the creation of the first 3D animated hand, and Atari’s release of “Pong.” The decade also marked the release of “Star Wars,” inspiring a new generation of creatives. In the 1980s CG experiences continued to advance, alongside the releases of the Nintendo Entertainment System and Apple’s Macintosh personal computer. Progress continued into the 1990s with the formation of Epic Games and Pixar, the launch of Adobe Photoshop, and the debut of “Terminator 2” and “Toy Story.”

As the millennium rolled in, so did AWS with a vision for cloud-based computing. Personal computers with meaningful compute capabilities became more widely accessible and filmmakers began to use advanced production techniques, such as real-time rendering on-set for “Avatar” and artificial intelligence in “Lord of the Rings.” In the 2010s, cloud-based computing became more widely accepted, with studios offloading burst render workloads as shot complexity and volume increased. Untold Studios became the first fully cloud-based creative studio in 2018, blazing the trail for other studios to follow suit. Marvel’s 2019 billion-dollar blockbuster “Avengers: End Game” alone featured 2.5 times more VFX shots than “Iron Man,” which was released in 2008. Other ground-breaking projects include Disney’s “The Jungle Book” and “The Lion King,” and Disney streaming series “The Mandalorian,” which showcased the latest iteration of virtual production technology powered by game engines.

Examining Modern Content Production
In 2020, at the onset of COVID-19, the industry found itself yet again at a tipping point in its evolution. Content creators had to rethink how to plan, shoot, and edit projects, and in some cases, adopt technology they either hadn’t considered or had planned to adopt later on. Fortunately, the necessary tools were production-ready, and many studios found that integrating the cloud whether via hybrid or all-in scenarios – allowed them to not only survive but also capitalize on unprecedented opportunities.

The rise of remote work

Although some creative studios were choosing to move their production pipelines to the cloud pre-pandemic, it was rare for a VFX or animation studio to operate remotely. Due to infrastructure constraints or reluctance to alter decades-long practices, studios preferred artists to be physically present to work on projects. This meant that studios were bound to talent recruiting based on geographic location. Now, newer studios are increasingly opting for a more agile approach to infrastructure, while established facilities are rethinking their hardware roadmap.

With successful implementations of remote production and cloud-based collaboration prompted by the pandemic, many studios’ preference for on-premises hardware has begun to fade. Cloud-based remote production workflows provide access to a larger, more accessible talent pool, allowing studios to benefit from more diverse skillsets and perspectives. In turn, artists are able to choose the projects they want to work on without uprooting their families. At the same time, cloud-based pipelines can also be deployed in-studio, providing companies with more flexibility in how they work and eliminating the costly upfront expenditures that come with purchasing and managing on-premises compute resources.

Using the cloud, Netflix has found an incredible benefit to remote work, allowing its team to tell more authentic stories and creatives to be more imaginative. Global production is here to stay, according to Laura Teclemariam, Director of Product, Animation at Netflix, and companies that don’t lean in will be left behind. Netflix connects with artists globally using its NetFX cloud-based pipeline, which provides the infrastructure needed to create Netflix-caliber projects. Netflix Director of Engineering Rahul Dani, who oversees NetFX, noted that the pipeline has been deployed on 35 productions spanning four continents.

Meeting an ever-increasing demand for VFX and animation
As the content quality bar continues to rise, the demand from streaming services and studios for compelling visuals is outstripping the pace at which creators can deliver. To ramp up capabilities, Weta Digital is taking a two-pronged approach: empowering more artists and harnessing infinite compute power. Alongside a shift from on-premises resources to AWS, a move announced in late 2020, Weta also develops tools that increase creativity by decreasing the time needed for work that can be automated. Weta Digital CEO Prem Akkaraju referenced the studio’s proprietary tree growth software that procedurally generates geographically-accurate forests, and was used on the “Planet of the Apes” films. He also touched on the role of advanced digital humans in storytelling, from “Gollum” to a 23-year-old version of Will Smith featured in “Gemini Man,” and how other studios can leverage Weta tools via WetaM, a cloud-based software as a service offering that was recently announced partnership with Autodesk.

Transparent infrastructure and ‘compute from anywhere’
Company 3 SVP of Technology Robert Keske began investigating cloud-based workflows in 2011 as a cost management solution. He noted that a CapEx approach to infrastructure had previously hindered the company’s growth, but using AWS allows them to scale from a few users to hundreds of users and thousands of render nodes, basically at a moment’s notice. He also stressed the importance of making infrastructure as transparent as possible to artists, and having a ‘compute from anywhere’ approach to bring the resources to the talent, instead of the other way around.

Lightbulb moments and cultivating diverse voices

This artist-first mindset also inspired Rex Grignon to develop a solution that helps animation and VFX studios move more of their workloads to the cloud. The idea for Nimble Studio was sparked when Grignon accessed his office computer from a training room, calling it a ‘lightbulb moment.’ Once he realized that a powerful workstation doesn’t need to be located under your desk to do work, he started learning about the cloud and the pieces came together. As Director of Amazon Nimble Studio Go to Market at AWS, Grignon now works to ensure that it provides a frictionless yet customizable experience for all artists. That inclusive spirit also informs AWS’ support for important industry organizations promoting open source and diversity, including the Academy Software Foundation (ASWF), The Blender Foundation, and Women in Animation, of which AWS recently became a corporate member. AWS CG Supervisor Hayley Kannall noted her excitement for the new partnership, first announced during the keynote, citing the importance of having women and other underrepresented communities in decision-making roles.

The Future of Content Production
Offering insights on virtual production from differing yet complementary perspectives, Amazon Studios’ Ken Nakada, who serves as Virtual Production Supervisor of Prime Video, and Epic Games Industry Manager David Morin, found significant common ground in terms of the impact and anticipated direction of the technology.  

Noting the contributions of virtual production pioneers James Cameron and Robert Zemeckis, Morin emphasized that the game-changing capabilities of real-time CG tools have made virtual production a more accessible tool for filmmakers, not just those with multi-million-dollar budgets.

Morin explained how essentially every department on a production has a use for virtual tools to augment what they’re doing in the real world. Looking to the future, Nakada emphasized the importance of making virtual production technology more intuitive to reach a broader set of filmmakers, predicting significant progress will happen in the next year.

Technology in Action
The shoot’s three-camera setup included an ARRI Alexa Mini and two Sony Venice cameras. With 270-degrees of displayed imagery across more than 35 million pixels, the stage featured ROE Visual BP 2.8mm LED panels in a curved configuration for linear display, and Planar Systems LED panels for ceiling imagery, with LED video processing handled by Brompton 4K Tessera LED processors. Motion tracking was achieved with Stype. AWS services used by ICVR to create the background imagery include Amazon Simple Storage Service (Amazon S3), Amazon Elastic Compute Cloud (Amazon EC2), and Amazon Route 53. ICVR also used AWS Fargate serverless compute and Amazon Cognito for managing their proprietary RendezVu interactive 3D world app.

The keynote can be viewed on-demand via the SIGGRAPH website until October 29, 2021. For free registration and access, visit: