As one year draws to a close and another begins, we’ll be picking the most relevant trends and putting them under a microscope to see whether our predictions came true, where each trend is at now, and what the future holds for them in the visual effects industry.
So without further ado…
We’ve written a lot on locational VR this year, like the unexpected ways in which it could end up in your home and how it
powered up London’s first VR arcade
The buzz surrounding it is persisting as 2019 came to a close, with household-name brands lending their IP to locational VR experiences—such as Marvel’s collab with The Void—which is testament to an enduring interest and appetite from the public in general.
This same experience was well-received upon its release in October, with a review from UploadVR praising its use of the Marvel Cinematic Universe: “What really makes Avengers: Damage Control stand-out above other experiences from The VOID I’ve tried and even other [location-based entertainment] experiences in general is just how connected and in-tune with the rest of the MCU it felt. This really did make me feel like I was stepping inside of the cinematic universe rather than just playing a quick vertical slice of mostly unrelated content.”
This is just one example of how Location-based Entertainment (LBE) has fulfilled its promise—and our prediction—that in 2019, LBE will be increasingly tied to film releases as a way of allowing fans to engage with the cinematic universes they’ve come to know and love.
Elsewhere, in the Five of the biggest trends from FMX 2019, we touched on “a novel technique that sees rollercoasters augmented with virtual reality headsets [give] a glimpse into a potential future of theme park attractions.”
This came to fruition thanks to VR Coasters, and you can read all about the amazing things they’ve done with locational VR here
As unusual use cases like these become more pervasive in everyday life, the added exposure this gives to high-quality VR experiences might be enough to make a positive cultural shift toward people trying, and accepting, VR as the next standard in household media and entertainment.
After all, how will consumers accept that VR is the future of technology and entertainment—espoused time and
again—if they never have a chance to try it for themselves?
So—has VR found its home, outside of the home thanks to LBE? It certainly seems that way, and 2020 may see it settle in and extra comfortable in its own defined space. Safe to say, it’s not going anywhere soon.
Foundry predicted: “We expect to see a sharp rise in the breadth of application of deep learning.”
And the answer? Deep learning itself provided this, with a resounding “yes.”
Earlier this year, Adobe and Autodesk both announced machine-learning features in their products, and Foundry announced the open-sourcing of our deep learning platform, ML Server.
As part of this, we’ve open-sourced the tools we’re using to do our own deep learning experiments, so our customers can try the same things we're trying. You can read all about it here
And as deep learning promises to become even more pervasive as we move into 2020, machine learning has become a key goal for us to focus our R&D efforts on in the new year.
There are still challenges to crack when it comes to deep learning—namely, building a bridge between artist and algorithm, and streamlining the deployment of machine learning tools in VFX pipelines—but with the rising adoption of ML by the VFX industry, and increasing R&D investment in its application and usage in artists’ pipelines, it might be that solutions to these come sooner rather than later.
Bridging the uncanny valley was a main aim in 2019 for visual effects companies, digital humans, and AI techniques in general. As part of this, Foundry expected “digital humans to become ever more life-like—to the point where they're indistinguishable from the real thing.”
So has this aim been realized?
It certainly seems close—especially if judging by Weta Digital’s recent visual effects work on Gemini Man, where Will Smith was digitally re-created by the VFX studio to hearken back to his Fresh Prince days.
The aim: to finally cross the uncanny valley by making man and machine indistinguishable through a pain-staking process that took nearly two years, and unprecedented craft and artistry.
The results speak for themselves. Will Smith’s 23-year-old digital double, dubbed “Junior” by Weta Digital and its VFX artists, is an amazingly life-like re-creation—right down to the nuances of blood flow and breathing.
It’s also a testament to the artistry needed to create high-quality digital humans, especially at a time when inexpensive and accessible deepfake technology is giving rise to quick-and-easy, “subpar” results.
This same technology is also behind concerns raised recently in the news surrounding the implications of fakery in the 2020 US elections and beyond.
Digital humans and deep fakes are becoming harder to detect as AI software and tech develops, and the uncanny valley narrows. With this comes ethical and moral concerns—if anyone can be recreated, and be made to say and do anything by the creator, what can be trusted, and should the technology by made so readily available?
But whilst the human eye has the shrewdness to tell real from fake—even down to an imperceptible, unconscious awareness that something isn’t quite right—the technology used to do the same hasn’t caught up with the rapid pace at which AI, digital human and deepfake software is developing, especially as we wrap-up 2019.
The challenge in 2020 then becomes one of technology playing catch-up to a beast of its own making, so identification software can confidently tell an imposter from the real thing—and politicians and celebs can breathe a little easier.
As VFX is increasingly used in feature films, virtual production has arisen as a way of filming that allows the director to see any CG elements mixed with live action, so they know the shot will look and work as intended.
Inroads are continually being made into this space to scale with improving VFX capabilities, culminating most recently in the production of The Lion King, which was filmed in a purely virtual environment.
It’s the first example of a blockbuster hit being brought to our screens through a completely virtual set, and if its success is anything to go by, then we might see other pioneering studios follow suit with virtual production in 2020 to make award winning films.
The only challenge tempering widespread adoption of virtual production is the expense and expertise required to make it really work. Each show—The Lion King being no exception—is typically set up by specialists. By the time these specialists move onto their next project, they want to build it again, but better, reducing any “plug-and-play” element of virtual production. It’s not one-size-fits-all; they’re learning as they go.
Nonetheless, virtual production, whilst still niche, remains an important approach in bringing decision-making forward in the production process. It’s this usefulness that might see it adopted wholesale by big-budget studios and projects in 2020.
The amount Foundry has written on open standards is testament to its emergence as one of 2019’s hottest trends.
Open-source technology continues to play a huge role in powering collaboration and interoperability across the industry. Foundry’s own efforts to improve our tools and software utilize USD to great extent, such as our Advanced Viewport Technology. This runs as a
Hydra render delegate inside Katana to display USD assets with their USDShade Preview Surface descriptions, for a consistent visual experience across multiple digital content creation applications.
Alongside this, both Nuke and Katana rely on open standards like OpenEXR and OpenColorIO to improve artist efficiency and creativity. In fact, most industry-standard post production tools do now.
The benefits of open source tech are undeniable, yet there are concerns that its rapid rate of development means that it’s “running before it can walk.”
Each new standard brings a huge amount of excitement and promise, but the real-world adoption and integration of these are hard—and it takes a lot of effort to iron out the kinks.
The bottom line is: Best buckle up. Open standards will continue to develop at pace in 2020, and the industry should prepare for the unbridled potential—and pain points—that this will bring.