In a 'Fortnite'
Brian Pohl
Issue: Volume 41 Issue 2: (Edition 2 2018)

In a 'Fortnite'

In July 2017, Epic released Fortnite, a video game where players save survivors of a worldwide cataclysmic storm, building fortifications and constructing weapons from scavenged materials to fight off enemies. In late 2016, in preparation for the game's release, Epic Games began production on a three-minute cinematic trailer for Fortnite.

Traditionally, game trailers don't use in-game, real-time graphics because the quality just isn't there. Instead, trailers have been rendered like short films using the same production workflow CG artists have used for decades, involving long wait times between pressing the render button and seeing the result.

For the Fortnite trailer, Epic's goal was to create a short movie with the same level of quality expected in a trailer, but rendered in real time with Unreal Engine. Furthermore, real-time rendering would be an integral part of the pipeline from start to finish, including set design, character animation, and special effects. With real-time rendering, they'd be able to collaborate, review, and iterate a lot faster than with the traditional render-and-wait approach.

This was a tall order, but Epic had a point to make beyond showing off what Unreal Engine can do - they wanted to show how real-time rendering gives more freedom in the creative process. Here, we'll go over some of the highlights.

Rough Layout

One of the first steps in an animated short film is a rough layout of character motion and camera placement within the CG set. This usually starts with the creation of low-resolution sets and characters, but Epic had the advantage of existing game assets they could use for the task.

To get the rough layout for character animation, Epic used a process they call "first unit previs," combining filmmaking terms for the principal photography team (first unit) and methods for visualizing complex action prior to actual filming (previsualization, or previs). First unit previs combines motion capture with real-time rendering.

The Fortnite trailer's script included action and dialog from four main characters. For the first unit previs session, actors performed the script, and their actions were captured through multiple, long mocap takes. Each take's mocap was immediately added onto a rigged character within the environment so the performances could be reviewed, and any re-takes that were necessary could be ordered right then and there.

These performances were analyzed to determine which ones were best for the purposes of the trailer. The best long takes became the basis of the interactive rough layout, where the director and others could review, re-capture, and swap out motions until they were satisfied with the result.

Because the director and cinematographer were free to conceptualize the movie by their own hands, the result was a more natural cinematic approach to the rough layout process.

Set and Character Models

The in-game set and character models were improved for the trailer, increasing both poly count and texture resolution. Epic used a variety of DCC software, such as Autodesk's 3ds Max and Maya, Pixologic's ZBrush, and Foundry's Modo for modeling.

As camera placements came through from first unit previs, set models were optimized for real-time playback by removing backs of buildings and anything else that wasn't visible to the camera.

Fortnite
ALL FACIAL PERFORMANCE WAS MANUALLY KEYFRAMED IN MAYA.

Character models got a poly count upgrade to work with the new animation. For the characters' upgraded heads, Epic implemented a universalized facial topology across all four characters so the facial rigs could be kept consistent and animation could be shared among characters.

Each character model had about 185,000 triangles.

For body rigs, Epic used Unreal Engine's Animation & Rigging Toolset (ART), a full suite of animation tools that operates as a Maya plug-in. Animation of ART rigs is easy to pop into Unreal Engine for real-time review.

Facial animation posed a different challenge for the Fortnite team. The final facial rig was a combination of 201 joints, lattices, and blendshapes. At the time of the Fortnite trailer's production, FBX format was able to export up to eight influences per vertex, but the team needed more than that to get the facial performance they wanted. Alembic exports baked vertex positions, so the team used as many influences and deformers as necessary within Maya before exporting the facial animation to Unreal via Alembic cache.

The facial rig provided on-surface controllers for animators to modify the facial performance as needed, as opposed to a virtual blendshape control board. All facial performance was manually keyframed in Maya.

When an animation is exported via Alembic, the ABC files store the position of every vertex on every frame. For a model with thousands of vertex positions on each frame in a three-minute animation, the amount of data can add up very quickly. Analyzing and playing back such a large volume of data in real time isn't feasible.

Instead, Unreal Engine imports and stores Alembic animation data using principal component analysis (PCA) compression. During import, the process distills the enormous amount of vertex-based animation data to a smaller, more manageable set of morph targets while keeping the fidelity of the original animation.

During playback, Unreal Engine loads the new set of morph target data into memory and blends them per-frame in real time. In this way, the Fortnite facial animation exported via Alembic format could be played back in real time in Unreal Engine.

Fortnite
REAL-TIME VOLUMETRIC RENDERING KEPT THE EFFECTS LIVE AND ADJUSTABLE IN UE.

AnimDynamics, another UE feature, was also used to enhance the characters' secondary animation detail, such as hair and clothing movements, all in real time.

Pulling It All Together

While modeling, rigging, and animation for each asset in the Fortnite trailer was done with a DCC package, rough and final layout of each scene was done in Unreal Engine. This approach differs from a traditional CG pipeline, where the scene assembly is created within the DCC package itself. By finalizing each asset separately within a DCC package and aggregating them in Unreal Engine, the team was able to work on characters and sets in parallel.

Transfer of data between DCCs and Unreal Engine was accomplished with either of two file formats: FBX, for transfer of models and editable/animatable rigs to the Unreal Engine ART, or Alembic, for transfer of complex animation such as baked morph targets.

With all these files going back and forth into Unreal Engine, work being done on the latest files has to be tracked somehow to keep things efficient.

In traditional pipelines, data tends be stored across multiple file servers and various locations with different naming conventions, which can require carefully designed structures and scripts to point to the proper file. There usually is a "home" directory of sorts, but it can be confusing.

For the Fortnite trailer, Epic kept things organized by using a centralized Perforce Software depot. By utilizing the source control mechanisms within Unreal, a user can check out and check in assets as needed for modifications, much like checking out a book from a library. This checkout process temporarily locks the asset from being changed by other users, but they are still able to access the existing file for read-only purposes. Once the user completes his/her changes, the files are uploaded to the depot through Unreal's source control tools and the UnrealGameSync (UGS) utility (a customized wrapper that controls Perforce). UGS then records and annotates all the modifications into a change list with which other users can synchronize across the entire project.

Fortnite
A FORTNITE TRAILER SCENE.

Fortnite
A MOTION-CAPTURE SESSION.

Shots were composed and reviewed as level sequences within the Unreal Engine Sequencer, a non-linear editing and animation sequencing tool. Within Sequencer, scene elements like characters, props, lighting, and special effects can be animated and layered together for review sequentially.

Post Processing and Special Effects

Traditional animation pipelines rely on renderfarms to calculate visual effects and post-processing enhancements as separate render passes that must be composited in a dedicated compositing program such as Adobe's After Effects or Foundry's Nuke. While the use of such programs has its place, Unreal Engine eliminates a number of these steps by providing those capabilities within the engine.

The game's smoke-like storms and enemy deaths required special attention. In Fortnite, storms roll in as purple fog. Enemies always emerge from the storms themselves, and when killed, they evaporate into purple smoke. For the Fortnite trailer, real-time volumetric rendering methods were used to keep all the effects live and adjustable in Unreal Engine, rather than importing them from an off-line package.

Some shots called for cloud effects to evoke the feel of the impending storm but didn't require the custom shapes needed by enemies and storms. For such shots, the Fortnite team made use of Unreal Engine's built-in Volumetric Fog.

Storm clouds called for curling, flowing shapes. The shapes of the storms were animated using 3D Flowmaps, a type of animated texture in Unreal Engine. The Flowmaps were hand-painted inside of virtual reality using custom Unreal Engine tools. Painting the clouds' Flowmaps in VR was a more natural experience than using a mouse or stylus.

For enemy deaths, the Fortnite team used a fully real-time fluid simulation. To start off the smoke for each fluid simulation, the monsters' skeletal meshes were converted into emitters. Both the monsters' motion just prior to death and normals from the skeletal meshes contributed to the fluid's motion.

The team implemented a number of clever tactics to improve playback speed, such as foregoing multiple light bounces on the smoke volumes and using blurring instead. Because the action is so fast, the visual difference was negligible.

By using a color instead of a scalar value for the shadow density, the light hitting a volume changed color with shadow depth, giving the illusion of depth while allowing a wider range of artistic effects.

Fortnite
REAL-TIME PLAYBACK OF ALEMBIC FACIAL MOTION IN UNREAL ENGINE.

Each sim was controlled via the Sequencer. This allowed for very rapid iterations when compared with the wait times incurred by traditional off-line rendered simulations.

Where It's Going

Unreal Engine gave Epic the opportunity to push boundaries by showing that film-quality simulations are possible in real time. Creating all these scenes and visual effects in real time allowed the Fortnite team to achieve a quality and integration that would not have been possible with a traditional render-and-wait workflow.

By using real-time rendering from start to finish on the Fortnite trailer, Epic wanted to inspire filmmakers with a new way of working. Unreal Engine releases filmmakers from the shackles of long render times, making real-time creativity and collaboration possible.

Brian Pohl is Epic's M&E technical account manager for North America.