A Boy's Life
Issue: Volume 38 Issue 4: (Jul/Aug 2015)

A Boy's Life




For years now, the industry has been waiting for the day when the great divide between film and game creation ceases to exist. Unquestionably, the two worlds have been inching closer and closer during the past decade. Game artists and feature-film animators/VFX artists have crossed the void to work on projects “on the other side.” Many of the same tools are being used, as are many of the same techniques.

As close as the industry has come, true convergence has not occurred – until now.

Just recently, a group at Epic Games, led by CTO Kim Libreri, merged these two worlds with the stunning digital short film “A Boy and His Kite.” The production is epic in scale, spanning 100 square miles of terrain, and the environments are breath.taking, filled with photorealistic flora and fauna. The scenery transitions from a nice, easy landscape to one that is much harder to negotiate. There are meadows, a river valley, a forest, a lake, a valley ridge, and a cave. 
And the main character, a pre-teen boy, looks as if he stepped out of a high-end feature film.

The two-minute film – which debuted at the Game Developers Conference this past spring – follows the young boy as he chases his beloved kite after it breaks free from its tether. Dashing over the picturesque landscape, he attempts to retrieve his possession, finally catching up to it at the mouth of a mysterious cave high up in the mountains.

The story also describes the struggle to unite the game and film worlds, as technologists and artists have long pursued tools and techniques that would finally unite and even merge the genres.

Yet, the film is so very much more than a great story and its underlying message. Aesthetically, the imagery looks as if it was generated on high-end machines running the latest content creation software and employing state-of-the-art techniques. And in truth, it was. However, at the heart of the process is Epic’s Unreal Engine 4. And those techniques? Well, they were fine-tuned and optimized for the game engine. Oh, and one more thing: The digital short is presented in real time, running at 30 frames per second (fps) on Unreal Engine 4.

“WE WANTED TO DO SOMETHING BEAUTIFUL FOR BEAUTY’S SAKE AND SOMETHING 
WITH PEOPLE IN GENERAL. AND DO IT WITH AN ENGINE ANYONE CAN USE.”

NEW DIRECTIONS

“A Boy and His Kite” began as a project to showcase the capabilities of the new Unreal Engine 4.8, but resulted in a touching, beautiful digital short in its own right. Indeed, Epic is well known for creating trailers that show off the latest features and capabilities within its newly released version of Unreal Engine. Mostly these have consisted of sci-fi, first-person shooter imagery – explosions and mass destruction – within feature-rich environments. Not this time.

“We have done dark, monster-y cinematic demos in the past, which were not so much stories as they were showcases of new features or technical improvements in the engine. This time we wanted something different. The engine is capable of so much more than dark surfaces, explosions, and gunshots. It’s very much a blank canvas for people to paint on. You can do anything you can think of within it,” says Gavin Moran, director of the short as well as an animator and storyboard artist at Epic. “Kim’s idea was to make a short film that looks like a CG short and would blow people away (albeit in a different sense). He wanted it to star a human, and we were not going to hide anything. We were going to push the engine as far as we could.”
One of the improvements to the new Unreal Engine is its ability to visualize large-scale environments. But how do you avoid becoming overwhelmed with the challenge of creating 100 square miles of open, outdoor terrain as well as the tools –
technology that everyone can understand and that will bring the desired results out on the other side?

Insofar as the extensive terrain was concerned, the group knew it wanted an outdoor environment reminiscent of Scotland. “And, we wanted to do some.thing beautiful for beauty’s sake,” says Libreri, “and something emotional that would resonate with people in general. And do it with an engine anyone can use. We planned a large set with lots of fauna and flora, lots of animated characters appearing on screen at the same time. We wanted to show the vastness and the flowers and trees that were procedurally generated – assets that are beautiful.”

The objective was twofold: to entertain and to showcase the engine and its capabilities. Well, threefold actually. “We wanted to inspire people as creators to tell a story with this new technology, as opposed to the laborious, classic process of film production in which you are unable to see things evolve right in front of your eyes,” Libreri says. “It is quite a liberating process [using the game engine]. I thought, let’s show the classic media industry what they can do with real-time technology today.”

The digital short was created in less than nine weeks, from concept to completion – an amazing feat with a crew of approximately 15 people. Of course, this did not include the team working on the new game engine technology.

INSIDE THE WORLD

As Libreri points out, features such as dynamic lighting that can vary according to time of day and a large-scale mountainous terrain with varied, realistic foliage is very difficult to do in-engine, but new systems within Unreal Engine 4 make those tasks easier to achieve. 


Epic’s “A boy and his kite” is a technology demo as well as a digital short film running in real time at 30 fps.

To this end, the terrain in “A Boy and His Kite” consists of all geometry – no matte paintings, notes François Antoine, lead technical artist who assumed the role of asset supervisor on the project. The world has creatures that live in areas [of the environment] where they should, and gather where they should. It is entirely lit with dynamic lighting, contains ray.traced shadows, and features fully direct and indirect global illumination, and the lighting can be changed to reflect any time of the day or night. “Nothing is baked in,” he says. 

The film features cinematic depth of field and motion blur. Moreover, there are trees, rocks, and vegetation everywhere – PBR (physically-based rendered) photo-modeled assets placed procedurally with rules.

According to Libreri, one of the biggest challenges of the project was the large-scale environment creation. “We had many people working on the open world – the LOD management, the procedural foliage and forestation,” he says.

An intelligent procedural process within Unreal Engine handled the foliage. As a result, vegetation does not grow well in areas of the canyon where there is not a lot of light.

Says Antoine, “We used a new rendering technology – distance field ambient occlusion,” which traces cones against mesh distance fields to determine the sky visibility and how dark it should be.
The engine, in fact, enables a plethora of cutting-edge achievements. It is used as a sandbox, with various tools and capabilities available – a physics system, animation system, lighting system, layout system, terrain management system, AI, camera-accurate depth of field, motion blur similar to what is used in visual effects, new global illumination techniques for bouncing light off the ground, and more. And now these functions are available in Version 4.8 of the engine.

GENERATING ASSETS

According to Libreri, the theme for the terrain in “A Boy and His Kite” is from the Isle of Skye in Scotland. Because the crew was working on a reduced schedule, they began researching the area virtually, using Google Earth. Then they started breaking down the components: cliff faces, large rocks, vegetation, small rocks, medium rocks, terrain rocks, ground dirt, and so forth. As it was December, the heavy rains and early darkness made an asset shoot there impractical, so a two-person team instead headed to New Zealand, where the vegetation was lush and green and the landscape similarly varied and enticing.

The photographs were used in two manners: for reference only and as photogrammetry for 3D meshes and 2D ground. For the latter, the group used Canon 5D Mark IIIs to acquire full panoramic pictures, as well as Canon 24-70mm cameras. “When we found an asset we wanted to capture, we started with a reference picture in its natural setting. Then we used a chrome and gray ball setup on a rig to capture the light intensity, and got a 360-degree HDR panorama,” explains Antoine.

This was no quick feat – they shot 30,000 photos in seven days to cover 250 assets totaling 1tb of data. A team of 10 artists across three Epic sites then reconstructed the data. “We were sending a massive amount of data across the network for each asset – between 7gb and 45gb,” says Antoine, noting that a new process was required to facilitate the transfers.

Depending on their preference, modelers used either The Foundry’s Modo or Autodesk’s Maya/3ds Max to generate the film models. Lighting, layout, camera work, effects, and AI all occurred within the engine.

“It’s very different from a visual effects process,” says Moran. “It’s all about getting [the film] to run at 30 fps. It’s not like a visual effects pipeline where things get better every day, day after day, until you finish. Here, you might come up with a solution for grass. You put it in and it does not run at the speed you need to run it at. So, although it might look beautiful, you have to throw it out and start again since you don’t have the performance you need.”
For the digital short, the group processed approximately 50 of the 250 assets acquired, with the goal to complete them all eventually. These are high-poly assets, which needed to be turned into game assets: low poly, with three levels of de.tail (LOD), cleaned textures, and a capped version of the mesh so the asset can be placed anywhere in the world.

THE ENVIRONMENT

“The world is 10 times bigger than a Skyrim level,” says Antoine of “A Boy and His Kite.” Ryan Brucks, a senior technical artist at Epic, created the realistic environments using Stephen Schmitt’s World Machine for realistic 3D terrain generation, and then imported it into the game engine, where the crew used Unreal Engine tools to break up the imagery into streaming levels with the continuous LOD loading system. The artists shaped the landscape with textured weight maps, using all the masks, and then blended together the different types of landscaping.



The short illustrates many complex techniques, including full-scene HDR reflections and photometric sampling of real-world data, within 100 square miles of terrain.

The team employed two different systems for the terrain. One was for generating the foliage (trees and rocks), which used collision and had a mix of manual/procedural placement tools, enabling the artists to move a tree from here to there, for instance. “That is the difference between live-action and CG. If you don’t like the basic layout and want to change the distribution of flowers in the meadow, you can change the numbers and get something completely different,” explains Libreri.
Initially, the artists were going to re-create the rocks using vector displacements or even tessellation, but instead set up the proprietary process. Now, the rocks comprise geometry, normal maps, and meshes.

The other system (for grass, flowers, and pebbles) did not use collision and supported material-driven placement. “You cannot render a billion little rocks, so they are really a texture,” explains Libreri. “One of our lead artists took the rocks that François [Antoine] generated and hand-dropped them and sprinkled them like he was sprinkling colored [sugar] on a cake. And they would scatter with a physics system. Then, we used the engine to generate normal or displacement maps, to produce the impression of rock.”

While they are two different systems, both use the same rendering technology offered by the engine.

“For the foliage, we could render a million instances of things,” says James Golding, lead programmer at Epic. The objects ranged from 32 polys for a billboard-like object to a hero object with 115,000 polys. To improve the low-poly object’s aesthetics, the artists could tweak the information; also, because there is depth information, the shadow is perceived like a real tree.

The artists wanted a natural look for the foliage placement, 

with clustering, self-thinning, and growth curves (mature and younger plants). They also want.ed to iterate fast and intervene when they wanted to alter the landscape. As a result, they came up with an ecosystem to simulate nature, with species and density settings as well as the ability to age and seed plants. The system also places rocks, bushes, and so forth.

“We have rules to determine which species is the best fit for an area relative to size, shade, altitude, and slope, for a nice wooded environment,” says Golding.

The simulation takes a long time to run, so the team optimized it, then did touch-ups with painting tools, editing the results as they would with other types of foliage.

In all, the level contains 193,000 rocks, 226,000 trees, 1,170,000 bushes, 6,358,000 flowers, and 7,879,000 tufts of grass (generated in the foliage simulation system only, which excludes what was created with the grass system).
To place the grass, the artists found that using painted weight maps did not work well. Instead, they used the material graph to define the grass density, then rasterized the terrain on the CPU to find instance locations. The new global illumination technique bounced light off the ground, so if there is green grass, it bounces green light back up into the trees.

The sky, meanwhile, was made using traditional visual effects, for the most part. Technical artists could animate between multiple versions of the sky that changed for time of day. “We are basically swapping out textures,” says Libreri.
Illustrating the passing of time was a bit tricky, however. Standard cinematic language, such as cross-dissolves, weren’t available to the artists for technical performance reasons (they would have had to render twice as much data on an already maxed-out system). So, they had to come up with different cinematic approaches to show this effect. 

ANIMAL LIFE

“Now we had this beautiful, lush world, but it was pretty static,” says Golding. So, the crew added herds of red deer, which are found in Scotland. “We were learning how our dynamic systems scale in large worlds, and we were pushed to do things we hadn’t done before.” 

In this regard, they animated and added AI to the deer in the same way they did the human character. Pathfinding kept the animals from running off a cliff, and crowd simulation prevented the fauna from colliding with each other as well as objects such as trees. Navigation data helped them get from point A to point B and avoid obstacles.

Because the navigation mesh would have been too large to calculate for the entire world – it would have required approximately 5gb of memory – the team added a feature to the engine that let them generate tiles on demand, so they could spread the work across multiple cores fairly quickly. The AI gave the deer a goal, and the state machine determined the best animations for the deer to use. A spawning feature divided the world into cells and determined the best location within each cell, based on rules, for the animals to spawn automatically as a person navigates the world (which can be done in the slower walking mode or faster drone mode.

Assisting Libreri’s team with the deer AI were a handful of animators from the game group. Ray Arnett and Jay Hosfelt provided walks, runs, trots, idle cycles, and multi-directional bolts, which provided the AI with plenty of choices in regard to how to react to a given situation.


The world contains 15 million pieces of vegetation and an ecosystem that simulates nature.

BUILDING A BOY

The central character in the film is a young boy, unique territory for a company whose bread and butter has been muscle-bound future soldiers and monsters. Before the animators can pose and move the boy, digital muscles and bones have to be put into the static model, allowing it to deform and move – which can be a time-consuming process.

Epic has made a name for itself focusing on tools and pipeline. With the next generation of gaming hardware able to sustain such a large jump in visual fidelity, teams cannot scale to create the content using antiquated pipelines. 

In fact, Epic sometimes has to create multiple digital characters in a single week, and to achieve this, Lead Technical Animator Jeremy Ernst created the Animation and Rigging Toolkit (ART), which automates the rigging process. Every character at Epic is created with this tool set. When the engine was released for free, Epic allowed Ernst to distribute the tools freely online, making it easy for anyone to create an animatable character for their game or film.

As with all next-gen pipe.lines, no matter what the content, outsourcing must be planned from day one. With so little time, Epic partnered with 3Lateral, a studio that creates high-end digital characters and focuses on rigging high-fidelity faces for game cinematics and films. 3Lateral was able to plug its proprietary facial system directly into Epic’s ART tool set, which made creating an engine-ready boy puppet much easier and allowed 3Lateral to focus on the face.

“We focused on the character’s face early in the process, to be sure the character could meet the performance requirements,” says Moran, who worked with 3Lateral through.out December on the child’s design. By the end of January, 3Lateral provided the team with a rig to animate.



Subsurface scattering can be seen on the boy’s face, which also contains 550 blendshapes.

One of the pillars of the production was to use film pipe.lines and procedures to create content, not just appropriate or adapt them as game engines have done in the past. Epic wanted to make sure that linear content (film, television) could easily be created with Unreal Engine 4, especially as Epic was about to release it royalty-free for that use. 

The boy’s face is a film-quality FACS rig containing more than 550 sculpted blend.shapes spread across eight different facial meshes (eye.brows, eyelids, and so forth). “Representing facial animation with 550 blendshapes is quite difficult. Our animation team had to work hard to get the performance down,” says Libreri.

By the time the Epic group returned from holiday break in early January, it had an approved head and facial design, enabling 3Lateral to begin work.ing on the final body. Until this point, a mannequin represent.ing the boy’s proportions was being used to allow all departments to move forward with production. The final boy mesh would be built to the existing proportions and skeleton of the mannequin body, to make sure the update was as seamless as possible. 

Motion capture was then obtained using Epic’s Vicon system. “We were in the motion-capture studio early, trying to get as many movements as we could,” says Libreri.

Early on, the artists deter.mined that the boy would be 5 feet tall, an arbitrary height for a 12-year-old child, and based certain decisions on that information. The child actor then performed various motions, such as running and jumping, which were motion-captured and fed into the engine to generate the animatic. Later, the scenes were exported into Maya and given to the animators. 

“At that point, we had to determine whether to keep the mocap, throw it away, or stylize it,” says Moran. “Some shots were animated completely by hand in Maya from scratch, while some used the motion-capture as a basis but were highly modified to better suit the scene, and others were purely cleaned-up motion capture.”

The model contains corrective shapes, enabling the riggers to artistically direct the poses, a capability supported by Unreal Engine 4. Also, the engine supports improved skin shading, including custom SSS diffusion profiles.

“Initially, the boy was going to have shorter hair. 3Lateral sent an idea for the design of his hair that was longer all over, and I modified it so he had more hair on the top and it was shorter on the back and sides, to avoid the unnecessary overlap,” says Mo.ran. “Ernst, who is Epic’s rigging lead, added spring bones to the fronds at the front of the haircut, which gave us a cheap and easy simulated hair in real time within the engine and didn’t require any further animation resources.”

The overall look of the character was inspired in part by Monster House. It was realistic enough to be able to emote complex emotions and yet had the aesthetic advantage and audience acceptance by staying firmly on the more stylized side of the Uncanny Valley.

“It would’ve been nice to have had more VFX hair on our character, and maybe we will upgrade that over time,” Libreri adds. 

REVOLUTION IN FILMMAKING

Many artists have crossed the bridge from feature films to games, including Libreri and Moran, who previously worked at Escape/ILM and Dream.Works/Imageworks, respective.ly. For “A Boy and His Kite,” they used their collective skill sets to tell a story, albeit with a unique set of tools and in a unique way.

Want to change the boy’s hair color? No problem, it can be done instantly, and the results can be seen instantly. “There is no renderfarm involved,” Libreri points out. All you need is a good computer to play the demo and fly around and modify the imagery. Want to see a different angle? Just alter the focal lens, camera, and so forth.

“Imagine a world in the future, where you are watching an animated movie and it can be different every single time you watch it,” Libreri adds. “You can buy clothing, and the audience can change the outfits based on their own personal tastes. The system is living and breathing.”

Libreri points out that the visual fidelity achieved with Unreal Engine is probably the highest available with video gaming technology today. And while there were still some artifacts in “A Boy and His Kite” that would not be present in an animated movie, they are  minor – so minor, in fact, that most people don’t realize that it is “playing” in real time.

“It’s not just an animated short made in Unreal Engine. It’s an animated short that runs live at 30 fps,” says Libreri. “The difference is, while you are playing the cinematic, you can pause it at any time and fly around the environment. We can adjust the time of day. We can fly through the clouds and down into the forest. We essentially built the cinematic in a virtual world.”



CG Techniques in ‘A Boy and His Kite’

Dynamic direct/indirect illumination
Cinematic depth of field and motion blur
Procedurally placed trees and foliage
Physically-based rendering
Raytraced distance field/soft shadows
Full-scene HDR reflections
Distance field ambient occlusion
2-sided foliage shading model
15 million pieces of vegetation
Temporal anti-aliasing
Procedural asset placement (deer)
Dynamic global illumination from height fields
Photometric sampling of real-world data
Subsurface scattering
500 sculpted blendshapes
Everything blended in real time

It’s Libreri’s belief that once people realize the creative empowerment of real-time systems, they will begin to embrace them more. While there is a unique set of restraints to this new method, including 30 fps, the advantages are many. “Creativity is about iteration, experimentation, and collaboration. When you can do that in a more intuitive way, you will get better images and story.”

“It really transcends from a certain point of view of what people think of as a game engine,” says Moran of the demo.

Today, visual effects has reached the point of being synonymous with photorealism, and Libreri is confident that within the next five years, the industry will have crossed the Uncanny Valley (see “Almost Human,” page 50). And real time will be ready for that. 

“Our fidelity level in real time was decades behind pre-rendered imagery not so long ago, but now it is just a few years behind,” Libreri points out. “It is getting better and better. It is now about the workflow and letting the artists get creative, as opposed to simply mimicking nature. I think what you will see, especially with virtual reality, is that storytelling will be told in the real-time domain, so people can be involved.”

Libreri further predicts that on the horizon will be studios producing live-generated content, whether for the PC, Mac, tablet, iPhone, or Android devices, all rendered in the cloud. “We have amazing power capabilities now to render high-end graphics on the fly,” he says, noting that the Epic Games demo could have easily been generated live from Amazon Cloud or Google Cloud, direct to consumers. After all, the tools are readily available – it can be done with Unreal Engine 4.8 running on a home computer.

“[Real time] will open a new world in terms of how animation companies think about their content and how their content evolves over time,” Libreri says. “We want to inspire other artists to explore real-time technology in video games and as a hybrid between the two worlds [of gaming and filmmaking] to tell stories and share experiences.”

While the storyline of “A Boy and His Kite” illustrates life’s simple pleasures, the actual production is a story of revolution, a new way of filmmaking that could change the genre forever.