Science Project
Issue: Volume 37 Issue 6: (Nov/Dec 2014)

Science Project

Disney studios might be known more widely for animation than technical advances, but anyone involved in computer graphics knows that pushing filmmaking science and technology forward has been part of the studio's DNA from its black-and-white beginning. At Walt Disney's studio on Hyperion Avenue, Mickey Mouse made his debut in the 1928 film Steamboat Willie, the first animated short to successfully synchronize sound and images. Walt Disney Hyperion Studios was also the first to use three-strip Technicolor, and the first studio in the US to produce a full-length animated feature, Snow White and the Seven Dwarfs. Inventors there refined the multi-plane camera and replaced it much later with the digital CAPS system, developed in conjunction with Pixar.

This year, the staff at Disney played homage to that tradition in its latest animated feature, Big Hero 6, a visually extravagant, action-packed, superhero comedy adventure. One superhero, called Baymax, often looks more like a balloon than Iron Man. The others are whiz kids, techie teenagers who are brilliant and funny.

"This movie is a celebration of science," says Don Hall, who directed the film with Chris Williams. "We have a film about characters engaged in cutting-edge science made by people engaged in cutting-edge science."

Based loosely on a Marvel comic of the same name that celebrated Japanese pop culture, Big Hero 6 centers on a 14-year-old supergenius named Hiro, who lives in San Fransokyo, which is, as it sounds, a mash-up of San Francisco and Tokyo.

At the beginning of the film, Hiro spends his time wastefully. To inspire him to give up 'bot fighting, his older brother, Tadashi, takes him to his university lab and introduces him to a group of friends. Tadashi has invented a Personal Heathcare Companion named Baymax, a plus-size inflatable robot that detects and treats any ailment.

After Tadashi dies, Hiro transforms Baymax into a crime-fighting warrior who, along with Hiro and a group of young geniuses, form the Big Hero 6 superhero team.

In a stunning scene, Baymax ignites his new thrusters and soars over San Fransokyo with Hiro on his back, flying north beyond the Golden Gate bridge with its pagoda-style stanchions, west to the Pacific Ocean, east over the entire city, and above the Oakland Bay Bridge, swooping down to fly nearly at street level, then back up, dodging balloon-like wind turbines that drift 1,500 feet above the city.

On any other film, it would have been a painting. On this film, it is an entirely 3D sequence. The details give the movie an immersive feeling unlike any other.

"If you lived in San Francisco, you couldn't find your house," says Technical Supervisor Hank Driskill, "but you could find a house on your lot that is the right size and height."

The city has 260,000 trees, 250,000 streetlights, 100,000 vehicles of 15 types, 83,000 buildings placed according to assessor data, thousands of crowd characters, and tons of details from stairwells to satellite dishes.

"This city is way beyond what we do normally," says Andy Hendrickson, chief technology officer. "It has more geometry than the last three films combined."

How do you handle that much geometry? Light it? Turn it into pictures?

With a new global illumination rendering system named Hyperion.

 "We were calculating 10,500,000,000 rays per frame," Hendrickson says. "That's what makes the scenes in Big Hero 6 special. We had a lot of geometry."

Risky Business

One reason for all that geometry is that the renderer could handle it. "The show changed as work on the renderer progressed," Driskill says. "We started to realize what we could throw at it, what would break it. And our appetite grew. As the movie got bigger and bigger, we wanted to do more and more. Individual artists were saying, 'Look what we can do now.' "

The decision to use Hyperion, however, was an act of faith.


We never looked back. Because we couldn't.

"It was not the safe choice," Hendrickson says. "We weren't sure even a year ago that it would work, so we were living on the edge. We had a world that's awesome and a deadline that was absolute. And, we were always in beta, all the time. But, we get a little bit of a rush putting it altogether."

Director Hall began working on the film in 2011, and in November of that year, Brent Burley approached Hendrickson with his idea for the radically new type of raytracer. Eighteen months later, production began on the film.

"I've been in the industry since 1989," Hendrickson notes. "Brent said, 'We should do it this way,' and all of a sudden I realized, 'Heck, yeah.' "

Until this point, the studio had rendered its films using Pixar's RenderMan software. Hendrickson asked Brent and his team to run some tests with various amounts of geometry.

"The tests came back favorably," Hendrickson says. "The next step would be to go for it. I sent a memo out to the company early last year saying we were at a dividing point, a fork in the road where we had to decide whether to use Hyperion or not. The experiments were so favorable; it looked like it could be the new way to work. So I invoked executive privilege. I said, 'We're doing this.' We never looked back. Because we couldn't."

Breakthrough

In June 2013, Burley, along with Christian Eisenacher, Gregory Nichols, and Andrew Selle, had pushed the idea far enough along to publish the paper "Sorted Deferred Shading for Production Path Tracing." It received the Best Paper distinction at the 2013 Eurographics Symposium on Rendering.

They wrote: "Raytraced global illumination is becoming widespread. However, incoherent ray traversal and shading has traditionally limited raytracing to scenes that fit in memory. To combat these issues, we introduce a sorting strategy for large, potentially out-of-core ray batches, and we sort and defer shading of ray hits. As a result, we achieve perfectly coherent shading and texture access, removing the need for a shading cache."

Sean Jenkins, the technical supervisor who, along with Chuck Tappan, helped move the renderer into production, simplifies the description: "Hyperion's power is that it takes all the rays it casts out, sorts them, and then works on all the rays aiming toward one place, like the ceiling. Then, it works on aiming all the rays toward the table. It's a different way of looking at the problem. Brent identified coherence as the main bottleneck, as what would limit us as far as scale. By capitalizing on coherence, everything else got easier. The software doesn't have to swap back and forth between the ceiling and the table and the chairs. Sorting the rays is pretty straightforward. The algorithm we published in the paper hasn't changed a whole lot since then."

Most raytracing systems send rays out from the camera and calculate the shading for each spot each ray hits. As the rays bounce, the renderer calculates the shading for each hit, and continues doing so through a complex spiderweb of rays bouncing and hitting objects in a scene. Thus, studios often limit raytracers to as few bounces as possible. By contrast, Hyperion shades a fixed set, a batch, of 33 million rays going in roughly the same direction.

Burley provides more detail: "We start by sending rays from the camera, generate a few hundred million rays, and organize them into batches of 33 million going to a similar part of the image in a similar direction, " he explains. "The renderer records the ray hits and switches to shading mode. So, all the rays that hit a chair, for example, get processed all at once. The process of shading generates new rays and, unless the thing being shaded is a light, the rays bounce more. Those rays are sent to a queue where they are bucketed by directions. There's an open batch for each of six directions. Once the batch is full, we have a ready batch of rays going in the same direction. We trace, shade, trace, shade until there are no more rays. It happens over the course of a few seconds. Because we're sorting rays based on hit-point coherence, they are shading the same object. They have the same shader, the same texture, even a portion of a texture on an object. They do everything they need and are done with it."

And that's why Hyperion could calculate 10,500,000,000 rays per frame. The effect was dramatic.

"Because we could capitalize on coherence," Jenkins says, "we could trace more rays and trace rays through multiple bounces. We started seeing significant results from six bounces. We still had energy filling in the overall scene. And it gave an overall illumination within the characters. Baymax is hollow and bright. We could bounce light through him many times."

Hendrickson adds, "It's the first time we could do caustics, glass, steel, volume rendering, and atmospherics with no limits. We could have done this film without Hyperion. But, it would have been a different film."

Other Technical Achievements

Hyperion, though, as remarkable as it is, wasn't the only technical advancement realized for the film.

"This was our most ambitious film ever," Driskill says, and lists several reasons why.

"We built a geographically accurate version of San Francisco with every lot and building height," he says. "We worked on aspects of it for over a year, going back and forth with visual development in terms of complexity. We built it procedurally using template environments for districts. Our biggest shot had 350,000 unique elements."


Baymax and Hiro soar over a beautifully rendered, highly detailed San Fransokyo.

To populate the city, the crew designed a new system called Denizen that put 6,000 people in the opening shot, and created an animation retargeting system to work with Denizen.

"CG films have often felt like they were filmed on a soundstage," Driskill says. "Fifty feet out you have a painted backdrop, and you have the same six crowd characters. We kept pushing further and further because we felt unlimited."

The team also built a new system called Parade to help animators. When an animator manipulates rigging controls, Parade calculates the effect of those changes on neighboring frames, which helps animators iterate more.

Some of the new tools centered on Hyperion. For example, they refined and improved a frame interpolation technique that sister company Industrial Light & Magic had built for Pacific Rim.

"If we ended up without enough bandwidth to render every frame, we could have rendered every other frame and interpolated," Driskill says. "By the time we were in shot production, we didn't need it, but we now have a cool frame-averaging technology. We can break out layers and average them, and it does nice noise reduction as a side effect. Hyperion, like any path tracer, is nonlinear in the time it takes to squash the last bits of noise. This helps clean up those last bits of noise in the image, so we can chop off one turn of the crank."

To further reduce rendering time for some shots, the team created Façade.

"Global illumination is interesting because you bounce light in the whole world to create imagery," Driskill says. "But, when we're in Hiro's bedroom with the sun coming through the window as the light source, Hyperion spent most of the time rendering San Francisco Bay out the window. It had loaded the entire city. Even though Hyperion allows an obscene amount of geometry in a scene, we didn't need that. Facade pre-bakes the illumination outside, then builds a piece of geometry outside the window with that pre-baked illumination."

The effects team also had technical challenges - especially in creating Hiro's little microbots, which use magnetism to connect to one another and disconnect. Other challenges were in creating effects particular to each character, and in producing a signature effect for the climax, which involves an environment created entirely with volumes.

But of all the accomplishments, the breakthrough idea for rendering encapsulated in Hyperion stands out. It's tempting to call the Hyperion team the Big Heroes on this film.

"Hyperion was absolutely a crazy idea," Driskill says. "When Brent brought the idea out, we went, 'Oh, wow. It's revolutionary.' We went to Ed Catmull, and he said, 'Go for it.' "

And they did. The rest is CG history.

"I've watched the San Fransokyo flyover probably 100 times," Driskill says, "and I'm still excited about it."