|DreamWorks forges a new stereo 3D experience and brings a Viking world to animated life in a CG adventure comedy
The title of DreamWorks Animation’s latest film is How to Train Your Dragon, and indeed, the star of this animated feature, a teenaged Viking named Hiccup, does just that, albeit in his own way. So, it makes sense that the biggest challenges for the real-world animators, visual effects artists, modelers, and riggers centered on “training” CG tools to help the crew create these dragons. Seven dragons in all. Unique dragons devised by Annie Award-winning character designer Nicolas Marlet (Kung Fu Panda, Over the Hedge), who also designed the main characters for the film: Hiccup, his big-bearded father Stoick, a young female Viking named Astrid, and various other Vikings young and old.
Dean DeBlois and Chris Sanders, the team who wrote and directed Disney’s Lilo and Stitch, directed How to Train Your Dragon and wrote the screenplay. It’s the second film created in stereo 3D at DreamWorks from start to finish.
The iconoclastic young Viking Hiccup rides the dangerous Night Fury dragon after befriending the creature that other Vikings considered an enemy. At right, Tuffnut, another teenager, will take some convincing.
The story, based on the children’s book by Cressida Cowell, pits the brainy Hiccup against his brawny dragon-slaying tribe: Hiccup breaks with tradition, befriends a dragon, and dubs his dragon friend “Toothless.” The tribe is not amused.
Simon Otto, head of character animation, began working with Marlet, a design team, modelers, and riggers three and a half years before the film released, as part of a small development group that brought the two-dimensional drawings into the 3D world. “The design language of the movie pushed caricatured shapes set in a realistically textured world with live-action-esque lighting,” Otto says. “We had exaggerated shapes, but the story is epic and naturalistic. So we needed to be sure we could deliver the emotional beats with realistic acting.”
Production started a year before release. “The magnitude of the dragons was a main task to tackle,” Otto says. “And on the human side, we wanted to make sure the Vikings had beards, and that created challenges for the rigging and character effects departments.”
At first, the riggers thought they could set up the controls for one dragon and apply them to the others, but it wasn’t that simple. “We ended up with seven bespoke dragons,” says Nathan Loofbourrow, character TD co-supervisor. “They share many controls, but each is unique with special powers of its own.” They all had legs and wings, but one had spikes that move up and down based on the dragon’s emotional state, and another puts his wings down and crawls. They each walked differently, flew differently, and spewed different forms of CG fire. And, they had different personalities. Unlike the dragon in PDI/DreamWorks’ Shrek, these dragons are primary characters.
“We tried to hit a tone that was hopefully fresh,” Otto says. “Dragons have been in many films, but they’ve mainly been in 2D or live-action movies, like Dragonheart. We didn’t need to match a live-action design or match to plates, and we could make a more complex dragon than a drawing. Nico [Marlet] looked for a shape language—what a short, stubby dragon would look like, for example. We drew inspiration from real life and steered the designs into something naturalistic and recognizable for the audience. We wanted to have fun with them. Make them believable. And at the same time, somewhat silly in their nature.”
The animators drew from personal experiences with their cats and dogs for Toothless. But, Toothless is also a dragon in the Night Fury species; a bird of prey, a panther, black like a stealth bomber, that fires white-hot lightning bolts. “Toothless has four legs, two sets of wings, a tail, and a tail fin,” Otto says. “[For us] to have maximum artistic control, he had four times the number of controls as Hiccup, the main character.”
The Deadly Nadder, on the other hand, has the muscular legs and aggressive nature of an ostrich and the beautifully colored feathers of a parrot. His tail is spiked, and he shoots swirling, white-hot sparkly shapes, but he doesn’t see well.
Gronkle is a tubby, green dragon with tiny wings that Otto describes as a cross between a walrus, crocodile, bumblebee, and bulldog. “He’s silly like a bulldog is silly,” Otto says. “The dragons aren’t funny as in a Tex Avery cartoon, but there is a funny aspect to their design and behavior that’s drawn from real-life observation.” Gronkle flies like a hummingbird, but scoops up rocks and turns them into lava balls.
At top, two Terrible Terrors illustrate the color range in this species of dragon, as they fight over a bit of food. At bottom, Hiccup shows his astonished yet dubious friends how to train a Monstrous Nightmare.
Hideous Zippleback has two heads that zip together. One head spurts gas, the other head ignites it. The tiny Terrible Terror attaches itself to the larger dragons for free rides. “His fire is close to propane gas,” says Craig Ring, visual effects supervisor. “The funny thing is that it’s so out of scale for the dragon. It’s like a 20- or 30-foot blowtorch.” The red and black Monstrous Nightmare, which looks most like a classical dragon, sets itself on fire. Lastly, the Red Death is the biggest of all, in every way.
Rather than have the seven supervising animators be responsible for having their teams perform all the characters in entire sequences, Otto organized the supervisors and their teams by characters, using a system typically implemented for traditionally animated films. In addition, separate sets of animators worked on crowd scenes with armies of Vikings and big groups of dragons.
“Most of the supervisors were classical 2D animators who had worked at the studio for a long time,” Otto says, “so we persuaded the studio to go back to a supervisor-per-character system.” The supervisors led 51 animators at PDI/DreamWorks in northern California and at DreamWorks Animation in southern California. Although Otto noted that organizing the production by character rather than sequence can sometimes be more difficult to manage and less efficient, for this production that wasn’t the case.
“It turned out that it helped us in the long run,” Otto says. “Because the animators worked on one character, it was significantly more efficient, particularly with complex characters. And, the consistency of the characters grew as we went through the film.”
Systems within the animation rig helped the animators handle that complexity.
Wings and Fire
“We spent a lot of time optimizing,” Ring says. “The biggest problem was the complexity of the characters. For example, when you have hundreds of spikes on a dragon’s back that can be moved independently, putting in controls that don’t bog down the animators is a big challenge.”
This film was the first to use the studio’s rewrite of its in-house rigging system. “That definitely helped because it’s faster,” Ring says. “Also, we gave animators low-res proxy versions of dragons. They could turn off the parts they didn’t need.”
To help the animators control the dragons’ wings, the riggers started by looking at movies of bats for reference. “Then we broke down the wings mathematically,” Loofbourrow says. “Each wing had five, six, or seven divisions based on how they would fold up.”
For each dragon wing type, the animators created flap cycles for different flying maneuvers—landing, coasting, and so forth. Then, the riggers added those flap cycles to the wing rig, and the animators could make the dragon fly using a few simple controls. “It was almost like they could turn a crank in the dragon and the wing would flap,” Loofbourrow says. “They could dial in weak, medium, and strong cycles as they animated. We embedded the curves the animators crafted, and the system interpolated between them over time and strength. They could slide between no flapping to the strongest flap. The curve shape would change, and the dragon would move smoothly.” The combination of flight cycles and tweaks gave each dragon its unique method of flight, whether hummingbird or bird of prey.
As with all the procedural and simulation systems created for the characters, whether dragon or human, the animators could tweak the performances. And because each dragon breathed fire differently, the riggers created controls within the rigging system for those performances, as well. The character effects team, though, created the CG fire—seven different types. The directors wanted each dragon to have different fire and wanted the fire to be different from any fire seen before in films.
“We looked at movies with live-action dragons and discovered that the fire tends to be the same propane gas explosion style,” Ring says. “Propane gas makes great explosions on location without setting everything on fire. But, we didn’t have that problem. So, we decided to make it more dangerous.”
The effects crew primarily used Autodesk’s Maya particles and fluids for base simulations, with Side Effects Software’s Houdini breaking apart things the dragons exploded. They added color and detail via an in-house renderer.
Each dragon had its own set of effects artists working on its particular style of fire. “All the fire was independently developed,” Ring says. “We shared some generic fire tools and volume rendering, but we had different artists hitting different targets, and it really paid off.”
For example, inspired by high school chemistry class experiments in which students burned magnesium powder, Deadly Nadder’s artists twisted Maya particles into sparkly fireworks. Similarly, particle-driven fire emitted from a hand-animated rock helped the Gronkle turn a rock into an exploding lava ball. Houdini helped break the flaming rock apart on impact.
A Maya fluid simulation, on the other hand, streamed gas from one of the Hideous Zippleback’s heads. The same tool set with different parameters converted the gas into a fiery explosion when the second head lit it. And, Terrible Terror’s giant blowtorch used both Maya fluids and particles.
Red Death’s fire was perhaps the most onerous. “He has smoky, dirty fire, like an oil refinery,” Ring says. “The fire is driven by a creature 500 feet long. At one point, he sets a whole fleet of ships on fire. So, the fire had to look big.”
Coincidentally, a huge wildfire erupted near where some of the artists lived. “There were 50-foot flames,” Ring says. “Some artists were evacuated out of their homes. It was on the news constantly.” He and the other artists studied the fire to determine what made it look so big.
“It’s about having pieces of flame breaking off, rather than having a continuous flame,” Ring says, “and about lots of small detail. Our old volume renderer would have bogged down and crashed, so we rewrote it.” The crew ran the base simulation for the huge fire in Maya, then added details on top.
Authoring in Stereo
This is the fifth film stereo supervisor Phil “Captain 3D” McNally has worked on and the second film at DreamWorks since CEO Jeffrey Katzenberg mandated that the studio would create all future films in stereo 3D. As a result, at DreamWorks, the minute a film moves from the storyboard stage, everyone is working in stereo 3D.
“We worked out what authoring in 3D is on Monsters vs. Aliens,” McNally says. “We had to get the technical gear together—put the technical pipeline in place so we could author in 3D all the way from previs—train the artists, and so forth. So a lot of that work was in getting everyone up to speed and seeing that it was 100 percent successful in terms of delivering sophisticated 3D without hurting anyone.”
Before work on How to Train Your Dragon began, McNally went back to Monsters and looked at what they had done. “I analyzed the settings we, as a group, had derived by sitting in the theater and deciding what we liked,” he says. “Then, I came up with a little calculation that’s now in our tools.”
The calculation, which McNally dubbed the “Happy Ratio”—the tool even comes with a smiley face—gives artists a baseline from which to design the stereo. “Even if an artist hasn’t worked in stereo before, we can get consistency,” McNally says.
The way it works is that the tool has default settings for the volume of objects in a scene and the depth of a scene. “If we have a tennis ball floating within arm’s reach, the ball needs volume because in real life we see around the sphere,” McNally explains. “But a distant planet might be flat and far away in depth. So the Happy Ratio is a combination of these things.”
McNally uses the way photographers manipulate camera apertures as an analogy for the way artists use the tool. “We have planes within a scene,” he explains. “The artists decide which part is important. If it’s a close-up, they’ll set a marker on the nose of a character. The system knows the distance from the nose to the camera and guides the artist on the stereo numbers.
The tool also made it easier for McNally and head of layout, Gil Zimmerman, to make creative decisions as they designed the stereo experience.
“We would sit down at the beginning of a sequence and work out how the stereo should emotionally support the script,” McNally says. “We had done this at the studio to some degree on Monsters, but this was the first time we could concentrate more on the emotional arc than the technology.”
McNally offers an example, and here’s the setup: It’s a sequence where Hiccup has brought down the Night Fury. He goes into the forest to find out where the dragon crashed. Hiccup’s capture device has netted the dragon, so Hiccup decides he’ll kill the beast with his little dagger, take the heart back to his village and prove he’s a knight.
“So as he’s gearing up to do this,” McNally says, “we ramp up the stereo intensity in sync with the camera push-in. It’s not a stereo effect we think the audience will notice, but the intensity goes up further and further. One of the shots is a close-up of Hiccup with the forest behind, and he’s standing with the raised dagger. At the beginning, the stereo is set at 50 percent of normal. The character is a bit flat, the background is close. He’s in our personal space a bit. As we ramp up, it isn’t like the shot goes farther away or comes closer, it just expands. The volume within expands. It has the effect of the character getting closer and the background getting farther away. The audience feels the impact of the sequence, the music, lighting, and camera, but they don’t pick out what we’re doing in stereo. We think that’s the perfect use of stereo: adding emotional intensity without drawing attention to the technique.” When Hiccup decides he can’t kill the dragon, the stereo drops down to be less deep than even in a normal sequence. The adrenaline drains from the scene.
In another sequence, Hiccup, who is riding on the back of the dragon, does a free fall along the z axis. “They do a rolling dive, like going off a roller coaster,” Otto says, “and then they’re finally flying.”
The animators working on that sequence helped create the stereo 3D experience. “We want to make sure the 3D experience feels truthful and real,” Otto says. “It’s in the framing—the characters have to be within the frame—and in the timing. It’s how long you play certain moments out, how you allow certain depth cues to happen.”
In the free fall, for example, the animators hold back a moment as Hiccup is finally flying, to give the audience the same thrill Hiccup felt. But there are other ways in which stereo 3D has affected animation. “In 2D, everything is about silhouette,” Otto says. “Stereo gives you more readability; it reads very differently. We’re only scratching the surface.”
McNally notes, for example, the action in animated films typically happens on a proscenium stage with the characters on the left and right, and the action playing across the frame. “In 3D, though, we no longer have the confusion of characters overlapping,” he says. “3D can also carry more visual density. In 2D, simplicity works better, and we spent a lot of time in the past clearing the space and simplifying the shots. But, the more the better in 3D. We can keep putting stuff into the shots and it never gets confusing.”
Although many Vikings in the film have various types of beards, Hiccup’s father Stoick has the mother of all beards. “His beard is probably as complex as Shrek in his entirety,” Otto points out.
Stoick’s beard obscures his entire upper body and most of his face. “His face is the beard,” Loofbourrow laughs. “His beard is his lips. His cheeks. Even his brows are hairy eyebrows. You have to look at the hair to understand his performance, to see the smile in his beard. We had to make sure his expressions showed through.”
At top, Ruffnut and Tuffnut face a Deadly Nadder. Note how the lighting in this shot and the lack of background detail focuses attention on the characters. At bottom, the gruff Viking trainer Gobber, who lost his limbs fighting dragons, stands out from the teenagers behind him.
Riggers Doug Stanat and Sean Nolan handled Stoick’s face and beard rigs, working together to shape his facial expressions and make sure he had follow-through on his dynamic beard. As in most hair systems, guide curves controlled the overall shape of the characters’ hair and beards, with dynamics driven by the underlying performance creating the movement. Usually a character effects group runs the dynamics after the animators finish. Not this time. “In this case, because we knew the animators had to see the beard as they worked, we couldn’t send the beard to simulation and back,” Loofbourrow says. “It had to be part of the animation process. It wasn’t fast. But, it was fast enough.”
So that the animators could see the guide curves, the modelers turned them into tubes. “That gave the animators a low-res preview of the volume of the hair,” says Ring. “We also tried to get as much movement built into the rig as we could so they could see the movement. We had the ability to mix hand-animation controls with physically-based dynamics.”
Stoick’s beard had 100 guide hairs draped over his chest and flowing along his face. The animators could look at the tube geometry and, using a magnet, pull the curves in a specific direction. “For really tricky cases, we’d take the guide curves back into Maya and use the whole suite of Maya tools—sometimes dynamic calculations, sometimes hand-animated keyframes—to get the hair-to-hair contact working right,” Ring says.
A surfacing department added shader parameters that controlled the hairs’ shininess, kinkiness, color, and so forth, and the studio’s in-house renderer then multiplied the guide hairs into the thousands of beard hairs. The process wasn’t always straightforward, though. “A whole bunch of departments are involved, and sometimes they worked in parallel,” Ring says. “So you get to the end, look at it, and the animators move a guide curve, rigging adjusts the rigging, surfacing changes a shader parameter—iteration after iteration.”
One reason for the iterations was that the crew created the film in stereo 3D. “This was one of the first films in which we looked at the hair and fur in stereo 3D,” Loofbourrow says. “All kinds of stuff can happen in a bushy beard that you don’t see until you put on 3D glasses.” The problems typically happened with the guide hairs, with two guide hairs passing through each other—a problem exaggerated when the renderer interpolated the guide hairs into thousands.
“We previewed the beard in 3D as much as possible,” Loofbourrow says. “We’d slap on the little red-blue glasses. We couldn’t see the beard in color, but we could see the problems.”
Fifty people at DreamWorks worked on fire and water effects, and another 20 on cloth, hair, and fur. In addition, a lighting crew of approximately 50 artists brought the scenes to life using techniques from live-action filmmaking.
Light My Fire
“One of the things we hadn’t done before was to bring in a live-action director of photography,” says Ring. “This time we had Roger Deakins as a visual consultant. He came in once every month or two depending on his scheduling and sat in on color grading, too.”
The eight-time Oscar nominee for best cinematography helped the team work interactively to develop the look. “In the past, we did paintings to guide lighting and provide inspiration,” Ring says. “We did a lot less of that on this show and not at all for many sequences.”
By using Maya, they could, just as Deakins might do on a live-action set, put a light outside a window or place a soft bounce card in a scene. “I think it really paid off for a style for the film,” Ring says. “We had more contrast, richer blacks. We threw away detail to concentrate on the characters and pushed the live-action feel.”
Animators could control the Vikings’ hair and beards using rigs with built-in dynamic simulations.
How to Train Your Dragon has a look and feel that’s as different from any of DreamWorks’ previous films: Madagascar, Kung Fu Panda, Shrek, Over the Hedge, Monsters vs. Aliens, Shark Tale, Flushed Away. In part, that’s due to Deakins’ influence. In part to the design decision to set the caricatured shapes in a realistically textured world. But the result is a film unlike any other, and one—we expect the studio hopes—will lead to yet another successful franchise.
“What I love about this film beyond the visual design,” Otto says, “is that we’re in it for the long haul. From beginning to end, it’s a deeply touching and charming experience. Of course, I’m close to it, but there’s a sequence where there is zero dialog for six minutes and the story is told very clearly at that moment. I think the film really hits home in regard to heart, emotion, and charm. It’s very truthful. That’s what I like best about the film.”