Virtual Verite: 'The Jungle Book'
Issue: Volume 39 Issue 2: (Mar/Apr 2016)

Virtual Verite: 'The Jungle Book'

The feral boy Mowgli’s adventures with animals in India have been a familiar part of many people’s lives since Rudyard Kipling first published his series of children’s stories in magazines during 1893–1894. The collection, now known as “The Jungle Book,” has inspired comic books, cartoons, animated movies, and live-action films. 

The most famous of these adaptations is, arguably, Disney’s popular animated feature, the 1967 The Jungle Book. And it is from this film that Director Jon Favreau drew inspiration for this year’s CG/live-action “re-imagining.” Favreau’s The Jungle Book tries to capture the magic of the animated feature while embracing technology available in 2016.

“The film plays homage to the 1967 film, but it’s a movie rooted in real life,” says Visual Effects Supervisor Rob Legato. “It’s a fun experience, and part of that fun is getting dark and scared. When the tiger is chasing Mowgli, that tiger looks real. Your heart will be pumping like crazy. But the film also has light moments, comedy, and music.”

The tiger, however, is CG, as are all the animals in the film. Ninety-five percent of the movie was shot in sets on greenscreen or bluescreen stages. 

Produced by Walt Disney Pictures, The Jungle Book, features an all-star cast of voices for Mowgli’s animal guides, protectors, companions, and enemies. Ben Kingsley is Bagheera, the black panther that leads Mowgli to the wolf den. Lupita Nyong’o and Giancarlo Esposito provide mother and father wolves Raksha and Akela’s voices. The Bengal tiger Shere Khan threatens Mowgli’s life using Idris Elba’s voice. Baloo, a bear that speaks with Bill Murray’s voice, guides Mowgli away from danger with help from Bagheera. Along the way, Mowgli encounters the dangerous and manipulative animals Kaa, a python voiced by Scarlett Johansson, and King Louie (Christopher Walken), an orangutan-like ape based on the prehistoric Gigantopithecus. Lastly, Neel Sethi plays the child Mowgli, the only live-action actor in the film.

Legato was the overall visual effects supervisor for The Jungle Book, with Andy Jones the overall character animation supervisor. The Moving Picture Company (MPC) created all the CG animals and environments for 1,200 shots, which comprise most of the film. Weta Digital handled a sequence with King Louie and various monkeys (see “Monkey Business,” page 14). Adding the environments and characters to live-action plates filmed with Sethi took nearly 1,000 postproduction artists in the two studios more than a year.  

At MPC, Visual Effects Supervisor Adam Valdez oversaw the work of 600 people in London and Bangalore, India, who were on the show at any one time, and 800 people overall. Also at MPC, Ben Jones supervised character development, Audrey Ferrara supervised the environment work, and Peta Bayley and Gabriele Zucchelli were animation supervisors.

One key to these postproduction artists’ success was the previsualization process (see “Directing and Lighting Previs” on

Expert Previs

“Picture stepping onto a motion-capture stage,” Legato says. “[Neel Sethi] is in a motion-capture suit. On the stage are gray-shaded pieces of wood and ramps that create a lumpy landscape. On the computer, we have facsimiles of the places we want to shoot. We have people in motion-capture suits representing the animals. On the computer, we see a cartoon version of Mowgli and facsimiles of the animals. As we tell the actors how to move, we block out the scenes.”

Working together to create the previs using the motion--capture stage and virtual cinematography were the director, Director of Photography Bill Pope, Legato, Jones, Production Designer Christopher Glass, Valdez, a crew from Digital Domain, editors on the film, and others.

“The coverage, the editing, all the choices were made by these people, not previs people,” Valdez says. “Rob Legato and Bill Pope were shooting with the virtual camera. Andy Jones was helping stage the action. Jon Favreau directed people in the motion-capture suits on set. Bill [Pope] was talking with Christopher Glass about the look and light. So we had a group of filmmakers creating the previs. That gave it intention, tone, style, and look from the beginning. I thought our biggest challenge would be how to make the light connect between what would be filmed and CG. But, Bill [Pope] was lighting from the beginning, from previs.”

After previs, Pope sat with Lighting Technical Director Michael Hipp at MPC to refine the lighting. 

“Because we use a raytracer now [Pixar RenderMan RIS], the light is accurate,” Valdez says. “We had gray models, no texture yet, but we could show him where the light was, how much fill there would be, how much bounce. Michael worked with Bill to light every scene and create key angle frames that went beyond previs, and then we put them on an iPad so Bill could remember his lighting while on set. Gravity had just come out with its pre-lit shots designed in the computer and brought to set, so we referenced that idea.”

Once Pope approved the lighting, the MPC team created 360-degree panoramas and loaded them onto iPads using an app called “Location.” On set, Pope and others could point the iPad toward the bluescreen stage and see a 360-degree view of the corresponding CG scene lit by Pope.

“It is like looking at a scene in Google Street View,” Valdez says. “When a person is surrounded by bluescreen, it’s difficult to know there’s a big tree over there. But, Bill wanted to know when Mowgli walked through the tree’s shadow. With the iPad panoramas, he could see the tree, the dappled light over there, the open sun beyond. It was essential to have his pre-lighting brought on set.”

Talk to the Animals

Earlier, during the previs shoot, stunt actors in motion-capture suits had played the part of the animals. During the live-action shoot, puppeteers in blue suits gave young Actor Neel Sethi eye lines. 

“They were like stand-ins,” Valdez says. “Often they had sock puppets on their hands, but there were also puppets on set that helped when Neel needed to make contact.”

At peak, 78 animators at MPC performed approximately 70 animals that appeared in the film. Of those animals, the hero animals and several secondary animals talked. 

Animation Supervisor Peta Bayley divided the MPC team into groups under seven leads assigned to specific sequences, with each lead animator having a deputy who helped handle the information flowing through. Each lead worked with riggers to develop the hero characters and to create the hero characters’ “bibles.” In addition to Bayley, Overall Animation Supervisor Andy Jones worked at the studio for much of the project.

“We were so lucky to work with him,” Bayley says. “Often we hear feedback on a project secondhand or thirdhand. It was helpful to have him here; he’s a very talented animator. It was motivating to have someone of his caliber in-house.”

The challenge for the animation team was to create believable performances for animals that, by nature, vocalize but don’t speak dialog. 

 “Andy Jones kept taking us back to the sensibility of the film,” Legato says. “If it were real, how would you shoot it? So, if an animal could talk, what would it look like?”

Thus, rather than relying on motion capture for facial expressions and lip sync, the animation team turned toward photographic reference.

“We looked at the way the animal’s mouth moved when eating or chewing, to see if we could incorporate that detail,” Bayley says, “at the way they vocalize, lick lips, and groom, as well as their breathing patterns. When they pant, you can see it in their throats and diaphragm, and we could incorporate that movement with dialog to help sell the idea that the character made that noise.”

The team amassed a large library of animal reference, including facial close-ups. They had video footage of the actors doing voice recordings and of the actors in other films. And, they referenced studies of animal behavior. Animators would sometimes get approval from the director on performances based on reference footage and combinations of reference that they found. 

“Sometimes, we matched reference second by second,” Bayley says. “It’s difficult to stray from the reference and still have the same weight, but we also had to adapt.”

Legato adds, “At first, the animators nailed each mouth fluctuation. But, people don’t enunciate every syllable. When we tucked it back, it became more believable.”

The most difficult animals, according to Bayley, were the snake Kaa and the bear Baloo.

“A snake is not inherently expressive,” Bayley says, “so it was a massive challenge. We didn’t want to introduce things that don’t exist onto that face.”

The bear presented the other side of the coin – an animal that is somewhat anthropomorphic by nature.

“Baloo was the most lively and had the most range of emotion,” Bayley says. “We found some fascinating footage of bears. Their range of emotion in their mouth is incredible. And, there’s so much character in Bill Murray’s voice, so we wanted to keep the sense of him. But you want to believe Baloo is a bear. So, that balance was the most difficult.”

Once the director approved the animals’ performances, the characters moved into technical animation – techanim – for secondary muscle and skin movement, and for fur and hair grooming. 

“Ben Jones completely rebuilt how we do muscles and tissue,” Valdez says. “He created new ways for the muscles to react to physics and interact with each other, and new ways for the skin to receive information from the muscles. We had multiple influences to manipulate fine details in the skin. Ben worked a lot with the character modeling and asset departments.”

Rather than rely on displacement maps for detailed textures, modelers built those details into the models.
“In the past, we created wrinkling via displacement maps,” Valdez says. “We shied away from using geometry due to render times. But, I like geometry. I like to have faces and polygons reacting with light. Any time you have geometry, you get more realistic light play. You can’t simulate light that well with maps.”

To alleviate the burden of high sampling rates with raytracing, MPC collaborated with Pixar and Disney on de-noising tools.

“Raytracing is the way to go forward because it gives you the obvious realism everyone seeks, but it demands that everyone raise their game,” Valdez says. “The lighting department has to be more physical in their setups. Lookdev people need to create more robust surfaces that work in all lighting cases. Asset people have to provide more realistic objects.”

To groom and move the animals’ fur, the team used and updated MPC’s seasoned custom Furtility tool. 

“We had new levels of interaction between fur and water,” Valdez says. “In the lazy river sequence, Mowgli rides down a river on Baloo’s belly. So we had to work out how to make the bridge between water and fur simulation.”

The Jungle

The lazy river sequence was one of many environments the crew at MPC needed to create for this largely bluescreen film.

“Filling up the screen for 90 minutes with the bulk of the frame is intimidating,” Valdez says. “This movie is somewhere between an animated feature and an Avatar type of movie. The environments team had probably the single most complicated contribution to the film.”

To populate the environment with rocks, ground, and vegetation, Environments Supervisor Audrey Ferrara led crews in London and Bangalore that grew to 120 artists – modelers, texture artists, lookdev artists, environment artists, and technical animators. These artists created 58 different sets and 224 subsets for the film.

“We had the wolf den and its subsets; peace rock where all the animals drink water and Shere Khan shows up; the home jungle where the wolves and Mowgli train, run, and play; the lazy river with Mowgli and Baloo; Kaa’s jungle, which is big and scary; and more,” Ferrara says. “At the beginning of the show, I wasn’t exactly terrified, but I thought, oh my God, how will we build these massive jungles with flowers and vegetation everywhere? At different times of days. In different seasons. In stereo.”

They started in India. Glass and his art department created rules for the vegetation based on species growing in India, and provided reference material. Working from that, Ferrara planned photo shoots. 

“I learned where in India we could find the species the art department wanted,” Ferrara says. “Then I scheduled three photo sessions in different seasons. We had people go to 43 locations all over India. They shot 20tb of photographs. From those, we could do photogrammetry and start to build our sets. We built 1,200 assets, more or less, for this show.”

Although Ferrara was initially worried about building the vegetation, she and the team discovered that sets with rock formations were more challenging. “You can hide imperfections in the chaos of a jungle,” she says. “Your eye looks at it as a whole. But with a big rock formation, you analyze the information. Rocks were unforgivable.”

For rocks, the team typically used photogrammetry for the structure and photos for the textures. The more complex shapes of the plants, though, meant modelers sculpted those by hand working in Autodesk’s Maya, SpeedTree’s vegetation software, and Pixologic’s ZBrush.

“The challenge was to glue all those assets together,” Ferrara says. “You can’t just place one tree on one rock on one piece of ground. We had to add details to glue all the components together. Dead leaves, moss, dust, pebbles in the little cracks between a root and a rock.”

Although artists placed most of the assets by hand, a custom procedural tool helped scatter small elements, like twigs and leaves, on the ground, and MPC’s Furtility tool grew moss and grass. 

“We could place the moss on the trunks, and it would grow toward the north of the scene, just like in real life,” Ferrara says.

The team occasionally used 2.5D cards in the background, but only the very far background, and then, seldom. 

“Most often, we used brute force,” Ferrara says. “This is stereo, so everything had to be perfect and make sense in the space. We sculpted every blade of grass and modeled every single polygon in the scene. When Shere Khan jumps on Mowgli, every single blade of grass in that shot was modeled.”

The technical animators on the team managed the interaction between the characters and elements in the environment. 

“Almost every plant had a rig for the techanim team to simulate wind, rain, and fire,” Ferrara says, “and then we had fancier rigs for trees and branches the actor and the characters touched and interacted with. Character interaction required a big collaboration between animators and environment artists to make sure everything was perfect and that we tracked all the changes.”

Prior to beginning work on the project, Ferrara had gone to Los Angeles to collaborate with Glass and, after, decided to implement the methods utilized by the art department on set for the CG team, as well.

“I realized that I needed to use the same process,” Ferrara says. “Select the plants. Put them in place. Think about the action and what might happen in the scene. Build modules so we could change position, rotation, and heights if we wanted and if Mowgli interacted with them or not.”

When the environments included water, MPC’s simulation artists turned to Side Effects Software’s Houdini, rendering the shots with RenderMan RIS. 

“There was quite a bit of water,” Legato says. “But this new RenderMan re-creates what real water does. How much you see through it. What’s reflected and refracted. When you see a shot with water, the water makes you believe the scene is real, all by itself. The eddies, the breeze on it, the flow all feels real. We had some frames that took 40 hours to render, but the result is photographic.”

Helping to make all this – the believable characters and the realistic backgrounds – possible was a reasonable schedule. 
“I’m glad Jon [Favreau] chose to make a movie of this length [90 minutes],” Valdez says “We could focus on making each sequence the best it could be in terms of storytelling and the quality of the finish. We didn’t just blast out a movie as fast as we could. The work feels good, and it’s engaging. It rides that balance between naked photorealism and an enhanced stylized, lyrical tone. It’s a bit different and a bit special.”

“It wasn’t easy,” Legato says. “But the achievement was high.”  


Artists at Weta Digital created one main sequence in the film. The sequence begins in the forest. Mowgli is on a perch, until, suddenly, he’s captured by monkeys that drag him to the temple of King Louie and his minions. A confrontation there leads to a song and dance moment. Mowgli escapes. And King Louie comes to his demise as the temple falls to ruins.

Dan Lemmon was visual effects supervisor when Weta Digital joined the production in early-fall 2014. Keith Miller took over as supervisor a year later, as Lemmon moved on to preproduction for the third Planet of the Apes film.

“On most shows,” Miller says, “we integrate a CG character into a live-action environment. This show was unusual. We have a single live-action element, Mowgli, that we integrate into a full-CG world.”

The kidnap moment is a perfect example. 

“It’s fun to look at the plates,” Miller says. “It’s a bluescreen environment with 20 or more stagehands in bluescreen suits passing Mowgli overhead like a rock star who’s crowd surfing.”

To create the final shots, Weta Digital artists and animators replaced the blue-suited stagehands with monkeys and added a CG jungle with vegetation. The vegetation interacted with Mowgli and with the fire brigade of monkeys passing him along overhead.

“We made minor improvements in our Lumberjack program in terms of the tree dynamics to give animators more control so they could drive the primary interactions,” Miller says. “Then the effects team managed the simulations. We also added instancing to our proprietary renderer, Manuka, to handle the large numbers of heavily detailed trees.”

The monkeys – King Louie’s minions – were species specific to India: pig-tailed and lion-tailed macaques, zippy langurs, and gibbons. In Disney’s 1967 film, King Louie was an orangutan. In this film, he is an orangutan-like prehistoric ape called a Gigantopithecus, which, unlike orangutans, might have lived in India.

“We didn’t have a lot of information about it,” Miller says. “Fossil records don’t tell us whether the Gigantopithecus was a quadruped or biped, but it was probably 10 feet tall. We received concept art and previs models, and then went through our design process.”

Because Christopher Walken would provide King Louie’s voice, Weta Digital artists sculpted a facial model with geometry matching Walken. Then, using their experience in incorporating Andy Serkis’s facial shapes to create Planet of the Apes’ Caesar, a technique honed in other films as well, they began modifying forms to roughly match the volume and shapes of an orangutan’s face.

“We studied photography, video reference of Walken performing his lines in the sound booth,” Miller says. “We tried to incorporate corners of his mouth. His signature wrinkles. The line above the ball of his chin. It’s an iterative process. You always run the danger of anthropomorphizing to such a degree that you lose the feel of the character and what you would expect an ape to do.”

Animators had video of Walken voicing the dialog for reference, but because he had read the lines in the recording booth, they also turned to reference of the actor in other movies. 

“Sometimes Jon [Fav-reau] would send selections from films, moments we could use as a template for shots,” Miller says.

For King Louie’s body performance, the crew motion-captured Director Favreau playing the part, but the combination of Favreau’s body performance and Walken’s facial performance didn’t, as Miller puts it, “play nicely together.”

“We tried to find pieces that worked to craft a new performance,” Miller says. “Jon [Favreau] gave us a run-though of the scenes to establish a tone, so we could pick out actions here and there. But, this was used only as reference for animators. Any pieces we wanted to use were more easily keyframed.”

Song and Dance

Animators also used keyframing rather than motion capture for King Louie’s minions. As it turns out, the motion-capture techniques the crew had used for Planet of the Apes were not appropriate for the monkeys in this film. 

“The chimpanzees in Apes do knuckle walking that you can mimic with humans,” Miller says. “But our monkeys had quick direction changes that didn’t lend themselves to human performances. With Apes, we could capture a performance in camera on set during live action, so the director could craft the performance and it was a good jumping-off point for animators. For this film, the animators spent more time exploring and crafting the performances and assembling components into a cohesive piece.”

That process was especially intense for a sequence in King Louie’s throne room during which as many as 30 monkeys took part in a song and dance routine.

“They were jumping around and bouncing,” Miller explains. “So, we had a lot of complex keyframe work. Jon Favreau wanted a sense of progression; he wanted us to whip up a frenzy, to continue ratcheting up the action as the song goes on. We tried to do motion capture, but the complexity of the motion was more suited to keyframe.”

In fact, originally the sequence had been planned without a song and dance component, and with King Louie taking a harsher tone.

“It was leaning toward too dark and menacing,” Miller says. “And the test audiences missed the song. Because it was added later, we didn’t have any reference from Favreau. We had reference of Walken reading lines in the sound booth, but it was a big shift from dark and menacing to song and dance. We preserved the facial expressions we had as much as we could, and animated our way through it. It was a bit of work, but it was fun.”


The filmmakers shot Mowgli [Neel Sethi] on the bluescreen stage in a set dressed to match the throne room designed by Glass, but the song and dance sequence was, for the most part, entirely CG. A handful of random plates were salvaged for Mowgli elements, while a digital-double Mowgli was used to help fill it out. The entire temple behind the set, inside and out, was CG.

“Our Reference Photographer Matt Mueller went to seven different temples in India and shot tens of thousands of photographs of buildings and the monkeys there,” Miller says. “That really helped us understand the construction and the weathering geometrically and texturally. We worked those details into the structure.”

Because King Louie’s temple would be destroyed, modelers considered the building’s demise as they constructed the complex system of passages and tunnels within the temple, as well as the overall geometry.

“We used a lot of modular pieces,” Miller says. “We might assemble hero stone blocks into a column, for example. Sometimes we had a shell piece that the effects team would fracture. But the collapse needed to be art-directed, and we never knew where the stress would cause fractures and how the forms would collapse until we got into it.”

When King Louie or other characters caused destruction by pushing on areas of the building, animators blocked in large forms to indicate timing and direction. Otherwise, the effects team started the action and, by using a rigid-body dynamics solver, would collapse the temple according to the laws of physics.

“Postdestruction, we might swap out pieces when the result of geometry from the rigid-body dynamics was simpler than we wanted,” Miller says. “We wanted to be sure we were looking at detailed pieces of ruins.”

The reason Favreau would turn to Weta Digital for a sequence involving apes and requiring detailed environmental work is obvious. Yet, it was more interesting for the crew of artists and animators than they might have expected.

“Animating the monkeys was time-consuming,” Miller says. “But, it was fun to have that control and to explore ideas and changes as we went. It was exciting; a growing process for everyone.”