'Lion King' from a Compositor's and Lighter's Perspective
September 26, 2019

'Lion King' from a Compositor's and Lighter's Perspective

A lot of work went into the making of the groundbreaking The Lion King. While CGW covered the CG aspect quite heavily, we simply did not have the room to address some of the key aspects of the work. So here we present some further information about the making of this film, in this interview with  Gianluca Dentici, senior compositor key artist at MPC London, and Max Centra, senior lighting TD at MPC London.

When and how was the production of The Lion King born? 

After the incredible breakthrough of The Jungle Book film and before the Academy Award ceremony where the film was prized with a Best Visual Effects Oscar, Disney had already started working with director Jon Favreau, visual effects supervisor Rob Legato and Adam Valdez on The Lion King, trying once again to push the level of realism even further.

The first phase was obviously that of referencing with the first film of 1994, since Disney wanted to employ the same directorial style of the first film; in fact, on the new remake, everyone would spot some of the more iconic shots many of us grew up with.

The next goal was establishing a suitable aesthetic look while keeping the realism; therefore, visiting those locations where the story is originally set was really crucial. The production organized a location scouting in Kenya to visit places and observe animals in their natural habitat and for acquiring visual references the artists would have used to re-create the characters’ look, study their movements and being able to build it into the digital realm; environment and lighting would use the information as well.  

The team also captured many other details, including different types of leaves, plants and trees, all in HDR to see how the different surfaces would react to different lighting conditions. Therefore, 360-degree HDR shots of the skies and sun were also captured to build a core library of lighting situations and backgrounds.

Traveling  to Kenya, however, was not only meant to find visual references but also to carry out photogrammetric surveys useful for rebuilding CG environments. Photogrammetry is, in fact, a technique that helps on reconstructing three-dimensional environments through the processing  of multiple photographs taken from different points of view. For The Lion King, MPC used this technique to scan Mount Kenya and surroundings, flying a helicopter at 17,000 feet.

It has been calculated that this photo scouting work has seen six photographers and four video operators traveling between the north and south of Kenya covering a total of 18,000 km. In fact, the crew returned to same locations several times to photograph the same environment on different lighting and on different weather conditions and for observing the behavior and growth of the surrounding vegetation. 


Cinematographer Caleb Deschanel.

When did the "virtual filming" begin, and what does shooting a film using virtual reality equipment really mean?

At the moment, The Lion King represents probably the largest production ever made by using virtual reality as a tool for navigating an immersive shooting environment. In fact, after a meticulous pre-production work, first low-resolution environments and characters were created and sent to be used for the virtual filming; This began in June 2017 when Adam Valdez, MPC's visual effects supervisor, Rob Legato (VFX supervisor for Disney) and cinematographer Caleb Deschanel, along with Unity technicians and those from other game engines, gathered all together in a Los Angeles studio for wearing virtual helmets. Through virtual reality, it was in fact possible to immerse them all into the digital environments created by MPC and observe it as if you were walking on a real environment to carry out a location scouting.

At this point, the director and his collaborators were able to define shooting setups, thus the camera position, lenses and camera moves and anything while observing the animation of the characters already playing within the virtual environment. Obviously, all the traditional shooting equipment like tripods, dollies, drones and cranes were "encoded," which means that they were also part of the virtual environment in a way each move actually corresponded to a change in the shooting point of view within the virtual world. During the  filmmaking process, the director could check the shot on the set monitor and make changes of various kinds, and this was only possible thanks to the virtual cinematography equipment such as Unity.

Basically it was like shooting on a real set with actors in flesh and bones, but in a controlled environment and with the possibility of repeating takes as many times as they wanted or changing the shooting setup on the go, in a short time. Among other things, the acting would remain exactly the same for each take, so it was possible to build up the shots with much greater precision. 

DP Deschanel also benefited from the virtual environment, as he has been able to carry out tests using different virtual lighting conditions. Apparently one of the first test that was carried out featured the Rafiki character with six different lighting conditions in an attempt to find the most appropriate and natural look that could be plausible in the real location in Africa.

How did the creation of visual effects take place? What were the more crucial steps?

While Adam Valdez was in Los Angeles working along with the virtual production team, VFX supervisor Elliot Newman at MPC London started the long process of creating assets (characters, environments, animations) needed for shooting the scenes with the virtual system in L.A. 

For The Lion Kin,g a sort of 'evolving' workflow was employed so the assets were constantly improving and enriching with many details on animals, environments, etc.; by that point the asset could be sent back to the virtual cinematography crew in L.A. with even more elements than it had before, so the director had more hints and elements for a potential re-shooting;

How did the work of creating the necessary assets for the film take place?

The creation of environments and animals required a big R&D work on specific tools to speed up the creation of so many elements featured in the show. MPC has managed a huge work on creating all the assets between characters and environments.

Among these: 17 main characters, 63 unique species (in 365 variations, from zebras to anteaters) and in total animated 9,063 characters and 3,1421 virtual crowd actors. Speaking on environments, the work was really massive and it was inspired by the reference documentation registered in Kenya; in fact MPC created 66 natural environments, a total area of ​​150 square km, about 11 times the city of Los Angeles! These environments have been finally populated by a real ecosystem including 921 green species among plants, trees and flowers, not even counting the different variations for each type! 


What was your experience on The Lion King?

From the first day we were told the film’s realism look was going to be more like a documentary than that of a standard full-CG feature, but honestly I wouldn’t expect we could go that far! 

I worked for the Compositing department as a senior compositor key artist and our task was to create the final shot by putting together all the characters, settings and effects coming from all other departments and make it look photorealistic. I must say that by the time I received CG characters’ renderings from the lighting department on the very fîrst shot  I was amazed with its realism! Characters themselves, their eyes, the fur were really impressive, now when a compositor get such a high quality material to start with the work is much more exciting and pleasing. 

We were divided into two, and at some point also in three, compositing units based on different sequences and environments criteria. I worked for the second unit that was responsible, among other things, on the creation of nighttime and atmospheric shots, like those into hyenas’ grotto. I must say that our task was very complex, perhaps even more than other units given that when you are dealing with scenes in penumbra, you have to keep as many details as possible but at the same time creating a typical Disney-styłe atmosphere but on a realistic manner, without revealing too much, otherwise it would break the lighting work. It is therefore a matter of delicate balancing with contrasts, lights and shadows that often requires some testing time before reaching the desired result.

Valdez was often with us to tackle shots and defining the visual look for the most complicated sequences. 

I could remember that when we worked on such shots, the point was that the only plausible light in a forest would be the on of the moon; therefore, you must be able to show the most relevant elements of the shot and its details without overdoing it to the point the moonlight looks like a mega light reflector. It is a complicated phase that deserves attention, time and interaction between several artists.

The film was made in stereo, which means that for each shot a double data flow had to be made: the images for the left and right channels. Technically it might require the doubling of the rendering time for CG department, whereas for us in compositing, also a further step to overcome before getting to the final approval. 

Infact as soon as the shot was approved by internal supervisors Valdez, Legato and, ultimately, by Jon Favreau, the second channel was in fact carried out, but this final stage is critical and occasionally unforeseen issue might come over.

Some areas of the shot in the 3D space might turn to be very extreme and invasive against the audience space, even a tree’s branch or just a leaf could come out too much from the screen. So in those cases, we had to work along with the stereo supervisor on changing the convergence of the stereo stream which translates into the shifting of the scene’s stereo space.

Sometimes even strong light spots and small highlights can cause an annoying "ghosting" effect if they sit on top of high contrast areas; basically it seems like these points got shifted on one side and doubled, so even in that case we must intervene to minimize this problem.

You have also worked on Jungle Book at MPC, what are the substantial differences you found out working on both films?

Well, look-wise, the qualitative leap of the computer graphics – that already had a very high profile on The Jungle Book – was immediately obvious!  Besides on The Lion King, we don’t have any live characters so we saved a lot of time not having to pull out any bluescreen key out of it. Same thing for the stereoscopic part because on Jungle Book, one of the difficulties was to precisely blend the stereo CG work with the live’s stereographic setup, which required the Stereo department to rigorously work on 'Triage,' a delicate phase of retouching and improving the stereo live footage before being able to compose it with the CG.

Another substantial difference I noticed was the incredible work done by the animation department. All the characters move with impressive realism throughout the entire film: expressions and secondary actions on bodies, small movements like the typical slight twist of a lion’s ear when it tries to shake off a gnat, have made the characters even more credible and alive.


Can you give us some number of the workflow and the approval phase for the shots?

Well speaking on behalf of the Compositing department, it was calculated that we sent 6,182 compositing versions, and considering the show itself counts 1,490 in total, it means that each shot has been reviewed on average 4.1 times before reaching the final approval. 

Now speaking on all other departments between Animation and Environment, 847,013 versions were produced, equivalent to 46 days of visual material, for which only 3.6% was sent to Disney, supervisors and the director for the review.

Some other fun facts: there are 676,578 flies and insects in the film and about 100 billion tufts of grass were generated whose dynamic simulation was obtained through PaX, a proprietary MPC system for dynamic simulation on virtual environments . 

The artists involved in the production numbered 1,250, among them 650 in London, 550 in Bangalore and 50 in Los Angeles. 

Is there an anecdote that you remember with particular affection?

The making of the film was an incredible life experience that united and amazed all the artists who were part of the creative process! Every artist was committed on doing his or her best, and finally all these efforts really paid off on achieving such realistic-looking images nobody has seen before. During the crew screening organized by MPC at the IMAX cinema in London, we all had a spontaneous and contagious explosion of joy at the end of the film while our credits were running on the screen.

We went back home with the memory of an unprecedented experience and a posh branded bag containing a Lion King’s Agenda, specially designed by Disney and donated to each artist.