For the Birds: Digital Flocks in 'Noah'
April 23, 2014

For the Birds: Digital Flocks in 'Noah'

The feature film Noah is an epic tale inspired by the biblical story of Noah’s Ark and the Great Flood. Directed by Darren Aronofsky, the movie contains many visual effects, including a menagerie of animals, a deluge of water, and a digital ark that is used in place of the practical piece, especially in the apocalyptic scenes. While ILM took on the majority of the work – particularly the water and the animal pairs herded into the ark – there were a number of other effects handled by Look Effects.

Here, Look Effects’Dan Schrecker, VFX supervisor, and Dave Zeevalk, 3D supervisor discuss the studio’s work on Noah with CGW Chief Editor Karen Moltenbrey.

The Noah trailer is available in the CGW video section.

What type of work did you do on Noah?

The biggest ticket item was the birds, but we also did some antediluvian environments and set extensions (before the flood), the Garden of Eden, a little bit of CG water work, and a number of shots in the Rivulet sequence.


Let’s start with the birds. Can you describe that work?

We created every bird in the film except for the three practical birds: two doves and a raven. The challenges we faced in completing this work were numerous, both technical and creative. We worked with Darren [Aronofsky] on Black Swan and did the final sequence in which Nina sprouts wings and turns into a swan, so we were designated as the bird guys. Needless to say, there is a vast difference between one pair of wings and building multiple shots with two of every kind of bird on Earth.     

To complete the work, we had to build two different systems from the ground up: a flocking system and a feather system. In addition there was a tremendous amount of work put into our infrastructure in terms of managing our pipeline to deal with so much data. Noah was the biggest show we had ever done and consisted of work that was more complex and challenging than anything we had done before, so, as a facility, it required us to take a fresh look at many of the things we had done prior. 

Starting with the design of the birds, we had to figure out ways that would allow us to create the tremendous volume required to give the appearance that Noah has collected two of every bird on Earth. The art department gave us a number of hero birds to build from , and we ended up with 14 different body types. From each of those , we created a number of unique plumages, which we called ‘grooms.’ By varying up overall color and size , we were able to give the impression of a tremendous amount of diversity from a relatively small number of base birds.  

To make sure we had the right amount of verisimilitude throughout the process, we did extensive research early on. In addition to countless viewings of Winged Migration, we also took a field trip to The Edna Lawrence Nature Lab at The Rhode Island School of Design. This gave our team access to real bird skeletons and feathers , and allowed an up -close look at what it was we had to build. Later on in the process, the producer of the film, Ari Handel, put us in touch with Ben Parslew, a research associate at The University of Manchester in the UK. Affectionately referred to as ‘the Bird Nerd,” Parslew was invaluable in helping ensure that the flight animations of our hero doves were scientifically accurate.

We had two approaches for our birds that we could use, depending on the shot, in some cases combining the two. 

For shots with more distant birds, we used our flocking system that allowed us to populate up to two million birds. As we get closer, we hand-animated ‘hero’ birds, which we used for operating in tight spaces and when a finer level of detail was required in their performance. All told, our animators worked on over 3,000 individual , hand-animated birds, starting with key ranges of motion that would be fed into the flocking system. These then served as a starting point for our hand-animated hero birds. 

Another key component of our bird work was our use of Deep compositing. Used for the first time on a large scale for a live-action VFX project by Weta on Rise of the Planet of the Apes, Deep compositing allows compositors to work in 3D space in a way that provides added flexibility and functionality that would not be available in a normal 2D compositing approach. Simply put, without our use of Deep compositing, we would not have been able to do this work. It was the basis of how we put together our most complex shots and was one of the first decisions we made. Going forward, it made the bulk of our task easier, especially later in the process when changes that might have otherwise been major were handled more easily due to this approach.

A good example of how this was invaluable can be seen in our biggest shot in the film, in which Noah and his family walk down the aisles of bird roosts and put the flock to sleep with their magic smoke. Because of the overlapping nature of the bird roosts and the over 1,800 hand-animated hero birds, we needed to find a way to put it all together , while maintaining flexibility throughout the process. As the camera dollies across the set and the birds fall asleep in their roosts, we were faced with the challenge of how to properly seat our digital birds into the practical set. Because they were essentially crisscrossing each other during the course of the shot, such that a single bird would appear in front of and behind other birds and the practical set, standard holdouts were impossible. We relied on Deep techniques to combine the CG elements with the roto mattes that were required for the practical roosts. This allowed us to do things like pull individual birds out of the comp fairly late in the process, stay flexible with color and depth of field without introducing edge issues , and also deal with the semi-transparent curtains that were present in the practical set.

In addition, we rendered our elements with full Deep RGB and breakout passes.  Normally, the approach is to render Deep opacity and shadow maps because there is less data to manage and, in most scenarios, this is enough. But because of our volume of birds and the amount of flexibility we wanted, we took a more extensive approach. This led to huge amounts of data that we had to manage, but in the end enabled us to achieve what we set out to do.

Can you expand on the feather system that Look had to develop? 

The feather system itself was made up of a number of parts, including a feather creation tool, a grooming tool, and a deployer. For a variety of reasons, we decided not to use an off-the-shelf feather system and undertook the challenge of building our own. We needed something that would allow us to create a tremendous amount of variety and volume, while also working efficiently and with as much optimization as possible. Our system hit those marks and was a crucial piece of what we did on the film. 

Early on, we realized that we couldn’t use a traditional hair or fur system because of the randomness with which points were distributed across a surface. Due to the fact that the layout of feathers across a bird is not completely random, as hair is, we looked for other ways to apply our feathers. We ultimately adopted a blue-noise pattern as a basis for our distribution. In this system, every point on a surface has a scale that represents the scale of that feather, and its distance to its surrounding neighbor feathers is based off that scale. We wrote our own blue-noise solver that would apply those points on a surface and then resolve any spacing issues. We used multiple techniques to then instance those feathers.

The feathers themselves were a series of cards that were created using a photo-real feather-creation tool that we built. This separate application allowed us to define shape, length , and other attributes starting with a flat, oval-shaped, low-resolution feather. We were able to push points around until we were happy with the general shape of the envelope , at which point the tool would generate the rachis along the spine of the envelope and spawn hundreds of curves to represent the barbs. The artist had fine-grained control over splitting, curl, noise , and all the other various attributes that define the look of a realistic feather. The end result was then rendered to a texture to speed up rendering, as we had too many birds to directly render the curves of every feather.

With our feathers built and our distribution model in place, we commenced grooming the birds. Artists interactively painted maps on birds, which allowed us to control a wide range of feather attributes , including scale, base color, tip color, lift (how much a feather lifts off of the body) , and which feather types to use. Our groomers did not have to generate every single feather, instead relying on our grooming tool to instance and blend them together into the desired effect.

Another tool that proved useful was one that allowed artists to draw in areas of finer detail where we needed them. For instance, in the areas surrounding a bird’s eye, we need a higher density of feathers , and this tool allowed us to achieve that without increasing the resolution of the underlying geometry. A separate tool allowed us to control basic feather direction with a few pen strokes, instead of having to hand-comb each feather.    

A final piece of the puzzle was our deployer, a multi-faceted tool responsible for distributing and applying grooms and feathers within each shot. On a macro flock level, the deployer decided which birds received which grooms. 

With our volume of birds, we wanted a way to procedurally assign grooms to individual birds, so our system looked at what types of birds existed in a given shot and programmatically dispersed grooms, taking note of which birds were in proximity so we didn’t end up with two similar birds butting up against one another.

Digging deeper down to the feather level, our deployer took each groom, which were initially designed in a rest position, and applied them to our animated birds, recalculating feather positions frame by frame to maintain relative angles from one feather to the next. The idea here was that because of the high volume of birds, we couldn’t solve for collisions between individual feathers. Looking at averages, the deployer detected outliers and brought those feathers in line with the others surrounding it. This was essentially a method of avoiding most collisions, when a true collision detection system was out of reach for the scale on which we were working. Other tasks handled by the deployer were caching of feather geometry, which was then instanced at render time, in communicating with [Side Effects’] Houdini to define exactly what needed rendering (that is, avoiding feathers that weren’t visible to camera) and applying static and dynamic noise to feathers for ruffling and air resistance as they flew.


What about the flocking system?

The second major system we built was our flocking system. As with everything we did, we looked for ways to maintain flexibility while also keeping in mind the scale of the task at hand and how we could work most efficiently. We started with a series of loopable animation cycles and progressed to different ranges of motion, building a library to draw from.  

Because we had a number of shots that required flocking, we created multiple ways to generate a flock itself. For shots outside the ark, we were able to work more loosely, generally using curves to drive direction. Inside the ark , we followed a 3D volumetric approach using the geometry of the interior to confine the movement of the flocking birds. The system would then choose which animation cycle to use and when to transition – based on a bird’s motion in the world. In order to prevent collisions, each bird had a field of view of around 120 degrees. Anything outside of that range was ignored , and as birds moved around and past each other, they would constantly be updating and re-calculating their flight paths based on what they ‘saw’ in front of them. This flocking AI was the backbone of our system.

Birds were then instanced to create the full count we needed for each shot. In addition, every bird would spawn a mate , and these would stay in close proximity in order to fly two-by-two as required by the story. 

Another key piece of the system was a method to give the appearance of feathers, without the added data of actually feathering all of the flocking birds.  We were able to generate shaders based on each of our grooms and apply it to the flocking birds. This made them appear consistent with our hero birds but saved us a lot of processing that we didn’t have to do because of their distance from camera.

Did Look do any set extensions?

We did a number of set extensions inside the ark to give the impression of more depth. The practical set was fairly large, but not as big as was required for the story. As a result, we used on -set reference, plate photography , and LiDAR scans to build out the environment. This was also key for us when we needed our birds to fly through the ark upon their arrival.

In addition, we did a number of digital matte paintings in the beginning of the world, showing how mankind had laid waste to their environment. Using plates shot in Iceland, we added burned-out cities, toxic pools , and other environmental damage to help sell the idea that humans had not taken care of all that the Creator had entrusted to them.


Water, of course, is important to the story. Did Look have a hand in any digital water for the film?

We only did a handful of water shots. The majority of the water shots were handled by ILM. We had a few CG oceans that we built, but they were mainly the calm seas. We used a variety of tools to create those, ranging from full fluid simulations to simpler particle sims and even displaced geometry, depending on what was called for in the shot.  

One sequence that is worth mentioning is the Rivulet sequence in which a magic stream emanates out from Noah’s camp and spreads out across the land, leading the animals to the ark. This proved to be a challenging sequence from a conceptual standpoint because we needed to sell a couple of ideas: that time was passing and that the rivulets were branching out from their origin point in the forest. We initially had time-lapse plates with which to work, but as the concept of the sequence changed, we ended up creating a fully digital environment. To show the passage of the time, we assembled a series of sunset images for the skies, flashing each one for two frames and animating different elements in each shot to further sell the idea. For instance , in one of the shots , we see a series of buildings crumble before our eyes and piles of trash disappear as they are scavenged by humans who flash through the frame. The entire effect is one of time-lapse photography that portends the work that ILM did later in the film as Noah recounts the tale of Genesis to his family.

 


So, let’s step into the Garden of Eden. 

Our work in the Garden of Eden consisted of a number of different things. First, we created the Tree of Knowledge, a matte painting broken into different pieces to get some internal movement within the tree. The fruit itself was a fully CG ‘apple’ that was designed to look like a beating heart. We looked at reference of how a heart beats and alluded to it by breaking the fruit up into chambers and staggering the animation accordingly. In order for the actor to pluck the fruit, a practical fruit was hanging from the tree. We simply scaled our CG fruit up a tiny bit to cover the practical one.  

Another aspect of this scene was Adam and Eve. Clearly how the first humans look in the film is a sensitive issue, so working from concept art, we gave them a glowing look that was reminiscent of the Tzohar effect we developed for other scenes the film. This glow was a visual element that carried throughout the film, from Adam and Eve , to the valuable golden rocks , to the inner light of the Watchers. 

The snake was the final piece of the scene, a fully CG creature that incorporated two different designs: the green snake that we initially see and the four-eyed black snake that reveals itself as the skin is shed. We used Autodesk Maya to model the snake’s base geometry, and then sculpted and textured it in Mudbox. Look development happened in V-ray for Maya, using vector displacement for the snake scales and surface detail. We created a fast subsurface scattering shader for the skin and used two-sided materials for the peeling skin. The eyes combined glossy, refractive shaders with additional subsurface materials to achieve the black, inky look that Darren wanted.   

The shedding itself was a combination of keyframe animation and a simulation that allowed us to control how it peeled off and collapsed as the black snake pulls away. A plug-in called L-Brush allowed us to sculpt the look of the simulation on specific frames. To complete the snake shots, we also created a fully CG environment, complete with rocks, macro-fragment soil (where every bit of grain was placed using a particle scatter), and CG grass to interact with the snake as it slithers toward camera, running multiple dynamic simulations to get believable interaction.Plates shot in Iceland provided HDR reference for us , but ultimately the shots were 100 percent CG.


Did you have to digitally augment any real animals? 

The only real animals in the film were the two doves and the raven mentioned earlier. However, this doesn’t mean there weren’t other practical beasts, but they were stuffies. The art department and special effects make-up team collaborated to create a menagerie of sleeping animals: mammals, birds and reptiles. We augmented those by adding breathing, small movements such as twitching ears and subtle steam vapor rising from the bodies. In this way, we were able to give some life to creatures that would otherwise have been lifeless.

What surprised you most about the work you did on Noah?

The scale of what we had to do.   

What other studios contributed work in Noah?

ILM, an in-house team, and Mister X Gotham.

What was the most challenging aspect of the work you had to do?

Definitely the birds. The only way were able to overcome the immense challenges before us was to get the right people. We were fortunate to land some really great artists and supervisors. It was a small team, but everyone was very passionate about what we were doing and pushed beyond anything that we had ever done prior to working on Noah .