Issue: Volume: 31 Issue: 3 (March 2008)

Making History

Having shown in the film The Day After Tomorrow what the world might look like if global warming melted the icebergs, director Roland Emmerich and visual effects supervisor Karen Goulekas turned the clock back to the day before yesterday. Before a few thousand yesterdays, in fact. Their latest collaboration, 10,000 BC, follows the young hunter D’Leh (played by actor Steven Strait), who falls into the role of tribal leader after a mammoth hunt. When slave traders raid his village and kidnap the beautiful Evolet (Camilla Belle), D’Leh leads a band of hunters on a rescue mission, encountering terror birds and a saber-toothed tiger along the way.

Goulekas won a BAFTA award for visual effects that sent walls of water through Manhattan and later froze the New York City in The Day After Tomorrow. In this film, the effects center on the animals, all of which are CG, and on shots near the end, set in Giza, in which we see thousands of slaves and a few hundred woolly mammoths building the pyramids. “The historians are going to go nuts,” he laughs.

Goulekas worked solely with London-based effects houses on this film. Of the 641 effects shots in 10,000 BC, Double Negative and The Moving Picture Company (MPC) handled the 300 most complex shots and all the animals. Senate Visual Effects provided digital makeup, shot fixes, and composites for 105 shots, while Machine worked on composites and added weapons to 104 shots. An in-house team took care of wire removals and more composites to finish the remaining shots.

Prior to filming, Goule­kas worked with a team of 18 previs artists at Nvizage in Pinewood Studios (UK), some of whom were hired specifically to previsualize the film. The process extended over two years. “The previs was elaborate,” Goulekas says. “I had 14 senior character animators, three asset builders, and an editor.” Some members of the team accompanied the crew on location in New Zealand, South Africa, and Namibia. And later, some of the crew continued working on “postvis,” that is, integrating the previs into the live-action plates, when they returned to London. The postvis was later handed off to the effects studios.

“It was great,” Goulekas says of having postvis in-house. “Of course, the studios did their own blocking for their shots, but they didn’t have to do as many iterations to figure out what we wanted. And, what was also interesting is that I handpicked the animation supervisors from the previs team.” Rob Hemmings joined Double Negative; Greg Fisher joined MPC.

Mammoth Undertaking

The film begins with the mammoths grazing, as hunters sneak up on them through tall grass. By teasing the lead mammoth into charging, the hunters run the beasts into a canyon. There, they trap the trailing mammoth in a net, but the huge animal escapes. The hunters jump onto the net and then fall off, all except D’Leh, who is tangled in the net.

MPC handled this sequence and shots with mammoths during the end sequence. “Because the mammoths are 18-feet-tall, they had three-and-a-half-foot-long hair,” says Goulekas. “So, getting the grooming right and dealing with the dynamics was hard, and then making it more complex, the net was made of mammoth fur and bones. So that whole performance of a mammoth with three-and-a-half-foot-long hair interacting with the net was tricky. And, of course, they had ground replacement for a herd of 300 mammoths.”

Special rigs that warned when the harnesses stretched too far helped animators control mammoths aiding the sometimes digital slaves build CG pyramids in Giza.
Nicolas Aithadi led a crew of approximately 150 people at MPC who worked on the studio’s 150 shots for close to 18 months. “The shots were insanely complicated,” he says. “We had at least one mammoth in every one of our shots. We spent between six and eight months designing the fur. We had 15 effects technical directors working just on the grass and dust. About 40 of the shots in the opening sequence are full-CG shots—the mammoths, the environment, the sky. Sometimes the only thing that’s real is the hunter, and sometimes even he is completely CG.”

For the grazing herds, MPC used the studio’s custom crowd-simulation software, Alice, which applied keyframed motion cycles. “We developed a whole library of mammoth behavior,” says Greg Fisher, animation supervisor. “Walking slowly, quickly, uphill, downhill, around corners, grazing, running, galloping, moving their heads, eating food, with variations for males and females at different ages. We had hundreds of cycles.”  Once in Alice, the mammoths moved along predetermined paths using these motion cycles.

Hemmings, who later became animation supervisor at Double Negative, worked on previs at Nvizage for MPC’s mammoth sequence. “One of the important things we worked out was the speed that D’Leh and his friends—that is, how fast the live-action actors—would have to run when they’re running among the mammoths,” he explains.

Even so, because the director wanted the mammoths to gallop, the actors couldn’t run fast enough to keep up. “Each stride of the mammoth covers a big distance, and when they were galloping, they were moving 90 miles per hour,” Aithadi says. So, MPC created digi-doubles for the hunters.

But that wasn’t the biggest problem. The gallop was. Elephants, and presumably mammoths, can’t gallop; they never get airborne because, at their size, they’d crunch their bones when they landed. “Elephants always have one foot on the ground,” MPC animation supervisor Fisher points out. “They’re not the most agile of creatures; they’re almost like a box on stilts. They don’t have much range of motion. At full speed, they’re walking fast or running. But, Roland [Emmerich] wanted them to gallop. So we tried to do something between a gallop and a run that wouldn’t be too unbelievable.”

For the sequence in which the mammoth is caught in the net, MPC had live-action plates of a net on set as reference, but in the final shots, the net is always CG. To animate the beast and the net, the team started with postvis animation of the net. “We knew at what time the mammoth would hit the net and how far it would move,” says Aithadi. From that, animators keyframed the mammoth. Effects TDs simulated the net based on the keyframed mammoth animation using Syflex’s software. “We’d basically throw the mammoth at the cloth,” Aithadi says. “Because we wanted the net to look heavy and not stretchy, we did several simulations. Sometimes, we’d pull and push some elements by hand.”
Hairy Beasts

MPC built a new fur system, Furtility, from scratch for this film. “We had hair that couldn’t be solved with off-the-shelf software,” Aithadi says. “The size of it, the length. We couldn’t do it with Maya hair or any software on the market.”

Furtility uses Autodesk Maya’s interface for its grooming and dynamics tools. “It’s similar to Maya hair in the way we use it,” Aithadi says. The flexible, procedural system works with nodes that can be switched on or off and re-ordered.

“A big problem was that after we groomed the hair, it looked too clean and nice,” Aithadi says. “So, using a combination of textures and geometry objects, we messed it up. The other problem was the length. It complicated every aspect.”

Because the mammoths’ hair is three-and-a-half-feet long, MPC developed Furtility, a new fur system, to groom and move it. A hero mammoth had one million hairs, each with 35 vertices.
For example, to make the hair bend, the software uses vertices down the length of the hair to compute positions. The more points, the longer the simulation takes. Because these mammoths had one million hairs and each had 35 vertices, the crew initially looked at render times of 16 hours per frame. “We managed to trim the renders to three or four hours,” Aithadi says, “but some shots, when we had 50 hero mammoths in a shot, still took a week full time to render. If you remove the points, though, you get hair that’s not bendy.”

To control the long hair, the groomers positioned hairs at the base of the fur to control how it would fall, and then let gravity take over. To simulate occlusion, the technical directors largely used shadow maps. “The big clumps of fur needed to shadow each other to see the volume,” Aithadi says. “We rendered a single frame of real ambient occlusion that we used under the fur. Shadow maps projected under the fur and lined up at the same angle as the camera simulated occlusion.

Similar techniques helped the crew create the tall grass and the furry net. For the grass, MPC implemented special nodes in Furtility to attach seeds to the top of grass blades. For the net, a node allowed a TD to have the hair twist around a core rather than hang. Additional tools made it possible to attach bones, clumps of dirt, and other objects to the net as well as to the mammoth’s fur.

Because the hunters had made the net from mammoth fur, to distinguish one from the other when the two interacted, the crew used shading, lighting, and occlusion. Once again, shadow maps projected under the hair simulated occlusion for the furry net, but the shadows had a second function as well: handling the connection between the net fur and the mammoth hair as the two interacted.

“We used the shadow maps to push the fur from the net into the mammoth,” Aithadi explains. “We did that in a dynamic way using animated maps. At each frame, wherever there were shadows, the fur went into the mammoth hair.” That created the illusion that the mammoth was touching the net.

Harnessing the Slaves

The enslaved mammoths that move the pyramid building blocks during the final sequences had less fur because they supposedly had been mistreated. But, they caused an additional problem: Harnesses made of little pieces of rope, metal rings, and leather straps needed to move in a coherent way each time a mammoth moved.

“We’d have four mammoths harnessed together, walking up and down ramps, being whipped by guards as they pull a massive quarry block,” says Fisher. Because the harness couldn’t look elastic, riggers created a rig that warned animators by turning the harness red if they stretched it too far. After the animators created the performance, TDs ran the skin and muscle system and a cloth simulation that added dynamics to the harness.

To create the environment for the final sequences in Giza, MPC started with shots of two pyramids and a long ramp built and filmed in a Namibian desert. “The miniature pyramids are 24-scale,” Goulekas says. “They are huge. We plunked them in the Namibian sand dunes and then shot tiles with Spydercams, cranes, everything. We also Lidar’d the set.”

The crew filmed a huge, 24-scale miniature of the pyramids built in the Namibian desert. MPC used that footage to texture-map a matching CG model, thus providing flexibility in post.
To fill the space with CG people and the digital mammoths, MPC rebuilt the miniature in CG. “It was the same problem that we had with the grass in the beginning,” Aithadi says. “We realized we had to re-create the whole environment in CG.”

For Giza, the crew used the photographs and the data from the Lidar scans to build a low-resolution CG model of the set, and then texture-mapped the digital model with the photographs. “The miniature was the length of a football pitch,” says Aithadi. “And they had 20,000 blocks along the ramp. But, our work went quickly. We put our blocks on top of the real blocks in the photos and then projected the photos onto the blocks, and it just worked. Once we had the 360-degree environment, Roland [Emmerich] could come up with new shots.”

Of the 50 shots in Giza that MPC created, Aithadi estimates that 95 percent had full-CG backgrounds, and half were completely CG, including the people and the sky. For the digi-doubles, MPC motion-captured people walking up ramps, pulling blocks, hammering, cheering, running, and so forth. “It’s a busy construction site,” says Goulekas. “The hunters infiltrate the site and look for the lead mammoth.” As they did in the opening sequence, they cause the mammoth to charge, which results in a mammoth stampede; thousands of slaves overrun the guards, as D’Leh tries to rescue the lovely Evolet.

Terror Birds and Tigers

Sandwiched between the mammoth opening and ending sequences are the action adventures of D’Leh and the small group of hunters who wind their way through Africa looking for their kidnapped friends.

First, as they step out of the grassy plains and into a jungle, they meet the terror birds. A variation of the large, ostrich-like birds created for the film existed in prehistoric times, but the 12-foot-tall CG killer birds have a bigger head and claws on their wings. “It’s a really fun chase sequence with near-miss shots, bites, birds flapping,” says Goulekas. “Double Negative had to create the birds and re-create ferns and jungle foliage they interacted with.”

Jesper Kjolsrud led the crew at Double Negative that created the terror birds and this sequence, as well as two sequences with saber-toothed tigers and shots at the end along the Nile River. Hemmings led the animation team.

“The first terror bird appears as just a shadow,” Hemmings says. “Then there’s a big chase.” To allow changes in camera angles and ease the interaction between bird and foliage, most of the environments were fully CG. “We saved and logged everything,” says Hemmings, “every camera, animation, environment. The editing was constantly changing as we were animating. [Emmerich and Goulekas] might like the animation with the camera from a week ago, or the animation from a week ago.”

The birds never fly, but one climbed a tree and jumped. For this shot and others, the animators worked in Maya on motion cycles before animating the shot. “In the chase sequence, the camera didn’t seem to go fast enough,” Hemmings says. “So, we adapted the bird to make it look like it was running fast in relation to the camera.”

The tiger shots were more difficult and more interesting for the crew. After D’Leh makes his escape from the terror birds, he hunts for food. Just as he’s about to throw his spear at a fleet-footed animal, though, he falls into a trap and lands at the bottom of a pit. It’s raining when he wakes up, the pit is filling with water, and, worse, pinned beneath the logs at the bottom of the pit is a saber-toothed tiger. The only way D’Leh can escape from the pit is by moving the log pinning the tiger. Vultures, balanced on the branches, wait. The vultures, the branches, and the tiger are all CG.

“I had special effects make a life-sized blue tiger for me to position in the pit,” Goulekas says. “We laid branches over it to give the camera something to frame. Later, Jesper and his crew had to paint out and replace all the trees and logs as the tiger struggles.”

Double Negative worked on developing the tiger as a proof of concept in November 2005, before filming began. “Karen [Goulekas] needed to be confident that down the line we could deliver the tiger and be flexible enough to suit the way she worked,” Kjolsrud says. “We had done fur development before, but this would be a showcase for it, and it would have to interact with the water. But the biggest challenge was one I didn’t quite expect. We had to build a pipeline that made it possible for Karen and Roland to keep developing the look of the tiger.”

Animators at Double Negative created the tiger’s performance as the animal struggles in the water by imagining how the digital water would affect the digital tiger’s movements.
The crew knew roughly what size the animal would be and, technically, how to create it, but they didn’t know what it would look like when it was finished. “Everything had to be built knowing that Karen and Roland would keep tweaking far into the process,” Kjolsrud says. “We knew that if we tweaked the model, it would ripple through the pipeline and affect the fur dynamics, the muscle dynamics, and the water simulation. The development stretched out for almost two years. We managed it quite well, though.”

Although the Nvizage artists had previs’d the sequence for the live-action crew, Hemmings also created previs animation at Double Negative. “As the water rises, the tiger gets frenetic,” he says. “D’Leh walks behind the tiger and lifts the heavy log that’s pinning him down. There’s an explosive moment when the tiger is free. The tiger comes back to D’Leh, there’s a moment of eye contact, and the tiger leaps out of the pit.”

Even though the tiger would be in water in the final shots, Hemmings animated the creature in Maya, imagining how the water would affect the tiger’s movements. For reference, he talked a friend into holding his cat down and filmed that, eventually incorporating the cat’s action of pushing backwards into his friend’s hand. “When the tiger was pinned, we had constraints in the rig to fix his elbows to the ground,” Hemmings explains. “But otherwise, the rig was the same as we used throughout.”

For shots of the saber-toothed tiger stalking through a village in a later sequence, though, Hemmings used reference footage of real tigers. In fact, he used that live-action footage to create the previs. “It was quite difficult to do,” he says of the village sequence. “CG creature stuff is often fast and in the dark—fast action, fast camera moves. This was slow camera moves, slow walking, daylight. There was nowhere to hide.”

Double Negative also created shots in Giza, near the end of the film. “Compared to what we were doing with the tiger and terror birds, it was straightforward,” Kjolsrud says, “rivers, boats, matte paintings, and crowds. We also had little one-off shots that one person could work on.” For these shots, motion-capture cycles moved into Massive software animated the crowds of people.

All told, although 160 people worked on the shots at Double Negative, the crew ranged between 50 and 60 most of the time. Kjolsrud believes the prehistoric film will help the studio move more into creature work in the future.

“It’s easy to be typecast,” Kjolsrud says. “We did the water for Below, so for quite a few years we’ve done water. But we did quite a bit of creature work on Harry Potter [and the Order of the Phoenix].” Now, this film has given the studio another big push toward creature creation. “It allowed us to prove we can do it.” 
Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at

Back to Top
Most Read