The critics have not been kind to Ridley Scott’s film Exodus, but none deny the visual effects.
Once Moses leads the Jews away from Egypt and out of slavery, the spectacle begins in full force. There are plagues of locusts, frogs, and flies. Thousands of people. Crocodiles. Dramatic chariot races on the edges of cliffs. And, of course, the parting of the Red Sea. In the past, biblical epics relied on sets and thousands of extras. For this film, Scott relied on crews of thousands who created the cinematic illusions with computer graphics.
Peter Chiang from Double Negative (DNeg) was overall supervisor, managing the work at DNeg, The Moving Picture Company (MPC), Method, Lola, Pearless, and Scanline.
“MPC had worked with Ridley on Prometheus, so it tackled the toughest shots, finishing the work from when the Hebrews are released until the finale,” Chiang says. “They also used their Alice software to create the battle. DNeg handled the plagues and the Egyptian landscape. Lola did the landslide. Pearless did greenscreen set extensions of live-action sets. Method did a lot of skies and some one-off shots. And, Scanline helped MPC with the water simulations at the end, since MPC uses their Flowline software.”
DNeg and the Plagues
For the plagues, Scott wanted each to seem plausible in nature, yet dramatic.
“He wanted the plagues amped up to 11,” Chiang says. “But, if you look at YouTube footage of locusts, it’s quite underwhelming, so we multiplied the particles and amount that DNeg was doing. And frogs surged like a football crowd through a narrow gap. We had to create that dimension so when seen from a height, they wouldn’t look like a flat carpet layer.”
Some of the frogs were real. When asked how many frogs they’d need to film, Chiang requested 400.
“I just picked a number, and lo and behold there were 400 frogs on set,” Chiang says. “They came in a bucket. After every take, we had to pick them all up and count them. Ridley, Dariusz Wolski the DP, the set dressers, the animal wranglers. We all joined in. The frogs went everywhere. It was harder picking them up than filming them. I’m glad I didn’t ask for 1,000.”
During another plague, this one with flies, Scott wanted great vistas of Egypt seen through a wide-angle lens.
“In true scale, you wouldn’t see the flies because the depth of field put them so far away,” Chiang says. “So we had some [CG flies] bouncing on the camera lens, and to make others read, they ended up the size of rabbits. You don’t notice the cheats as the images flow from one cut to the next. But, if you would look at the 3D scene closely, you’d see that it’s full of cheats, speed changes, and sizes.”
Of all the plagues, Chiang found the crocodiles most difficult.
“All that water,” he says. “We shot the film in native stereo, so we couldn’t do any 2D cheats. We had to respect the Zed space. So those shots are mostly digital. The flow of the Nile and the water interaction with the crocs is all CG. The odd splashes on the people are real, though. Special effects made a boat that could sink with a gimbal that we shot against a bluescreen.” To create those shots, the crew at DNeg used a fluid system based on Side Effects’ Houdini.
MPC: Crowds and Water
For its part, MPC artists in Montreal handled the Battle of Kadesh sequence by extending environments filmed in Jordan’s Wadi Rum mountains and adding digital crowds of soldiers to create the Battle of Kadesh.
“These are the same mountains Ridley used for Prometheus,” says Max Wood, CG supervisor at MPC. “So that was great for us. We had photogrammetry of the mountains already and lots of reference.”
MPC’s London-based crew moved the crowds through deserts, mountains, and to the beach through an additional nine sequences, and created the Red Sea.
Although the water shots represented the bulk of the difficult work at MPC, crowd shots dominated the work as a whole.
“Our biggest crowd had 200,000 agents,” Wood says. “We hoped we would break our record of 250,000 agents in World War Z, but we didn’t get a plate that was wide enough.”
Alice, MPC’s proprietary system, managed the crowds, whether birds, animals, or people.
“We had a lot of birds, tens of thousands of birds,” Wood says, “seagulls mostly flying around. We thought we could use elements, but there were so many, we did them all in CG. And, this was the first time we had crowds of horses and chariots.”
To derive performance data from the horses and chariots, MPC worked with Audiomotion Studios in Wheatley, UK. MPC provided the CG skeletons; Audiomotion handled the motion capture at an equestrian center near London.
“We had two horses pulling a chariot with two people in back,” Wood explains. “We had special suits built for the horses. Audiomotion captured everything – the wheels turning and other elements on the chariots. We had captured horses for other films before, but never two running with chariots. There was a gap of only eight to 12 inches between the horses, so there was a lot of shadowing, but Audiomotion did a great job. They cleaned up the data and sent us clips quickly.”
Motion cycles included data for straight passes, left turns, right turns, acceleration, and shots of riders pulling on the reins.
“We blended those set pieces together in Alice,” Wood says. “The difficulty was that horses are left- and right-footed, so we had to make sure the blends had the two horses running with the same stride.”
During one sequence, a cliff collapses, sending riders and chariots down the side of a mountain. The filmmakers shot the sequence on an abandoned roadway carved through mountains in the Canary Islands. MPC artists extended the mountains, collapsed the cliff, and added the animated digital soldiers, horses, and chariots. Kali, MPC’s proprietary rigid- and soft-body solver, handled the large rock fall. A second program, Papi, moved the small rocks and shale. And the simulation artists added fine layers of dust with Autodesk’s Maya, The Foundry’s Nuke, and Flowline.
The sequences involving the parting of the Red Sea formed the bulk of the difficult work at MPC, where a crew worked for nearly a year on sky replacements, beach extensions, seabed environments, giant waves, and tornadoes to create the iconic images. In fact, the studio used an image illustrating this sequence for the movie poster.
Sky replacements became necessary when the production unit decided to film in the Canary Islands’ sunny Fuerteventura.
“Ridley wanted dramatic skies and, as the sequence goes on, tornadoes and waterspouts,” says Wood. “So we replaced skies across the whole sequence, which must have been more than 100 shots.”
Actors and animals filmed in stereo against a bright sky meant rotoscope artists had hard work cut out for them.
“It took about 20 days of roto per shot to remove all those people and animals,” Wood says. “The clothes. The hair. Yeah – 20 days or more per shot to get the roto done.”
To have water flow around the live-action actors, the production crew had six people on Jet Skis pull a metal framework off camera to create a river-like motion.
“It looked great in-camera,” Wood says. “But it dissipated quite quickly. So we extended the water to create an entire sea using CG elements simulated in Flowline and some [live-action] elements shot of rivers in Canada by people in our Vancouver office.”
As the water recedes, the seabed reveals rocks, puddles, dead fish, and other objects.
“There were set rocks, but we extended the seabed far into the distance,” Wood says. “We replaced 90 percent of the plates to give the dramatic feel Ridley wanted for the seabed.”
After the water recedes, a big wave begins to form at the horizon. “You can see what Pete Chiang called ‘a silver thread,’ a boiling line a few pixels high that you wonder is it there or isn’t it,” Wood says. “We created that through Nuke in compositing. Then, you see tornadoes above as the action starts to build.”
To create the tornadoes, the team animated simple geometry, and then filled and enhanced the geometry with a combination of particles and volumes. They rendered the tornadoes from different angles to create a library that compositors could access to insert twisters into the shots.
For the wave itself, the team – working from CG previs provided by The Third Floor, Scott’s drawings, which the team calls “Ridleygrams,” and the live-action plates –decided on several approaches.
For small to medium wave shots, the crew started with a surface blocked out in postvis by animators using rigged geometry.
“We had lots of talks about how to make a long wave interesting,” Wood says. “We wanted different areas pushed forward and backward, and different heights at different points to show the wave constantly evolving. So, we blocked out all the waves with a rig to get the right speed and shape. Ridley signed off on quite simple geometry in which he could see the advancement of the waves.”
They moved the wave between 45 and 100 mph, but most often at 50 mph.
“We read a lot of literature and found that real waves can travel up to 100 mph, although it’s difficult to estimate exactly,” Wood says. “Our average was 50 mph, with a few shots slower for artistic reasons. It was quite challenging having different speeds in different shots.”
Once the team had an animated surface, the artists used a tessendorf deformer to add choppiness and scale.
“From there, we meshed the tessendorf into geometry to add a leading edge, the churn along the ground, and the boiling water that hits the rest of the water,” Wood says. “We’d get spray off the wavelets and cresting water, all through effects simulation in Flowline.”
Similarly, the crew used base geometry from rigged animation for the monster wave.
“One of the best things about working with Ridley Scott are the Ridleygrams,” Wood says. “He’s so quick and good at drawing, he gives everyone a great idea of what’s in the movie and how to frame it. At the beginning of the movie, we had a breakdown on how he wanted the wave to work and the size of it compared to a person. We did our own concept art, but it was based on what he showed us.”
To simulate the wave, the crew relied on Flowline.
“We had Flowline create a thin layer of water that sits on top of the wave,” Wood says. “From that, we got spray off the crest and a more natural feel from the water that flows up and over the wave. The waves are 120 to 200 feet, to accommodate the framing Ridley wanted, but most are 160 to 180 feet tall. We generated the surface, the refined sheet of water on top, and the particle simulations, all through Flowline.”
To help simulate some of the large wave shots in Flowline, MPC called on Scanline.
“We were using their software, so they were the right guys to go to for help with some of the large shots,” Wood says. “We had defined a look of the churning water along the front edge. It was a question of getting through the body of work.”
And what a body of work. At their best, the goal of visual effects is to support a story. It’s unfortunate when good visual effects support an otherwise flawed film.
Writes Christopher Orr in “The Atlantic” of the effects in Exodus: “The CGI is often spectacular – the palaces, the armies, the plagues of frogs and hail and locusts – and it comes as close as anything else to offering a rationale for the movie’s existence.”
Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World
. She can be reached at BarbaraRR@comcast.net.