Beauty and the Beast
Barbara Robertson
August 1, 2012

Beauty and the Beast

For Ridley Scott's Prometheus, Weta Digital, one of 10 visual effects studios on the film, creates some of the most memorable sequences.

When Ridley Scott decided, after a 30-some-year hiatus from science fiction, to follow up his breakthrough film Alien, fans of the sci-fi horror genre that he invented began celebrating. Could he conceive another film as wildly grotesque as the famous scenes in Alien and as surreally beautiful? The answer was yes, it seems he could. While critics haven’t always applauded Prometheus’s story, there is resounding acclaim for the visuals and, for those who appreciate such scenes, the grotesque in Scott’s latest feature.

Richard Stammers of The Moving Picture Company (MPC) was overall visual effects
supervisor on the film, leading the work of 10 studios that created 1400 visual effects shots: MPC, Weta Digital, Fuel, Hammerhead, Rising Sun Pictures, Lola, Luma Pictures, Prologue, Pixel Pirates, and Invisible Effects—the latter two working in-house on comps and cleanups.

The two leading vendors were MPC and Weta Digital. MPC’s 450 shots centered on realizing Scott’s vision through establishing the environment for the film: the ethereal planetary landscapes and the stormy atmosphere, alien and human spaceships and their travels, and a dangerously epic crash sequence. Weta Digital created some of the most iconic sequences in the movie—the opening sequence in which the “engineer” sacrifices himself, a gruesome surgery, the trilobites, and the pilot’s chair.

Weta artists used silicon castings of cabbage leaves and digital loofahs to       give the helmet an organic-looking interior.

“One of the things I was most excited about as a fan was the pilot’s chair,” says Martin Hill, who supervised Weta Digital’s postproduction work. Matt Sloan represented the studio on location.

Alien was one of the films that got me into visual effects,” Hill says, “and that guy in the pilot’s chair has always been a mystery: Who was he? To be able to work on that sequence was incredible.”

The sequence affected Stammers, as well. “The pilot chamber had the space jockey seats from the original Alien,” he says. “It was a great moment when we went to the interior set and saw the same presentation of the set we all saw 33 years ago. There was an amazing buzz on set.”

That Scott wanted to film the actual chairs on set represented a method of working throughout the film. “Ridley [Scott] wants to work in camera,” Hill says. “He wants to work practically. But, he knew there were some things that he wanted to do that went beyond prosthetics.”

In the opening sequence, for example, a spaceship created at Weta Digital lands and the ghost engineer (actor John Lebar) steps out. He drinks from a cup of black, pulsating goo, effectively committing suicide. “It destroys him from within,” Hill says. “It sends black goo through his veins. He crumbles and collapses. His bones break. His spine rips open. He falls into a waterfall at the bottom of a river, and his head falls off.”

In the middle of all that action, Weta Digital artists sent the camera zooming into the engineer’s arm, deep into his bloodstream, and deeper, into the DNA, until we can see the black goo at a microscopic level tear apart and distort the DNA. Then, we see the broken bits reform into a recognizable DNA, one found on Earth, with cells that form and split.

“Ridley referenced the statue of David in the design of the engineer,” Stammers says. “He wanted godlike, alabaster skin with translucency. Weta did a fantastic job of matching the live-action actor.” An actor slathered with silicone.

The engineer begins to dissolve into particulate matter, thanks to Weta Digital.

Weta artists began by referencing a sculpted maquette that Scott had placed in a lazy Susan, lit, and filmed in 2010. “Ridley asked us if we could match it,” Hill says. “We applied shaders and lit it to match, and when we got to the point where we were happy with that, we added a facial system.” The digital head they sent back to Scott rotated once, and then on the second turn, it blinked and formed an expression.

“That brought the head to life and gave Ridley the confidence that we could create a digital engineer,” Hill says.

There would always be a point at which the engineer would become digital, of course. The question was when. “The original plan was to see if we could apply the early stages of the disintegration effect to plates of the engineer,” Hill says. “It turned out that we have two shots with the effects starting on the digital actor. The rest is fully digital.”

The advantage in starting with the actor wearing the prosthetic was that the animators had a performance they could match. That was also the disadvantage. “Obviously, if you have a guy in a loin cloth wearing a silicone suit, his muscles don’t fire and deform like real muscles,” Hill says. “We wanted to put in a full muscle, fat, and fascia system, and have all the sliding and tensing of skin to make everything physically correct. But, it didn’t match the prosthetic. We had to make our digital model less accurate than we usually make creatures, and more accurate with the prosthetic.”

That was true for the engineer’s skin, as well. The silicone was so thick that instead of scattering rays a centimeter beneath the surface, the artists needed to reset their subsurface scattering system to plummet six or seven centimeters deep. “His fingers felt like wax, though,” Hill says. “We had to include internal blockers to make him look substantial and not like a block of silicone.”

Re-engineering a Life

To create the effect of the engineer disintegrating, the artists once again took the lead from Scott’s emphasis on practical effects and looked for ways to incorporate real-world elements into the process. “We carved veins into blocks of silicone,” Hill says. “We pumped ink and oil, and all sorts of liquid combinations through the veins, backlit the blocks, and filmed them. We also shot drying clay, drying paint, and other practical elements.”

The shading artists didn’t use these practical elements directly. Instead, the elements became drivers for shaders and textures that they applied to the engineer; that is, shaders procedurally driven by practical effects.

“It was quicker to give blood pulsing and cracks forming a more natural motion by using real elements, so we decided to use filmed footage to drive shaders,” Hill says. “We heavily processed the images, some of which we filmed at high speed, through [The Foundry’s] Nuke. Some shaders were procedural. Some used painted maps. And some included this natural motion to assist the shaders. We’d augment the footage in different ways, and it became information maps for different effects and stages of the disintegration. For example, we took a filmed vein pattern and applied it to the skin with a time delay to create bruising. We had veins that became darker and shinier as they popped out: As the skin became taut, the specular would increase, and as the skin dried out, it would become more leathery with lower subsurface depth and affected displacement. The main thing was to keep the natural motion in the effect and have it escalate dramatically in every shot.”

As they had done for the disintegration, the effects team also looked for examples from reality to create the DNA part of the sequence. “For the engineer, Ridley [Scott] had a strong idea about what he should look like, but we were able to design the DNA shots,” Hill says. “The brief from Ridley was fantastic. He said, ‘It’s like a war in there.’

To create the engineer’s DNA, the artists used, as reference, the spine bones of a fish that they wrapped digitally into a double helix and rendered with the material quality of tooth enamel. “That gave us room to infect it with the dark and sinister and corrupting infection,” Hill says. Burning polystyrene footage provided elements that the team fed into shaders to create the evolving infection.

“Our effects team did a fantastic job of smashing up the DNA and creating the blood in the veins,” Hill says. “We used [Autodesk’s] Maya to create an enormous particle simulation, and used [Side Effects Software’s] Houdini for cell mitosis later.” For rendering, the team used Pixar’s PRMan, and for compositing, Nuke, all with the studio’s proprietary code running on top.

The sequence takes place near a waterfall filmed in Iceland, and when the engineer falls, he splashes into digital water created to allow interaction with the digital character. “The mood of the shot when the spine rips open, his head falls off, and his arm breaks is dark and violent,” Hill says. “It continues to be fast paced, but by the time he reaches the bottom of the waterfall, the mood changes. We took the particle sim of all the cells that have broken apart, all the broken bits of DNA, and reform them into a fantastic-looking DNA with a clear double helix structure. We wanted something filmic, not a medical visualization. We used extreme depth of field to make the whole scene look big. We see the DNA re--create with hundreds of cells splitting and growing. We see that the engineer sacrificed himself to create life. Thelvin Cabezes and Christoph Saltzmann, the leads on the sequence, did a stunning job on this shot.”

Abominable Surgery

In stark contrast to the beautiful opening sequence, the artists at Weta Digital also created what is perhaps the most horrific sequence in the film, one people are as likely to remember as the classic chest-bursting scene in the first Alien.

“When we saw the medpod sequence as previs, our jaws dropped,” Hill says. “We thought, ‘They can’t want to do this. It’s too disgusting. But, they did. We became kind of numb to it after a few dailies.”

In this sequence, the character Elizabeth Shaw (played by Noomi Rapace) is pregnant. The father of her “child” is Charlie Holloway (actor Logan Marshall-Green), but the “child” is an alien because alien DNA has infected Holloway.

Shaw crawls into a medpod bay and re-programs a machine designed to perform abdominal surgery on men, to do a C-section on her instead. “She’s lying there and the creature inside is bashing against her abdomen,” Hill says. “We see baby feet and elbows pushing and distorting her abdomen.”

The surgical instruments in this horrific scene are CG, as is the human torso. 

To create that effect, the artists match-moved Rapace’s torso, rebuilt it digitally, projected the plate onto the digital model, and then enhanced that model to correct stretching and fix specular highlights. “She’s covered in sweat,” Hill says. “We needed to re-apply all the reflections and shadows from all the tools on her distorting and undulating body.”

Animators referenced car machinery to create the motion of the tools, all of which are CG. “We added a slightly spidery motion to make the tools feel sinister,” Hill says. “And we added one or two tools more than necessary to make the scene feel claustrophobic.”

The first gory shot happens when a laser slices open her belly. “We needed to match-move Noomi [Rapace], create belly geometry, slit that open, and build her internals,” Hill says. “You can see steam rising from where the laser slices her. Then the spreaders come in to open her stomach, and a giant claw comes down and pulls out a baby trilobite.”

The baby trilobite was a practical puppet that, while in the placental sac, didn’t move. “When you see it twitching in the sac, that’s our shot,” Hill says. “You can see embryonic goo and strings of viscera connecting it back to the stomach.”

When the sac bursts, the trilobite thrashes around wildly. “We did some of the more precise motion, but it was also practical. About half digital, half practical,” Hill says. “The goo that splats back onto the stomach was practical. But, we had a lot of simulation for fluids dripping off that had to match what was on set precisely because this is in stereo. So, we did a lot of re-projection onto models, some CG work, and Alfred Murrle’s team did some very good comp work for the sequence.”

The shot doesn’t end there. “The really grotesque part is of her becoming stitched back up,” Hill says. “We knew they wanted a stapler; they had put enormous staples in her on set. We needed to make a machine that felt solid enough to punch those staples. We looked at jackhammers and pneumatic road drills for motion reference. Whenever the stapler punches in, we have it punch two or three centimeters and create deformations around that. It’s quite brutal.”

A Trilobite Fight

Despite the complexity of these two sequences, Hill points to shots of a fight between the last engineer (actor Ian Whyte) and a trilobite as the most difficult.

“As an adult, the thing that came out of [Rapace] is 14 feet across from tentacle to tentacle,” Hill says. “When the engineer and trilobite fight, Noomi [Rapace] is caught in the middle. These shots were the hardest because they lit everything in a strobe environment. No frame was the same as the one before. And, Ridley wanted to catch it all in camera. They had Ian Whyte all wired up and fighting against air where the trilobite would be. We had to matchmove him perfectly to give animators something to work with and to position the digital trilobite. We had to replace a limb on the engineer in some shots, and we needed tight tracking for the creature rigs. Tracking software doesn’t respond well to incoherent flashing light.”

The fight between the last engineer and the trilobite was the most difficult         for the artists at Weta Digital.

On set, the crew had three reference cameras and a main camera to help provide an accurate representation of the action to help animators, lighters, and rendering artists fit the creature into the shots. “The trilobite is very deformable,” Hill says. “It has six or seven tentacles wrapping and constantly moving in this strobing environment, and one tentacle compresses against the engineer’s limbs and body. So on top of the muscle model our creature team used, we needed an extra simulation layer. We used a cloth simulation to produce the longitudinal wrinkles on the trilobite, and then compressed them around the axis of the tentacle. We passed all this information to the shaders so the skin would become shiny when tense and rougher when compressed.”

All told, approximately 350 people at Weta Digital worked on the studio’s 250 shots. “It was hard,” Hill says, “but the whole team was elated to be working on such an iconic piece of cinema.”

Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at