Heavy-Handed
Issue: Volume: 31 Issue: 7 (July 2008)

Heavy-Handed

The biggest thing about this year’s big-screen interpretation of Marvel Comics’ Hulk character is not the size of the CG superhero. It is how much Hulk has changed since his feature-film debut in Ang Lee’s 2003 film. From all appearances, the uncomfortably green fighting machine spent the past five years working out in the gym. He’s beyond toned; his muscles strain against his taut skin.

Directed by Louis Leterrier, Universal Studios’ Incredible Hulk stars Edward Norton as Bruce Banner (Hulk) and Tim Roth as his nemesis, Emil Blonsky (Abomination). Three studios—Rhythm & Hues, Soho VFX, and Hydraulx—created the digital characters and surrounding environments, with Rhythm & Hues taking the lead on character design and modeling, and in creating the bulk of the digital behemoths’ close-up shots. Rhythm & Hues’ Betsy Paterson supervised a crew of approximately 250 artists working in Los Angeles and Mumbai, India, who gave Hulk his new, buff body and Abomination his grotesque shape, and sent the two battling through the streets of New York City.

Universal Studios, Leterrier, Marvel, and overall VFX supervisor Kurt Williams knew from the beginning how they wanted this rendition of Hulk to differ from the previous film’s cartoonier giant. “They had done their research,” says Keith Roberts, animation director at Rhythm & Hues. “They knew what people liked and didn’t like. They wanted Hulk to be much more of a street brawler. Nasty. Rough. Edgy. When people saw him, they wouldn’t immediately know that he was a good guy.”

Rhythm & Hues, though, is famous for its award-winning furry animals, not edgy monsters: The studio won Oscars for Babe and The Golden Compass, and an Oscar nomination for The Chronicles of Narnia: The Lion, the Witch and the Wardrobe. This is the first film for which the artists at Rhythm & Hues have created a digital humanoid.

“We pushed ourselves to the limit technically and creatively,” Paterson says. “We pushed everything we could already do to 11. Maybe 15.”

The push started even before they had a contract: Strong animation tests helped the studio secure the job.

“[The production unit] gave us a rough model that conceptual artist Aaron Sims did,” Roberts says. “Of course, we re-modeled and re-rigged it. And, in three days, we had motion tests on the rig to show the executives. Winning this film was a big coup for us.”

Character rigging supervisor Matt Derksen masterminded the rig development for the test and the evolution of that rig for the film. It was, Paterson believes, a state-of-the-art breakthrough. Hulk needed to have zero body fat, pulsing veins, and straining muscles, and move through scenes in full daylight. And the crew had to transform Bruce Banner into this huge monster in close-ups under laboratory lighting, and then back into his human form.

Zero Percent Body Fat
Rhythm & Hues uses Autodesk’s Maya for modeling and Side Effects’ Houdini for effects, but rigging, animation, lighting, and rendering happen within the studio’s proprietary software. “The major thing we had to develop was a new skin-slide system,” says Derksen. “We had to slide Hulk’s skin tightly over his muscles without using a simulation approach. It was important. Without it, he would look unbelievable.”

The new system uses two geometries acting differently within the same space; that is, two skin “binds,” one sliding over the other. The riggers started with a pre-existing system. “We build the character as if it is a real person, defining each muscle using volumes,” Derksen explains. “Then we bind across all of those muscles based on the skeletal structure.” If Hulk bends, the muscles squish up and hold their volume, and the skin moves across his body appropriately.

“If he lifted an arm, you’d see the skin pull and tug across his ribs,” Derksen says. It’s a very organic bind.” That might have been enough for some creatures, but not for Hulk. The riggers added a second, newer bind.

The second bind was simpler, less distributed, more rigid. It doesn’t slide; it bends only at the joints. “We take that rigidly bound skin, relax it, and suck it against the initial bind per frame,” Derksen says. The second skin bind becomes smooth and shrink-wrapped against the original bind, but still somewhat rigid.

“The major benefit of sucking the one skin against the muscles was to make the Hulk look like he has zero percent body fat,” Derksen says, “which was an important part of the character’s redesign.”

As a result, when Hulk moves, the flexible bind crawls under the more rigid skin. When he roars, you see tense muscles push against his skin from his huge neck to his feet. “[The rigging system] gives you the sense of tight skin, as if you pushed your fist against a sheet of rubber and moved it around underneath,” describes Derksen. “You can see the muscles and inner structure moving beneath.”


Actor Edward Norton fights against the inevitable transformation into Hulk. His eyes and his veins turn green, his bones lengthen and stretch his skin, and his muscles expand beneath.

Because the riggers built this movement into the rig, rather than creating the skin sliding through a simulation, animators could control the entire process. Working with the director, the animators set key poses and determined the bind for those poses—that is, the amount of striation we see across Hulk’s chest, for example, and the amount of muscle volume preserved.

For animating Hulk and Abomination, the studio started with motion captured from stunt actors by Giant Studios (see “Maximum Motion,” pg. 25). But in addition, animators could control every muscle in the creatures’ bodies by hand and see them take shape.

“Usually, we set up a fast bind for the animators to see in real time while they work,” Derksen says, “but the final bind is fast enough that they could see in their file exactly what shape the muscle made, which is beneficial.”

A similar approach using layers of volume-preserving muscles and sliding skin worked for Hulk’s facial animation as well. “We slid the tight skin over his skull,” Derksen explains. “The most important parts of his face were his giant Neanderthal eyebrows and cheeks, so we needed to get the skin to slide over those bones without making it feel as if the bones moved.”

For reference, the riggers and animators used data captured from Edward Norton during motion-capture sessions using Mova’s facial-capture system. “It was like having a cyber scan for every frame,” says Roberts. “We had 24 incredibly detailed models per second. We could see subtleties—micro-movements in the cheeks and under the eye—because we could study his face in detail. That was a great thing that Mova gave us.”

The animators, however, ended up hand-keying Hulk’s face to give him comic-book expressions, using the Mova data primarily to help with timing. “Hulk doesn’t have Edward Norton’s expressions, but the two are eerily similar in facial timing,” Roberts says.

The Mova data also helped Derksen design the skin deformations. “We could see how Edward Norton’s face moved in 3D,” Derksen says. “We could see how his skin slides over his face, so we interpreted that and put it into Hulk. It helped us determine what controls we needed.”

The animators moved individual muscles to make final expressions using a master control, but they could also exert a finer level of control for any part of Hulk’s face, down to 10 vertices. A new user interface allowed them to pick any part of the face, click on that part, and drag it to move it. “We wanted to give them more of a sculptural approach,” Derksen says. “They could pick a part and scrub it.”

Becoming Hulk
The ability for the animators to control Hulk’s muscles and skin was especially important for the transformations, the most obvious of which happens during a scene in which Banner is on an operating table in a laboratory, taking what he hopes is a cure.

“The idea [of the transformation] is that the gamma radiation that turns Banner into Hulk is stored in the back of his brain, and when he transforms, everything radiates out from there,” Paterson says. “You see it first in his eyes. Then as the green blood moves through his veins, the skin changes color, his muscles striate, the bones enlarge, and the muscles catch up to the bones. It happens in a non-symmetrical way, so it feels organic.” The rig made it possible for animators to achieve an art-directed transformation, even in close-ups.

One rig handled both the bipedal performance and the transformation. To accomplish this, the team created models for Hulk and Banner that precisely matched, vertex for vertex. “We procedurally generated the Bruce Banner model based on Hulk’s geometry by relaxing the geometry in the Hulk model and then sucking it up against a 3D scan of Edward Norton,” Derksen explains. “Once we had that, we took Hulk’s binding and re-proportioned it into Bruce Banner. And, once we did that, we could morph locally around a given joint.”

The animators had two sets of controls for the rig: one for the bipedal performance and one for the transformation. With these controls, the animators could transform any part of Bruce Banner’s body—even one finger—into the Hulk at any time. Because they could control selected body parts, they could offset and propagate the transformation through his body and limbs.

Some controls affected bone length; the animators could elongate a bone they were animating. When they did, the skin around the bone tightened and created an emaciated look around the bone because, for a short time, the character had Bruce Banner muscles and Hulk bones.

Using a separate control, animators determined when the muscles grew and filled in under the tight skin. “That was an aesthetic choice,” Derksen says. “The director wanted the transformation to feel painful.”

Point attributes in the rig drove animated color maps and vein displacements. As the animators caused an area to flex and transform, the rig sent information, in effect, to lighters who could animate the color change for that area and the vein displacement.


Rhythm & Hues used Hulk’s muscle and skin tools and techniques to create a more grotesque body for Abomination.

“The animators knew if they were transforming a hand into a Hulk hand, it would turn green and the displacement maps would change the detail and striation in the render,” Derksen says, “But, they didn’t see that detail until we rendered it.” Though when Banner’s bone pops out of joint in a close-up, the crew used the studio’s proprietary cloth engine to simulate the skin shrink-wrapping over the shoulder and muscles crawling over the clavicle.

Abomination
Once the rigging team had developed Hulk, they moved his muscle and skin tools and techniques to Abomination for that creature’s scenes. Abomination represents the result when an overachieving fighter forces a scientist to quickly turn him into a Hulk-like human weapon. The experiment goes horribly wrong.

“The concept is that he grew so fast, he exploded out of his human skin and has remnants still on him,” Derksen says. “So he has an outer layer of skin.” Also, his bones protrude—he has a lizard-like spine. To snug his skin up against the protrusions, the team used areas of influence that caused the skin to compress as it moved up against the bone.

For facial animation, the team returned to Mova for a facial-capture session with Tim Roth. “Roth had acted with a mask for Planet of the Apes,” Derksen says, “so he was great at exaggerating facial motion that translated well into Abomination. We used a lot of that data to develop Abomination’s facial structure and poses.”

Action
When Hulk appears in a shot, it’s usually an action sequence; the monstrous superhero is angry. He fights an army battalion, tanks, soldiers, cannons, and ray guns on a college campus, and fights Abomination in Harlem. Soho VFX and Hydraulx helped with the action scenes and some other shots, working with models (meshes) and textures from Rhythm & Hues for both characters.

Soho VFX handled the “first reveal” of Hulk in a bottling plant early in the film and a complex scene with the characters fighting on a Harlem rooftop from the time they climb up the buildings until a helicopter crashes on the roof. (Rhythm & Hues took the “Hulk fighting Abomination” shots from the crash to the climax.) In addition, Soho VFX gave Roth a muscular body during a locker-room shower scene.

In the bottling plant sequence, which begins in the nearby Brazilian slum, Banner tries to control his excitement while thugs and soldiers chase him. Eventually, though, his inner Hulk bursts out. “It’s dark, and we try not to show too much at first, but by the end, we see him entirely, chasing through the bottling plant,” says Allan Magled, Soho VFX visual effects supervisor. Anything Hulk interacts with in the plant is CG, and he interacts with tons of stuff—literally. At one point, he tosses a CG water tank that’s six feet in diameter and nine feet long, and near the end of the sequence, throws a CG forklift.

Starting with the Rhythm & Hues model and textures, Soho VFX assembled Hulk in their pipeline, adding their own hair and eyes. “We had a basic static OBJ file of the model and a bunch of Open EXR files, each with a thousand texture maps or more, for displacement, textures, and subsurface scattering.” The same was true for Abomination, although that creature didn’t need hair or cloth.

Because Soho had shared assets with Rhythm & Hues for Narnia, they had a system in place to handle the differences in the studios’ pipelines. “The hardest part was making their maps work with our rendering technology,” says Berj Bannayan, co-visual effects supervisor. Rhythm & Hues uses proprietary rendering software; Soho VFX uses 3Delight, a RenderMan-compliant program from DNA Research.


Rhythm & Hues shared shots in the film with Hydraulx and Soho VFX. For example, Hydraulx built the city for the fight between Hulk and Abomination (above). Soho VFX took the fight to the rooftop, and then Rhythm & Hues brought it to a climax.

“We animate and light in Maya, and then use our own tools to bridge between Maya and 3Delight,” Bannayan says. For cloth and hair, we have custom software extensions in Maya. Everything starts with Maya as a base, but we have custom geometry tools and our own ways of deforming.”

Rooftop Rage
All told, Soho VFX’s crew of approximately 100 artists created 150 shots, with most taking place during the nighttime battle between Hulk and Abomination on Harlem rooftops that extends for nearly two minutes of screen time. For those shots, Soho VFX built highly detailed CG rooftops and streets seen below. “We had to be able to shoot everything close up,” Magled says. “Every water tank, ledge, and brick. The previs kept changing; we had to be prepared for anything.”

A crew of approximately 35 modelers and texture painters spent five months constructing the environments, working from Lidar scans and 19gb of photographs.

When they finished, the entire asset—the texture maps, models, shaders, and so forth—totaled 750gb, but animators could work with only the sections they needed for particular shots. They could also select whether they wanted to put the characters into low-, medium-, or high-resolution backgrounds as they worked. “We had only one shot with the entire rooftop,” Magled says. “In that shot, Abomination runs from end to end, with the helicopter firing at him.”

For the creatures’ musculature, Soho VFX worked with Autodesk on a custom build of the muscle technology in Maya, and then spent months creating additional tools. “We spent as much time on Hulk’s muscles as we spent on building the rooftop,” says Bannayan. “It’s all about seeing how the creatures’ veins and tendons pop, and how the muscles interact with the skin.”

Animators could work with traditional weighted skin to get coarse animation and then activate the muscle rigging to see the muscles jiggle and deform. In addition, localized displacement maps linked to muscle movement added wrinkles, veins, and fine details. A lightweight shader set provided quick renders for verification during the process. And, a new system of blendshapes accelerated character cleanup.

“Before, character cleanup was tedious,” Bannayan says. “We now have a system of cleanup shapes that we can use to fix any part of a character without affecting other parts.” The crew also rebuilt the lighting rigs to smooth the process and reuse lights set up for similar environments.

On the Edge
Hydraulx, on the other hand, had only three months to create its 300 shots, which included an Abomination transformation, CG environments, and multiple effects. In Abomination’s transformation, we see him change part by part, starting with his boots.

“We had a surface model and texture maps, but not shaders,” says Greg Strauss, who shared the job as visual effects designer with Colin Strauss. “We rigged the model, giving animators spheres of influence—proprietary plug-ins for Maya—they could use to precisely control what part would transform.” Modelers also created details with Autodesk’s Mud­box and with Pixologic’s Zbrush displacements baked into the model.

“We had a second layer of deformation on top when the skin was growing and the monster beneath was pushing it aside,” says Chris Wells, visual effects supervisor, describing how the group created the “snake shedding its skin” effect. “We had multiple ways to push things around with the deformer. Once we applied a deformer, it would tear the model open without doing horrible damage to the UVs.” Animated textures that coincided with the deformers affected the geometry according to color; the texture maps animated off as the green skin pushed through. Animators keyframed his boots tearing apart, used a rigging trick to pop off the threads, and Syflex cloth simulation to tear his pants.

Perhaps most important for this work, though, was a new photometric lighting system that Hydraulx installed in time for this film. “In the past, we’d cheat the light fall-off values,” Strauss says. “What was in the fill light would be a cheat. Now, it’s physically accurate. We match the true light, the energy level of the true lights.”

Much of that work happens with lens shaders and output shaders in Mental Images’ Mental Ray (now owned by Nvidia). Hydraulx changed the color space for calculations at the end of the Mental Ray pipeline to a photometric color space. But the studio also implemented final gathering, in which the color from every object in a scene influences its surrounding environment.

“Final gathering is particularly important on daytime exteriors because of the fill light in the Earth’s atmosphere,” Strauss says. “We thought it was too time-intensive in the past, but the tables have turned. Now, it takes too long to fake it. We turn on final gathering and all of a sudden, things look photoreal. Even our less-experienced artists can make stuff look good.”

Maximum Motion

Rhythm & Hues animation director Keith Roberts has performed animal characters in Babe, Harry Potter, The Chronicles of Narnia, and Garfield, but until The Incredible Hulk, he had never animated a human character, nor had he worked with motion capture. Perhaps as a result, he approached this project with his eyes wide open.

Giant Studios managed the equipment and sessions, providing real-time playback that allowed the performers to see their avatars on stage in Toronto and in Los Angeles.

“I had three performers with their own different styles of motion,” Roberts says. “I cast the performers per-character and also per-action. Some people don’t have a body type suited to certain actions—it’s a subtle thing, but I picked up on it straight away.”

For example, one performer could run and walk like Hulk, but couldn’t roll in the way Roberts imagined Hulk would roll. Another performer had a particularly good stance for Hulk when the giant roared.

“Motion from motion capture is as pure as it gets, so once you see the differences, you want to start with something right,” Roberts says. “If you give an animator data from a performer who moves his arms too much, or looks too bowlegged, your chances of success are diminished.”


Roberts worked primarily with two performers in Toronto: Terry Notary, a former gymnast, Cirque du Soleil performer, and choreographer; and Cyril Raffaelli, a martial arts expert, acrobat, and Parkour practitioner (moving quickly while efficiently overcoming obstacles in an urban or rural environment). Although they had suits for both actors playing Hulk and Abomination, Edward Norton (Hulk) didn’t wear the suit, and Tim Roth’s (Abomination) motion wasn’t right. “Cyril was faster and could do Abomination better,” Roberts says. “So, that left Terry more of the Hulk work to do.” He estimates that the motion-capture data landed in scenes without tweaking from animators in only about four of the 240 shots.

“If we wanted the characters to look like guys in suits, we could have plugged in the data and gone ahead,” Roberts notes. “But they’re larger than life, heavier, stronger, and faster, so we had to speed up some parts, slow down other parts, and reconstruct the motion to give the characters more weight.”

Roberts also discovered that they lost flexibility in the characters’ upper torso when they applied the motion-capture data to the rig. “I think that’s because you don’t put the targets directly on the skin,” he says. “So, it’s difficult to get the enormous amount of compression and extension happening in that area.”

Giant Studios provided Rhythm & Hues with only a rough track; Roberts wanted raw data. “We never asked them to clean it up,” he says. “I always wanted my best animators to make those decisions. I wanted raw materials coming into my ‘kitchen,’ not premixed sauces.”

All in all, Roberts found his first experience with motion capture educational.

“They teach you the principles in animation school,” says Roberts. “But it’s only when you study the motion capture, the way the body twists and torques, the way all the action comes from the hip, that you can see what they’re talking about. It was an epiphany.”

But, it didn’t make the job easier. “This is the hardest show I’ve done,” Roberts says. “With a cartoon character, you can get away with an enormous amount of dodgy animation. But with a human, my God. It’s so much more specific, so much more difficult.” –Barbara Robertson

Hydraulx’s lighting TDs work on eight-processor machines equipped with 16gb of RAM. “In the old days, when we were working out of our apartments, that was our entire renderfarm,” Strauss says. “Now, that’s one kid’s workstation.”

In addition to the creature work, Hydraulx worked on the university battle scenes. An OBJ model of Rhythm & Hues’ final Hulk acted as a proxy to hold out the dirt, dust, explosions, and the illusion of rays created with Maya fluid sims. Hydraulx also crashed and exploded the CG Apache helicopters. For a grotto shot, Hydraulx added a waterfall created using Next Limit’s RealFlow. And for the fight between Abomination and Hulk in the streets of New York City, Hydraulx built the city.

“We matched Soho’s aesthetic because they had final shots for what the city needed to look like,” explains Strauss. Using textures from their library and painted textures, the group quickly modeled five blocks close up and a larger area for midground shots. “The sequence is at night with a fast camera, so the midground buildings didn’t have to be so detailed,” Strauss says. The CG team worked 18-hour days, and two shifts of compositors worked on Autodesk Inferno systems to complete the shots. “The night crew left as the morning crew showed up,” Strauss says. But, together, they pumped out a remarkable 300 complex shots in three months.

It’s easy when you watch a film like Incredible Hulk to focus on the action and forget that CG artists created every bit of muscle and tendon straining Hulk’s skin. That they touched every chunk of concrete ripped from the roof of a building, every chain about Abomination’s neck, every propeller blade on a helicopter, and every metal fragment that lands on the ground. And, they created the sad look on Hulk’s face. That it’s all digital. But that’s the point, of course.   

Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.