Living Art
Issue: Volume: 32 Issue: 6 (June 2009)

Living Art

Rhythm & Hues and CafeFX help make possible a rolicking romp with art and artifacts for Night at the Museum: Battle of the Smithsonian

Read in-depth features about how cutting-edge CG was created for various movies making Oscar buzz.

Finally, someone shot a feature film inside the Smithsonian Museums. And, such a film. Directed by Shawn Levy, the 20th Century Fox feature magically brings to life works of art, historical figures large and small, creatures, statues, and Einstein bobbleheads in a sequel aptly named Night at the Museum: Battle of the Smithsonian.


R&H animators could cut loose with the Einstein bobbleheads and have fun creating facial expressions for the cartoony characters.

The battle features, as did the first Night at the Museum, Larry Daley (Ben Stiller), the museum guard from the earlier film, now an entrepreneur and inventor; Jedediah (Owen Wilson) as a miniature cowboy; Octavius (Steve Coogan), another miniature; and Teddy Roosevelt (Robin Williams). Introduced in the sequel are Kahmunrah (Hank Azaria) and Amelia Earhart (Amy Adams). But, enough about the actors. The fun in this film centers on watching familiar inanimate objects and iconic artifacts take on a life of their own.

Rhythm & Hues (R&H) was the primary visual effects house for the film, and a crew of approximately 300 in its Los Angeles and Mumbai and Hyderabad, India, studios created an animated octopus, the Einstein bobbleheads, various sculptures, airplanes, falcon heads for the Horus, a squirrel, and digital doubles. CafeFX concentrated on bringing paintings to life.

Co-visual effects supervisor Raymond Chen, along with Dan Deleeuw, led R&H’s work on the 535 shots. “The fun of this project was that we had different kinds of work. It wasn’t like having an orange cat for 500 shots,” says Chen. “We had a furry creature but also hard surfaces, metal, marble, and octopus flesh—a lot of different problems to solve.”

The octopus was the most difficult character, the bobbleheads the most fun. At Rhythm & Hues, artists model with Autodesk’s Maya, and for displacement, they use Pixologic’s ZBrush and Autodesk’s Mudbox. For most other tasks along the pipeline, the crew uses the studio’s proprietary tools.

To handle the eight octopus tentacles, which were like limbs without bones, character riggers created a system that included procedural animation. “The director wanted something anthropomorphic and not too cartoony,” Chen says. “The rig had splines controlled with lots of knobs, automated volume preservation so that if you extended a tentacle, it would get thinner as it stretched, and methods that helped the tentacles and suckers slide over each other.”

To prevent interpenetration, geometry on the ground or another tentacle could displace a tentacle overtop. “When animators dragged one tentacle over the top of another, they didn’t have to worry about the contact,” Chen says. “The rig would cause the bottom tentacle to push the top one up.”

After the animators completed their work, the effects department slimed the octopus based on proximity to other surfaces, using geometry rendered with transparency. “When the tentacles wrap around Ben Stiller, we tracked in geometry for him,” Chen explains, “and put slime in the areas closest to the tentacle. As a tentacle got farther away, the slime would stretch and then snap.”

The marble statue of Abraham Lincoln, by contrast, had an opposite challenge: It couldn’t look stretchy or rubbery. “The animation style was more constrained, but even in the rig, we limited the area of influence for movements in the mouth,” says Chen. Similarly, to stiffen the cloth, the simulation artists hid the movement inside the folds.


At left, actor Ben Stiller tosses water from a Turner painting animated by CafeFX onto an octopus created and animated by Rhythm & Hues, while Amy Adams looks on. At right, a special rig helped the crew at Rhythm & Hues lock CG falcon heads onto stunt actors’ shoulders to create the Horus.

In one scene, Lincoln smashes through a window and attacks the Egyptian god Horus, a humanoid with the head of a falcon. Horus emerges from an effects-heavy netherworld filled with particle smoke and mist. “Transforming flocks of these bird-like creatures from the mist and chaos of the underworld was a healthy amount of work, even though it’s a handful of shots,” Chen says. “We did some things with procedures to warp and transform the geometry.”


A constrained animation style and limitations in the rig helped animators create a performance for the statue of Lincoln without losing its marble essence.
Images ©2009 Twentieth Century Fox Film Corporation.


The creatures’ bodies were stunt actors filmed on greenscreen, onto which the artists fit the bird-like heads using a rig that helped lock down the feathers in the transition zone. To create the feathers—a number of which are metallic and inlaid with jade, several are dreadlocks with metal claws at the end, and many others are like more typical hair and fur—the crew used multiple techniques. “Some were cards of geometry with opacity maps,” Chen says. “Some were hair or fur groomed in layers to look like feathers. The dreadlocks and metallic feathers used simulations—cloth sims to swing the dreadlocks and rigid-body dynamics to swing the metallic feathers.”

The challenges for the squirrel and the bobbleheads centered more on animation artistry than on technology, for the most part. “The squirrel is a scale gag,” Chen says. “We see him most of the time from Octavius’s point of view, and he’s one of the miniature diorama guys. So, part of the joke is that the squirrel rears up and appears to be roaring, but he’s still a small, cute squirrel.” The studio used its well-developed fur tools to keep the critter looking fluffy, even when it interacts with the White House lawn, which the crew created with CG hair.

The animators had the most freedom in animating the Einstein bobbleheads. Modelers worked from scanned practical models to create the caricatured versions of the renowned scientist. “The bobbleheads were great to work with because of their proportions,” Chen says. “Einstein has a huge head that’s constantly bobbling, and short arms and legs. The animators didn’t have to worry about keeping him stiff like the sculptures. He had a lot of dialog, so the animators could be expressive. It was about getting the best performance.”

In addition to these characters and creatures, R&H created digital doubles that the crew used in whole and in part for Stiller (Larry), Adams (Amelia), and several other actors—especially for Coogan (Octavius), who rides the squirrel. And, the studio animated such famous sculptures as Auguste Rodin’s “The Thinker,” Edgar Degas’ “La Petite Danseuse de Quatore Ans” dancer, Jeff Koons’ “Balloon Dog,” and an abstract sculpture by Isamu Noguchi.

Leaping Off the Wall

The artistry at CafeFX, on the other hand, centered on paintings in the National Gallery. “When Ben Stiller gets hit with a snowball thrown by a kid in Agnes Tait’s painting, it’s the clue that everything comes alive in the museum,” says Scott Gordon, visual effects supervisor at CafeFX, referring to Tait’s famous “Skating in Central Park” painting of winter revelers from 1934.

For background paintings, the animation is subtle—you might see a bit of a wall breaking, or a curtain moving slightly. But some paintings played a more active role. For example, Stiller grabs a pitchfork from Grant Wood’s “American Gothic.” And, in another shot, a JMW Turner seascape saves the life of Rhythm & Hues’ octopus. “The gag is that the octopus is having trouble breathing,” Gordon says. “So Ben Stiller grabs the painting and heaves water onto the octopus.”


CafeFX brought several paintings in the National Gallery to life, including this seascape by JMW Turner, using tools from RE:Vision Effects enhanced with additional features and techniques whose origins trace back to the optical-flow particle-manipulation technology used for What Dreams May Come.

Gordon, who had been a CG supervisor at Mass Illusions for the painted-world sequence in What Dreams May Come, which won a visual effects Oscar, relied on tool kits from RE:Vision Effects, whose founders won a technical achievement award for those tools. “Pete Litwinowicz and Pierre Jasmin had created products based on the ideas in the optical-flow particle-manipulation software we used for What Dreams May Come,” Gordon says. “In its simplest form, you can track the motion in a filmed scene and attach paint strokes to it. And, you can do a lot of things with the paint strokes and the way they appear.” For example, you can base the paint strokes on sprites created in Adobe’s Photoshop from scanned brush strokes, or use algorithms to generate the strokes, and rendering to make the strokes look like paint.

“We contracted with RE:Vision to add some new features to their products,” Gordon says. “For example, they have a feature that builds random variation into the brush strokes, and we wanted to break down the variation to control how much was hue-based, saturation-based, and luminance-based. That’s just one example of many.”

A number of the paintings started as live-action footage that CafeFX altered—manipulating the photography, as in “American Gothic,” to stretch and squash the filmed actors into the shape of the farmers in the Grant Wood painting. Some paintings were entirely CG, as was the Turner painting, except for foreground characters in the last scene. For modeling, CafeFX used Maya, and for the water, Maya particles with Side Effects Houdini procedures. And, of course, the RE:Vision tools.

“We tried to be faithful to each painting in every way we could,” Gordon says. For Jackson Pollock’s swirling “Convergence,” they pushed Maya particles along splines using a system developed by CG lead Scott Palleiko. And, CG supervisor Will Nicholson caused Roy Lichtenstein’s “Crying Girl” to wipe her real tears. “That was a combination of CG animation, various contour renderings, and some compositing tricks supervised by Theresa Rygiel,” Gordon says. “It was quite complicated to make the face look exactly like what we expect Lichtenstein’s idealized girl to look like when she turns around.”

For most paintings, the artists tried to reproduce the painter’s brush strokes in oil, charcoal, and other mediums, and then scanned those into Photoshop. Applying such brush strokes to live-action footage made it possible to create the illusion that an actor filmed smashing a bottle is behind the counter in Edward Hopper’s “Nighthawks.”

When the artists took liberties, it was in adding motion—pouring water from a seascape and causing clouds in the sky to move in Turner’s dramatic “London from Greenwich Park,” for example, and in adding touches like god rays to Albert Bierstadt’s “Among the Sierra Nevada Mountains.” “Bierstadt’s painting was the one most like What Dreams May Come in terms of its setting,” Gordon says. “It’s only a three-second shot. We put in enough movement to see it, but not so much it took away from the foreground. It wasn’t always about what looked right in the painting, but what looks right in the shot.”

With effects ranging from setting into motion a painting’s captured moment in time to causing Abraham Lincoln to rise from his marble chair, these studios have shown, once again, that, in the hands of artists, computer graphics tools can make the impossible look real.


Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.