Artists use digital effects to give a modern-day slant to a classic fairy tale
Images © 2001 Henson/Hallmark.
If you think you know the story of Jack and the Beanstalk, well, you don't know Jack. That's because Jim Henson Television has created a contemporary follow-up to the classic fairy tale, whereby a modern-day Jack follows in his great-great-great grandfather's footsteps up the fabled beanstalk, only this time seeking to right the wrongs committed by his family over the years.
Jim Henson's Jack and the Beanstalk: The Real Story blends computer-generated ef fects created by Jim Henson's Creature Shop (Camden, England) with live action in an ad venture/comedy starring Matthew Mo dine as Jack, Vanessa Redgrave as the immortal family matriarch, and Jon Voight as a conniving business associate. In the two-part CBS Television Network miniseries, which aired in December, Jack Robinson discovers the secret to his family's wealth-a goose that lays golden eggs, stolen from a giant long ago by Jack the First, one of the greatest thieves of all time. Since then, the riches enjoyed by the family have come with a hefty price-a curse that cuts short the life of the family's males, which Jack Robinson seeks to break by revisiting the fantastic world discovered by his namesake.
The show marks the first time in The Creature Shop's history that all the visual effects work has been handled in-house for a project of this size. All told, the company produced more than 550 visual effects-including 3D animation, camera tracking, digital matte painting, rotoscoping, and compositing-that were completed in less than five months. "It made sense to use The Creature Shop not only from a financial perspective but also for the benefits of having everyone working under one roof," says Brian Henson, director and executive producer. "The same group that conceived the characters and visual effects were also responsible for their creation, resulting in a unified, consistent look."
One of the program's most dramatic and innovative effects occurs when a computer-generated beanstalk suddenly erupts through the leafy forest floor of a live-action setting, throwing leaves, dirt, and other debris into the air. Although a 20-foot-high practical beanstalk was used for most of the close-up shots, a team of five artists procedurally modeled and textured a digital replica using Alias|Wave front's Maya for the wide-angle and "growth" shots. The use of procedural techniques resulted in high-resolution, definitive textures, enabling the artists to match the digital beanstalk with the practical one as the camera cut between the two. "We had to grow and age the 3D model from a young sprout to an ancient, gnarled plant that was miles high, all in a minute and a half," says Sean Feeney, CGI supervisor. "It was a challenge technically and aesthetically to make the plant look organic and friendly, not ghoulish or horrific, as it grew."
To accomplish this, the group "en twined" the texturing and animation proc esses. To grow the beanstalk, they used Maya animation curves to "creep" the geometry up a profile curve so it could be animated by a series of clusters along its length. Because the geometry was grown frame by frame, the UV coordinates had to be updated in each frame, making texturing extremely difficult. To overcome this obstacle, the team scripted tools so the animation would also drive the textures and the customized shaders for Pixar Animation Studios' RenderMan. "By taking advantage of the open, common language of the two programs, we were able to automate some processes, which saved a tremendous amount of time," says Alex Wuttke, lead technical director for Jack and the Beanstalk. "As a result, the beanstalk textures evolved as the plant grew and matured." Additionally, the artists used high-dynamic-range imaging techniques for replicating the actual lighting setup in the digital space.
|To grow the magical beanstalk for Jim Henson's Jack and the Beanstalk: The Real Story, animators procedurally modeled and textured the plant in Maya. To ensure that the model matured as it shot upward, the artists scripted software tools so that the m|
After the computer-generated beanstalk was created, the team rendered it using RenderMan, then composited it with the live-action elements, replacing an actual prop that was buried under a pile of leaves and manually winched upward to produce the desired chaos. Tracking such a shot is difficult because of all the foreground movement. "We were afraid that all the motion would prevent us from getting an accurate track," recalls Wuttke. However, 2d3's boujou automatic tracking software locked onto the stationary leaves that filled the background of the frame, allowing the shot be to tracked in about an hour and a half.
The group also used boujou to track dozens of other shots throughout the film, especially when the backgrounds provided little depth information, such as when the forest floor fills the frame, forming a defocused background. The combination of soft focus and the lack of reference to features at various distances from the camera lens typically makes tracking extremely difficult, notes Wuttke, which would have been disastrous for this production in which so many shots were filmed in an organic forest setting.
|The miniseries included many fantastical characters, such as Harmonia, a 3D model with curved features and a reflective gold color that made lighting it difficult. |
"Camera tracking was used extensively throughout this movie," says Wuttke. "There were a lot of shots where the camera contained noise because of the instability from shooting in an uneven wooded setting. We also used it to create a 360-degree set extension at the end of the movie when the giants are casting judgment on Jack within The Great Hall."
Once Jack climbs up the beanstalk, he arrives in a fantastical world inhabited by Harmonia, a 3D female character fused to a golden harp, as well as a live-action giant with an animatronic head, a Henson-puppeteered goose, and many other characters both real and digitally enhanced. "In each instance, we tried to use the most appropriate [character creation] technique, whether it was through animatronics, puppetry, prosthetics, CGI, or hybrids," Feeney says.
The goose that lays golden eggs-a creature covered in feathers-was an ideal choice for an animatronic. But creating an entire human animatronic for the giant Cernos or the harpist Harmonia would have been nearly impossible because of the space that would've been needed to fit all the necessary controllers inside their limbs. So for Cernos-a giant who is half man/half stag-the group used an animatronic headpiece worn by a live actor. "We had The Creature Shop build us a mask that took advantage of the things that animatronics do well, like jaw, eye, and ear movement, and added the rest of the effects in post, such as the lip sync and resulting muscle deformation," explains Henson. "We didn't attach the head prosthetically. It was easier and less expensive to paint out the scenes between the actor's face and the mask. Then we warped the lower face of the headpiece to emulate muscle deformation using [Discreet's] flame."
|The giant Cernos was played by a real actor wearing an animatronic headpiece. Some effects, such as lip sync, were easiest to accomplish in post.|
Predicts Feeney: "In the future, you'll see both sides of the fence-animatronics and CG-coming together more frequently to create a dynamic effect because of its efficiency."
Ensuring that the practical and CG effects of the diverse characters blended smoothly was the responsibility of animation supervisor Mak Wilson, who oversaw the 3D animation of Harmonia and the puppeteering of the photorealistic goose, built at The Creature Shop and animated both by hand and with the Henson Performance Control System.
|The goose that lays the golden eggs was a practical model whose body was animated by puppeteers (above left) who were later removed from the shot (above). |
By far the most complex creature, according to Henson, was Harmonia, mostly because of her golden coloring. "It's surprising how difficult it is to texture and light gold so that it looks like metal and has the proper reflective characteristics," he says. Complicating those tasks even more were the numerous scenes in which Harmonia appears, all of which required different lighting setups. To improve her reflective properties, the team used its proprietary environmental mapping techniques as well as Softimage|XSI's radiosity function. "The ultimate nightmare was lighting two diverse golden objects-the CG harp, which consists of flat shapes and planes, and Harmonia, who is all curved surfaces," explains Feeney. "Unfortunately, there is no plug-in to solve this issue; we had to do it by trial and error."
The harp used in the production begins as a physical carved figurine of a sleeping woman nestled around the instrument. But when the wo man is summoned by the giant Thunderdell to perform, a 3D model re places the prop. Originally, the team had considered using an oversize harp embraced by a gold-painted actress. "Henson wanted that kind of lifelike performance in the CG realm, since the gag was having a gold object come to life," he says. "But if we had used an actress, she wouldn't have looked magical-she would have simply looked like a person covered in gold paint hugging a harp."
|Digital matte paintings refashioned a flat, barren landscape (above) into a lush mountain vista (above, right) for this scene in the giants' kingdom. |
Before creating the digital model, artisans developed a larger, more detailed version of the prop, only with the woman's arms outspread, and then digitized it using a Cyberware scanner. "One of the issues was that the carved figurine was in a sleeping position and hugging the body of the harp, so if we scanned the practical object used in the production, a large part of her geometry would have been obscured," explains Feeney.
Next, the artists imported the scan data into XSI and remodeled the image using subdivision surfaces. This enabled them to model, texture, and animate a complex, one-piece character while maintaining a very low resolution until it was time to render the model using XSI's integrated mental ray raytracing technology from Mental Images. Using XSI's An imation Mixer feature, they created a library of poses and movements for Harmonia, including expressions, constraints, and fixed shapes, so they could maintain a consistent look for the character in the various scenes in which she appears. "Altogether there are about four and half minutes of harp animation, which was done by eight different people," notes Feeney. "Using Mixer helped keep everyone on the same page."
|The giants and "little" people are the same size, but shot on different-size sets and at different film speeds to give the giants a slower, weightier look. The shots were then recalibrated to the appropriate speed using RE:Vision Effect's ReelSmart Tw|
In addition to the beanstalk and characters, the artists also spent a great deal of time creating digital matte paintings, especially for the backgrounds in the giants' kingdom, using mostly Adobe Systems' Photoshop, as well as Right Hemisphere's Deep Paint 3D, NewTek's LightWave, and Maya. "We had camera moves within the plates, so we applied the textures to the geometry in Light Wave and Maya, which enabled us to multi-plane through the matte paintings," explains Feeney. "So on a still frame, the backgrounds had an imaginary look, but when the cameras were panning and tilting, the depth gave them a real feel."
All told, the majority of the 550 effects shots in Jack and the Beanstalk involved 2D compositing and rotoscoping using a variety of software, including Avid Technology's Media Illusion, Nothing Real's Shake, and Discreet's flame and combustion running on SGI Octanes, and Pinnacle Systems' Commotion on a "roto farm" of Macintosh G4s. "We had to do a lot of rotoscoping for this project," says Feeney. "When you use bluescreen for large outdoor locations, it's hard to get it flat, and then it rains and the screen gets crinkled. It is therefore often easier to allow production to carry on with the filming and then rotoscope later." For in stance, filming began in the UK during late winter when the actual wooded landscape was fairly barren. So the artists rotoscoped characters from their original back grounds, which were replaced with environmental matte paintings to achieve the desired lush forest scenery as well as the vast open landscapes and waterfalls within the land of the giants.
Some of the composited scenes involved blending 3D characters and objects into a live-action scene. In other instances, the compositing is more subtle. For example, the actors playing the giants and little people are the same size, but the giants were filmed on undersized sets and the little people on oversized sets, then the imagery was combined to produce the desired effect. "The reality of doing this is that sometimes the two sets wouldn't fit together, but instead of warping and manipulating one set to fit the other, we just rotoscoped one and composited it onto the other," explains Feeney.
Although Jack and the Beanstalk has been told for centuries, Jim Henson Television breathes new life into the tale, through a clever updated story line and state-of-the-art digital effects, which-like the original Jack-often "steal" the scenes.
is a senior associate editor at Computer Graphics World.
· www.2d3.com ·Adobe Systems
· www.adobe.com ·Alias|Wavefront
· www.aliaswavefront.com ·Apple Computer
· www.apple.com ·Avid Technology
· www.avid.com ·Cyberware
· www.cyberware.com ·Discreet
· www.discreet.com ·Mental Images
· www.mentalimages.com ·NewTek
· www.newtek.com ·Nothing Real
· www.nothingreal.com ·Pinnacle Systems
· www.commotionpro.com ·Pixar Animation Studios
· www.pixar.com ·RE:Vision Effects
· www.revisionfx.com ·Right Hemisphere
· www.righthemisphere.com ·SGI
· www.sgi.com ·Softimage
· www.softimage.com ·