Issue: Volume: 25 Issue: 6 (June 2002)

Beasts of Eden

By Karen Moltenbrey

In the lavish and technically complex six-hour television miniseries Dinotopia, two teenage brothers unexpectedly find themselves marooned on a fantastic lost continent where humans and dinosaurs peacefully coexist. Successfully portraying this near-utopian society required a similar harmonious coexistence-forged by postproduction studio FrameStore-CFC (The Computer Film Company)-among the live-action and computer-generated characters, environments, and props. "A large portion of the shots contained some type of integrated digital content," notes Mike McGee, the film's visual effects supervisor and creative director of London-based FrameStore-CFC.

The Hallmark Entertainment mega-movie, which aired last month on ABC, is based on two books by author/illustrator James Gurney. The three-part miniseries brings to life the lost Dinotopian society that's ruled according to long-established principles of mutual respect. However, dangerous human outlaws threaten this ideal world, as does the presence of deadly carnivores who lurk just beyond Dinotopia's capital, Waterfall City. In an effort to restore harmony, the bickering brothers, aided by a multilingual stenonychosaurus named Zippo, embark on a daring journey to save their new homeland.
Images courtesy Hallmark Entertainment and FrameStore-CFC.

The production, which took 18 months to complete, was the largest ever filmed at London's Pinewood Studios, home to the James Bond films. It also pushed the technical capabilities of the FrameStore-CFC team, which had honed its skills for animating reptiles while creating the Emmy-winning Walking with Dinosaurs series two years ago. To complete Dinotopia, its largest project yet, FrameStore-CFC hired more than 70 animators and technical directors to create the show's 1700 effects shots, 1200 of which contain computer-generated content.

Aside from generating more than 62 variations of 3D dinosaurs, the Frame Store-CFC team created entire CG environments and virtual people. In other instances, the animators enhanced live-action scenes and backgrounds with 2D and 3D graphics. Of all those effects, the two biggest challenges facing McGee's team were the juxtaposition of the humans and dinosaurs, which required every shot to be scaled to exact proportions, and the elaborate digital extensions of the largest single set constructed in Europe.

Charismatic Characters
All but one of the dinosaurs in Dinotopia resemble actual species, though their expressive faces make them more stylized than photoreal. And nearly all of them were created digitally, with a few exceptions in shots with an extensive amount of interaction with live actors. In particular, these involved the crocodile-like mosasaurus as it springs out of the water to attack the heroes and the infant dinosaur known as Number 26. During close-up shots in this sequence, an animatronic created by Jim Henson's Creature Shop was interchanged with its CG counterpart, which was used in the medium-range shots. To ease these transitions, FrameStore-CFC worked with the Creature Shop during the design of the maquettes and their skin textures so their shapes and forms would appear seamless as the camera cut from live action to CGI and back.

To create the digital dinosaur models, the artists used Softimage 3D as their core modeling and animation platform. They also developed a number of proprietary plug-ins originally created for Walking with Dinosaurs, including an updated muscle deformation system to simulate muscle movement beneath the skin. The textures were hand-generated in Adobe Systems' Photoshop by an in-house team of specialized animal-skin painters. The skin designs contained numerous color and pattern variations as well as subtle wrinkles and creasing over underlying musculature. Once the textures were applied to the models, the artists fine-tuned them using Right Hemisphere's Deep Paint with Texture Weapons.

The character animation, like the textures, was also created from scratch. "We looked at motion capture but decided [keyframing] was more practical and gave us more freedom in determining the final look," says Alec Knox, supervising technical director. "Using actors to generate motions for these large nonhuman figures would have been extremely awkward and difficult to translate onto the models." The flexibility achieved through keyframing was especially important to the team of 14 animators who worked exclusively on the signature character Zippo, whose breadth of movements range from playing table tennis to gesturing with his glasses in a scholarly like manner to floundering in a mosasaurus-infested water hole.
Starring alongside the cast of human actors is the digital character Zippo. Created in Softimage, this versatile reptile interacts seamlessly within a live-action environment containing actual and digital props.

For Zippo's facial animation, as well as that of the other dinosaurs, the artists constructed an animation setup in Softimage, which enabled them to achieve a range of expressions. The team investigated using lip-sync products but didn't believe they would work well given the unusual shape of the creatures' mouths. They also wanted to give the animators total control over the facial expressions and lip movements, which were closely intertwined. "Zippo's face doesn't look at all human-its structure is more like a cross between that of a dog and a lizard," explains Knox. "Furthermore, his mouth extends to the back of his long jaw, so we extended our mouth and lip movements to the back as well, making the movement appear more believable."

Yet, not all of the digital characters in Dinotopia were reptilian. To fill out the crowds in some of the larger Waterfall City scenes, the director used nearly 275 human extras. But in some instances, such as those containing dynamic camera moves like a virtual flythrough or aerial view, digital extras and replicas of the main characters were used instead. "We determined that we could place the digital actors within 15 feet of the camera without the eye detecting that they were not real," notes Knox. To generate these models more quickly, the artists took a series of high-resolution photographs, which were used as texture maps. Then, using a proprietary plug-in, the artists constructed subdivision surface models in Softimage, which enabled them to achieve more facial detail after applying the photographic textures to the digital people.

Character Integration
For integrating the digital dinosaurs into the live action, the group used a number of techniques, including camera tracking and previsualization. For instance, throughout the movie, real actors are seen riding on the backs of the digitally created Over lander species, which was artistically contrived. To ensure that the bucking movements blended naturally, FrameStore-CFC built a motion simulator that translated the Softimage animation data in real time into actual movements powered by the machine's hydraulics. "The actors would sit on saddles atop the rig, and we filmed them against a bluescreen as they were bounced around," explains McGee. "Back at the studio when we keyed the actors onto the CG dinosaurs, their motion was perfectly synchronized to the dinosaur's gait."

At times, though, the rig's movements became too strong, requiring it to be bolted to the floor. As a result, the camera had to move around the character, rather than vice versa, to create a change in perspective. Therefore, to achieve a wider range of shots, the team used Softimage to drive a motion-controlled camera, which was also synchronized to the mo tion simulator.
Nearly half the environments in Dinotopia were computer generated. The artists chose soft, subtle lighting with warm highlights that helped create a magical feel for the fantastical lost world.

McGee also used Soft image to achieve naturalistic camera moves. This was done by first programming into Softimage the location of the actual camera used to film the sequences. Then, encoding devices were attached to the camera head, so as it panned and tilted, the CG camera followed the identical path through the virtual scene shown in a monitor located on the set. This particular setup, he notes, was especially helpful for the shots that panned from the top of a brachiosaur's head, down its long neck, across its back to the human character sitting in the saddle, and then down to the crowd below.

Because so many scenes contained digital characters and environments that would be added during postproduction, it was often difficult for the actors and director to establish a clear picture in their minds of the final shot composition. To overcome this problem, physical or digital stand-ins were used to provide a visual aid. For instance, dimensionally accurate physical stand-ins for the CG models helped give the director a better sense of scale and enabled him to review full shot compositions during the actual filming. Additionally, digital stand-ins for the CG elements were often viewed in real time with an "on set" monitor, which helped McGee assist in directing some of the crowd scenes. "We'd get ap prox imately 20 or so extras at a time to gather around the monitor and review the Softimage scene so they could get a sense of scale and establish accurate eye lines," McGee explains. "Then, once we integrated the dinosaurs in and around the humans as they walked about Waterfall City in the final shot, it all looked natural."
Actors were filmed against a bluescreen as they sat atop a motion simulator, whose movements were controlled by the Softimage animation data.

Living Large
Similar techniques to those used for integrating the characters were also used to blend the real and digital environments. To support the massive walls and roofs for the 2.5-acre exterior Waterfall City set, carpenters used more than 85 miles of metal scaffolding-far more than is typically used for a motion picture set. Despite its enormity, the physical Waterfall City set represents only 10 percent of the environment shown on screen. The rest was created digitally by FrameStore-CFC using Softimage and New Tek's LightWave.

To ensure that the large set extensions blended into the scene, the group conducted a survey of the physical set using a laser-ranging system, which samples remote surface points to within 1.5mm accuracy over a quarter-mile-square area. "We'd have a building that ended halfway up an arch, then we'd continue the arch in CG," explains Knox. "The only way to do that accurately with such organic, free-form architecture is to have a precise and thorough survey of the original set." Once that was accomplished, the digital artists converted the sample points into 3D space so they could be used to construct the virtual model.
For Canyon City, an environment that was nearly 80 percent digital, the artists provided a realistic sense of depth by using sweeping camera moves through a landscape of towering, jagged rock formations.

Waterfall City's most impressive geological features by far are the 20 different types of cascading waterfalls that are fed with an intricate web of canals that encircle the city-picture Venice situated atop Niagara Falls, describes McGee. While the underlying imagery for the waterfalls was shot in Brazil, McGee's team enhanced the the shots with particles created in Alias| Wave front's Maya. "We developed a technique whereby we took the essence of the live-action movement and used that to drive our 3D particle system," Knox says of the proprietary process. "This gave us total camera control so we could look at each cascade from any angle."

The artists also used Maya to build the large rock-like Canyon City environment. For texturing, a team shot high-resolution still images of the Grand Canyon from a helicopter, and mapped them onto the complex geometry, resulting in a heavy, highly detailed image. "We were literally rendering something akin to the Grand Canyon, and we needed the extra geometry so the rendered plates would reveal all the minute details that we could have never achieved using just painted textures," notes Knox. To achieve the detail, the canyon scenes were rendered with Maya using volumetric lighting and depth shaders. The 3D canyon model was then integrated with a small portion of practical elements and 2D matte paintings, resulting in a mixed landscape of practical and CG buildings and geology.

To show off these detailed environments, the group decided early on not to lock off the cameras, so they were always moving in these environment shots. "Consequently, we had an entire department of people who just did camera tracking throughout our entire production cycle," McGee says. "But the end result was worth the effort. You wouldn't know where the real buildings stop and the CG architecture begins." This was easily accomplished using the information gathered from the detailed camera surveys of the live-action sets, in conjunction with 2d3's boujou and RealViz's MatchMover digital tracking software.
In addition to the digital dinosaurs and sets, the artists also built a variety of CG props, such as carts, harvesting machines, and a plethora of armor and clothing for the beasts.

After using these sophisticated modeling, texturing, and tracking techniques for Dinotopia's digital elements, a team composited them into the live-action scenes using Discreet's inferno and smoke, Adobe's After Effects, and Nothing Real's Shake. In one scene that challenged compositors, a digital dinosaur runs amok, knocking over physical props-a shot that required perfect timing and integration. In another scene that takes place in Waterfall City, compositors compiled hundreds of layers to complete the final look. Knox notes that all the digital assets (17tb) were stored on large RAID disc arrays until the very end of the cycle, so they were easily accessible through a shared networked environment, helping to speed the compositing process. By integrating the 2D and 3D cameras into the production pipeline, the team was also able to mix and match the cameras while compiling the final scenes, enabling the artists to achieve the optimal viewpoint for a particular shot.

Epic Proportions
All told, FrameStore-CFC generated visual effects for 273 of the 336 scenes in the production-the scope of which Knox likens to that required for two full-length feature films. For FrameStore-CFC, this far surpassed its groundbreaking work in Walking with Dinosaurs. "In Dinotopia, you have real architecture and real actors next to digital elements and creatures, so your sense of reality has to be exact in terms of character movement, weight, scale, shadowing, and lighting," he says.

Having just completed the miniseries, FrameStore-CFC is already working on the first episodes for a 22-part television series for ABC, which will continue the Dinotopia story. The team is using many of the same models and animation cycles produced for the miniseries, which will make the production process faster and easier. "Instead of shelving the material from the mega-series, we're now reusing it for the television show," adds McGee. By doing this, FrameStore-CFC is, in effect, saving Dinotopia's inhabitants from extinction-at least for one broadcast season.

Karen Moltenbrey is a senior associate editor at Computer Graphics World.

Dinotopia Tool Vendors
2d3 * * infoNOW 80
Adobe Systems * * infoNOW 81
Alias|Wavefront * * infoNOW 82
Discreet * * infoNOW 83
NewTek * * infoNOW 84
Nothing Real * * infoNOW 85
RealViz * * infoNOW 86
Right Hemisphere * * infoNOW 87
Softimage * * infoNOW 88