Issue: Volume: 24 Issue: 5 (May 2001)


By Audrey Doyle

Nobody likes to work under the constraints of a tight deadline and a small budget. But in television, such limitations are common. "So, when directors and producers reach the end of a shoot and they've accomplished as much as they can with the time and money they have, they rely on effects companies to bridge the gap between what they had hoped to get on film and what they actually got," says Tim McHugh, visual-effects supervisor at Area 51 (Burbank, CA). "On Frank Herbert's Dune, that was a pretty long bridge."

Frank Herbert's Dune, which aired on the Sci Fi Channel, is a six-hour mini-series based on one of the most popular futuristic stories ever published. Besides Area 51, three other facilities created visual effects for the show: AI Effects (North Hollywood, CA); and Flat Earth and Netter Digital, both of which have since closed their doors. Originally, 250 effects shots were scheduled for the miniseries, but by the time principal photography finished, that number had grown to more than 500.

"As the footage started coming in, our jobs grew bigger," says Frank Isaacs, CEO and effects supervisor at AI Effects. It also quickly became apparent that the facilities would not only be handling the shots they'd been hired to work on, but that the work would become more involved in order to compensate for time and money shortfalls in production.

For the show's "Hunter-Seeker" sequence, AI Effects was asked to model and animate a CG robotic dart (the Hunter-Seeker) and composite it into live-action footage. The Hunter-Seeker-which AI Effects' Michael Hoover created in NewTek's Light Wave on NT-based PCs-travels from an intricately designed wall, stops in front of the veiled face of a maiden, moves across the room to confront the main character, Paul, and extends a poisonous needle toward his eye.

According to Isaacs, this sequence was more involved than originally anticipated because AI Effects received very little camera information from which to work. As he explains, a sequence in which a moving 3D CG element must interact with characters or objects in moving 2D footage ideally should be shot with a camera outfitted with a motion-controlled rig that records the camera's movements. The visual-effects artist would import the camera-movement information into the tracking software, feed the tracking data into the 3D animation package, and use the software's virtual camera to animate the CG element based on the tracking data. Because the virtual camera would now be moving perfectly in sync with the real camera, it would look as though the CG element was part of the original shot.
Because of time and money issues, CG effects in the mini-series Dune were used to remedy problems resulting from the live-action shoot. Originally, this battle scene was to contain only a few effects. But when the film footage was never shot, the entire s

For this show, however, budgetary limitations prevented the production crew from using a computer-controlled camera rig. As a result, the AI Effects team had only limited camera information-essentially camera height, camera type, and lens type-on which to base its virtual cam era. Further complicating matters was the fact that the background plates were soft in focus, making it difficult to track features in the scene manually.

To help overcome these challenges, AI Effects turned to Real Viz's MatchMover. After scanning the footage into their PC and choosing some points in the footage to track, the artists input the known camera data, and MatchMover automatically created the tracking data. The group then imported the tracking data into LightWave. "Using MatchMover, we established a basic track for the Hunter-Seeker to follow to make it look like it was moving through the air and interacting with the show's characters," says Isaacs. When the Hunter-Seeker was barely moving, AI Effects composited it into the footage using Eyeon Software's Digital Fusion.
To insert the digital dart into this scene, animators had to perform their own camera tracking using MatchMove. For the slower action, they created the scene using LightWave and Digital Fusion. (Images courtesy Area 51.)

Additional areas in Dune in which AI Effects had to compensate for production shortfalls can be seen in the version of the "Underground Bunker" sequence that wasn't included in the broadcast but is in the DVD release of the series. In this series, viewers see a room with a large window that's supposed to be overlooking a desert. Due to budgetary constraints, principal photography for Dune was shot on an interior soundstage in Prague, the Czech Republic, rather than on location in a desert. Therefore, desert imagery had to be composited into the window.

According to Isaacs and McHugh, most of the shots in Dune that were supposed to take place in the desert were produced using a TransLight, which is essentially a giant, front-lit billboard that's hung at the back edge of a soundstage. "To create the imagery in the TransLight, the production crew scanned photos of deserts into Adobe Photoshop and manipulated them. The resulting image was then used for the TransLight," McHugh explains. For the DVD version of the "Underground Bunker" sequence, AI Effects had to composite the TransLight into the window in the live-action footage, ensuring that as the real camera moved from side to side, the perspective in the window changed appropriately.
Artists used MatchMover to provide tracking data for compositing static desert scenery into the window and for achieving the proper visual perspective during camera pans in this underground bunker scene. (Images courtesy AI Effects.)

But once again, information about the real camera's movement was unavailable. Furthermore, when the crew shot the window opening with greenscreen, no one placed tracking targets on the material. "So, we not only had to track the scene, we also had to put the proper perspective of the TransLight in that window so that if the camera moved to the right side of the room, for example, viewers would see more of the left side of the image out the window, and vice versa," Isaacs explains.

As with the "Hunter-Seeker" scene, AI Effects used Match Mover for some tracking in the DVD version of the sequence. For shots in which the camera's movement was minimal, the group used traditional compositing techniques in Pinnacle Systems' Commotion.

Like AI Effects, Area 51 also had to make up for deficiencies in production for its work on Dune. Ironically, a lot of that involved maintaining the look of the TransLight. For instance, in several instances the show's characters travel in a 'thopter-essentially a six-person flying vehicle that Area 51 modeled and animated in LightWave. Whenever the 'thopter was stationary, the production group used a full-scale model of the vehicle; whenever it was supposed to be flying through the air, they used a CG model.
Artists created a 3D version of a fictional helicopter (below) that was substituted during flying scenes for a full-scale physical model. To the left are scenes with the actual model (top) and with a 3D harvester and chopper (bottom). (Images courtesy Are

According to McHugh, the 'thopter appears in about 40 shots. A few of those were hand-tracked, with the CG 'thopter composited into live-action footage using LightWave. In most of them, however, it's flying through a 3D desert environment created in LightWave. "This meant that every time we created a 3D animated background of the desert, it had to match the desert in the TransLight perfectly," says McHugh. "And we couldn't just use the 2D Photoshop file of the desert because our 3D 'thopter was supposed to be moving through a 3D environment, flying into canyons, and being attacked by giant 3D worms.

"Sometimes our version of the desert looked more realistic than the TransLight, but we had to maintain what was established in Prague," McHugh adds. In some instances, Area 51 created the desert environment entirely in LightWave, while in others the crew used LightWave to composite elements from the TransLight-such as hills in the distance-into the LightWave environment.

In addition to the desert images, Area 51 also created a battle sequence complete with CG characters and 3D backgrounds. "The way this was storyboarded and planned, we were supposed to get battle footage back from Prague, and all we'd have to do was composite a few 'thopters flying overhead and stick in a few laser blasts," says McHugh. "But for time and budgetary reasons this battle footage was never shot. So, we had to create all of it."

The Area 51 team began by building an entire city in LightWave. For the characters, the animators used motion-capture data that was acquired by Foundation Imaging (Valencia, CA) using an Oxford Metrics Vicon system for other sequences in the show. "We took our mocap guys and created crowds of people running through the streets and fighting," notes McHugh. "Then we added explosions and 'thopters flying overhead."

According to Isaacs and McHugh, despite the time and money constraints under which Frank Herbert's Dune was produced, the show did phenomenally well. "This was the highest-rated show the Sci Fi Channel ever broadcast," says Isaacs. "Not only that, but it looks like it might be considered for an Emmy."
Not all the effects shots in Dune were postproduction fixes. For this scene, animators created a digital worm rising out of the sand to snatch a 3D 'thopter, then receding into the desert. (Images courtesy Area 51.)

Because the show was so successful, the Sci Fi Channel is already considering a sequel based on Herbert's novel Children of Dune. In fact, John Harrison, who wrote and directed this mini series, has already been hired to write the screenplay for the sequel, which, if plans move forward, is tentatively scheduled to air in the fall of 2002.

And if AI Effects and Area 51 are asked to work on the sequel, what recommendations would they make to ensure fewer surprises in post? "My first recommendation would be to burn those TransLights," McHugh says with a laugh. "But seriously, having gone through it once, I'd ask that we all sit down before principal photography and discuss how best to shoot the footage with visual effects in mind."

Audrey Doyle, a contributing editor to Computer Graphics World, is a freelance writer and editor based in Boston. She can be reached at