Issue: Volume: 23 Issue: 4 (April 2000)

USER FOCUS: Britannic's Last Voyage

By Audrey Doyle

Ask Pat Corbitt to explain the innovation behind the effects his studio produced for the recent TV movie Britannic, and he answers, without hesitation: "We created the same effects that were in the film Titanic but on a budget much lower than Titanic's catering bill."

A Regent Entertainment production, Britannic tells the story of the RMS Titanic's sister ship, HMHS Britannic, which was sunk on Nov. 21, 1916, as she passed the Greek island of Kea. Originally intended to be a luxury liner, Britannic was instead used as a World War I hospital ship, and it completed five voyages before an unexplained explosion sent her to the bottom of the Aegean Sea, killing 30 people. Officially, Britannic's sinking was attributed to the ship hitting a mine, but many of the 1000 or so hospital staff and crew who survived alleged that a German torpedo hit the boat.
The Britannic, sister ship to the Titanic, sank in Greece under mysterious circumstances in 1916.

In January, the mysterious fate of the Britannic resurfaced in a two-hour thriller on Fox Family Channel. For the film, Corbitt's small creative house, Corbitt Design in Old Bridge, New Jersey, created more than 200 photorealistically rendered 3D scenes. Ac cord ing to Corbitt, what makes this feat remarkable is that he and two colleagues created the scenes (about 15 minutes of CG animation), which included the Britannic, water, people, and other elements, in just nine months and on a budget of only $150,000.

Luck as well as skill played major roles for the studio in landing the job of creating the effects in Britannic. According to Corbitt, after seeing the film Titanic, the studio's principal modeler and animator, Dan DiPierro, decided to test his skills by building in NewTek's (San Antonio, TX) LightWave a 3D computer-generated replica of the digital Titanic used in the film. "But it took him only two weeks to create it," says Corbitt. During a chance meeting, Corbitt showed DiPierro's model to a Regent Entertainment producer who, coincidentally, was thinking of developing a TV movie based on the Titanic's sister ship. "The producer liked what he saw, and we got the job," he recalls.

That job, Corbitt states, started out as 85 shots, most of which involved the Britannic. After scouring the Internet for information about the ship, DiPierro used LightWave, running on a Boxx Tech nologies (Austin, TX) dual-Pentium 400mhz RenderBOXX workstation, to transform his Titanic model into the Britannic by giving it more prominent lifeboat davits (cranes that hold the boats onto the ship) and by painting the ship appropriately. "Those were the two main visible differences between the Titanic and the Britannic," he points out. According to DiPierro, whenever the boat is shown sailing on the water, it's computer-generated. "Only small parts of the boat were built as set pieces," he says.

Corbitt Design's initial role quickly grew to creating more than 200 shots after the movie's director, Brian Trenchard-Smith, saw what the studio was capable of doing. Though the number of shots nearly tripled, the studio's tight deadline remained about the same. "We knew we would need machines that were capable of extremely fast rendering times if we were going to get these shots done on time," says Corbitt. "So we bought four dual-Pentium 500mhz RenderBOXX workstations just for rendering. We turned them on, and they didn't stop for nine months, 24 hours a day, seven days a week,"
To create this lifeboat scene, the animators at Corbitt Design composited a live-action boat and actors (left) into a scene containing a digital lifeboat's digital reflection in real water.

In addition to beefing up its arsenal of rendering machines, the Corbitt Design artists also devised several clever workarounds-in one instance, for the creation of water, to ensure that they wouldn't miss their deadline. Computer-generated water appears throughout Britannic: When the ship is shown sailing on the ocean; when the ship is flooding and water is pouring in through bulkheads and portholes; and in some underwater shots. But getting the water to look photorealistic required a great deal of trial and error.

"A lot of people use physics programs to create water today. In fact, we bought one-a plug-in for LightWave-but we had some problems with it," DiPierro says. "We wound up with buzzing polygons-polygons that would be up in one frame and then down a frame later. The result was a nightmare, and we didn't have a lot of time to experiment to get rid of the problem."

To resolve the dilemma, DiPierro created a mesh of triangulated polygons in LightWave and used the software's fractal noise setting to create a basic wave structure and turbulence. "I found that when I shifted fractal noise from left to right, I got a realistic-looking wave action," he says. As a finishing touch, he added LightWave-generated sea foam in layers where necessary "to get a realistic frothy look."

For the porthole and most of the bulkhead shots, which were entirely computer-generated, the trick was getting the CG water to shoot through these openings in a realistic manner. The solution, DiPierro says, involved using a combination of particles created in the LightWave plug-in Particle Storm 2 (Dynamic Realities; Waukesha, WI) and spheres modeled in LightWave, all of which he deformed using LightWave's Taper and Bend built-in deformation operators.

Although most of the bulkhead shots were computer-generated, in one shot the set was live action, to which DiPierro added computer-generated water. "I created the water in the same way I had for the other bulkhead shots, starting with a LightWave water mesh and enhancing it with particles, spheres, and deformation. Then I modeled a white version of the set, and in the computer I projected an image of the real set through a spotlight onto the white model," he explains. "I placed my CG water where I thought it should be, and it worked." Compositor Brian Dean then added live footage of real foam to the water and composited all the elements using Adobe Systems' (San Jose, CA) After Effects.
A digital submarine fires a torpedo through water created with a LightWave water mesh.

"I was amazed at how well this technique worked," DiPierro enthuses. "You can actually see reflections of the real set in the CG water."

As for the underwater scenes, many of which included LightWave-modeled submarines, DiPierro once again started with a water mesh created in LightWave. Then he simply flipped it upside down so that what would have been the topside of the water-mesh surface was now on the bottom. "In LightWave I tweaked the mesh to get the wave motion to behave well, and I turned on fog. Then, in After Effects, I played with color and density until the water really popped," he says. "I got a lot of mileage out of that LightWave mesh. Reusing the mesh helped cut down on production time."

Additional shortcuts also lent realism to scenes while saving time. For instance, 85% of the boat shots include digital actors-mainly soldiers and hospital personnel-shown at a distance of less than 100 feet. Whenever a shot required CG people, DiPierro loaded into LightWave's Modeler module eight different human models taken from the Acuris Perfect People disk and resized some of them so they weren't identical in height. Next, using reference photos of the costumes the live actors were wearing, he dressed the digital models in their appropriate garments by building the clothing in 3D using NURBS.

In addition to modeling and animation, the Corbitt Design team also devised some clever tricks to light the CG scenes and elements. "But to get realistic renders, you must fill in all the shadows with fill lights." That can take a lot of machine time, which the designers didn't have.

DiPierro invented what he dubs a spin-light technique to deal with this, which he used to accomplish realistic natural lighting. "I found that if I took point lights in LightWave, placed them in a square, and spun the square for one frame, I got a nice motion blur in which all my shadows overlapped," he explains. "I used whitish light for the main lighting and a grayish-blue color for my fill lights. It really softened the whole effect and made the lighting look realistic." For interior versus exterior scenes, DiPierro simply adjusted the color of the lights.

In addition to the digital scenes and elements in the movie, Corbitt Design's Brian Dean rotoscoped cloud footage into several sequences using After Effects. He also completed a number of challenging green-screen shots. In one sequence, for instance, the producer wanted a shot of survivors in a lifeboat on the ocean. "The production company put a 35mm waterproof camera on a float in a water tank, placed 15 actors in a lifeboat in the same tank, and stuck a green-screen panel in the background," says Corbitt. That footage was supposed to be composited with a computer-generated ocean and a digital Britannic. "But it was a nightmare. We had all this green being reflected from the panel into the water in the tank," he notes.

To perfect the shot, Dean and DiPierro built a lifeboat in LightWave and cast the reflection of that lifeboat into footage of the real water. Then they used After Effects plug-ins-Ultimatte (Ultimatte Corp.; Chats worth, CA) and Primatte (Puffin Designs; Sausalito, CA)-to eliminate the green spill from the earlier shots. Finally, they used Puffin Designs' Com motion image-editing and effects software to track the movement of the real lifeboat, ensuring that the 3D background moved in sync with the rest of the picture.

"In terms of difficulty, this project was head-and-shoulders above any other work we've done," he adds. "But we think we've succeeded in showing people that with NT technology and the right folks driving it, you can produce big-time effects on a very reasonable budget."

Freelance writer Audrey Doyle is a Computer Graphics World contributing editor and former editor-in-chief of Digital Magic. She can be reached at

LightWave, NewTek (

RenderBoxx workstations, Boxx Technologies (