Dead Ringers
Issue: Volume: 28 Issue: 2 (Feb 2005)

Dead Ringers

Adolf Hitler, Franklin Roosevelt, and Winston Churchill shaped world history more than 60 years ago. Yet even today, these stalwart figures are immediately recognizable, thanks largely to their unique, bold features-Hitler, with his pinched face, abbreviated mustache, and comb-over hairstyle; Roosevelt, with his deep-set, squinty eyes and extended jawline; and Churchill, with his busy eyebrows, bulbous nose, and fleshy jowls. However, when the Discovery Channel set out to make a new type of film-a simulated documentary-featuring these men, their famous faces posed a problem.

The concept of a simulated documentary is similar to that of a regular documentary-to reconstruct an event as convincingly as possible using documents, photographs, archival film, and, if necessary, reenactments by professional actors to produce a historically accurate picture of what happened at a particular time. Except with a simulated documentary, the intention is to blur the line between the past and the present so that the reenactment is indistinguishable from authentic archival footage.

“When you are watching an actor play a part, you have to suspend your disbelief because you know you are looking at an actor,” says David Abraham, executive vice president of Discovery Channel UK. “As a result of using state-of-the-art [digital] techniques, however, you can become emotionally involved because the newly created film now looks like original footage. This means what you’re seeing is much more shocking because it feels as if you are really there watching the events unfold.”

Three years ago, the Discovery Channel began looking at how computer graphics could be added to historical film archives. “CGI had been used successfully to create fantasy worlds and even to bring dinosaurs back to life,” reasons Abraham, “so why not take it a step further and re-create the faces of famous historical figures, and witness what they said and did on a certain day?”

For its first so-called Virtual History project, Discovery Channel chose to re-create the events of one day from World War II in the style of an authentic 1940s archive, mainly because a great deal of authentic footage and photos existed from the period that could aid in the research.

The documentary, called Virtual History: The Secret Plot to Kill Hitler, includes present-day reenactments, which are augmented by the authentic archival footage whenever possible. However, the biggest obstacle preventing viewers from buying into the concept that the staged action was actually newly discovered vintage footage were the actors. Merely resembling their famous, or infamous, counterparts was not enough; the actors had to look exactly like the world leaders they were portraying.

To accomplish this feat, current history was made on a number of technology fronts. Using computer-modeling techniques coupled with 3D motion scanning for replicating real-life facial movement, digital artists re-created the faces of Hitler, Roosevelt, and Churchill, and then replaced the facial structures of the actors playing those roles with the virtual geometry.

Face replacements in live-action productions have been accomplished before by cyberscanning actors to create fantasy characters. But this time the CG faces were those of actual, recognizable people who are no longer alive. Moreover, this type of intricate cloning work previously had been reserved for big-budget films, such as The Polar Express, rather than lower-budget cable shows. Nevertheless, by meeting these challenges, the filmmakers achieved their biggest milestone of all: the creation of a novel form of documentary, in which history is no longer all about past events.
To create a simulated documentary, in which modern reenactments look like authentic archival film footage, actors were augmented with virtual masks to look exactly like their famous counterparts. Below, the technique was used to transform an actor into Hi

The premise of Virtual History was to show simulated archival footage of the World War II leaders on July 20, 1944. While the actual film clips are contrived, the events that occurred on that day are real and historically significant. It was during this 24-hour period that some German army officers failed in their attempt to kill Hitler with a briefcase bomb. At the same time, Churchill and his generals met to discuss the possibility of using chemical warfare on German citizens. And across the Atlantic, Roosevelt was nominated to run for another term as US president and then, ironically, came closer to death than Hitler after suffering a stroke aboard the presidential train.

In fact, some authentic footage leading up to and immediately following these happenings does exist, and was used in Virtual History. The main incidents, however, were not captured on film. So for storytelling purposes, actors who resembled the historical figures played out the missing scenes. Later, postproduction facility The Moving Picture Company (MPC) digitally replaced the actors’ faces in close-ups with virtual masks created from the digital cloning process.

“We had to portray the leaders as realistically as possible so the viewers would consider this as newly discovered archives,” explains Jim Radford, visual effects supervisor at MPC. In addition, MPC processed the new film to resemble dated newsreel footage: grainy and scratched, with poor color quality and hyper movement of the people in the film. As a result of these techniques, it appears as if all the events from July 20, 1944, as shown in the program, were caught on film at the time they actually occurred.

The realism of Virtual History starts with the creation of lifelike digital masks of Hitler, Roosevelt, and Churchill. To craft these masks, artists first sculpted clay models of the historical figures. These clay models were then cyberscanned, and the resulting data used to generate 3D models within Alias’s Maya. “We were portraying these people at a particular time, so we had to take our reference information from moving and still photos during this particular time period,” says Radford. “We were not simulating a youthful Hitler, but rather how he appeared near the end of his campaign, when he was heavier and his face was more haggard.”

Texturing the faces proved even more difficult due to the unique properties of the human skin, such as its translucency. To overcome this hurdle, the group referenced skin textures from people today who were within the men’s age groups, not only to see how light played on their skin surface, but also to obtain the proper amount of wrinkles, flexibility, and flaws. Next, the team used Adobe’s Photoshop to blend that mixture of surface data with photographic textures of the actual leaders that they garnered from photos, and then applied those surfaces to the 3D models using Right Hemisphere’s Deep Paint. Later, the artists rendered the models using Pixar’s RenderMan software.
To craft a lifelike virtual replica of Roosevelt (from left to right), technicians first captured the facial geometry of the actor, from which artists made a 3D wireframe, shaded, and textured model. The fifth image illustrates how the artists fit the CG

After the 3D models were completed, they were, in essence, deadheads, void of any movement, including the mannerisms unique to their real-life counterparts. Breathing life into them was left to 3D data-acquisition company Eyetronics, to whom MPC provided a mesh of the static CG models. Eyetronics then used its proprietary technology to capture and record detailed facial performances of three actor doubles, one for each of the famous leaders.

As each actor sat on a chair, a fine grid was projected onto the person’s face, so the facial shape changes could be followed with the highest possible degree of accuracy. This technique produced approximately 70,000 points on each face for every frame captured.

Technicians at Eyetronics then took this facial-performance capture technique one step further than most other applications by applying approximately 75 dots on each actor’s face to record how the skin shifted while the person spoke. In comparison, 151 markers were used in the highly publicized technique for tracking Tom Hanks’s facial motion so it later could be applied to all-CG characters in The Polar Express. The shape changes, however, were not tracked for that film, as they were for Virtual History. “By combining the two sets of information-the skin motion and the changing of the facial shape-we obtained the greatest amount of information possible pertaining to a moving face,” says Eyetronics CEO Dirk Callaerts.

In all, it took the group about half of a day to capture the data for each actor’s face, which was recorded at 25 frames per second (fps) to coincide with MPC’s postproduction requirements. During the same capture sessions, a cyberscan was taken of each actor’s face with a neutral expression. This mesh served as the basic reference for tracking the performance over the various frames. Once the Eyetronics team completed the capture process, it used the company’s proprietary Shape Snatcher software to turn the information into 3D animated data.

First, the group obtained the 3D shape of the face for each frame. “At this stage, every frame has a different topology,” explains Callaerts, “so there are no correspondences between the 3D shapes in each of the frames.” Though the technicians tried to automate as much of the work as possible to reduce the delivery time, they nevertheless had to clean up part of the data by hand to achieve the necessary level of accuracy. In parallel, the movement of the markers was tracked in Maya, allowing Eyetronics to extract the skin motion.
Artists created several textures for each face, among them Churchill’s, that were mapped onto the model. These included the base color (left), shininess (top right), and bump map (bottom right).

Next, with Eyetronics’ proprietary Liquid Faces morphing software, the team blended a head template with the neutral reference model of the actor’s head. Then, using both the full-3D information and the skin motion data, the group morphed the template into each frame of the performance.

Later, the technicians applied the same facial animation to the digital faces by merging the acquired motion information with the 3D model meshes provided by MPC. This step is considered the actual performance cloning, and was executed in Liquid Faces and in Side Effects Software’s Houdini. While a translation setup eased the conversion between the actor and the digital double, some tweaking had to be done to correct for extreme situations such as a very wide open mouth. In fact, the most challenging part of the process, notes Callaerts, was adjusting the data around the eyes and the mouth. “When the mouth opens, there is a cavity, and the light grid cannot be projected there,” he explains, “thereby forcing us to do manual adjustments [to the data] using Houdini.”
The director took a “less-is-more” approach to filming: The less the actors did, the more realistic the fake archive looked, because most 1940s movies (such as this authentic one of Hitler) showed the men engaged in mundane activities.

Back at MPC, the team tracked the original footage of each actor’s head motion, and placed the cloned characters into the live-action scenes. While most “official” footage of the leaders was filmed using a tripod, their more private moments-like those contrived for the program-contained little action and were usually captured with a handheld camera. The MPC group replicated this style, emulating the simplistic actions and camera moves indicative of the time. However, because the current shots were not locked, they required a great deal of camera tracking, which was accomplished in 2d3’s Boujou software. Within these shots, the artists also object-tracked the head of each actor, so the individual CG faces could be accurately applied in Maya.

To accommodate the groundbreaking CG work, each scene had to be filmed three times: without the actors for a clean plate, with the actors, and with the actors wearing a tracking device that outlined the jaw, eyes, and nose, serving as a guide for attaching the CG masks afterward. According to director David McNab, the process was not merely time-consuming, it also tested the entire film crew, as it was nearly impossible to determine the quality of the acting “when you knew the person’s face and facial movements would later be replaced with a digital version of the performance.”

The virtual faces are used in 20 “hero” shots throughout the 90-minute program: 10 for Hitler, and five each for Roosevelt and Churchill. The actors, in heavy makeup, appear in the other shots, despite having been chosen for the roles based on their similarity to the world leaders in terms of the technical shape of their face and body (distance between the eyes, length of the nose, and such), more so than having a similar appearance, so that the digital masks fit well. As an added layer of realism, technicians altered their voices so they also sounded authentic.

In these scenes, the actors are far enough from the camera or obscured by the lighting-which was basic, as it would have been in the 1940s-to pass for the real men. In fact, at times the film looks underexposed and dark, and other times it appears overexposed, a recurring artifact of filmmaking during this time period.
After three years of planning and months of practical application, the Discovery Channel achieved virtual history. By re-creating previously undocumented past events with unprecedented realism, filmmakers were able to bring past history to life for a new

Despite these “forgiving” conditions, seamlessly tracking the actors and compositing the new faces into the scenes was, without question, the most time-consuming aspect of the project for MPC. “We kept refining the work to make sure that each actor’s face was oriented the correct way, because it had to be a seamless composite,” says Radford. When it came to Churchill, the work intensified because of the abundant flesh around his neck. So instead of blending the CG halfway down his neck, the team tucked it underneath the chin area so the actor’s entire neck area would remain practical.

For tracking purposes, the group extended the character masks beyond each person’s hairline, but the geometry was not rendered past the hairline, which also remained practical. In Hitler’s case, the hair is combed over the forehead, so the artists simply pulled the actor’s real hair back on top of the CG.

While nearly all the animation was performance-captured, the artists had to hand animate the eyes, which could not be fitted with markers for tracking during the motion-capture shoot. To give the artists more control over the timing, the mocap was turned off around the eye regions, which the artists then modeled, shaded, and textured by hand. By decoupling the dialogue from the facial expressions around the eye region, the artists also were able to mix and match motions, allowing for full control over eye blinks and other subtle movements.

After the team completed the digital faces, it composited them into the various scenes using Apple’s Shake. To help blend the CG into the shots, the artists used image-based lighting, but not full high-dynamic range imagery, since the footage would be heavily processed to resemble archival film. In fact, film from this era ranged in quality and color saturation, with even more apparent differences among the English, German, and American film. For consistency, the MPC team decided on a standard period style from each of those countries, enabling viewers to immediately associate a certain look with Churchill, Roosevelt, or Hitler.

Processing the film also made it easier to blend the contrived footage with selections of actual footage cut into the show. “For instance, the events surrounding the briefcase bomb explosion were reconstructed, but authentic footage of the aftermath was used to carry the story forward from that point,” explains Radford.

In addition to matching the color grain, the group also processed the present film to resemble 1944 newsreel footage, in which people appear to be moving faster than they had been-a result of the film being shot at 16 fps and then projected at 25 fps. For Virtual History, the team shot the scenes at 25 fps, and integrated the CG into the film plate. Next, the group retimed each shot using a Shake plug-in, making the footage appear as if it had been captured at 16 fps, like vintage film, so the action (and integrated CG) seemed to move faster than normal. “We considered that [version] our exposed negative,” says Radford, “to which we added all the other effects, like dust and dirt.” Finally, the group retimed the footage back to 25 fps, so when it was played in real time, it looked as if the film had been shot at 16 fps.
Left: Eyetronics experienced more difficulty processing the facial data for the Churchill character than for the others because of mismatched features, such as the mouth size. Righ

In addition, the group conducted research to determine what the film would have looked like after 60 years of being copied and stored; for instance, what kind of artifacts would be present, such as dust and mold, and the amount of flicker and weave that would have been introduced. “We learned that you can’t just apply every single effect,” Radford says. “You have to be sparing when applying the artifacts, because not every single type of artifact would be present in every single frame.” Moreover, the group had to consider the film source. The American footage, maintains Radford, is the cleanest and the most colorful of the three grades, so the artists added fewer artifacts to it than they did to the two other grades.

While Eyetronics has used the facial performance capture before in movies like League of Extraordinary Gentlemen (see “Invisible Effects,” September 2003, pg. 10), this is the first time it has cloned an actual nonliving person, and done so outside the realm of a feature film. In taking this bold step for television, the groups were able to pioneer a new genre of film that, according to Discovery Channel’s Abraham, has the potential to revolutionize the way viewers watch historical documentaries in the future.

As CGI evolves, contends Abraham, it will be possible to accurately re-create scenes not just from the 1940s, but from the time film was first used to record history right up to the present. “We’re going to see more computer animation in documentaries, and as the technology improves, it will be possible to combine it with more modern film [that is sharper and less forgiving],” he says. “Footage that was lost or never shot will be re-created to give us real insight into how and why momentous events occurred.”

Karen Moltenbrey is a senior technical editor atComputer Graphics World.