In 2007, actor Nicholas Cage enthralled audiences with his burning portrayal of the comic-book antihero Johnny Blaze in the film Ghost Rider—with a great deal of assistance, of course, from digital effects. Five years later, the actor reprised his role in the sequel Ghost Rider: Spirit of Vengeance, and once again a VFX team had to up the ante in terms of creating the character’s fiery form with CG controllable fire and smoke, but they also had an even bigger challenge in that the film is in stereo 3D.
Smoke and fire are essential elements in the sequel, just as they were in the original movie. The premise of the story is that by day, stunt motorcyclist Johnny Blaze assumes his human form, but by night, he becomes the blazing skeletal biker Ghost Rider. In the sequel, the action—and the smoke and fire—are dirtier and grittier than before, as the antihero cuts a path of destruction through Eastern Europe in an attempt to rescue a young boy, and the world, from a burning fate.
“The directors brought a different aesthetic to this film. They have a much more down-and-dirty type of style,” says The Creative-Cartel founder Jenny Fulle, who served as VFX producer on the movie. “So while the effects are similar in nature to those of the first film, they were executed in a way that is darker and more gritty.”
This time around, the VFX team at Iloura in Australia heated up the screen as the main visual effects vendor for the movie. Their work ranged from creating digital doubles and machinery/vehicles, to the decaying and withering effects, as well as the scene-stealing CG fire. According to Glenn Melenhorst, Iloura’s VFX supervisor for Ghost Rider 2, of the 450 shots the studio completed, approximately 30 percent were for fire-related effects on objects or on Ghost Rider himself (see “Fire Effects,” pg. 40).
Iloura created the visual effects for the movie, including the fire, which was done digitally.
They puppeteered the fire to achieve the desired effects for the smoke and flames.
According to Fulle, the fire had to act like a character; it had to show emotion. As if creating realistic, emotive fire throughout the film weren’t daunting enough, there was also the matter of the stereo 3D, since the movie would be filmed in 2D and then converted to stereo 3D for release. “You cannot get the three-dimensionality of fire when you are converting it during a post process,” Fulle explains. “3D fire is transparent and moves organically, so there is no way to pull it from the plate and place it elsewhere without it looking like a flat plane, especially when it is close to the camera. It is so integrated with its environment that it ends up looking like a pop-up storybook. It’s hard to deal with anything that has transparency when you are doing a conversion. So we knew we had to come up with something clever to pull this off.”
This challenge prompted Fulle and her Creative-Cartel team, including CTO Craig Mumma, the stereographer on the film, to meet with Iloura and Gener8 3D, the visual effects facility and conversion house, respectively, to come up with a better way to tackle the problem at hand.
“Necessity is the mother of invention,” says Fulle. “We were budgetarily challenged. We were also technically challenged. How were we going to convert all this fire? Early on we made sure that all the vendors were willing to sit together at the table and come up with a hybrid pipeline to eliminate redundant work and enable us to render some VFX shots natively.”
A New Workflow
For Ghost Rider: Spirit of Vengeance, VFX house Iloura had the challenge of creating all the 3D fire effects throughout the movie.
Due to budget, these effects had to be generated digitally rather than practically. To accomplish this, the artists did all the modeling and animation in Autodesk’s Maya, and then moved the files into Autodesk’s 3ds Max, where they employed Sitni Sati’s FumeFX, a Max plug-in, to craft the fire. Compositing, meanwhile, was done using The Foundry’s Nuke, and Cebas’s ThinkingParticles within Max was used for some of the simulations. Although Iloura has a matchmove department in-house, the decision was made to send the work outside to Yannix.
“The [filmmakers] covered the action in a wild manner, throwing the camera around with few, if any, tracking markers,” says Glenn Melenhorst, VFX supervisor at Iloura. “So we had rolling-shutter and tracking issues. Yannix managed to track stuff that we thought was pretty much untrackable.”
While a handful of vendors worked on the movie, Iloura did the lion’s share of the effects, including the fire and smoke. Perhaps most challenging was art-directing the fire on Ghost Rider’s skull to accomplish a range of emotions. “We were puppeteering the fire to get what we wanted out of it, blending the smoke and flames in such a way that it still felt natural but also performed as required,” says Melenhorst. “Yet, we didn’t want it to seem like you were looking at all-digital stuff all the time. It had to appear natural and photoreal.”
Directing the fire, Melenhorst notes, also required a great deal of trial and error, and scripting. The group also used Thinkbox Software’s Krakatoa, which enabled them to more easily control the billions of little particles—to make the fire wrap around a finger, for example. Compared to the fire in the first Ghost Rider movie, these flames were blacker and sootier, with embers and heat haze.
Because fire simulations can take a long time to do, Iloura code writer Jordan Walsh devised scripts that enabled the team to simulate multiple variations of the same simulation, allowing, for example, the group to dial up or down the heat level and speed, and output a QuickTime of those iterations—variations of cold/slow to hot/fast—to review. “We found that tool to be useful because the overhead of the simulations is so great,” Melenhorst says.
Each time Ghost Rider touches an object, it becomes a hellish version of what that object had been—including his signature motorcycle and an earthmover used in the climax. In addition, there is lots of death and destruction, all of which required Iloura’s magical touch. –Karen Moltenbrey
From the onset, Ghost Rider 2 was planned as a 3D film. Initially, Mumma’s intention was to shoot everything practically in 3D using stereo cameras. But shooting in 3D is like bringing NASA on set—there are lots of moving parts. And it soon became apparent that the filming style of the directors, Mark Neveldine and Brian Taylor, with their fast cuts and wildly moving, untethered cameras, would not lend itself to that method. As a result, the stereo would be done through post conversion, a method that has not always yielded stellar results.
Aware of that fact, Mumma said it was important that the parties come up with a methodology for the conversion that made sense and provided the quality everyone was looking for. “If we were going to go down this path, then we needed to develop a workflow that we could take to a conversion house that was savvy enough to understand visual effects, and adapt a whole pipeline to share everything—and get the results out of a good conversion rather than a bad one,” Mumma says.
The underlying question, notes Mumma, a former VFX supervisor, seemed simple enough: Why couldn’t they come up with a way to treat 3D conversion like they have visual effects for years, whereby the various build images are mapped back onto a 3D model? That concept became the starting point for the new 3D conversion workflow devised mainly by Iloura and Gener8.
“We needed to have a visual effects understanding with our conversion vendor and a conversion understanding with our visual effects vendor,” Fulle says.
The key was to work in a non-linear way within a shared pipeline, particularly for the hybrid, or pre-conversion shots—those containing visual effects. For the 3D conversion on these shots, the plates and assets moved back and forth between the effects house and the stereo vendor, as opposed to the usual process of effects finishing a shot before passing it to the conversion vendor (sometimes supplying pieces or elements of a shot, albeit at an added cost). On Ghost Rider 2, the shots were converted ahead of visual effects, and then the effects group would render the CG elements, like the fire, in a true stereo environment, resulting in a much higher-quality product with a more efficient scheduling process, as shots constantly moved through the pipeline.
Due to the fire and other factors, a new pipeline was devised to deal with the stereo
3D conversion on the film.
The workflow went like this: Iloura would first receive the plates and perform the 3D tracking, roto, and cleanup, and prep the plate for the CG elements. Then they would send the plate, along with the roto and other elements, including the CG camera, to Gener8, which would extract the 3D data from Iloura’s assets and produce a left- and right-eye camera. Next, Gener8 would send those plates, along with the CG and stereo cameras, back to the VFX studio to complete the visual effects work, render the elements, and composite them into the 3D scenes.
“Everything fit perfectly into place. Our VFX house didn’t have to guess on the 3D work, ever,” says Mumma.
The facilities also had to contend with two other types of shots that followed more traditional workflows: straight conversion, for live-action, non-VFX shots, which were given directly to Gener8, bypassing Iloura; and post-conversion shots, whereby Iloura would do its work before passing the plates and the assets (roto and track) to Gener8 for stereo completion.
And while each studio used specific software to get its job done, there were shared tools in the pipeline, including Autodesk’s Maya and The Foundry’s Nuke. A key component of the pipeline, notes Mumma, was Gener8’s proprietary conversion tool set, which he calls “the kingpin of the operation.”
As Gener8 producer Paul Becker explains, what sets his company aside from other conversion facilities is that Gener8 re-creates scenes in true 3D space. “We create a camera track, which is exactly what effects does to create their workspace in which to place objects like fire and smoke,” he says. “So we are more like visual effects’ little brother in the sense that we live in a true 3D space. The prevailing (and cheaper) method of conversion neither understands actual 3D photography nor plays by its rules.”
Gener8 typically uses what Becker calls a “model and project” method, which provides exceptional results but is thought to be costly and slow. However, the company has integrated programming into the manual steps, making it more efficient and eliminating human error. For the most part, Gener8 also employs off-the-shelf software whenever possible: 3D animation programs, such as Maya, Imagineer’s Mocha Pro or SilhouetteFX’s Silhouette for roto, and Nuke for finishing and painting. Proprietary tools, such as Gener8’s StereoComposer, facilitate communication among the packages. “We come from a game-development background, and we are used to building pipelines to suit each project’s needs. We don’t force our clients to do it our way. So when Jenny [Fulle] said, ‘Let’s try something new,’ we thought, ‘Awesome.’ We like to innovate and change, and we liked the idea of economizing without adversely affecting the quality of the 3D.”
Indeed, the 3D fire was the driving point of the hybrid process, enabling the group to get depth and volume from the CG pyro. But there were also time/money considerations. If he had to estimate the timesavings reaped from the pipeline, Mumma would put it at months. “Shared pipelines are usually so scary, but The Creative-Cartel team managed the process so well. I was blown away,” he says. “Usually you come crashing and burning into the final date, and this was a soft landing.”
For Iloura, though, it was not so much about timesavings as it was having its CG imagery fit comfortably within the stereo scenes. “However, it was more back-and-forth work than we usually do,” Melenhorst adds.
At Gener8, Becker estimates that the unique workflow saved his company nearly two months’ production time. “Jenny [Fulle] found a way for us to solve the more challenging elements, which were fire and smoke, by implementing this new pipeline,” he says. “We never had to worry about extracting Ghost Rider’s fire from the shot; it was under the control of visual effects.” That workflow was quite different from Gener8’s conversion process on a previous project that contained fiery CG characters.
“The cost of breaking out fiery elements was high for our client—and perhaps unnecessary. What Jenny has done is eliminate redundancy by providing the effects house with stereo assets prepared by Gener8—maximizing on the scheduling time,” Becker notes. “We didn’t have to wait for the shot to be done before converting it; the CG was added after the conversion. This gave the effects artists more time to make things sexy and not worry about setting stereo.”
Fulle, who is a proponent of shooting traditionally and then converting to 3D afterward, believes the new workflow gives filmmakers more freedom on set and allows them to make choices later compared to if they would have been filming in 3D from the start, forcing them to commit to decisions during the shoot. “It allowed us to get more boom for our buck,” she says.
Studios have been doing stereo conversion for a few years now, and since that time, conversion has often been considered a postproduction process—and it is not, contends Fulle.
Typically, conversion is done after visual effects, and the plates are not shared back and forth. And that process is best suited for filmmakers who want to see the whole film in context or who make a lot of changes in editing. But for those who can cut and shoot in their mind and know exactly what needs to be changed when they see the cut for the first time, the benefits they can reap from this type of hybrid pipeline can pay off big.
“We didn’t have the luxury of waiting until the whole cut was locked. We had to turn over sequences as they were ready, so we could push the sequences through and keep the pipe full,” says Fulle.
The converts to such a new workflow extend beyond the immediate facilities working on this movie. For instance, Mumma points to a tent-pole film that shot in 3D. After the pickups, the production team was going to reshoot some sequences in stereo—until they saw this hybrid pipeline. It prompted them instead to completely retool their own pipeline to match the hybrid one and better handle the changes. “I showed them the work and the pipeline, and they completely changed their methodology,” he adds.
Becker also knows of a number of studios interested in this pipeline. “It’s a new paradigm,” he says, adding that any film with CG in it would benefit from the workflow. “Filmmaking often is a conflict between art and industry. In tough economic times, the desire to save money can often result in inferior products. But what we did here was create a better product while saving time and money.”
While he agrees that this new method is not for everyone and that there are still shots that will call for using 3D cameras on set, Mumma does not believe the 3D cameras are the solution for every film, either—especially given that so much conversion is still done on 3D filmed projects due to alignment and calibration issues with the cameras, among other factors.
“This is a maturing industry and everyone is learning a lot,” says Mumma. “This is a way of finding a happy medium for hybrid shots.”