Working with director Martin Scorsese, CafeFX visual effects supervisor Ben Grossmann blurs the line between post and production for Shutter Island
It’s difficult to estimate the number of visual effects shots created for Martin Scorsese’s film Shutter Island, and that’s a good thing.
“Ballpark, I’d say we ended up with maybe 200 of the 700 visual effects shots in the film,” says Ben Grossmann, visual effects supervisor at CafeFX. Grossmann worked with overall VFX supervisor Rob Legato, and with CafeFX’s affiliate, The Syndicate, which gave the Santa Maria, California-based studio a base in Los Angeles closer to the production.
“But, over the course of the project, we created 300 or 400 shots because of the way Marty [Scorsese] and Thelma [Schoonmaker], the editor, work. “It’s surprising,” says Grossman. “You could think of [Scorsese] as an old-school filmmaker, but he’s more comfortable with visual effects than most directors. That’s what was most unique about this picture. He treats the visual effects department on par with all the other parts of the film. There’s no delineation between production and postproduction.”
That resulted in a new way of working and a new business model, as well. What it meant in practice for the visual effects crew was that they would rough-out shots during production so that Scorsese and Schoonmaker had rough composites to work with in editing. “We did that with the dailies,” Grossmann says. “If there were three takes Marty liked, we would temp all three takes. We’d do them flash to flash--from when the camera kicks in to where Marty yells, ‘Cut.’ And, if Thelma didn’t know what her edit would be, we’d do a rough take immediately after the shoot.”
To do this, the crew worked with Autodesk’s 3ds Max for 3D modeling, The Foundry’s Nuke for compositing, and Andersson Technologies’ SynthEyes for camera tracking/matchmoving.
“I call them rough takes because we knew the quality would be higher,” Grossmann says, “but they were pretty high quality. Our goal was to make them so good that the effects wouldn’t take Marty [Scorsese] and Thelma [Schoomaker] out of the story. They’re crafting the story with these shots. We wanted them to be free to edit with shots that functioned.”
Grossmann had first created digital environments for Scorsese to help him with a sequence in his short film Key to Reserva, which released in 2007. “We wanted to shoot in Carnegie Hall, but we couldn’t work out the schedule,” he says. “We could get a four-hour window, though. So, I set up a camera rig to capture a 360-degree environment at extremely high resolution. Marty did a quick walk-through and said he wanted to shoot from here, here, and there, so I did a lot of coverage for those locations. I shot high-dynamic range 360-degree domes using around 160 photographs for each position. When we finished, Marty could say, ‘I want to point the camera here,’ and we could rotate the world and give him a quick comp.”
For Shutter Island, Grossmann, CG supervisors Adam Watkins and Luke McDonald, compositing supervisor Alex Henning, and a team of VFX artists replicated and extended the process. For some shots, the team replaced the digital stills with moving footage, in others they brought the stills to life using animated CG objects--debris, blowing leaves, seagulls, and so forth.
“There are a couple scenes in the film where Teddy [Leonardo DiCaprio] and Chuck [Mark Ruffalo] stand and talk for what feels like forever on a cliff-side,” Grossmann says. “You’d never think the backgrounds are visual effects, but all the trees, the seagulls, all that life is in one of the domes we created.”
Many of the scenes that benefited from this process centered on Teddy and Chuck on coastal cliffs. The film, based on a book by Dennis LeHane, takes place on a creepy island in Boston Harbor where a patient has disappeared from an asylum for the criminally insane. US marshal Teddy Daniels investigates the disappearance of a patient, a murderer, from the hospital.
“The big sequences with Leonardo and Mark on cliffs were filmed with just those two guys on bluescreen,” Grossmann says, “so it was challenging for Thelma [Schoonmaker] and Marty [Scorsese] to edit the scenes without seeing what the environments would look like.” Although the crew would build these environments in 3D for the final shots later, Grossmann’s team used still photos of cliffs and waves to stitch together the temp domes, and fast matchmovers created a virtual camera that replicated the movement of the camera on set. By placing that camera, footage of the actors, and the dome in Nuke, Scorsese could see the entire scene that would be in the film from the camera’s point of view.
“We’d take the temp, apply footage, and see what we’d have on the day,” Grossmann says. “Wherever the camera moved, the actors were in this 360-degree representative environment that replaced the bluescreens.” Scorsese and Schoonmaker could use these composites as they edited the film, and comments from Scorsese and Legato about the digital stills used for the environments helped Grossmann’s team as they moved toward creating final images.
“On most pictures, the process is to wrap principal photography, lock a rough edit, do visual effects,” Grossmann says. “We were doing visual effects the entire time during principal photography, and we kept shooting with a visual effects unit for a year after principal photography wrapped to flesh out the scenes and fill in holes. Rob [Legato] and I would do location scouts, take film footage, and send the pictures to Marty. He say ‘Fine,” or ‘Oh, no. Not high enough. Not violent enough,’ and we’d look for another location until we could film the actual plates that we’d use in the movie. We’d put [the new footage] into the dome and see how it worked.”
For example, in one sequence, Teddy climbs up and down a cliff that juts up from the ocean. The cliff was on set. Everything else was digital, created in a dome. “We knew what the sky needed to be, so we used digital stills that we touched up in Photoshop,” Grossmann says. “And, we added animated clouds or used time-lapse photography of clouds moving to that matte painting.”
Obviously, though, they couldn’t use digital stills for the crashing waves. “There was some pressure to do the water digitally,” Grossmann says. “People would say, ‘Oh, you can do that in the computer, can’t you?’ But Rob [Legato] and I like to have real elements if we can. And, in this case, we didn’t need to control the water, we just needed to find something that looked right.”
The visual effects film crew shot some of the water for that scene at Acadia National Park in Maine, using a SpyderCam, which is a camera on a stabilized rig, that they hung off the edge of the cliff on a construction crane. “We divided the area into quadrants and then shot tiles using 35mm film,” Grossmann says. “We might shoot 600 frames, 15 seconds, for each tile. Then we blended them together and stitched them into moving footage. So, we had tiled motion-picture photography of waves crashing on rocks, matte paintings and textured geometry for the cliffs, animated matte paintings for the sky, and bits of stuff blowing in the wind. We had 180 degrees of coverage. The camera could point up or down and see continuity in the waves.”
Similarly, for a scene in which a character apparently catches fire, the crew used practical elements. “We could have done the fire digitally, but we’ve done production at The Syndicate,” Grossmann says, “and my background is production-oriented, so we decided it would be just as easy to do practical fire. It was only two shots. We built a miniature black version of the set at New Deal Studios, lit each piece of fire by hand, and composited the elements into the plate.”
The rats, which Teddy hallucinates but we see, are real, as well. He is at the bottom of a cliff standing in crashing waves looking for his partner when rats crawl out of the rocks and surround him. On set in Massachusetts was a small water basin with practical rocks; two dump tanks custom-built for the task by special effects supervisor Bruce Steinheimer would alternately pour water into the basin to create a steady ebb and flow. While DiCaprio stood in what was, essentially, a bathtub, an animal trainer released 25 trained rats during a crack in the waves. DiCaprio could nudge the real rats with his foot and push them off the rock.
Later, the visual effects crew surrounded DiCaprio and the rats with crashing water shot in Maine, put rocks and ocean waves shot in California behind, added a sky photographed in Florida, and layered in additional rats shot on bluescreen in California. “The rats hop from a rock shot in Massachusetts to a rock shot in Maine,” Grossmann says.
In the same way that the crew used footage of real-world locations in the dome, they added footage of miniatures created at New Deal Studios to the environments. “When a shot was mostly about the miniature, New Deal created it, and when it needed a heavy amount of CG based on the models, we did that,” Grossmann says.
For some shots with a lighthouse, for example, which Grossmann describes as a miniature around 20 feet tall, the visual effects crew built a CG model from blueprints of the miniature, photographed the miniature, projection-mapped the photographs onto the CG model, and put it into a dome.
Rebuilding the Model
Grossmann believes the entire crew enjoyed being more a part of the production process than usual, as much as he did.
“Everyone always says that the visual effects production process is no longer just post, but it isn’t on par with production all that often,” Grossmann says. “With this film, we started with pre-production, and we were there all the way through. It was great. If Marty trusts you, you’re part of the team. It’s like the snake bites from the head. When the director sets a bar and demonstrates respect, everyone follows suit. The grip, gaffers, DP, everyone, says, ‘Visual effects needs this, let’s get on it,’ instead of ‘Oh, no, another stupid request.’ ”
Because that respect extended to the entire visual effects team, Grossmann felt comfortable bringing compositors on set to work on the sequence with the fire elements. “Instead of having compositors sitting at their desks, they were standing behind the camera,” Grossmann says. “It totally changed their level of enthusiasm.”
This way of working changed the business side of the picture, as well, for the postproduction house. “You can’t use the same business model as in a traditional pipeline where you get shots locked by the edit and you know what they’re supposed to be,” Grossmann says. “In that case, you can give a flat bid and start working. But in this case, you don’t know how many shots. You don’t know how long. You can’t bid by frame count. You can’t lock anything down because you’re part of the filmmaking process. You’re creating shots on an as-needed basis, so from the outset you have to change the way you budget and schedule the film. Our VFX producer Ron Ames really pushed this model and made it possible for all of us to work within it.”
As a result, the visual effects crew became a production service, not a vendor. “You have to keep a budget like any other department,” Grossmann says, “not a bid. If you found yourself about to go over budget, you adjust the budget just like any other department. You don’t bid and try to track a shot granularly. We were part of the production process, and that continued to the end of the film. With Ron [Ames] nurturing everything along and balancing the workload between teams, we got to focus on putting beautiful things on the screen, rather than numbers in a spreadsheet. It was tons of fun and I hope to do it again sometime.”
Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.