On Virtual Location
Issue: Volume: 29 Issue: 6 (June 2006)

On Virtual Location

Episodic television opens a new frontier for effects artists setting a scene
By Martin McEachern
 
A revolution is under way, and it is being televised. Only, no one has noticed—even as it unfolds before millions of eyes each week on such hit shows as Crossing Jordan, CSI, and Las Vegas. And, it’s happening, quite literally, in the background of every show.

 
Look closely at the detectives on Fox's prime-time drama Bones as they pass through Arlington National Cemetery or at the military officers on NBC’s E-Ring as they stride through the parking lot of the Capitol Building, for example. In fact, wherever you find your favorite characters each week—be it near Boston Harbor, the Las Vegas strip, Central Park, the Lincoln Memorial, even in Afghanistan—chances are they were filmed in a back lot, parking lot, or studio in Los Angeles. The worlds they inhabit are increasingly virtual.
 



Images © Bruckheimer Television, Warner Bros.
With a limited travel budget, E-Ring relied on Zoic to dress a sparse greenscreen
set (top) using 3Delements, resulting in this final shot (bottom).
 
Overwhelmed by shrinking budgets, narrowing production schedules, and rising viewer expectations, TV producers are in desperate need of a new production paradigm—one that can transport actors each week through virtual sets and set extensions created within a matter of days, not weeks. But with larger effects companies tailored to handle long term, high-budget effects, these producers are pinning their hopes on smaller,“boutique” houses to fulfill the promise of a brave, new virtual world—companies with production pipelines that emphasize speed and simplicity, which can unshackle the imaginations of writers and production designers while still reducing costs. With that goal, they've turned to facilities such as Zoic Studios, Look Effects, and Stargate Digital.
 
As these companies rise to the challenge, a new and potentially lucrative frontier for the industry hangs in the balance. At stake is a future where digital set asset libraries become a valuable commodity, and digital set design and production design become intertwined. But changing the channel on the future will not be easy. Artists must learn to work much faster than their counterparts in film, who often labor for months, sometimes years, on a single shot.
 
They also must learn to use simple, all-purpose software packages. NewTek's LightWave, for example, shines on the episodic stage; rarefied and highly specialized tool sets do not. And because there's no time to fix mistakes on an episodic schedule, the artists are invested with a level of trust and creative responsibility found nowhere else in the industry. Moreover, they must learn to collaborate with directors, production designers, actors, and cinematographers, assuming a far more active and integrated role.
 
Capitol-izing on Zoic
 
When producers for NBC’s E-Ring (about the top-secret missions of two Pentagon military officers) asked Zoic Studios to transport the actors from a Los Angeles parking lot to the parking lot of the US Capitol, the crew sprung into action on multiple fronts, sending an artist to Washington, DC to gather reference photos for texture mapping, another to the LA set to gather lighting and camera data, while the rest of the team began the modeling process using Light Wave and Autodesk Media and Entertainment’s Maya.
 
For Zoic, which also does virtual sets for CSI: Miami and Battlestar Galactica, E-Ring was a golden opportunity to showcase its talents because the production was severely limited in its travel budget, making virtual locations a necessity. “Usually, set dressing would furnish the live-action sets with foreground elements or anything else interacting with the actors,” says visual effects supervisor Andrew Orloff. “But for the Capitol shot, there was no dressing at all, not even cars in the lot. We had to create everything in 3D because the director planned to shoot the action with moving, panning, tilting, and craning cameras—not lock-offs. We couldn’t just put a matte painting back there and track it in [Adobe]After Effects or [Autodesk Discreet] Combustion;it wouldn’t look real. We need the parallax to sell the shots.”
 
With only a week to complete the composite, the team arrived on set with a 360-degree camera to gather high dynamic range imagery for reflection mapping. As it rotates on a special tripod mount, the camera takes a picture every 20 degrees along the x and y axis, allowing artists to stitch the exposures together to form a spherical image of the location. The entire process takes about 15 minutes, and when done, the resulting image is mapped to a sphere in Maya. Zoic uses proprietary tools and HDR Shop’s free Lightgen plug-in for Maya to “read” the sphere and place directional lights across the virtual sky dome that approximate the on-set lighting. Artists then use these lights to create reflection maps and shadows for the scene.
 
The studio constructed the Capitol as alight, polygonal mesh in Maya. Using the reference photos, the artists refined the textures in Adobe’s Photoshop, and projection-mapped them to the building in LightWave. For UV mapping, or stretching a texture map across a piece of geometry, the team also used Maya. To enhance the lighting and detail of the concrete surfaces, the artists used Pixologic’s Zbrush to paint normal maps, which encode information for all three axes. “With Zbrush, we're able to create normal maps very quickly, getting a massive amount of detail without upping the geometry, which can cripple work fl ow and protract rendering times,” says Orloff.
 
Prior to arriving on set, Zoic will create a previz in Maya to block out shots, creating virtual dollies, cranes, and other rigs for the CG camera, and constraining it to the speeds and motions of the real camera. Taking the previz to the set, the team could make certain suggestions as to what was needed, such as a 50 mm lens and so forth.
 
Once the director and director of photography execute the shots, the group works closely with the assistant camera crew to gather precise information about the camera lens. This data is then fed into2d3’s Boujou. The goal on set, according to Orloff, is to be minimally invasive, never imposing restrictions on the camerawork that could undermine the signature visual style of a show. “When you seethe shot, you want it to feel like they shot the Capitol, not like they made a hundred concessions to composite the Capitol into the scene,” says Orloff. “Our effects have to conform to the visual style of the show.”
 
Using Boujou and artistic finesse, the team allows both the cameraman and director to work unfettered, tracking the camera movements and applying them to the 3D camera in Maya. This same camera is used by the compositors in After Effects and Combustion to add clouds, trees, telephone poles, flocks of birds, and other set dressings in 2D. Allowing the 3D and 2D artists to work in parallel with a synced-up camera is crucial to finishing shots on an episodic schedule.
 
Once the modeling, camera tracking, and texturing are finished , Zoic uses a proprietary system to export the Maya scenes to Light Wave for rendering. The in-house software translates everything—from rigged characters and hair dynamics, to camera and particle data from Maya—into Light Wave. “Light Wave has this tool called background radiosity, which does image-based radiosity lighting,” says Orloff. “It’s fast, and looks great.”
 
Crime-Scene Work
 
The Capitol shots were completed in just under a week, but for the crew at Zoic, there was no time to rest on their laurels. For in the episodic world, where jobs are short and fast, juggling multiple projects is key. So at the same time, the team played musical chairs with other projects, including regular work for CSI, E-Ring, and Battlestar Galactica.
 
According to Orloff, Mental Images’ Mental Ray, Maya’s Fluid Effects, and Next Limit’s Real Flow are crucial tools for the group’s work on CSI—in all its incarnations—especially when the camera whooshes through a body, down hair shafts, pushing through capillaries, connective tissues, and bodily fluids. CSI not only relies heavily on Zoic’s previz work for staging virtual set shots, but for doing motion-control shots as well. “We handle the motion-control shooting for CSI, so we'll swap data from the laptop and actually program the motion-control rig using the data from the 3D file ,” explains Orloff.
 
In addition, the producers of CSI: Las Vegas depend so heavily on virtual set extensions that, at the beginning of every season, they send Zoic to that city to shoot 360-degree panoramic plates. This way, the studio remains up to-date with the ever-changing sprawl of hotels. “They give us our own camera crew and location manager. We go to the top of buildings with a miniature version of a 360-degree rig and shoot panoramic moving plates on 35 mm, so we have the entire strip from the east, west, north, and south stitched together on multiple moving film plates,” says Orloff. “You can seen these plates in the backgrounds of panning shots, especially when the buildings are in the far distance.”
 



Images © CBS Productions, Bruckheimer Television, Alliance Atlantis.
The first image is from raw footage shot for the show CSI: LasVegas. The
team at Zoic digitally removed some trees to the left and added the top of
a Las Vegas hotel, as seen in the second image.
 
Over the years, Zoic has witnessed the tides of change in TV land. “Somehow, in the last few seasons, VFX shots have gotten really long,” muses Orloff. “For a typical TV show, we used to do 60 90-frame shots; if we got a 200-frame shot, we’d look sideways and say, Wow! Now, we’re doing 1500 900-frame shots as a matter of course. Every single pilot this season had multiple shots that were over 1000 frames.” In the past, VFX shots would be governed by the rule of “get in and get out,” he notes. But those days are long gone; effects shots are now interwoven into pivotal plot points that demand close attention and expanded screen time.
 
On Sacred Ground
 
In one such pivotal plot point for an episode of Fox’s Bones, the detectives discover the charred remains of a body lying against a headstone in Arlington National Cemetery. The studio could not get permission from the government to shoot on the hallowed grounds. So, they turned instead to Look Effects, whose credits include Criminal Minds, CSI: NY, Malcolm in the Middle, and The OC. “Because they didn’t just want to put something identifiable able in the image, like the Chrysler Building, to ‘tag’ it, they needed all of Arlington Cemetery so they could have total freedom in blocking the shots,” says visual effects supervisor Max Ivims.
 
With just under two weeks to complete the shot (which Ivims concedes is much longer than usual), the crew arrived at the arboretum in Arcadia, California, where set dressers had improvised a small military cemetery using 100 Styrofoam headstones and a fake fountain. The cemetery had to be digitally expanded to 5000 headstones for various day and night shots. In the 900-frame opening shot, which was completely green screened, the camera moves with the main characters as the entire cemetery fills the background. To keep rendering times low, artists constructed a simple polygonal model of Arlington in Maya. Because there was nothing in the shot to enable match moving, the artists used a combination of eye-matching the 3D and doing 2Dadjustments in Apple’s Shake and After Effects. For panning and craning shots, the crew used Boujou and The Pixel Farm’s PFTrack to synchronize their virtual cameras.
 
For texture mapping and re-creating the on-set lighting in its virtual cemetery, the team did extensive photography of the area, including the fake headstones, trees, grass, and other plant life in the arboretum. However, the group did not go to Arlington;Ivims says it was more important for the textures to be consistent with the faux cemetery than the real one. Unfortunately, the team could not use the texture maps derived from the Styrofoam headstones.“The set-dressing headstones looked fine when we were on set and in practical photography, but when we took photos of them and tried to map them to the virtual ones, it just registers in your subconscious that something is wrong. Real headstone shave a little stain on the bottom from rain splashing up from the ground. The fake ones don’t have that.”
 
To solve the problem, Look Effects took photographs of gravestones at a local cemetery. The secret to virtual environments, argues Ivims, lies in the little details that take the “curse”off it. Those details include color depth, texture-map complexity, and, in the case of the gravestones, undoing the perfect geometric order in which the digital ones had been arrayed across the virtual field. “When we had them all lined up, they looked mechanical, so we had to angle them just slightly, so they’re not in perfect rows and not perfectly vertical,” explains Ivims. “For the grass, we didn’t use repeated texture maps, and we added patches of greener grass interspersed with swaths of burnt and dying grass. The more little details you add, the less your brain recognizes it as a synthetic environment.” The artists prepared all the texture maps in Photoshop. While the crew normally uses Maya’s Paint Effects to generate foreground trees, because most of the trees were confined to the far background, it was easier to project the tree images onto cards.
 



Images courtesy Look Effects.
For this scene from the new hit show Bones, the team from Look Effects
filmed at an arboretum, where set dressers had added Styrofoam headstones,
which were digitally multiplied, to simulate Arlington Cemetery. CG elements
and greenscreened actors were also added later.
 
Later, the team adds as much depth of field as possible in Maya before rendering the scenes in Mental Ray and sometimes in LightWave. According to Ivims, prodigious use of depth of field is integral to the extreme micro close-ups in CSI:NY. When the shot shows a substance being absorbed through the skin, the camera quickly zooms in to a real arm, then Zoic uses a motion blur to dissolve into a CG arm. After applying a photoreal texture map, artists composite multiple renders using transparency to give the illusion of subsurface scattering.
 
After Look Effects proved it could reproduce Arlington Cemetery, the producers of Bones returned with an even bigger challenge: re-create Washington, DC for an elaborate aerial pullout from the Washington Monument that lands in the lap of the Lincoln Memorial. The nighttime aerial originates from a polygonal model of the Washington Monument sculpted in Maya, and then flies over an entire virtual set of Washington toward the Lincoln Memorial. “We built everything, including the Memorial, the malls on either side, and the reflecting pool,”says Ivims. The water was a simple polygonal mesh to which the artists applied a sine wave to make the surface ebb and fl flow. It was then mapped with a highly reflective shader and raytraced in Mental Ray to create reflections of the world above.
 
Stargate to the Virtual Back lot
 
While Zoic Studios and Look Effects have become adept at assembling digital sets in a matter of days, the ability to offer clients a library of ready-made virtual sets is what many experts argue will become the next big step in streamlining episodic production. And that’s exactly what’s being tried at Stargate Digital, where founder/CEO Sam Nicholson and his crew have been building 360-degree, high-resolution mega-mattes of every conceivable environment a director could want. It’s called a Virtual Back lot, and Nicholson likens it to shooting a movie without actors. “By shooting those images at extremely high resolutions (8000 lines), the directors and DP can visit the environments from the positions we’ve shot them and go in much further if they want to,” he says. “They can plan their own camera moves within the environment, and no two people would create the same sequence with the same footage.”Nicholson estimates it takes about two hours of footage to create a variable-resolution, 360-degree environment.
 
Rather than dollying around a circular set surrounded by a greenscreen, which Nicholson says creates too much bounce backlighting, the team places the actors on a rotating turntable as the lighting grid above turns with them. Using the Virtual Backlot and nested 3D geometry, Stargate has built virtual sets and set extensions for ER, Las Vegas, CSI, and Steven Spielberg's Into the West. “Crossing Jordan also uses them a lot because it's easier to re-create Boston on a greenscreen than it is for the actors to fl y there,” says Nicholson. “Moreover, time stands still in the Virtual Backlot. Because magic hour lasts forever, you can afford to perfect a background and add CG foreground elements and put actors and partial set work in between.”
 
Stargate’s most recent Virtual Backlots were created for the TNT original film Battleground, starring William Hurt. For the movie, Stargate created scores of digital characters using a combination of greenscreened live actors and virtual people created in Maya and mapped with motion-captured data. All the shots were composed on the circular turntable at Stargate’s Van Nuys, California, stage.
 
For creating and lighting nested 3D geometry, such as the massive digital layout of ancient Rome for the miniseries Spartacus, modelers rely primarily on Maya and occasionally on LightWave;compositors use After Effects and Shake; texturers and matte painters work with Photoshop exclusively; and matchmovers track shots in Boujou. “With Boujou, we don’t have to waste time on set gathering lens information,” says Nicholson. “Indeed, no two lenses are the same, so it’s better to let Boujou determine the[focal length] of the lens. Instead of 24 mm, it will say 23.75, and it processes it—and it’s pixel perfect.”
 
Stargate’s render farm houses approximately 200 processors running primarily in Boxx computers and some Dell machines. The studio also maintains about 100TB of storage to efficiently run its proprietary VOS (Visual Operating System) software, which organizes visual information and disseminates it to a number of artists in various countries. “Transparently tracking all our effects shots, VOS provides real-time access to, and playback of, an extensive stock-footage library as well as every rendered effects shot from any desktop in our facility,” adds Nicholson.
 
The Digital Road Ahead
 
Looking to the future, the three companies are widely optimistic, not only about the prospects for small effects houses tailored for episodic work, but the ability for virtual sets to expand the creative horizons of television and completely redefine the role of the digital effects artist. “With the absence of sci-fi shows, visual effects artists have to change from a bunch of guys who can render spaceships to heads of departments who interface with TV production and collaborate on set,” says Look Effects’ Ivims. “We get to be more creatively involved with all the other departments.”
 



Images courtesy Stargate Digital.
These images are from Stargate Digital’s Virtual Backlot Russia location, for
a project the group completed last year. The first image is a greenscreen shot,
with the digital set applied to the final
.  
 
 
Moreover, television’s reliance on virtual sets could serve as a seedbed for change in the film world, as well. Vlad Bina, a digital set designer from xyBlue Design, whose credits include the Matrix films , agrees:“Stargate’s Virtual Back lot is one of the first attempts to create a system and pipeline for acquiring and organizing digital set assets. Surprisingly, such a system is missing in most of the large companies doing million-dollar digital set shots. Sometimes they end up re-inventing the wheel for every new project. To make a comparison, modern architecture today is an assembly process based on huge catalogs of ready-made construction components that are assembled according to a design grammar. An architect works with custom elements only sporadically; most of the design vocabulary is based on a library of tried and trusted components. I see the same thing happening in the future, to some extent, for virtual sets.”
 
And Bina’s vision of the future gets even brighter. With zdepth cameras and crossover technologies from advanced rendering engines for games and video card chips, real-time virtual set compositing could be only 10 years away, he says. Indeed, a primitive system has already been successfully tried on Disney's The Book of Pooh TV series. Stargate’s Nicholson also emphasizes the importance of standardized, workhorse software like Maya and LightWave, which can deliver the “good, fast, and cheap” triangle, and shorten the learning curve for new artists. And, it’s the artists themselves who will profit the most from the growing market, says Orloff.“
 
TV tends to be dominated by smaller companies because we're able to harness the collective talents of a small group of artists better than larger companies can. When we get a project, I’ll tell four guys to put their heads together and pitch me the best solution that exploits every body's talents,” says Orloff. “On the episodic stage, everybody gets to shine.”



Martin McEachern is an award-winning writer and a contributing editor for Computer Graphics World. He can be reached at martin@globility.com.