Rethinking Moviemaking
Issue: Volume: 31 Issue: 11 (Nov. 2008)

Rethinking Moviemaking

Could we be about to witness a revolution in filmmaking as profound as the introduction of sound and color? Proponents of stereoscopic 3D films think so, and it looks like they might be right.

In 2005 when Disney created a stereo 3D version of its first CG feature Chicken Little, the stereo 3D version played in only 84 theaters equipped with the new, digital projector-based RealD systems, which use disposable, polarized glasses. The following year, stereo 3D versions of Sony’s Monster House and Disney’s The Nightmare Before Christmas landed in 200 RealD theaters. By November 2007, when Paramount Pictures released Beowulf in stereo 3D, the number of RealD theaters had grown to 900. Now, the chickens and the eggs—that is, the theaters and the content—are quickly moving into place for a major revolution.

Digital projection is the key to stereo 3D’s theatrical success. The stereo 3D systems, such as those from RealD, that sit in front of the digital projectors control the dual images (left eye, right eye) with split-second accuracy. This has helped eliminate the headache-producing misalignments of left-eye/right-eye frames that can result when two sprocket-based projectors put images on screen (see “Supersized,” January 2007).

However, the slow adoption of digital projectors by movie theaters has slowed the adoption of stereo 3D. That’s about to change. On October 1, a consortium of Hollywood Studios—including Disney, Paramount Pictures, Universal Pictures, Twentieth Century Fox, and Lions Gate—announced their pledge of $1 billion-plus to upgrade 20,000 North American movie theaters to digital projector systems.

“Digital conversion is the major cost,” says Jim Dorey of MarketSaw.blogspot.com, a Web site focused on 3D movies. “Once that’s done, it’s a relatively minor cost for a theater to move to 3D.”

The digital projection installation project will unspool during the next three years, but already 1300 theaters show 3D films, with RealD, the leader in this field, boasting of deals in place for future installations that would bring its total to 5000 theaters. Further increasing the potential number of theaters is Dolby, which entered the picture in 2007. Dolby’s stereo 3D systems, which don’t require special screens but need re-usable polarized glasses, have now landed in 150 US theaters and an additional 350 around the world. And, just last month, Sony announced a 3D adapter for its 4k-resolution movie-theater projectors that it plans to ship in March 2009.

The 5000-theater number would be a milestone: When a film studio releases a so-called tent-pole film—a movie, for example, like Iron Man, Wall-e, Indiana Jones, or The Dark Knight—it lands in approximately 4000 theaters on opening day in the US. So, having 5000 potential 3D-capable theaters has sparked excitement in the major studios.


Director Henry Selick is using varying depths to dramatize story points for Focus Features’ Coraline, a stop-motion animation created at Laika and scheduled for release in February 2009.

“We anticipate there will be between 2500 and 3000 screens in North America by the first quarter next year,” stated John Batter, co-president of production at DreamWorks Animation SKG, speaking at “The Conversation,” a forum co-hosted by Scott Kirsner in early October. As for content, such prominent tent-pole filmmakers as Steven Spielberg, James Cameron, Tim Burton, Robert Zemeckis, and Peter Jackson have 3D films in the works.

In fact, Dorey lists on his Web site 21 3D films scheduled for release in 2009—CG features, stop-motion animation, and live action films—including Zemeckis’s 3D animation A Christmas Carol, Pixar’s Up, Sony Pictures Imageworks’ Cloudy With a Chance of Meatballs, Fox’s Ice Age: Dawn of the Dinosaurs, and Focus Features’ Coraline (wide release). Cameron’s 3D live-action science-fiction thriller Avatar opens in December 2009. Spielberg’s 3D animation Tintin, produced by Peter Jackson, is scheduled for 2010.

Moreover, Disney plans to author and release all its CG films in 3D. So does DreamWorks Animation.

“We’re looking at the point where there’s going to be 3D content coming out every couple of weeks now,” says Phil “Captain 3D” McNally, stereoscopic supervisor at DreamWorks Animation. McNally supervised the conversion of Disney’s Chicken Little into 3D at Industrial Light & Magic and was stereoscopic supervisor at Disney for Meet the Robinsons.


Belgian director Ben Stassen floated the characters in his stereo 3D film Fly Me to the Moon in front of the screen, over the heads of the audience. Stassen created the film exclusively in 3D.

Fundamental Changes
The studios’ drive to release 3D films is causing a fundamental change in filmmaking that is ripping through the production process. “Every movie we release starting in March 2009 will be authored in 3D,” says Batter, alluding to the March release of Monsters vs. Aliens. “We’ll conceive in 3D and shoot in 3D. It’s the next great frontier.” That’s the tipping point: Filmmakers are now considering stereoscopy from the beginning. “Chicken Little and Meet the Robinsons were post conversions,” says Robert Neuman, stereoscopic supervisor at Walt Disney Animation Studio. “Bolt is the first film in which 3D is part of production. As we’re laying out the films and setting out the cameras for 2D, we’re building the 3D version.”

Similarly, at the visual effects studio Sony Pictures Imageworks—where artists created the animated films The Polar Express, Monster House, and Beowulf, all shown in 3D—stereo is moving further upstream, according to Rob Engle, senior stereographer and digital effects supervisor. “With Polar and Monster House, it was sort of an afterthought,” he says. “We are much more embedded into production from the outset now.”


Disney’s Meet the Robinsons was the second CG animation that the studio converted into stereo 3D.

Engle finds that this is also true for live-action directors. “Filmmakers are asking us how they can adjust their films for the stereoscopic medium,” he says. “They’re talking to us about which camera angles and compositions work better. These conversations didn’t happen two years ago.”

And, that’s why the practitioners of stereo­scopy compare stereo 3D today to the introduction of color and sound in years past.

“If you look at the advances in the history of cinema, they all had technical hurdles to overcome,” says Neuman. “Color had to have an emulsion developed that conveyed the spectrum properly. And, in the earliest color movies, filmmakers were using color more for spectacle than for storytelling. Color was so saturated reviewers said it hurt.” Much like color, Neuman explains, 3D had false starts because of the technology and the spectacle of 3D that seduced early filmmakers: “The studios would go for the gimmicks, the ‘throw everything at the audience’ approach.”

Now, filmmakers working with animated and live-action features are using 3D more creatively. To learn what they are discovering, we talked with several 3D pioneers creating animated CG films, stop-motion features, and live-action films.

Sculptural Movies
“I talk about stereo 3D now as spatial movie­making,” says McNally. “The difference is like comparing painting to sculpture. We have the potential to conceive the whole storytelling art form as a spatial art form, especially when we combine CG with stereoscopic moviemaking.”

Close-up shots provide one example of how 3D might differ from traditional moviemaking. In the latter, when a director shoots a star for his or her “close-up,” the character in the film appears closer because it looks bigger. “It’s part of the illusion you create when you’re working in a medium that doesn’t support distance,” McNally says. “But, in a 3D movie, it’s possible that the close-up could be shot with the framing quite wide, and with the character moving closer to us. Is that more powerful, or is it distracting? These are questions we’re trying to understand.”

When stereographers structure a 3D experience, they work within certain limits.  “We have something we call the stereo budget, or the parallax budget,” Engle says. “That’s the technical limitation for how far away or close something can be before it becomes uncomfortable [to look at]. So, the creative aspect is in using that space to give the audience the best experience.”


Director Eric Brevig filmed the live-action Journey to the Center of the Earth in stereo 3D. The July 2008 release into multiplex theater chains earned $100 million at the box office.

Within that limitation, a director can bring objects out into the audience or use stereo 3D as a window into which the audience looks deeply. “I’ve often heard James Cameron says he’s keeping the subject of interest on the screen,” McNally notes. “He’s shooting live action, and you can understand why you would want to edit without worrying about the subjects jumping around. The other extreme is Fly Me to the Moon. The space of the movie is detached from wherever the screen is, almost nothing plays at the screen, and almost every character is within arm’s reach. We’re using both techniques and working between [them].”

Don Hahn, who was the producer at Disney for a number of feature animations, including Beauty and the Beast, The Lion King, The Nightmare Before Christmas, and The Emperor’s New Groove, is currently the executive producer for Tim Burton’s 3D Frankenweenie, scheduled for December 2009. “Directors are entering a whole new space with new rules,” he says. “For example, you might have a deep set, like a big ballroom, but you want the audience to focus downstage or way upstage. So, you converge the eyes to different points on the screen. Also, you can play with distance to create a child-like wide-angle perspective or a more normal adult sense of a room. But, I suppose pacing is the biggest difference. When you convert flat animation to 3D, there are times when you wish you could linger and let the eye adjust to a scene. The [stereo 3D] directors have extra time to follow objects to the z plane and back again.”

The possibility of such leisurely pacing is something that Adam Holmes, vice president of Wide Band Entertainment, finds enticing. Holmes is currently executive producer for a stereo 3D feature animation, an independent production made feasible by the promise of 5000 theaters. “If we can design stereo 3D films in which we hold on to the scenes, we might be able to bring some artistry back into our ADD world,” he says. “We may be able to extract emotion from films, not just fast cuts and fast pacing.”

Emotional Depth
In designing the overall look through the arc of a film, the stereoscopic teams frequently compare the conceptual use of 3D to that of a sound track. “We constantly play with the depth like a symphony,” McNally says. “We might start quiet, build to a crescendo, and then fall back so that depth is something that flows.” The stereoscopic artists consider depth within individual scenes as well.

“When filmmakers went for the gimmick of hurtling stuff at the audience, they pulled the audience out of the film, instead of creating an immersive experience,” Neuman says. “We’re taking a more mature approach.”  For example, in the film Bolt (see “Back to the Future,” pg. 20). Disney stretches the environment fully for action scenes, but when the dog Bolt talks with the cat Mittens, and it’s a lighthearted moment, the crew tones down the depth.

The instruments that stereographers use to move objects toward the audience or push them far away are the distance between the two cameras (one for each eye) and the convergent point for the two images. The choice of camera lens affects depth as well.

“Let’s say you want to use a long lens, maybe a 100mm lens,” Neuman says. “That gives you an unsatisfactory result in 3D: There is a large separation in depth between the characters from foreground to background, but each character looks like cardboard. To compensate, you can increase the interocular or interaxial distance [between the cameras], which increases the internal volume in the characters, but the gaps between the characters are also magnified.” And that could quickly use up the stereo budget. To solve this problem, Disney and other studios use multiple sets of cameras—perhaps one for the foreground, one for midground, and one for the background—and dial in the depth to create the internal volume and roundness for the character. Then, they composite the layers together.

With that kind of control available, directors can choose where to place characters in scenes and how much depth to give characters based on the story they’re telling. “One metaphor we build on is equating the emotional depth of a scene to the depth of the character,” Neuman says. “We might increase the depth until we get a nice, round, full character for an emotional beat. A second literal metaphor we use is equating emotional separation to depth.”

For example, when directors want the audience to connect with a character, they might place that character on the audience’s side of the frame; that is, bring the character out into the audience a little. When they want the audience to feel detached from the character, they might push the character back.


Imageworks turned the CG film Beowulf into a stereo 3D version using techniques honed on The Polar Express and Monster House.

To do this without changing other characteristics in the scene, Neuman sometimes uses another stereoscopic technique called the floating window:  The stereoscopy crew puts a black mask on the edges of the images and then varies the thickness of that mask between the two eyes to change the perceived location of the theater screen.

“We create a virtual screen (a window) that we can float into the theater, push back into the screen space, or change the orientation of the image,” Neuman says. This helps them fix problems—perhaps a frame edge that crosses in front of an object appearing in front of the screen. Moving the frame puts the depth cues back in sync. But, he also has begun using the floating window more creatively, to temper the balance between emotional and stereo depth.

“Sometimes if we can just pull out one corner, we can retain the proximity we want,” Neuman says. “And, we’re experimenting with tilting the window during action sequences to create tension. We’re sculpting a 3D environment artistically based on where the eyes are drawn and where the action is taking place.”

At Laika, Coraline director Henry Selick is using stereo 3D to enhance a main story point for the stop-motion animation. “Our main character, Coraline, is a little girl who lives in a bland, overcast place,” says Brian Van’t Hul, VFX supervisor who won an Oscar for visual effects in King Kong while he was at Weta Digital. “In that real world, we keep the stereo flat. But when she goes to the magical land, we increase the depth to enhance the move. We’re using stereo 3D in an intentionally dramatic fashion.”

Because stop-motion animators shoot their films one frame at a time, in order to create the stereo 3D version, they simply shoot two stills. “We shoot the left eye with the digital still camera and then move it over and shoot the right eye,” explains Van’t Hul. Compositors layer the stereo images into backgrounds moments after an animator shoots the frame.

DEEP TOOL KITS

New tools from several software companies now help artists working in 3D. For example, Autodesk’s Lustre speeds the work of colorists who are grading 3D films. The Foundry’s Ocular plug-in for Nuke helps compositors who paint one eye to create the second eye in the correct depth. Quantel’s Pablo system aids artists matching the colors on the left and right. And, Tweak’s RV software gives artists the ability to see images in stereo 3D on standard monitors within RV viewing software.

The animation and visual effects studios are also creating 3D-specific tools, and those tools provide a glimpse into the state of the art for stereoscopy.

Weta Digital
“We’ve had to come up with a whole new workflow,” says Matt Welford, head of compositing at Weta Digital, where work is under way on James Cameron’s Avatar and Steven Spielberg’s Tintin. “We use [Apple’s] Shake and Nuke for compositing, which are both node-based. Initially, we made a left tree for the left eye and a right tree for the right eye, so we had two trees per shot rather than one, as in traditional films. To reduce the number of files and make file management neater for artists, we worked on a way to represent stereo images in a single image file. To do that, we formalized the names of the left and right eye within the EXR file format.”

Weta called the new format SXR, and worked with the OpenEXR community to make it broadly useful. “Over the last six months, we’ve seen companies starting to adopt, or at least support, this format,” Welford says. “Now, within Nuke, for example, there’s a left-eye and a right-eye button in a single file.”

DreamWorks Animation
“We have a whole suite of tools that take care of the nuts and bolts of stereo for the artists,” says Phil McNally, stereoscopic supervisor at DreamWorks. “If an animator has a scene with a character running up a road straight to camera, we would want the character to feel distant at the beginning and literally get closer to us at the end. We give the artists controls so they can dial the stereo as they watch the performance.” Artists can have a different stereo setting on every keyframe without worrying about distortions, and without knowing the nuts and bolts.

“We also have something called the multi-rig,” McNally adds. “In CG, it’s easy to have three, four, five stereo rigs all pointing to the same scene in much the way we have multiple lights putting rim lights on one character, for example, but not another. Unless you’re doing a 3D science project, why not have everything up for manipulation and artistic interpretation?”

Disney Feature Animation
“We developed a camera rig that uses a results-driven paradigm. I can define how far out from the screen I want the closest part of the scene to be, how deep into the farthest part of the scene, and where I want the convergent point to be,” says Robert Neuman, stereoscopic supervisor at Disney. “Based on those three things, the system sets up the camera positions for us.”

Neuman continues: “We also have a tool to visualize the floating window within a 3D scene. We can see exactly where the floating window lies. So, if I have an over-the-shoulder shot with a character breaking the frame on frame right, I can grab that floating window within the camera space and pull it out until it’s in front of the character.”

To avoid the cardboarding that can happen when the crew uses a telephoto lens, they have tools that tell them the internal volume of characters. “If a character has a factor of 1.0, it’s a nice, round character the audience relates to. If it’s .1, or only 10 percent round, that’s a problem, and we might need to use multiple camera rigs,” Neuman explains. “We also have tools that allow us to view what it would look like with multiple rigs. All the tools we developed in-house work within [Autodesk’s] Maya.”

Imageworks
“We have a huge suite of tools, from image viewing to dialing the depth within Maya, that we’re constantly refining,” says Rob Engle, senior stereographer at Imageworks. “But one area we’re particularly interested in moving forward is the 2D-to-3D conversion. The most relevant example is our work on G-Force.”

In this film, which is scheduled for July 2009, a specially trained squad of guinea pigs becomes a force for doing good. “We’re taking 2D plates, isolating elements that would be at different depths within the plate, and producing the other eye’s point of view. We slide the elements over to create depth and then fill in the holes that the other eye would see. We’re developing a wide variety of tools that allow us to do that. And then, we have the added challenge of putting elements from the virtual world into the plates. And, we’re still investigating the techniques we’ll need for Tim Burton’s Alice in Wonderland—imagine Beowulf but with live people in a 3D virtual world that’s Tim Burton’s vision of Alice in Wonderland.”

Frantic Films
Frantic created 200 shots for Journey to the Center of the Earth using Autodesk’s 3ds Max and Mudbox, Pixologic’s ZBrush, Nvidia’s Gelato, and several proprietary tools, including the studio’s water-simulation software. For compositing, particularly to work with stereo, the studio created a series of scripts and plug-ins for Eyeon’s Digital Fusion 10 that they’ve named Awake. The studio sells the tools it developed to work on Journey and other films on its FranticFilms.com Web site.

“We wrote a completely new pipeline for Journey,” says Chris Harvey, co-visual effects supervisor at Frantic with Mike Shand. “The entire movie was shot in stereo, so we had double plates for everything. We could have decided to work on one eye and then ask artists to create the second eye, or we could have had scripts generate the second eye. But, we came up with a third approach. Our artists worked in stereo all the way from tracking to finishing the shots. We literally worked on the shots with both eyes at the same time stacked side by side or vertically. We called it ‘stereo stacking.’ It was a huge timesavings.”

–Barbara Robertson


Before beginning the project, Selick consulted with Lenny Lipton, CTO at RealD and a renowned stereo 3D pioneer. The studio brought in other stereo experts, as well. “They taught us all the rules, and it’s good to know the rules,” Van’t Hul says. “But many of us are from creative backgrounds. We’re trying to break as many rules as we can.”

Depth of field is one point of contention. “The experts say everything should be as sharp as possible to make stereo work,” Selick says. “That way there’s no eye strain and you can see everything. I say, ‘Yeah, but the whole point is that we don’t want the audience to look everywhere.’ The easiest thing when you have so much in frame is to go a bit wider and throw the background out of focus so the foreground pops.”

Thus, Van’t Hul and his crew have decided to flaunt conventional wisdom when they can. “We don’t launch an animator on a shot lightly,” he says. “We don’t have time to go back. But sometimes given the energy of a shot, we can be bold if the shots aren’t on the screen long enough to cause eye strain.”

Live Action
One of the biggest 3D success stories for 2008 has been the feature film Journey to the Center of the Earth, directed by Eric Brevig, who received two Oscar nominations for best visual effects while at ILM (Pearl Harbor, Hook) and a special Achievement Award from the Academy for Total Recall. This Walden Media/Warner Bros. film, which Brevig shot in 3D, has grossed $178 million worldwide.

Frantic Films, one of several studios that worked on the feature, created approximately 200 VFX shots, including those in which they integrated footage of actor Brendon Fraser and others in a raft that floated on CG water. “We had to track two cameras for every shot,” says Chris Harvey, co-VFX supervisor with Mike Shand. “And for every frame, we had to create two separate images.”

Typically, the crew would track one eye view and then use the studio’s software tools to generate the second eye view. Even so, the artists needed to tweak the result. “The physical cameras caused discrepancies,” says Harvey, “and the two tracks had to match perfectly.”

Similarly, rotoscoped images needed to match perfectly, even though the original images might not. For example: “The camera views of Brendon Fraser’s hair might be slightly different,” says Shand. “You might see a curl of hair in the left-eye image but not in the right-eye image. That confuses the viewer’s brain and causes what I call buzzing. So, the artists had to tweak the images by hand.”

Specular highlights captured by the camera were often different for the left and right eyes, as were lens distortions and colors. “The biggest things that would make this process easier would be if the camera systems produced footage that was more identical in color,” Shand says, “and, if we could improve the interocular tracking.”

As do filmmakers working with 3D animation, Frantic’s visual effects crew used the distance between the cameras as a storytelling device for their largely digital shots. “If you increase the distance, things effectively look small, as if you were a giant,” Harvey says. “If you do the reverse, everything feels bigger and more imposing. So, we constantly shifted interocular distance and convergence to tell the story. We did a lot of adjusting—literally sliding the images left or right.”

STEREO 3D AT HOME

Although movie theaters are excited that stereoscopic 3D promises to pull people away from their screens at home, consumer products are in the works for viewing 3D at home and even on the road. Samsung and Phillips have demonstrated no-glasses 3D TVs. Hundai sells a 46-inch 3D TV that requires glasses. At the Ceatec show in Japan recently, Panasonic showed a 3D high-definition home theater, which requires 3D glasses, NEC demo'd a nine-inch glasses-less 3D LCD, and KDDI showed a 3.1-inch glasses-less 3D LCD display. And Fuji has announced a pocket-sized camera that shoots 3D movies.

In addition, Nvidia is introducing new glasses and a transmitter that works with existing CRT displays, high-definition DLPs, and the new 120hz LCD monitors. Software in the system converts a standard 3D video game into a stereoscopic 3D game. "Since all the data is coming down the pipe in real time, we don't have to pre-author," says Andrew Fear, product manager. "We have a predefined depth amount, but if the end user wants to adjust the depth, he or she can."

Although the target for the glasses is gamers, Nvidia expects that when a Blu-ray or other standard emerges for high-definition 3D movies, the DVD crowd will want to view stereo 3D movies at home, too. "I think we're probably six months to a year away from having a rigid format,"Fear predicts. "In the meantime, we're building our architecture to play back whatever they adopt."

Also aimed at gamers is iZ3D's $599 22-inch LCD monitor that displays stereo 3D with the help of a pair of glasses. The display, which gamers can also use as a standard monitor on PC systems equipped with a dual-output video card, displays 1680x1050-resolution images and stereo with a 170-degree viewing angle.

Meanwhile, such display manufacturers as Alioscopy are readying glasses-free 3D systems. Adam Holmes, producer of an announced stereo 3D feature animation, has been using these monitors for pre-production and looks forward to the day when glasses-free displays will find a home in living rooms. For now, though, "I'm most excited about the possibility of monitors like these being used as digital billboards,"Holmes says. "That could really help advertise our independent movie."

-Barbara Robertson


In doing so, the crew quickly discovered that making those adjustments on a computer screen wasn’t sufficient. “You have to look at these images big,” Harvey says. “To get things to sit properly in stereo, we were moving things at a sub-pixel level.”

For example, differences in the live-action images sometimes caused ghosting. “Ghosting happens when high-contrast areas close to each other are at different depths,” Harvey says. “It causes buzzing in your eyes, and it’s hard to resolve. If you’re working only in CG, you can move the images onto a zero plane with no convergence—the left and right images are the same. In live action, what was shot is what was shot.”

At Weta Digital, where work on Cameron’s Avatar is ongoing, Matt Welford, compositing head, is running into some of the same challenges. “Often, we find that pulling a key for the left eye will not work for the right because of differences in lighting,” he says. “We have to make sure the quality is perfect for both eyes, but the cameras are looking in slightly different areas.”

That makes the job of painting the images much more difficult. “Hand painting on a single view is already difficult,” Welford says. “But now, if we have to remove something from a scene and hand-paint the image, we’re painting a very specific depth. If we put something in a slightly different depth for one eye, we throw the stereo off. We had to toss a lot of the tips and tricks we used to use out the window. However, it’s good to be doing something new and different.”

Moving Deeper
Welford is certain that although the studio will solve some of the complex problems in the next few years, creating hybrid live-action/CG films, such as those he’s working on, won’t become a push-button process anytime soon.

“I think maybe for people working in fully CG movies, it will be easier to make 3D a process, but it will take a long time for people working with live-action stereo films to develop automated processes that will be at a standard that’s acceptable for final production.”

Even so, Imageworks’ Engle believes that the choice to make a 3D film will eventually become as insignificant as the choice between filming in color or black and white.


Disney’s 2005 animation Chicken Little was the first stereo 3D film shown in 84 theaters newly equipped with RealD projectors. Experts estimate that in the first-quarter 2009, approximately 3000 North American theaters will be stereo 3D-ready.

“We have three hurdles to overcome,” Engle says. “One, audience acceptance among adults as much as kids. Two, filmmaker acceptance—not every filmmaker is as excited as Burton, Spielberg, Zemeckis, and Cameron. It remains to be seen at what stage everyone jumps onboard, and that may never happen. And three, making the decision a no-brainer. We’re not there yet. We have a lot of work to do with tools, but what we’re really missing right now is experience. The visual language is still evolving.”

What might we see in the future as stereographers such as Engle, McNally, Neuman, and various directors and visual effects supervisors continue exploring the possibilities? Engle offers a few thoughts: “So far, everyone is using 3D as a representational experience,” he says. “They’re mimicking reality. Why do we need to do that? I think there is a filmmaker out there who will make a revolutionary movie—someone who will really push the technology. Someone who will break the rules.”

McNally is looking in that direction, too. “You don’t expect a painting to be a photograph,” he says. “You expect to see the medium. What does it mean when we’re in a full three-dimensional spatial delivery of a story?” That’s what is so exciting. We have an undiscovered medium.”

But with 5000 3D theaters in the offing and directors readying new content, it’s a medium that’s ready for its close-up.

Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net