Setting the Stage
Issue: Volume: 26 Issue: 11 (November 2003)

Setting the Stage

Digital video and computer graphics effects have been starring on the big screen for years, and now the same tools and techniques that have wowed movie audiences are captivating theater-goers of a different kind. From rock concerts to operas, ballets to modern dance, digital images are taking center stage in these traditional theatrical productions. In some instances, the DV and CG elements serve as backgrounds; other times they are used as props or even performers on the stage. While the role or the particular art form in which they appear may vary, their function remains the same—to enhance the performance. | At the University of Kansas (KU), students and faculty have been integrating computer graphics in a number of unique projects (see "Performance Enhancer," pg. 52). Starting with the 1994 staging of Elmer Rice's The Adding Machine, KU's University Theatre and Institute for the Exploration of Virtual Realities have featured stereoscopic and virtual scenic environments in seven "experimental" works, the most recent being Mozart's The Magic Flute, which also includes projected video. | "The current generation is bombarded with television, movies, games, and so on, and by just putting up the traditional box setting with painted muslin and having it sit there for two hours isn't really speaking to the audience," says designer/technologist Mark Reaney. "Instead, we need to use a language they understand, which is modern, digital, and loaded with information." | Yet Reaney is quick to note that while DV and CGI can enhance theatrical performances, the productions must also retain the characteristics that make live performance special and distinctive. One way to achieve this is by using real-time DV and CG as opposed to pre-animated imagery, so every performance is unique and fresh. | The first segment of this two-part series highlights an interactive application of DV at a rock concert and the fusion of scientific imagery with the soothing notes of a string quartet. The second segment will focus on a range of CG technologies used in an opera produced by KU, as well as in a ballet and a modern dance. Though the applications and technologies may vary in these examples, the end result is the same—a brand-new look for 21st-century stage productions. —Karen Moltenbrey

DV|Rock Concert

By Stephen Porter

When Lollapalooza first appeared on the music scene in 1991, it was a groundbreaking event that introduced the music world to the concept of the touring, multi-band summer music festival. Not surprisingly, when Lollapalooza returned this season after a six-year hiatus, it again struck an innovative note, this time through the use of digital video technology that turned a traditionally passive concert experience into an interactive affair.

More than just a concert, Lollapalooza is an all-day event at which the attendees can watch multiple bands perform on two different stages or can wander around the grounds, which are packed with activities, performers, and an assortment of vendor booths. The vision for creating an interactive concert experience originated with Perry Ferrell, the front man for the group Jane's Addiction and the founder of the original Lollapalooza.

"Integrating music, technology, and lifestyle into an interactive entertainment experience has been Ferrell's vision for years," says Michael Abrams, managing director of Lollapalooza. And the return of Lollapalooza provided him with the perfect venue.

Photos courtesy Barry Brecheisen Photography.

(Top) Video producer Paul Harb, sitting at an Avid Adrenaline editing system nestled among packing crates in an area off the main Lollapalooza stage, places the final touches on DV imagery that was subsequently projected in near-real time onto one of the

A video game pavilion, for instance, gave concert-goers a chance to test their computer gaming skills, or they could use their mobile phones to enter contests and respond to trivia questions displayed on large Lighthouse LED screens near the main concert stage as well as on the "Lollatron" screens in a central location on the festival grounds. Occasionally, fans could even dial up to vote on which songs they wanted the bands to play.

The most impressive display of interactivity, however, was achieved through an innovative application of video. Indeed, most musical groups today utilize large screens positioned around the stage to show IMAG (image magnification) video, which consists primarily of close-up shots of the band and the audience. The feeds are live, from the cameras directly to the screens. On occasion, a concert also will feature projected pre-produced video. These are segments that are edited prior to the event and played at certain times during the show.

At Lollapalooza, Ferrell and Abrams used both applications, in addition to a third, which required them to film, edit, and broadcast the resulting video in near-real time. Specifically, their goal was to film candid video footage of the crowds and edit the material into polished segments for display in short "newsreels," in effect, enabling the concert-goers to truly become part of the show.

One of the people brought in to help make this vision a reality was Paul Harb, a video producer and owner of Wrong Beach Multimedia in Long Beach, California. Harb, who joined the project about two weeks before the start of the festival, admits that he was skeptical when he first heard the plan. "I think a lot of us walked out of that initial meeting with our jaws open, wondering how we were going to do this," he recalls. "Everybody appeared to have their own concerns, but for me, it was about the schedule. Getting these [segments] up in that short amount of time would be difficult, especially when so many other variables necessary to make it happen were out of my control."

One of Harb's first requests was for an Avid Technology Adrenaline system. At the time, the Adrenaline had just become available, but Harb had read about its speed. A member of Avid's new DNA (Digital Nonlinear Accelerator) family of editing systems, the Adrenaline combines the company's next-generation Media Composer software with a DNA hardware accelerator—a stand-alone box that connects to the computer with a single FireWire cable. Whereas the Avid Meridien video system-based Media Composer can handle two real-time streams of uncompressed standard-definition video, the Adrenaline can handle up to five streams (and up to eight in draft mode), as well as support a variety of video formats including DV25, DV50, MPEG, IMX, ABVB, and Meridien media formats, and uncompressed NTSC ITU-R-601, all of which can be mixed within the same timeline.

Two weeks prior to Lollapalooza, Harb used the system to edit 11 hours of video that was used as pre-produced content. But the real test came once the tour started, when he had to create six newsreels a day, spending less than a half hour producing each one.

For the content, a team of three videographers using three Canon XL1s and one GL1 mini-DV camcorder obtained candid footage of festival attendees. A favorite subject was the Mindfield actors, who mingled among the crowd playing pranks and gags on unwary victims. Once the trio shot enough footage, they delivered the mini-DV tapes to Harb at his Hewlett-Packard xw8000 workstation, which was tucked among the stacks of packing crates in one of the wings off the main stage. There, Harb had approximately 20 minutes to edit the footage, and then export it, compress it, and send it out over a wireless network to the Lollatron and the Lighthouse LED screens at the main stage.

Concert-goers, including this wandering performer, became part of the Lollapalooza show, as video shot during the event was edited on site and projected onto the screens at the locale.

Not only was this a complex process that had to be completed within a short period, but it had to be done six times a day. "We'd strive to get newsreels for every scheduled set break," Harb says. "But there were days when I'd miss that window, and then we'd have to wait for the next break."

In fact, the process was so complex that it took a couple weeks' worth of shows to iron out all the kinks. "We had to work out the work flow," explains Harb. "Because I didn't get pulled into the project until late, we didn't have time to rehearse or practice, to see what worked and what didn't. So when we went on the road, there were certain issues that had to be worked out, like what's the best bit rate and what's the best codec to use that will give us the best quality without stutter. In addition, certain graphics that we used during the beginning of the tour didn't look good during the daytime because of the color contrasts, so we'd only pull those out and play them in the evening."

Despite these types of issues and the occasional network problems that inevitably plague a highly wired event, Abrams says he couldn't have been more pleased with the results. "It really helped the attendees become part of the show and involved in the story and the magic of the moment, rather than appearing as 'the audience' between celebrity shots," he adds. "The crowd became stars of the show, fulfilling Ferrell's dream of making the concert experience more than just a spectator sport."

Stephen Porter is a contributing editor of Computer Graphics World and a freelance writer. He can be reached at

DV|orchestra Concert

By Karen Moltenbrey

An international tour by the Kronos Quartet combined music, science, and DV technology during its performance of new-age composer Terry Riley's Sun Rings, a multimedia extravaganza of string-quartet music, chorus, space sounds, and video. Commissioned by the NASA Art Program to champion space exploration and to commemorate Voyager's 25 years in space, the recent concert featured the sights and sounds of outer space set to a stringed accompaniment. And, the timing couldn't have been better for the audiences, as the topic of Mars dominated the news when its orbit brought the planet in close proximity to the Earth.

The person responsible for the visual and musical collaboration was visionary designer Willie Williams, who has worked on video stage presentations for U2 and the Rolling Stones, among others including the dance troupe La La La Human Steps (featured in Part 2 of this series). The fact that someone with video expertise like Williams got involved with a production of this nature represented a major departure for a live classical music stage production such as this. That's because until recently, high costs have kept video out of the price range of most performing groups, especially non-profit musicians such as Kronos, notes Punk Films director Mark Logue, whose studio edited the imagery for the concert. But with the arrival of powerful laptops and relatively inexpensive editing software, that situation has changed.

For the Sun Rings production, Williams culled rare video footage of astronomic phenomena filmed in space and provided by NASA, NASA's Jet Propulsion Laboratory (JPL), the University of Iowa, and the University of Alaska to accompany Riley's approximately 90-minute work, which was performed without interruption. After Williams selected the archived imagery, Punk Films in London edited the video and generated related visuals of space ships, planets, diagrams, equations, and photos, all of which were projected onto a large screen at the rear of the stage. Among the treasures Logue worked with was footage from a satellite launched in 1998 that captured close-up video of the sun's surface, which served as the foundation for many of the projected sequences. "I'd never before seen images like these," he says. "They were stunning."

Images courtesy Punk Films.

Punk Films edited raw scientific imagery, from NASA and other agencies and universities, that served as a visual accompaniment to the recorded space sounds and musical augmentation by the Kronos Quartet during a NASA-sponsored arts event.

As Logue points out, the digital video was an accompaniment to the music, not a simple visual collection that was projected onto a screen. "We listened to the music as we edited the selections," he explains. "We approached each as a separately composed structured edit—as though we were making a film—as opposed to looping abstract graphics that bore no relation to the instrumentals. We timed and structured the edits so they matched the pauses, quiet pieces, and busy segments, and as a result, the imagery worked in tandem with the music."

This was important, Logue says, because the music of a quartet rises and falls to where "you can hear a pin drop," and the visuals had to follow that rhythm without overtaking the music. Conversely, rock music is loud, lending itself to "louder" imagery that can be looped, he notes, drawing on Punk Films' experience of creating stage visuals for recent concert tours by Queen, Aerosmith, U2, and the Rolling Stones. While the type of music and visuals may vary, Logue contends that the imagery must adhere to one basic rule: It cannot upstage the band.

Many of the "space arrangements" from Sun Rings were based on the recordings of so-called whistlers (produced by lightning disturbing the plasma outside the Earth's atmosphere) and other sounds collected during research by professor/physicist Donald Gurnett of the University of Iowa. Sun Rings actually began with sounds from Gurnett's audio tapes that were captured during the past 40 years by scientific instruments aboard spacecraft such as Voyager I and II. The concert then progressed into an integrated sensory experience as the trio of space sounds, musical notes, and video blended together with mesmerizing results.

In one visual backdrop, Logue infused video footage of Gurnett, along with mathematical formulas and related symbolic imagery such as a spinning gyroscope and a satellite. In another segment, which related to the rotation of the Earth, the group projected a large NASA image of the planet tipped on its side, then rotated it so the dark side penetrated the stage beneath the orchestra. "It looked awesome as the globe loomed over the quartet," says Logue. But this was one instance when scientific accuracy overshadowed the aesthetic value. "On the first night of the performance, some physicist friends of Gurnett were in the audience," he recalls, "and during this selection we could hear a ruckus. We had spun the Earth in the wrong direction."

That was just one of many aesthetic alterations that Punk Films made to some of the scientific imagery. "Obviously, as you clean it up, you are destroying the original information," Logue says. Yet, there was no question that the initial satellite imagery required editing because of its crude quality. "It wasn't intended to be watched as a beautiful visual sequence," he notes. "Rather, it was for scientists to pore over and analyze." According to Logue, the video required a significant amount of panning, tilting, perspective correction, color correction, and similar enhancements to turn the crackly, low-resolution, noise-filled segments into a smooth, soothing piece that could be projected onto a large screen. The data—acquired as a series of individual snapshots over long stretches of time—also had to be edited into a flowing, sensible sequence.

Logue and his partner, Marina Fiorato, used Adobe Systems' After Effects, along with a series of plug-ins, as its main video-editing tool for polishing the data and stripping out the noise, or grain. Sometimes the group encountered big jumps between frames or missing or partial frames, often caused by satellite repositioning. When this occurred, the editors used RealViz's Re-Timer to create the missing snapshots. Once the editing was completed, Logue and his colleague used Apple Computer's Final Cut Pro to generate a concert-ready version of the sequences.

"What we did wasn't exactly rocket science," Logue says. "But the combination of unique imagery and stimulating music resulted in a performance that was out of this world."

Karen Moltenbrey is a senior technical editor at Computer Graphics World.


Adobe Systems
Apple Computer
Avid Technology