CASE STUDIES: Storage stokes the creative process
Issue: Volume: 29 Issue: 3 (March 2006)

CASE STUDIES: Storage stokes the creative process

Change is a constant in today’s studios as they struggle with crunching deadlines, the need to juggle multiple projects at once, and the prospect of overhauling their server, storage, and networking architectures to make way for the growing wave of high-definition (HD) work.

In our latest look at the state of storage technology in fast-paced studio environments, we found everything from “storage-on-the-go” systems used for real-time editing in the field to elaborate enterprise-class storage installations that mirror many Fortune 500 companies. These high-powered implementations can include hundreds of terabytes of data and “tiered-storage” architectures to help move archival and backup data onto lower-cost storage systems. We also found facilities that use homegrown virtualization software to help users access specific files without knowledge of the physical device where the file actually resides.

Underscoring these storage strategies is the goal to make work in progress instantly available, shareable, and reusable from a central storage repository. It’s also about producing quality content as efficiently as possible.

“Anytime your user data becomes centralized and available from more locations, it can be worked on in a more cost-effective manner,” says Matthew Schneider, director of technology at PostWorks, New York, a film and HD post facility that has been involved in a variety of independent feature films and TV shows.

“Sooner or later, storage will become part of the lifeblood that makes it all happen,” says Schneider. “Whether it’s interesting to you or not, it’s something you’ll be forced to learn how to do. Storage is at least half the equation, if not more.”

PostWorks’s storage infrastructure includes a variety of Avid workstations connected to Avid’s Unity shared storage systems via 2gb/sec Celerity Fibre Channel host bus adapters (HBAs) from Atto Technology (which also offers high-speed 4 gb/sec Fibre Channel adapters). PostWorks’ total Unity storage capacity exceeds 30 tb.

For storage and playback of high-bandwidth digital 2k film mastering files, PostWorks also uses Facilis Technology’s TerraBlock 4 gb/sec Fibre Channel SAN disk arrays, which are based on low-cost, high-capacity Serial ATA (SATA) disk drives. TerraBlock capacities range from 2 tb to 12 tb, with support listed for up to 24 simultaneous users and 16 streams of uncompressed HD video. PostWorks also uses TerraBlock storage-along with its own, self-assembled “nearline” storage systems-as a temporary holding area for older files.

Matthew Schneider, PostWorks’ director of technology, uses 4gb/sec Fibre Channel HBAs from Atto in most of the studio’s Avid workstations.
Maciek J. Maciak, director of engineering at The Napoleon Group, sits in front of what he calls the storage backbone at the facility, a 3.5tb Max-T Sledgehammer disk array from Maximum Throughput.

Configuration files for key applications have been modified to automatically export to the Max-T system for archiving. According to Maciak, “Operators know to look for those types of folders on the Max-T, so there’s no more ‘share your drive and I’ll put this on your desktop.’ This just simplifies the whole archiving process.”

Maciak also incorporates 3tb of nearline storage that is simply a PC with a RAID storage controller. When a job no longer needs to be stored on the primary Max-T system, the job is sent to a “holding” folder used to move files to nearline storage. “We have a rule that says if the Max-T is 75 percent full, dump all the contents of the holding folder to nearline storage. Once the nearline storage gets full, we move it over to tape,” Maciak explains.

Sausalito, California-based music editor Malcolm Fife knows firsthand how much the use of inefficient storage can cost a project. Fife has performed sound work on movies such as King Kong and The Lord of the Rings trilogy. He is also a partner of Tyrell LLC, which focuses on sound design, music production, and postproduction editing and mixing.

Fife knows that when the director of a big-budget film comes to play the latest sound reel and suggests a few changes, you don’t want to hold up the process trying to move the file from suite to suite to cut a new version.

“These are multimillion-dollar timelines. If that change isn’t handled instantly, you could easily blow thousands of dollars,” says Fife.

After working with London’s Abbey Road recording studio on the score for The Lord of the Rings sound tracks, Fife and his partners became enamored of the work flow there, which was based on several Studio Network Solutions A/V SAN Pro Fibre Channel storage systems that helped streamline the recording, editing, and mixing of the music score for The Lord of the Rings: The Two Towers. (In addition to Abbey Road, A/V SAN Pro disk arrays are used by facilities such as Sony/ATV, Wally’s World, and Vidfilm/Technicolor.)

Fife and his partners decided to duplicate much of the Abbey Road setup in their Sausalito facility. However, instead of Fibre Channel SAN connections, they went with two Studio Network Solutions iSCSI-based globalSAN X-4 shared storage systems running on a Gigabit Ethernet network. (iSCSI SANs provide a low-cost alternative to Fibre Channel SANs.) Since the systems are based on Ethernet and iSCSI, Fife’s team found it could even do simple checks of files using a laptop in a different room. By installing SMS client software on the laptop, the group could plug into the SAN from anywhere-even over the Internet.

During peak production for South Park, it’s not uncommon for the 60-plus animators and editors at Los Angeles-based South Park Studios to work approximately 100 hours per week.

According to J.J. Franzen, South Park Studios’ technology supervisor, work begins in earnest a week before the show is due to air, with changes often made as late as 12 hours before airtime. This timeline requires systems to be available at all times-a feat put to the test at the start of the ninth season when a network switch failure made it impossible to access any work in progress until the problem was solved.

“We realized then we had to get rid of our older stuff and remove single points of failure on our network,” Franzen explains. The studio replaced the main file server-another potential point of failure-with Apple’s Xserve RAID servers and a 15tb Apple Xsan storage configuration capable of supporting the 30 mb to 50 mb of capacity needed for an average animated scene, as well as the 150 mb to 200 mb required for larger scenes. The storage system is also mirrored, and supports automatic fail-over in case of failure.

In the upgrade process, Franzen also implemented Atempo’s TimeNavigator software, which takes incremental backups four times a day. TimeNavigator now provides time-stamped backups of earlier file versions so animators and editors can quickly access them without having to go through re-rendering.

Another studio that knows how to keep the digital pipeline humming is Los Angeles-based Rhythm & Hues, an animation and digital effects studio that became known as the “talking animal house” for its award-winning work in the film Babe.

Recently, Rhythm & Hues put 650 people to work during the hard-core production phase of the Disney movie Narnia. Chief tasks involved the development of animation and underlying muscle movements of key characters such as Aslan the lion, including computer-generated simulations of smoke, fire, and the lion’s fur.

According to Rhythm & Hues’ vice president of technology Mark Brown, this required about 24tb of data to move through the system each night-a process that was managed through a combination of several high-speed Titan storage systems from BlueArc and Rhythm & Hues’ homegrown virtualized file system. The file system allows data to be replicated in front of the BlueArc disk arrays so that bandwidth levels are always maintained.

Rhythm & Hues used three high-speed storage systems from BlueArc to handle the renders and nightly processing of up to 24tb a day of data during peak production of Narnia.
Image © Disney Entertainment

“You can put 256tb on a storage server [which is what each Titan system can support], but our problem is that we have so many processors going at it that we wouldn’t have the bandwidth we need. So we only put 4 tb to 6 tb behind each Titan head so that we have the bandwidth to get to the data,” says Brown. BlueArc’s Titan storage systems deliver 300 mb/sec to 400 mb/sec of sustained throughput for Rhythm & Hues and are capable of scaling from 5 gb/sec to 20 gb/sec of throughput, according to BlueArc.

What do you do when the clips you produce might end up on other TV channels and on other shows, or even on the Internet? According to Jeff Mayzurk, senior vice president of technology at E! Networks, this means the underlying IT infrastructure has to allow production teams to be able to quickly re-use content as needed, in a variety of forms, as soon as it’s produced.

From a storage perspective, this has meant a “hub-and-spoke” architecture that captures any film acquired from the field once, and copies it into the network’s central storage repository. This material is then distributed to various edge (or spoke) locations for their own use, whether on an Avid system or Apple’s Final Cut Pro system connected via a director’s home.

What makes this model work is a combination of custom virtualization software that masks the underlying complexity of the storage systems in use. Storage resources at E! Networks include about 200tb of SATA-based NAS from Isilon Systems, Network Appliance NAS servers, and two Fibre Channel SANs from DataDirect Networks. Mayzurk says the virtualization capability allows him to remain vendor-agnostic. “We didn’t want to be tied to a particular vendor or type of technology. We want the flexibility to migrate as storage technology improves.”

Isilon’s IQ series of clustered storage systems include the OneFS distributed file system, which scales to 250tb of capacity. The company claims throughput performance of 3 gb/sec. In addition to standard Gigabit Ethernet connections, Isilon’s IQ series systems are also available with higher-speed, lower-latency InfiniBand connections.

Jeff Mayzurk, senior vice president of technology for E! Networks, fuels the company’s multimedia productions with several hundred terabytes of networked storage and custom virtualization software.



Michele Hope is a freelance writer focusing on trends and advancements in the storage industry. She can be contacted at mhope@thestoragewriter.com.