Storage Space
By Randi Altman
Issue: Volume 37 Issue 1: (Jan/Feb 2014)

Storage Space

What is the best kind of storage? The kind you don't have to worry about. Across the board, that is the consensus from visual effects pros. 

With 2k, 4k, and stereo projects coming into studios, storage is more important than ever, and having a system that is fast, flexible, and able to grow with artists' needs certainly hits home. Also important? Not needing a small electrical plant to power the system or having to break the bank to keep things cool.

Element X
Dallas-based Element X Creative (www.elementxcreative.com) is a 25-person, full-service visual effects, motion design, and animation studio targeting commercial, television, and film work. Recently the studio began creating its own animated properties.

"We focus on design and storytelling, while investing heavily in our staff," explains Element X CEO/partner Chad Briggs, who believes in not getting too big and, instead, relying on a manageable core staff.

Briggs feels that not being tied to only one aspect of the work has given Element X Creative (EXC) what he calls a leg up. "I've never been a snob to one medium; I love visual storytelling of all kinds," he says. "Our guys tend to cross-pollinate between visual effects, animation, and graphics. And it's that cross-pollination that drives the design because we are not limited in how we think about a project - we can approach it from the best angle to tell the story."

All of that couldn't be accomplished without having a deep, fast, and expandable storage platform. "Storage is the lifeblood," maintains Briggs. "You have to have enough, and you always need more, especially these days with 2k, 4k, and 4k stereoscopic. The demands of production on the visual effects side of things continue to get more extreme."


THE CREATIVE-CARTEL was on set for After Earth, employing its own management system, Joust. On the hardware side, the studio uses JMR equipment.

EXC has had a long-standing relationship with EMC. "They have been rock-solid, and the support is great," Briggs says.

The ability to add storage as needed, as well as to have a group of users connected to that storage, is hugely important to Briggs. In addition, the latest Isilon version offers more gigabit networking on each node, "so we have the ability to connect a lot more users in a centralized place."

EXC has three EMC Isilon F200s that offer 6tb nodes - each one has four 1gb Ethernet ports and 6gb of RAM - for a total of 18tb. Even with that amount, the staff often hits those limits, so Briggs says the studio began looking into buying another node or two.

All that storage comes with the need for robust power and cooling. When EXC moved into its present location about four years ago, it had to add a new electrical subsystem to accommodate the storage. "It gets really hot," Briggs says. "Once you start throwing in Isilon nodes and backup systems and renderfarms - we have about 35 to 40 machines dedicated to rendering, each running different software/engines - that gets power-intensive. Then you have the cooling aspect, keeping it under 70 degrees."

In terms of projects, Element X this past summer completed a national spot for 7/11 out of the Integer Group promoting the convenience store's July 11 free Slurpee giveaway. It features comedian Nathan Barnatt doing the "Slurpee dance" in front of a greenscreen with retro '80s graphics behind him. The visuals were composited and animated at EXC. Charlieunformtango handled the edit. 

Another job was for Occam Marketing client Leapfrog. EXC created a flurry of CG products and graphics for the piece.

The studio's primary 3D software package is Autodesk's Softimage. The facility also employs Pixologic's ZBrush and The Foundry's Nuke, Nuke X, and Nuke renderfarm nodes; a seat of Mari was expected to be added, as well.

The Creative-Cartel
The Creative-Cartel (www.thecreative-cartel.com) in Culver City, California, manages projects from camera to DI, providing everything from engineering digital pipelines, to near-set lab and dailies services on location, to VFX and stereo production management. Its resumé includes Priest, Ted, and After Earth.

Storage is incredibly important to every aspect of what The Creative-Cartel does and how the studio does it. It's all about efficiency and streamlining the workflow - so much, in fact, that the studio created its own management system, called Joust, which acts as a repository for all digital media and metadata during principal photography, including data wrangling, script, and camera notes, as well as pertinent color information for each shot. After the data is collected, Joust becomes a dailies and vendor-review system, with the ability to create bid packages, watermark images, and automate vendor submissions. Plus, it allows editorial to manage plate pulls and transcoding so that VFX plates are delivered the same day.

Prior to using Joust, The Creative-Cartel had been, for the past three years, using JMR tools for its infrastructure needs. While Joust has become a prime part of The Creative-Cartel's pipeline, the facility still relies heavily on JMR, particularly the company's Bluestor and its servers. "We will call on JMR for pretty much all our hardware needs at this point," says CEO Jenny Fulle.

The relationship began when the studio was looking for technology to do stereo playback at 4k for reviews. CTO Craig Mumma approached a couple of companies to see if they could help, but "they all said they didn't have the architecture and hardware to do it." Then he tried JMR. A week after the request, The Creative-Cartel had a box for testing over at The Amazing Spider-Man. "They pushed the limit on the playback speed of dual 4k 3D, which was amazing from a single box." In addition to the technology, Mumma appreciates the customer service: "You get directly to the heads of the company at any time. That's important in our industry because the speed of the production is ridiculous and you need answers right away."


SAVAGE VISUAL EFFECTS stays on the cutting edge of technology when working on series such as House of Cards, and that includes its storage solution.

One of The Creative-Cartel's most recent jobs was the Sony F65-shot After Earth, which brought its own set of workflow concerns. "When we started, we were pioneering the workflow for F65. The last thing we wanted to worry about was our storage solutions because we had to worry about cameras more than anything," explains Mumma. "We went to JMR and said we need to have enough storage for all the camera files, and we are going to be traveling and need robust equipment that will last for all the different areas." The production brought them to Costa Rica, Pennsylvania, (Eureka) California, and (Maob) Utah.

After Earth was the first film in which The Creative-Cartel used Joust almost as a complete package. "That meant keeping all those original Raw files live and online, which was a big deal," says Fulle. The group had 150tb live, rolling with them from town to town.

The Creative-Cartel provided production management for After Earth and acted as the hub for eight or nine visual effects companies, which produced about 700 shots. "We started at camera and did the mobile lab," explains Fulle. "So we processed all the dailies and kept all the files online, and once we started engaging with the vendors, Joust did all the transcoding from Raw files to Open EXR, which was the format we worked in. We then managed all the digital images - moving them between the vendors and bringing them back in, showing them to the director, getting them to the DI house. After we finished the dailies, it was all about managing the visual effects workflow. We were able to do light grading on the [FilmLight] Baselight Transfer Station for visual effects stuff, too. It was a robust pipeline we worked out between Joust and the equipment we had on hand."

Mumma points out how important power and cooling are to a drive's efficiency, especially considering how much traveling the crew had to do on After Earth. "That is important to consider when you are moving around with these drives. The JMR [drives] have a low power requirement, so you don't have to build a power plant to get these things up and going. We don't have to have big, special rooms. Now we can set them up in a hotel room with basic cooling and power."

Fulle gives an example of when production took them to a remote location in Costa Rica. "They set us up in a hotel, and by hotel, I mean in the middle of the jungle with bugs and lights flickering. We had all our equipment set up, and we had run out of outlets for all the gear. Craig was able to take a sconce off a wall, pull the wires out, and wire up a plug. Five years ago, you couldn't have done something like that because you would have set the place on fire."

And indeed, there has never been a better time to take advantage of storage. "It's not as cost-prohibitive anymore," concludes Fulle. "One hundred terabytes is not going to break the bank, so you can keep it live and online, and save all the days that were wasted before and put them back into the hands of the artists."

Savage Visual Effect
Six-year-old Savage Visual Effects (www.savagevisualeffects.com) focuses on film, television, and spot work, with studios in Los Angeles and Pittsburgh. While Savage might not be a huge company, it does work with some big names, such as Directors David Fincher (The Social Network, The Girl with the Dragon Tattoo, House of Cards), Louis Leterrier (Now You See Me), and Bryan Singer (Valkyrie).

Currently, Savage has a core staff and builds up as needed, but the studio is moving toward staffing a bigger office in Pittsburgh with more full-time personnel.

The studio's ties to Pittsburgh begin with co-owner James Pastorius, who grew up and went to school in the city. As such, he has many contacts and artists to call on.

Savage typically uses Apple's Shake and The Foundry's Nuke for its compositing needs, and relies heavily on various 3D applications and Pixar's RenderMan, but will call on other software packages as needed depending on which freelancers are brought on for certain jobs.

A Tiered Approach

Rising Sun Pictures (RSP) in Adelaide, Australia, is a visual effects specialist whose recent credits include The Great Gatsby, The Wolverine, Gravity, and Seventh Son. Founded in 1995, the company has seen its data storage needs grow exponentially over the years and currently operates a tiered storage infrastructure that includes a pair of EMC Isilon NL series storage clusters, a cluster of Avere FXT 3200 edge filers, and an open-source Lustre storage cluster. Connectivity comes via 10gb Ethernet on the front end and QDR InfiniBand on the back. Additionally, the company has 1.3pb (petabytes) of long-term data written to tape.

From RSP's perspective, storage systems work best when they are invisible. If artists are worrying about versioning, backups, or access, it means there is a problem or they are not working efficiently. Routine functions, such as making redundant backups and moving files from on-line to off-line, are now fully automated.

"Ten years ago, it was a constant battle to ensure we could store data and had a fast enough infrastructure," recalls RSP Visual Effects Producer Ian Cope. "Today, we have systems in place so that all the legwork is done by scripts. If we get a hard drive in from a client, we don't have to worry about shuffling data around. We simply copy it across and start working right away."


The Wolverine © 2013 Twentieth Century Fox. Image courtesy Rising Sun Pictures.

In 2008, RSP became the first Australian company to acquire an Isilon storage cluster. As data packages continued to grow, the shop added a second cluster. More recently, it incorporated the Avere edge filer to serve as a front-end caching system. That made it much easier for artists to share, manage, and work with files for current projects.

"Previously, if two render nodes hit one file, it would slow things down. Placing the Avere caching appliance between those nodes removed the load from the system," says Director of Engineering Mark Day.

Some of RSP's more challenging projects would have been difficult to manage without the performance boost provided by the Avere system. Cope points to The Wolverine, where RSP was charged with re-creating the atomic bomb attack on Nagasaki, Japan.

"There was a lot of data involved when the computers were trying to work out the nuclear explosion," Cope says, "especially as we were doing multiple passes and iterations. For high-end simulation work like that, you need robust systems in place or you are going to have trouble."

"On shows like The Wolverine and Gravity, we're working with large, complex data sets, non-stop every day," Cope says. "With our current infrastructure, there is little direct involvement from our artists. They can focus on solving creative problems and how things are going to look on the screen. And that, after all, is why we're here."

In terms of hardware, co-owner Brice Liesveld recognizes that having the right kind of storage is hugely important. "It's all about efficiency," he says. "We are constantly needing more storage as things evolve, and if you can't access data in real time, your efficiency just drops through the floor. It's important for everyone to have access to what they need without the hiccup of having to go pull something online."

The need for storage never seems to end, especially with new technologies and the growing prevalence of 4k and beyond. "We aren't Weta by any means, but we chew through a lot of data, even at our size," explains Liesveld. "It was 2k, now it's 4k and 5k, and next it will be 6k and 8k. You have multi-layered files like EXR, deep compositing, which is file-size intensive, and newer cameras generating more and more metadata. Every day there is another chunk of data you need to store and access. Without having reliable, consistent, and large enough storage, you can't do your job efficiently."

In order to get a system that worked for them, Savage contacted Venice, California's Open Drives, which offers a scalable and easy-to-manage data storage platform built specifically for the media and entertainment industry.

"Jeff Brue at Open Drives based the system on hardware from SuperMicro, and he uses OpenIndiana, which is an open-source operating system that he has fine-tuned for film and media," reports Liesveld. "Our current Open Drives system gives us 50tb of live storage with the capacity to expand to approximately 150tb through the purchase of additional disks."

Savage's production setup offers 50tb of SAS disks that sit behind 960gb of L2Arc cache, which Liesveld describes as an SSD RAID, offering very fast read/write capabilities. "Elements and plates that are used frequently are automatically pushed to the L2Arc cache, and are then served off those faster disks instead of relying on the slower SAS pool."

The Open Drives platform uses RAID-Z, which Liesveld likens to RAID-5, but it manages file space better and is self-healing. "ZFS also gives us hot-swap capabilities for backups, so I could create a data pool of disks, push project data to it, and then pull those drives out for archive instead of having to go to LTO-3 or a FireWire-type backup solution," he says.

The studio has 10gb Ethernet (10 GigE) Fibre connections to all its workstations. "That allows us to get real-time 2k stereo and 4k files directly from shared storage to the artist," says Liesveld. "It also gives us the luxury of working with full-resolution plates rather than introducing proxies into the workflow."

While working on the first season of the Netflix series House of Cards, Savage was able to keep the entire show, along with all related elements and reference footage, online and available to artists from start to finish. "Prior to teaming up with Open Drives, we had to do a bit of digital juggling, archiving, and restoring shots and assets to manage space," he adds.

The VFX studio provided more than 300 shots for season one, including a CG library, greenscreen car shots, monitors, sky replacements, and a variety of other invisible effects.

"House of Cards was shot at 5k with the Red Epic camera, so considering the volume of work we had coming in, the ability to put together affordable and scalable storage was essential," says Savage.

Now that the first season is completed, Savage has moved the critical data to nearline storage, which is essentially the same as the facility's 50tb setup without the SSD cache in front of it. "We push recently wrapped data to our nearline storage and let it sit for a while before it gets fully archived. That way it's easy to access, and if we need to get data back on the production server quickly, we can," Liesveld says.

The Need for Speed

With the rapid increase in content that digital filmmaking now generates, it's extremely important for filmmakers to optimize their creative time in post. As production of digitally acquired 3D films is becoming more commonplace, uncompressed digital content can translate into hundreds of terabytes per project, and the traditional tools used to manage the content are often overwhelmed by new demands of data acquisition, collaboration, distribution, and long-term, protective archiving.

As a premier postproduction facility located in New Zealand, Park Road Post Production was developed by filmmakers for filmmakers, and has worked on some of the largest-budget Hollywood films, independent American and foreign films, and lower-budget New Zealand features and short films. 

Pressing demands from digital postproduction required Park Road Post Production to dramatically increase its capacity and throughput. For example, on a recent project, the crew processed an average of 6tb to 12tb of new material each day, and on a really busy day, that figure could reach 20tb. What's more, all this new material needed to be processed and delivered to the client within 12 hours. This quick turnaround is important so filmmakers have the option of reshooting scenes before the sets are struck. 

While evaluating various systems, the group at Park Road Post Production knew the solution should be flexible and could be tailored to fit the facility - and not the other way around. 

After much due diligence, the studio execs decided to extend the infrastructure using Quantum's StorNext and Scalar i6000 for virtualized tape storage.

StorNext touches every part of the workflow, from ingest to archive, at Park Road Post Production. Now, source data is rapidly acquired into a StorNext environment, either on set or from field LTO-5 tapes, for collaborative processing via multiple SGO Mistika workstations. These workstations access source material concurrently over dual 8Gb/sec Fibre Channel. The source data and all metadata generated on set and derived through processing is automatically archived to LTO-5 tape via StorNext Storage Manager. Tapes are retained within the Scalar i6000 library for quick retrieval back to the shared storage pool for further processing.

Tapes are also "vaulted" from the library for long-term archive. With StorNext, the studio can rely on the software to take care of the heavy lifting of moving terabytes of data, allowing the crew to focus on improving creative processes.

Savage started on House of Cards' second season this past summer.

Sums up Liesveld: "The last thing you want to worry about is, 'Do we have enough disk space?' because then you can't focus on the actual work."

Randi Altman is the former editor in chief of Post Magazine and a writer in the postproduction industry.