Imagining Zork
Issue: Volume 38 Issue 1: (Jan/Feb 2015)

Imagining Zork


An innovative course project combines cinema, animation, and game design

A classic game shrouded in mystery. A cold, deep west-central Wisconsin cave. Two professors who wanted to experiment a little bit.

It sounds like the makings of a Halloween thriller.

For Kevin Pontuti and Dave Beck, from the School of Art and Design at the University of Wisconsin-Stout, it was even better than that. They devised a semester-long, innovative course project that channeled the power of the game Zork, challenged their students, and used cutting-edge industry technology.

Pontuti and Beck brought students together from multiple classes (Digital Cinema Studio and Advanced 3D Modeling and Animation) to explore the convergence of cinema, animation, and games. Working in teams, students created an animated short film (or trailer) for Zork, one of the world's first interactive fiction games.

Zork, a text-based game created in the late 1970s at the Massachusetts Institute of Technology, represents the dawn of game design. The "engine" understands basic text commands, such as "Look Up," "Go East," or "Open Mailbox." Each of these commands solicits a response from the game that describes the outcome of the command. The game was later divided into three sections and distributed by Infocom, and still has a cult-classic following today. Although Zork does not include video, it remains popular in an age dominated by video games.

Pontuti and Beck decided to use Zork as a point of departure for the class project. "2015 marks the 35th anniversary of Zork, so it seemed like a perfect opportunity to celebrate the game," says Beck, an assistant professor and chair of the design department.

Many of the students in the class are game design students, and many had never heard of or played the game, so it was a great way to introduce them to this classic title, notes Pontuti, an associate professor and director of the entertainment design undergraduate program. Also, because the game is entirely text-based, meaning there are no visuals, it's very different from contemporary video games in that the viewer/player has to imagine the world as it's explored and experienced. It functions basically as an interactive script. Students, therefore, were required to create a video interpretation of the game.

A total of 20 students from two classes worked for four months on the cross-course interdisciplinary project, created via live action and with computer-generated images. (The videos can be seen at http://stoutcinemagames.wordpress.com/student-videos/.

"Since we were framing the semester around the idea of cinema and game convergence, we decided to craft a project where the teams would create film trailers or hook scenes for a game that was being adapted into a film, or basically, a game-inspired short," says Pontuti.

One of the goals of the project was to have students collaborate in a way similar to the entertainment industry approach, with cinema, animation, and games converging.

"The idea to team-teach our cinema and 3D animation courses emerged," Pontuti says. "Students had to adapt to team positions - director, cinematographer, editor, 3D artists, game-engine scripter, and so forth, and learn how to best communicate and time-manage these large, multi-stepped projects."

Special Interest Groups


Four student teams created visuals for the Zork game, including a CG group whose work appears above.

Once the project and theme were introduced, students were divided into teams. Given that some of the students were more interested in CG, VFX, or animation projects, while others were more interested in traditional live action or experimental art films, the instructors segregated the teams along those lines. The goal was to give the students some creative flexibility and a variety of production options that would demonstrate a range of outcomes. In the end, there were four teams that ranged in size from three to nine students.

The previsualization team primarily comprised game design students, who were more comfortable working in the Unity game engine as opposed to Autodesk MotionBuilder. Students created their animations using motion capture within the Xbox Kinect. The animations and camera work were then implemented into Unity using Cinema MoCap and the beta version of Cinema Director.

The traditional live-action team (director, cinematographer, editor) used contemporary live-action film techniques to create a four- to six-minute short. Meanwhile, an experimental live-action team (director, cinematographer, editor) used contemporary live-action film techniques for its four- to six-minute short.

Students working on the live-action teams used Crystal Cave near Spring Valley, Wisconsin, as a filming location, after first considering the basement of a dorm that was to be demolished, as well as miniature and full-scale sets. Initially, the students thought they would acquire good reference photos and textures for the CG team from a visit to the cave. However, cave owner Eric McMaster donated cave rental time for location filming - but students had to wait until spring when the cave's resident bats were finished hibernating.

The CG team was a hybrid live-action CG and greenscreen group (director, cinematographer, editor, compositor, matte painter, character artists, environment artist, animators, and lighting designer) that created a trailer using Autodesk's Maya, Pixologic's ZBrush, The Foundry's Nuke, and Adobe's Creative Suite.

When the live-action groups began their work, the CG students started modeling characters, props, and environments for both VFX films and the previs project. Although there were discussions about possibly sharing assets, for the most part, the artists designed and built separate models for the films. Some of the students wore multiple hats, covering a variety of duties and even sometimes working for multiple teams as additional props and characters were added.

At the time, the school was piloting a new renderfarm solution, outsourcing the 3D rendering to service provider Render Rocket. "Since our rendering needs have more than quadrupled over the past four years, we've been looking for models to serve the courses' needs. Students were able to test-render the animations locally in our existing lab to check the artistic and technical details," explains Pontuti. "Once the tests were done and the files were double checked, the students could submit the renders through the Render Rocket Web interface."

Students also created matte paintings and utilized the school's greenscreen studio for augmenting some of their location needs.

Valuable Connections

Another key aspect of the project was industry cooperation. Pontuti and Beck worked with Cinema Suite, Inc., which provided use of its then soon-to-be-released Cinema Director software. Cinema Suite also provided its motion-capture program, Cinema MoCap.

Cinema Suite, a companion offering for the Unity Game Engine, allows a production team to previsualize all shots and sequences in a film. Traditionally, storyboards have been the primary previsualization tool for directors to help "imagine" their films. Many productions are

moving toward 3D visualization, allowing the entire film to be viewed in a format that resembles a low-resolution 3D animated movie, Pontuti points out.

In fact, UW-Stout students began using Cinema Suite during its beta-testing phase, and provided product feedback to the technical team at Cinema Suite. "Our partnership with UW-Stout was critical for the future development of our cinematic tools. This collaborative project gave us in-depth feedback on our products and allowed us to make them even better for the general public," says Dan Gamsby of Cinema Suite.

In addition to Cinema Suite software, Canon contributed to the project by providing students a special cinema camera, a Canon C100, for filming in Crystal Cave and for other aspects of the project.

Furthermore, Matthew Kuchta, director of the school's Physics Image Lab, let students borrow an ultra-high-speed camera used for studying movement and physics-based collisions, recording at 10,000 frames per second. 

"Based on our initial conversations, we decided to build a small miniature set where we could demonstrate some examples, including topics of how scale relates to movement and physics," says Pontuti. "Using a toy Transformer robot and some model-train components, we quickly mocked up a shot where the robot would slowly start to teeter and eventually fall into a dusty cloud of baby powder. After a few takes and some help from some canned air, we had a pretty decent effect that demonstrated the potential."

Pontuti, who teaches in the entertainment design program, and Beck, from game design and development, plan to continue with their cross-course projects in the future.