Moving and Tracking
All the video animation was motion-captured at Origami Digital’s own mocap studio. The mocap facility is unique in that it is located inside Hotz’s home. It comprises a 20x30-foot volume (expandable to 40x40 feet) and a 48-camera PhaseSpace active LED-based system that provides data direct from the markers. Acquisition is faster and the system more affordable than traditional mocap offerings (see “Moving On Up” Part 1 and 2, October and November 2007, respectively).
On the software side, Origami Digital used its proprietary real-time Loco solution, which the company developed a year ago but only made public with this project.
Even though the short videos are live action, CG plays a major role in them. Most of the soldiers are actors, while the creatures are digital. For the most part, the models were created in NewTek’s LightWave, while all the CG character animation was motioncaptured using PhaseSpace and Loco.
“When shooting the mocap, we don’t look at the data or dots on the screen,” says Hotz, referring to the typical “dotted” stick figure many motion-capture facilities look at as a reference for their animation data. “We look at the actual character model moving around the environment in real time. We use Loco to drive that.”
The system integrates a rudimentary version of the performer and the environment, so the technicians and animators, as well as the director, can see the action unfold in context while the mocap is being shot. In this instance, Blomkamp, who was in New Zealand, could view the same screen as the Origami Digital crew, which was in California, during the capture. As a result, Blomkamp could make immediate judgment calls during the session, avoiding the need for a re-shoot.
Files are then saved and opened within a 3D application—in this case, LightWave.
For this project, Origami Digital used the motion-capture system in 35 to 40 shots across all three minis; no keyframe animation was used. Furthermore, the data comes out rather clean, requiring little touch-up. “Approximately 90 percent of the motions were never touched after the capture,” says Hotz of the Halo video animation.
Besides the creatures and Brutes, the team also created digital set extensions to the physical structures, including CG walls and bullet marks. Also, they crafted a 3D environment, vehicle, and passengers for a car-chase sequence in the last piece.
According to Hotz, the most technically challenging aspect of the project was the tracking (accomplished with Andersson Technologies’ SynthEyes), since most of the filming was done in a handheld style with no locked-off shots.
“When you look at some of the shots from a technical perspective, they are very difficult to track,” says Hotz. “The director cares about the creative, and we are always thinking about that and not necessarily about the technical aspect at the time of the shoot. If we need to, we’ll figure out a way around a problem later.”
To achieve the handheld camera effect for the CG shots, the group again used Loco. The software allows the cinematographer to hold a device resembling a real-world camera, which, when you look through its viewfinder, shows the digital environment and characters rather than the real-world representations, explains Hotz. The mocapped actors can wear goggles, allowing them also to be immersed in the digital environment.
At the end of a take, Loco then automatically writes out a Maya or LightWave scene, with the entire content of the capture. “It provides live-action people the ability to work in a completely digital world, using the tools and devices that they already know, without having to get accustomed to a digital workflow,” adds Hotz.
In all, Origami Digital worked on 140 shots during a 2.5-month period. However, work on the first release was fast-tracked; the group had just two weeks for that video. “That was our concentration point,” says Hotz. Afterward, the group—comprising five artists—worked on the second and third shorts in parallel.
In terms of the limited staff and time, Hotz believes the group broke some new ground, particularly in terms of methodology.
In this scene above, the environment (the hangar) is computer generated, as is the helicopter. The artists re-created the handheld camera look of the live-action videos in the virtual world using Loco, Origami Digital’s proprietary software.
“Normally such creature work takes a lot of people, but with our mocap pipeline, we were able to approach it differently,” Hotz says. “We were able to make the creatures do specific tasks by shooting and integrating the animation in 10 minutes because we do not have to go through all the steps that typical motion capture and larger facilities require.”
While the war between the humans and the Covenant has ended for Hotz and his crew, for many gamers, the third battle has just begun.