The visual quality of video games has improved dramatically with the arrival of Microsoft’s Xbox 360 and, soon, Sony’s PlayStation 3. In fact, some games—in particular, racing titles—approach the much-promised goal of photorealism. However, amongst all the talk about HD, new shader models, and lighting, many consumers have come to realize: The animations still look distinctly last gen.
The high state of rendering fidelity exposes low-animation quality much more than on previous consoles. Characters look great in screenshots, but they can seem unnatural, clunky, and even robotic when in motion.
And a second problem has surfaced, as well: After playing any game for a little while, users will notice that the animations always look the same. This didn’t matter so much when simple rendering signaled that we were merely playing a computer game. However, when a photorealistic Kobe Bryant repeats a move for the hundredth time with digital precision, something seems very wrong.
The reason for these two problems is that our game animation pipelines have not changed very much since the 16-bit days. Animators create key- frame animation or processed motion-captured clips, and the data is played back at, more or less, appropriate times in the game. There are two issues with this approach. First, animators have little creative control over the last section of the pipeline (that is, what their animations look like in-game). Second, played-back data can never be truly interactive.
The manifestation of the first problem usually results when animators walk over to the game’s animation programmer and complain that their carefully crafted animations look totally wrong in-game. The way to solve this, and to get in-game animation to the quality that the raw material already supports, is to provide animators with intuitive tools to perform the tasks currently carried out by animation programmers. Those include the determination of the transitions between animations, the nature and timing of blends, the distribution and weighting of different animations across the character’s body, and even the responsiveness of the animation to user or AI input. All of these are creative tasks and, therefore, should be under the full control of the animator, and this should result in very quick turnaround times.
The solution to the second problem (that is, non-interactivity and repetitiveness) lies in the CPU power available on next-generation hardware. Rather than pre-producing every clip beforehand, we can use the console’s processing power to synthesize animation on the fl y, as the game unfolds. This, of course, is not straight- forward, as it requires a full (and real-time) simulation of the 3D character (including body, muscles, and motor/nervous system), but it is now possible on the Xbox 360 and PS3. Not only does this approach increase the visual quality of the game, but it also enables unique moments that have never happened before, and will never happen again: every tackle is your tackle; every haymaker is your haymaker. Crucially, animators require full control over synthesis, too. In this case, rather than creating baked animation, they determine the style and parameter of adaptive behaviors. They, therefore, work at a higher level, akin to a director.
The upshot of all of this is that animation will become the predominant quality determinant of next-gen game visuals and experiences. The role of the animator will grow significantly, from keyframing to controlling in-game blends, to directing motion synthesis. There’s never been a better time to be in this industry.
graduated with a degree in biology from Oxford University and holds a master’s degree in evolutionary and adaptive systems from
. Prior to founding NaturalMotion, he was researching for a PhD in complex systems at
, and in 2003, was named as one of the world’s top 100 innovators by MIT’s Technology Review. Along with CTO and co-founder Colm Massey, he developed NaturalMotion’s core technoIogy. Based on
research on the control of body movements, NaturalMotion’s Euphoria synthesizes 3D character animation in real time on the PS3, Xbox 360, and PC, creating unique game moments and previously unachievable interactivity. NaturalMotion’s other DMS product, Endorphin, creates offline animation an order of magnitude faster than with traditional techniques, and is used in the film and gaming industry.