New Coke Classic
Issue: Volume 35 Issue 3 April/May 2012

New Coke Classic

It took nearly 70 years for the beloved Coca-Cola polar bear characters to migrate from their first print ad to their first CG-animated television spot. It took nearly 20 more years for the computer-generated characters to make the move to real time, but the results were well worth the wait.

This impressive feat—which merged a live digital stream and social media activity with traditional advertising--took place during one of the biggest events of the year, the Super Bowl. Coca-Cola has made some big moves during the big game over the years, but never with the bears, which have mostly appeared in commercials during the Christmas holiday and even during the Olympics. That is, until this year, when the characters threw their own Coca-Cola Polar Bowl Party and invited more than a hundred million viewers to join them.

The lovable iconic bears starred in the multi-discipline ad campaign that spanned television, the Internet, and social media. The traditional segment contained three television commercials--“Argh,” “Catch,” and “Superstition”—featuring two adult bear, each sporting a scarf representing the team colors of either the New England Patriots or the New York Giants. The commercials—conceived by ad agency Wieden + Kennedy and created by Animal Logic--were playful and fun, and done in the recognizable visual style used for the past Coke polar bear commercials.


CPT

The big news, though, was not the actual TV commercials. It really wasn’t the social media portion per se, either. Rather, it was the four and a half hours of live-streamed animation that spanned the entire game as well as the pre-game activities and post-game celebrations. Shown on www.CokePolarBowl.com during the evening of the game, the live stream “occurred” from within the bears’ cave, as the polar bears, seated on a couch fashioned out of snow, watched and reacted in real time to their team’s performance. On occasion they were joined by friends (mischievous penguins) and family (one bear’s young son).

Not only did the bears and environment look the same in the streamed version as they did in the pre-rendered commercial, but the action and animation also coincided with the traditional spots. For instance, when the Patriots fan bear, unhappy with the way the game was going, got up from the couch and left the cave in the TV spot “Catch,” the cave camera in the streamed piece also showed the bear getting up and leaving the cave, with the stationary camera capturing the remaining character as it occasionally glances toward the doorway, wondering when the other bear will return. Meanwhile, at the end of the commercial, after the Patriots bear makes a spectacular saving catch of a Coke thrown by a group of bears outdoors, the camera pulls back to show the Giants bear on the couch inside the cave, nodding its head and clapping in appreciation of the catch. At the same time, the live-streamed animation shows the head nod and the clap, as well. So for those fans using a second screen (computer) for related content during the game, the action was never broken between the various media.

And it was not just the Coke commercials that provoked a reaction from the bears. “If something exciting was going on in one of the commercials, we would cover it in our live stream—so it would feel like the TV spots were somewhat live,” says Framestore’s Digital Creative Director Mike Woods.

The illusion continued via the social media campaign. In another Coca-Cola first, the real-time bears took over the Coke@CocaCola Twitter account, with the characters tweeting using the hashtag #GameDayPolarBears; they also interacted with fans live via the Coca-Cola Facebook page, responding to requests, displaying messages, and even reacting to the traditional commercial breaks during the game. At times, the tweets encouraged viewers to send photos of their Super Bowl parties in progress, and the live-streamed bears incorporated them, as well as messages, into the presentation—holding up a tablet computer displaying the pictures, for instance.

As trendy and technologically savvy as the bears were, the inspiration behind the integrated media blitz was the ad team at Wieden + Kennedy, while VFX facility Framestore and game developer Blitz Games Studios were responsible for the technology that made possible the cutting-edge, real-time animation in this game-changing campaign.

Two Worlds Converge
The digital platform that brought the computer-generated bears to life and allowed fans to follow their reactions throughout the evening employed proprietary technology developed by the two independent UK-based studios, Framestore and Blitz. The bears—which were an aesthetic match to their pre-rendered counterparts for visual continuity--were animated, or puppeteered, live via conventional video game controllers.  

The core technology capitalized on Framestore’s expertise in high-quality linear characters and animation, and Blitz’s expertise in real-time rendering and multiplatform game engines. “Blitz fundamentally understands performance (Blitz is a character-based game studio),” says Woods. “This is not just about computer graphics and gameplay. They have an in-house development team that studies facial muscles, emotions. They have a different way of working compared to other game companies, and it fits with what we do with character design.”

It was Framestore’s Woods who initially came up with the idea of melding the two sectors by taking characters from the film and television worlds and putting them in a game engine so they could be controlled in real time with game console controllers for advertising or other purposes. “We take care of the digital side of campaigns for clients, and we often have to match content from TV spots with Web-site content,” says Woods, who notes that he was looking for technology that offered far more than what can be done in Flash, which he is currently using.

Woods shared his vision and concept with Blitz, inquiring whether the developer could start prototyping a game engine that could indeed run one of Framestore’s characters, with the goal of creating a high-end real-time interactive character. “We realized that the game world was slightly behind the film world when it came to what we could do with visual effects and animation,” he says. “Grand Theft Auto and Modern Warfare still do not look as good as Harry Potter and Batman, but they are not that far behind when you are dealing with CG characters, not photoreal humans, as the starting point.”

Blitz’s R&D Art Director Jolyon Webb was intrigued. “Our stock in trade is producing fully reactive characters, though they didn’t have as high degree of polish and finish as those in a high-end TV commercial or film,” he says. “We decided to combine our synergies [with Framestore] to see if we could make an interactive character that matched as closely as possible a rendered character you would see on TV and gave a consistent, adaptive performance.”

Once Framestore and Blitz got the technology working, Woods contacted a former colleague who was working at Wieden + Kennedy and showed him the prototype. A few months later, Woods received a call from the agency, which was interested in using the technology for the Coke campaign. So in August, Woods began working even more closely with Webb to turn their concept into technology that could be used for this specific project. Adding to the project’s complexity was the agency’s desire to match the characters and their performances to those in the television commercials that would air during the broadcast.

Blitz took the reins of the Coca-Cola real-time animation concept, designing the system that would run the streamed segment of the ad campaign. “The challenge was too good not to have a go at it,” notes Webb, whose R&D team, like that of Woods’, is always on the hunt for new options and opportunities for its technology.

Real-time Moves
According to Webb, the digital system had to handle two types of animation: longer performance-based segments of pre-generated content, and the reactive real-time puppeteering—both of which would be fed and controlled through the Blitz-developed game engine.

To get an understanding of the types of character reactions that might be needed for the streamed experience, the artists reviewed footage from previous Super Bowls. While the bears would be reacting to unpredictable situations on the field, the animators also discovered that there was quite a lot of obvious and expected action, such as those associated with time-outs, first downs, touchdowns, and so forth. Since Madonna had already been signed for the halftime show, they knew to prepare a “Vogue” dance, as well.

Framestore assumed responsibility for the pre-generated bespoke animations, which were less crucial to the immediate beats of the football game but important to adding variety to the content during the four-plus hours that it appeared on the Web. These included nearly 25 minutes worth of non-looping animations pertaining to the coin toss, the National Anthem, and so on.

“We packed the engine with content that we knew would be applicable, and then we packed the engine full of fun fluff to produce a party atmosphere, with the bears walking around the room, with baby bears that wander in and dance, and penguins that bother the bears by continually stealing their popcorn,” says Woods. “There was constant opportunity to bring in the pre-generated content on a whim.”
Blitz, meanwhile, devised the technology and the components for the engine, as well as the game-like animations that would be processed within the engine for the puppeteered action (postural gestures, mood adaptations, gestures, and so forth—the Patriots bear shaking its head in frustration, for example) that would coincide with the immediate beats of the game.
 
“There is nothing in there that we could have gotten from another game company,” says Woods of the BlitzTech engine.

Although Blitz developed technology specifically for this project, at the core is the BlitzTech game engine. According to Webb, the R&D group at Blitz develops small, modular components that are added to the engine as needed for each new project. For this one, they organized the flow of the information that allowed the bears to change moods, and developed the components—eye blinks, the way they gaze at objects—that kept the bears interesting and alive.


CPT

“It let us set up control states to match the operations we wanted,” explains Webb. “We also created some new components, especially on the rendering side, to match the Animal Logic renders [in the commercials].”

In essence, the “puppeteering” resulting in the real-time animation was like playing a video game. Each bear was controlled in a so-called relative state by a person using the joystick and buttons on a gaming joypad controller. “We could make him look anywhere and put him in various states,” explains Webb. Each bear had a positive, neutral, and negative state, which reflected its mood based on the team’s performance at that time. Pushing the joystick up or down put the bear into the various states. The controller buttons, meanwhile, triggered reactions (strong, medium, weak) according to the state. A strong reaction may prompt the bear to stand up and clap its hands, and settle back down onto the sofa. A strong negative reaction could make the character grasp its head in its hands and sigh.

“As a user, the only two determinations you need to make is, ‘What’s my state?’ and ‘Am I making a strong reaction?’” says Webb.

At any point, the Framestore pre-generated animations in the game engine could be triggered via a Web interface. For instance, if the Patriots bear was jumping up and down after a touchdown (which was puppeteered), a pre-generated animation of the baby bear wondering into the room could be triggered from the user interface. “There was no visible difference between our pre-generated clips and the emotional content driven by the game controllers,” he says.

One person from the agency was assigned to puppet each bear, with a third person, a content director, triggering the longer pre-generated segments via an interface created by Blitz. A creative director instructed when and the types of animations that should be done; he also interfaced with the social media team. “It was all perfectly coordinated from the same room” says Woods.

The real-time puppeteering was done on a PC containing a high-end graphics card and running the BlitzTech game engine; the two game controllers were also connected to the PC. Much heavier computer power was required for the live streaming, however. To this end, nearly 30 people involved in the project camped out at Major League Baseball’s headquarters in New York City, which has one of the world’s largest collection of streaming servers. “We knew we needed the capability to stream content for a few hours, and we had to be sure we could accommodate the fans following the real-time action on the Internet, Facebook, and Twitter,” says Woods.

Build A Bear

For years, there has been a discernible difference between real-time and pre-rendered content. However, to successfully pull off the Coke mixed-media campaign, the CG Coca-Cola polar bears featured in each segment had to look and feel like the same bears.
 
Animal Logic, which created the pre-rendered TV commercials, set the visual target for all the work, sending their assets (models, meshes, and lighting and texture references) to Framestore and Blitz Games to match as closely as possible. This task was especially difficult since the main characters were covered in fur—a difficult process in the pre-rendered world and an even larger one in the real-time world, but one that was accomplished through teamwork.

With its experience in creating game imagery, Blitz modeled the bears and the cave environment as well as generated the textures. Blitz also created the short-beat animations, while Framestore crafted the longer, pre-generated segments.  

Basically, the models and scenes were created like those in a typical video game, so they could be rendered in real time. Yet, the processes were quite different from those used for pre-rendered work. For instance, the shape of the fur was built into the mesh of the model, “which is unheard of in the pre-rendered world,” Woods says. Also, the textures are read into the game engine, as is the shading, which is projected live from within the game engine. Lighting is also done in engine. Rendering happens on demand in the engine, not hours later via a renderfarm.

The team at Blitz modeled the bear meshes in Autodesk’s Maya, using a fairly standard setup and straightforward bipedal character rigging. They also used Adobe’s Photoshop for texturing and Pixologic’s ZBrush for sculpting. The shader surfaces were created with in-house software. “We have our own suite of tools, including the main engine, which has proprietary production tools, including a shader editor,” says Jolyon Webb, Blitz’s R&D director.

For the fur, Blitz used a well-established game technique called “fur shells,” whereby the artist repeatedly renders the surface farther and farther from the base layer with a noise pattern alpha-layered over the surface that produces dots. As the next layer is rendered, the alpha is eroded a little more so that the dots get thinner and thinner and eventually taper away to nothing. The dots are layered on top of one another, akin to building up a stack of coins, until a strand of hair appears.

“There was a lot of work to optimize the characters so they would run in the engine, but if you look at our models and theirs, they really do look the same,” says Mike Woods, Framestore’s digital creative director, about the comparison between Animal Logic’s pre-rendered assets and the Framestore/Blitz assets rendered in engine.

Not only did Framestore and Blitz have to worry about the outward appearance of the models, but also what’s inside—in particular, the rig--in order to animate the characters properly. As Woods points out, the animators were restricted to using the actual rig; these rigs could not move around or change size like they do in the pre-rendered world. “There is not as much flexibility,” says Wood, noting that a significant amount of R&D occurred between Framestore and Blitz to devise a rig that could be used in the game engine and one that Framestore was comfortable with.
 
“There was definitely a learning curve for the animators here in that regard,” says Woods. He notes that the FK and IK rig system separates, requiring the team to blend animations and splines in the neck and arms differently than they usually would. “We’d get clipping on the neck if we went too far, and it would overlap the shoulder, and we would have to blend that in. In a post-rendered world, that doesn’t happen,” he adds. In the end, the artists added a several FK an IK systems to the rig, but once the models were up and running, the artists could animate them in Maya as they would pre-rendered models, and deliver the Maya scenes to the game engine.

“It was a nice pipeline; there was no rendering involved,” says Woods. “We would have a Maya scene of a bear jumping up and doing a touchdown dance, and once we were happy with how it looked on the Blitz rig, we would send it off to Blitz and they would put it into the game engine for us and update our game-engine build we had here in the office. We could then see it textured immediately, so if there was clipping on the neck of if we extended the neck too far, we could adjust it based on the feedback.”

According to Webb, one interesting challenge of producing a real-time character to match a TV spot was that on the TV side, things change up until the last minute. For games, though, a working system has to be maintained the entire way through production. “The bears continued to evolve over the four months we worked on them, and Animal Logic didn’t finalize their assets until about two weeks before the game,” he says. “We had to constantly keep our assets up to date, and each asset revision could mean a complete change in the rigging, skinning, and hierarchies, and then we’d have to go back and repurpose existing animations and make sure the nodes that were important in keeping some of the procedural systems would adapt to the changes. We would overlay the basic animation rig data with the informational nodes that were hooked to the game engine—the kind of things you would not worry much about in an edited video.”

Of course, the look did change a good deal. As Webb notes, the direction from the agency was clear and unchanging: The bears had to look engaging and constantly alive. The specifics of the bears—their size, proportions—however, were malleable and changed over the course of the project.
 
“With a film, you spec out shots and know what you need, and you keep the sequence as a unit, so if you have to revise it later, you go back to that one animation—a character getting up out of a chair and walking across the room, for instance—and make the change,” Webb explains. “With real time, you have to produce a lot of content that is blended together. So you might have walk loops, getting out of a chair loops, head-turning code-15 to 20 elements that would fit together to produce the same movement. So if the shoulders of a character suddenly get wider, we have to cascade that change through all the components that will be influenced by the change. It’s a typical game problem, and it surprises people if they have not encountered it before.”

–Karen Moltenbrey


Tip of the Iceberg
Although the project used retro characters, the initially test was done with a more recent, more detailed CG character. “The key is that the audience should not notice the difference between real time and pre-rendered. These bears have a lovable, cartoonish quality, and physiologically they are different from real bears, but that does not mean this technology can’t stand up to a character that is more realistic,” says Woods.

Woods and Webb believe that real-time CG rendering such as this has the potential to be a massive game changer, if you will, for visual effects and animation. “Real-time rendering is the future and spells the end for time- ad data-consuming renderfarms. The possibilities are endless,” says Woods.

For the Coke project, it was folks from the ad agency controlling the characters streaming directly from the Blitz engine. But with cloud gaming and remote rendering, the control could easily be shifted to fans. “It can be done now, and it’s a big step up from how things like Farmville look and play at the moment,” notes Woods. In fact, Woods’ Framestore team is currently working on a prototype for remote rendered content so people can play with branded characters through an Internet connection and their own keyboard.


CPT

“We’ve hardly scratched the surface in terms of what can be achieved,” says Woods,

Webb agrees, adding, “The interesting stuff will come from real-time characters. Film and TV will continue to push the boundaries of CG work, but it is beginning to plateau. It is already spectacular—look at Avatar and John Carter. But they also closely match other fantastic movies. It is not a personal experience,” he says. “Film plays all the way through and you go along for the ride. The interesting new stuff is not going along with someone else’s ride, but directing it yourself or adapting in response to other things happening in the world. If digital characters give appropriate responses and attention to what’s happening in the real world, that is massively powerful.”

Karen Moltenbrey is the chief editor of Computer Graphics World.