Spot On
Issue: Volume 38 Issue 2: (Mar/Apr 2015)

Spot On


Television commercials use VFX to make creative customer pitches

Sometimes TV commercials use celebrity spokesmen to get viewers' attention. Sometimes they use comedy. And, sometimes they incorporate amazing animation and stunning visual effects.

Today, many VFX facilities that handle high-end film work also have a commercials division: Digital Domain, The Mill, Moving-Picture Company (MPC), and Framestore are but a few. Often, the visual effects teams working on advertising projects face bigger hurdles than their feature-film counterparts, with tighter budgets and shorter deadlines. And although the projects are short form, usually 30 to 60 seconds, they are long on creativity and pizazz.

The following are some recent spots that push the creative boundaries with the help of computer-generated imagery.

IKEA’s The Joy of Storage

It’s not unusual to give anthropomorphic attributes to all manner of creatures. But, in the commercial “The Joy of Storage” for IKEA, visual effects artists at MPC Advertising were challenged with applying animal characteristics – specifically those of migratory birds – to T-shirts.
The commercial follows a flock of shirts, which behave like a flock of Canadian geese, in their quest to find a new home. Their journey, while epic in scope, is far from easy. It starts out in the Arctic tundra, as the characters are seen high in the sky, over mountaintops barely touched by a dull sun. As they land on the snowy ground, viewers learn for the first time that these are shirts, not birds. The characters huddle together to stave off a cold, harsh wind, before taking off again. The landscape beneath them eventually begins to show splotches of green, as the characters push onward, even through a downpour. They rest momentarily by the sea and later in a field. They are chased by a child, then by a dog, and eventually find sanctuary in an IKEA wardrobe within a city apartment.

“It’s a visual spectacle as well as an emotional plight – you really feel for the characters,” says MPC’s Diarmid Harrison-Murray, visual effects supervisor and CG lead on the project. On average, a team of eight CG artists and five compositors handled the work, completing more than 35 VFX shots in just three weeks.

To sell the concept, the shirts had to feel realistic and the animation had to be spot on. So the modelers and animators spent a good amount of time reviewing film of actual flocks and their shapes, slow-motion clips of birds in flight, and so forth.



MPC animated these CG shirts to resemble a flock of birds as they take off and land.

Nearly all the T-shirts and their subsequent movement are computer-generated, while the rest were puppeteered by Blink, such as one that is close to the camera in the rainstorm.

Director Dougal Wilson liked the performance he was getting with the puppets on the ground during preproduction, so the MPC artists infused a puppeteered feel into the way they moved their digital characters. “We got some rough video while they were developing the puppet rig – we could see where it was articulating, where the wing curve was – so we could have our [CG] rig behave in a similar way,” Harrison-Murray says.

The rig, Harrison-Murray notes, was difficult because it was deceptively simple. The animators had to control it in flight, where the characters appear to have arms and wings, and on the ground, where they step, and then blend between these motions. “We spent a lot of time on the rig and the cloth simulation on top of it, which was key in terms of creating this type of character made of cloth,” he explains. “We went through many iterations and continued to tweak the rig during production.”

Following the shoot, the team continued to squeeze more and more emotion out of the characters. “It was clear from the director that emotion was key,” Harrison-Murray says. “He didn’t want a cold narrative.”

The characters were modeled and animated in Autodesk’s Maya. Texturing was done mainly with Adobe’s Photoshop, in addition to Mari from The Foundry. Maya nCloth was used for the simulation of the shirt material. All the rendering was done in Solid Angle’s Arnold; compositing in The Foundry’s Nuke. For the majority of the tracking, the group used Science-D-Visions’ 3DEqualizer.
In shots where the CG T-shirts became wet, such as when some of the birds are landing and taking flight from the river and sea, the digital artists introduced digital water using Side Effects’ Houdini.

“I believe we achieved the goal of finding a way to suspend a person’s belief, and a lot of it hinged on how much to dial in the cloth effects and how much we would fill in the underlying animation of the T-shirts,” says Harrison-Murray. “We didn’t want them flapping around in the wind and not have any muscular power to them. But, too little and it would have looked like a body without a head. There was a thin line to making it feel like a dynamic piece of cloth.”

Unlike the characters, the backgrounds are natural. The spot was filmed in multiple locations throughout England as well as Sweden and Scotland, although some of the landscapes were graded to appear more winter-like.

“You can’t help but feel the suffering of the T-shirts and their drive to keep on flying,” says Harrison-Murray. “It takes a great animator to get that kind of magic out of a T-shirt.”

Nissan’s Winter Allies

So-called bad weather can take many forms: sleet, ice, snow, hail…. In a commercial for Nissan, the manufacturer’s crossover vehicles each take on a harsh road condition that manifests in the form of a CG character. “We were asked to create four different monsters that represent bad weather: potholes with teeth snapping up at the cars, a mud monster emerging from the saturated earth, a snowy tree monster, and an ice-laden bridge with cables snapping and waving like tentacles,” says Vicky Osborn, MPC Advertising’s VFX supervisor and CG lead on the spot.
The commercial that aired during the Super Bowl featured all four scenarios, though it was generated as a compilation of three separate commercials, along with the pothole sequence created special for the Super Bowl version.

The majority of the work was done in MPC’s New York studio with a team of approximately five CG artists and four compositors using The Foundry’s Nuke and Autodesk’s Flame as they tackled the individual spots. MPC’s LA studio handled most of digital work for the icy bridge sequence. The 60-second compilation contains close to 100 VFX shots.

In all the encounters, the Nissan vehicles are real, as are the basic environments shot on location. The CG artists later made the real environments more treacherous and built the location-specific monsters that interacted with the practical vehicles.

All the modeling and animation was done in Autodesk’s Maya. Nuke and Flame were used for compositing the effects, Vicon’s Boujou for tracking. Rendering was accomplished with Nvidia’s Mental Ray, Side Effects’ Mantra, and Solid Angle’s Arnold. Side Effects’ Houdini provided the simulations that appear in each encounter. The monster shapes were exported from Maya into Houdini. For the mud monster sequence, the team also exported into Houdini the Maya animation showing the beast grabbing at the car, making it seem as if the mud was moving across the car’s surface.   


Artists used a film plate (top) and added a CG character to the scene (bottom).

All the monsters contain Houdini simulations. In the Juke encounter with the potholes, the artists used sims to break up the ground as the creatures emerge from the pavement. Their jaws were animated in Maya, but the little bits of debris that fall off were simulated in Houdini. In the Rogue mud encounter, all the creature parts are slick and viscous, created with a sim. The Pathfinder encounter with the snowy tree uses simulations for the flinging branches and the falling snow. Some branches were animated in Maya and combined in comp. In the Murano encounter with the icy bridge, all the falling, snapping debris and smashing ice were simulated.

According to Osborn, the biggest difficulty was controlling the creatures in such a way that they are believable and recognizable as monsters. The dripping mud and swinging branches with falling snow offered an extra challenge. Artists started off with character designs to establish a look for all the creatures and then built the underlying shapes and rigged them. Surfaces drove the mud sim over the structure for the monster in order to keep the intended shape; particle sims were used along with practical elements when the creature crashes into the vehicle’s windshield.
“We combined as much reality as we could to make it as believable as possible,” Osborn notes. “Mud is one of those things that is difficult to do convincingly. We also shot a lot of rain in camera to get all that interaction with the rain and the mud on the ground. But the rain in camera also made the CG integration more difficult.”

The group animated the tree in Maya, using grayscale models in all the shots. Because the branches would ultimately be sims and animated in Houdini, the artists had to import stand-in geometry in Maya that would visually represent the space the tree would take up in the shot. Then they simulated all the individual branches moving. “They were all created with a wire simulation, so as the character walked along the ground, the branches would bounce and react, and we would have all this secondary movement that made it feel convincing,” Osborn says.


A CG mud monster interacts with a practical vehicle.


      “We combined as much reality as we could to make it as believable as possible.”


In that tree sequence, the snow on the ground was real; the snow falling from the sky and from the tree were not. The snow sliding from the tree was linked to the animation and simulated in Houdini, while the flakes falling from the sky were generated in Maya.

For the icy bridge, the artists used a great deal of practical ice but took it further using CGI. The big ice chunk that falls on the
vehicle is real, while all the smaller incidental bits are digital.

The group at MPC Advertising has encountered many challenging scenarios over the years. And while the weather monsters in “Winter Allies” did their best to wreak havoc, the artists, like the vehicles in the spot, handled them magnificently.

Pepsi’s Halftime Touches Down


Commercials that air during the Super Bowl are very expensive, so companies making that investment have high expectations for the work. The good news is that the spots will continue to air after the game. Yet, that was not the case for Pepsi’s “Halftime Touches Down” spot, as it was specifically designed to introduce the halftime entertainment. Since it would only be live for 30 seconds, each one of them had to count. And they did, with out-of-this-world effects.

“Every year the world’s biggest TV audience tunes into the game with the expectation that something spectacular might happen, so from the moment Mekanism came to us with [Director] Tommy Means’ genius idea for setting the stage for halftime, we knew we had our work cut out for us,” says Ben West, Framestore’s director of the spot.

Framestore directed and created all the effects in the piece, which has a UFO descending on Glendale, Arizona, the site of the big game. “Revealing the UFO as the stadium was always going to be the great payoff,” West says. “In some ways, it’s a no-brainer – that stadium really looks like a UFO.”

Nearly 20 people in the Framestore LA office handled the work. Post was completed in less than three weeks.

It was up to Chris Eckardt, VFX supervisor, to bring West’s concepts to fruition, which, for the most part, centered on the environments. In short, the idea was to get the idea across that this was the Arizona desert (which carries the mystique of alien encounters), and a spaceship touches down and transforms into the actual stadium. Before this, there are images that hint to the entertainment that’s to come: a Pepsi machine, a Pepsi truck, guitars, and Katy Perry’s iconic blue wig. When the spaceship hovers overhead and a beam of light floods the area, objects suddenly lose their battle with gravity.

The practical effects were just as challenging as the digital effects for this commercial, as the live action required a massive amount of rigging and lighting to achieve the base elements required to tell the story. In terms of visual effects, “a big part of the job was creating a seamless connection between what we shot and where we wanted to take it,” West says. “We did a lot in camera, which gave us a good reference point, making things a little easier.”

Shooting took place in the California and Arizona deserts, as the production crew used a 100-ton crane and a huge mothership-style light rig to create the effect of a hovering stadium-­size UFO as it beams up components of halftime.


Complex practical lighting, along with CGI, resulted in this
commercial’s successful homage to alien encounters.


Initially, West toyed around with lifting the truck in CG. “But it was important for me to have a real truck and lift it with a crane on set so we would have all the interaction with the lights,” he explains. An actual trailer was used on set for establishing shots, but during the lift, only the cab was used; the trailer was built in CG.

In the studio, the VFX artists created the UFO/stadium, referencing the real University of Phoenix Stadium. They also populated the landing area with 14,000 high-res CG cars, created with an Arnold (Solid Angle) archive system which generated proxies that could be rendered quickly. Smoke, dust, and debris effects helped achieve the look of the tractor beam and conveyed its antigravity pull, as well as the stadium’s massive scale as it lands.

The main digital tools used by Framestore included Autodesk’s Maya (modeling and animation), along with Arnold (rendering). Most of the texturing happened in Maya, with some work also done in The Foundry’s Mari. Compositing and last touches were handled in The Foundry’s Nuke. Final “slight” adjustments and color leveling occurred in Autodesk’s Flame. Science-D-Vision’s 3DEqualizer was used for tracking. The smoke and particles were produced in Side Effects’ Houdini.

Lighting was important to selling the commercial. “We were trying to match the overall lighting on the set, which was put together by film cinematographer John Schwartzman (Jurassic World, Armageddon, Pearl Harbor). He used these beautiful concert-based lighting setups on set to re-enact an alien ship hovering and beaming things up,” says Eckardt. A second crane on set lifted the lighting rig. Meanwhile, the CG crew acquired a substantial amount of HDR references, then the artists pushed the lighting further in Arnold.

The light beam, meanwhile, was coordinated between the 2D and 3D departments. Michael Ralla, the 2D lead on the project, developed the look of the light beam and integrated the effects for the live action and CG. “We didn’t want a color separation between the Pepsi blue UFO beam and the rest of the scene, so the lighting rig was enhanced with volumetric lighting and particle effects for a beam with a little more solidity and opaqueness that we associate with a UFO beam,” West says.

The CG stadium was built in Maya; CAD data of the actual stadium was used for reference.

“There’s no more suitable arena to unleash this magic than at the Super Bowl, an epic congregation when America stands to attention to witness a moment of greatness,” says West. That would include the performances by the players, Perry, and, certainly, Framestore.
 
Mercedes’ Fable

We all know the story of the tortoise and the hare. But for the Mercedes commercial titled “Fable,” CG artists at MPC Advertising put a unique spin on this fairy tale, updated for 2015 sensibilities. “The director and agency wanted to keep that magical quality but with some edge,” says MPC’s Chris Bernier, CG lead on the project.

The 60-second spot features two creatures – a turtle and rabbit – in a race, cheered on by their fans, other woodland animals. The rabbit is cocky, sure of himself. The turtle, slow but eager. When an opportunity presents itself in the form of a Mercedes-AMG sports car, the turtle is suddenly in the driver’s seat, literally and figuratively.

The spot contains a large amount of CGI, mainly in the form of the characters. A lot of the environments and the car were shot live action, with kicked-up leaves and debris added in post. “Every shot had some VFX element, from a large number of CG characters to set cleanup and augmentation,” says Bernier.

Most of the work was done by MPC’s New York studio, with the Los Angeles, London, Bangalore (India), and Mexico crews assisting. At any given time, a core team of approximately 15 worked on the project during a three-month period.  

MPC created 15 individual characters – including the hare, turtle, fox, mole, toad, crow, snail, possum, and more – and a number of variations of those animals. The models were built with Autodesk’s Maya and textured with Autodesk’s Mudbox, Pixologic’s ZBrush, and The Foundry’s Mari. Some proprietary tools, along with Maya, were used for rigging. Lighting was done in Maya using the Arnold (Solid Angle) renderer. The Foundry’s Nuke and Autodesk’s Flame were the compositing tools.


Artists integrated at least a dozen CG furry animals into live backdrops for “Fable.”

According to Graeme Revell, lead animator, there were discussions about which animals would be bipeds and which would be quadrupeds. “The client wanted the main characters to be anthropomorphic and walk like a human on two legs, but during the race, they are on four legs,” he explains. “So we had to build each of the main characters as both bipeds and quadrupeds, doubling our pipeline for those characters.”

The two types of character models differed in terms of their proportions – the legs are longer in one version than the other, the animal fatter and thinner. “The challenge was to make it look like the same character, even though the shapes are different,” says Bernier. The performances and clever placements in the shots helped make that happen.

Another challenge was dealing with the large amount of furred or feathered animals appearing on screen at the same time – 12 of the 15 main creatures have fur or feathers. Using Furtility, a flexible, robust in-house tool from the MPC Film group, helped in this regard. “The Arnold renderer really crunches through the tons of curves and rendered the large amounts of geometry, which allowed us to really raise the fur count and detail on our characters,” says Revell.

Cloth simulation was another challenge when it came to character creation. The hare, for instance, is wearing a jogging outfit, and the fox a referee’s shirt. “So we used our fur system, then did the cloth simulation after that,” Revell emphasizes. For the cloth simulation, the artists used Maya’s nCloth.

Bernier notes that the hare was the most complicated character from a technical standpoint, with the cloth, fur, some muscle sims, and so forth. This is also true from a story sense. “The first shot is a close-up of him talking, and as he does so, he whips off his sunglasses. In those two seconds you get a snapshot of his personality,” he says. “He is the cool bad guy through the story, so nailing that performance right away was important.” In that close-up, the hare model has more than one million hairs.

Meanwhile, a camera car filmed the Mercedes traveling along an actual street. Even shots with the turtle appearing behind the wheel contain an actual car. Some of the scenes were shot in Portland, where the camera crew filmed the car and the woodsy environment. In fact, at the starting line of the race, a large amount of the foreground was shot on set, and anything behind the starting line was a matte painting, VFX, and set extensions. To give the environments a “magical” feel, the CG team added particles in the air and other atmospherics.

“A huge amount of the foliage was practical,” says Bernier. “We didn’t do much CG-rendered plant life, but we did an extended bluescreen shoot of plant life because we knew we would have to augment the plates heavily. We shot branches, brush, and ferns being pushed around, so when we got back into the studio, we could add them.”

Although the MPC team was retelling a story in “Fable,” the reality was they had to push the creative and technical envelopes and deliver a visual tale, with a large number of CG furred animals that had a lot of personality and style.

Avocados from Mexico’s First Draft Ever

Animals and humor tend to be fantastic ingredients for a commercial, and the Avocados from Mexico’s “First Draft Ever” spot is proof that this combination leads to successful results.

The commercial, from production company Biscuit and VFX facility a52, mixes the modern (Jerry Rice and Doug Flutie behind an ESPN-like desk) with the prehistoric (a caveman co-anchor and a sparse desert/rock environment). The potential draftees are as unique as they can get, with countries making their selection of icons associated today with those nations. For instance, Brazil selects the sloth; Australia, the kangaroo; the US, wheat; Madagascar, the lemur; China, ginkgo biloba. And, Mexico selects the avocado.

As each selection is announced, the camera rolls to the drafted animal (or object) within a coliseum-like environment. The animals’ faces go tense as a trade is announced, with the dodo bird keeling over. The polar bear, meanwhile, is wearing a sombrero and chanting “beach” under its breath in hopes of being chosen by Mexico.  

“It is a fun and wonderfully ridiculous, charming, tongue-in-cheek ad,” says a52 VFX Supervisor Andy Barrios.
The live action was shot on stage by Siren Studios. Approximately 10 percent of the sets were built practically, while the rest consist of matte paintings and CG set extensions.

What happens, then, when you put a snake in close proximity to a mouse? An alligator by a lemur? A lion near a zebra? Mayhem. So, they were all filmed separately on bluescreen and then composited into the scenes. The opening shot alone contained more than 150 layers.
The animals react to the draft notices, mostly with facial expressions, so a lot of the VFX work required eyelid, mouth, and ear movement in post. The artists also added some exaggerated expressions, all using Autodesk’s Flame.

Some CG animals joined the draft, including the digital polar bear, brown bear, mouse, and penguin. Naturally, the animals with fur – the bears, especially – posed the biggest challenge.

The wheat and avocado were shot live on stage, while the rest of the plants and objects were photographed separately from a range of angles, so when they were later composited into the shots, they filled the frame as needed. The tumbleweed, however, was created in CG.
“Everything starts off with a good base, and that means a rock-solid camera track,” says Barrios. The group used PFTrack from The Pixel Farm for that task. Later, the crew used Auto­desk’s Maya and Pixologic’s ZBrush for modeling and then animating the cast as needed. All the models were run through ZBrush for an added detail pass used as displacement maps.

Chaos Group’s V-Ray for Maya was used for shading and rendering, which gave the artists renders with the added control layers for compositing. Because of the fur, they also used V-Ray’s HairMtl3 shader.

The artists sculpted the hair grooms in ZBrush using FiberMesh, and then exported a reduced number of curves to act as guides in Maya. Because of the short turnaround, they used Maya’s dynamic curves to drive the simulated nHair. The ZBrush guide curves were attached to the base meshes as dynamic hairs and simulated so they moved correctly as the animals moved, explains Barrios.

“By using the dynamic guide hairs as a base for a PFX hair system, we were able to replicate and achieve our final hair density at render time,” he adds. A number of systems were also used to create fly­aways and randomness.

The spot was cute, imaginative, and funny, and could not have been contrived without the right amount of digital magic, thanks to the a52 team drafted for the job.



Supercell’s Clash of Clans: Revenge

Many people have played Clash of Clans, a 2012 freemium mobile MMO strategy game from Supercell. And what’s not to love? There are barbarians, archers, goblins, giants, skeletons, wizards, healers, dragons, and more.

In this game, clans face off against other clans. But, as this commercial by Psyop shows, it might be wise to know who you are doing battle with.

“This is the 13th Clash of Clans commercial, and being that it was for the Super Bowl, we wanted to do something different and very special. Introducing live action by showing the link between the Clash of Clans world and the real world was a great twist,” says Director Fletcher Moules from Psyop.

The group decided to parody actor Liam Neeson’s famous monologue from the film Taken, and then pit that film character against another player in the Clash of Clans community. Playing under the handle Angry­Neeson52, the actor swears vengeance on BigBuffetBoy85 after demolishing Neeson’s base in the opening sequence.

The spot starts out with the CG battle, generated by Psyop, in progress. Psyop had been working with Supercell for a year and a half to bring the game’s universe and stories behind the characters and gameplay to life for previous commercials. “In that time, we’ve built a fully realized animated world with all the immersive detail and depth of a long-running Saturday-morning cartoon series, including a wide range of characters, props, and set pieces,” says Chris Sage, CG lead.



Psyop put actor Liam Neeson “In the game.”

With those assets in hand, the Psyop crew directed and produced 10 fully CG shots that provide the backstory for Neeson’s performance as AngryNeeson52. “The shot we created for this spot focused on the utter destruction of Liam’s base, so we created all the animated mayhem and VFX that were needed to bring that moment to life,” Sage explains. In any given scene, there are 30 to 40 animated characters, each with their own personality and signature battle moves.

As Sage points out, the spot shows for the first time in the campaign the world from inside the defenders’ base while they are being attacked. “It’s a unique POV that most players don’t get to see in the game,” he says of the purpose-built animation. “We also had to get across the total annihilation of Liam’s base and the experience of a player logging in to see his base destroyed. This meant that no wall or defense was left unscathed, and everything was fair game for carnage here.”

As a result, there is more destruction than Psyop ever had to create before, so the crew knew that integrating all the effects into the rendered scenes was going to be difficult.

“We do most of our effects in [Side Effects’] Houdini and comp them into [Solid Angle] Arnold-rendered scenes,” says Sage. “For this spot, we decided to take advantage of the layering and compositing functions of the new Deep EXR pipeline. By rendering Deep Matte passes, we were able to composite the effects elements without having to worry about matching displacement and motion-blur consistency.”

The artists created the animation before the live-­action shoot of Neeson occurred. Therefore, the crew had to tell the animated story in a modular way that could be edited after animation was completed, Moules notes.

After the animation, the piece transitions to Neeson, who had just witnessed the carnage. As he stares at the screen message telling him his defense lost the battle, the actor delivers his threatening line directed at BigBuffetBoy85.

While revenge is often a dish served cold, in this instance, it was done through a spot that was hot, based on the resulting accolades it received after airing.