Cool Commercials
Issue: Volume: 30 Issue: 3 (March 2007)

Cool Commercials

 

Shooting a commercial on the moon isn’t feasible—yet, anyway—so The Mill in New York was called on to create Fed Ex’s first office on the moon, with approximately 20 CG shots. The Mill special effects supervisor Yann Mabille and Autodesk Flame compositor Alex Lovejoy attended the shoot at soundstage Downey Studios in Los Angeles, measuring everything that was built so they could match the CG with the practical sets. 

Sometimes that wasn’t so easy. For instance, there is a platform in the office where the desks are on a 30-degree angle, and the directors wanted the actors to be able to walk freely and naturally at that tilt. The work to create the illusion was extremely painstaking, says Mabille, requiring the production crew to build two desks on a horizontal surface, and then shoot the actors walking around the desk and interacting with it. The rest of the office, the center with a photocopy machine, was shot separately on a different set.

“We first shot the inside of the office, including the round corridor where the boss and his guests are walking,” Mabille says. “We got the camera animation from that, imported it into Softimage’s XSI, tweaked it, and then re-exported it so we could shoot the horizontal desks with the camera tilted at 30 degrees. That way, when it’s comped, the desks look like they’re tilted.”

To make the moon office look like it had been there for a while, The Mill team added lots of detail against the base of the building, such as a box with wires going into the structure, and some vehicles. “These little props were important to give the notion of scale, as well,” adds Mabille.

Once outside, on the moon surface, the astronauts were shot on a 40x40-foot platform, to get the proper interaction of shadows with the surface. “Beyond the astronauts and a small bit of surface around them, we created the rest in 3D using XSI,” adds Mabille. “We gathered reference photos from NASA, which we tweaked heavily and then projected onto the 3D models.” The entirely computer-generated Fed Ex spaceship was modeled based on designs from the production company. Realistic CG details included landing gear, rotating reactors, and dust kicking up from the surface, the latter created using the XSI particle system.
 
3D played an out-of-this-world role in the spot.
Top shows the CG elements against greenscreen inside XSI; middle shows a
simple rendered view for visualization purposes; bottom is the fi nal image
.
 

The Fed Ex “Moon Offi ce” spot contains a mixture of live action, some
filmed against greenscreen, and CG elements. In the shot above, the astronauts were
filmed live on a large platform. Except for the actors and a tiny amount of surface around them, the scene was created in XS
  

“One detail that no one notices is that the astronauts were shot without any glass visors in front of their faces, so we could control the reflections in post,” Mabille points out. The team tracked the imagery using Science-D-Vision’s 3D-Equalizer, and then regenerated it in 3D with XSI. Rendering was accomplished with Softimage’s Mental Ray tool, and compositing was done entirely in Autodesk’s Flame. Meanwhile, the meteor that takes out one of the astronauts was the subject of a lot of testing. “The goal was to avoid something that looked like a crashing, burning plane,” admits Mabille. Using XSI’s particle system, the group came up with a gaseous, graceful comet.

Another easy-to-miss detail is the coffee that one of the space workers is imbibing. The artists generated the mesh of the fluid in Next Limit Technologies’ RealFlow, and then exported it to XSI, where they rendered it.

Slow-motion shoots were the lighthearted moments on the set, including a “flying” dog that jumped from a trainer’s arms to the floor, shot at more than 500 frames per second (fps). The floating secretary grabbing at papers drifting from the photocopy machine is another comical touch that was created by suspending the woman with a wire setup and adding the flying papers later in CG. All the body animation was shot at 48 fps, but the faces were shot at 24 fps so lip sync would work with the track. The Flame artists then tracked the faces onto the 48 fps bodies. 

“If you see the offline, there’s greenscreen everywhere,” says Mabille, who notes that the spot was completed by five people in 3½ weeks. As a matter of fact, “every shot has CG in it.”

Unlike many Super Bowl production companies, Nexus Productions had a relatively generous amount of time to create “Videogame,” in which the gritty landscape and activities of the popular computer game Grand Theft Auto take a decidedly altruistic, upbeat turn, evolving into full-on song and dance.  According to Nexus senior 3D supervisor Ben Cowell, Nexus created the first “proof of concept” images for Wieden & Kennedy in October 2005. “We picked the two major ends of the story—the beginning when the city is grim, and the end when it’s all happy,” he explains. “Amazingly, they ended up looking a lot like the final images.” Nexus Productions got the go-ahead in February 2006, when the crew started production.

Although initially the plan was to have some live-action elements, such as kids playing computer games at the end of the spot, the commercial evolved into a totally CG version. The generous timeframe was perhaps not so liberal given the fact that the spot features almost 60 “hero” characters and a cast of more than 8000 digital extras. That massive amount of character creation spurred “a few key innovations in the process,” says Cowell. Rather than modeling a character from scratch each time, there was a style for all the characters that acted as a starting point. “We had a [basic] female character and a [basic] male character, and using sliders we could make them taller, shorter, fatter, thinner, and then take a snapshot of the 3D mesh and start detailing,” he adds.

Using Autodesk’s 3ds Max as the core software, eight people focused on modeling, skinning, rigging, and texturing the hero characters; though Nexus Productions has a staff of only two or three full-time CG artists, for this project the production swelled to 35 people, the majority of whom were 3D artists, along with illustrators and compositors.

 
“We had to make a bit of a pipeline to churn the characters through quickly,” Cowell says. “We did that mostly by splitting the group into teams of two or three artists, and each team had 10 characters each. We’d start from the beginning and push them through until the end. Each character had its own deadline, so the teams could begin work on the right day. It was more regimented than how we usually worked, but with so many characters, it had to be micromanaged.”
 
Point caching was another trick that helped speed up production. As Cowell explains, with point caching, the character animator can save all the deformations of the mesh, where the character has moved to, and all its morphs. Then, someone else could off-load that and, rather than having the rig and everything else that’s heavy, the person could start lighting the skinned mesh without the bones. Cowell adds that point caching was especially helpful when it came to the end shot with 8000 characters. “There’s no way you could load all those characters with rigs,” he says. 
 
In addition, the massive crowd scenes proved challenging, and Cowell was initially concerned about creating animations that wouldn’t appear randomized. “We made a library of animations for the people in the back waving their arms, dancing, and so on,” he says. “We would point-cache them out and then place the characters in the scene. If you have 20 guys dancing, you can slide the animation a little bit on each one so they’re all doing the actions slightly differently and it looks more natural. We could create randomization quickly that way, and it’s amazing that you can’t spot repetitions or patterns [in the final output].”

Nexus planned the intricate all-CG spot using a full animatic.
The finished commercial features 60 hero characters and more than 8000 digital extras that populate the scenes.
 
To make the dancing scenes as believable as possible, Nexus Productions hired choreographers, and even went to a dance studio and filmed the choreographers dancing to the beat of the spot’s score. “Animators didn’t rotoscope [the action], but rather used it as reference so the motion feels natural,” says Cowell, who reveals that the spot’s directors were also filmed in the dance video. 

Other tools used include Adobe’s Photo­shop for texturing and After Effects for composit­ing. For theatrical release, the commercial was given a professional digital intermediate grade at Midnight Transfer in London. “There was a lot of range because the frames were 16-bit,” says Cowell. “In the past, we’ve output things to film, and it can be painful because the colors can go crazy. With [Midnight Transfer’s] big DI suite, it allowed us to grade the piece as if we were seeing it through a cinema projector.”

Sparking some debate, “Robot”—about a charming robot that contemplates suicide after losing its GM assembly line job—nevertheless was a big hit among this year’s commercials. For the spot, GM wanted to stress the robot’s tag line, “obsessed with quality,” and quality is what the company got in this Phil Joanou-directed spot that combined animatronics from Stan Winston and seamless CG from Sway Studios.

 
 
 
Sway Studios owner/creative director Mark Glaser says his firm jumped in at the preproduction meeting and shortly thereafter was on its way to East Lansing, Michigan, to film in an auto assembly plant, followed by an additional three days of filming in the Los Angeles area. “The story­board was fairly straightforward,” says Glaser. “The part we thought would be challenging was the water in the Los Angeles River. There wasn’t any water when we shot, but we definitely added quite a bit.”

 
The endearing robot in this GM spot is mostly an
animatronic. At times, though, it is CG, built using Luxology’s Modo
.
 
Stan Winston Studios built two working robots, puppeteered via a hydraulic remote control. Also, the Sway artists constructed a digital replica of the robot, which was used in the scenes when the robot is traveling from Point A to Point B, including the shots when it was on top of the bridge. “All those are CG,” Glaser notes.

In addition to taking photos of the animatronic robot from every angle (which were used to create texture maps), the digital artists videotaped the robot performing some of the actions on the set, which enabled Joanou to direct the performance of the CG robot. “The motions were a great reference for us,” notes Glaser. 

The artists accomplished all the modeling with Luxology’s Modo, and did the animation in NewTek’s LightWave and Autodesk’s 3ds Max. “One key feature in Modo that’s really helpful is the ability to customize the orientation of the workspace while you’re modeling,” says Glaser. “The robot’s parts didn’t always align with an x, y, z work plane, but Modo made it easy to model the robot in all its odd angles.” 

To make certain that the transitions from the real to the digital robot would be seamless, Sway Studios paid close attention to lighting, shooting High Dynamic Range Imagery (HDRI) with a Canon 1DS Mark II SLR camera on the set and locations, and then applied the information within 3ds Max and The Chaos Group’s V-Ray renderer to come up with a result that closely mimicked real life. According to Glaser, the group has a homegrown tool that it uses on every job to assemble HDRIs and calibrate them with color charts. He explains: “We use a Macbeth color chart and shoot it with the real film camera and the HDRI. Once it goes through film development and transfer, we color-correct the HDRI to match the film. It’s basically a correspondence tool between two types of photography, and it was helpful.”

Glaser also credits an Nvidia Quadro-based solution for giving the team the ability to work at the high level of resolution and detail required to make the seamless transitions between the live-action and CG robots. Compositing, meanwhile, was done primarily in D2’s Nuke, with some finishing touches and beauty work on the cars done with Autodesk’s Linux-based Flame.   

As to the challenges of putting water in the Los Angeles River, Glaser once again used Nuke and the advanced features of the Quadro card to replicate the nuances of a real river by applying its features to a CG water surface as multiple layers of procedural textures. 

Judged to be the number one spot during this year’s Super Bowl, “King Crab” had an auspicious beginning. “It was a great storyboard with a lot of potential,” recalls senior 3D artist Ben Smith of The Mill in New York. “There was funny character animation, good potential for great CG mixed with good live-action plates. We were excited about it.”

Stan Winston Studios built animatronic crabs and took two fully articulated versions to the shoot in Mexico. The idea from the outset was to shoot the animatronic crabs, with the plan that The Mill team would replace about 50 percent of the live-action shots with CG. To that end, two Mill artists, Douglas Luca from the New York office and Dave Parker from the Los Angeles office, attended the shoot. There, they took reference and HDRI photos in preparation for all the CG close-ups, and worked on the textures and the animation rig.

During offline, however, everyone loved the close-up shots of the animatronics. “It became the X factor,” says Smith. “People had responded well to that look, and we had a lot of conversations about why those were good. Although we couldn’t quite put it into words as to why, we knew it was better to stay with that look. The fact that it did look slightly goofy and clunky made [the bit] funnier.”

The “King Crab” spot started out with mostly CG crabs, but the look and
feel of the animatronics won out in the end. So, the digital artists had to revamp
their 3D models, shown in some scenes, to blend with the Stan Winston models.
  

Replacing 10 animatronic shots with CG versions, The Mill’s artists needed to closely match the animatronic, which was now seen close up. “Previously, we had taken real crabs from a market and used them for texture—and then ate them,” says Smith. “But once we needed to use the animatronic close-ups for reference, all those extra details didn’t work anymore. The animatronic is not anatomically a crab. It’s a caricature of a crab. So the most challenging part was making the crab look as photoreal as possible, but in an unreal sense, because it was an unreal-looking animatronic.”

As a result of having to match the animatronic, the CG texturing and rendering processes weren’t as tricky as they would have been since the artists had a physical model to work from. But the animation, accomplished in Softimage’s XSI, did prove challenging. “The animatronic moved in a shaky, goofy way,” Smith says. “We had quite a number of animation revisions until we got to someplace that felt right. In our first versions, we over-animated, trying to put a lot of character and cute crab-like features into it. To match the animatronic movement, they had to be readable and, therefore, as simple as possible.”

The group wrote a special software tool to generate footprints in the sand. “That worked on some of the shots,” says Smith. “For the close-up shots, we used another trick. We got a sand pit and an HD camera and lighting. We basically re-created the camera angles and just prodded the sand with a little poker. The Flame guys used that in the composite to get nice interactions with the crab legs in the sand, which helped sell the image.”

Six or seven standard passes—subsurface scattering, ambient occlusion, specular, and others—were rendered out and given to the Flame artists, who had a batch setup for the composite. “All the levels were combined in a way that would get them 60 percent of the way there,” says Smith. “And then they worked on them frame by frame.” The HDRI photography was used for the image-based lighting pass, and the shots were lit using Final Gathering, a tool in Mental Ray. For the big sweeping shots at the end and the panning shot at the very beginning, the group used Science-D-Vision’s 3D-Equalizer for the tracking.

This was another spot with a tight deadline: seven artists, two of them compositors, over a four-week period—a fast turnaround. But, the result was definitely a hit with viewers.


 

Debra Kaufman is a freelance writer in the entertainment industry. She can be reached at dkla@ca.rr.com.