Cinesite Gives Marmaduke Some Bite
Karen Moltenbrey
July 8, 2010

Cinesite Gives Marmaduke Some Bite

Cinesite, a leading film visual effects house, has completed over 650 shots on the new Twentieth Century Fox film Marmaduke.

Originally based on a newspaper comic strip by Brad Anderson, the film follows the story of a suburban family that moves into a new neighborhood with their large, loveable Great Dane, Marmaduke, which has a tendency to wreak havoc in his own, oblivious way. Directed by Tom Dey, the film plays host to a raft of celebrity voices including Owen Wilson (Marmaduke), George Lopez (Carlos), Kiefer Sutherland (Bosco), and Fergie (Jezebel).

Working for a total of 11 months on the film, Cinesite’s visual effects supervisor Matt Johnson and his team centered their pipeline on existing proprietary software and custom-built tools to overcome the series of challenges that animating 10 different live-action dogs created. “It was a massive task for us,” explains Johnson. “Placing CGI faces over live-action animals sounds like a simple task, but each breed of dog or cat brought its own unique challenges, and the detailing involved meant we had to push our pipeline to its limits.”

Cinesite used a hybrid technique, which involved combining fully textured and lit CG passes with parts of the original photography re-projected over the animated geometry. To create the CG faces of the different canine characters, Matt’s team built base head models in Autodesk’s Maya using photographic references of the dog actors. Blendshapes based on individual muscle shapes were then integrated into the rig using in-house tools. The muscles were further defined in the animation process. A customized rig was designed to mimic the muscle structure of a dog, and this became the primary layer of canine muscles. The muscles were further defined on a secondary layer, mimicking the muscles of a human face.

“Because the dogs had human traits about them, we had a team of artists studying facial expressions and writing muscle codes that could be added into our pipeline,” continues Johnson. “We carefully studied Owen Wilson’s mannerisms and reflected this in the expressions we gave Marmaduke beyond simply making the dog talk. We were attempting to capture a real sense of Owen Wilson in a 200-pound Great Dane.”

In many shots, Cinesite further defined Marmaduke by adding in 3D eyes and whiskers. “Using Cinesite’s hybrid texture projection and CG fur techniques allowed our talking animals to sit seamlessly alongside the production’s live-action animal performances,” adds Johnson.

The final result was the creation of 10 live-action dogs that have had CGI work seamlessly blended into their faces and around their necks to sit pixel by pixel alongside the real fur.

“It was great to work on our third canine project following the successes of Beverly Hills Chihuahua and Underdog,” says Antony Hunt, managing director at Cinesite. “Working on Marmaduke has allowed us to further develop our dog simulation pipeline, making it an incredibly innovative and powerful tool for us to rely upon for many projects to come.”

In this Q&A with chief editor Karen Moltenbrey, Matt Johnson, visual effects supervisor, provides further details about the project.

What specific changes did Cinesite make to its production pipeline to accommodate this work?

Our talking animal pipeline is based around a hybrid technique combining fully textured and lit CG passes with parts of the original photography re-projected over the animated geometry. For this show we also added CG eyes and ears to Marmaduke in many shots.

Why were the changes needed?

Every director and each film comes with its own artistic requirements, particularly regarding the look and style of the talking animals. In this instance, the client was looking for a ‘photorealistic cartoon dog.’ Eyeball, eyelid, and ear animation were all necessary to achieve the desired expressions.

Which software packages did you use, and for which specific tasks?

The team used: Apple’s Shake and The Foundry’s Nuke for compositing; Autodesk’s Maya and Pixar’s RenderMan for lighting; The Pixel Farm’s PFTrack for tracking; Silhouette FX’s software for rotoscoping; Maya for all 3D tasks, such as modeling, rigging, tracking, and animation; Autodesk’s Mudbox, Pixologic’s ZBrush, and Adobe’s Photoshop for texturing; PRMan 15.0 for rendering; Assimilate’s Scratch for 2K dailies reviews.

What about the hardware?

Artists’ workstations were Quad processor 2.6GHz HP 8400 or similar running Linux Fedora. We used an 80-machine renderfarm and IBM workstations for 3D.

Describe the custom tools you used on this film. What did they do?

There are several custom tools involved in the pipeline we’ve developed. They include:

• CharacterCameraProjection — An in-house Shake tool that enables compositors to re-project paint scan over the animated geometry.

• MotionWarper — A 2D warping tool that allowed for re-creation of the projected imagery in 2D with the added bonus of being able to manipulate the motion vectors and, therefore, control areas that suffered from exaggerated stretching where needed at the compositing stage.

• UVMapping — UV passes rendered by 3D were used to great effect at the 2D stage for a wide variety of purposes, including mapping paint work, 2D noise maps, and roto onto the animated geometry.

• Modeling / Rigging — “Trickle down node” and “dog blender” are in-house tools that intelligently blend muscle shapes from mouth movements.

 Animation — CANI (Cinesite Animation Interface) is an asset management tool that keeps animators informed as to the current status of the track and rig assets they’re working with. In a creative and flexible pipeline, with frequent client requests for changes, this tool is essential.

• Fur – In-house fur-grooming tools and customized RenderMan shaders.

Several 3D tools were also created including: a one-button set-up for creating full CG eyes matching HDRI lighting; we further developed the artist friendly lighting pipeline, automating many processes to give a consistent look; advanced pre-render checking tools; a tool to groom fur around mouths and give more detail to lip lines.

Describe the work that you did on the film. Was it all canine mouth replacements? 
Due to the nature of the animation style, our work varied from shot to shot. In some instances the amount of movement in the eye area or corner of the muzzle meant that we needed to rely on large areas of fully furred and rendered CG. In other instances, projected textures were used.  Due to the unpredictable nature of the dogs’ live-action performances, there were no hard and fast rules. Our pipeline allowed full flexibility when it came to using the best methodology for each given shot.
Again, due to the nature of filming with live animals, another challenge on this type of movie is that, in many cases, any given frame is comprised of multiple passes, each featuring multiple animals. Some of these can be split, while others are greenscreen elements, and we have to allow flexibility within the pipeline to accommodate these requirements.
I was also on set throughout principal photography, both in Vancouver and Los Angeles.

Which dog was the most difficult, and why?

Several of the dogs presented extra challenges. Marmaduke is a Great Dane with very large and mobile jowls. Every shot required ‘jowl clean-up’ and we had to be particularly mindful of how the appearance of his jowls affected his performance in each shot.

Mazie the cat was also quite a challenge. She had very visible white whiskers that needed removing in every shot. She also had a varied fur pattern which was tricky to match.

In total, we animated 10 talking dogs and cats. Many of them blinked in the original scans, and the animation director often wanted the blinks to be synched with animation queues. This meant removing original blinks with 2D paint work followed by the extra integration work needed to add in CG eyelids.

How did you overcome those difficulties from a technical standpoint?

Jowl clean-up was tackled with standard 2D patch-tracking methods and custom motion analysis tools that, in certain situations, could track patches very accurately, even with panting and breathing dogs.

Whisker removal was helped greatly by custom motion-analysis tools that we used to track patches. However, it still required a fair amount of manual processing, animated warps and rotoscoping.

What other technical challenges did the overall work present?

The biggest challenge lay in creating high-quality 3D models, rigs, and animation, with realistic lighting across a series of characters, while maintaining a high level of built-in flexibility to allow quick response to client feedback.

There’s inevitably a challenge in instances where fully 3D elements are side by side with live action, where areas of the shot are under greater scrutiny by the film’s audience. I believe our work stands out, and I’m very proud of what our visual effects team have achieved.

Have you done similar work in the past? How was this work different? How was your approach different?
I was personally involved with the production Beverly Hills Chihuahua, which presented similar challenges. However the main difference on Marmaduke was principally the type of dogs being used. For example, Marmaduke as a Great Dane had long jowls and a very creased and wrinkled face, which caused very specific challenges for the 3D pipeline compared to the relatively simple structure of the smaller Chihuahua.

We have seen talking animals in the past, and even currently. Is this different?

Previous talking-animal movies have broadly fallen into two camps. Firstly, the cartoon-style CG characters such as Scooby-Doo, Garfield, and Alvin & The Chipmunks. These feature a very broad style of animation, and the characters are rendered in a humorous and non-photorealistic way. The second popular style is for more realistic animals with more subtle, less characterful performances, like Beverly Hills Chihuahua. What we were asked to do in Marmaduke is combine the two styles and create something that is simultaneously cartoony and photorealistic.

How has the approach to talking animals changed over the past few years?

Originally, many talking animal movies used animatronics and other mechanical effects. In addition, there has always been a tendency to impose rigid parameters on the animal performances while filming. However, as the tool sets and techniques have improved over the years, filmmakers have far more flexibility both at the filming stage and to make editorial decisions based on the best performances. They know that we’re far more able now to accommodate their requirements in postproduction.

What will we see in the future?

There will always be a market for animal-based family entertainment. I grew up watching the Disney Homeward Bound movies of the ’70s and The Littlest Hobo. Previous generations watched Lassie. The modern techniques have changed so that what would once have been voice-over has become a more complete and realistic animal performance. Whatever your feelings about the genre, kids love these films, and there will always be a flourishing market for them at the cinema.

See Clips From Marmaduke on

- Clip #1 "How's My Breath?"
- Clip #2 "The Duke's Ranger"
- Clip #3 "Pretty Good Bee"