Science Project
Issue: Volume: 31 Issue: 9 (Sept. 2008)

Science Project

Sid the Science Kid, a new television series debuting on PBS Kids this month, may be geared for youngsters, but the technology used to create the animation—a puppeteer-controlled real-time motion-capture system—is far from child’s play.

Sid the Science Kid, produced by the Jim Henson Company in partnership with Los Angeles public television station KCET, is an animated series that encourages curiosity and teaches the foundations of science to preschoolers. Although the show is aimed at children, it is also certain to catch the attention of animators and animation fans everywhere. That’s because Sid is the fruit of Henson Company’s many-year effort to create the Henson Digital Puppetry Studio, a viable real-time motion capture-based animation system (formerly known as the Henson Digital Performance System).

In 40 half-hour episodes, the inquisitive five-year-old Sid, with the help of family and friends, gets answers to questions they have about science as it pertains to everyday life: Why do bananas go “bad”? How does my juice box straw work? How does a bird fly without a plane? “With each production, we’ve set the bar higher,” says Kerry Shea, head of digital production. In fact, the group is able to perform five characters in real time, “with four cameras, audio…everything a director needs to see a performance in real time,” she says. “We’ve always been able to do it in bits and pieces, but now we can see it fully fleshed out.”

The concept is the Holy Grail of animation: to be able to direct animation like live action, with real-time feedback and no onerous, costly rendering times. That’s been done to some degree with motion capture, but the Henson Com­pany is after something more finessed, using puppeteers to control facial expressions and fingers, as well as voice the dialog (and sometimes improvise it). The idea was to create different environments, props, and a space large enough for multiple characters to interact—to the point that an episode could be directed exactly like a sitcom.

In fact, live-action director Katy Garretson, who has directed more than 50 prime-time sitcom episodes, had never directed a children’s show, much less an animated one, when she tackled her first episode of Sid. “I was petrified coming into the show,” she admits. “It’s different from anything I had seen before, and it was daunting. I thought I had a lot to bring to it, with my comedy background, but wondered if I could grasp this technology.”

After meetings and discussions with Brian and Lisa Henson, two of Jim Henson’s five adult children who run the privately owned studio, Garretson decided to give it her best shot. Her first surprise came on stage, when she found the camera operators were the same ones she had worked with on live-action productions. Nevertheless, she still had to get used to the stage setup—adult and child characters perform in different areas of the stage, with the puppeteers who control facial expressions and hands, and do the voices, off to the side of the stage—as well as learn the terminology.

“You learn to look at the monitor [and not the stage] when everything is coming together,” Garretson says. “It takes a strong imagination to do a show like this. You don’t see it when you look at it. You have to see it in your mind. When I’m giving direction to a character, I’m talking to the mocap actor doing the body and the puppeteer doing the face and the voice. But that started to become second nature.”

Garretson eventually discovered the freedoms that came with a Henson Digital Puppetry Studio production. “The cameras can do so much more,” she notes. “It’s like having mini Techno cranes that can go under, around in circles…things you could never imagine doing in a sitcom. The creativity is huge in terms of what you can do visually. And the puppeteers did such wonderful improvisation.”

Although many of the pieces of the Henson Digital Puppetry Studio were already in place, Sid is the first production to take advantage of a tool set that was complete and speedy enough to direct like a live-action sitcom. That required a specialized production and postproduction pipeline, which was put together by digital effects supervisor Steffen Wild, and innovative character templates and rigging, headed up by CG supervisor Jeff Christie.


The Jim Henson Company’s Digital Puppetry Studio enables the production crew working on the new television series Sid the Science Kid to perform as many as five characters at a time, in real time, directing the animation as they would live action.

Pre-production
This groundbreaking production took place at the Henson Company stages on the historic Charlie Chaplin lot in Hollywood. In a building called the School House, artists turned artwork into clay models of Sid and the show’s other characters. “It’s like a live-action pre-pro experience where you build sets, characters, and costumes,” Shea says. “One thing we employed from live action was a production designer who came in and designed all the sets for us. We found it was incredibly effective.”

Christie and Wild oversaw a crew of 30 people comprising modelers, texture painters, riggers, and look supervisors. After the maquettes were built, Christie helped supervise the modelers as they scanned the shapes into the computer.

“We came up with a single, generic mesh topology that we could get to match up to the sculpted scan,” says Christie. “By having a single template character, we’d have the same vertex count and the same relative vertex position, so that when we were working on skin weights, for example, we could come up with one rig and easily transfer it to another character. It would get you 85 percent of the way there with rigging.”

Using Autodesk’s Maya, each modeler handled two characters. In all, it took about a month to complete each of the six characters. The first pass took a week, the rigging two weeks, and then some tweaking was required. While the Henson Digital Puppetry Studio includes an in-house viewer, Maya is the linchpin, says Christie.

Christie also developed the facial animation system, which is based on placing bones underneath the mesh. The face is broken down into a number of discrete areas, and for each region—for example, right eyebrow, left cheek, and so forth—the group developed a series of control attributes. “We have control attributes that drive a series of bones to manipulate the mesh into the shape we want,” he says. “There’s a collaboration between the puppeteer and the rigger. The puppeteer gets access to [the control attributes] and then can mix them and work with them. As the puppeteer works on the character and develops its unique personality, he or she comes up with little tics and personal movements. They give feedback to the rigger—maybe they want something to move farther or a little control area we haven’t spelled out that they might want control over.”

According to Wild, the pipeline was created from scratch to handle this “interactive, very immersive process whereby actors and puppeteers can work together in one virtual environment on the fly in real time.” He adds, “We have takes that are about 4000 frames long without a camera cut. To do that in a keyframe environment, we would have to sit for months and months. Things are possible in this environment that are unique.”

The machines were “beefy enough” to handle two million polygons per scene, in real time. “That encompasses the environment and up to five characters,” Wild says. For this real-time approach, the team had to enhance the system to where the memory footprint is 4gb of RAM per core. The Henson Company has eight-core computers, so eight processors are working with 32gb of RAM in every machine. The computers themselves rely on a mix of AMD and Intel chips.

Production
The Henson Company’s main production software for Sid is Maya Version 2008, but the Henson Digital Puppetry Studio creates the code that was missing to capture real-time data on stage. “There’s no real-time format out there,” Wild says. “We tested a couple of products, but we had unique requirements of up to five characters interacting with one another, and not only body motion, but simultaneous facial motion as well as finger motion, plus being able to capture up to four cameras in real time. There was no off-the-shelf package available that could do that, so the Henson Company developed it.”

Once the five characters, 11 environments, and multiple props were built and ready to go, production moved to a massive soundstage that contained a Motion Analysis system handling a 40- by 60-foot motion-capture volume. In the middle of the stage are the mocap performers, wearing the familiar suit with optical dots and an unusual rig. “Henson came up with Outriggers, which is what he used for extensions of the body,” explains Shea. “We do this so the performers know the extent of their [CG] character. You don’t want to put your hand through your stomach. With the Outriggers, it’s like wearing a costume, so they really get to know the physicality of their character.”

On the perimeter of the stage sit the puppeteers, who control the facial animation with a hand-manipulated rig. Two performers—the mocap performer and the puppeteer—create each character. Everywhere around the stage are monitors, just like a live-action soundstage, except we see our 3D CG characters walking, talking, and moving in the CG environment in real time.


Artists modeled and rigged the characters in Autodesk’s Maya, while puppeteers provided the movement behind the animation.

 “At the back of the stage, three camera operators are in front of controls, moving cameras,” describes Shea. “You have a hanging screen with a quad split, so you’re seeing all the camera angles of the characters, all at the same time. The director has two monitors: the quad split and, on the other, the camera angle that is being called by the TD (in this case, the television director, who calls the cameras) upstairs—just like a real sitcom.”

Also in the booth, is someone recording audio and a script supervisor—again, just like TV. Instead of rolling film, the group had what it calls a “mission control” operator who, when the director calls “Action!,” hits a Record button that records all the camera angles, animation, and audio at the same time. “I truly envision it as digital dailies. If you said, roll back tape, that’s what they’d have,” Shea notes.

All those cameras that are able to go anywhere in the environment can make for tricky scenes. “The camera operators are performers as much as the mocap performers or puppeteers,” says Christie. “They’re operating in real time, and the switch director is following it on the fly. It’s quite a choreography.” To assist in that choreography, the Henson Company developed a way to operate the different assets: characters, props, sets, and environments. “If we have Sid’s bedroom, composed of walls and floors, and populated it with various props, like a bed and bookshelf, all those assets are referenced into the scene and can be updated at any point in time with textures as materials develop,” he explains. “We can turn off a wall if we need a camera to fly in. And we can have the walls broken down into sections, so we can also turn off a particular section if need be.”

After a day of shooting, the director picks the takes he or she likes, and those takes are hardware-rendered. “They create circle takes, just like in TV,” says Shea.

“This is a very similar process to live action where we’d send out our dailies,” adds Wild. “We have a special hardware renderfarm for it. If a take is 5000 frames long, they can render for all four cameras, which is 20,000 frames.”

Postproduction
According to Wild, the Henson Company has also written custom code to automatically import and export animation into and out of Apple’s Final Cut Pro as XML description language. “That way, the editors don’t have to do that work,” he points out. “They can start working the same day. If we have stage going in the morning and a circle take is called, that render starts instantly. By late afternoon, all those frames are available for editorial. Editorial takes two and a half to three days to put together an entire episode.”

At this point, says Wild, the pipeline becomes more similar to that of the typical 3D CGI workflow. “Now we have the selects and we have the scene, take, and camera information,” he says. “We are basically shot-based, where we know exactly what to light and render for the final product.” The team uses Mental Images’ Mental Ray for rendering.

Prior to this process is the animation cleanup. “Although our performers on stage are all very good, in the heat of the action, there could be a hand going through a table,” Wild says. “In cleanup, we adjust for those little inconsistencies and make that a seamless integration of the details.”

A challenge for the entire production and post pipeline was to come up with a system to manage all the data for the 40 all-CGI episodes. The group used Cirque Digital’s GDI database system as the front-end interface to the Henson Company’s proprietary database, though the GDI interface was customized to act as a virtual assistant. “If an artist creates a new file, the system automatically knows what to do with it,” says Wild. “There’s no headache dealing with administrative stuff, and the artists can focus on the creative.”

Christie adds that GDI enabled the group to have as many as 999 versions of any character, environment, or prop. “We always tie the latest good version to a master version, and that master version is the source for use in the scene,” he explains. “We also have our own in-house database written in Linux, and that database is the back end to GDI. GDI gives us the interface to that database.” Another piece of Henson-written code allows the animator to publish to GDI directly from Maya.

Real-time Animation
In Shea’s opinion, the experience of marrying the worlds of live action and animation was efficient, despite the surprises. “When you expand the director’s tool set and give her a bigger box of crayons, you’re always amazed with what she can come up with in the moment,” Shea notes. “And you have to be ready.” For example, traditional CG productions rely on painstakingly developed storyboards.

“We don’t have storyboards,” Shea adds. “We have multiple cameras, and you never know where the director will put it. That can create an interesting situation where everyone wonders where she’ll put it. You get great shots that way, but as CG production people, we always have to be prepared for the unexpected.”

That is exactly what happened in the last episode, where the group decided to add a dog to the mix. “We successfully motion-captured a dog and performed its head and face in real time,” says Shea. “I’ve never seen anything sillier than a real dog running around in a mocap suit, but our lead character, Sid, had a dog.”

From Wild’s point of view, the beauty of the Henson Digital Puppetry Studio is that the production results look outstanding yet they are accomplished with a small crew. “We can keep it all in-house and maintain creative control,” he says. In other words, there is no need for any kind of outsourcing.

“This is a serious use of technology and an advanced application that everybody realizes is needed,” notes Shea. “What the Henson shop has done is remove the technology from the creative process so people making performances and creative decisions don’t even notice the huge firepower to make that happen. Their skill is what makes it all worthwhile.”

Debra Kaufman is a freelance writer in the entertainment industry. She can be reached at dkla@ca.rr.com .