Produced quickly with Xsens mocap technology, the film was recently an official selection at Comic-Con International proving that the right tools are all it takes to update a rom-com.
ENSCHEDE, NETHERLANDS – How do you catch robot love on film? The answer for Chapman College’s Dodge College of Film and Media Arts students was “Some Like It Bot,” a short film that depicts the clumsy first date of a human and his cyborg crush.
The centerpiece of the film is a robot-only bar that the lead human has to sneak into in order to go on a date. This set piece allowed the team to design an accompanying cast of interesting background characters. But through doing so, they quickly realized they’d need to take a unique approach to capturing them on screen.
“We had some really fun ideas like bartenders that served from the ceiling and intimidating 12-foot-tall robot bouncers; a cast like that could have cost us quite a bit of time and money if we didn’t have Xsens MVN,” says Alex U. Griffin, a recent graduate of Chapman University’s Dodge College of Film and Media Arts and budding filmmaker. “With the suit on hand, our team could act out our moves anywhere and then feed that data into MotionBuilder. The characters that we visualized in our heads began coming to life before our eyes.”
The experienced cast and crew spent the next few weeks defining a production plan that would help to expedite the film. Griffin made sure to give extra attention to the interaction of actors with the cameras and their Xsens MVN suit, capturing their live-action shots as seamlessly as possible.
“Our actors (myself included) knew from the rehearsals and shot plans where the cameras were going to be, and more importantly, where their sightlines were,” says Griffin. “This way, we could use motion capture for the actors to play off their digital characters on set. After all, it’s one thing to animate a whole world. A live-action blend with digital counterparts is a whole different beast!”
When the students were ready to start filming, the Xsens team paid a visit to help them kick it off. After a morning walkthrough that included a quick system set-up and information on how to raise the output quality, students started to experiment with the system’s capabilities. Three days later, the entire film had moved on to post-processing.
“We tried some real-time filming with the Xsens MVN suit, often doing a shot with an actor as the robot and another with our live human interacting with them from a distance, in order to keep our frame clean,” says Griffin. “Our crew really liked using this technique with over-the-shoulder shots or close ups. Everyone had a basic idea of where they were in the shot so we were able to match eyelines on set instead of spending a ton of time re-shooting or editing in post.”
That weekend of production was followed by a quick post workflow, thanks to the clean data from Xsens’ inertial system. This came as a welcome change for students who were used to keyframing for hours or even days after filming.
“Since we were shooting live-action, we relied on Xsens MVN for its flexibility to work on set. After our quick ten-minute calibration at the beginning of each day of production, our crew was ready to film. We found that Chapman’s optical motion capture room really only works well for an animated piece,” notes Griffin.
Recently, “Some Like it Bot” was an official selection at Comic-Con International, where its creators led a short Q&A session at a showing of the film. With “Some Like it Bot” under his belt and headed toward even more film festivals, Griffin is interested in seeing what motion capture can do, especially in relation to blending with and enhancing live action footage
“Working with Xsens’ technology really opened up this kind of live action production for us,” notes Griffin. “We were amazed at what we could do with this size of a budget and crew. I can’t wait to get ahold of the suit again to tackle an even more ambitious project.”