Moving the Future
Issue: Volume: 31 Issue: 2 (Feb. 2008)

Moving the Future

They’re an unlikely couple—Paul S. Otellini, Intel CEO, and Steve Harwell, lead singer of the alt rock band Smash Mouth. But, with the help of VR, motion-capture, and other leading-edge technology companies, they made music in what Otellini calls “third life” at this year’s Consumer Electronics Show in Las Vegas.

Onstage, Otellini began the keynote by stepping through Intel’s vision of a future in which the Internet becomes personal; it becomes “proactive, predictive, and context-aware.” To demonstrate, he takes the audience on a tour of Beijing, with the help of a set that looks like the outside of a Chinese restaurant, a 15x50-foot screen, a few off-stage processors, and Total Immersion’s D’Fusion augmented-reality software.

A colleague points a cell phone at a street sign and sees the English translation replace the Mandarin characters on his screen. Similarly, when he points his cell phone at a menu on the side of the restaurant, he sees pictures of the food and a video review overlaid on the video captured by his phone.

Otellini notes that although this is only a demo, with more powerful and energy-efficient processors, ubiquitous wireless broadband infrastructure, context-driven Internet, and more natural user interfaces, we can expect to have applications such as these in our hands “in a blink.” Now, it’s time for the climax.

The Smash Mouth band members are already online, connecting from wherever they are in the world. Harwell takes the mike, and by using eJamming’s AUDiiO software, sings “Walking on the Sun” with his band. They’re perfectly in sync. “I can’t believe how close in time that is when we’re this far apart,” says Harwell.
 
Steve Harwell, lead singer for Smash Mouth, jams with his band in real time and in a virtual garage during the keynote address at CES, even though they’re in different parts of the world.

But, Otellini hasn’t finished. He brings Big Stage cofounder Jonathan Strietzel onto the stage. Big Stage, a start-up that debuted in December 2007, converts photographs into 3D avatars. Strietzel takes three photographs of Harwell’s face—one from the front and one from each side. From those photographs, or, in this case, photos Strietzel took earlier, the software creates a 3D model texture-mapped with Harwell’s face in less than a minute. Digital Steve Harwell is bald, but he has the real Steve’s eyebrows and goatee, and can wink, blink, and move his mouth. After a little funny business—Strietzel gives Harwell a blond Mohawk, he puts Harwell on a still image of a chopper, changes his hair, and changes his expression—he pulls up a Smash Mouth music video and replaces Harwell’s head with an Otellini avatar.

Then, Otellini shows Harwell a video on the big screen.

“That’s my neighborhood,” Harwell exclaims as the video rolls on the 15x50-foot onstage screen. The camera stops in front of a garage.

“We’ll raise the garage door,” says Otellini.

“That’s my band right there,” says Harwell.

On screen, the band members, who were in different parts of the world, appear in their avatar form at their instruments. Harwell steps into Organic Motion’s space onstage, and his avatar joins the band in the garage. As the band plays, you can see Harwell’s avatar in the garage replicate his motion onstage in real time as he sings “All Star,” one of the band’s songs from the Shrek sound track. It was the first-ever virtual live jam session.

eJamming handled the sound, which was separate from the video. Serious games developer Virtual Heroes used Epic Games’ Unreal Engine 3 technology to pull the garage and the graphics together.

“We had a month to put it all together,” says Andrew Tschesnok, Organic Motion CEO. “There was some work getting our motion data triangulated into what the Unreal Engine uses, but most of the work was on the graphics side. Virtual Heroes put the instruments into the garage and got the avatar heads to sit right on the shoulders.”

During the keynote, data from Organic Motion moved via Ethernet cable to a server behind the stage that drove the characters on the big screen. But, during the month of preparations, Organic Motion worked remotely with Virtual Heroes. “The Virtual Heroes people in North Carolina were running software that drove the garage graphics, and we were in Manhattan,” Tschesnok says. “We needed to figure out how far Steve could walk, so I stepped into our booth and they’d say ‘Move a little to the left.’ One time, I threw up my hands, and they said, ‘OK, OK, just relax.’ It was surreal. I’m on the phone and they’re watching me.”

The Organic Motion software captures motion by using only cameras; the subject being captured doesn’t wear markers. “Marker-based systems see dots,” Tschesnok explains. “We see the entire body and figure out what the dimensions are.”  Built into the system is a theoretical human model, which speeds the calibration process: When Harwell stepped into the white-walled space onstage, the calibration was nearly instant and the 14 cameras were ready to go. He didn’t need to wear a special suit or markers of any kind.

The white walls help the camera more easily distinguish the subjects from the background in the captured images. “Our technology isn’t confined to a space, but without the white background you’d need more processing power,” Tschesnok says. “For our first product, we focused on putting together a turnkey system.”

The first systems Organic Motion plans to release this spring include the 14 custom cameras and a proprietary hardware/software system that plugs into an off-the-shelf computer. The proprietary system processes the images captured by the cameras and sends the resulting data to Autodesk’s MotionBuilder, C-Motion’s Visual3D, or other software.

Tschesnok expects it will be three to five years before consumers have personal access to the technology, but consumers will use it long before that. “Retail access will happen very soon with our product,” he says, noting that the company is signing deals with international manufacturers that make consumer products. The manufacturers will then work with retailers. “This isn’t going to be an absolutely huge thing overnight,” he adds. “We’ll supply quantities in the hundreds, not the thousands. But you should see something in the six- to nine-month timetable.”

What kinds of things? Tschesnok can’t provide details of an actual application. “I can use golf as an example, though, because it isn’t golf,” he says. “Imagine you go into a store, swing a golf club, and then get a custom golf club built for yourself based on your swing. This hasn’t been possible before because the manufacturers were dependent on store clerks for data, so there was a quality-control issue. But, when you have a computer giving consistent results based on statistics, you can base a manufacturing process on that.” Similarly, a runner might get custom shoe inserts manufactured based on a treadmill run.

“Some of the most interesting applications are from people who approached us,” Tschesnok says. “We didn’t approach them. They’ve ranged from theme parks to medical research—anything dealing with human motion. It’s hard for us to guess what the next big thing might be.”

Now, Tschesnok hopes that the spotlight focused on them by Intel during CES will help the company reach people who might not have been aware of their motion-capture technology. “There are so many different areas that can use this technology,” he says. “So, we were thrilled to be part of the keynote. It really helped introduce the idea that you can see people and quantify what they’re doing.”
Or  show them, as he puts it: “Here’s this thing that can see you move.” 

Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.