Getting Personal
Issue: Volume: 25 Issue: 10 (October 2002)

Getting Personal

Ask any group of people you know to list the ways computers are changing, and chances are they would start by saying that systems are getting smaller, faster, and cheaper. After that, they might add that computers are becoming more ubiquitous and connected. And finally, they might mention that input and output devices are improving or perhaps even getting better at sensing their surroundings. In fact, this is the list most people would probably come up with, regardless of their age, education, or profession. And they would all be right.

But is the list as it should be? Is this the right order of research priorities for computer system developers? No way. In fact, the items on the list are in exactly the reverse order if computers are to better meet our needs and if the computing industry is to halt its fiscal decline. One industry visionary who promotes this view is Bill Buxton, chief scientist at Alias|Wavefront and a speaker on a panel titled "The Future of Computer Graphics" at the most recent Siggraph conference. The problem with focusing on making computers smaller, faster, cheaper, and so forth, he says, is that every one of these is a technological attribute, but technology for its own sake doesn't matter in terms of what's going to have social and economic importance.

Buxton is right. Rather than taking a techno-centered approach, developers need to begin creating computers that serve people, instead of continuing to require that people serve computers. What should be at the top of the priority list, instead of at the bottom, is the development of systems that can make sense of their environment, in terms of recognizing the identity, location, movement, and speech of the people around them. For example, understanding a person's identity and location is critical because it provides context, Buxton explains. "When I'm at the podium, you know I'm the speaker. You know that the other panelists are speakers, even though they're not speaking. We need to spend time making such semantics implicit for machines so we don't have to specify them explicitly." But this has not happened because we're still focused on technology rather than on people, he says. "And there's not a computer science department in the world where in order to get a degree in computer science it's necessary to have ever written a program used by another human being."

While that may be true, at least one university-based alliance is striving to create a new, "human-centered" form of computing. Called Project Oxygen, the MIT-industry partnership aims to develop systems that will enable computers to recognize us, understand our gestures, converse with us in plain English (or other languages), and even interact with other machines and computers without our intervention, and be available anytime, anywhere—like oxygen.

With seed funding from DARPA and support and research from leading high-tech companies including Hewlett-Packard, Philips, Nokia, Acer, Delta Electronics, and NTT, the five-year project has led to significant advances in the two years since its inception. For example, at the second annual meeting of Oxygen's partners held recently, the researchers introduced a host of new technologies, including an integrated vision and speech system that tracks a speaker's location and arm position, extracts the speaker's voice from background noise, and responds to gestures and spoken commands. In fact, the theme of Oxygen's two-year progress report was the integration of speech recognition, vision systems, and other state-of-the-art technologies to achieve the participants' grand vision of pervasive, intelligent, and virtually invisible technology that responds to us and serves our needs.

The approach may someday lead to systems that have existed only in the realm of science fiction. For instance, developers envision creating applications such as a "holodeck" office, with an array of microphones for speech recognition, multiple cameras for face and gesture recognition, and huge 3D screens mounted on each wall for displaying visual information to the user.

Integration between components such as handheld devices and larger displays is also on the Oxygen agenda. Accomplishing that would, for example, enable a user listening to a mobile phone or viewing its display to walk into an Oxygen environment and see visuals of the person on the other end of the line or other images automatically displayed on a nearby wall-mounted screen.

Despite these recent advances and lofty goals, a tremendous amount of work remains before such human-centered systems are ready for users. But the potential payoff is huge. Indeed, the technology could be used in countless new ways and ultimately reach a vast new market of consumers, numbering perhaps in the billions, including those who have never used a computer before.

Project Oxygen is a breath of fresh air for the computer industry. And, to be fair, so is the work being done by a select few other researchers, including Buxton and his group. Focusing on building smaller, faster, cheaper, technology-centered systems will not lead us out of the economic doldrums. As Buxton warns, "We understand pretty pictures in this community. But it's not about putting pixels on the screen; it's about interacting with pixels on the screen. Unless we put human beings back in front, we're going to stay in this slump. We're going to get more and more arcane papers. And we're going to become more and more irrelevant." It's time to finally put the person in personal computing.

A New Look

By now you may have noticed that Computer Graphics World has a new look, complete with a new cover layout and logo. But that's not all that's new about the magazine. We also have a sharper focus—spelled out in our updated tag line: "The magazine for digital content professionals." And we have a tighter style—aimed at delivering more articles on innovative applications and products for sophisticated users in film, TV, DV, gaming, the Web, and graphic arts. We hope you like our new design and direction. Please drop us a line and let us know what you think. —The editors