Issue: Volume: 25 Issue: 8 (August 2002)

reality redefined



Just emerging from the labs, augmented reality (AR) blends computer graphics and the world around us for a variety of uses

Through the computerized looking glass of augmented reality (AR), like Alice in Wonderland you may see the real world vir tually annotated with informative la bels, or what's visible overlayed with what's usually hidden, or animated playmates fading in and out of view. Such real-time location-based interaction of reality and computer-generated images could soon be widely used in education, tourism, defense, medicine, manufacturing, and entertainment.

AR is still largely confined to academic and corporate research labs. Nonetheless, it has become far more practical in the three years since Computer Graphics World last visited this realm of technology in depth (see "Better than Real," pg. 32, February 1999). Augmented reality has gone from custom-built, proof-of-concept systems to readily available components and software, with everyday applications now being prototyped and tested. In New York, a virtual restaurant guide displays reviews that float beside the actual eateries near the Columbia University campus; at the University of North Carolina, breast biopsies are guided with sonogram images superimposed on patients; and from Japan have come AR games that have delighted Siggraph attendees in past years. This year at Siggraph, several exhibits offered further evidence of AR's imminent emergence from the labs.
The Battlefield Augmented Reality System (BARS) provides a test bed for military applications that could guide the soldier of the future through hostile terrain. Some form of AR informational display is expected to be part of standard GI equipment before




Increasingly in the graphics community, augmented reality is seen as one point along a continuum known as "mixed reality," a taxonomy first proposed in 1994 by Paul Milgram and Fumio Kishino. In between 100 percent real-world environments and entirely computer-generated virtual environments lies augmented reality, where virtual objects are added to real environments (there's also the less-used 'toon-style "augmented virtuality," where real objects are added to virtual environments).

In principal, auditory or other sensations can be associated with features in the augmented world, but most work is being done on integrating visual information with the perceived environment. Aligning computer graphics and a user's point of view was pioneered by virtual reality and heads-up display researchers, so the head-mounted display (HMD) used by them is also the preferred AR display medium. In such eyewear, left- and right-eye LCD computer images either are re flected onto a half-silvered mirror with see-through views of the real world or are shown on opaque displays mixed with real-world stereoscopic views provided by miniature video cameras perched on the HMD. For proper alignment of computer graphics and real-world scenes, an AR system must precisely measure a user's location and viewing orientation. The computing power needed to process this information, render left- and right-eye images, and superimpose them on real-world or video views with sufficiently little lag was beyond portable systems until recently.
The Mobile Augmented Reality System (MARS) developed at Columbia University was the first to demonstrate AR outdoors. In the background is a local eatery made famous by "Seinfeld," with restaurant guide entry in AR.




Off-the-Shelf Augmentation
At Columbia University's Computer Graph ics & User Interfaces Laboratory, the Mobile Augmented Reality System (MARS) is the result of a decade of development led by professor of computer science Steve Feiner. Only in 2001 did commercially available portable computers become sufficiently powerful to replace custom-assembled configurations. The current MARS, used for the virtual restaurant guide, a historical tour of the Columbia campus (including virtual buildings no longer there), and other outdoor demos, is still cumbersome: its principal components mounted on a backpack weigh 26 pounds. These include a laptop computer with a Pentium III processor running at 1ghz (boosted by a GeForce2 Go graphics processor), extra batteries to power the display glasses, and position and orientation devices. Loca tion sensing accurate to within centimeters is implemented with a real-time kinematic GPS receiver, hence the flying saucer-shaped antenna attached above the apparatus. For orientation, atop the Sony Glasstron see-through HMD is an Intersense IS-300 Pro inertial/magnetic tracker, combining miniature gyroscopes and accelerometers to detect head movements and an electronic compass to establish the orientation of the user's head in relation to the terrestrial magnetic field.

Augmented reality research at Columbia has been supported by grants from the Office of Naval Research, which is also funding a team at the Naval Research Laboratory in Washington DC to replicate the civilian MARS system and adapt it for prototyping military applications as the Battlefield Augmented Reality System (BARS). Heads-up displays that superimpose computer graphics with crucial battle data on a combatant's field of view are commonplace in fighter aircraft and tanks. But providing up-to-the-second visual cues on enemy positions to the foot soldier is a harder problem, because of constraints on size and weight, and has drawn interest from planners throughout the US armed services.

Indoors augmented reality can take advantage of fixed position markers (infrared LEDs, black squares, or special bar codes) detected by sensors to achieve millimeter-level accuracy, a must for medical applications. At the Chapel Hill campus of UNC, patients undergoing routine breast biopsies are being randomly assigned to the first clinical medical tests of AR; the radiologist performing the procedure wears an opaque HMD, with sonogram imaging mixed into a video view of the patient, in effect giving the practitioner virtual X-ray vision to see lumps or other anomalies within the body. Positioning data for the HMD is provided by infrared LEDs mounted on it and registered by an Image-Guided Technologies FlashPoint 5000 optical tracker. For wider-range tracking within a room, UNC researchers have developed the HiBall tracker system, consisting of infrared arrays and an egg-sized HMD-worn optical sensor, now offered commercially by the start-up company 3rdTech.
A demonstration at the University of North Carolina shows a system for superimposing ultrasound visualizations during breast biopsies. Patients have now taken the place of the lab dummy in the first clinical trials of AR for medical use.




Industrial-Strength AR
The first readily available software for calculating a viewer's position and orientation in augmented reality has been developed at the University of Washington's Human Interface Technology (HIT) Lab. The ARToolKit, now in version 2.60, can track simple black squares or can be adapted to recognize other marker patterns and is optimized to run fast enough for real-time augmented reality applications. The software can be downloaded from the HIT Lab's Web site in Windows, Linux, or SGI formats.

Because a head-mounted display is often cumbersome and even uncomfortable after prolonged use, some researchers are working on hand-held or projected AR displays that allow computer images and real-world environments to be easily combined. At Boeing, researcher David Princehouse has developed a position-sensing Palm Pilot-size hand-held display that can be moved around an unfinished part to indicate with corresponding CAD images in exact registration where holes need to be drilled and at what angle. This Augmented Drilling Demonstrator derives its tracking precision from a HiBall system, with infrared markers placed in the ceiling above.

Incidentally, the term "augmented reality" was coined in 1990 at Boeing by researcher Tom Caudell for a workplace aid in manual manufacturing processes; a head-mounted display could provide an easily reconfigurable alternative to the large plywood boards with pegs and color-coded lines then used as guides for workers assembling electrical wiring harnesses to connect on-board electronic equipment in aircraft. Though Boeing has yet to install AR systems on the factory floor, the potential for training and guidance in manufacturing is gaining widespread recognition. In Germany, AR is being pursued by industrial firms like Siemens and a consortium known as ARVIKA (a German acronym for Augmented Reality for Development, Production, and Servicing)-which includes automakers Audi, BMW, Daimler-Chrysler, Ford, and Volkswagen, and industrial giants Zeiss and Airbus, as well as Siemens.
Japan's Mixed Reality Systems Laboratory took shoot-'em-up video games to another level, blending fantasy avatars and real players in Aqua Gauntlet, demonstrated at Siggraph 2000.




Pursuing larger and more life-like AR images for simulations, in this case for the US Army, the University of Southern Cali fornia's Institute for Creative Technologies has been developing Flat World, a training environment combining "flats," the moveable walls used in set design for Hollywood movies, and projected stereoscopic computer images. When combat trainees wearing lightweight polarized-lens glasses move around this environment, they can experience a convincing 3D illusion of landscapes, buildings, and hazards in a specific location.
At Siggraph 2001, Contact Water, a kinder, gentler game from Mixed Reality Systems Laboratory, allowed players to toss and catch animated dolphins in augmented reality.




Augmented Entertainment
Not all AR applications are a question of life-or-death, nor of greater industrial efficiency; mixing virtual and real objects lends itself to entertainment, too. In Japan, a government/private consortium now part of Canon, the Mixed Reality Systems La boratory, has developed an improved HMD called COASTAR (Co-Optical Axis See-Through Augmented Reality) that eliminates the parallax errors characteristic of most head-mounted video cameras. To demonstrate this, researchers have designed multi-player AR games. Aqua Gauntlet, a three-player shooting game superimposing avatars and targets on people and objects in a studio setting, was shown in the Emerging Technologies exhibit at Siggraph 2000. The Contact Water game, which lets four users play catch with animated dolphins tossed from hand to hand, was featured in Siggraph 2001's Art Gallery.

This year at Siggraph, the Emerging Technologies exhibit included two more examples of augmented reality from Japan. "Occlusive Optical See-Through Dis plays in a Collaborative Setup" takes advantage of another new HMD, called ELMO (developed at the Communications Research Laboratory of Ja pan's Ministry of Posts and Telecommunications), which adds a secondary LCD to block out incoming light from a viewed environment, interposing opaque virtual objects anywhere in a user's field of view and creating the illusion of overlapping real and computer-generated images in a see-through display. For this demonstration, two users sitting across from each other were able to manipulate virtual objects as if they were real.

Researchers at the Nippon Telegraph and Telephone Corporation have come up with a 3D computer graphics capture system using a single camera on a rotating arm and "sensor cubes" that determine the lighting and spatial characteristics of the environment into which the 3D image can be integrated. In this system, called "Regeneration of Real Objects in the Real World," Siggraph visitors were able to use a hand-held tablet-style PC to see a virtual Noh mask with seemingly different emotional expressions depending on lighting from different angles of view.

Also in Siggraph 2002's Emerging Technologies area was "The Virtual Showcase: A Projection-Based Multi-User Augmented Reality Display" (see "Showcasing Augmented Reality," pg. 18, November 2001). This convex-mirror and projection screen display system from the Fraunhofer Center for Research in Computer Graphics in Providence, RI, allows virtual objects and real ones to share the same museum-style display space. Multiple viewers wearing untethered shutter glasses can experience 3D stereoscopic views without the weight and complexity of head-mounted displays.

How far could AR eventually evolve in terms of realism? In 1965, in a classic paper, Ivan Sutherland envisioned "The Ultimate Display." For the pioneer of computer graphics and developer of the first augmented reality system (though he didn't call it that), the only limit was reality itself. "The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet would be fatal. With appropriate programming such a display would literally be the Wonderland into which Alice walked."




Steve Ditlea is a New York-based journalist who frequently covers emerging technologies for Computer Graphics World.






Visitors to Siggraph 2002 witnessed what may soon be the first commercial product to embody augmented reality. At the Emerging Technologies exhibit, the Sonic Flashlight, created by George Stetten, assistant professor of bioengineering at the University of Pittsburgh, with researchers from Carnegie Mellon University, gave users a black-and-white view of bones and blood vessels inside their hands or other parts of their bodies-much like a spreading beam from a virtual X-ray flashlight. Doing away with special headgear, the Sonic Flashlight is a hand-held ultrasound transducer that seemingly casts its computer-generated sonogram images through the patient area being examined, with the help of a half-silvered mirror. Currently awaiting funding for clinical trials, the Sonic Flashlight could prove to be an indispensable medical diagnostic tool in examining rooms and hospitals, due to its ease-of-use in revealing what lies beneath the body's surface.