The success of any virtual-reality experience depends on the user's sense of presence in the virtual environment. While believable graphics and familiar modes of interaction are important elements to achieving such presence, both are of little value if the user's navigation through the digital scene feels slow, stilted, or somehow out of sync, or if the user feels prohibitively constrained in space-all effects of less-than-perfect motion tracking.
A motion-tracking system is the hardware/software configuration through which a user's head position and orientation in a scene are communicated to the computer, which in turn adjusts the displayed image to reflect changes in the user's location. This is generally achieved through one of five types of motion-tracking technologies: mechanical, electromagnetic, inertial, optical, or acoustic.
An ideal tracking system is one that is able to generate position and orientation computations in real time without compromising the graphical display or the user's freedom of motion. Unfortunately, such an ideal has yet to be realized. In reality, most tracking systems are forced to make technology tradeoffs.
|The HiBall technology is packaged in a golf-ball sized device fitted with specialized optical sensors and lenses that can be head-mounted to track the user's view perspective through a virtual or augmented reality scene. Housed inside the device are s|
With mechanical tracking, for instance, the tracking hardware is physically attached to the object or person being tracked, so while such devices offer high accuracy, they restrict movement. Electro mag netic tracking measures the strength of the magnetic fields in coils attached to objects, which enables fast results, but the systems are prone to interference from metallic objects in the physical surroundings and they operate within a limited range. Inertial tracking systems, which operate by integrating voltages from gyros and accelerometers, suffer from drift and thus usually must be combined with another technology to be useful.
In traditional optical systems, a fixed-position camera monitors the pulsations of light-emitting diodes (LEDs) attached to an object or user. As with magnetic systems, the response time is fast, but the systems are prone to line-of-sight problems and interference caused by ambient lighting. Finally, acoustic systems, which use ultrasound waves to measure position and orientation, are burdened by the slow speed of sound.
As part of an ongoing effort to develop a system that avoids such tradeoffs, the Tracker Research Group at the University of North Carolina (http://www.cs.unc.edu/~tracker) has created a wide-area optoelectronic tracking technology that lets users move freely through full-scale virtual worlds in real time. Such a capability not only enables VR applications that would otherwise be difficult or impossible to achieve-such as the exploration of life-size architectural designs and room-filling molecular models-but it is also expected to be of value to augmented reality (AR).
In AR, real and digital worlds are superimposed into one scene through the use of see-through head-mounted displays that rely either on mirrors to represent the physical world or video input. Highly accurate motion tracking is crucial because even small tracking errors can result in unacceptable misregistration between real and virtual objects.
Called the HiBall Tracking System, the new technology is able to meet the needs of such applications through its implementation of four unique components: ceiling panels that house LED targets, a miniature optical-sensor cluster (the HiBall) that senses and digitizes the LED flashes, a custom interface board that facilitates communications among the various components of the system, and tracking software that processes the communications in real time.
Unlike traditional optical tracking methods, in which targets are attached to the object or person to be tracked and sensed by a camera in the environment, the HiBall system employs an "inside-out" approach, in which the sensors are user-mounted and the LED targets are fixed in the environment. This distinction is important, says UNC research assistant professor Greg Welch, because it ensures constant sensitivity to orientation over the working area. Also, because the targets are in the ceiling tiles, the tracking environment is infinitely scalable by increasing the number of tiles.
The HiBall itself is unique in that it does not rely on the same charged-couple devices (CCDs) that most digital cameras employ. Rather, it uses lateral-effect photo diodes (LEPDs). Unlike CCD's, LEPDs are not imaging devices. They are 2D optical sensors that produce four analog voltages, which together indicate the 2D position of the center of the light hitting the sensor. "There is no image to capture and interpret, simply four voltages to digitize, which is done right inside the HiBall," says Welch.
The control center of the tracking system is the Ceiling-HiBall Interface Board (CIB), which sends LED addresses and control signals to the ceiling to direct the flashing of the LEDs. It also communicates with the HiBall, sending control signals and receiving the digitized LEPD values. The PC tracking software sends requests to the CIB for a sample of a particular ceiling LED from a particular optical sensor. In response, the CIB tells the ceiling to flash the LED and tells the HiBall to sample the LEPD. The digitized LEPD data it receives is sent back to the PC.
|The HiBall system's LED-studded ceiling tiles are designed to match standard acoustic tiles so they can be easily dropped into place.|
The system's tracking code relies on an estimation approach called SCAAT (single constraint at a time) tracking, which turns the individual LED sightings into a complete position and orientation, or pose, estimate for the HiBall. With SCAAT, individual observations are reported as soon as they're acquired, rather than at the end of a complete collection of measurements, providing some information about the user's pose. Subse quent measurements build on previous ones to improve the estimates. A filtering technique fuses a continuous sequence of these incomplete, single LED sightings into an ongoing sequence of complete estimates. To enhance the quality of the estimates and ensure low latency, thousands of LED sightings are generated per second. An autocalibration process compensates for shifts in the tiles and for inherent estimate inaccuracies.
On the agenda for the HiBall system is the development of a wireless capability between the HiBall and the CIB. The researchers are also investigating more flexible LED strategies, including LED strips that can be hung from ceilings wherever needed. The group's long-term objective is to develop hybrid tracking approaches that will reduce the system's infrastructure to allow users to move beyond the lab, eventually outdoors, while maintaining system performance.
In the meantime, the existing HiBall technology is headed toward commercialization by a new company called HiBall Tracker Inc., which is currently negotiating a technology license with UNC Chapel Hill.
Diana Phillips Mahoney is chief technology editor of Computer Graphics World.