Issue: Volume: 23 Issue: 3 (March 2000)

Mirror, Mirror in the Hand



Over the past few years, virtual reality has become a practical reality for many applications thanks to a number of technology innovations. Among these has been the development of rear-projection display systems, including such media as virtual tables and workbenches, powerwalls, and surround-screen rooms. Un fortunately, the nature of rear-projection technology often prohibits applications that rely on such displays from taking full advantage of another important VR technology: head tracking.

Head-tracking techniques monitor the position of a user's head to enable realistic navigation through a virtual environment. Applying head-tracking in conjunction with a rear-projection display, however, can cause an unnatural clipping of large or high-lying objects by the edges of the display's projection plane because of the limited viewing volume. Additionally, because navigation is enabled by tracking the viewer's head and rendering the scene from that viewpoint, it's difficult for the user to see all of a large image or to see any image from every viewpoint. For a birds-eye-view of a large model on a tabletop display, for example, the viewer would either have to climb onto the table or downscale the scene, thus sacrificing detail.

In an effort to eclipse this shortcoming, researcher Oliver Bimber at the Fraunhofer Institute of Computer Graphics in Rostock, Germany, in collaboration with colleagues L. Miguel Encarnação at the Fraunhofer Center's Rhode Island facility and Dieter Schmalstieg at the Vienna Uni versity of Technology in Austria, have developed a mirror-like tool that provides a broader perspective on rear-projected virtual scenes, approximates simultaneous multi-user navigation, and extends standard VR tools into the realm of augmented reality in which virtual objects are superimposed on real scenes.
A conceptual display interface couples a virtual workbench with a mounted "transflective" surface that transmits and reflects light so digital parts can be evaluated in conjunction with physical components.




The system consists of a number of components, including a hand-held, six-degrees-of-freedom pad consisting of a sheet of Plexiglas laminated with a semi-transparent foil, an electromagnetic tracking device to compute the position and orientation of the pad, a virtual table, a head-tracking mechanism, and stereoscopic shutter glasses. The reflective pad serves as the user interface to the digital world by reflecting the virtual scene on its surface. To achieve views that could not possibly or easily be attained using head-tracking alone, users simply reorient the mirrored pad or view it from a different line of sight.

Unlike traditional head-tracking applications in which stereo images are computed based on the users' physical eye positions, the mirror system tracks the corresponding reflection of the eyes within the reflection space behind the mirror plane. The projection of the scene appears as a perspectively correct reflection in the mirror.

The unique physical characteristics of the pad make it a useful interface for both traditional virtual reality as well as augmented-reality applications. The pad's foil surface can either reflect light or transmit it, depending on its orientation to the light source in the environment. For example, if the viewer and the light source are located on opposite sides of the pad, the foil transmits the light, so that the user can see through the pad. When the user and the light source are on the same side of the pad, the foil reflects the light, providing a mirror view of the environment in front of the pad. When the pad is illuminated on both sides in a certain way, the foil simultaneously transmits and reflects the light, making the reflected and transmitted images visible to the observer.

In the transparent mode, the pad can be enhanced with 3D graphics from the Virtual Table display, offering a range of interactive possibilities. For example, says Bimber. "The transparent pad can serve as an object palette. It can carry regular window-control elements such as buttons and sliders to augment graphics displayed on the table, which are visible by looking through the pad. It also lets us apply special rendering techniques to see different things, such as an X-ray view, by looking at the pad."

In the reflective mode, the pad can be used to reflect the virtual scene. "Only the projection plane and its displayed pixels are physically reflected by the mirror, but the reflected virtual scene can still be perceived in 3D," says Bimber. The continuous computations enable real-time navigation. To move through objects in a scene, the user can attach virtual clipping planes to the pad.

The third interactive mode, the transflective mode, uses the pad in both the transparent and reflective modes. As such, the pad can be used to integrate real objects into virtual worlds-an effect that traditional rear-projection set-ups can't support. "[With conventional displays] the real objects occlude the projection plane, thus they occlude virtual objects," says Bimber.
A physical nozzle augmented with virtual parts is viewed through a mirrored pad that both reflects and transmits light.




To avoid this effect, the pad can be used as an interactive image plane that merges the transmitted image of the surrounding physical scene with the reflected image of the virtual scene. Users can then interact with the augmented surrounding space.

Another advantage of the technology is that it makes the virtual scene accessible to multiple viewers simultaneously. It does this by tracking the movement of the mirror, rather than the movement of a single viewer. The perspective of the images visible in the mirror is approximately correct for all viewers. While there is some perspective and stereoscopic error for every observer in the group, the distortion is minimal, primarily because the size of the mirror requires that groups cluster around it, resulting in only slight perspective differences.

The researchers hope to use the mirror-tracking system to extend common virtual-reality devices toward the application of augmented reality. To this end, they have conceived a new kind of virtual workbench that can be used in both virtual and augmented reality modes that would, for example, let users project digital designs onto physical counterparts for evaluation.

Many applications could benefit from such a tool, says Bimber. "For example, the ship-building industry still builds large plastic models before they build the real ship. It is difficult to replace the plastic models with a pure virtual design task. An intermediate step could be to use the plastic model in conjunction with virtual parts."

Although the major technical problems presented by their mirror-tracking system (such as the refraction/reflection computations) have been resolved, a few remain. For example, notes Bimber, "our solution relies on two tracking sensors to compute the stereo images [one attached to the pad, one to the observer's head]." The consequential nonlinear distortion, time lag, and noise are higher than that of single-sensor systems, such as see-through head-mounted displays.

Another limitation is the indirect line-of-sight problem, which refers to the requirement that part of the projection plane has to be visible in the mirror, affecting the application range of the pad. The problem, says Bimber, can be mitigated through the use of large, fixed transflective surfaces or by augmenting the immediately surrounding physical space with a small hand-held pad.

The researchers have appplied for patents on the transflective pad and its modes of interaction. On their research agenda is the creation of better tracking technology, methods for handling magnetic-field distortion, and algorithms to detect collisions between virtual and real objects. They're also developing techniques for more intuitive interaction in virtual- and extended-reality applications.

Diana Phillips Mahoney is chief technology editor of Computer Graphics World.