Issue: Volume: 23 Issue: 3 (March 2000)

Ready for Take-off




It sounds like a theme-park at traction, it looks like a computer game on steroids, and it has the potential to alter forever our deep-seated negative impressions of airport traffic operations. "It" is a full-scale virtual airport-control tower on the grounds of the NASA Ames Research Cen ter in Moffett Field, California. Called FutureFlight Central, the facility was de signed by the space agency in collaboration with the Federal Aviation Administration (FAA) to serve as an R&D testbed for potential solutions to the air- and ground-traffic problems plaguing commercial airports.

The two-story, 5130-square-foot facility officially opened for business in December 1999, culminating a two-year, $10 million development process aimed at increasing airport safety and reducing costly traffic delays. "The idea is that a photorealistic, multisensory simulation of an airport will enable everything from better training and management of airport systems and emergency rescue operations to better loading of luggage," says Linda Jacobsen, a VR expert with SGI (Mountain View, CA), who with her colleagues worked closely with NASA to get the project off the ground.
The tarmac of a simulated San Francisco Airport can be monitored from inside the cab of the FutureFlight Central control tower.




The only such facility of its kind in the world, FutureFlight Central (FFC) is a virtual haven for simulation junkies. It re-creates, down to minute detail, the experience of being in an actual air-traffic control tower. The top floor of the structure is a 24-foot, circular tower cab surrounded by "windows" looking onto the simulated airport tarmac. The cab can support up to 12 air-traffic controllers, as many as might be found at large commercial airports. Contributing to the perceptual reality of the system are its full-size consoles with functionally accurate computer displays that replicate controller equipment. Graphical user interfaces on 16-inch flat-panel touchscreens simulate such tools as radar displays, wind indicators, clocks, and altimeters. The perimeter console is based on FAA standards and air-traffic control tower configurations, and the modular center console can be configured to match most tower-cab layouts.

The lower floor of the structure is designed to house the multiple support people involved in air-traffic control. The area can accommodate five ramp controllers, 13 pseudo pilots, three airport operators, three simulation engineers, two software developers, and two researchers. The computer displays in the pseudo-pilot, ramp-control, and test engineer rooms are programmable to accommodate different airport configurations and new technologies as they become available. A simulated radio and phone system lets the controllers in the cab communicate in real-time with the pilots and ramp technicians on the first floor.

What distinguishes this structure from the real thing is that behind the FFC tower cab's tempered-glass windows are 12 seamlessly stitched display screens on which high-resolution 3D graphics, imaging, and video data are rear-projected at a 60Hz frame rate to create the illusion of reality. The visual display is driven by a monstrous 12-pipe, 16-processor SGI Onyx 2 image generator. Air-traffic control simulation software developed by Raytheon Computer Systems is integrated with the image generators and Intel-based workstations to support the visual display.

FutureFlight Central is airport-independent and thus can be used to simulate the air-traffic control environment of any airport by reconfiguring the console set-up, reprogramming the interface displays, and creating and employing the visual database representing the respective airport. To ensure the accuracy of a given airport model, the visual database incorporates data from various sources, including 3D CAD files, satellite data, terrain data, and aerial photographs. In addition to providing a 3D view of the airport environment, the system can depict a realistic view of weather conditions, environmental effects, and the movement of up to 200 active aircraft and ground vehicles.

FutureFlight Central was born as a spin-off from another NASA research endeavor-the development of the Surface Movement Advisor, a tool designed to improve communication between the airlines and the air-traffic tower. "Like many other tools targeting airport operations, this one was difficult to test in a real airport situation, so the idea came to develop a facility in which to test the tool before it was deployed in an actual airport," says FFC simulation director Boris Rabin.

"The fact that there is no good way to assess new tools before actually installing them has always been an issue. This sounded like the perfect solution for testing and validation." As the project evolved, it began to get more attention from the FAA as well as from airports, pilots, and controllers. As a result of the increased interest, the plan for the venue blossomed from what initially was going to be a three-screen display to the ultimate full-scale setup. "The project snowballed," says Rabin. "The more requirements that were being proposed by the FAA, the airlines, the airports, and the controllers, the more the budget grew and the more we could spend on the display system and image generator. Ultimately, instead of using an existing room or lab for putting this facility together, we had to design and build a separate facility."

The physical layout of FutureFlight Central is as important as the simulations being run there, because in addition to the testing and validation of new operational tools, another primary application for the facility is human-factors research. "The goal is to try to understand the stress and workload impact of proposed changes on a crew of air- and ground-traffic personnel," says Rabin. "Putting each controller in front of one screen wouldn't provide the necessary realism to accurately access the consequences of a proposed change. Only by putting controllers in environments that are very close to the actual tower scenario will researchers be able to get valid human factor information."

The requirement for environmental realism presented the FFC development team with a number of unique challenges, the most pressing of which was figuring out how to enable a multi-user virtual environment without any encumbering tracking or display technology. "Multi-user VR is a difficult thing in general," notes Rabin. "When you're dealing with a virtual environment for just one user, you can use tools like head-tracked, head-mounted displays to allow the user to move around the virtual space and have a practically unlimited field of view. It's not so straightforward in a multi-user setting." Although the use of some sort of mechanical tracking system or even stereoglasses could go a long way toward enhancing the visual experience for multiple users in the virtual setting, doing so would decrease the physical reality of the setting and thus compromise the validity of the human factors research.

As a result, says Rabin, "we had to determine how to provide all of the users with a sufficient amount of visual detail-as much as controllers get in a real tower-to achieve the sensation of having a 3D world behind the windows using nothing but a straight view from the rear-projected 2D screens." Although it might have been easier to achieve this illusion using a front-projection system, doing so would require hanging projectors over the controller's heads, which would also compromise the facility's physical reality. Given these constraints, the designers carefully configured the display screens based on such considerations as the square footage of the tower cab, the throw distance of the projectors, and the optimal distance between a given user's eyepoint and the display. A change on any of these fronts meant a change on all of them, says Rabin. "When we increased the size of the tower, we increased the size of the screens, effectively increasing the distance from the eyepoint to the screen and the throw distance from the projectors. To get it all together, we finally had to go to this enormous square footage."

The efforts paid off, says Rabin. Guests at the facility's opening-many of whom were familiar with airport-simulation technology-deemed the FFC simulation of the San Francisco (SFO) airport to be the most realistic simulation they'd ever seen.
The FFC cab layout mimics that of control towers at large airports. It supports up to 12 controllers at perimeter stations and numerous support people at the center console. Behind the windows are 12 rear-projected display screens.




As the first "customer" to sign on for the virtual experience, the SFO has grand expectations for the technology. Because of fog and unpredictable weather conditions, SFO is the nation's leader in ground delays. The airport management is planning to add a new runway to help alleviate the problem. The plan is to use the virtual airport tower to help experts evaluate the new runway's placement for best air traffic flow, noise reduction, and environmental impact, and to evaluate new decision-support tools for controllers.

In addition to SFO, a number of other customers are waiting in the FFC wings, says Rabin. "Boeing has signed on, as well as a number of airports interested in testing new runway configurations and airlines that want to test new ramp-tower locations."

These activities represent another first for NASA-the agency's first foray into the commercial sector. "NASA is actually going to start generating funds by renting out this capability, which is significant," says SGI's Jacobsen. "NASA is taking its amazing technology and making it available to commercial institutions ranging from airports to airlines to businesses that rely on air ports and airlines, such as FedEx and UPS."

Although the FFC is unquestionably one-of-a-kind, it's easy to foresee how similar set-ups could be employed in other applications. "An air-traffic control tower is unique in that sits in the middle of a large ex panse and lets you look around, but this type of virtual facility could be useful in any environment in which users need to control things going on in another area," says Jacobsen. For example, she says, "In a Chernobyl-like situation, you might want a really good telepresence system that lets you watch and control people and vehicles operating in a dangerous setting." A more traditional "NASA-type" application might be the use of a tower simulator to enable space exploration. "You could be looking onto a Mars scene from within the simulator that's receiving live 360° pictures and you're looking at them as 360° pictures instead of single stitched pictures on a monitor."

Regardless of the application, FutureFlight Central is the classic VR story, says Jacobsen. "It's what VR is all about. It's about testing out different scenarios and getting people's reaction to them."

Diana Phillips Mahoney is chief technology editor of Computer Graphics World.
Back to Top
Most Read