Student Features

Whatever Happened to Virtual Reality?
In "The Matrix" film sequels, the metal gates to the city of Zion are operated by air traffic controllers who work inside a virtual control tower. It is a computer-generated, heavenly-white space where controllers use fancy virtual control panels to guide science-fiction hovercraft. This fantasy scenario must seem familiar to anyone who rode the wave of virtual reality (VR) hype during the 1980s.

Man wearing a virtual reality helmet
Helmet-mounted displays, power gloves, 3-D sights and sounds were supposed to make immersive environments commonplace. These technologies would revolutionize everything from video games to stock market analysis to psychotherapy.

Image to right: VR helmets are now lighter and deliver a greater sense of feedback. Credit: NASA

It didn't happen. "The technology of the 1980s was not mature enough," explains Stephen Ellis. He leads the Advanced Displays and Spatial Perception Laboratory at NASA's Ames Research Center. VR helmets and their optics were too heavy. Computers were too slow. Touch-feedback systems often didn't work. The only things consistently real about VR were headaches and motion sickness (common side effects of 1980's-era helmets).

Twenty years later, things have improved. Computers are thousands of times faster. VR peripherals are lighter-weight, and they deliver a greater sense of feedback and immersion. And, researchers are beginning to understand crucial human factors. They're eliminating nausea and fatigue from the VR experience. Once again, virtual reality seems promising, and NASA is interested.

Picture this: an astronaut on Mars sends a rover out to investigate a risky-looking crater. Slip-sliding down the crater wall, the rover sends signals back to the Mars Base where the astronaut, wearing VR goggles and gloves, feels like she is on the slope. Is the find important enough to risk venturing out in person? VR helps decide. In another scenario, astronauts could use VR to perform repairs on the outside of their spacecraft by controlling a human-like robot, such as NASA's Robonaut.

Image to left: Astronauts can use VR and rovers to decide if an area is worth exploring in person. Credit: NASA

Ellis, who holds advanced degrees in psychology and behavioral science, evaluates VR for space applications. At the moment, he's investigating user interfaces for robots such as the Autonomous Extravehicular Robotic Camera (AERCam). These are spherical free-flying robots being developed at NASA's Johnson Space Center to inspect spacecraft for trouble spots. AERCam is designed to float outside (the International Space Station or Space Shuttle, for instance) using small xenon-gas thrusters and solid-state cameras. AERCam will view the vehicle's outer surfaces to find damage in places where a human space walker or the Orbiter's robotic arm can't safely go. The current plan is to use a laptop and a normal, flat monitor to operate AERCam. But, Ellis is conducting research, funded by NASA's Office of Biological and Physical Research, to see if a virtual environment might be a better option. With a VR system, the astronaut could maneuver the melon-sized AERCam with standard hand controls, while head movements rotate AERCam to let the astronaut "look around."

Virtual reality glove
Ellis's research is necessary, because "VR isn't always the best choice." For example, at the Wright Patterson Air Force Base, researchers have tested VR interfaces for pilots. Time after time, their tests showed that pilots perform better with traditional panel-mounted displays. Why? No one is sure. One possibility is the field of view of the VR helmet was narrower than the pilots' natural peripheral vision. Ellis believes these helmets effectively divorced the pilots from the cockpit -- the environment in which they learned to fly.

Image to right: Children quickly grew tired of the virtual reality glove because it was too tiring to use over a long period of time. Credit: NASA

"There are some surprisingly simple ergonomic issues that can interfere with VR interfaces," adds Ellis. For example, "In the early 1990s, Mattel sold the PowerGlove (a simple VR glove) as a novel way to control video games. It was cool. But, kids quickly discovered that it's very tiring to hold your hand up in front of you long enough to play an entire game," said Ellis. You'd have to be an athlete to use it. The glove is no longer sold.

Since the 1980s, there has been a dawning awareness among researchers that human factors are crucial to VR. Age, gender, health and fitness, peripheral vision, posture, the sensitivity of the vestibular system all come into play. Even self-image matters. One study showed that people wearing VR helmets like to glance down and see their own virtual bodies. It helps "ground them" in the simulation. And, the body should be correct: arms, legs, torso (male for men and female for women). For every virtual environment, there is a human-computer interface, and if the interface doesn't match the person... game over.

Diagram of the inner ear
To address these human factors, Ellis's group performs fundamental research on human senses and perception. One central concern is how people cope with "latencies," or delays, in the VR system. When you swing your head, does the virtual view follow immediately, or is there a split-second lag? If your eyes and your inner ear (where vestibular organs sense orientation) send conflicting reports to the brain, you might need a motion-sickness bag. Organs in the inner ear affect human balance. Vestibular adaptation is a key human factor in VR interface design.

Image to left: NASA has helped reduce the amount of delay time in the virtual reality system making it less likely to affect human balance. Credit: NASA

"The question is: how much delay can you tolerate?" Ellis says. For movement within the virtual environment to feel natural, most people need the delay to be less than 15 milliseconds (thousandths of a second), according to his group's research. However, Bernard Adelstein, Durand Begault, and Elizabeth Wenzel, who work with Ellis in the Advanced Displays Laboratory at Ames, have discovered that sound can help compensate for the delay in touch. In a virtual environment, sounds can be generated much faster than touch feedback from a VR glove. For example, when grabbing a virtual object, the immediate "click" sound of contact enhances the user's sensory perception of realism.

The years of research are finally beginning to pay off, Ellis says. "The fully immersive, head-mounted system is getting to be high enough fidelity for practical use. We'll probably have the AERCam experiment running by August (2004)." "The Matrix" will take a little longer.

Published by NASAexplores