Human-Centered Interfaces for Teleoperations
To develop safe, reliable, and effective teleoperation interfaces, human factors principles are applied to optimize interface design in support of human-robotic teaming for space exploration.
The Human Information Processing Research Branch is internationally recognized for their work in advanced user interfaces. Safe and effective operator interface design is critical to space exploration systems, as is the need for data-rich virtual presence.
Our in-house expertise draws upon a diverse, multi-disciplinary team of psychologists and engineers. We apply a variety of task analytic, experimental, and modeling techniques to characterize interface requirements and test potential solutions.
Recent research and development projects include:
- Haptic Interfaces for Teleoperations
We are developing human factors guidelines for effective haptic (force reflecting) manual interfaces for multi-sensory virtual simulation and teleoperation displays. Major program goals include: 1) the design of a novel, very high performance, 3 DOF force reflecting manual interface device; and 2) examination of operator perception and manual task performance via psychophysical and target acquisition experiment. A patent has been awarded for our 3 DOF parallel mechanical linkage.
- Spatial Auditory Displays
We have developed and validated technologies to synthesize spatially localized sounds. By applying real-time transformation, any acoustic source (voice, warning tones) can be localized to a point in 3D space. Sounds can be displayed via speakers or headphones. These technologies have been tested in a number of aerospace environments (including ATC, aircraft flight deck, and mission control). We have demonstrated consistent improvements in situational awareness and speech intelligibility with these advanced acoustic displays. Several patents have been awarded or are pending.
- Space Perception in Virtual Environments
We are assessing the acquisition of situational awareness via immersive (virtual environment) and non-immersive (desktop) interface displays, and are developing a model to predict the degree of interface fidelity required for specific visualization tasks. Our findings have been published in PRESENCE, the journal of virtual environment research.
Right: High Performance Manual Interface Device.
The Space Exploration Vision recognizes the need for robust teaming of human and robotic partners. The composition of the on-site teams will vary; in some cases, human operators will work in close proximity to their robotic assistants. For other missions, the human operator will teleoperate a robot, utilizing a tight in-the-loop control regime. Still other scenarios call for robots to operate in a semi-autonomous mode; the human operator will supervise operations, but only intervene in case of emergency or off-nominal conditions.
To support such diverse missions, we must develop safe, reliable, and effective interfaces. At Ames , we have been on the forefront of teleoperations interfaces for two decades.
Much of the pioneering work in telepresence and virtual environments was performed in our laboratories. Today we continue our research in advanced multi-modal displays and interfaces.
The Exploration Vision presents unique challenges for interface design such as:
- How do we compensate for the divergent gravitational-inertial environments of the operator and the robotic agent?
- How can operators maintain tight, inner-loop control in the presence of multi-second transmission delays?
- How do we optimize situational awareness for supervisory control regimes, ensuring the operator both understands the robots’ current activities and can anticipate future states?
Right: Users Interface with Robotic Missions in a Simulated Virtual Environment.