Advancing human-technology interaction for NASA missions
About human factors
[image-62][image-67]Rapidly advancing technologies require humans to make critical decisions in an increasingly complex environment; human factors studies human interaction with engineering systems to address safe, efficient and cost-effective operations, maintenance and training, both in flight and on the ground.
Ames human factors is unique in that multiple labs encompass a full range of aerospace-related work, from basic research like visual perception, motor control and psychophysiology, to direct applications like flight deck design, air traffic management, mission controller displays and crew operational procedures.
Human factors and human systems integration work at the center is carried out by over 150 researchers across more than 20 labs to dramatically improve safety, efficiency and mission success.
Ames human factors areas include:
- Human-Machine Interaction improves NASA software through careful application of human computer interface methods.
- Human Performance: develops new technologies, human performance models and evaluation tools to enhance human productivity and safety for both space and aviation environments.
- Integration and Training: develops and evaluates methodologies to integrate human factors principles and improve aviation capacity, safety and training.
- Intelligent Systems: conducts user-centered computational sciences research.
Aviation Systems: conducts research and development in air traffic management and high-fidelity flight simulation.
Featured example: Mars Science Laboratory InterfaCE (MSLICE)
How do you plan a rover's day?
[image-83][image-99]In 2012, the rover Curiosity landed on Mars to assess whether Mars has ever had an environment able to support life. With its extensive array of scientific equipment, Curiosity requires specific instructions each day on what actions to take in order to carry out its mission (such as driving along the surface, analyzing rocks, taking imagery, and transmitting data). These instructions must be based on the requirements of scientists and engineers back on earth, and the instructions must take into account the constraints on the power and computational resources of the rover.
Fortunately, Curiosity activities are based on integrated planning tool called the Mars Science Laboratory InterfaCE (MSLICE) for scheduling its tasks. This software tool is a key part of the daily science planning and commanding process for a distributed team of mission scientists and engineers.
Using human factors design principles, Ames computer scientists designed MSLICE to provide critical resource modeling and activity planning capability. With this approach, MSLICE ensures that mission scientists can work closely with rover and instrument engineers to create plans that maximize science return from the limited resources available to the rover.
Featured example: Human performance in the 20-g centrifuge
Can humans handle mission tasks while under the vibration conditions of launch?
[image-115][image-131]As it plans for human exploration beyond low-earth orbit, NASA intends to have human crews fully engaged in mission critical activities (e.g., monitoring displays, flipping switches and pushing buttons) during all phases of flight. Those phases include launch and re-entry, periods where the crew will be subjected to significantly elevated gravito-inertial and vibration loads. To properly design the interface systems and plan crew operations, mission managers will need to understand how human performance may be altered during these phases of flight.
Consequently, Ames researchers enhanced the 20-g centrifuge by adding a vibration capability, which enabled it to simulate the both the vibration and enhanced g-forces experienced during launch. A first set of experiments collected human performance data on the impact of vibration on display readability and usability and the results were used to set limits on acceptable vibration levels.
In a second set of experiments, three types of human performance data were collected:
1) eye-fixation data to measure gaze stabilization reflexes
2) gaze target-acquisition data to measure eye-movement reaction-time, accuracy, and precision
3) a rapid manual pointing component to measure hand movement reaction-time, accuracy, and precision.
Using these data, scientists can dissect the contributions of each of the major sub-elements (i.e., vestibular, visual, biomechanical, and proprioceptive) to the overall impact of G-plus-vibration loading on human sensorimotor control.
Featured example: Planning for air traffic management within NextGen
How do you know if you need more air traffic controllers?
[image-147]NASA researchers often work in collaboration with the Federal Aviation Administration (FAA). One major new effort for the FAA is the development of a new set of concepts and automation tools for managing air traffic operations in the future, called the Next Generation Air Transportation System, or "NextGen". Under NextGen, aircraft will be able to fly more efficient and direct routes than available today.
In support of this activity, NASA researchers in the Airspace Operations Lab conducted a series of studies to determine how these routes would be developed, evaluated and coordinated by air traffic control, then issued to the pilots. One question was whether a new air traffic control position would be needed, or whether, with new tools and procedures, this new job could be performed by current day staff. NASA concluded that a new position was not needed, provided that appropriate automation tools and coordination procedures were implemented. The FAA is in the process of incorporating NASA’s recommendations into their planning for NextGen.
› Read more