Raven (STP-H5 Raven) - 12.15.16
Future robotic spacecraft operating thousands of miles from Earth need advanced autopilot systems to help them safely navigate and rendezvous with other objects. The Raven investigation studies a real-time spacecraft navigation system that provides the eyes and intelligence to see a target and steer toward it safely. The investigation enables future exploration missions near Earth and beyond, including satellite servicing and repair, asteroid exploration and redirect missions, and the Orion program. Science Results for Everyone
Information Pending Experiment Details
Ross M. Henry, Ph.D., Goddard Space Flight Center, Greenbelt, MD, United States
Goddard Space Flight Center, Greenbelt, MD, United States
Sponsoring Space Agency
National Aeronautics and Space Administration (NASA)
Technology Demonstration Office (TDO)
ISS Expedition Duration
September 2016 - February 2017; March 2017 - September 2017; -
Raven is developed, integrated, and operated by the Satellite Servicing Capabilities Office (SSCO), the same team that built and conducts the multi-phased Robotic Refueling Mission (RRM) experiment on the ISS. A previous, single-sensor version of Raven technology flew as the Relative Navigation Sensor (RNS) Payload on STS-125 during the Hubble Space Telescope Servicing Mission 4. The Raven visible camera is a repurposed flight unit from the STS-125 demonstration. Raven also reuses the flash lidar flown as part of the Sensor Test for RelNav Risk Mitigation (STORRM) demonstration on STS-134. Raven uses a version of the infrared camera manifested on STS-128, STS-131, and STS-135 as part of the Neptec TriDAR demonstrations. Both RRM and Raven are part of SSCO’s ongoing campaign to develop and test advanced technologies, tools and techniques for the Restore-L servicing mission.
- Many future exploration missions have to robotically or autonomously dock, capture, or land on exploration sites, whether those sites are other spacecraft, orbital outposts, asteroids, comets, or planetary moons. With such a breadth of missions ahead of it, NASA desires a single solution to meet each of these mission's relative navigation needs. Raven seeks to demonstrate that a single suite of sensors, avionics, and algorithms can indeed navigate a spacecraft to cooperative objects--those that have visible aides used for navigation purposes--as well as legacy satellites that were not designed to be serviced, or natural planetary bodies.
- Raven uses a complex and compact system to image and track the many visiting vehicles that journey to the space station each year. It contains three separate sensors that span multiple wavelengths, coupled with high-performance, reprogrammable avionics that process imagery. Raven’s vision processing and navigation algorithms convert the imagery collected by the sensors into an accurate relative navigation solution between Raven and the vehicle. Raven also uses a two-axis gimbal to point its sensors at the vehicle to increase the time it has to follow the vehicle into docking or berthing.
- Successful Raven operations allow NASA to demonstrate that a similar system can be used to pilot a satellite-servicing vehicle to a client spacecraft, verifying a key technology for the Restore-L servicing mission. Additionally, Raven demonstrates that NASA can move to a common specification on autonomous rendezvous and docking technologies, which means that a single set of hardware could meet the needs of multiple missions. This approach saves the government money over time, as it provides a standard specification that mission managers can order from, and that vendors can build to.
Future exploration spacecraft may dock with, capture, or land on many different objects, from other spacecraft to asteroids, moons and planets. An autonomous navigation system that could handle any of these tasks would be more versatile than systems designed for one type of use. The Raven investigation demonstrates that a single set of sensors, controls, and computer software can be used to rendezvous with both human-built satellites and celestial objects.
Raven uses a tracking system to image the many crew and cargo ships that visit the International Space Station (ISS) each year. Its vision processing and software converts images into a relative navigation solution between Raven and the object. This software could also potentially be used to inform and educate teams working on robotic navigation technology on Earth, including autonomous underwater vehicles and in unmanned aerial vehicles.
Operational Requirements and Protocols
- Operation of the experiment are conducted from the ground via TReK workstations by the Principal Investigator.
- Perched on an outside platform of the ISS, Raven is a technology demonstration of the real-time, relative navigation part of this system.
- As vehicles approach and depart from the space station, Raven tracks them in action and sends collected data to Earth.
- NASA operators then evaluate how Raven's technologies work as a system and make system adjustments to increase Raven's tracking performance.
- Over the course of Raven’s nominal on-orbit lifespan, Raven is expected to monitor approximately 50 individual rendezvous or departure trajectories.
Decadal Survey Recommendations
Information Pending^ back to top
Information Pending^ back to top
Satellite Servicing Capabilities Office, GSFC
+ View Larger Image