The Mars Science InterfaCE (MSLICE) – pronounced "EM-slice" – is a collaborative effort between NASA's Ames Research Center, Moffett Field, Calif., and NASA’s Jet Propulsion Laboratory, (JPL) Pasadena, Calif. The Human Computer Interaction Group at Ames led the planning component of the MSLICE system, enabling mission scientists and engineers to prepare the hundreds of daily activities for the rover to perform. The planning software ensures that mission scientists can work closely withboth rover and instrument engineers to create a plan that will maximize scientific data and be safe for the rover to perform. By allowing scientists to understand how long Curiosity’s activities will take to perform, whatinstruments to use, and what resources the activities will consume, the scientists can focus on science, while supporting the generation of complexactivity plans.
“Each day, [Mars Science Laboratory] scientists and engineers will be under time pressure to make sense of the data that is sent back from the rover and to plan what the rover should do the next day on Mars,” said Joy Crisp, Curiosity deputy project scientist at JPL. “MSLICE is the collaborative software tool that will enable our team of hundreds of scientists and engineers to view data products from Mars, select targets, prepare rover activities and command sequences that meet all of the constraints we have. We will be relying heavily on this tool.”
"The MSLICE developers have created a clean and intuitive interface between we humans and our wonderfully complex machine on Mars,” said Ashwin Vasavada, deputy project scientist for Curiosity at JPL. “With MSLICE, our 400 scientists around the world can quickly view the latest data, share results with each other, plan the rover's activities within the available resources, and generate detailed commands to send to the rover. It's an amazing tool that enables us to be scientists onMars, and not programmers."
Engineers in the Human Systems Integration Division and the Intelligent Systems Division contributed to the design and development of the planning system. To build MSLICE, Ames and JPL engineers used Open Source Software, including Eclipse, Java from Oracle, and Rhino from Mozilla, among others.
Footage of the MSLICE software
Download the Video here:
The Mars Science Laboratory Mission (MSL) Curiosity rover has a suite of scientific instruments and sensors onboard that include four science cameras: two "Mastcam" mast-mounted remote sensing cameras, the Mars Hand Lens Imager (MAHLI) microscope mounted at the end of the rover's Robotic Arm (RA), and the MArs Descent Imager (MARDI), a chassis-mounted descent imager. A team of engineers and computer scientists at Ames, in collaboration with Malin Space Science Systems (MSSS), San Diego, Calif., and JPL have provided the Mastcam, MAHLI, and MARDI (MMM) science teams with interactive 3D visualization and simulation software, dubbed "Antares," for the development of MMM instrument command sequences, situational awareness, and to better understand the rover’s surroundings.
To operate Curiosity from Earth requires mission scientists and engineers to synthesize an understanding of the rover's state and environment from remote sensor and science instrument data. An approximately 15 minute communications delay, due to the distance between Earth and Mars and the limited high-bandwidth communication windows to orbiting satellites, compounds the difficulty of rover operations. Because of this, careful “offline” planning of activities is necessary to mitigate risk and enhance productivity.
Ames' Antares software provides a number of capabilities that enhance the MMM science teams ability to comprehend the operational environment and productively plan image acquisition command sequences, including:
- Interactive 3D visualization of reconstructed Mars Digital Terrain Models (DTMs) and Digital Elevation Models (DEMs) covering a large range of scales (sub-millimeter to kilometer)
- Interactive command sequence editing to allow users to edit camera command parameters all the while previewing a simulation of what the new images will look like
- Terrain following – or the simulation’s ability to virtually display the rover’s progress as its wheels move across the Martian terrain
- Robotic Arm inverse kinematics that allow users to use a computer mouse to position and point the MAHLI camera at the end of the robotic arm
- Camera view visualizations to predict what the rover will see at a new location before it arrives there
Interactive simulation of shadows based on the time-of-day on Mars
- Measurement tools, including a ruler distance, location, and heading, as well as coordinate grids
- Terrain overlays such as maps that display false color slopes and elevations
The Ames Antares team also provides the MMM team the capability to derive data products using images to augment other imaging data products produced by JPL. Examples of image data products include terrain models (DTMs and DEMs), which are generated using a stereo correlation technique to calculate the 3D positions of objects and a sense of distance. It's similar to the way humans’ vision works. Other examples, include:
- Aligned and merged DTMs and DEMs from multiple locations
- Panoramas created using a mosaic of images
The Ames team also developed a data services architecture to support operations with a distributed science team. Due to the mission’s two year duration, science team members and instrument staff will be distributed at various institutions for the majority of the mission – the science team will only be collocated at JPL for the first three months.