NASA Podcasts

NE@Robonaut 2
› Download Vodcast (133MB)

[To Enable Closed Captioning]

Note: if you're using Quicktime 10 (Mac OSX Snow Leopard) closed captioning is already available.

1. After downloading the vodcast, locate the downloaded file on your computer. Your file should have an .mp4 extension and include "NE00041910_at25_Robonaut2" in the name. Please note that our example file name is generic.

NASA EDGE example vodcast file with mp4 extension 2. Change the extension of the file from .mp4 to .m4v.

NASA EDGE example vodcast file with m4v extension 3. Using the latest version of Quicktime, open the new .m4v file.

4. Closed Captioning and chapter marks should now be available options.

Selecting the Show Closed Captioning option for NASA EDGE example vodcast file in Quicktime 7 5. Enjoy the captioned show!

NE@Robonaut 2

Robonaut 2
- Josh Mehling
- Adam Sanders

Robonaut 2 is more than just chiseled abs of titanium and brilliant neural net processors. It is actually the logical next step when it comes to working hand in hand between humans and robots for both NASA and General Motors Jacky Cortez and Franklin Fitzgerald talk to NASA's Josh Mehling and GM's Adam Sanders to get a first hand look at many of the new and improved features of Robonaut 2 prior to its journey to the ISS. In fact, the only restraining bolt needed in this episode is for the not-so-chiseled Co-Host.

JACKY: Welcome to the Robonaut 2 Lab. We’re here with Josh Mehling. I’m a little star struck at Robonaut 2 standing next to you. Can you tell me a about your involvement with the Robonaut 2?

JOSH: I’m the Lead Mechanical Engineer on the project. That means I’m responsible for the structural design and mechanism design. The robot as a whole, here in this lab, we’re looking at designing the next generation of robot that can work side by side with people and help make their work more efficient.

JACKY: You’re saying side by side, not necessarily replacing anyone, but helping us.

JOSH: Right. The whole design of the robot has been focused on something that can work safely next to people. The software, the controls, the mechanism, it’s all about helping people be more efficient with their jobs.

FRANKLIN: Robonaut 2 looks like a human torso. Why is it set up like that?

ADAM: If you have a human robot it can use the same tools as the human can use. It can help assist humans. And it also moves in a way similar to how humans move. It’s very easy for you or me to work with a robot and understand and visualize how this is going to move, as opposed to a standard industrial robot. It’s a little harder to visualize that.

JOSH: We have tendons inside the forearm that pull the fingers in various motions. We have very similar joint structure to people. We have the different degrees of freedom in the thumb and the fingers so it all bends very human-like.

JACKY: As humans we can touch something and sense when we are holding something too hard. Will Robonaut be able to do that?

JOSH: That has been one of the main advances we’ve gone after in this robot, that sensing ability, especially the sense of touch. Through out the robot’s hand we have various sensors that can give you a feel of the forces you’re exerting on the environment. The sense of touch of the things you’re picking up and that really allows it to take the next step towards doing the human tasks that you do by feeling. It has a full seven degrees of freedom, just like the human arm. That gives you the ability to put your hand on an object and still move your elbow. That’s real effective when you’re doing tasks, working side by side with somebody. As you’re doing something you can keep your hands in the position they need to be in but you can move out of the way if someone is bumping into you. It gives you the extra ability to be flexible on the fly.

JACKY: What about the head? Can it see anything?

JOSH: We have cameras built into the head. It really is powerful place for a camera platform. There is a reason our eyes as people focus in best on the work surface between the intersections of our two hands. We followed that same design structurally with the robot. We have the cameras mounted right where it would make sense for them to see what the hands are doing.

ADAM: When you’re looking at tasks you want to automate, the three types of tasks that are prime for automation are applications that are dull, dirty, or dangerous. These are the environments you don’t want to put a human in. These are the kinds of things robots can assist us with and really help make our jobs easier and safer.

JOSH: There are a variety of different control methods you could use with the robot. One is somebody using a joystick or virtual reality equipment to move the robot through motions. On top of that you can build levels of computer control, autonomy, so the robot can think and do actions on its self. There’s a whole range of how to control the robot, whether it is preprogrammed, learning from its environment and doing tasks, or whether you’re controlling it with a human in the loop.

JACKY: This is a Robonaut 2. There must have been a Robonaut 1. What are the differences between 1 & 2? Robonaut 2, we’ve gone for faster actuators. It’s about 4x faster. It’s a little bit stronger but the real advances are in the dexterity and the sensing, especially in the hands. It moves a lot more humanlike. It can make better grasps, has a variety of different sensors, so it can better learn about its environment not just through the cameras but also through its sense of touch. That is good for doing tasks and also good for being safe around people.

JACKY: Is the idea to only keep the top half of the body or is it going to have legs as well?

JOSH: Right now, it’s a modular design. We have an upper body. You could add legs onto it. You could also add other lower bodies. You could put it on a wheel platform for handling rough terrain. You could put a zero-g leg on it so it could climb around the outside of the space station. It leaves options open for different lower bodies for different operations.

FRANKLIN: Can you give me some future, real-world applications for the Robonaut 2 robot?

ADAM: Sure. We see a lot of technologies from this robot being applied to our cars and also in our plants. And really, just the idea of humans and robots interacting safely, working together to do a better job.

JOSH: Wherever we’re sending people, wherever we’re trying to perform some science or do work, whether it’s on the space station or on the moon or Mars, wherever we might go, we’re looking to take the robot and assist in those types of missions. It’s not just the robot for the robot’s purpose but the internal technology can be applied to other things we’re doing, computer control, autonomous vision, force sensing. It’s a wide application of things beyond robotics also.

JACKY: I know you’re an engineer. You worked on Robonaut 1, and now Robnaut 2. Did you every think you’d work on a robot growing up?

JOSH: I think that was always a dream from the time I was playing with Legos as a kid but it’s hard to imagine a more exciting job. It’s really like big kid Legos.

JACKY: Right.

FRANKLIN: You’re watching NASA EDGE, an inside and outside look at all things NASA.

FRANKLIN: Can I get one of these at Best Buy?

› Download Vodcast (133MB)