NASA Podcasts

NASA EDGE: Technology Demonstration Missions (TDM) Part 1
› Download Vodcast (472MB)

NASA Technology Demonstration Missions
- John McDougal
- Terry Fong
- Mark Micire
- Mike Weiss
- Michelle Munk


ANNOUNCER: The future is shaped by technology; by the advances we make here and in space. What new technologies are being developed at NASA? How will demonstrating these technologies help us learn more about our universe? And is the Co-Host a technological liability? Find out on NASA EDGE.


BLAIR: Now John, I understand you are the Program Manager for TDM or Technology Demonstration Missions. Tell me a little bit about what TDMs are.

JOHN: Within NASA, they want to use technologies that are proven and demonstrated because these missions are generally very expensive. So, within the Technology Demonstration Mission, what we would like to do is to demonstrate new and revolutionary technologies in a lower cost, simpler mission where we can accept more risk to enable these technologies to be infused in these more expensive missions. We look at other spacecraft and we see if there are environments where we can put our demonstrations on those other missions. One approach we can have is ride sharing where we launch on somebody else’s rocket with other spacecrafts. There may be multiple spacecraft on the same rocket. We look at particular commercial space communication satellites are a good example where they are already going to a particular point in space. We put the payload on their satellite…


JOHN: …and ride along with them. Another approach would be, such as in the case with MEDLI, where we put experimental sensors on the heat shield of Mars Science Lab and fly into the planet with that and take the data as we’re entering the atmosphere.

BLAIR: There are how many TDMs? Are there like nine?

JOHN: There are nine Technology Demonstration Missions in our portfolio right now.

BLAIR: Thinking of your broad TDM schedule, what would be the best three missions for us to cover in our first episode?

JOHN: The ones I would talk about now would probably be the LCR Data, the Laser Communication Relay Demonstration, and the human exploration Telerobotics & MEDLI.

BLAIR: MEDLI is the heat shield technology deal, kind of like the Captain American shield.

JOHN: Well, it’s actually some thermal and pressure sensors that flew in on the Mars Science Lab aero shell. It’s currently in the investigation and analysis phase.

BLAIR: If you want to get technical but yes.

JOHN: Well, it is technology, so we do get technical.

BLAIR: I’ve always wanted to be a part of an investigation. Do you think I could maybe be the lead investigator for MEDLI as a new technology?

JOHN: Do you think you can lead an investigation?

BLAIR: Yeah, seriously. I’m on it.

CHRIS: Don, are you sure this is a good idea?

JOHN: If he can handle the pressure…

CHRIS: Oh, he’ll bring his “A” game.

FRANKLIN: Well, at least his “B” game.

CHRIS: We’ve got to cover the first three TDMs, so we better go.

FRANKLIN: Yeah, Terry Fong is waiting for us over at Ames.

FRANKLIN: Terry, give me the Telerobotics big picture as it relates to human exploration and working alongside with robots.

TERRY: Sure. Telerobotics project is all about how do you take remotely operated robots and use them to improve the way that humans live and work in space.

FRANKLIN: Under Telerobotics, you have quite a few projects that you’re working with. The last time that I met you, we were at Moses Lake in Washington State. You were working with K10.


FRANKLIN: Tell us a little bit about where that has been since then.

TERRY: The K10 Rover project is something we’ve been doing for about 7 years now. Back when we were at Moses Lake, we were trying to look at how do you use robots like K10 to do scouting, work ahead of a human mission; surface level reconnaissance.


TERRY: Since then we’ve also looked at the flip side. How do you use robots to follow up after humans? We did some testing up in the Canadian Arctic a couple of years ago where we did a human mission and then did a robotic mission following after that. This whole theme of robots working before, robots working after, and now what we’re doing with the Telerobotics project is looking at the middle ground; robots working side by side or in support or in parallel. This coming summer, we’re going to be doing an interesting test where we have astronauts on the Space Station control the K10 robot here in this planetary analog environment. To do that, we’re going to finish transforming this space into a small-scale replica of portions of the moon. Because the mission we’re trying to simulate in our testing is a possible future mission to the far side of the moon, where we use a robot, like K10 or a future version of K10, to deploy a lunar telescope. To simulate that mission we’ve created this large space here that we’re going to use as part of our testing.

FRANKLIN: Another project that falls under Telerobotics is R2.

TERRY: Yeah.

FRANKLIN: How is R2 being used on the ISS?

TERRY: R2 is a dexterous, humanoid robot. We’re trying to get it to work with tools that astronauts use. We’re trying to get it to do things that require manipulating with your hands or with your arms. By the end of the day, it’s really all about trying to do the work that’s unproductive for astronauts to spend their time doing.

FRANKLIN: Can we see R2 soon look something like C-3PO with legs?

TERRY: Funny you should ask. Actually, over the next year or so one of the things we’re really looking forward to is making the current version of R2 mobile. On Space Station right now R2 is a bit on a pedestal…


TERRY: …in one location. We’re actually learning a lot about how to operate a two-arm dexterous robot in space. But during the next year we’re actually going to be sending up a pair of legs. This will allow it to basically be self-mobile. It can literally climb around inside Space Station to get from point A to point B.

FRANKLIN: Just like an astronaut?

TERRY: Just like an astronaut.

FRANKLIN: Are we making science fiction reality?

TERRY: I think in a lot of ways we are. A lot of our inspiration for the things we do at NASA comes from science fiction, whether it’s in print or the movies. There are a lot of really great concepts out there that back in the day maybe you didn’t have the technology to actually make those real but now we do. Now, we have the opportunity to try to see how does it work. That’s really part of what we’re trying to do. Is to take those ideas and concepts, be it from science fiction or elsewhere, and make those real.

[Light saber sound]

MARK: Interestingly, the light saber trainer droid was a source of inspiration for three different free flyer projects that NASA and MIT have worked on in the past and SPHERES is actually one of those.

CHRIS: What is Smart SPHERES all about?

MARK: Smart SPHERES is all about taking the existing SPHERES platforms that we have on Station and expanding their capabilities. We’re very interested in seeing how we can use robotics to help astronauts and ground controllers do their daily tasks and act as an assistant to them and allow them to provide eyes, ears and other sensors in areas of Station that they wouldn’t be able to otherwise.

CHRIS: I’m looking at this SPHERE next to you. The first thing I notice is the cell phone.

MARK: It’s a Samsung Nexus S phone. It’s the same phone that you can buy in any store.

CHRIS: Okay.

MARK: We then modified it for flight. You have to make modifications to it to fly on ISS. We also went through and modified the software and the operating system that runs up underneath the hood. It ends up being a really, really great resource for us. It’s humbling to say that NASA cannot even out run the advancements that are happening in the cell phone industry.

CHRIS: If you gave me the cell phone number for that phone, I could call it right now on Station?

MARK: Actually, all of the cell phone capabilities have been disabled. So, much like you have to put your phone into airplane mode when you are on an airplane, we have to disable all of the cell phone capabilities when it’s on ISS.

CHRIS: Give us a break down of what’s inside that SPHERE.

MARK: Sure. The SPHERE internally has liquid CO2 tanks. It’s the same kind of tanks you would use on a paint ball gun. That liquid CO2 comes up through this regulator that’s on the top of it that takes it from high pressure to low pressure.

CHRIS: Okay.

MARK: The low-pressure gas then gets passed through tubes that are on the inside and they come out to these little thrusters. There are two of these little, silver thrusters on each of the faces. Each of those thrusters has a solenoid in it. So, it’s like an electromagnet that the computer can tell whether it needs to have gas coming out of it or not.

CHRIS: Okay.

MARK: It uses liquid CO2 to move itself through station. It also has these little ultrasonic beacons. What the ultrasonics do is they help the SPHERE know where it is. If you tell the SPHERE to go from point A to point B, it needs to know that it’s at point A before it can figure out how to get to point B. Inside of it, there’s, of course, a computer that’s doing all of the math that’s involved in figuring out which combinations of thrusters it needs to fire to make it go from point A to point B.

CHRIS: Can you control that through a joystick or is that autonomous, or, is it controlled by astronauts or ground control?

MARK: All of the above. The SPHERES was originally designed be canned experiments. The programmer would set up what the flight path he wanted the SPHERE to fly. That would be uploaded to the SPHERE. The crew would load that onto the SPHERE. Crew would say go and then the SPHERE would begin flying around. Part of what this Smart SPHERES project is doing is trying to extend that so ground controllers are able to establish the path and do all of the programming from the ground.

CHRIS: Do you maybe see it working on the outside of the International Space Station to help astronauts?

MARK: Absolutely. The idea that we have robots helping us in all of the situations in which we’re doing Space Science just seems natural. We live in a world today where we have Roombas that are vacuuming our floors. We have autonomous cars that are being created that will drive us eventually. The idea that we have astronauts working and interacting with robots that are helping them with everyday tasks is very important to ISS or any other future missions that we’re working on.

[Blair humming]

BLAIR: All right.

[Adjusting desk chair]

INVESTIGATOR 3: Blair, you do realize you’ve been given the responsibility of being the lead investigator on this mission?

BLAIR: Absolutely. I’ve got all kinds of data here, some information, documents, so I should be good to go.

INVESTIGATOR 3: You have twenty-four hours.

BLAIR: [sighs] Okay, let’s get busy here. [yawning]


CHRIS: Mike, tell us, what is LCRD all about.

MIKE: Laser Communication Relay Demonstration, a demonstration of Earth to geo real-time, optical communication.

CHRIS: So, right now we’re pretty much communicating, I understand, in RF frequency.

MIKE: Right. That’s really where we’re at today but we’re quickly reaching the limits of what we can do with radio frequency transmissions. And as you know, NASA has some pretty bold plans, to go farther and transmit a whole lot more data. We really need a new architecture, a new way of communicating. We’re going to need things that can transmit at faster rates that are more efficient at collecting that data. Imagine if we could make humongous trades for mass and power and not have to spend those kinds of resources on essentially communication systems.

CHRIS: So, with laser communications, you can send more information over a wider bandwidth? How does that work?

MIKE: It’s basically about a 10 to 100 times increase in the amount of data...


MIKE: … and the speed of data.

CHRIS: You look at these radar dishes. When we’re looking at laser, we’re not even looking at a dish, are we?

MIKE: You’re looking at something that transmits a laser beam, basically a telescope and something that collects that beam, either another telescope or some type of photon collector or detector.

CHRIS: When you send the signal from Mars, you send it from the moon or from the spacecraft; it goes over a wide area but with laser it’s going to be at a narrow angle.

MIKE: Right. It’s a narrow beam so you need a whole lot less power.

CHRIS: Since it’s going to be a narrow beam, it could be harder to pick up in terms of collecting that information down on Earth?

MIKE: Well, it’s definitely difficult to pick up. A lot of technology development has gone into what we call “pointing and tracking.” We’ve definitely worked on that. We know how to do it, so we’ve solved that problem. Now, it’s just a matter of hitting the other things that are a little bit different with optical communications, such as the Earth’s atmosphere.

CHRIS: Right. I was just going to say, what happens if it’s a cloudy day or there’s a storm passing over the relay station?

[Man laughing]

MIKE: Well, there’s two ways to solve that problem. One is to get rid of the atmosphere. I don’t think that one’s going to work.

CHRIS: Right. Right. Right.

MIKE: The other one is to deal with the atmosphere.

CHRIS: Okay.

MIKE: You can put your receivers in places where you know you have very little effects due to atmosphere. The other is to correct for what the atmosphere does to these signals, such as adaptive optics. We have that technology now and we’re going to apply it to these laser transmissions.

CHRIS: In looking at LCRD, there’s another mission coming up shortly that’s LLCD or Lunar Laser Communication Demonstration. Is that a precursor to LCRD?

MIKE: It is. Lunar Laser Comm is going to be a demo of a laser transmission from the moon. To give you an example, LRO, huge amount of data, digital map of the moon, highly successful mission that was transmitted at 150 megabits a second. The Lunar Laser Comm is going to transmit at 622 megabits a second.


MIKE: Four to five times increase in speed. From that, we’re going to go demonstrate on LCRD a full network. So, we’re going to have two types of modulations. One’s going to be a little bit faster, slightly faster than what we could do with today’s RF. But it’s not so much about the speed it’s really demonstrating that relay capability. Nobody has ever done that before. That is what LCRD is all about. It’s the optical equivalent to what we do with the TDRSS system today.

CHRIS: In theory and in your hope, one day is that TDRSS will be replaced with this laser optical system.

MIKE: Yeah, probably not right away. It will probably will be a combination of radio frequency and optical but eventually, once we prove all these technologies, and make some improvements on other things like disruptive, tolerant networking and onboard processing, the hope is some day we will just totally use optical systems. Imagine if we’re on the moon and we’re sending real time data back along with high definition television…

CHRIS: Right.

MIKE: …along with engineering data, along with spacecraft telemetry, we are talking huge, huge data volumes.

CHRIS: Right.

MIKE: If we’re going to get there we need different communication systems because today’s are just not going to get us there.

[Wake Up song]

[Phone beeps]



[Video noises]

VIDEO VOICE: You’ve lost communication.


[Bull horn]

BLAIR: Thanks, Michelle, for coming in this morning to this investigative part of our review. It’s very important that we understand all the aspects, phases and success of Melody.


BLAIR: Yeah, MEDLI. Okay, could you please explain for me the technology behind MEDLI?

MICHELLE: MEDLI is a series of pressure measurements and temperature measurements that we made on the heat shield of the Mars Science Laboratory as it was going down to the surface of Mars. On August 5th or 6th, depending on what time zone you were in…

BLAIR: What time zone were you in?

MICHELLE: I was actually in the Eastern Time zone.

BLAIR: Good. Eastern Time zone. Very good.

MICHELLE: For us, it landed at 1:31a.m. We measured the pressure and temperature on the heat shield as the landing was occurring. That was to give us more information to validate our models so that we can better design the next Mars lander.

BLAIR: So, you were gathering data during the MSL mission?


BLAIR: Explain a little more clearly what that data was.

MICHELLE: We had seven pressure measurements. So, from before we entered the atmosphere up until a few seconds after the parachute deployed we were reading the pressure on the front of the capsule eight times a second. This is a section of the aero shell structure and it would have the thermal protection system material on the front of it. We drilled a hole through that material and the pressure was sensed through that hole, through this tube and into the transducer and then the signal came out through an electrical harness and was collected by our electronics box, which was also mounted inside the aero shell.

BLAIR: These are not the actual hardware units that were used during the mission, correct?

MICHELLE: No, those are now on the surface of Mars.

BLAIR: On Mars. Very good. Awesome.

BLAIR: Now, in your professional opinion, let me remind you, you’re under oath. Was that a success?

MICHELLE: It was a great success. You want to see the data?

BLAIR: Absolutely.

MICHELLE: Here is our MEADS pressure data and here is a figure of the front of the heat shield, and the coloring denotes the pressure. We have the highest pressure down here in what we call the stagnation region and then lower pressures towards the outside of the heat shield. Remember that it’s 4.5 meters in diameter. So, it’s fairly big and it’s flying a little bit nose up at an angle of attack so it can steer itself down to the landing site which is why it landed so precisely near Mount Sharp.

BLAIR: That’s a unique feature of steering itself.

MICHELLE: It is. This is the first time we’ve done that on Mars.

BLAIR: That’s pretty impressive.

MICHELLE: It was awesome.

BLAIR: Awesome.

MICHELLE: Here the white dots show our seven locations of our pressure ports. The first thing we notice about this data is that it’s incredibly clean. There’s no noise on it. There are no dropouts. It’s beautiful. We were really happy with it.

BLAIR: No failure on any of the instruments during the process?

MICHELLE: Nope, it was all beautiful. From this, our team has been reconstructing the trajectory for the past three months. They show that our predictions of how the vehicle was going to fly were pretty much right on. It’s all really good news.

BLAIR: One more question here, did you or anyone on your team experience all seven minutes of terror?

MICHELLE: We definitely did.

BLAIR: Any associated health problems as a result or is everyone okay at this point?

MICHELLE: Definitely.

BLAIR: Very good.

MICHELLE: We were all elated at the end.

BLAIR: Elated, okay. Very good. Well, I believe this concludes the investigative part of our review.

INVESTIGATOR 2: I’m sorry Michelle. We have a few questions.

INVESTIGATOR 3: What percentage of the heat shield was degraded during entry into the Martian atmosphere? And how happy are you with the data?

MICHELLE: That’s a really good question. We’re still analyzing the extent of the degradation of the heat shield so our thermal plug right here called MISP is what measured the temperature as it soaked through the material. We also had a sensor that measured the char depth or how much it burned. Right now, we’re showing that it charred about .3 inches down into the heat shield. It actually did not recess or completely burn away at all. We designed margin into the material so that we still had plenty of material left. We were nowhere near endangering the spacecraft at any time.

INVESTIGATOR 3: During the descent, you could actually see the heat shield fall away. What was going through your mind at that time? And did you stop the video to analyze the heat shield?

MICHELLE: Oh yes, we’ve definitely used the MARDI camera images over and over again. It’s such a high-resolution camera, especially in the first few frames when the heat shield is falling away, you can almost read the serial numbers on the MEDLI transducers. You can see all the harnessing as it’s laid into the heat shield. But actually our team members out at JPL are using it to measure how the heat shield flowed down to see if there were any wind effects on it, and what its aerodynamics were. It was really interesting to us that it just fell straight down and it never flipped or turned on its side or anything. That camera data is useful in many ways.

BLAIR: I’m going to need to see the serial numbers from the…

MICHELLE: Okay. We have those.

BLAIR: Thank you.

INVESTIGATOR 2: Michelle, are you still in contact with the MEDLI team?

MICHELLE: Yes, I am.

INVESTIGATOR 3: We’re going to need you to reassemble that team.

INVESTIGATOR 2: Blair, please hand her File 1138.

INVESTIGATOR 3: Can we count on you?

MICHELLE: Absolutely.

INVESTIGATOR 3: Melody? Dude.

INVESTIGATOR 2: I’m not sure you actually formed the questions.

INVESTIGATOR 1: A number of your questions were either irrelevant or nonsensical.

INVESTIGATOR 3: You brought your “C” game.

BLAIR: I just was going to do an investigation and I thought it would be funny to ask some serious questions and try to get behind the program. I didn't know there were rules. I didn’t know this would be used against me. Somehow this stuff seems unconstitutional. I mean you wanted an investigation. I do an investigation. A few “if,” “then,” “why” kind of questions. I didn’t stress anybody. I wasn’t pressuring anybody. I wasn’t insulting… well, maybe I was a little insulting but who doesn’t get a little insulting when you’re doing an interview or at least trying to act. And I haven’t been to acting school. I don’t know what acting is. I just kind of show up and do what I think is natural. Maybe that’s wrong but I can’t help that. That’s who I am. I’ll never ask another question as long as I live no matter what.

BLAIR: Okay, I think I have a better idea now of exactly what they’re looking for when they want to conduct an investigation of a TDM. So, next time I think I know better how I can prepare. It should be a much more rigorous investigation.

› Download Vodcast (472MB)