Andres Almeida (Host): We all take risks nearly every day — from crossing the street to taking a new job. But what about managing risk when the stakes are as high as building a new spacecraft, landing a probe on another world, or protecting astronauts in orbit? There’s an entire discipline around this, and it’s both an art and a science. It’s called Risk Management, and today we’re talking about it with the person who helped write the literal handbook on it, Dr. Mary Skow, NASA’s Agency Risk Management Officer. This is Small Steps, Giant Leaps.
Welcome to Small Steps, Giant Leaps, your podcast from NASA’s Academy of Program/Project & Engineering Leadership. I’m your host, Andres Almeida.
Mary took a chance on us and came onto the podcast to talk about NASA’s approach to risk and the challenges of mitigating it when you don’t yet have all the data.
Hey, Mary, thanks for being here.
Dr. Mary Skow: Hi, how’s it going?
Host: Pretty well. How are you feeling today?
Skow: You know, I’m alive. I’m here. We’re just ready to go. That’s how we’re doing today!
[Laughter]
Host: Well, we’re glad you’re here! What does your role at NASA entail?
Skow: Sure, so, I’m the agency risk management officer here at NASA. This is the first time that this role has ever existed.
So, I have been in this role starting in November of 2023 and it consists of a lot of things. I will say that. So, it started out as, hey, what does this role even look like at NASA? Is it like a chief risk officer at a federal agency? Or is NASA a little bit different because we do more technical work than like, financial work, right? Is it like the chief risk officer at a more technical place, educational wise, or in industry?
So, we first took a lot of hard looks at like, what this role could look like here, and we’ve done a lot of work there. So today, I would say that my role here is to help the agency make risk informed decisions at the highest of levels. We’re also here to help programs and projects and offices and organizations make risk-informed decisions at their levels.
So, depending at what level you sit in we’re here to help you make good, risk informed decisions. And when I say good, I just mean with the information you have available, and in the timeframe, you have available and at the level of risk you’re willing to take to meet your objectives.
Host: So, it’s known that you won’t always have the data. How do you identify those opportunities?
Skow: First, you have to recognize that there is a lack of data, which, I’ll be honest, not always easy to know, right?
Host: Until you’re in it.
Skow: Until you’re in it, or until you have someone come to you and ask you a question. This is why, for some folks who are aware of some of the work I do, I love learning about the power of questions and how to ask questions, and what in questions can actually bring out in people and in environments. So, we’ll start with that. How do you even know that you’re missing data?
As an old project manager, I would tell you I knew everything all the time, right? That was my job, except for I definitely didn’t ever. And so, it’s always important to seek other perspectives and have other viewpoints and have people ask me questions, which can be really, really hard.
I’ll be honest, having people ask you questions about your technical work or about your programmatic cost, or your schedule, or the return on investment of your project or your organization is really difficult, especially when it’s in an open environment.
So, first, be open to that. And when they ask those questions, don’t ask yourself, “Well, how dare this person ask me this question, really?” Go, “Why?” Like, what is this question inferring? Do I actually have the data to answer that question? Am I aware of it? Or are they trying to, or is there something else going on? Like, ask, ask them a question to follow on, to better understand the question. And you might come to find out that there’s data missing, or they’re bringing a different perspective to the table, because they’ve seen something happen, an issue on their old program or project, and that issue could be affecting you, and you just not even know…
Host: They’re being proactive, right?
Skow: They’re trying to be proactive and go, “Hey, this thing happened to me. I don’t want it to happen to you. So let me, like ask you some questions if you’ve considered these factors or not,” and then take an honest, hard look at yourself, by the way, and it’s okay to say you don’t know. And then go find out and try to figure out if you can go find that information out. So first figure out if you’re missing information – which, by the way, we all are most of the time, so no hate there – but simultaneously be open to, to understanding that you’re missing the information.
Next is like actually saying, “Hey, I am missing information. Now what do I go do?” Sometimes you can do stuff, sometimes you can’t. Sometimes the information is available. Sometimes it costs money to get that information. Sometimes it takes time to get that information.
Sometimes, it’s both. Sometimes it costs money, time and a lot of technical work, um, to do that, right? So, it really just depends on the situation, but it really should go back like how much effort you put into finding a down additional information should be commiserate with the level of risk you are willing to take to meet that objective. That that information will help you make a decision on.
So, to put it another way, if I have objective ABC and I’m willing to take a really high risk, I’m willing to just wing it, and whatever happens, happens. Well, maybe I don’t actually need that information, because it’s okay if I fail completely. That might work.
In another case, you might have a[n] objective that’s XYZ, and it needs to work like 100 percent or like 90-plus percent of the time. That thing better turn on, it better take those measurements, and it better shut off, and it better be accurate, right? And what you all don’t see is my wonderful hand movements that are going on here.
And so, if that’s going to happen, well, then you might need that actual information to truly understand if that is thing is going to turn on or not and be accurate. So, then you’re going to put in that effort to get that information. And of course, there’s a whole bunch of gray zones between, of those two examples, right? So it really depends on the level of risk you’re willing to take to meet your objective, and then what information you need so that you can make risk informed decisions, and then really figuring that out, and then applying that knowledge to if you have to go get that additional information or not, or additional testing or not, and things like that.
And, by the way, that’s called “objectives driven risk informed and case assured” for future references.
Host: So, usually when people hear risk, the general public, and even within the agency, they think of risk like, “Okay, we’re gonna attempt this feat, this technological feat. How can we make it safe?” But what about other risk factors that are unknown/underappreciated during program development?
Skow: Oh, sure.
Host: Can you talk about those?
Skow: Yeah, definitely. And I’m gonna also link it to, like, organizational development, not just program/project management, because a lot of things cross over, and we don’t always think that they do. So, I’m going to give some fun examples.
So, for example, let’s utilize security. When I say, “Security,” what do you think of immediately?
Host: Oh, I think of IT. I think of harm reduction, right?
Skow: Yeah. So, cyber, cyber, maybe physical security, like you have badge in, badge out, but then also badging has some, like, a cybersecurity aspect to it as well, right? How many programs and projects consider how much risks are you willing to take in terms of security for their program or project?
Host: I mean, I’m gonna say, “Not enough.”
Skow: I’m gonna say, “Not a lot!” Right? Because, like, because they’re, the program/project management is usually not doing it. They’re usually not doing it. They usually, when they, because they’re not the ones who own the communications, right? They build a thing that will communicate, but they don’t own the satellites that it communicates on. They don’t own the ground infrastructure it communicates on. They don’t own the physical IT assets that allow the cables to run around the world, right? So, they don’t actually think about all of that immediately. So that’s usually a really underappreciated risk.
They also don’t think about how their system could actually cause a risk to the bigger system. So, they think, “Oh, all I need to do is create this system and let it talk. Yay, go, team! It shouldn’t be that hard, right?” Except for what happens if there’s a hole and someone finds that hole and they try to, you know, take advantage of the unlocked door. Let’s say there’s an unlocked door and someone tries to take advantage of that unlocked door.
Well, that asset that’s flying now is now a risk to the larger assets on the ground and to the broader network, not just themselves. And a lot of people don’t consider that. So that’s even a larger, underappreciated risk. So that’s just a quick one, but those are like, that’s kind of the those are like, the kind of like, the negative ones. But there’s also reputational risks, right, that we’re willing to take.
So how many of us in organizations, when we’re sitting on our branch meetings or in our divisional meetings or directorate meetings, are we considering the reputation of NASA, the reputation of your organization, to the next level up of management, if you promise that you’re going to do something for ABC dollars and ABC time with ABC objectives committed, and you fail to do that for whatever reason, technical issues, non-technical issues, maybe like things gone wrong, things cost more money, supply chain risks that a lot of people don’t think about either. So that’s another either. So that’s another good one. There’s a reputational hit. So how many of us actually truly understand the level of reputational risk we’re willing to take? How have you ever been asked to take a project that you knew was going to fail?
Host: Oh yeah.
Skow: Did you take it?
Host: Of course!
Skow: Right! Why? Because there’s an opportunity, isn’t there? Yeah, see, it’s not just negatives, though, right?
Host: No, I’m a perpetual optimist as well.
Skow: So, well, most engineers are, that’s another, we have a lot of problems with that too. But, I will say, like there’s an opportunity risk that goes along with that risk, right? So, your reputation was on the line, and you were willing to say, “Yes,” right? And then what happened, even if you failed, do people still appreciate you because you tried?
Host: Probably.
Skow: Probably, yeah, right, exactly.
Host: Because we are all in it together, right?
Skow: Right! We’re all in it together. We all know that the thing that we set you up for wasn’t going to be great, like, everybody knew so, like they actually appreciated the fact that you’re willing to try.
So, there’s a level of risk associated with both a negative and a positive on each side, right? So how much risk are you willing to take to meet that objective? Going back to objectives driven – I’m going to keep going back to it, because that’s my thing right now – going back to that, how much risk are you really willing to take? And when you made that call, when someone came to you and said, “Hey, this project is going to be rough, it might not succeed. Are you willing to do it?” If you said, “Yes,” you did that mental math in your head. You did what you’re what are you willing to do.
And you did a risk-informed decision, because someone else in your shoes may have said no, because they’re less, they’re more risk adverse naturally, than maybe you are in that instance or in that specific part. Pretty cool, huh?
So, yeah, so we covered so, just to clarify, we covered cybersecurity, physical security. We covered reputation. We also mentioned supply chain in there. These are all generally underappreciated or not necessarily overly considered risk areas to think about
Host: That has to be challenging when it comes to international partnerships.
Skow: Oh, it absolutely is.
Host: And collaborations, yeah.
Skow: Yeah. Because we have different, we have different regulations, rules, things like that, that get in the way, and sometimes we talk past each other. So, when I say risk, and you say risk, are we even using the same definition? I can promise you, the answer is probably no.
And in fact, when I give a lot of presentations, and I probably should have done this here, I should have told you what I mean by risk.
Host: Not the board game.
Skow: Yeah, like, not the board game, even though the board game is a lot of fun. But like, what is the definition of risk?
So for me, a risk is the likelihood, the possibility that something could go wrong to a stated objective, basically, is the quickest way to put it. I have an objective I want to meet my objective. Maybe that’s go to work on time. Maybe it’s do my project for ABC dollars, XYZ costs, or schedule and meet whatever technical objectives. Maybe it’s a workforce objective to keep my people at a specific happiness level on the, on the FEVS, right? Whatever my objective is, a risk is the likelihood that something will go, that would, you know, cause harm to that objective of making not succeed the level whatever that probability is. So, yeah.
Host: When you mentioned FEVS, that’s the Federal Employee Viewpoint Survey.
Skow: Yeah, so, some of us care a lot about whatever employees think of us, so we want to make sure that we’re keeping them happy.
Host: Reputational!
Skow: Yeah, going back to that, yeah, exactly.
Host: So, this is going to be a basic question for you. How does NASA define risk management, both when it comes to human spaceflight and robotic missions?
Skow: That is not a simple question!
[Laughter]
I don’t know why you thought that was simple. Risk management is, I’m gonna actually have to, like, pull this up. You only actually have to give me five seconds. So, I want to make sure I have the exact way that NASA says it define it. So yeah, here we go. Let’s define risk first. Shall we? Let’s just define risk.
All right, so the definition of risk, according to my own handbook that I definitely memorized, is the potential for shortfalls with respect to achieving explicitly, established and stated objectives. So basically, what’s the likelihood of something going wrong to meet when you try to meet one of your objectives?
Now, the definition of risk management at NASA, risk management is an overarching process that encompasses identification, analysis, mitigation, planning and tracking of root causes and their consequences.
So, basically, it’s the process in which we use to identify, respond to and track risks.
Host: In regards to robotic missions and human spaceflight, that’s the same approach?
Skow: Absolutely. And in fact, we approach risk management for the entire organization of NASA, like your branch management, legal, contracting, the same exact way.
We actually utilize the same framework across the board. It’s outlined in NPR.8000.4 for those who don’t know, so they can definitely Google that, and it’s online and it’s open to the public, which makes life really easy.
Host: Yeah, we’ll link that to the page.
Skow: Yeah, but basically, risk management follows some key principles. We like, I like to use the words objectives driven, risk informed, case assured is, is the overarching Safety and Mission Success Criteria that OSMA [Office of Safety and Mission Assurance] utilizes. But in there is the risk management part of it.
So, the risk management framework, just to really kind of put some more keywords around that is holistic. We really believe that risks come from everywhere, from external and internal. Sometimes At NASA, we’re really good at just looking internally at the risks we control.
So, what I mean by that is, if you’re a program or project manager, you control your costs and your schedule, your technical risks and a lot. You don’t control it perfectly. You control the safety aspects, right? You can control the design. There’s a lot of things you can control in there that have risk. There’s also a lot of things that happen to you as a project manager that you don’t control, right? So, for example, supply chain, you don’t control if a manufacturer is just no longer going to build your FPGA, and you definitely don’t control the rate in which they’re going to build it.
Host: And what is FPGA?
Skow: Oh, yeah, sorry. Oh, you’re gonna ask me to define FPGA.
[Laughter]
Oh, gosh, we’re gonna have to look that one up later. But basically, it’s the big processor, microcontroller on your spacecraft and your instrument, the thing that, the brains of your, of your instrument or system that helps make sure, make sure it turns on and does the things it’s supposed to do.
Host: Yeah, the project manager doesn’t have control of that critical component, right?
Skow: Exactly. You don’t have control of the lasers. If you need a laser, you don’t have control over the metal either, right? So, these things are supply chain, but they affect you. So that’s why we talk about holistic risk management. We want to make sure that we’re looking everywhere for the risk.
Now, of course, there’s a point in which looking everywhere is not maybe helpful. At some point, talking about hurricanes hitting Maine is probably not going to be as a helpful conversation as the potential of a hurricane coming and hitting your spacecraft while it’s sitting on a launch pad in the middle of July in Florida, right? So that’s probably a different conversation, right? And there’s probabilities associated with that.
We also talk about risk as being probabilistic, so the likelihood of a risk occurring one way or another, right? And this goes back to risk management, because part of risk management is identifying the risk. Great, yes.
So, going back to, like, the actual identification of risk, but then there’s the probability of it actually occurring, right? At some point. It’s a higher likelihood, and something we really need to pay attention to and care about. And other times it’s maybe really, really, really low, like a hurricane hitting, I don’t know, Canada. So, like, we have to pay attention to those aspects of it.
And then there’s organization-specific risks. So, I’m going to go back to that objectives-driven part.
My objective as the agency risk management officer is very different than your objective, or the objective of someone sitting in HR, or the objective of someone sitting in legal, or the objective of someone who is a systems engineer on a[n] instrument for SMD [NASA’s Science Mission Directorate], or someone who is doing the life support system for the next, you know, crewed mission.
Host: [It] requires many different conversations.
Skow: Completely different objectives, I may add. So, because of that, it’s very organization-specific and like what you’re doing based on what your objectives are.
So, it’s always really important to go back to those objectives and make sure that your risk management framework is helping you. The framework is there to help you make decisions. It’s there to help your boss to make decisions so that you can communicate in a way that will help them make the best risk informed decision that they possibly can.
So, I really wanted to point that one out, because it’s sometimes we kind of get caught up in the minutia of the process of the pretty five-by-five matrices, and sometimes that’s not actually helpful. Sometimes what’s more helpful is actually going, “Hey, I have this risk, I have this likelihood. I think that this thing could happen. It’s going to really affect my objective. And I want to make sure that you are aware that it’s happening, because I could use your help on making a decision if I should expend more resources and take the time to do it. Let’s have a conversation.” That conversation is probably the most important part, not the five-by-five.
So I really want to just say, when we’re taking a whole hard look at risk management, it’s the whole thing, but it’s also there to help you and make good decisions, help your boss make the good decisions, and to also understand what’s really going on in terms of the level of risk you’re willing to take and how much effort you’re going to put into it to mitigate it, or to maybe accept the risk because it’s okay.
Host: What I pick up on is that it also has to be a people-focused objective when you do risk management. Think of your team.
Skow: Yes.
Host: Think of the people around you and the conversation, they’re real people who are
Skow: absolutely and it’s continuous too, right? It’s not just, “Oh, hey, I’m going to tell you about this one risk, one time I’ve never talked to you about again,” right? Because that’s not helpful, is it? You forget. You have different objectives. You have different people you’re talking to. I have different things that I care about on a daily basis.
So, risk management is risk informed decision making, but also continuous risk management. If we’re going to identify a risk, we need to keep tracking it. We need to keep making sure that, hey, this risk is worth mitigating, or maybe it’s no longer worth mitigating, maybe it’s okay to accept, or maybe we’re researching it and we’re going to keep researching it until we see something pop and or there’s an indicator that maybe it’s trending worse. Our research is showing that something’s getting worse, and maybe that’s an indicator, right? So, it’s things like that. It’s not just, you know, one off, you know, I’m only gonna tell you about something one time, forget about it again and not track.
Host: No, it’s knowledge sharing.
Skow: You’re gonna make sure we’re constantly talking about it. We’re not just ignoring it.
Host: Absolutely. So, in the NASA Risk Management Handbook, risk leadership is defined as a fundamental pillar of risk management culture. What does risk leadership entail, exactly? How can an organization maintain that risk leadership culture? Sure.
Skow: Sure, Great question. So first, let’s define risk leadership. So, risk leadership is leading by an accountable leader. It’s really important, accountable risk acceptance, decision making within the limits of a defined risk posture. And I’m going to talk about risk posture in just a second, including the authority to allocate portions of the risk posture to subordinate or supporting organizations. That’s, like, a lot. So, what does that even mean?
So, first and foremost, I like to use the example of learning how to drive a car, or you are teaching someone how to drive a car. So, do you do you have kids? Have you taught your kid how to drive a car? I don’t, but I’ve taught my nephews. Okay? So, I mean, you know you sat in the seat. So, while you’re sitting in the seat and you’re teaching your nephew how to drive, I’m sure the first thing you’re gonna do is make sure that they know where the pedals are.
Host: Yep, absolutely.
Skow: That they know how what a brake is, and that they know how to utilize secure shifting, right? Okay, so you’re probably not gonna take them on the highway on the first round right, you’re probably gonna keep them on the driveway or even on a straight road with no obstacles in the way.
So, you’re doing that because you’re understanding, you understand the level of risk it means to drive a vehicle, and you understand that it’s very risky, and there’s a lot of potential for harm to occur, but there’s also a lot of opportunity and good that can come from it. So, you need to do it in a slow, concise manner.
So, you are actually understanding the level of risk, and you are encompassing the level of risk that you both are willing to take, by keeping it to a small area, by keeping the obstacles on the way you’re saying, we’re not going to take a lot of risk today. We’re to keep that risk low. That’s called the risk posture. You’ve defined the risk posture. You are the risk leader in that situation, because you are the accountable leader. You do know better, right? You do know the rules. You have a broader understanding.
Well, the next day, you might take him out after he passes that and doesn’t you know, he doesn’t hit anything. You might take him to a parking lot, right? Parking lots are bigger. They have obstacles. So, you’ve now said, “Hey, I’m willing to give let more risk. We’re willing to take more risk in this area, not a lot, but more,” right? So, you’ve opened up that risk posture as the risk leader. You’ve allocated additional risk to that person sitting in the driver’s seat. That right there: risk leadership.
And it can change. I’m pointing out the fact that it changes, too, right, based on the environment and the scenario and the information you have, and they have.
Now, let’s say at their while their nephews there, did your nephew ever tell you that they didn’t feel safe or uncomfortable and they weren’t ready?
Host: They didn’t feel like they were capable of, just yet, yeah, basically.
Skow: Yeah, yeah! They actually showed up as a risk leader, just so you know. And you did a great job as a risk leader, creating a safe environment for them to tell you, No,
Host: Well, the car’s totaled now.
[Laughter]
Skow: Oh no, gosh, don’t tell me that!
[Laughter]
But you created a safe environment for them to say, “Hey, no, I’m not comfortable. I need more time.” And so, you created a safe environment as a risk leader to have a conversation. Good job. You created the level of risk, and you communicated the level of risk that you were willing to take. They had the ability and authority to push back and say, “No, I’m not willing to,” or, “Yes, I am willing to.” Those are both risk leaders. You’re both risk leaders in your own sense, just at different levels, right?
You are much like a higher-up (I’m air-quoting this) “risk leader” because you have a broader understanding, and they are a lower organization, a sub-organization to you, and they have a less understanding of this full situation, but they knew enough to say, “Hey, I like uncomfortable, not comfortable. You know, let’s have more conversations about the level of risk you’re actually asking me to take, or capable, right?” And, but, sooner or later, they were capable, and they wanted to right, and then you took them on the highway right.
And so like, yes, it was scary, I’m sure, but that is what a risk leader is, right. There you are. There you are supporting. You’re clearly defining expectations about the level of risk. You gave the space for them to come talk to you and say,” Hey, I’ve met the level of risk, or I’m exceeding it. I feel like I’m exceeding my level of comfort and accepting this risk. I need to talk to you.” That’s called elevation for decision making, by the way.
Or they reported something up to you during the process, saying, “Hey, I need more information. Can you give me more time here?” or something to that effect, where you, you had to help them additionally. And they’re so they’re like, hey, that’s reporting. They’re like, “Hey, I have a risk here. I need more. Help me. Help me here a little bit.”
You did a great job. I’m really proud of you. Hopefully your nephews appreciate you for all the work you did there.
Host: They’re doing great!
Skow: Okay, good job.
[Laughter]
But that’s what a risk leader is in the real world like outside of NASA. Inside of NASA, a risk leader is same concept, just with your team. If you’re a project manager, you had to trust your systems engineers, you had to trust your mechanical design folks. You gave them areas in which objectives, in which they could work under with and say, “Hey, I’m cool if you’d make decisions in this area, but if anything pops from this area, please call me.” Same idea, right?
Or hey, you were given something and you’re like, “Hey, I’m cool with doing this level of risk and doing these decisions, but like, I’m pretty sure my boss is going to want to know if I need an extra 50%, so…” right? You’re going to have to go talk to somebody at that point, right? And they probably don’t want you just to come tell them at the last minute that you need 50% more funding. They probably want some warning. But yeah, that’s the idea, right? That’s the whole concept of risk leadership.
Now, you heard me mention the concepts of risk posture a few times, and I kind of put it in the terms of, like, actually, like drive, you know, teaching someone how to drive, or maybe you were taught how to drive.
Host: It was very helpful.
Skow: But the actual definition, for those who care, is the limits of acceptable risk to the established or stated objectives whose achievement is of direct concern to the stakeholders.
Host: Acceptable risks.
Skow: Acceptable risks. That’s right, the level of acceptable risk: how much risk are you willing to take to meet an objective versus how much risk was your nephew willing to take to meet an objective? And there should be a lot and, if they’re not, you created a safe space to have that conversation.
Host: One more question for you, Mary: What do you consider to be your giant leap?
Skow: I’ll say, I’ll go with it from like a turning point scenario. We’ll answer it from that one.
So, some people in the agency already know this, but for those who don’t: So, I used to really do a great job of ignoring teachers.
[Laughter]
I was one of those kids who sat at the back of the class and through stuff and did not pay attention and definitely failed, like every spelling test, by the way, I still can’t spell. Don’t ask me to. And because of that, I was really I had a really good habit of you’re not necessarily caring about my grades? Well, in the fifth grade, I had someone come in a blue suit. That name was Pam Melroy.
So, Pam Melroy, astronaut Pam Melroy, at the time, came to my elementary school when I was in fifth grade to give a talk to us, because she’s actually from my hometown, and she came, walked down the aisle. I remember sitting in the gymnasium floor, looking up at this woman in a blue suit, and I was like, “I want to be her. This is awesome.” So basically, that’s what changed my trajectory.
So, starting from the fifth grade, in the middle of science class, I decided I was going to be an astronaut, and if I couldn’t be an astronaut, I was going to run NASA. So, for those who don’t know I do have a goal to be the NASA AA [Administrator] one day.
So, with that…
Host: Were you able to share that story with Pam Melroy?
Skow: Oh, absolutely. She knew. yeah, like when I found out that she was the deputy administrator here, I was able, through a friend, I was able to actually meet her and say, “Hi,” which was hilarious, by the way, but with that in mind, yeah, that was my turning point. So, I always saw that blue suit, and I was like, “Absolutely.”
Host: Wow.
Skow: I didn’t even know that was an option until she came to my school. Because, for those who don’t know, I’m from Upstate New York. There’s no NASA center hanging out there, but we got to check out a Challenger Center, I think is what they were called back then. And I got to be, like, a mission support person. And I, from there, I was, like, absolutely in love with science and math again. Don’t ask me to spell anything, but science and math.
Host: Yeah, maybe we won’t add a spelling officer to the agency.
Skow: Yeah, please don’t, please don’t do that. That’s what Word is for.
Host: That’s why we like acronyms here.
Skow: Yeah, acronyms are great. Don’t ask me to spell acronyms, either!
Host: So, later on in your career, you had multiple roles and you got your Ph.D. What do you consider to be like a big moment for you where you felt, or I’m sure you always feel like you want to like you just mentioned you want to aspire for more, but do you remember a point where you’re like, “This feels great at this moment?”
Skow: Oh yeah, I have a lot of those. Like, I try to take all the wins.
So, you don’t just become the NASA administrator one day. I figured out. They won’t just, and they don’t just become an astronaut one day, I also figured that out. Still not an astronaut. It’s very depressing.
When I graduated from my undergrad, I was excited because I worked a full-time job and went to undergrad full time. I when I got to go to my grad school, I got into grad school that was an achievement. I graduated from grad school. That was a huge achievement. I’m the first PhD in my family, or anyone with a grad, like a graduate degree, really. Great, awesome. Super excited about that.
Got into NASA as a co-op before I graduated with my Ph.D. Huge excitement. I was bouncing off the walls. We won’t go into the story, but I will tell you that my professor did not appreciate the fact that I got a co-op at NASA, so I actually had to get a new professor and restart all of my research.
Host: Wow!
Skow: Yeah, just so I could come to NASA. That’s how much I wanted to be at NASA and serve, you know, the American public, and be an astronaut one day.
So, I had to restart my research. So, when I graduated, that was a huge achievement. I got to work full time at NASA, a huge achievement. In fact, at some point I was like, I’ve already achieved, like, all of my life goals, but like, two which was to become an astronaut or run NASA, and those are going to take a long time. What am I supposed to do between now and then? So, I had to come up with new goals, because there’s a big gap, apparently.
So, I have slowly been bouncing those goals down. So, like, I wanted to be a project manager, I got, became a project manager. I wanted to learn how the agency under does the budget. I worked for the Office of the Chief Financial Officer. I wanted to make sure that I was helping people do project management. I started helping with the chief program management officer. I wanted to really improve how risk management is done across the agency. Because good project management is good risk management, by the way.
So, when this opportunity became available, I said, “Yes,” because I love a challenge. So, when I asked you if you ever said, “Yes,” to something that you knew was gonna fail, I have definitely said, “Yes,” to a few things I knew were gonna fail. This was not one of them.
Host: Clearly!
Skow: But there were other projects along the way. And so yes, absolutely. So, like every time I get a new job, it’s like a new excitement.
Host: Do you have any closing words for people who aspire to work at NASA or follow in your footsteps?
Skow: Yeah. Know your values and stick to ’em. That’s all I’m gonna say.
My number one value is have fun. So, if you ever see me wandering around the halls laughing and giggling, it’s because I believe in having fun every day. Otherwise, it just gets really boring.
Host: Well, thank you, Mary. Thanks for your time.
Skow: No, thank you!
Host: That’s it for this episode of Small Steps, Giant Leaps. That wraps up another episode of Small Steps, Giant Leaps. For more on Mary and the topics we discussed today, visit appel.nasa.gov. That’s A-P-P-E-L dot NASA dot gov. And while you’re there, check out our other podcasts like Houston, We Have a Podcast, Curious Universe, and Universo curioso de la NASA. Thanks for listening.