April 26, 2010 — Vol. 3, Issue 4
Learning from Failure
OCO-2 Gets Underway
The Orbiting Carbon Observatory team is applying lessons learned in a unique way after getting a rare second chance to fly.
The early morning of February 24, 2009 was cold, wet, and beautiful. Patrick Guske, Mission Operations System Engineer for the Orbiting Carbon Observatory (OCO)
, sat in the Orbital Sciences
Mission Operations Center in Dulles, Virginia. On a big screen, he saw the Taurus XL rocket rumble away from ground at Vandenberg Air Force Base in California. The rocket carried OCO successfully into the air with a bright blue streak trailing behind it—but not for long.
OCO came down much sooner than anyone expected. "I [saw] people starting to get a little nervous," Guske recalled. "Then they got very nervous. Then they got very quiet." OCO had missed its injection orbit and plunged into Antarctic waters.
The OCO team later learned that during ascent, the payload fairing (the nose-cone covering that protects the satellite as it goes through the atmosphere) failed to separate from the launch vehicle. The additional weight prevented the final stage from boosting OCO into the injection orbit.
Guske had planned to stay at the Dulles site for two weeks. He boarded a plane to California in a matter of hours.
OCO was an Earth System Science Pathfinder project run by the Jet Propulsion Laboratory (JPL). Its mission was to make precise, time-dependent global measurements of atmospheric carbon dioxide (CO2) that would help scientists better understand the processes that regulate atmospheric CO2 and its role in the carbon cycle. The observatory had three high resolution spectrometers dedicated to measuring Earth’s carbon dioxide levels.
Scientists know that carbon dioxide from humans and natural processes is absorbed into "sinks," like the ocean and growing plants. "But we know that we have put more carbon dioxide into the atmosphere than we see," said Guske, "and we're not sure where all of this carbon dioxide is going. How is it being absorbed and where? Are there seasonal variations?" While OCO didn't have the opportunity to answer these questions, OCO-2 can.
OCO-2 will follow OCO's original plan. It will join the Afternoon Constellation
(A-Train), a track formation of six satellites orbiting Earth and studying various aspects of Earth's natural systems. OCO-2 will compare its data with measurements from other instruments and observe daily and seasonal variations of atmospheric carbon dioxide.
"We have met the customer and he is us."
Within 24 hours after the launch failure, project closeout for OCO began. This included capturing lessons learned, a process that is often treated as a pro forma activity resulting in "lessons listed." Though no one knew it at the time, this had a different significance for OCO, because unlike most missions, it would ultimately get a second chance to fly.
Guske led the OCO lessons learned effort. He thought it was important to consider the people who would be reading the document his team was charged with creating. With cartoonist Walt Kelly
in mind, Guske said, "We have met the customer and he is us.”
“We wrote these lessons learned to ourselves because we’re going to use these lessons learned,” said Guske. The lessons had to be written so the team could understand them. “There is a difference between how we dealt with lessons learned on this project, OCO, and how other missions deal with their lessons learned,” he added. For OCO, the lessoned learned would be active, not passive.
From Listed to Learned
The process began with Guske sending out an email to everyone on the team: engineers, scientists, contractors, librarians, and secretaries. He asked for feedback regarding what worked and what didn’t. When the responses came back he sorted through all of them to generate a streamlined list.
In total, Guske collected 78 lessons learned. Lessons ranged from secretaries asking that team lists be kept up to date to larger programmatic issues such as sorting out lines of authority and clearly defining deliverables. At their simplest, each lesson met three specific criteria: it was positive, didn’t point fingers, and offered a solution to a problem. Guske welcomed all of the feedback – the good and the bad – and evaluated each of the lessons based on these criteria and how they would affect the team for the next time around.
During this process, Guske emphasized the dangers of “better is the enemy of good enough.” The team wanted to avoid any attempt to make the spacecraft “better”—they wanted OCO-2 to be as close to the original as possible. Changes were considered only if improvements would reduce risk, or if components didn’t have spares or had become obsolete. For the most part, OCO-2 is a near-clone of OCO.
Guske assigned each member of the OCO team specific lessons to implement when rebuilding the observatory. He also began documenting the implementation of the lessons learned effort, with the intention of conducting a post-launch evaluation of the effectiveness of the process.
One of the lessons the OCO team learned had to do with testing. Given the mission’s low cost and compressed schedule, the team decided not to test the instrument detectors in flight-like conditions, instead accepting the detector screening done by the vendors. However, the screening processes did not mimic the operational use of the detectors.
After integrating the instrument and putting it into the thermal vacuum chamber, the team discovered a problem: the instrument had a residual image. The effect is similar to the bright spot you see after someone takes your picture with a flash, explained Guske. Faced with two choices—replace the detector or correct for the anomaly—the team decided to develop an algorithm that would correct for the residual image.
This time around, the OCO-2 instrument manager had time and money to test the detectors in flight-like conditions. By screening the detectors ahead of time, the team will know if there are any problems.
Another lesson learned by the OCO team related to data transfer. While testing the observatory in the thermal vacuum chamber, the mission operations team in Dulles, Virginia, downloaded raw data from the instrument in three gigabyte-sized files (one for each spectrometer). It then had to send the data to JPL, which had responsibility for analyzing the data, but the JPL team couldn't receive it because of security firewalls at each location.
Since this problem cropped up late in the schedule, the solution the OCO team developed involved transporting the data on portable hard drives back and forth on commercial air flights. Although it was slow and inefficient, this fixed the problem for the time.
At one point, when the OCO team was asked to remove the observatory from the thermal vacuum chamber, it was hesitant to do so because it had not received and analyzed all of its instrument data (which was on a plane somewhere over the United States). There was the possibility that the team would not have all of the measurements needed for fully assessing the instrument and its operation.
The team went ahead and removed OCO from the chamber without the data. When the data did arrive, it was incomplete. Fortunately, the OCO team was able to reconstruct the necessary dataset using an ambient temperature chamber. Despite this successful mitigation, however, the OCO team added this experience to its lessons learned. For now, the team has discarded air travel as a method of data transfer and is exploring more efficient options.
Getting to Fly...Again
OCO made it to launch. Its design was mature and approved for flight. Since OCO-2 is nearly identical, the team has been granted what Guske called a “free pass” on reviews before their Critical Design Review in August. The Project is conducting a "tailored formulation phase" to ensure the updated OCO-2 is developed correctly and completely.
The team is still holding peer reviews for a few interface changes that resulted due to a lack of spare parts, but on the whole they are "just making sure things fit together and flow together," according to Guske.
The OCO-2 team will track the status of each of the 78 lessons learned. Guske said he believes the process is going well, and he looks forward to evaluating the process in hindsight after the launch in February 2013. "We're doing it," he said. "People have the battle scars to show the lessons they have learned, and they're getting to implement those changes now."