What the Mars Rover Teaches Us about Risk Management
I recently read an LA Times article about potential contamination of the Mars rover with bacteria from Earth. The story suggested that a breach in protocol had inadvertently contaminated the drill of the Mars rover “Curiosity." As a consequence, it would be a problem for the drill come in contact with water on Mars.
It seems that the box containing the drill bits had been sterilized and sealed several months before the launch, consistent with protocol, to assure that no Earth bacteria were on the bits. Sometime after it was sealed, someone recognized that if the drill bit loading mechanism were damaged during landing, the drill would be unusable, thus cutailing mission critical activities.
To mitigate this risk, someone decided to open the sealed and sterilized box of drill bits and preload one into the drill. This results in a positive consequence: At least one drill bit is available.
The risk management geek in me loves this example of risk identification and mitigation, and I would be proud to be the person who came up with this option. This is a great example of mitigating one risk leading to another risk.
Apparently the folks implementing the risk mitigation violated protocol and did not involve the NASA officials responsible for avoiding contaminating Mars with Earth bacteria. Because the NASA Office of Planetary Protection wasn’t involved in planning and implementation of the decision to pre-load the drill bit, they didn’t have input into the process and it resulted in contamination. It was done in a clean room but not a sterile clean room.
The project management geek in me has several questions: Was there a conscious decision to skip the protocol, and if so, why? Was it a hassle? Was there lots of paperwork? Was there a mission delay? Did the people skipping this step understand the potential implications to the mission? Was this a conscious and informed decision made at the appropriate level of the organization?
The answers to these questions aren’t clear from the LA Times article or a subsequent NPR interview with the NASA Planetary Protection Officer about the incident.
This story brings into focus trade-offs made on complex projects all the time. In this case, we don’t have enough information to know if what occurred was a mistake or an informed decision made at an appropriate level of the organization. The carefully ambiguous language in both the article and the interview gives us little clue.
I’m most curious about the decision-making process. Given the highly-visible history of risk management at NASA (see Challenger and Columbia), I would be encouraged if this decision had been made with full knowledge of the consequences at the highest levels of the organization.
For context, you might find this analysis of the Space Shuttle Columbia accident and the cultural and risk management issues surrounding it interesting. I read the whole thing, but the chapter starting on page 185 is particularly noteworthy.