You are on page 1of 11

Safety lessons from challenger

challenger
This Jan. 28, marks the 35th anniversary of the Challenger accident. The loss of the crew was a
tragedy felt by their families, friends and coworkers at the agency, as well as people
throughout the world. The Challenger accident taught us tough lessons and brought forward
what have become recognizable phrases: normalization of deviance, organizational silence and
silent safety program. Sadly, we learned these lessons again in 2003 with the loss of Columbia
and her crew. This shows how vital it is that we pause to revisit these lessons and never let
them be forgotten. We cannot become complacent. In this month's Safety Message, Harmony
Myers, director of the NASA Safety Center, discusses the Challenger accident and the lessons
it continues to teach us today.
10 key lessons learnt from challenger
Lesson #1: Risks Hiding in Plain Sight. The flight history of O-ring performance would have demonstrated
the correlation of O-ring damage and low temperature. Yet project leadership didn’t review that history, and
thus were unprepared to properly evaluate the launch risks. For many new organizational initiatives, there
are often indications “in the files” that offer warning signs of possible risks. Project leaders should commit
the time to look for them.

Lesson #2: When the Signs Are Right In Front of You. While the manufacturer didn’t tell NASA not to
launch, it warned that the impact of launchpad ice on the shuttle was an unknown condition, and that
risk of ice striking the shuttle was a potential flight safety hazard. Yet NASA proceeded. Effective risk
management depends on the ability to identify, process and give adequate consideration to signs and
indicators that can be critical to project success.

Lesson #3: Those Right Hand, Left Hand Problems. Senior launch officials were unaware of key
warnings expressed by others: the most recent problem with the O-Rings; a contractor’s recommendation
not to launch below 53 degrees; the similar warnings of project engineers and the manufacturer’s
concerns with launchpad ice. Project information flow must provide decision-makers with access to the
perspectives of all meaningful project participants.
Lesson #4: Protecting Checks and Balances. At some point in the launch process, NASA management made an
independent decision to waive previously established launch constraints designed to assure flight safety. Basic
principles of risk oversight make it imperative that both the establishment of project compliance checks and
balances, and decisions to override them, are subject to review by higher levels of management.

Lesson #5: Increasing Levels of Acceptable Risks. The response of NASA and a key shuttle contractor to early
indications of a design flaw was to increase the amount of damage considered to be an acceptable risk. This was
justified, according to the commission, “because we got away with it the last time.” Effective project leaders
recognize that the ability to “get away with” something is never an acceptable basis for assuming material risk.

Lesson #6: Yielding to Pressure Rarely Works Out. A primary contractor reversed its opinion and recommended the
shuttle launch, contrary to the strenuous safety concerns of its engineers. This was done to accommodate a major
customer of the contractor. Effective risk evaluation processes typically involve “give and take” exchanges with various
interested parties, but provide protections against the excessive influence of purely financial pressures

Lesson #7: When the Fox Guards the Chicken Coop. Organizational structures at key shuttle project levels placed safety,
reliability and quality assurance offices under the supervision of the very organizations and activities whose efforts they
were responsible for supervising. Effective risk management and compliance structures pay close attention to
organizational hierarchies and administrative structures in order to assure appropriate checks and balances.
Lesson #8: Insidious Risk Equations. The decision to launch in the presence of so many obvious “yellow
flags” of safety created the impression that NASA was actually requiring a contractor to prove that it was not
safe to launch, as opposed to proving that it was safe to launch. The effectiveness of risk management and
compliance protocols can be severely compromised by review standards that are biased in favor of project
design feasibility.

Lesson #9. Beware the Counterproductive Culture. The management culture at a primary NASA
facility was criticized for a propensity to contain potentially serious problems and seek to resolve them
internally instead of reporting them upstream. Indeed, silos are antithetical to effective risk
management. Leaders should promote knowledge and information sharing amongst management, and
board committees with risk/compliance/legal responsibilities.

Lesson #10: “Just Say No”. Five booster rocket engineers employed by the prime NASA contractor made a
last-ditch (but ultimately unsuccessful) effort to stop the launch because of fears that the O-Rings would fail in
the cold. The righteousness of their efforts underscores the importance of strong whistleblowing and futility
bypass mechanisms in product design processes. Project participants need the freedom and access to say
“Stop-we’re not ready!”
Made by-Sameer(108),
Arun (072)
CSE-2

You might also like