You are on page 1of 17

How to make better

decisions in a crisis
WHITEPAPER
Contents
INTRODUCTION 3

‘GOOD-ENOUGH’ DECISIONS 4

THE IMPORTANCE OF PLANNING 5

AVOIDING CRISIS MANAGEMENT ERRORS 8

GROUPTHINK 11

HOW DO YOU OVERCOME THESE ERRORS? 12

THE AFTERMATH 15

FURTHER READING 16
INTRODUCTION
In a business setting, the term crisis refers to a period of intense difficulty or danger.

Crisis management describes how challenging and important decisions are made, and how
actions are taken to reduce danger. The danger might be to the reputation of an organisation,
a financial risk, or a risk to life. Decision makers are under the most pressure when the wrong
decision can lead to deaths and injuries.

Often one type of crisis follows another. A production problem is a business risk, but can lead
to an explosion and fire, with threats to life. This leads to an environmental problem as toxic
smoke fills the air, contaminated runoff water seeps into the earth or oil gushes into the ocean.
Residents and businesses are affected, client needs are unfulfilled. The damage to reputation
and future prosperity needs to be managed.

For some crises, there are warnings. A pandemic like COVID-19 had been predicted for
decades, although few organisations had detailed plans in place. When the pandemic started in
2019 the crisis developed slowly – timescales for considering evidence and making decisions
could be measured in days and weeks.

For other crises, the decision-making timescales for are much shorter. The Buncefield
explosion occurred at 06.01 on a Sunday morning in 2005. The fire engulfed twenty storage
tanks within minutes. The Grenfell Tower fire was reported in a fourth floor flat just before
01.00, on 14 June 2017. Within 30 minutes, the fire had spread all the way to roof level, above
the 24th floor.

Even when the initiating event takes only a few minutes, the need for crisis management might
continue for days or weeks. It took a full 24 hours to put the fire out at Grenfell, and several
days at Buncefield. The clear up post-Buncefield took months, while cleaning up after
Deepwater Horizon took years. The crisis of building safety and regulation raised by Grenfell
seems likely to continue for decades.

What each crisis has in common is that decisions must be made under time pressure if
damage – whether to life, property or reputation – is to be limited. This whitepaper looks at
how improving our understanding of the way we think can lead to better decision-making
during a crisis.

How to make better decisions in a crisis 03


‘GOOD-ENOUGH’
DECISIONS
How do you make decisions?

If you were buying a new home, you might make lists of all the desirable criteria and all the
homes available, and then score each choice against the criteria. You might do the same for a
car.

You probably wouldn’t go to this effort for every item in your weekly shopping – it would take
too long. We take a shortcut, selecting one or two criteria as the most important – the price of
a bar of chocolate, the familiarity of a breakfast cereal, or the ethics of a cup of coffee might
trump other considerations.

This ability to consider options carefully and


slowly, and to switch to faster, more
instinctive thinking normally serves us well.

It developed for good evolutionary BOX 1


purposes. In our distant past, our ancestors
could plan slowly and methodically where US Airways
to pick berries or look for prey. However, if
Flight 1549
under threat while hunting – for example
from an animal that saw them as prey – our
ancestors didn’t have time to work out the When both engines failed on US Airways Flight
options. They had to decide quickly whether 1549 on 15 January 2009, Captain Chesley
to run, attack or hide. The shortcuts they B. Sullenberger (Sully) and his crew had less
learned became part of our evolutionary than four minutes to make a decision and land
the plane somewhere.
inheritance.
The three-page checklist normally completed
We take similar shortcuts during a crisis before a landing would take 30 minutes.
when there isn’t the luxury of time to ‘We forced calm on ourselves’5 Sully
consider all the options and all the criteria. explained. He prioritised the criteria that
We consider options one at a time, against a seemed the most important – to save every
limited number of prioritised criteria. When life on the plane and of the people below. He
one seems ‘good enough’ it is selected. It knew which technical steps in the checklist
might not be the best option – if you had the would have the best impact on that outcome.
time to consider all the options, against all He landed on the Hudson River with no loss
of life, but the plane was damaged.
the criteria, a much better one might have
been available. At a hearing after the event, Sully was told
that calculations had shown he might have
Sometimes these shortcuts work out – see made it to a runway, and might have saved
Box 1. But other times they make a crisis the plane. But to demonstrate that had taken
into a disaster. hours of simulations on the optimal approach
- time that the crew of Flight 1549
didn’t have.

How to make better decisions in a crisis 04


“In the moment, it can
be hard to think. Having
a plan is vital, and having
the flexibility to deliver it
even more so.”6

Dr Alistair Teager, consultant clinical


neuropsychologist at Salford Royal
Hospital

THE IMPORTANCE
OF PLANNING
When we make decisions under pressure, we don’t have time to consider all the options.
Planning takes advantage of our ability to think carefully about multiple priorities.

If we could predict every crisis, we could plan our response and weigh up the alternatives,
applying our methodical optimising slow-brain thinking to the problem, rather than our ‘good-
enough’ fast-brain. While we can’t plan for everything that will ever happen, we can plan for a
broad range of scenarios. When a crisis happens, we have some tools ready to use.

But which types of incidents should we plan for? Most organisations have plans in place for fire
evacuation and put their staff through a fire drill once or twice a year. For many organisations,
this is where emergency planning stops.

To understand what to plan for, we need to manage the information available – and to be aware
of the shortcuts we take when assessing the information.

INFORMATION MANAGEMENT
BEFORE THE CRISIS

In relation to the Grenfell fire, the London


Fire Brigade (LFB) Commissioner, Dany
Cotton, justified the lack of training that
firefighters had for a cladding fire with a
startling comparison:

“In the same manner that I wouldn’t


develop a training package for a space
shuttle to land on the Shard … I wouldn’t
expect us to be developing training or
response to something that simply
shouldn’t happen.”7

How to make better decisions in a crisis 05


It shouldn’t have happened, but it did. The Grenfell Inquiry found evidence that if the
information available across the organisation had been shared appropriately, a Grenfell-like
fire could have been predicted and planned for. The existence of ‘a silo’ had prevented
information getting to the right people.

This emphasises the importance of sharing information across an organisation. You can’t
assess the likelihood of a crisis if you don’t have the relevant information. Sharing everything
that everyone knows would be impossible, and overwhelming. The right IT system can help
you organise what is known so that it is easier to identify which threats to plan for.

You need to find a balance between spending a lot of effort on something that will never
happen, and ignoring things that haven’t happened so far but could occur in the future. Take a
list of possible events, and brainstorm with the management team what processes you have
in place to deal with them. Some might be well covered already. Others might be manageable
by tailoring existing plans.

Others – like a space shuttle landing on your worksite – might be regarded as so unlikely as
to be beyond consideration. But considering the impact of unlikely – but catastrophic -
incidents is a useful way to test your plans.

How to make better decisions in a crisis 06


SHORTCUTS IN PLANNING

While better information management will make us more aware of what we need to plan for, we
need to be aware of the shortcuts that can make us ignore the information we have.

Normalcy and familiarity Near misses feed optimism

Even in the face of evidence, we don’t like to In safety management, we are supposed to
think about bad things happening. Too many of learn from near misses – someone nearly trips
us were slow to respond to COVID-19 because on an uneven floor, so we repair the floor to
shaking hands, travelling and seeing people prevent someone falling. At the scale of a crisis,
were normal things to do. near misses make us think we are already
resilient.
Climate change is already claiming lives across
the world, and yet too many people still deny Europe had seen the rest of the world deal with
its existence, or assume that science will find a SARS (severe acute respiratory syndrome),
solution in time to protect our lifestyles. swine flu, Ebola and avian flu. That these
illnesses didn’t impact most Europeans
We plan based on what we already know. When perhaps led to false optimism – we didn’t need
things stay the same, this works. When things to plan for such events.
change, it makes it difficult to see the need for
change. Pilots land planes on runways, not in When Hurricane Ivan turned aside from New
rivers. In high-rise buildings, firefighters put Orleans in 2004, the authorities appeared to
out fires in one self-contained flat, they don’t scale down their preparations for a hurricane,
evacuate people from 120 separate flats. rather than escalate to deal with the impact of
Hurricane Katrina a year later.

While it’s good to have a generally positive


outlook on life, when planning for emergencies
Hyperbolic discounting you need to be a pessimist. What if the fire
spreads from one location into another? How
An aversion to a loss now for something better would we respond if there is a bomb attack in
in the future also prevents good planning. It two locations at the same time? How will we
is why it can be difficult to get young people cope if these key people are not available?
to save for a pension, or a dieter to refuse a
sugary treat for the long-term benefit of being
healthier.

It might be the reason we don’t keep a


stockpile of personal protective equipment
(PPE) that might never get used.

It has been used to explain why despite


warnings in the 1990s, there was no tsunami Ask yourself: what could
warning system in place throughout the Indian cause my plans to fail,
Ocean. Such a system might have reduced the and if they fail, what will
230,000 deaths in 2004.
I do next?

How to make better decisions in a crisis 07


AVOIDING CRISIS
MANAGEMENT ERRORS
INFORMATION DURING THE CRISIS

As with planning, information is a key tool for managing a crisis. People need the right
information at the right time to make the right decisions. Behaviours regarded after an event
as ‘panic’ can often be explained by the lack of information at the time a decision was made
(see Box 2). This is particularly frustrating where information is available in the organisation,
but it hasn’t been passed to the right people.

However, too much information can be as much of a problem as not having enough. Keeley
Foster had to manage efforts to control large grass fires in outer London when she was
Deputy Assistant Commissioner at London Fire Brigade in 2018:

“I got there at 4 pm and was bombarded with information. If you take too much on board
you’re going to be overloaded and your stress levels will go up. So it’s about processing that
information, prioritising it, and allocating it to the right people at the right time.“5

The right IT tools can make a significant difference to information management and
communication. Tools to specify who gets a message by role or department or location make
it more likely people have the right information on which to base their choices.

BOX 2
The psychology of ‘panic’

Too often, when looking back at a crisis, people are accused of


panicking – making decisions that were clearly irrational.

In Box 1 we saw that with plenty of time, the inquiry following the crash
of US Airways Flight 1549 was able to devise a solution that might
have been better, and they challenged Captain Sullenberger on why he
hadn’t taken that option. Sully was able to explain the rationale behind
his decisions, and the board of enquiry had to admit that his choice was
reasonable, given the information and time available.

What might look irrational to us with the benefit of hindsight might have
been perfectly reasonable to the people making that decision at the
time. If you think you might have difficulty getting petrol or toilet roll
or pasta next week, it is reasonable to top up your supplies this week,
even if normally you’d have waited another few days. If when running
away from a fire, your preferred route is full of hot, acrid smoke, it is
understandable that you might jump out of a window.

It is important therefore that people making decisions have the best


information available to make the decisions.

How to make better decisions in a crisis 08


SHORTCUTS DURING CRISIS MANAGEMENT

Under the pressure of a crisis, we are even more likely to resort to shortcuts, and less likely to spot
the errors these shortcuts introduce.

Exponential myopia

Our evolution taught us that people, food crops and animals tend to grow steadily, in a linear way.

Pandemics, fires and other crises often develop exponentially. One person infects two others, who
infect four others. On 23 February 2020 there were ten reported cases of COVID-19 in the UK.
Politicians imagining linear growth believed the National Health Service would be able to cope.
As cases doubled every three or four days, by the time of the first UK lockdown a month later there
had been over 12,000 cases in the UK.

If some bad publicity about your company is shared in one social media post and seen by 10 people
who repost, by the end of a single day the bad news might have been seen by thousands, or even
millions.

If you suffer from exponential myopia and assume that a crisis will grow linearly, you are likely to
miss important cues that will tell you to change course, to try a different approach, or to prioritise one
aspect of a crisis over another.

Outcome focussed

It is painful to remember when we got something wrong. It is much nicer to dwell on success. Hence,
during a crisis the options we think of first are more likely to relate to past successes than to past
failures. This is one reason we so often fail to learn from an event.

In fighting a fire, a commander might consider all the many fires where more water eventually put the
fire out without loss of life; they would need help to recall the fires where this strategy wasn’t
successful. If sandbags held back the flood last time, it might be assumed they will do the same this
time.

This focus on outcomes can escalate – each time we get away with a little more, we become bolder
next time. The driver who gets home safely after one drink might try two drinks next week. The
homeowner who didn’t evacuate during the last weather warning decides not to put up their storm
shutters this time. Before the Colombia space shuttle disaster in 2003, NASA ‘got away’ with at least
14 previous incidents of damage to space shuttles.

The damage threshold creeps up with each successful flight, until the flight where the shuttle
disintegrates.

How to make better decisions in a crisis 09


Confirmation bias and sunk costs

Think back to a big decision you made in the past. Perhaps it was deciding which home to buy, which
person to ask out on a date, or which job offer to take. If you are still in that home, still dating or
married to that person, or still enjoying the career path that job led to, you probably feel you made
the right decision. We reduce any unease by looking for confirmation that the decision we have made
is correct – and place less weight on any evidence that we have made the wrong decision.

We do this because of the way our brains are wired. Neuroscientists have detected parts of the brain
shutting down when presented with information that contradicts current beliefs, while ‘happy’
circuits light up when given information that confirms existing understanding.

Confirmation bias can be psychologically healthy. It prevents us wasting our lives worrying about
whether we should have chosen a better partner, or home, or career. However, while managing a
crisis we need to be more versatile, and more critical of our own decisions, or those made by other
decision makers.

Imagine you have invested much time and effort in crafting a response to a situation. You have
mobilised resources to follow this plan. To admit that there is evidence your plan isn’t working - to
abandon the plan and ask for more help, might feel like failure.

Industrial safety and accident analysis author and consultant, Professor Andrew Hopkins of the
Australian National University in Canberra summarises the problem with decision making in the lead
up to the explosion at Deepwater Horizon:

“The group was subject to powerful confirmation bias because the cement job had already been declared
a success... they were not testing whether the well was safe, they were confirming that it was… they re-
peated the test in various ways until finally they got a result that could be interpreted as a success,
at which point, it was ‘mission accomplished’.” 4

A plan assumes a theory or model of how we think the world works. We assume blocks of flats will be
built and maintained such that a fire on one floor will not spread to another floor. Concrete in a deep-
sea well will hold back hydrocarbons. When evidence is missing to support our theory, our brains will
fill in the gaps.

Look at Figure 1. What shapes do you see? Are these the same as the shapes that have been drawn,
or is your brain filling the gaps to see a triangle and a square? Just as we do this with shapes, our
brains do the same with facts. We connect the pieces of information most readily available to us to
reinforce the theories we already have. Sometimes, we need to step back to see what’s really
happening.

Figure 1: What shapes do you see? Are they there, or are you filling the gaps?

How to make better decisions in a crisis 010


GROUPTHINK
We recognise that intelligent, hardworking Conformance to the norm
individuals will take shortcuts which can
lead to serious errors in planning and in We have evolved from people who had to be
managing a crisis. Would forming a team, part of a group to survive – for shelter, food and
where we put overly cautious people with mutual defence. The sense of needing others
risk-takers, optimists with pessimists and persists, and it is uncomfortable to express an
people who care about the detail with ‘big opinion that might upset people around us.
picture’ people, result in better choices
during a crisis? Even in a non stressful situation with strangers
we try to fit in. In a classic study in the 1950s,
Professor Solomon Asch asked people to
Imagine two children in a play area. One is
compare the length of lines on a piece of paper.
very cautious and nervous about using the Look at the lines in Figure 2 – which of A, B or
slide. One is very bold, and likes to climb up C most closely matches the length of the target
the slide the wrong way. If the two children line?
play together, do we think their behaviours
will ‘average out’ so that they both play While this seems like a simple, non-
nicely? Probably not. But somehow, controversial task, around three-quarters of the
businesses think that a group of people subjects got it wrong at least once. What they
might ‘average’ out their decisions. didn’t know was that the other people in the
study were in on the test. The subject would be
asked for their opinion after several other
Psychological research shows us that
people had given the same wrong answer.
without support, information and challenge,
a group is likely to take greater risks than That the majority of test subjects gave in to
any individual member of the group would pressure to agree with an obviously wrong
have taken on their own. This phenomenon decision is alarming. Now imagine a tightly knit
was defined as ‘groupthink’ by the American group of decision makers in an organisation with
social psychologist Irving Janis in 1972. pressure to make the right decision, quickly, and
the knowledge that the wrong decision could
Characteristics of groupthink include: make a crisis into a disaster.

How many people then will be willing to stand


• an unswerving belief in the decisions of
out against their colleagues and challenge a
the group, supported by group pressure decision?
against different opinions
Figure 2: How many people would have to disagree
• self censorship of any doubts. with you before you’d decide that the target line
wasn’t the same length as line C?

Target Line A B C

How to make better decisions in a crisis 011


“An error does not
become a mistake until
you refuse to correct it.”1

Orlando Aloysius Battista, a


Canadian chemist and author in
1946,quoted by US President John
Kennedy in 1961

HOW DO YOU OVERCOME


THESE ERRORS?
Simply understanding how we make errors in our decision-making is not enough to stop us
from making the wrong decisions. We need to have plans and processes in place to catch
errors before they become disasters. We also need the flexibility to know when a plan isn’t
working, and needs to be changed.

PLAN, PRACTICE, PLAN, PRACTICE

“There wasn’t time in that critical moment to try to devise my strategy... I decided that we were
only going to do the highest priority items and do them very, very well.”5

Chesley Sullenberger (‘Sully’), in an interview in 2018 (see Box 1).

Events are likely to be more variable than we think they will be, and we will have to change our
plans more often than we expect. Sully’s ability to devise a life-saving strategy within moments
was remarkable, and based on 19,663 total flight hours. We should not rely on such notable
achievements in our own plans. Planning before a crisis allows us to weigh up the options with
our slow-thinking brain. Practising that plan makes it more likely that in a crisis our fast-
thinking brain will take the right shortcuts.

You cannot plan the exact details of every crisis you might face. But you can plan and practice
for a broad range of scenarios, and build up a set of skills that can be applied when something
novel comes along.

The most usual form of rehearsal is the fire evacuation drill, where an alarm sounds and staff
march out to the assembly area to be checked. But practice doesn’t have to be a full-scale
operation. Table-top exercises involving key decision makers are a cost-effective way of testing
communications, technology and procedures. You can test expected emergencies – and
unexpected ones too.

After any form of practice, review the plans and processes and enhance them, so that people
are fully prepared when they need to be applied in earnest.

How to make better decisions in a crisis 12


STATUS CHECKPOINTS

Since our perception of time when under pressure can be inaccurate, it is unsafe to rely on
someone noticing that a crisis is growing more quickly than anticipated, or that the plan isn’t
working as
expected. We need a system to bring together and summarise all the key clues that decision
makers should be looking at, and a regular ‘alarm clock’ that reminds us to take time – even
when we think we have none to spare – to stop and reconsider the plan we are following.

Status meetings can be in person or virtual, depending on the type of crisis being handled.
They might be timetabled to take two minutes every 20 minutes during a fire, or half a day
every week during a pandemic.

However long or frequent, status meetings need some structure to avoid falling into
confirmation bias and groupthink. You need a process that requires decision makers to
respond to particular prompts – for example, about communication with stakeholders, the
status of equipment, or the growth of the crisis. Set criteria for signals that must not be
ignored – like an oil pressure test, or indications that a fire hasn’t been contained within one
flat, or damage to equipment beyond a set threshold.

The Shuttle Programme management rules at the time of the Columbia mission stated that
the Mission Management Team should meet every day. It met just five times during a 16-day
mission. More frequent and more structured meetings might have considered the impact of
damage to the shuttle more carefully.

How to make better decisions in a crisis 13


“In the case of NASA, our assessment after
the Columbia tragedy found that many of
their team meetings were not conducive to
open and productive communication.

Often, disagreements were ‘unsafe’ in that


someone would win (and be celebrated) and
someone would lose (and be marginalized)
from the discussion, even if everyone was
raising valid points.” 3

Dr Thomas Krause., author of several


books on organisational safety

TEAM DYNAMICS

A successful team will collectively have all the knowledge needed to manage a situation. This
can include regular members of the operational team, or specialists brought it for their
expertise. But assembling a team is not simply a matter of having one person to cover each
skill area.

Team members need to co-operate around a common goal, while communicating with each
other honestly. The quote above refers to the culture at NASA before the disintegration of
the Space Shuttle Columbia in 2003, resulting in the deaths of seven crew members. NASA
engineers knew that the shuttle had been damaged on take-off. They suspected there was a
risk of failure on the return journey. But managers didn’t want to consider this possibility, nor
how to avoid it. Following this second major disaster with the shuttle programme, leadership
at NASA rebuilt their teams.

As Krause explains:

“Subsequently, leadership at NASA took on the task of creating a more open environment that
encourages communication and values dialogue about disagreement. Progress was measured
via surveys and observations, and positive results were evident in the first year.” 3

To avoid groupthink, consider bringing in a new person to take a fresh look at the evidence. If
they are not invested in the original plan, they will be more likely to suggest a new approach if
one is needed.

One approach is to employ someone as a ‘devil’s advocate’ – a person whose specific job it
is to disrupt and to point out the flaws in someone else’s line of thinking. Results from Asch’s
line-matching study support this proposal. Even where the right answer was a minority
decision, if only one of the confederates disagreed with the majority (incorrect) decision, the
subject was more likely to keep with the right answer.

In one variation of the line-comparing study, Asch asked people to write down their decision
privately, so that the rest of the group didn’t know how they had responded. The likelihood of
choosing the wrong line decreased. Technological solutions – where people can respond from
separate locations, rather than having to come together to a single location – might have the
added benefit of reducing group pressures to conform.

How to make better decisions in a crisis 14


THE AFTERMATH
Applying some of the approaches described in this whitepaper, supported by tools and
resources, will improve your ability to manage the crisis to a successful conclusion. Whatever
happens during a drill or a real emergency, there will be something to learn.

Failing to learn lessons makes it more painful when mistakes are repeated. Lessons from the
Challenger Space Shuttle disaster, where seven astronauts died in 1986, did not prevent
seven fatalities on the Columbia Space Shuttle in 2003; lessons from the otherwise
unremarkable and unmemorable fire at Knowsley Heights near Liverpool in 1991 where no
one died were not used to prevent 72 people dying in Grenfell Tower in London in 2017.

Many of the shortcuts we take in planning or managing a crisis also apply when looking at
what happened during an event. The ‘good enough’ option that was pragmatic in the heat of
an emergency appears a poor choice with the benefit of hindsight and with time to consider
what other options were available. It is often easiest to blame the people on the front line.

The pilot who made the best of tight turnaround times, the fire service working in a climate
where safety regulations are mocked, engineers who would be ostracised for pointing out
potential problems. The senior managers, the local authorities and where relevant, national
governments must face up to their role in creating the environment for a crisis.

We judge people more harshly if a decision didn’t work out than if it did, even if the reasons
for success or failure were outside their control. A doctor with a difficult medical judgement
to make about whether or not a patient should have a heart bypass that could improve their
life – or kill them – might be judged reckless if the patient dies, and brave if the patient lives.
We apply a similar bias to other decision makers – judging them by the outcome, not by how
they used the information available to them.

An accurate account of information received, decisions made, and actions taken can make
the review process more objective.

Paper records can be unreliable, so where possible use a system that allows you to create
a time-stamped log, including information from equipment, people, systems and multiple
locations. Knowing what information was available at the time each decision was made will
provide better improvements to future planning than hindsight ever can.

How to make better decisions in a crisis 15


FURTHER READING
References for quotes:

1. Kennedy, J (1961). The President and the Press: Address before the American Newspaper Publishers Associa-
tion, April 27, 1961. Accessed at www.jfklibrary.org/archives/other-resources/john-f-kennedy-speeches/ameri-
can-newspaper-publishers-association-19610427

2. Tavris C and Aronson E (2008) Mistakes were made, but not by me. Pinter & Martin.

3. Krause T (2010). What Caused the Gulf Oil Spill? This article also mentions the Columbia space shuttle disaster.
Accessed at www.csrwire.com/press_releases/30321-what-caused-the-gulf-oil-spill-

4. Hopkins A (2012) Disastrous Decisions: The Human and Organisational Causes of the Gulf of Mexico Blowout.
CCH Australia.

5. Pires C (2018). Don’t panic! Meet the experts with a steady hand when catastrophe strikes. In The Guardian
(UK newspaper). Accessed at www.theguardian.com/world/2018/sep/09/dont-panic-meet-the-experts-with-a-
steady-hand-when-disaster-strikes

6. Florance I (2018) ‘When the Manchester Arena attack happened we developed our plans on the way to work’.
In The Psychologist, November 2018. British Psychological Society. Accessed at https://thepsychologist.bps.org.
uk/volume-31/november-2018/when-manchester-arena-attack-happened-we-developed-our-plans-way-work

7. Grenfell Tower Public Inquiry website (ongoing) at www.grenfelltowerinquiry.org.uk

Background to specific accidents:

Columbia Accident Investigation Board (2003) Volume 1. At www.nasa.gov/columbia/home/CAIB_Vol1.html

National Transportation Safety Board (2010) Loss of Thrust in Both Engines After Encountering a Flock of Birds
and Subsequent Ditching on the Hudson River, US Airways Flight 1549. At www.ntsb.gov/investigations/Acciden-
tReports/Reports/AAR1003.pdf

Deepwater Horizon Accident Investigation Report, Executive Summary (Sept 2010) www.bp.com/content/dam/
bp/business-sites/en/global/corporate/pdfs/sustainability/issue-briefings/deepwater-horizon-accident-investiga-
tion-report-executive-summary.pdf

National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling (2011) Deep Water. The Gulf Oil
Disaster and the Future of Offshore Drilling. Accessed at www.govinfo.gov/content/pkg/GPO-OILCOMMISSION/
pdf/GPO-OILCOMMISSION.pdf

HSE (2011) Buncefield: Why did it happen? At www.hse.gov.uk/comah/buncefield/buncefield-report.pdf

References on how we think:

Asch S (1952) Effects of group pressure upon modification and distortion of judgement. In Guetzkow H (1952)
Groups, Leadership and Men. Carnegie Press.

Janis I (1972) Victims of Groupthink. Houghton Mifflin.

Reason J (2013) A Life in Error. Routledge.

Kunreuther H and Meyer R (2017) The Ostrich Paradox. Wharton School Press

Collins R, and Leathley B (1995) Psychological predisposition to error in failure analysis. In Safety and Reliability.
Vol 14(3).

See also Reference 2 above (Tavris and Aronson, 2008)

How to make better decisions in a crisis 16


Crisis Management Software
EcoOnline’s Crisis Management Software makes it easier for
your business to safeguard lives, safety, important assets and
the environment in the event of an incident.

Learn More

You might also like