Professional Documents
Culture Documents
decisions in a crisis
WHITEPAPER
Contents
INTRODUCTION 3
‘GOOD-ENOUGH’ DECISIONS 4
GROUPTHINK 11
THE AFTERMATH 15
FURTHER READING 16
INTRODUCTION
In a business setting, the term crisis refers to a period of intense difficulty or danger.
Crisis management describes how challenging and important decisions are made, and how
actions are taken to reduce danger. The danger might be to the reputation of an organisation,
a financial risk, or a risk to life. Decision makers are under the most pressure when the wrong
decision can lead to deaths and injuries.
Often one type of crisis follows another. A production problem is a business risk, but can lead
to an explosion and fire, with threats to life. This leads to an environmental problem as toxic
smoke fills the air, contaminated runoff water seeps into the earth or oil gushes into the ocean.
Residents and businesses are affected, client needs are unfulfilled. The damage to reputation
and future prosperity needs to be managed.
For some crises, there are warnings. A pandemic like COVID-19 had been predicted for
decades, although few organisations had detailed plans in place. When the pandemic started in
2019 the crisis developed slowly – timescales for considering evidence and making decisions
could be measured in days and weeks.
For other crises, the decision-making timescales for are much shorter. The Buncefield
explosion occurred at 06.01 on a Sunday morning in 2005. The fire engulfed twenty storage
tanks within minutes. The Grenfell Tower fire was reported in a fourth floor flat just before
01.00, on 14 June 2017. Within 30 minutes, the fire had spread all the way to roof level, above
the 24th floor.
Even when the initiating event takes only a few minutes, the need for crisis management might
continue for days or weeks. It took a full 24 hours to put the fire out at Grenfell, and several
days at Buncefield. The clear up post-Buncefield took months, while cleaning up after
Deepwater Horizon took years. The crisis of building safety and regulation raised by Grenfell
seems likely to continue for decades.
What each crisis has in common is that decisions must be made under time pressure if
damage – whether to life, property or reputation – is to be limited. This whitepaper looks at
how improving our understanding of the way we think can lead to better decision-making
during a crisis.
If you were buying a new home, you might make lists of all the desirable criteria and all the
homes available, and then score each choice against the criteria. You might do the same for a
car.
You probably wouldn’t go to this effort for every item in your weekly shopping – it would take
too long. We take a shortcut, selecting one or two criteria as the most important – the price of
a bar of chocolate, the familiarity of a breakfast cereal, or the ethics of a cup of coffee might
trump other considerations.
THE IMPORTANCE
OF PLANNING
When we make decisions under pressure, we don’t have time to consider all the options.
Planning takes advantage of our ability to think carefully about multiple priorities.
If we could predict every crisis, we could plan our response and weigh up the alternatives,
applying our methodical optimising slow-brain thinking to the problem, rather than our ‘good-
enough’ fast-brain. While we can’t plan for everything that will ever happen, we can plan for a
broad range of scenarios. When a crisis happens, we have some tools ready to use.
But which types of incidents should we plan for? Most organisations have plans in place for fire
evacuation and put their staff through a fire drill once or twice a year. For many organisations,
this is where emergency planning stops.
To understand what to plan for, we need to manage the information available – and to be aware
of the shortcuts we take when assessing the information.
INFORMATION MANAGEMENT
BEFORE THE CRISIS
This emphasises the importance of sharing information across an organisation. You can’t
assess the likelihood of a crisis if you don’t have the relevant information. Sharing everything
that everyone knows would be impossible, and overwhelming. The right IT system can help
you organise what is known so that it is easier to identify which threats to plan for.
You need to find a balance between spending a lot of effort on something that will never
happen, and ignoring things that haven’t happened so far but could occur in the future. Take a
list of possible events, and brainstorm with the management team what processes you have
in place to deal with them. Some might be well covered already. Others might be manageable
by tailoring existing plans.
Others – like a space shuttle landing on your worksite – might be regarded as so unlikely as
to be beyond consideration. But considering the impact of unlikely – but catastrophic -
incidents is a useful way to test your plans.
While better information management will make us more aware of what we need to plan for, we
need to be aware of the shortcuts that can make us ignore the information we have.
Even in the face of evidence, we don’t like to In safety management, we are supposed to
think about bad things happening. Too many of learn from near misses – someone nearly trips
us were slow to respond to COVID-19 because on an uneven floor, so we repair the floor to
shaking hands, travelling and seeing people prevent someone falling. At the scale of a crisis,
were normal things to do. near misses make us think we are already
resilient.
Climate change is already claiming lives across
the world, and yet too many people still deny Europe had seen the rest of the world deal with
its existence, or assume that science will find a SARS (severe acute respiratory syndrome),
solution in time to protect our lifestyles. swine flu, Ebola and avian flu. That these
illnesses didn’t impact most Europeans
We plan based on what we already know. When perhaps led to false optimism – we didn’t need
things stay the same, this works. When things to plan for such events.
change, it makes it difficult to see the need for
change. Pilots land planes on runways, not in When Hurricane Ivan turned aside from New
rivers. In high-rise buildings, firefighters put Orleans in 2004, the authorities appeared to
out fires in one self-contained flat, they don’t scale down their preparations for a hurricane,
evacuate people from 120 separate flats. rather than escalate to deal with the impact of
Hurricane Katrina a year later.
As with planning, information is a key tool for managing a crisis. People need the right
information at the right time to make the right decisions. Behaviours regarded after an event
as ‘panic’ can often be explained by the lack of information at the time a decision was made
(see Box 2). This is particularly frustrating where information is available in the organisation,
but it hasn’t been passed to the right people.
However, too much information can be as much of a problem as not having enough. Keeley
Foster had to manage efforts to control large grass fires in outer London when she was
Deputy Assistant Commissioner at London Fire Brigade in 2018:
“I got there at 4 pm and was bombarded with information. If you take too much on board
you’re going to be overloaded and your stress levels will go up. So it’s about processing that
information, prioritising it, and allocating it to the right people at the right time.“5
The right IT tools can make a significant difference to information management and
communication. Tools to specify who gets a message by role or department or location make
it more likely people have the right information on which to base their choices.
BOX 2
The psychology of ‘panic’
In Box 1 we saw that with plenty of time, the inquiry following the crash
of US Airways Flight 1549 was able to devise a solution that might
have been better, and they challenged Captain Sullenberger on why he
hadn’t taken that option. Sully was able to explain the rationale behind
his decisions, and the board of enquiry had to admit that his choice was
reasonable, given the information and time available.
What might look irrational to us with the benefit of hindsight might have
been perfectly reasonable to the people making that decision at the
time. If you think you might have difficulty getting petrol or toilet roll
or pasta next week, it is reasonable to top up your supplies this week,
even if normally you’d have waited another few days. If when running
away from a fire, your preferred route is full of hot, acrid smoke, it is
understandable that you might jump out of a window.
Under the pressure of a crisis, we are even more likely to resort to shortcuts, and less likely to spot
the errors these shortcuts introduce.
Exponential myopia
Our evolution taught us that people, food crops and animals tend to grow steadily, in a linear way.
Pandemics, fires and other crises often develop exponentially. One person infects two others, who
infect four others. On 23 February 2020 there were ten reported cases of COVID-19 in the UK.
Politicians imagining linear growth believed the National Health Service would be able to cope.
As cases doubled every three or four days, by the time of the first UK lockdown a month later there
had been over 12,000 cases in the UK.
If some bad publicity about your company is shared in one social media post and seen by 10 people
who repost, by the end of a single day the bad news might have been seen by thousands, or even
millions.
If you suffer from exponential myopia and assume that a crisis will grow linearly, you are likely to
miss important cues that will tell you to change course, to try a different approach, or to prioritise one
aspect of a crisis over another.
Outcome focussed
It is painful to remember when we got something wrong. It is much nicer to dwell on success. Hence,
during a crisis the options we think of first are more likely to relate to past successes than to past
failures. This is one reason we so often fail to learn from an event.
In fighting a fire, a commander might consider all the many fires where more water eventually put the
fire out without loss of life; they would need help to recall the fires where this strategy wasn’t
successful. If sandbags held back the flood last time, it might be assumed they will do the same this
time.
This focus on outcomes can escalate – each time we get away with a little more, we become bolder
next time. The driver who gets home safely after one drink might try two drinks next week. The
homeowner who didn’t evacuate during the last weather warning decides not to put up their storm
shutters this time. Before the Colombia space shuttle disaster in 2003, NASA ‘got away’ with at least
14 previous incidents of damage to space shuttles.
The damage threshold creeps up with each successful flight, until the flight where the shuttle
disintegrates.
Think back to a big decision you made in the past. Perhaps it was deciding which home to buy, which
person to ask out on a date, or which job offer to take. If you are still in that home, still dating or
married to that person, or still enjoying the career path that job led to, you probably feel you made
the right decision. We reduce any unease by looking for confirmation that the decision we have made
is correct – and place less weight on any evidence that we have made the wrong decision.
We do this because of the way our brains are wired. Neuroscientists have detected parts of the brain
shutting down when presented with information that contradicts current beliefs, while ‘happy’
circuits light up when given information that confirms existing understanding.
Confirmation bias can be psychologically healthy. It prevents us wasting our lives worrying about
whether we should have chosen a better partner, or home, or career. However, while managing a
crisis we need to be more versatile, and more critical of our own decisions, or those made by other
decision makers.
Imagine you have invested much time and effort in crafting a response to a situation. You have
mobilised resources to follow this plan. To admit that there is evidence your plan isn’t working - to
abandon the plan and ask for more help, might feel like failure.
Industrial safety and accident analysis author and consultant, Professor Andrew Hopkins of the
Australian National University in Canberra summarises the problem with decision making in the lead
up to the explosion at Deepwater Horizon:
“The group was subject to powerful confirmation bias because the cement job had already been declared
a success... they were not testing whether the well was safe, they were confirming that it was… they re-
peated the test in various ways until finally they got a result that could be interpreted as a success,
at which point, it was ‘mission accomplished’.” 4
A plan assumes a theory or model of how we think the world works. We assume blocks of flats will be
built and maintained such that a fire on one floor will not spread to another floor. Concrete in a deep-
sea well will hold back hydrocarbons. When evidence is missing to support our theory, our brains will
fill in the gaps.
Look at Figure 1. What shapes do you see? Are these the same as the shapes that have been drawn,
or is your brain filling the gaps to see a triangle and a square? Just as we do this with shapes, our
brains do the same with facts. We connect the pieces of information most readily available to us to
reinforce the theories we already have. Sometimes, we need to step back to see what’s really
happening.
Figure 1: What shapes do you see? Are they there, or are you filling the gaps?
Target Line A B C
“There wasn’t time in that critical moment to try to devise my strategy... I decided that we were
only going to do the highest priority items and do them very, very well.”5
Events are likely to be more variable than we think they will be, and we will have to change our
plans more often than we expect. Sully’s ability to devise a life-saving strategy within moments
was remarkable, and based on 19,663 total flight hours. We should not rely on such notable
achievements in our own plans. Planning before a crisis allows us to weigh up the options with
our slow-thinking brain. Practising that plan makes it more likely that in a crisis our fast-
thinking brain will take the right shortcuts.
You cannot plan the exact details of every crisis you might face. But you can plan and practice
for a broad range of scenarios, and build up a set of skills that can be applied when something
novel comes along.
The most usual form of rehearsal is the fire evacuation drill, where an alarm sounds and staff
march out to the assembly area to be checked. But practice doesn’t have to be a full-scale
operation. Table-top exercises involving key decision makers are a cost-effective way of testing
communications, technology and procedures. You can test expected emergencies – and
unexpected ones too.
After any form of practice, review the plans and processes and enhance them, so that people
are fully prepared when they need to be applied in earnest.
Since our perception of time when under pressure can be inaccurate, it is unsafe to rely on
someone noticing that a crisis is growing more quickly than anticipated, or that the plan isn’t
working as
expected. We need a system to bring together and summarise all the key clues that decision
makers should be looking at, and a regular ‘alarm clock’ that reminds us to take time – even
when we think we have none to spare – to stop and reconsider the plan we are following.
Status meetings can be in person or virtual, depending on the type of crisis being handled.
They might be timetabled to take two minutes every 20 minutes during a fire, or half a day
every week during a pandemic.
However long or frequent, status meetings need some structure to avoid falling into
confirmation bias and groupthink. You need a process that requires decision makers to
respond to particular prompts – for example, about communication with stakeholders, the
status of equipment, or the growth of the crisis. Set criteria for signals that must not be
ignored – like an oil pressure test, or indications that a fire hasn’t been contained within one
flat, or damage to equipment beyond a set threshold.
The Shuttle Programme management rules at the time of the Columbia mission stated that
the Mission Management Team should meet every day. It met just five times during a 16-day
mission. More frequent and more structured meetings might have considered the impact of
damage to the shuttle more carefully.
TEAM DYNAMICS
A successful team will collectively have all the knowledge needed to manage a situation. This
can include regular members of the operational team, or specialists brought it for their
expertise. But assembling a team is not simply a matter of having one person to cover each
skill area.
Team members need to co-operate around a common goal, while communicating with each
other honestly. The quote above refers to the culture at NASA before the disintegration of
the Space Shuttle Columbia in 2003, resulting in the deaths of seven crew members. NASA
engineers knew that the shuttle had been damaged on take-off. They suspected there was a
risk of failure on the return journey. But managers didn’t want to consider this possibility, nor
how to avoid it. Following this second major disaster with the shuttle programme, leadership
at NASA rebuilt their teams.
As Krause explains:
“Subsequently, leadership at NASA took on the task of creating a more open environment that
encourages communication and values dialogue about disagreement. Progress was measured
via surveys and observations, and positive results were evident in the first year.” 3
To avoid groupthink, consider bringing in a new person to take a fresh look at the evidence. If
they are not invested in the original plan, they will be more likely to suggest a new approach if
one is needed.
One approach is to employ someone as a ‘devil’s advocate’ – a person whose specific job it
is to disrupt and to point out the flaws in someone else’s line of thinking. Results from Asch’s
line-matching study support this proposal. Even where the right answer was a minority
decision, if only one of the confederates disagreed with the majority (incorrect) decision, the
subject was more likely to keep with the right answer.
In one variation of the line-comparing study, Asch asked people to write down their decision
privately, so that the rest of the group didn’t know how they had responded. The likelihood of
choosing the wrong line decreased. Technological solutions – where people can respond from
separate locations, rather than having to come together to a single location – might have the
added benefit of reducing group pressures to conform.
Failing to learn lessons makes it more painful when mistakes are repeated. Lessons from the
Challenger Space Shuttle disaster, where seven astronauts died in 1986, did not prevent
seven fatalities on the Columbia Space Shuttle in 2003; lessons from the otherwise
unremarkable and unmemorable fire at Knowsley Heights near Liverpool in 1991 where no
one died were not used to prevent 72 people dying in Grenfell Tower in London in 2017.
Many of the shortcuts we take in planning or managing a crisis also apply when looking at
what happened during an event. The ‘good enough’ option that was pragmatic in the heat of
an emergency appears a poor choice with the benefit of hindsight and with time to consider
what other options were available. It is often easiest to blame the people on the front line.
The pilot who made the best of tight turnaround times, the fire service working in a climate
where safety regulations are mocked, engineers who would be ostracised for pointing out
potential problems. The senior managers, the local authorities and where relevant, national
governments must face up to their role in creating the environment for a crisis.
We judge people more harshly if a decision didn’t work out than if it did, even if the reasons
for success or failure were outside their control. A doctor with a difficult medical judgement
to make about whether or not a patient should have a heart bypass that could improve their
life – or kill them – might be judged reckless if the patient dies, and brave if the patient lives.
We apply a similar bias to other decision makers – judging them by the outcome, not by how
they used the information available to them.
An accurate account of information received, decisions made, and actions taken can make
the review process more objective.
Paper records can be unreliable, so where possible use a system that allows you to create
a time-stamped log, including information from equipment, people, systems and multiple
locations. Knowing what information was available at the time each decision was made will
provide better improvements to future planning than hindsight ever can.
1. Kennedy, J (1961). The President and the Press: Address before the American Newspaper Publishers Associa-
tion, April 27, 1961. Accessed at www.jfklibrary.org/archives/other-resources/john-f-kennedy-speeches/ameri-
can-newspaper-publishers-association-19610427
2. Tavris C and Aronson E (2008) Mistakes were made, but not by me. Pinter & Martin.
3. Krause T (2010). What Caused the Gulf Oil Spill? This article also mentions the Columbia space shuttle disaster.
Accessed at www.csrwire.com/press_releases/30321-what-caused-the-gulf-oil-spill-
4. Hopkins A (2012) Disastrous Decisions: The Human and Organisational Causes of the Gulf of Mexico Blowout.
CCH Australia.
5. Pires C (2018). Don’t panic! Meet the experts with a steady hand when catastrophe strikes. In The Guardian
(UK newspaper). Accessed at www.theguardian.com/world/2018/sep/09/dont-panic-meet-the-experts-with-a-
steady-hand-when-disaster-strikes
6. Florance I (2018) ‘When the Manchester Arena attack happened we developed our plans on the way to work’.
In The Psychologist, November 2018. British Psychological Society. Accessed at https://thepsychologist.bps.org.
uk/volume-31/november-2018/when-manchester-arena-attack-happened-we-developed-our-plans-way-work
National Transportation Safety Board (2010) Loss of Thrust in Both Engines After Encountering a Flock of Birds
and Subsequent Ditching on the Hudson River, US Airways Flight 1549. At www.ntsb.gov/investigations/Acciden-
tReports/Reports/AAR1003.pdf
Deepwater Horizon Accident Investigation Report, Executive Summary (Sept 2010) www.bp.com/content/dam/
bp/business-sites/en/global/corporate/pdfs/sustainability/issue-briefings/deepwater-horizon-accident-investiga-
tion-report-executive-summary.pdf
National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling (2011) Deep Water. The Gulf Oil
Disaster and the Future of Offshore Drilling. Accessed at www.govinfo.gov/content/pkg/GPO-OILCOMMISSION/
pdf/GPO-OILCOMMISSION.pdf
Asch S (1952) Effects of group pressure upon modification and distortion of judgement. In Guetzkow H (1952)
Groups, Leadership and Men. Carnegie Press.
Kunreuther H and Meyer R (2017) The Ostrich Paradox. Wharton School Press
Collins R, and Leathley B (1995) Psychological predisposition to error in failure analysis. In Safety and Reliability.
Vol 14(3).
Learn More