Professional Documents
Culture Documents
net/publication/41108160
Beyond the Worst Case Scenario: 'Managing' the Risks of Extreme Events
CITATIONS READS
16 138
1 author:
Denis Fischbacher-Smith
University of Glasgow
132 PUBLICATIONS 2,588 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Denis Fischbacher-Smith on 01 May 2014.
Denis Fischbacher-Smith
CHERR – Centre for Health, Environment, Risk and Resilience, University of
Glasgow, Glasgow, UK .
E-mail: d.fischbacher-smith@lbss.gla.ac.uk
In t ro duction
The history of risk and crisis management is littered with narratives about
the ways in which organizations failed to deal with the demands of
‘extreme events’. Extreme events, by definition, are a class of outcomes
that have very high consequences (often exceeding the perceived worst
case scenario) but also have a low probability of occurrence. As such, they
represent a difficult area for analysis and research. Some would dismiss
their significance due to the fact that they are not representative of the
‘normal’ state of affairs within the ‘system’ under consideration. However,
as our opening quote from Taleb suggests, it is precisely their unique na-
ture that makes them important for research and theory. Extreme events
call into question our understanding of the various classes of phenomena
in which they are found and the strategies that organizations have in place
to deal with them. They challenge organizational claims around the con-
trol of systems and can often call into question many of the fundamental
assumptions that are held about the nature of risk.
In aviation, for example, the worst case was often seen as the event
where ‘two fully laden Boeing 747 aircraft colliding in mid-air over a
sports stadium that was full to capacity’. Many would have dismissed this
scenario as being beyond the bounds of credibility due to the multiple
levels of organizational defences that are in place to prevent such a sce-
nario. We have, of course, seen both ‘ground’ and mid-air collisions of
aircraft as well as terrorist attacks on aircraft that have caused fatalities
© 2010 Macmillan Publishers Ltd. 1460-3799/10 Risk Management Vol. 12, 1, 1–8
www.palgrave-journals.com/rm/
Editorial Review
on the ground. In each of these cases multiple levels of systems defences were
bypassed allowing the catastrophic event to occur. Anyone who suggested
before September 11th that terrorists would hijack civilian airliners and
fly them into prominent buildings would have been asked to provide strong
‘evidence’ of such a threat given the lack of historical support for such a thesis
and the existence of multiple levels of ‘supposed’ defences that were in place to
prevent such a hijacking.
The case of the Space Shuttle Challenger also illustrates the difficulties that
surround the management of risk in the absence of unequivocally clear evi-
dence about a hazard. In this case, Roger Boisjoly, who was an engineer at
Morton Thiokol, was concerned about the significance of the evidence that he
was collecting regarding the risk of o-ring erosion in the solid rocket boosters
(Boisjoly, 1987, 2006). However, when expressing his concerns in a telecon-
ference presentation to NASA managers on the evening before the Challenger
launch he was asked for stronger proof of the risk than he was able to provide.
This ‘burden of proof’ is inevitably problematic within organizations as the
competing task demands of performance and risk management often come
into conflict. Without strong evidence of proof and an associated predictive
validity of failure, managers are often reluctant to take actions that could con-
strain organisational performance on the basis of a weak and inconclusive evi-
dence base. This, of course, runs counter to the precautionary approach that is
often advocated within the academic literature (see Calman and Smith, 2001)
and, in the case of NASA, may well have run counter to their own policies on
the raising of concerns by contractors (Boisjoly, 2006).
Extreme events can also be found in geo-physical phenomena such as natu-
ral hazards, extreme weather conditions and longer-term problems such as
climate change (EPA, 2009). The flooding events that occurred in the North
West of England and Southern Scotland in November 2009 illustrate the dif-
ficulties that such ‘extreme’ conditions can generate. Cumbria received some
314mm of rain in a 24 hour period (which was equivalent to two months’
average rainfall) and whilst there was a clear warning that heavy rain was due
to arrive, the scale of the event surprised many people. Again it is often the
scale of these events that present challenges around prediction. This, in turn,
can often create quite fundamental difficulties in the provision of mitigating
advice to those who are exposed to risk. There are, for example, attempts to
provide early warnings about the onset of hazards and these can be found in
the tsunami warning systems that are in use on the Pacific coast and in the
monitoring of Vesuvius in Italy. In both cases the potential population deemed
to be ‘at risk’ and the difficulties that exist around effective prediction neces-
sitated the introduction of early warning systems. The absence of such warning
systems in the case of the Indian Ocean tsunami of 2004 and the inabilities of
agencies to communicate with those at risk resulted in a death toll of some
276 000 according to some estimates.1
2 © 2010 Macmillan Publishers Ltd. 1460-3799/10 Risk Management Vol. 12, 1, 1–8
Editorial Review
What seems to typify many examples of extreme events is the fact that there
is often a sense of denial held by many before the event, particularly if the
acknowledgement of the likelihood of an accident required considerable
investment and a loss of revenue (see Dilnot, 2008). The lack of historical evi-
dence often led organizations and individuals to argue that the probability of
the hazard being realized was so low as to render the provision of early warn-
ing or other mitigation strategies un-economic. A key element here was that
this ‘denial’ was a function of the lack of a priori evidence that was available
to decision makers, even though the predictive validity associated with many
risk assessment can be low. A study of extreme events should challenge the
ways in which risk is managed in organizations and by governments, espe-
cially at the policy level where the costs of mitigation are high.
This special issue of Risk Management seeks to examine issues around the
generation and management of extreme events as a starting point in a wider
debate within the journal around such phenomena. The papers selected for this
edition represent an attempt to provide a cross section of theoretical approaches
to dealing with extreme events – they should not be seen as being exhaustive
accounts of the issues but rather the first stages of a longer term debate. The
challenges raised by a study of extreme events will remain a long-term goal of
the journal and the remainder of this editorial attempts to set out some of the
issues that will be considered here and in subsequent volumes.
‘Extreme events’ are typified by being both high consequence and low probabil-
ity events. They are events that have the potential to overwhelm our defences
and yet they occur so infrequently that we are unable to develop enough experi-
ence of them to develop effective management control strategies that are
grounded in the normal trial-and-error learning processes that characterize
organizations. Extreme events are, however, also characterized by the various
attempts made to ‘manage’ them – to prevent the processes of escalation that
move a system from within the boundaries of its normal perturbations towards
an extreme position where it can no longer be controlled and has the potential
to cause considerable levels of damage. Indeed, Taleb et al (2009) suggest that
© 2010 Macmillan Publishers Ltd. 1460-3799/10 Risk Management Vol. 12, 1, 1–8 3
Editorial Review
4 © 2010 Macmillan Publishers Ltd. 1460-3799/10 Risk Management Vol. 12, 1, 1–8
Editorial Review
A ko¯ a n o f R i s k M a n a ge m e n t – Re c og n i z i n g t h e L i m i t s o f
K nowledge
Black Swan logic makes what you don’t know far more relevant than what you
do know. Consider that many Black Swans can be caused and exacerbated by
their being unexpected. (Taleb, 2007, p. xix)
The phrase ‘How do we know what we don’t know’ almost sounds like a
managerial ko¯ an3 – a phrase that has no obvious rational answer but which
causes us to reflect upon our intuition and our emotions. The manner in which
we constrain our thinking, and our reluctance to accept challenges to our dom-
inant paradigms, are important factors in shaping our willingness to consider
rare and seemingly illogical failure pathways.
Seidl (2007) highlights the nature of this problem in what he terms our
‘structures of observation’. We essentially make a decision to carry out a set of
observations around performance as these are deemed to be important to the
ways in which we ‘manage’ (that is, control) the system. We inevitably impose
limits on our observations and ignore those issues that lie outside of these
boundaries. This is, of course, inherently problematic when the system has
potential for emergent properties and where those properties can interact with
© 2010 Macmillan Publishers Ltd. 1460-3799/10 Risk Management Vol. 12, 1, 1–8 5
Editorial Review
other elements of the system to generate failure. Boisjoly, for example, was
concerned at the lack of ‘real world’ data for the performance of the solid
rocket o-rings in relation to cold temperatures. As the boosters were contract-
ed to work within specified limits, the failures of the o-rings to work properly
would have invalidated that specification as well as have the potential to initi-
ate a failure. The data that caused Boisjoly concern was only acquired as the
shuttle was used in a real-world setting. Our structures of observation will,
therefore, change over time and if we fail to incorporate that new information
into our decision making then we run the risk of generating potential gaps in
the systems defences.
Another problem occurs due to the fact that management generally only
deals with those issues that it can measure as this allows for performance mon-
itoring to take place. There is, of course, a problem with the generation of
emergence in that it isn’t normally part of this performance monitoring process
and so will inevitably sit outside of the organization’s structures of observa-
tion. Seidl argues this in his observation that:
6 © 2010 Macmillan Publishers Ltd. 1460-3799/10 Risk Management Vol. 12, 1, 1–8
Editorial Review
being the likely failure modes and effects that will impact upon their activities.
As such, and on the basis of the information that they have available, manag-
ers within the organization will generate a series of contingency plans and
associated responses to this portfolio of events. If we accept Seidl’s arguments,
then these plans will be shrouded by the potential for failure that sits outside
of the accepted structures for observation and, at any point in time, the per-
ceived processes for escalation that the organization has planned for will be
surrounded by emergent conditions that the organization is not monitoring. As
the ‘event’ unfolds, the responses of the organization to the conditions that
they face will be guided by the contingency plans that are in place. When man-
agers are faced with a set of conditions for which they have little or no experi-
ence, then there is the potential to generate ‘points of inflection’ (Handy 1994,
1995) in which the decisions taken can escalate the problem further. These
points of inflection will take on considerable importance once events move
outside of the contingency limits that the organization has planned for, or
where the problems that confront managers sit outside of their normal struc-
tures of observation. In these cases, the potential for escalation to occur that
moves the system beyond the (perceived) worst case scenario is high.
An important element of the processes by which risk is managed concerns
the manner in which we develop and test our knowledge about the problems
that we face. Risk analysis is a robust process when dealing with high proba-
bility events as we invariably have a body of data relating to failure modes and
effects. However, this breaks down when dealing with low probability events
or emergent forms of risk as, by definition, we do not have the same level of
data concerning the phenomena. Problems occur here as organizations seek to
consider the significance of highly damaging events that have an extremely low
probability. However, perhaps more problematic are those low-probability/
low-consequence events that have the potential to act as triggers for high con-
sequence failures. Such triggers expose the vulnerabilities within systems and
societies and inevitably challenge the abilities of those who ‘manage’ the ‘sys-
tem’ to contain the damage that is generated. In some respects, this exposure
of vulnerability is intrinsically linked to the nature of extreme events as or-
ganizational controls are invariably based on the likely credible range of acci-
dents and not the extremes. A key issue that sits at the core of this discussion
on extreme events is, therefore, the processes by which knowledge is construct-
ed and used within organizations and the manner in which that knowledge
can, in turn, shape our vulnerabilities.
© 2010 Macmillan Publishers Ltd. 1460-3799/10 Risk Management Vol. 12, 1, 1–8 7
Editorial Review
an academic field and a system of practice. The papers included in this volume
cover both socio-technical and geo-physical forms of risk in an attempt to
frame some of the wider theoretical issues associated with dealing with
‘extremes’. The study of extreme events presents some fundamental challenges
to management in general and to risk management in particular. Inevitably,
this issue can only touch the surface of the problems associated with extreme
events but if it can simulate further coverage of the issues in subsequent
volumes of Risk Management then it will have achieved its purpose.
Notes
1 The figure was obtained from the Red Cross Report ‘America’s Disasters 2004’ accessed at www
.redcross.org/www-files/Documents/pdf/corppubs/amdisasters2004.pdf.
2 This would also call into question the definition of crisis that was being used.
3 Ko¯ ans have their origins in Buddhist thought where a phrase that has no intuitive rationality is
used as a means of attaining enlightenment. The answer is perhaps not the true purpose of the
koan but rather the processes by which that answer is obtained. For our purposes it would be seen
as a challenge to our ‘boundaries of consideration’ around risk.
References
Boisjoly, R.M. (1987) Ethical Decisions – Morton Thiokol and the Space Shuttle Chal-
lenger Disaster, The American Society of Mechanical Engineers; 13–18 December,
Boston, Ma New York: ASME.
Boisjoly, R.M. (2006) Ethical decisions – Morton Thiokol and the space shuttle challenger
disaster – index. Online Ethics Center for Engineering; 15 May, 1:31:21 PM, National
Academy of Engineering. Accessed: Thursday, 12 November 2009 www.onlineethics.
org/CMS/profpractice/ppessays/thiokolshuttle.aspx, accessed 12 November 2009.
Calman, K. and Smith, D. (2001) Works in theory but not in practice? Some notes on the
precautionary principle. Public Administration 79(1): 185–204.
Dilnot, C. (2008) The triumph of greed. New Statesman, 8 December: 36–39.
EPA. (2009) Extreme events, http://www.epa.gov/climatechange/effects/extreme.html.
Handy, C. (1994) The empty raincoat: making sense of the future. London: Hutchinson.
Handy, C. (1995) The age of unreason. New thinking for a new world. London: Random
House.
Perrow, C. (1984) Normal Accidents. New York: Basic Books.
Seidl, D. (2007) The dark side of knowledge. Emergence: Complexity & Organization
9(3): 16–29.
Taleb, N.N. (2007) The Black Swan. The Impact of the Highly Improbable. London:
Penguin Books.
Taleb, N.N., Goldstein, D.G. and Spitznagel, M.W. (2009) The six mistakes executives
make in risk management. Harvard Business Review 87(10): 78–81.
8 © 2010 Macmillan Publishers Ltd. 1460-3799/10 Risk Management Vol. 12, 1, 1–8