Professional Documents
Culture Documents
analysis
85
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING I
86
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ENGINEERING AND SOCIAL RISK ANALYSIS
87
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING
Probability
0.9754925 0 0225 0.00016625 0.000257775 0.000110475 0.001473
Fig. 10. Event tree for LPG tanker accident (Dryden and Gawecki, 1987)
88
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
I ENGINEERING AND SOCIAL RISK ANALYSIS
Unacceptable
10
1 10 100 1000 10 000
Fig. 11. An example of an FN diagram (Dryden and Gawecki, 1987)
89
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING I
Often these standards and those for individual risk were origi-
nally deduced from the statistics of natural hazards or common life
nazards that appear to be acceptable to the general public.
Risk analysis as an engineering decision technique appears
rational but has nevertheless attracted much technical criticism
(Whittaker, 1991). Firstly, the role of human error is largely ignored.
Although average, small mistakes are accounted for in the statistics
of equipment failure, the incidence of gross failure is ignored. Yet
tragic failures such as Bhopal, Challenger, Chernobyl and even the
King's Cross Underground fire can be sheeted home to human
errors that could not have been anticipated by risk analysis
(Reason, 1990). Secondly, the accumulating historical evidence of
failures in the nuclear, aerospace and petrochemical industries
indicate that fault tree predictions grossly underestimate the
probability of hazardous events. This is partly due to the
dominance of unanticipated failure interactions in complex
facilities (Whittaker, 1991). Yet these technical doubts pale before
the torrent of criticism that has been voiced by social scientists. The
essence of this critique is that the technical rationality of risk
analysis and, indeed, applied expected utility theory in general,
does not match the reality of humanity's real sense of risk.
Cultures of risk
This chapter is about how we make decisions concerning engineer-
ing hazards. This would appear to be a universal theme indepen-
dent of cultural factors. Such a view is unsustainable if we imagine
ourselves in the position of (for example) a peasant in Bangladesh.
As a poor farmer we expose ourselves to hazards that would be
unthinkable in the rich northern countries. We may face our dan-
gers and hardships with fatalism or bitterness but face them we
must—or die. The fear of starvation eclipses all other fears. Other
fears may also dominate the fear of technological risk, such as those
generated by deeply held religious beliefs. Indeed, from the point
of view of much of the world's population, always living close to
90
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
I ENGINEERING AND SOCIAL RISK ANALYSIS
91
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING I
ability, high consequence risks are feared much more than more
common risks with potentially high impacts (von Winterfeldt et ah,
1981; Kunreuther, 1992). Probabilities are irrelevant because people
either discount them intuitively—'it can't happen to me' or they
focus on the consequences. Why is this?
The work of Slovic et ah (1986) and Slovic (1992) is central to this
question. Using elaborate questionnaires and complex statistical
analysis techniques, these researchers have mapped the attitudes of
diverse groups in the USA to risk, which other researchers have
replicated in other countries.
The key finding is that experts and non-experts think about risk
in very different ways. Risk experts rate the degree of risk
associated with a technology with the likely casualties. Non-
experts could make casualty estimates similar to those of the
experts, and did rate this factor as important. Nevertheless, non-
experts were distinguished by a broader concept of risk. Factors
such as the catastrophe potential, controllability, and threat to
future generations were important. Analysis found that two
clusters of factors were related and important in any assessment
as to whether a technology was risky. The first cluster indicated
how unfamiliar, unknown or delayed were the effects and the
second cluster indicated how uncontrollable, dreaded, cata-
strophic, fatal, unfair, dangerous to future generations, increasing,
and involuntary were the effects. An increase on one or both of
these scales indicated increased fear of technology. Technologies
associated with DNA, radioactivity, toxic chemicals were high on
the dread scale. Many technologies associated with everyday life
and sport that produced significant numbers of deaths were low
on these risk scales. Most people felt that the levels of
technological risk were too high and should be subject to more
regulatory control—particularly those high on the dread scale.
Other work by Stallen and Thomas (1984), in Holland, indicated
that a feeling of personal control was important. In line with
related stress research, they concluded that people cope with
threats much better if they feel that their destiny is in their own
hands.
92
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
I ENGINEERING AND SOCIAL RISK ANALYSIS
Amplification
Early work on the psychology of risk indicated that small incidents
associated with technologies high on the dread and unknown
scales, produced strong negative reactions (Slovic, 1992). The
nuclear accident at Three Mile Island demonstrated this principle.
Although no fatalities resulted from the leakage of radiation, the
social effects were very great. People saw the accident as a signal or
warning that much more catastrophic accidents could just as easily
occur in that industry. The resulting political fallout was fatal for
US nuclear technology.
Kasperson et al. (1988) have put together a model to explain why
technically minor incidents often cause strong public reactions.
This model uses the analogy of a radio signal to account for the
way the nature of risk changes as it moves from the domain of
experts into the public sphere. The risk event gives out certain
signals which are interpreted in different ways by different groups
and translated into role-dependent languages. Thus, the original
signal is decoded, filtered, subject to cognitive manipulation and
value loaded in a variety of ways. These processes may serve to
radically amplify the signal as the receptor groups pass the
message on. Amplification may result from the volume of news
surrounding the event, the conflict generated between experts and
its real or potential, symbolic and dramatic nature. The social
consequences such as litigation, investor flight and community
mobilization may be seen to be a function of this social
amplification rather than the risk event itself.
The reader may note the similarity between this model and the
Brunswik lens model discussed in the next chapter. In a
Brunswikean sense the perceptual cues associated with the event
are selected and integrated in different ways by different people.
An engineer will tend to look at the physical damage, its causes
and casualties. The surrounding population will notice those cues
that indicate the long term safety of the facility and the worth of the
safety features. Officials will be concerned with the aspects that
contravene regulations or indicate that regulations may have to be
93
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING I
modified. The press, of course, are looking for sensational cues that
will sell newspapers. If they cannot find them at the site of the
incident they can always wheel out a hired expert to point
the finger of blame and warn of the potential for catastrophe. The
owner will tend to move from a pragmatic mood confined to the
physical and production consequences, through an increasingly
alarmed period where the publicity and conflict surrounding the
plant seems to be getting out of hand, to the astonished realization
that resulting commercial damage was unrelated to the engineer-
ing damage.
Stigma
An important social factor in the way non-specialists react to risky
technology is the phenomenon of stigmatization. For example,
waste disposal sites may be judged as repellent, ugly, upsetting and
disruptive as well as dangerous. These are all negative stig-
mata associated with such facilities which are easily passed on to
associated entities. For example, Slovic (1992) cites the example of
dairy cows that became slightly contaminated by polychlorinated
biphenyl (PCB). Even though the dairy was within health limits the
stigma associated with PCB was sufficient to cause consumer
flight. Whole cities or states can become stigmatized by an
association with a source of fear. Slovic describes a study to look
at the impact of locating a nuclear waste storage facility in the
Yucca Mountains in Nevada. Using out-of-state subjects, the team
was able to demonstrate that the stigma of such a site was suffici-
ently strong to have a potential effect on the attractiveness of
Nevada, and Las Vegas in particular, as a tourist destination. The
potential for the stigmatization of the whole state was such that
any further investigatory work in the Yucca Mountains was
actively resisted by the State Government (Slovic, 1992).
From an engineer's point of view, we must face the possibility
that the phenomenon of stigmatization has permanently affected
the public's view of certain industrial sectors. For example, in the
94
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ENGINEERING AND SOCIAL RISK ANALYSIS
word association tests used by Slovic and the consulting team, the
most frequent word associated with chemicals was 'dangerous' or
closely related terms such as toxic, hazardous, poison, or deadly.
Beneficial uses of chemical were rarely mentioned.
95
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING I
Summary
Engineers often structure physical risk as a function of the
probability of a hazard and its negative consequences.
The veracity of the technical analysis of physical risk is
often degraded because of the difficulties of estimating
event probabilities and an ignorance of human error.
The general public do not use the same rationality as
engineers. Hazardous technology is often feared for
cultural and psychological reasons unrelated to casualties
or probabilities. For this reason, decision making in this
field must take account of both the technical and social
construction of risk.
96
Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.