You are on page 1of 12

Engineering and social risk

analysis

This chapter uses physical risk as a means of demonstrating the


way that engineering management decision making can be
negatively influenced by our tendency to model problems in
purely technical terms. Engineering risk analysis is an excellent
example of the way that our engineering values and methods can
blind us to other world views. It demonstrates that, when
engineers are required to make apparently technical decisions,
they may mistakenly ignore other important social constructions of
events.

The technical construction of risk


We engineers are a conservative lot. We do not like risk, and as a
consequence we have developed a culture that is guided by stan-
dards, codes of practice and specifications to ensure that what we
create is safe as well as functional. We have a philosophy of safety,
and our traditional language, using phrases like safety factor,
reflects this philosophy. Only quite recently have design codes
talked of failure probabilities—shamefully admitting our covert,
unacknowledged lack of perfection. Despite our rules, things do
break, collapse and blow up. Engineers have faced the truth that
unanticipated events can spoil their plans, that we do not know the
full nature of the applied loads, the characteristics of the materials,
or the consequences of all the possible interactions—that, in reality,

85

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING I

our world is probabilistic. The point of emphasis varies across


engineering. In some cases it is the extremes of loading arising from
natural forces that has attracted probabilistic attention. In others, it
is the short and long term carrying capacity of new materials, and
in the cases of the nuclear and petrochemical industries, it is the
potential catastrophes which may result from the failure of com-
plex, tightly coupled systems. In civil and mechanical engineering
the emphasis has been on failure probabilities (reliability) with the
consequences being only implicitly acknowledged. However, in
those industries dependent on new untested technology, analysis
has been forced to explicitly take account of the potential for death
or injury. As a consequence, risk analysis in these industries has
adopted a form of expected utility theory to link failure probabil-
ities with their outcomes. One such industry, petrochemicals,
illustrates the nature of our taken-for-granted technical risk ration-
ality and its consequences for decision making.

Probabilistic risk analysis


Petrochemical plants and transportation vehicles often contain
material that if released, would explode, burn or gas people. The
danger may be confined to the immediate vicinity of the container
or may involve the surrounding built-up area. Casualties can be
high. In an effort to design petrochemical facilities in such a way
that they can be considered safe, engineers have made liberal use of
probabilistic risk analysis. At the core of this method is a require-
ment to evaluate the probability of a hazardous event and the
likely consequences for the surrounding population. The units used
are normally estimated frequencies (F) and number of deaths (N),
and risk is defined as a function of these factors (Philipson and
Napadensky, 1982). Thus, as in expected utility theory, a small
probability of a major consequence is deemed equivalent to a high
probability of a small consequence—although the relationship is
not necessarily linear. The risk analysis typically contains the
following steps (European Federation of Chemical Engineers, 1985;
NATO, 1984).

86

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ENGINEERING AND SOCIAL RISK ANALYSIS

Quantify hazardous event probabilities


The estimated frequency of a hazardous event in the petrochemical
industry cannot often be quantified directly from records of previ-
ous catastrophes because of their rarity and circumstantial depen-
dence. Hence attempts are made to predict rare large events from a
knowledge of the frequency of more common small failures. The
fault tree is used in large complex plants and links the uncommon
top event to the failure of basic causal events (like a sticking valve),
using Boolean logic. Thus, a knowledge of the past history of
failure of the subunits (such as valves), enables the calculation of
the probability of failure of a critical part of the plant likely to
precipitate a hazardous event. Human error is often included in the
statistics of equipment failure, but nevertheless, remains a major
source of estimation error (Cox, 1982). Event trees are designed to
predict the probability of certain types of outcomes if the nature of
the hazardous event is known. Thus, if the probability of a danger-
ous fault or accident can be estimated from the fault tree or
historical records, then an event tree can be used to calculate the
probability that this will result in various types of consequences.
As an example, we can consider the problem of transporting
LPG through city streets (Dryden and Gawecki, 1987). Firstly, the
tanker route is divided up into segments and the probability of a
tanker crash calculated using transportation accident statistics. This
is the estimated frequency of the top event. Given the occurrence of
this hazardous event, the probability of a range of consequences
can be calculated using data from a fault tree analysis or historical
records. A typical example of an event tree for an LPG tanker
accident is shown in Fig. 10. It can be seen that the probabilities of
really dangerous events such as a large torch fire, a fireball, or an
unconfined vapour cloud explosion (UVCE) is small, but the
consequences in a built-up area could be grave.

Quantify the consequences


A knowledge of the physics of the hazardous event enables an
estimate to be made of the likely casualties. For example, the crash
of an 8 tonne LPG truck, which resulted in an UVCE, would almost

87

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING

Probability
0.9754925 0 0225 0.00016625 0.000257775 0.000110475 0.001473

Fatality radius (m)


0 12.2

Fig. 10. Event tree for LPG tanker accident (Dryden and Gawecki, 1987)

88

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
I ENGINEERING AND SOCIAL RISK ANALYSIS

certainly kill exposed people standing within 85 m of the explosion


and cause injury to those within 170 m (Pitblado, 1984). Thus,
depending on the time of day and the use of the street, the con-
sequences, expressed in terms of death can be calculated for each
route segment.

Quantify the risk


• Individual risk. This is often expressed as the probability
that an individual within an exposed group will be killed
in the next year because of a hazardous event. This is then
compared to some acceptable risk criteria such as an
increased probability of death of 1 x 10 ~ 6 per year.
• Societal risk. This is often expressed as the estimated
frequency of hazardous events with consequences above
N, where N is the number of casualties. The acceptability
of these frequencies is checked against some standard
such as the one shown in Fig. 11. This was adopted
for the analysis of risks associated with the transport
of dangerous goods in Sydney (Dryden and Gawecki,
1987).

Unacceptable

Typical societal risk plot


6
i<r -
Risk reduction desirable
io-7-
Acceptable

10
1 10 100 1000 10 000
Fig. 11. An example of an FN diagram (Dryden and Gawecki, 1987)

89

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING I

Often these standards and those for individual risk were origi-
nally deduced from the statistics of natural hazards or common life
nazards that appear to be acceptable to the general public.
Risk analysis as an engineering decision technique appears
rational but has nevertheless attracted much technical criticism
(Whittaker, 1991). Firstly, the role of human error is largely ignored.
Although average, small mistakes are accounted for in the statistics
of equipment failure, the incidence of gross failure is ignored. Yet
tragic failures such as Bhopal, Challenger, Chernobyl and even the
King's Cross Underground fire can be sheeted home to human
errors that could not have been anticipated by risk analysis
(Reason, 1990). Secondly, the accumulating historical evidence of
failures in the nuclear, aerospace and petrochemical industries
indicate that fault tree predictions grossly underestimate the
probability of hazardous events. This is partly due to the
dominance of unanticipated failure interactions in complex
facilities (Whittaker, 1991). Yet these technical doubts pale before
the torrent of criticism that has been voiced by social scientists. The
essence of this critique is that the technical rationality of risk
analysis and, indeed, applied expected utility theory in general,
does not match the reality of humanity's real sense of risk.

Cultures of risk
This chapter is about how we make decisions concerning engineer-
ing hazards. This would appear to be a universal theme indepen-
dent of cultural factors. Such a view is unsustainable if we imagine
ourselves in the position of (for example) a peasant in Bangladesh.
As a poor farmer we expose ourselves to hazards that would be
unthinkable in the rich northern countries. We may face our dan-
gers and hardships with fatalism or bitterness but face them we
must—or die. The fear of starvation eclipses all other fears. Other
fears may also dominate the fear of technological risk, such as those
generated by deeply held religious beliefs. Indeed, from the point
of view of much of the world's population, always living close to

90

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
I ENGINEERING AND SOCIAL RISK ANALYSIS

poverty, the discussion of technological risk must seem bizarre or


frivolous. Yet in our industrialized cultures, an obsession with
technological risk is pervasive and growing. We live in what Beck
(1992) calls a 'risk society' where, to some perceptible degree, the
culture of techno-economic development is losing legitimacy to be
replaced by a risk culture. As commentators such as Wartofsky
(1986) have pointed out, this is associated with an increasing ten-
sion between our sense that we should control our own personal
destinies and the apparently increasing imposition on us of tech-
nical hazards. Risk is salient to us because we see that such hazards
can be avoided if we take collective action. Unlike the helpless poor
of the world we feel empowered to demand the mitigation of im-
posed hazards. Whatever risks we are willing to take in our vol-
untary leisure time, we feel indignation when others impose risks
upon us or our families. And we are self confident enough to take
political action in an attempt to frustrate such impositions.
Although industrial cultures are often more sensitive to
technological risks than other cultures, there are groups within
such societies that take a more robust view. Generally speaking,
those involved in or committed to technology or business have less
of a sense of risk than those who do not (Buss et ah, 1986; Buss and
Craik, 1983; Deluca et ah, 1986). This is understandable, as making
a living is an important motivator for an engineer or factory
worker. It is so much easier for a teacher to worry about the
pollution from a power station than the shift engineer with three
children to bring up. Indeed, a group's attitude to risk can often be
predicted from a knowledge of their attitude towards, and trust in,
industry, government agencies and science—the key institutions of
modernity (Bastide et ah, 1989; Otway and von Winterfeldt, 1982;
Bord and O'Conner, 1990).

The psychology of risk


The application of expected utility theory of risk does not corres-
pond to most peoples' judgement of riskiness. Some low prob-

91

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING I

ability, high consequence risks are feared much more than more
common risks with potentially high impacts (von Winterfeldt et ah,
1981; Kunreuther, 1992). Probabilities are irrelevant because people
either discount them intuitively—'it can't happen to me' or they
focus on the consequences. Why is this?
The work of Slovic et ah (1986) and Slovic (1992) is central to this
question. Using elaborate questionnaires and complex statistical
analysis techniques, these researchers have mapped the attitudes of
diverse groups in the USA to risk, which other researchers have
replicated in other countries.
The key finding is that experts and non-experts think about risk
in very different ways. Risk experts rate the degree of risk
associated with a technology with the likely casualties. Non-
experts could make casualty estimates similar to those of the
experts, and did rate this factor as important. Nevertheless, non-
experts were distinguished by a broader concept of risk. Factors
such as the catastrophe potential, controllability, and threat to
future generations were important. Analysis found that two
clusters of factors were related and important in any assessment
as to whether a technology was risky. The first cluster indicated
how unfamiliar, unknown or delayed were the effects and the
second cluster indicated how uncontrollable, dreaded, cata-
strophic, fatal, unfair, dangerous to future generations, increasing,
and involuntary were the effects. An increase on one or both of
these scales indicated increased fear of technology. Technologies
associated with DNA, radioactivity, toxic chemicals were high on
the dread scale. Many technologies associated with everyday life
and sport that produced significant numbers of deaths were low
on these risk scales. Most people felt that the levels of
technological risk were too high and should be subject to more
regulatory control—particularly those high on the dread scale.
Other work by Stallen and Thomas (1984), in Holland, indicated
that a feeling of personal control was important. In line with
related stress research, they concluded that people cope with
threats much better if they feel that their destiny is in their own
hands.

92

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
I ENGINEERING AND SOCIAL RISK ANALYSIS

Amplification
Early work on the psychology of risk indicated that small incidents
associated with technologies high on the dread and unknown
scales, produced strong negative reactions (Slovic, 1992). The
nuclear accident at Three Mile Island demonstrated this principle.
Although no fatalities resulted from the leakage of radiation, the
social effects were very great. People saw the accident as a signal or
warning that much more catastrophic accidents could just as easily
occur in that industry. The resulting political fallout was fatal for
US nuclear technology.
Kasperson et al. (1988) have put together a model to explain why
technically minor incidents often cause strong public reactions.
This model uses the analogy of a radio signal to account for the
way the nature of risk changes as it moves from the domain of
experts into the public sphere. The risk event gives out certain
signals which are interpreted in different ways by different groups
and translated into role-dependent languages. Thus, the original
signal is decoded, filtered, subject to cognitive manipulation and
value loaded in a variety of ways. These processes may serve to
radically amplify the signal as the receptor groups pass the
message on. Amplification may result from the volume of news
surrounding the event, the conflict generated between experts and
its real or potential, symbolic and dramatic nature. The social
consequences such as litigation, investor flight and community
mobilization may be seen to be a function of this social
amplification rather than the risk event itself.
The reader may note the similarity between this model and the
Brunswik lens model discussed in the next chapter. In a
Brunswikean sense the perceptual cues associated with the event
are selected and integrated in different ways by different people.
An engineer will tend to look at the physical damage, its causes
and casualties. The surrounding population will notice those cues
that indicate the long term safety of the facility and the worth of the
safety features. Officials will be concerned with the aspects that
contravene regulations or indicate that regulations may have to be

93

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING I

modified. The press, of course, are looking for sensational cues that
will sell newspapers. If they cannot find them at the site of the
incident they can always wheel out a hired expert to point
the finger of blame and warn of the potential for catastrophe. The
owner will tend to move from a pragmatic mood confined to the
physical and production consequences, through an increasingly
alarmed period where the publicity and conflict surrounding the
plant seems to be getting out of hand, to the astonished realization
that resulting commercial damage was unrelated to the engineer-
ing damage.

Stigma
An important social factor in the way non-specialists react to risky
technology is the phenomenon of stigmatization. For example,
waste disposal sites may be judged as repellent, ugly, upsetting and
disruptive as well as dangerous. These are all negative stig-
mata associated with such facilities which are easily passed on to
associated entities. For example, Slovic (1992) cites the example of
dairy cows that became slightly contaminated by polychlorinated
biphenyl (PCB). Even though the dairy was within health limits the
stigma associated with PCB was sufficient to cause consumer
flight. Whole cities or states can become stigmatized by an
association with a source of fear. Slovic describes a study to look
at the impact of locating a nuclear waste storage facility in the
Yucca Mountains in Nevada. Using out-of-state subjects, the team
was able to demonstrate that the stigma of such a site was suffici-
ently strong to have a potential effect on the attractiveness of
Nevada, and Las Vegas in particular, as a tourist destination. The
potential for the stigmatization of the whole state was such that
any further investigatory work in the Yucca Mountains was
actively resisted by the State Government (Slovic, 1992).
From an engineer's point of view, we must face the possibility
that the phenomenon of stigmatization has permanently affected
the public's view of certain industrial sectors. For example, in the

94

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ENGINEERING AND SOCIAL RISK ANALYSIS

word association tests used by Slovic and the consulting team, the
most frequent word associated with chemicals was 'dangerous' or
closely related terms such as toxic, hazardous, poison, or deadly.
Beneficial uses of chemical were rarely mentioned.

Managing hazardous decisions


It is increasingly apparent that the engineering sciences face a
historic turning point: they can continue to think and work in
the worn-out ways of the nineteenth century. Then they will
confuse the problems of the risk society with those of early
industrial society. Or they can face the challenges of a genuine,
preventive management of risks. Then they must rethink and
change their own conceptions of rationality, knowledge and
practice, as well as the institutional structures in which these
are put to work (Beck, 1992; p. 71).
The ability of an engineering manager to successfully manage de-
cisions involving risk is limited by the magnitude of the social
forces associated with the risk society. The sensitivity of the public
to risk will increase despite the efforts of engineers to mitigate the
negative effects of technology. Moreover, there is a danger that the
whole of the engineering profession will be stigmatized in the
process, rendering engineers a liability in decision making. Never-
theless, certain fundamentals are important when making deci-
sions that involve a risk to life.
1. Do your risk analysis carefully. Your professional imperatives
demand that you should be sure that all practical precautions
have been taken, and that the analysis is robust enough to
withstand peer group scrutiny.
2. Analyse the potential public reaction to your decision using a
knowledge of the psychology of risk and the effects of
amplification and stigmatization.
3. Give these social factors equal weight to the risk analysis and
make your judgement holistically.

95

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.
ANALYTICAL DECISION MAKING I

4. Use group or strategic decision management techniques to


coordinate your efforts.
5. Communicate with all parties in an open and honest fashion,
remembering that trust is the key to success.
6. Communicate in terms of consequences not probabilities.
7. Emphasize safety not risk; in particular, the safety record of
your profession, its ethic of public safety, and the trust, built
up over hundreds of years, that engineers are the protectors
of the community.

Summary
Engineers often structure physical risk as a function of the
probability of a hazard and its negative consequences.
The veracity of the technical analysis of physical risk is
often degraded because of the difficulties of estimating
event probabilities and an ignorance of human error.
The general public do not use the same rationality as
engineers. Hazardous technology is often feared for
cultural and psychological reasons unrelated to casualties
or probabilities. For this reason, decision making in this
field must take account of both the technical and social
construction of risk.

96

Downloaded by [ University of Technology Sydney] on [29/04/20]. Copyright © ICE Publishing, all rights reserved.

You might also like