An engineer's view of human error

An engineer's view of human error
Third edition

Trevor Kletz

The theme of this book:

situations, not people Try to change


The information inthis book is given in good faith and beliefin its accuracy, but does not implythe acceptance ofany legal liability or responsibility whatsoever, by the Institution, orby the author, for theconsequences ofits use or misusein any particularcircumstances.

All rightsreserved. Nopartof this publication may be reproduced, storedin a retrieval system, or transmitted, in any form or by any
means,electronic, mechanical,photocopying, recordingor otherwise, withoutthe prior permissionof the publisher. Published by

InstitutionofChemical Engineers (IChemE),
Davis Building, 165—189 Railway Terrace, Rugby, Warwickshire CV21 3HQ, UK IChemEis aRegisteredCharity

© 2001 TrevorKletz Firstedition 1985
Second edition 1991

ISBN0 85295 430 1

PrintedintheUnitedKingdom byBell & Bain Limited, Glasgow


Foreword to the third edition

In this book I set down my viewson humanerror as a cause of accidents and illustratethemby describinga numberof accidents that have occurred, mainly intheoiland chemicalindustries. Thoughthebook is particularlyaddressed to those who work in the process industries, I hope that it will interest all who design,construct,operateandmaintainplant of all sorts, that itmay suggest to some readers newways of looking ataccidentprevention and that it will reinforce the viewsofthosereaderswhoalreadytry tochangesituations ratherthan people. It is intendedfor practising engineers, especially chemicalengineers
rather than experts in humanfactors and ergonomics, and I hope it will be of particularinterestto students. Although human factors are not specifically mentionedin the Institution of Chemical Engineers' publication Accreditation of University Degree Courses— A Guide for University Departments (November 1996), they are essential if the course is to 'develop an awareness of the necessity of safe design', which is included. Many of the accidents I describe can be used as the raw material for discussions on the causes of accidents and the action needed to prevent them happening again, using the methods describedin the Institution's safety training packages, collections of notes and slides on accidents that have occurred. Someof the incidents are includedin the package on HumanErrorand other packages. Some readers may wonder why I have added to the existingliteratureon humanerror. Whenso much has been written, is thereneed for more? I felt that much of theexistingliteratureis too theoreticalfor the audience I have in mind, or devotedto particularaspects of the problem. I felt there was a need for a book which would suggest to engineers how they might approach the problem of human error and do so by describing accidents which at first sight seem to be the result of human error. My approach is therefore pragmatic rather than theoretical, and influencedby engineering methods.Thus I have:


• questioned theaccepted wisdom; • startedwith situations as they are ratherthan as we would like themto be; • judged possible actionsby their effectiveness; • suggested actions based on thebest current information, as in industry we
often cannotpostponedecisions until more information is available.

I do not claim any great originality for the ideas in this book. Many of them are to be found in other books such as Man-Machine Engineeringby A. Chapanis (Tavistock, 1965) but while that book is primarily concerned with mechanical and controlequipment, this one is concernedmainlywith process
equipment. Not all readers may agreewith the actionsI propose. Ifyou do not, may I suggestthat you decide what actionyou think shouldbe taken. Please do not ignore the accidents. They happenedand will happen again, unless action is takento preventthemhappening. Thanks are due to the many friends and colleagues, past and present, who suggested ideas for this book or commented on the draft — without their contributionsI could not have producedthis book — and to Mr S. Coulson who prepared the illustrations for Chapter 14. Thanks are also due to the many companies who allowed me to describe their mistakes and to the Science and Engineering Research Council for tinancial support for the earliereditions. In thesecond edition I addedthree new chapters, on 'Accidents that could be prevented by better management', on 'Errors in computer-controlled plants' (as people, not computers make the errors) and on 'Some final thoughts' and made anumber of additions to the existingchapters. In this third edition I have added more examples of accidents caused by thevarioustypes ofhumanerror. Ihave extended the chapterson errors made by managers and designers and now include elTors due to their ignorance of various options such as inherently safer design. I have also expanded the chapter on computer control and added an appendix on 'Some myths of humanerror', widespread beliefsthat are not whollytrue. John Doe has been joined in his adventures by Joe Soap but the reasons for their errors are very different. I have often been irritated by authors who use phrases such as, as discussedin an earlier chapter', without saying which one. I have therefore includedcross-references whenever a topic is discussedundermore than one heading. US readers should note that in the UK 'supervisor' is anothername for a foreman while 'manager' is used to describe anyone above the level of


or violation An error that occursbecausesomeonedecides not to carry out a taskornot to carryitoutinthe way instructed orexpected.foreman. To avoid the clumsy phrases 'he or she' and 'him or her' I have usually used 'he' and him'. Somewriters use error in a narrower senseto includeonly slips and lapsesof attention. A failure to carryout a taskin the wayintendedbythepersonperformingit. Mistake An error that occurs as a result of ignorance ofthe correcttask or thecorrect way to perform it. Mismatch Anerrorthat occurs becausethe taskis beyondthephysicalormentalabilityof thepersonaskedtoperformit. The motivecan range from sabotage. TrevorKletz May 2001 Glossary Error Some words are used in this book in a specialized sense. designer or accidentvictimis still usuallya man. includingthose who are knownas supervisors or superintendents in most US companies. In such cases the non-compliance may prevent a mistake (as definedabove) (see Chapter5). the way expected by other people or in a way that achieves the desired This definitionhas been wordedso as to includethe various types of error listed below. in objective.perhapsbeyondanyone'sability (seeChapter4). It is consistent with the Shorter Oxford English Dictionajy which defines error as 'something incorrectly done through ignorance or inadvertence' and as 'a transgression'. the manager. as definedbelow. Non-compliance V . Though there has been a welcome increase in the number of womenemployed in the process industries.through'can'tbe bothered'to abeliefthat theinstructions wereincorrect. The intentionis fulfilled but the intention is wrong (see Chapter3).

fatigueorsimilarpsychothat causes. Compared with mistakes (as defined above). AI1 happy families resemble one another. the intentionis logical correctbutitis not fulfilled(seeChapter2). there is a right way for every task but each error is erroneous in its own way. habit.' Similarly. vi .Slips arid lapses of attention An error occurs as aresultofforgetfulness. The opening words of Tolstoy's Anna Karenina are. each unhappy family is unhappy in its own way.

9 Wrong connections Errors in calculations 32 34 35 41 Othermedicalerrors Railways Otherindustries Everyday life (andtyping) 210 211 43 45 2.5 Twosimpleexamples Accident investigation 16 1 Astory Researchon humanerror 7 2 Accidents caused simple slips by 2 1 Introduction 22 Forgetting to openor close a valve 23 Operating wrongvalve the 24 Pressing thewrong button 2 5 Failures to notice 2.12 Fatigue 3 Accidents could beprevented by beftertrainingor instructions that 3 1 Introduction 3 2 Three Mile Island 33 Otheraccidents that couldbe prevented by relatively sophisticated training 34 Accidents that could be prevented by elementary training 3 5 Contradictory instructions 3 6 Knowledge ofwhatwe don't know 48 48 50 53 62 65 66 vii .4 1.6 11 11 13 21 25 29 31 27 2.8 2.Contents Foreword tothethird edition 1 vi Introduction Accept men as we find them Meccano ordofs? Types of humanerror 3 4 6 8 9 10 11 12 13 1.

1 The probability ofhumanerror Why do we needto know humanerrorrates7 Otherestimates ofhuman errorrates Two moresimpleexamples Buttonpressing Non-process operations 129 130 132 132 136 141 72 Human errorrates — a simpleexample 73 A more complexexample 7.3 78 78 80 85 88 Individual traits and accident proneness Mind-sets 4.6 67 6.2 5.4 5 5.8 6. 6.10 Dataon equipment maybe data on people Viii .4 75 7.37 3.8 3.3 64 65 6.9 7 7.9 Some pitfalls in usingdata on humanreliability 7.3 Accidentsdue to failures to follow instructions Accidents due to non-compliance by managers Accidents due to non-compliance by operators Actions to improvecompliance Alienation Postscript 97 99 103 107 109 110 54 55 6.6 7.9 Some simpleways ofimprovinginstructions Training or instructions? Caseswhen trainingis notthebestanswer 67 72 74 4 Accidents due to a lack ofphysical or mental ability 4 1 People asked to dothe physically difficultor impossible 42 People asked todothe mentallydifficultor impossible 4.1 Accidentsthat could by prevented by bettermanagement Anaccident caused by insularity Anaccident due toamateurism Thefire at King's Cross railway station TheHerald of Free Enterprise TheClapham Junction railway accident PiperAlpha What morecan seniormanagers do? Themeasurement ofsafety Conclusions 112 114 115 116 117 118 120 120 125 127 62 6.1 5.7 144 146 147 147 149 78 Traindrivererrors 7.

1 Hardware failures 203 204 205 122 123 Specification errors 12.7.6 TheSellafield leak Otherdesign errors Conceptual shortcomings Problems of design contractors Domestic accidents 87 8.8 89 9 91 9.6 Can we avoidthe needforso much maintenance? 185 11 Some accidents thatcouldbeprevented by bettermethods ofoperation Permits-to-work Tanker incidents Some incidents that could be prevented bybetter instructions 187 187 189 192 193 194 199 201 111 112 11 3 11 4 Some incidents involving hoses 11.2 Incidents which occurred because ofpoor maintenance practice 103 Incidents dueto gross ignorance or incompetence 104 Incidents which occurred because peopletook shortcuts 181 105 Incidents which could beprevented by morefrequentorbetter maintenance 183 10.5 Communication failures 11.12 Conclusions 8 8.1 Someaccidents could be prevented bybetter design that Isolation of protective equipment Better information display Pipe failures Vesselfailures 154 155 156 156 159 159 161 162 165 166 82 8.3 8.11 Who makes theerrors7 151 151 7.4 85 8.7 Simple causes in hightech industries 12 Errors in computer-controlled plants Software errors 12.6 Examples from the railways 11.4 Misjudging responses toa computer 207 210 ix .2 Someaccidents couldbe prevented bybetter construction that Pipefailures Miscellaneous incidents Prevention ofconstruction errors 168 168 170 173 93 10 Some accidents thatcouldbe prevented by better maintenance 175 175 178 180 10 1 Incidents which occurred because peopledid not understand how equipmentworked 10.

125 126 12.10Conclusions 13 13.1 Personal and managerial responsibility Personal responsibility 221 221 223 227 228 229 231 132 Legalviews 133 Blame in accident investigations 134 Managerial wickedness 135 Managerial competence 136 Possible and necessary 14 15 The adventures ofJoe Soapand John Doe Somefinal thoughts 234 254 257 258 26] 269 271 Postscript Appendix 1 — Influences on morale Appendix2 —Some mythsofhumanerror Appendix 3 —Some thoughts on sonataform Further reading Index 273 x .9 Otherapplications of computers 12.7 128 Entering thewrong data Failurestotell operators of changes in data orprograms Unauthorized interference with hardware 213 214 214 217 217 218 orsoftware Thehazards ofoldsoftware 12.

. decreasing the amountofhumanresponsibility in the operationoj'theplant increasesthe amountqfhuman responsibility in the designoftheplant... so mae I call it thatIhai'e . IChemESafeti'andLossPreventionSubject Group Newsletter.2nd edition (Butterworth-Heinemann.. mayjustly be called the bees' honeve .. 29(1) 10 inevitable andunpredictable. However..UK. Spring 1999 '. . Oxford. Smithand R.. page 13 'Ifthehomzeythat the beesgatherout ofso manvefioureof herbes.. 1998).1989. gatheredofmanve goodautores . UK... Quotedby FrankLeesin LossPrevention in the Process Industries. TheSciences. humanerroris inevitable onlyifpeopleare placed in situations that emphasise humanweaknesses and that do not supporthuman there is abeliefamongstmanyengineersandmanagersthathumanerrorisboth strengths MartinAnderson. 1996)and reproducedhereas atributeto one who did so much to furtherthe subject.unlike allother organismsHomo sapiensadapts not throughmodification ofits genepoolto accommodate the eni'ironmnent but by manipulatingthe environment to accommnodatethe genepool M.. Layton.T. mvbooke WilliamTurner(1510—1668). The Use ofComputers in Safety-critical Applications (HSEBooks. thataregrowing in other meimis mnedowes .

The accidents occurred mainly. The method used is to describe accidents which at first sight were due to human error and then discuss the most effective ways of preventing them happening again. not just chemicalengineers. sometimes as many as 90%. and indeed all those who work in designor production. of industrial accidentsare due to human failing. meaningby that the failure of the injured man or a 1 .we shouldacceptpeople as we find themandtry to remove opportunities forerror by changing the work situation—that is. but nevertheless shouldinterest all engineers. I developedmy views as the result of investigating accidents and reading accident reports.The remedy was obvious.)Ihope thebook will remindengineersof some of the quirks ofhumannature so that they can betterallow for them in design. and this policy has been supported by tables of accident statistics from many companies which show that over 50%.Introduction 'Man is a crealurenadeatthe end ofthe week v1'hen God was tired. Apartfrom their intrinsic interest. grab the reader's attention and encourage him or her to read on. I hope. You may not agreewith my recommendations but you shouldnot ignorethe reports. We must persuade peopleto takemore care. the plant orequipmentdesign orthe method ofworking.1 Accept men as we find them The theme of this book is that it is difficult for engineersto change human natureandtherefore. though not entirely. (When it is possible for them to do so. wecanmitigate theconsequences oferroror provide opportunities for recovery.I did not collect incidentreports to illustrate or support my views on prevention. They are also more importantthan the advice.' MarkTwain 1. Alternatively. peopleare betterat correcting errors thanat not making them. Browsingthroughold ICI files I came across a report dating from the late l920s in which one of the company's first safety officers announced a new discovery: afterreadingmany accidentreportshe had realizedthat most accidents are due to humanfailing. Since then people have been exhorted to do just this. the accident reports will. insteadoftryingtopersuadepeoplenotto makeerrors. in the oil and chemicalindustries.

has to decide how to do it. sometimes by better training or instructions. (Managers and designers.) This is comforting for managers. We should considerthe people whohave opportunities to preventaccidentsby changing objectives and methods as wellas those whoactuallycarry ont operations (see Appendix 2. ofpersuading peoplenotto do these things? To say that accidents are due to human failing is not so much untrue as unhelpful. (2) Sayinganaccidentis dueto human failing is about ashelpfulas saying that a fall is due to gravity. looked where he was going. when I was a manager. an element of human failing in the accidents. which at one time would have been followedbyexhortations to take more care or followthe rules. usually a manager. someone. But what chance do we have. has to decide what to do. page 261). of Together course. item 1. It is true but it does not lead to constructive action.has to do it. Ifall accidentsare due to humanerrors. They would not have occurredif someonehad not forgottento close a valve.instructions. page4). someone. It impliesthat there is little or nothing they can do to stop most accidents. (3) The phrase 'liunian error' lumps together differentsorts of failure that require differentquite actions to preventthem happeningagain (see Section 1. these may be called changing the work situation. sometimes of by better enforcement the instructions. tellingpeopleto take more care will not preventan accident happemng again. usually anoperator. not taken a short-cut. There was. Insteadit merelytempts us to tell someoncto bc more careful. are not human or do not fail. without management actionof somesort.3. usually a designer. auditsandenforcement well as the way atask as is performed. Many years ago. I looked through a bunch of accident reports and realized that most of the accidents could be prevented by better management — sometimes by better design or method of working. All ofthemcan makeerrors but the operator is at the end of the chain and often gets all the blame. and emphasizes whatcanbedonebychanging designs ormethods ofworking.howdoes this bookdiffer from any otherbook on accidents? It describes accidentswhich at first sight seem to be due wholly or mainly to human error. not a safety adviser. for threereasons: (1) Every accident is due to humanerror: seems. We should look for changes in design or methods of workingthat can preventthe accident happening again. But no-oneis deliberately careless. C) .AN ENO[NEERS V(EW OF HUMAN ERROR fellow-worker. The latterphrase includes training.

of course. We were interested in machines and the way they 3 . etc. but we are left with MarkI man and woman. We maybe able to keepup a tip-topperformance for an hour or two while playing a game or a piece of music but we cannot keep it up all day. We do not like to admit that we did something badly. outputor efficiency3. etc. compressors. instructions. But the results achievedin the last few thousand years suggest that their results will be neither rapid nor spectacular and where experts achieve so little. to some extent. but we are willing to adniit that we could do it better. Causeimplies blameand we become defensive. distillation columns. we should design user-friendly plants which can tolerate human error (or equipment failure) withoutserious effectson safety. instructions. We can. Thosemore qualifiedthan engineers to do so — teachers. engineers are likely to achieveless. every day. training. I am not simply saying changethe hardware. clergymen. psychologists — willno doubt continueto try and we wish them success. social workers. than to say it was caused by bad design. We can. Wheneverpossible. etc. by better motivation. What we cannot do is enable people to carry out tasks beyondtheir physical or mental abilitiesor prevent them making occasionalslips or having lapses of attention. Let us therefore accept that people are the one component of the systemswe design that we cannot redesign or modify.better instructions. change people's performance by better training and instructions.2 Meccano or dolls? that Let meemphasize whenI suggest changing the work situation. I do not say that it is impossible to change people's tendency to make errors. better supervisionand. most engineers are men and as boys most of us played with Meccano rather than dolls. Peopleare actuallyvery reliable but there are many opportunities for error in the course of a day's work and when handling hazardous materials we can tolerate only very low error rates (and equipmentfailurerates). lower than it may be possible to achieve. Sometimes wehavetochangethe software — the methodof working. but sometimes redesignis impossible. 1.N R U U UCT10 N l It is better to saythat an accidentcanbe prevented by betterdesign. We can design better pumps. etc. reduce the opportunities for such slips and lapsesofattention by changing designs or methods of working. At present. however. therefore. Safetyby design should always be the aim. and we haveto modify procedures. ortoo expensive. In over half the accidentsthat occur thereis no reasonably practicalway of preventing a repetition by achangein design and we haveto changethe software.

we may have less difficulty getting approval. persuading people to follow them. Mostclassification systems are designedprimarilyto help us find the information. Procedures are subject to a form of corrosion more rapid and thoroughthan that which affects the steelwork. and require as much of our effort and attention. otherwise we would not be engineers. as if a book on transport discussedjet travel and seaside donkey rides under the same headings (such as costs. It is easy to install new protectiveequipment — all youhaveto do is persuadesomeoneto provide the money. If a hazardcanbe removed or controlled by modifying the hardware or installing extrahardware. if a hazard is controlled by modifying a procedure or introducing extra training. the new equipment may have to be tested and maintained. unfortunately. we may haveto fightfor themoney. writing instructions. whenwe do go for safetyby design. You will get more grey hairs seeing that the equipmentis tested and maintained and that people are trainedto use it properly and do not try to disarm it. itis not safetyby design alwayspossible and economic. A continuous management effort — grey hairs — is needed maintain our systems. One reason we are less happy with software solutions is that continual effort — what I have called grey hairs1 — is needed prevent them disapto pearing. 4 . to devising new training programmes or methods. Furthermore. checking up to see that they are being followedand so on. as we shall see. However. No wonder we prefer to whenever it is possible and economic. Procedures lapse. trainers leave and are not replaced. Human errors occur for various reasons and differentactions are needed to preventor avoid the differentsorts of error. Unfortunately much of the literature on human errorgroupstogetherwidely differentphenomenawhich callfor different action. 1. these actions arejust as importantas the hardware ones. I have used a systemthat helpsus find the most effective way ofpreventingthe accidents happening again. Ifind itusefultoclassifyhumanerrors as shown below. Most of us are very happy to devise hardware solutions.h' £'O1EER'S VIEW OF HUMAN ERROR work.butonce we getit and the equipmentis modifiedorinstalledit is unlikely to disappear. but the new procedure or training programme may vanish without trace in a few months once we lose interest. maintenance and publicity). In contrast. We are less happy when it comes to software solutions.3 Types of human error • Errorsdueto aslipormomentary lapseofattention(discussed in Chapter2).

Weshould reduceopportunities for errors by changing the work situation. mle-basedand knowledge-based. Skili-based actions are highly practisedones carried out automatically with little or no conscious monitoring. as people often believe that the rule is wrong or that circumstances justify an exception. were they difficult to follow. These errors are not a fifth categorybut are mainly due to ignorance of what is possible.iNTRODUCTION • • • The intentioniscorrectbut the wrong actionorno actionis taken. We should ask why the rules were not followed. have supervisors turned a blind eyeinthepast? Thereis fine line betweeninitiative and breaking the rules. they take longer than skill-based ones as each step has to be considered. Errors due to a deliberate decision not to follow instructions or accepted practice (discussed in Chapter 5). We need to improvethetraining orinstructions or simplify thejob. There is a mismatch between the ability of the person and the requirements ofthe task. Tasks are sometimesdividedinto skill-based. Errorswhichoccur becausea taskis beyondthephysicalormentalabilityof the person asked to do it. Chapter 6 discusses those errors made by managers. because the task is non-routine or unfamiliar. Many managers do not realize that they need training and thereis no-onewho can tell them. looks at management failures in a narrower sense. worse. thus winking at non-compliance. however. Rule-based actions have been leamed but have not been carried out often enoughto becomeautomatic. Theseare calledmistakes. failures to provide adequate training and instruction. Thus an error mightbe due to poor training compounded by limited ability (a mismatch) and the fact that the foreman turned a blind eye onpreviousoccasions. Todayit is widelyrecognized that ali accidents are management failings: failures to reduce opportunities for error. and so on. Did someonenot understand the reasonsfor the instructions. Note that if the instructionswere wrong. 5 . These are often called violations but non-compliance is a better term. failures to ensure that people follow the rules. the degreeofconscious controlis intermediate. especially senior managers. perhaps beyond anyone's ability (discussed in Chapter4). Errors due to poor training or instructions (discussed in Chapter 3). thinkshe knowsbut doesnot. Chapter6. Someonedoes notknow whatto do or. Theintentionis carriedoutbut iswrong. Knowledge-based actions are carried out under full conscious control. because they do not realize that they could do more to prevent accidents. non-compliance may prevent a mistake(as definedabove). Theboundaries betweenthese categories are not clear-cut and often more than one factor may be at work. We needto change the work situation.

• accidents thatcouldbe prevented by betterconstruction. Finally Chapter 13 looks at legal views and at the question of personal responsibility. or open the wrong valve. those that are due to poor instructions are mainly errors in rule-based tasks. Those errors in my secondgroup — mistakes — thatare due to poor traintng are niainly errors in knowledge-based tasks. This is the most likely reason and similar errors are discussedin Chapter2. If we try to prevent errors by changing the work situation. All we can do is assumethat they will continue in the future at the same rate as in the past. does this mean that people who make errors are entitled to say. However. • accidents thatcouldbe prevented by better methods of operation. The errors in my first group — slips and lapsesof attention — are errorsin skill-based tasks.Violations can be any ofthese types of error. 6 . unless there is evidenceof change. training. 'It is your fault for puttingme in a situationwhere I was able to makea mistake?' Thevarious of types human error may be illustratedby considering a simple error: forgetting to push in the choke on a car whenthe engine has everyday warmedup(arareerrortoday when mostcarshaveautomatic choke). but classifiedsomewhat differently: design. • accidentsthatcould be better • accidentsin computer-controlled plants.Each of these factors can contribute from 0 to 100% to the probability offailure. instructions and monitoring. rule-basedfor anotherand knowledge-based for a third. Chapters8—12 examine some further accidents due to human failing. but we cannot estimate the probability that he will make a mistake because the training or instructions are poor. or becausehe has a 'couldn't care less' attitude. prevented by • accidentsthatcould be prevented by bettermaintenance. We can estimate — roughly — the probability that someonewill have a moment's abeiTationand forget to open a valve. Thereare severalpossible reasonsfor the error: 1. People often assume that these errors have been eliminatedby selection. but this is not always true.AN ENGINEER'S VIEW OF HUMAN ERROR Note that the same action may he skill-based for one person.4 Two simple examples • It may be due to a lapse of attention. Chapter7 describessome of the attempts that have been made to quantify the probabilityof human error. because he lacks the necessary physicalor mental ability. these methods apply only to the first sort of error.

Bettertraining orinstructions mightpreventtheerrors butpersuadingme to use thespell-checking tool would be cheaperand more effective. IfItype recieve or s'eperate. Telling metobe more carefulwill notpreventthese errors. it is probablybecauseitis beyondmy mentalcapacity to spell it correctlyfrom memory. Similarerrors are discussedin Chapter3. The following. As anotherexample ofthe five sortsoferror. the spell-checking tool can be misusedand can introduce errors.Sucherrors are discussed in Chapter4. either through ignorance or as the result of a 7 . IfItype wieght itmaybe theresult offollowing arule that is wrong: 'Ibeforee. overlooked whenI checked my • • • • typing. if the error is due to lack of training or instruction then we can provide better training or instruction. copied from various sources. take no interestin the wayhe treats thecarandfail toinvestigate thereasonswhy myengineswear out so quickly. However. Sirmlar errors are discussedin Chapter5. he or she may not havebeen told what to do or whento do it. The latter adds to the cost and provides something else to go wrong and in some such cases it might be better to acceptthe occasional error. The error maybe due to the fact that the driver cannotbe bothered — the hired carsyndrome. it is probablydue to a deliberate decisionto use the American spelling. (Can you spot the error? Answer on page 10. Examples ofsituationsinwhich automatic equipmentis notnecessarily better than an operatorare given in Chapter7.iNTRODUCTION • • • • With an inexperienced driver the error may be due to a lack of training or instruction. I may leave everything to him.Thesemanagement errors are discussed in Chapter6. perhapsthe managers take little orno interestinthe standardof spelling. There is not much we can do in the other cases except change the work situation — that is. probably occurred because someone misspelled a word. They may urge me to do better but do not realize that theycoulddo more.) If I type thru or gray. To preventthe errors someonewillhave topersuademe to use theEnglishspelling. 11. Usingthe spell-checking tool on my word processorwill preventmany ofthem. except after c'. IfIworkfora company. if I employa driver.considerspellingerrors: • IfI type opne or thsi it is probablya slip.1 write Llanftuiipwllgwvngylfgogerychwvrndrobwlllaniysiliogogogoch (a villagein Wales). it isprobablydue to ignorance. The error may be due to a lack of physical or mental ability— unlikely in this case. provide a warning light or alarm or an automatic choke (now usual).such as encouraging me to usethe spell-checking tool.

have made slips. we need a sympathetic attitude towardspeople who have committed so-called violations. We should ask why he did not take that action and make recommendations accordingly. for example. will try to show that the accident could not have been prevented by better design.. (From a letterfrom a buildingsociety. (Froma classified advertisement. however. people are defensive and notvery forthcoming.) programme.) • The opera isin threeparts. If we wishto find out why accidents occurred and find waysof preventing them happening again. They illustrate the way computers (and other tools) can be misused when we do not understand theirlimitations.) • Bedsitwith separatefatalities. on the assumption that the all-powerful computer could not possibly be wrong. the design engineer will suggest ways in whichbetter design(or changes in the designprocedure. Eachpersonwill consider what he or she mighthavedone to preventthe accidentor makeit less likely. (Possible reasons are discussed in the following chapters.. (From an advertisement. It is a small price and is worthpaying.5 Accident investigation The outputfrom an accidentinvestigation maytellus much about the cultureof thecompany whereit occurred. • Twnhledrye.) • A wondeifulconversation ofaperiod house to createfour homes. such as more use of Hazop or inherently safer design) could prevent a repetition. It is difficultto find out what has happenedand the true causes may never be found.) 8 .) • Classic old brass and coppergeezei (From a classified advertisement.had lapses of attention or not learnt what their training was supposed to teach them. Thefirstisan appealto density. If it has ablame culture. In a blame-free environment.) • A red box containing assaulted tools worth £100 was stolen . (From a newspaperreport. This applies to everyone. We should not blame the workman who for one reason or anotherfailedto take action that could havepreventedan accident. the designengineer. In addition. Peoplefrom other departments will behave similarly.(Fromaclassifiedadvertisement. (From aconcert • Concrete arched lentils.when actions are discussed. (From a newspaperreport.) • Shouldyouhaveany queersplease contactthis office.AN ENGINEERS VIEW OF HUMAN ERROR slip. and then accepted whatever the computer offered.) 1. verticallyunused..

Unfortunately a desire to blame someone for misfortunes seems to be deeply engrained in human nature. He tried on most of the stock without findingone that fittedhim. broke the cable of a cable railway. flying low in an Alpine Valley. youre the wrong shape. the tailor said. many peoplehad an opportunity to prevent it. move the valve or remove the need for the valve. causing many deaths. The following story illustrates the theme of this book. or poor maintenance? Why was the railway not shown on the map? Was an old editionin use? And so on. For example. 'I'm sorry. so he was much lower than he thought. But some people expect others to change their mental shape and never have slips or lapses of attention. sir. rites were performed often at the grave or exposure platformof the dead to discover the person to be blamed for the murder'. This is as difficult as growing taller. As with every accident.It'TIOUUCT10N Similarlywe shouldnot blame the senior managerwho failed to take action that couldhave prevented an accident. 9 . If a man cannot 1.The reasonis usually not wickedness but failure to foresee the actions he might have taken (see Chapter6 and Sections 13. We are much the same and are ready to believe that every accident is someone's fault. amongst the Australian Aborigines. television picturesshowed relatives of those killed expressingtheir outrage at his acquittal. In 1998 a plane. A man went into a tailor's shop for a ready-made suit. but ask why. We provide a step. in exasperation. Why was the altimeterfaulty? Was it poordesign unsuitable for the application.5.' Should we as engineersexpectpeopleto change their (physical or mental) shapes so that they fit into the plants and procedures we have designed or should wedesignplants arid procedures to fit people? We know that they cannot change their physical shape. especially the person at the sharp end'. and that the cable railway was not marked on his map. usually a member ofanotherlocal group4. Finally. I can'tfit you.6 Astory reach a valve we do not tell him to try harder or grow taller. Nevertheless. At first the pilot was blamedfor flying too low but the inquiry showed that his altimeterwas faulty. Since death was not considered a natural event a cause for it was always sought in the evil intentions of someone else. We shouldinsteadchange the designor methodofworkingto reducethe opportunities for error.3—13. page227).

Process Plants: A Handbook for Inherent Safer Design lv (Taylor and Francis. 1973.UK). in Table 1. are not causedby lack ofknowledge.. Kletz. Australian Aboriginal Culture. Table 1. 1 Advisory Committee on Major Hazards. Kletz. Appendix 13 (HMSO. in a small way. Plant/Operations Progress. as a result of ignoring these facts. 1984. The Control of Major Hazards. with a few exceptions. 3. The main problem is that we do not use the knowledge we have.1 summarizes the four types of accident and the corresponding actions needed. The correct spellingof the villagein Wales (seepage 7) is: Llanfaiipwllgwyngyllgogeiychwyrndrobwllllaniysiliogogogoch 10 .A.7 Research on human error TheThirdReportofthe UKAdvisory Committee on MajorHazards2 arid many otherreportsrecommend researchonhumanreliability andmakesomesuggestions. T.T.3(4): 210.1 Typesofhumanerrorand the actionrequired Error type Mistakes— Doesnot know whattodo Violations—Decidesnotto do it Mismatches— Unableto do it Slips andlapses ofattention Action required Bettertralmng and mstructions I CHAOS Persuasion I CHAOS CHAOS CHAOS CHAOS= Change Hardware And I Orsoftware References in Chapter 2.. 1998. They are discussed more detail in the followingchapters.A. Third Report. lack ofknowledge is not the main problem. but by a failure to usetheknowledgethat is available. to contribute towards accident reduction by remindingreaders offacts theyprobablyknow welland reinforcing themby describing accidents which have occurred. London. USA). page44. Philadelphia. Accidents.AN ENOLNEER'S VIEW OF HUMAN ERROR 1. This book is an attempt. 1984. I. Australian National Commission for UNESCO. 4.Whilethereis undoubtedly much wewould like to know. inpart.

11 . BBCRadio4.N. what we aredoing. Whitehead 2. or carried it out wrongly— thatis.We must either acceptan occasionalerror — probabilitiesare discussedin Chapter7 — or remove the opportunity for error by a change in thework situation— that is. had doneit many timesbefore. Note that errors of the type discussed in this chapter — errors in skill-based behaviour— occur not in spite of the fact that the man who makes the error is well-trained but becausehe is well-trained. Such slips or lapses of attention are similar to those of everyday life and cannot be prevented by exhortation. Alternativelywe can. heknewwhat todo. errors areliable to occur. Theintention was correct.routinetasksuch as closing oropeningavalve. When the normal pattern or programmeof action is iiiterrupted for any reason.was capableof doing it and intended to do it and do it correctly. 6 December1977 'It isaprofoundlyerroneoustruism. Routinetasks aregiven to thelower levels of the brain and arenot continuously monitored by theconscious mind. punishmentor furthertraining. We could never get through the day ifeverythingwe did required our full attention. he closedor openedthewrong valve. repeatedby all books. provide opportunities for people to observe and correct their errors or we can provideprotection againstthe consequences oferror.only aforgettomy. Civilisation advancesby extending the numberof importantoperationswhichwe canpem form without thinking about them. but he had a moment's aberration. A. and by eminent people when they are making thatwe shouldcultivatethe habit ofthinking speeches.Accidents caused by simple slips 'I haven'tgot a memo/i'.1 Introduction This chapterdescribes someaccidents whichoccurredbecausesomeone forgot to carry out asimple. The preciseoppositeisthe case. so we put ourselves on auto-pilot. by changing the design or method of working. in some cases.' Small boy quotedonTuesdayCall.

we mightfind that we have weeded out the introvertsand left the extroverts and those are thepeople who cause accidents for another reason: they are more inclined to takechances. we are not concernedwith the reasons why these errors occur. 12 . before we cross the road. as engineers. Reason and C. where we hold captive thoughtsthat would embarrass us if they became conscious. trips which prevent serious consequences and reducing stress or distraction. is not the same as the Freudianunconscious.AN ENGINEER'S VIEW OF HUMAN ERROR Those interested in the psychological mechanisms by which slips and lapses of attention occur shouldread J. to stop and ask themselves. 'Is this the right valve?'. Can we reduceerrors by selecting people who are less error-prone? Obviously if anyone makes a vast number of errors — far morethanan ordinary person would make — there is something abnormal about his work situation or he is not suitable for the job. interlocks that preventan incorrect action being performed. People can be encouraged. But that is as far as it is practicable to go. as used here. It is rather like a computer operating system running quietlyin the background. They includebetter displays of information. Section 4. Behavioural safety training (see Section 5. but it is inevitable that we all do so from time to time. slips and lapses are most likely to occur when we are in a hurry or under stress and these are the timeswhenwe are least likelyto pause. much as we pause and look both ways. Peoplewho have made an absent-minded error are often told to keep their mind on the job. but it is not very helpful. Note that the unconscious mind. However. a place ofpreventivedetention. In contrast the non-Freudian unconscious is a mechanism whichdoes not requireconsciousness to do its job.3 on page 107) will have little if any effect as slips and lapses of attention are not deliberate. It is understandable that people will say this. The examples that follow illustrate some of' the ways by which we can prevent or reduce slips and lapses of attention or recover from their effects. beforeclosing a valve. Here. If we could identify people who are slightly more forgetful than the average person— andit is doubtful ifwe could — andweed them out. The psychology is the same. We should therefore acceptthem and design accordingly.3 (page 85)returns to this subject.but with the fact that they do occur and that we can do little to prevent them. No-one deliberately lets their mind wander. Mycielska'sbook on everyday errors'.warnings that it has been performed. or an auto-pilot25.

Figure2. The open and one closed.2.2 (page 14) shows two similarfilters.1(a) Filter 13 . The steam supply was then isolated. The filterdoor was held closedbyeightradialbars whichfittedinto U-bolts on thefilterbody. with filter leavesattached.1(b) onpage 14). Steam (100psig) Vent From plant Filteredproduct to storage Drain Figure 2. the inlet valvewas closed and the liquid in the filter blown out with steam. the pressure blown off through the vent and the fall in pressure observedon apressuregauge.1 Opening equipment which has beenunder pressure A suspendedcatalystwas removedfrom a process stream in a pressure filter (Figure2.fixed to thedoor.To withdrawtheradialbars from theU-boltsandopen thedoor theoperatorhad to turn a large wheel. Whena batchhad beenfiltered.1(a)).2 Forgettingto open or close a valve 2. The operatorthenopenedthe filter forcleaning. could thenbewithdrawn (Figure 2.ACCIDEt'iS CAUSEI) IY SIMPLE SLIPS 2.

2 Two filterssimilarto the one involvedin the accidentdescribed m Section 2.1.1(b) Filter door and fastening Sideview (enlarged '1 Figure2.2. the doorflew open. 14 . The one on the leftis open and the one ontheright is closed The in operator who was killedwas standing front ofthe wheel seen on the right He had forgotten to vent the pressure in the filter Whenhe startedto open the door.AN ENGINEER'S VIEW OF 1-IUMAN ERROR Frame U-bolt Radialrod End viewofdoor Figure turningthe wheel.

This is now requiredin the UK by the Healthand SafetyExecutive23. and we would recommend the following changes in the design: whichhas beenunderpres(1) Wheneversomeone has to open up equipment release devices: sure. he may have been thinkingof his holiday. The accidentwas the result ofthe work situation. His mind may not have been fully on the job. (2) (3) The handle on the door should be modifiedso that it can be operated withoutthe operatorhaving to standin front ofthe door.Only minorchanges were madeto the design.this isimmediately apparentand thepressure canbe allowedtoblow off through thegap or thedoorcan be resealed. It showed the need.They were located on the floor above. However. the most important. The accident occurred at the end of the night shift. it was said. Thepressuregaugeand vent valveshouldbe locatednearthedoor so that they are clearlyvisibleto the operatorwhenhe is about to open the door.Who can blame him? It is not very helpful to say that the accident was due to humanfailing. He was standing in front of it and was crushed between the door and part of the structure. Recommendations (2) and (3) were made and carriedoutat the time but not (I).ACCIDENTS CAUSED BY SIMPLE SLIPS One day an operator. This accident occurredsome years ago and at the time it seemedreasonable to say that the accident was due to an error by the operator. a conscientious man of great experience. If the cover is released while the vesselisunderpressure. that an operatorwill forget that he has not openedthevent valveand will try to open the filter while it is still underpressure.we now see that in the situation described it is inevitable. an hour before the operator was due to start his annual holiday. sooner or later. started to open the door before blowing off the pressure. for other operatorsto remain alert and to follow the operating instructions exactly. 15 . usingquick (a) Interlocksshould be fitted so that the vessel cannotbe opened until the sourceofpressureis isolated and the vent valveopened(one way ofdoingthis wouldbe to arrange forthehandlesofball valveson the steam and ventlinesto project over the door handle whenthe steam valveis open and the vent valve closed).He was killed instantly.and (b) Thedesignofthe door orcover shouldbe such that it canbeopenedabout ¼ inch (6 mm) while still capable of carrying the full pressure and a separate operationshouldbe required to release the door fully.

Many similar accidents have occurred whenoperatorshavehadto open up equipmentwhich has been under pressure (see Section 12. page 207). 'We shall have to rely on the operator. every day. saidthat it was.AN ENGINEER'S VIEW OF HUMAN ERROR We all have moments when for one reasonor anotherour minds are not on thejob. we can rely on the operatorto open a valve 99 timesout of 100. and that each stepis ticked as itis completed. Would a check-listreduce the chance that someonewill forget to open a a 16 . 'reasonable torely ontheoperator. This is inevitable. One person prepares the equipment and issues a permit to anotherperson who opens it up.he would haveinstalled mechanical aids.' He would not have said it was reasonable to rely on the operator if a tonne weighthad to be lifted. In contrast. perhaps more perhapsless if stress and distraction are high (see Chapter7). normallyby carefullyslackening bolts in case there is any pressureleft inside. valve? Check-lists are useful when performing an unfamiliar task — for example. If the managerinsists that one is used.' They shouldfirst ask what failure rate is likely and whetherthat rate is tolerable (see Chapter7). It is unrealistic to expect people to use them when carrying out a task which is carried out every day or every few days. One design engineer. Of course.The operatorknows exactly what to do and sees no need for a check-list. In this case the accident could have been prevented by betterdesignofthe equipment4. equipmentwhich has been under pressure is opened up safely for repair but this is normally done under a permit-to-work system. Similarly if memory tasks are too difficult we should install mechanical (or procedural) aids. Safetyis obtainedby following procedures: the involvement of two people and the issueof a permit provides an opportunity to check that everythingnecessaryhas been done. the listwill be completed at the end ofthe shift. Accidents are liable to happen whenthe same person prepares the equipmentand opens itup and in thesecases we shouldlook for safetyby design. finding it difficult to install the devices recommended in (1) above. m every factory.3. Many designengineersacceptthe arguments ofthis book in principlebut when safety by design becomes difficult they relapse into saying. but one failure in 100 or even in 1000 is far too high when we are dealing with an operation whichis carried out every day and where failure can have serious results. we all forgot occasionallyto push them in when the enginesgot hot but we would not have agreedto complete a check-listevery time we startedour cars. When chokes on cars were manual. We should list as the causes of an accidentonly those we can do something about. plant start-up or shutdown which occurs only once per year.

inattention to detail. and indicatorlights on the panel showedwhether it was open or shut. It shouldhave been locked out and it should have been impossibleto lock it open. in wnting a report. 2. the residue was discharged to a residue tank through a drain valvewhichwas operated electrically from the controlroom. not a root cause. the Health and SafetyExecutivewould.3 Emptyinga vessel Figure 2. The report said that the root cause was personnel error. It would have been far better if the electrician had isolated the circuitbreaker. commenton the failings of the injuredperson but fail to see that their procedurescould be improved. He turnedoffthe circuitbreakerandreportedtheincidentto his supervisor whosent for an electrician.2. As in the filter incidentin Section 2.2. while charging the still. The report26 is a good example ofthe way managers. akey was required to operate thevalve. have commented on the poorpermitsystem. It is not clear from the report whether the circuit was locked out or just tagged out. The maintenance worker. To reduce the chancethat thevalvemightbe opened atthe wrong time. The electrician found that a cable had been damagedby a support bracket. After a batch was complete. but the numberof checks to be made is large and the consequences offailureare serious.2. One day the operator.ACCiDENTS CAUSED BY SIMPLE SLIPS It is true that aircraft pilots go through a check-listat every take-off.1. The incidentoccurredin the US. 2. I think. While working on the crane he received second andthird degreebums on his righthand. However. checkedthat the circuitwas dead and signed a permit-to-work to confirmthat hehad done so. ifit had occurred in theUK.3 (page 18) showsthe boiler of a batch distillation column. this was an immediatecause. noticed that the level was falling instead of rising and then realized that he had forgotten to close the 17 . He asked a cranemaintenance workerto makethe necessary repairs andremindedhimtolock out the circuitbreaker. turneditto the 'on' positionbeforehanginghis tag on it. not realizingthat the breaker was aireadyin the 'off' position. The root cause was a poormethod of working.incidentsare liable to occur when the same person prepares the equipment and carries out the maintenance. failure to confirmthat the circuitwas de-energized.2 Preparation and maintenance by same person Acraneoperatorsaw sparkscoiningfrom the cranestructure.

To prevent the dischargevalve being openedat the wrong time.3 bar. A drain valvewas providedfor use after the reactorwas washed out with water between batches.A RNO1i Wi. Indicator lightsshowits position Figure 2. As an immediate measure a large metal tag was fitted to the key. (This was some thing that any operatorcould havedone at any time.4.3 Prematuredischarge ofthe boilercaused a reaction to occurin the residue tank drain valve after emptying the previoushatch. A quantity of feed passed to the residue tank whereit reacted violentlywith the residues. The operator pointed out that the key was small and easily overlooked and that the indicator lights were not easily visible when the sun was shining.remotely operated by key onpanel.3 bar. When the reaction was complete the pressure fell and the product was discharged into the product tank. the discharge valve opened automatically 18 . A similar incident occurred on the reactor shown in Figure 2. 'JRW OE IUM.N ERROR Feed valve (remotely operated) From other umts Drainvalve.) Later the drain and feed valves were interlocked so that only one ofthemcould be open at a time. When the gauge pressurefell below 0. A batch failed to react and it was decided to vent off the gas. it was interlocked with the pressure in the reactor so that it could not be opened until the pressure had fallen below a gauge pressure of 0.

ACCIDENTS CAUSED BY STEIPLE SLIPS Vent High pressure switchprevents valve opening until pressure isless than0.3 bar gauge Reactor Discharge valve To controlpanel Toproduct tank Closed Drain valve (left open) Figure 2.The drain valve was left in the open position.4 Arrangement ofvalves on a batchreactor.the discharge valveopenedand the contents ofthe reactorwere discharged. When thepressurein thereactorfell below0 3 bargauge. .

in a hurry to start up the spare pump and preventinterruption to production.the drain valvehad to be closed. Labels will be providedto remind operators that certain pumps may be left with open vents. though they were flammable. 20 . The accidentoccurred some years ago ' . it was carelessness whichledhim to Accordingto the accidentreport. After a while the notice will have become part of the scene and may not be noticed. The accident could have beenprevented by a better designedprotective system. changes were made to makethe accident less likely and protect the operator from the consequences of error. A better solution would have been to fit a small relief valve to the pump. Theweakness in the designof theprotective systemcould havebeen foreseen by ahazard and operability study. as the drain valve had been left open.2. Fortunately.AN ENGINEER'S VIEW OF HUMAN ERROR and. However. they did not catch fire. forgot to close the drain valve although he knew that he should. the contents of the reactor were discharged into the workingarea. The drain line was moved so that the drainings were still visiblebut less likely to splash the operatorand a notice was placed near the pump to remind the operators to close the drain valve before starting the pump. 2. Again. to be due to human failing and the operator was told to be more careful. One day a young operator. Before starting up the pump. Process supervisors will be told to ensurethat all operatorsand particularly thosewith limited experience.4 Emptyinga pump A pumphad to be drained whennot in use and the drain valveleft open as the liquid insideitgaveoffgas onstanding and thepressurewould damage the seal. The remote actuatoron the discharge valve had been left in the open position and as soon as the pressure in the reactor fell the interlockallowedthe valve to open. are fully aware of the elementary precautions they are expectedto take in carrying out their duties. The liquid came out of the open drain valve and burnt him chemically on the leg. it is too simplistic to say that the accidentwas the result of an error by the operator wholeft the actuatoron the discharge valve in the open position.. had done it before and an instruction to do so was includedin the plant operating instructions. in the report. open liquors to the pump with the vent open .' These comments writtenby the plant manager(the equivalentof a supervisor in the US) suggest that he was rather reluctantto make any changes. The accident was said... Interlocking the drain valve and the starter would be possible but expensive and complex and probably not justified.

5.1 Trip and alarmtesting A plant was fitted with a low pressurealarm and an independentlow pressure trip. If I had known I might have changed the design. Therewasnolabel on thealarmbut there was smallone near the trip. We can imaginethe comments that the other people involved mighthave madeat the time. arranged as shownin Figure2. open the vent to blow off the pressure and note the reading at which the alarm operated. The designengineermighthavesaid.USED Y SI1PLE 'L1PS and today most managers would be more willing to change the design.5 Arrangement oftrip and alarmconnections 21 . not oncentrifugal ones. a Label Connection to plant Figure 2.3 Operating the wrong valve 2.' The shop steward would have said. was to isolate the alarm from the plant.' 2. The procedure.fCCDEN LS CA. The injuredman might have said. 'Blame the working man as usual.' The safety officer might have suggested a relief valve instead of an open drain valve. 'It's easy to forget there is something special about a particular pump when you are busy and trying to do everything at once. Aninstrumentartificerwas askedto carry out the routine test of the alarm.3. Our usual practiceis to fit relief valves only on positivepumps. No-one told me that the drain valveon this pump had to be left open. well known to him. The foreman would probablyhave been blunter than the manager and said the accident was entirely due to carelessness.

A series ofinterlocks were theninstalled sothat aunithas to be shutdownbefore akeycan beremoved. Eachboiler heat cost.3. This is more expensive but simpler and free from opportunities for error. The chance of an error was increased by the lackoflabelling and the arrangement ofthe valves —D3 wasbelowC2. On two occasions the wrongvalve was closed (D3 insteadof C2) and an on-line boiler was starved of water and over-heated. had to be takenoffline from time to time for cleaning (seeFigure 2.2 Iso'ationof equipment maintenance for To save three waste boilers shared a common steam drum. To reducethechanceof afurthermistake: • providebetterlabels. It would have been of little use telling the artificerto be more careful.AN ENGINEER'S VIEW OF HUMAN ERROR By mistake he isolated and vented the tnp. • possiblypaint the tripand alarmdifferentcolours. This may be true but such simple things produce error-prone situations. Note that we do not grudge spending money on complexity but are reluctant to spend iton simplicity. is to have a separate steam drum for each waste heat boiler (or group of boilers if several can be taken off line together).7(a) (page 24) illustrates another accident involvingvalves that were out-of-line.6). On the first occasion the damage was serious. • putthetripand alarmfurtherapart. A series of parallelpipes rose up the wall of a room. note that the trip and alarmshould be connected to the plant by separate impulse lines to reduce the chance of a commonmode failure: choking of the commonimpulseline. It took 36 hours to get it back to normal. On the secondoccasion theyprevented serious damage butsome tubes still had tobe changed. There is then no need for valves between the boiler and the steam drum. the pressurein the trip fell and the plant was automatically shut down. A better design. Figure 2. 22 . There was a horizontal double bend in the overhead sectionof each pipe. High temperature alarms were then installed on the boilers. Thoughnot relevant to this incident. 2. As a result thepipes going up and the pipes comingdown were out of line. went part way across overhead and then came down the centreof the room. During a discussion on this accident a design engineer remarked that designers have enough problems arranging complexpipework in the space available withouthaving to worry whetheror not corresponding valves are in line with each other. When he opened the vent valve. used on later plants. this keyis needed toisolatethe corresponding valves on the steam drum.

It is easy to see which valve should have been closed.6 Waste heat boilerssharinga commonsteamdrum A valve in a pipe in themiddle of theroom had to be changed.7(b) on page 24). Note that both the valvesclosed were the third from the end of therowas an extra pipe on thewall ledelsewhere. A common failing is to look 23 .7(a) (page 24) shows the pipes as seen from above with the ceiling removed.kCCIDENTS CAUSED BY SIMPLE SLIPS From No. Figure 2. This valve was in the wrongline. Colour-coding of the corresponding valvescould have prevented the accident (see Figure 2. The other isolationvalvewas on the wall. When the topwork on the valvewas unboltedthe pressure of the gas in the line causedthe topwork to fly off and imbed itself in a wall. The man who isolated the valves overlooked the bends overhead and closed the valve on the wall that was in line with the valve that had to be changed. Standingamongst the pipes the correct valveis less obvious. I unit Figure 2. The report did not mention this but recommended various changes in procedures. One isolation valvewas beneath it.

The operatorclosed the valve belowit.use one colourforthe pair labelled 1. and so on.and to consider ways of removing the hazard rather than controllingit only as a last resort. for changes to procedures first. To completethe isolation. page 165). INLET orON OUTLET OFF or Figure 2. Both of the valves that were closed were the third from the end oftheir row. another colour for the pair labelled2.The bendsin the overhead pipes are in thehorizontal plane. Ifthe valves or switches cannot be moved. last item. He overlooked the doublebends overhead and closed valveB.7. to consider changes in design only when changesin proceduresare not possible.AN ENGINEER'S VIEW OF HUMAN ERROR Pipes in centreof room Pipeleading — elsewhere Figure 2. This is the wrong order (seeSection 8. theone oppositevalve A.7(a) ValveAhad to be changed. 24 . he intendedto close the valve onthe otherside ofthe room in thepipe leadingto valve A.7(b) Valves orswitches arrangedlike this lead to errors.

butsupposea similarpanelwas used to charge a batchreactoror fill a containerwith productfor sale.Pressingthe wrong button might result in a runawayor unwanted reaction or a customer complaint. Ifound that on about 1 occasion m 50 whenI usedthese machines I pressedthe wrongbuttonandgot the wrongdrink.8. TEA TEA÷SUGAR BLACKTEA+ SUGAR BLACKCOFFEE WFIITECOFFEE÷SUGAR i h 1 1 I 1 WH1TECOFFEEBLACKCOFFEE+ SUGAR HOT CHOCOLATE I! 1 I HOTSOUP Sold out when lit V I' 'I Press for rejectcoins D Figure 2.4 Pressing the wrong button 2.8 Panelofbeveragevending machine 25 V .1 Beveragemachines Many beveragevendingmachines arefitted with apanel such as that shownin Figure2.ACCIDENTS CAUSED B' SIMPLE SLIPS 2. Obviously theconsequences were trivialandnot worth worrying about. It is thereforeworth studyingthe factors that influence the probability of error and ways ofreducingit.4.

If this did not give an acceptable error rate we might have to consider attaching a microprocessor which could be told the name of the product or customer and would then allow only certain combinations of constituents. separate the choice of drink (tea. 26 . If they have to be slightly different — because of a differentfunction— then a striking notice or change of colour is needed to draw attention to the difference. They were certainly not due to lack of motivation becausewhen I got the wrong drink I had to drink it (beingtoo meanor shortofchange to throw it away and try again). We could put the buttonsfurtherapart. Redesign the panel. I started to drink lemon tea obtained by pressing the isolatedcentralbutton— andI felt sure that I would makeno more mistakes. we should not a blame the operator if he makes an error. chocolate.9(a).5 (page 141) that in a situationfree from stress and distraction the error rate would probablybe about 3 in 1000. or to lack of physicalor mental ability. I knewwhat to do. A situationlike this sets a trap for the operatoras surely as a hole in the ground outside the control room door.The errors were not due (I hope) to lack of training or instructions. shown in Figure 2. butnevertheless made an occasional error.9(b). after a few weeks. coffee. • • • Ifwe wishedto reducetheerror rate we could: Placethemachinein aplacewhere thereisless distraction. Alternatively it might display the instructions on screen and the operator would then haveto confirmthat theywere correct. was able to do it and wantedto do it. If there are two panels in the controlroom and they are slightly different. I moved to a new building where the beverage machines were of a differenttype. Nevertheless. My error rate was increased by a certain amount of stress and distraction. I used a machine in a different part of the building and when I pressed the centrebutton I got hot chocolate. The panels should be identicalor entirely different. (Themachines are in the corridor. but I was so used to pressing the centre buttonthat I did not pause toread the label. Obviously it does not matter if I get thewrong drinkbut a similar mistake on a plant could be serious. soup) from the choiceofmilk (yes or no) and sugar(yes or no). stress isharderto remove. Betterstill. The labelling was quite clear. The panel was arranged as shownin Figure 2. I did get the wrong drink.) It is shown in Section 7.A?i EN'GINEERS VIEW OF FIUMAN ERROR This exampleofhumanerror is of interestbecausemany of the uncertainties that are presentin other examples do not apply.

on a batch plant were shut down for maintenance. However. 4. Nos 4 and 6. Nevertheless we must expect an occasionalbump'. shownin Figure 2. 2. The valve was electrically operatedand the operator went to the panel.4.Nevertheless occasionalerrors occur.2 Overhead cranes Figure 2.PCCIDENTS CAUSEF) BY SIMPLE SLIPS Coffee Coffee Coffee Coffee black no sugar white black white Lemon no sugar with sugar with sugar tea Tea black _H_H_H_ (a) _ _H_H_H_ sugar sugar no sugar sugar Tea white no black with Tea Tea white with Coffee Coffee Coffee Coffee black no sugar white Hot chocolate no sugar black with white with Tea black Tea sugar Tea sugar Tea sugar sugar sugar ______ no sugar white black white no with with H (b) ______ ______ ______ ______ ______ ______ Figure 2. There is an extensiveliterature on the ergonomics ofcontrolroom layout. 2.9 Panels oftwo similarvending machmes Two identicalunits shared a common control room.The two panels were arrangedas mirror images ofeach other. but by mistake pressed the button controlling the inlet valve to No. The work on No.11 (page 29). 4 was completed and the foreman asked an operator to open the feed valve to No.3 Charging a reactor A similarmistake to the one with the beverage machine causeda serious firein which severalmen were killed.10 (page28) shows the buttons on the control unit for an overhead crane. Two reactors. 6 reactor.This led to numerous this case the operatorsees the load move the wrong way and can usually reverse the movementbefore there is an accident. Operators become very skilled in their use and are able to press two or three buttons at a time so that they seem to be playing the controls like a concertina. This reactor 27 .

The valveopened. 28 .33. providingbetter labeffing and so on. 'What can we do to prevent men making errors like this?' The answer is that we cannot prevent men making such errors. but errors will still occur. particularly ifthe operatoris understress (seeFigure 14. though we can make them less likely by putting the buttons further apart. page 250). The company concerned said.flammable gas came out and caught fire.10 Controlumt for atravelling overhead crane was still under repair.AN ENGINEER'S VIEW OF HUMAN ERROR Up slow Up fast Downslow Downfast Left Right Forward Reverse Reset Stop Figure 2.

12(a) (page 30). He realizedthat he had to isolate the fuel to thefurnaceon theextreme left so he went to the buttonon the extremeleft.4 () () () () Inlet Exit No.but shouldmakethe signalmore prominent.4 Shutting down equipment A row of sevenfurnaces was arranged as shown in Figure 2.5 Failures to notice The failures discussed so far have beenfailures to carry out an action (such as forgetting to close a valve) or carrying it out incorrectly (closing the wrong valve). 2. The accident could have been prevented by a better method ofworking. He made the sort of error that everyone makes from time to time.5 No. 5 furnace.11 Arrangement ofoperatingbuttonsfor the inletand exit valves on a groupofbatchreactors We shouldnever tolerate a situationin which such a simple slip has such seriousconsequences.ACCIDENTS CAUSED BY SiMPLE SL1I'S No. The valveson reactorsunder maintenance shouldhave been defused and locked off and the inlet and exit lines should have been slip-plated. bybetter management.7 Inlet Exit () () () C) inlet Exit No. The buttons for isolatingthe fuel to the furnaces were arranged as shown in Figure 2. Accidents can occur because someone fails to notice the signal for an action. The operator was not to blamefor the accident. 2. He the pressed wrong button and isolatedthe fuel to A furnace.6 inlet Exit Figure2. An operator was asked to close the fuel valve on No. We shouldnot tell people to 'Wake up' or 'Pay more attention'.4.12(b) (page 30). 29 .

I quoted the claim number (19 letters and numbers) but the man I spoke to typed MH insteadofKM whenentering 30 . They were fitted to the top rungs. The filters lookedsimilarto thosenormallyused and the storekeeper did not check the part number27. It was then found that the person who ordered the filters had left out one letter from the part number. I had a similarexperience whenI telephoned an insurancecompany to ask about progress on an accident claim. a man found that some were the wrong typeand unsuitablefor use. Reference numbers should be chosen so that a simple error (such as leaving out a letter or number. under stress or distraction. page249). quoting the wrong one.12 Arrangement offurnacesand controlpanel Forexample. givingthedate on whichthey were due for inspection. The report on the explosion blamed inadequate operator training. or interchanging a pair) cannotproduce a validresult.inadequate procedures and poor supervision but did not point out that if drums of differentchemicals look alike then sooneror later someone. will have a momentary lapse of attention and use thewrongdrum21 While checking a stock of dust filters for respirators. a company fitted smalldiscs to every ladder.31.AN ENGINEER S VIEW or HUMAN ERROR The furnacesare arranged as shownbelow: (a) But the switches are arrangedas shownbelow: (b) Figure 2. He checked the records and found that four similarones had been issued and used. The cover of the reactor hit the ceiling.2 m above. An operatorcharged a reactor with a chemical from a drum that looked like the drums he normallyused.Unfortunately he did not notice or check the label and there was a runaway reaction. Someonepointed out that they were more likely to be seen if they were fitted to the sixth rung from the bottom! (see Figure 14.

Doctorsbelievedthat highlyskilledprofessionalmenwould not make such a simple errorbutaswe haveseen everyone canmakeslips occurhowever well-motivated and well-trained.ACCI. MH. We spoke at cross-purposes.13 showsthe simple apparatus devised in 1867. all that was needed were differenttypes ofconnectionor differentsizes ofpipe.13 An earlychloroform dispenser. As a result he asked me to telephone another office. tomix chloroform vapourwith air and deliveritto the patient. infact. in the early days of anaesthetics. In 1989.Ifit was connected up the wrong way round liquidchloroform was blown into the patientwith results that could be fatal.a leak ofethylene exploded. Redesigning the apparatus so that the two pipes could not be interchanged was easy. Persuading doctors to use thenew design was more difficult and the old design was still killing people in 1928. They did not like to say that they hadn't a clue what I was talkingabout and brushedme off withremarks such as. 31 . 'We will keep you informed whenwe get more information.6 Wrong connections Figure 2. in a polyethylene plant in Texas.I)ENTS CAUSED BY SIMPLE SLIPS it into his computer. It was easy toconnectit upthe wrong way roundand blowliquid chloroform into the patient. I repeated the number (including HM) but many of the claims they dealt with includedthe letters MH and they heard what they expectedto hear. Do not assume that chemical engineerswould not make similarerrors.killing23 To face maski— i—From air bulb Figure 2.' 2. only whenwell-trained28.

errors in calculation were still possible. and they had been interchanged. This is hardly a new idea. Beforeentering each half of the line the enginedriverhad to possess the train staff for that section. only 60% of the buffer required was added. a violent exothermic side-reaction occurred. As well as this slip there was also a violation.loss of agitation. There was a runaway reaction and the reactor exploded.14)29 (see also Section 5.AN ENGINEERS vrw OF HUMAN ERROR people. high temperatureor low pH resultedin the reaction being automatically abortedby addition ofwater. An error by a designer resultedin the supports for a small tank being too light. as well as errors in the quantities ofreactantsadded.2. In the middle of the 19th century the Festiniog Railway in North Wales had a passing loop halfway along its 14-mile single-track line. 'I have however recommended that the train staffs should be painted and figureddifferentlyforthe two divisions ofthe line. one to open the valve and one to close it.the amountof buffer added was increased to twice that theoreticallynecessary. a decision (authorized at a seniorlevel)not to follow the normal company rules and industry practice which required a blind flangeor double isolationvalve (Figure 2. and so there was a need for the protectiveinstrumentation. It was open becausethe two compressed air lines. The operatingstaff may havebeen unable to influencethe designbut they could havepaintedthe two hoses and the corresponding connections different colours. 32 . When the tank was filled with water for a constructionpressure test. As the result of an error in calculation. On the modified plant. However.'30 2. If it was not controlled. Followingthe explosionthere was a review of the adequacy of the protective systemand many extra trips were added. A level switchin the watertank prevented operation of the chargepumpif the level was low. In addition. Eachbatchwas 'tailor-made'forthe particularpurpose for which the product was required and the weights of the raw materials requiredwere calculated fromtheircompositions andtheproductspecification. had identicalcouplings. page 105).7 Errors in calculations Abatchreactiontook placein thepresence ofaninorganic saltwhichactedas a buffer to control the pH. it fell to the ground. In 1864 a railway inspector wrote. unfortunately fracturing an oil line and causing a fire whichkilleda man. The leak occurred because a line was opened for repair while the air-operated valveisolatingitfrom the rest ofthe plant was open.5.

' 'New methods of calculation.14 Product settlingleg on apolyethylene plant. The top valve was normally closed open and theproducttake-offvalve normally In another case an experienced foremansaid that the supports for a new pipebridgewere too far apart.CCtOENTS C&USEO BY SIMPLE SLIPS Valve operated by compressed air Reactor loop Product valve take-off Figure2.' 33 .' I suggested. 'What you meanis that they are further apart than in the past. 'resultedin a cheaperdesign.A. The manager(myself) said.

Despite these two incidents. he says. 2. perhaps it was a slip or perhaps the reporter did not know the difference betweenmicrograms and milligrams31. Most surgeons are sued at least once in the course of their careers. forexample. The importantquestion isn't how to keepbad physiciansfrom harming patients. It was estimated that. [a] studyfoundthatnearlyfour percentof hospital patients suffered complications from treatment which prolonged their hospital stay or resulted in disability or death. instead of 1 microgram/kg. gavethe baby 10 micrograms of a drug.000 patients die each year at least partially as a result ofsuch errors in care. A doctor. workingunder stress. An inquiry found the doctors involved ilot guilty of serious professional misconduct. per kg of body weight. nationwide... you might expect malpracticecases to be concentrated among a small group. the pipebridge had sagged and an extra support had to be added.8 Other medical errors In along articlea United Statessurgeonhasdescribedsomeofthe errors thathe and other doctorshave made32. .AN ENGINEER'S ViEW OF HUMAN ERROR After a flangedjoint on a pipeline on the bridgehad leaked it was found that there had been an error in calculation.. and that two-thirds of such complications were due to errors in care . calculation errors by designers are comparativelyrare andI do not suggest any changes to normal design procedures. it's how to keep goodphysicians from harming patients.have no standarddesign.. 'If error were due to a subset of dangerous doctors. but in fact they follow a uniform bell-shapeddistribution. The baby's weight was reported in pounds and the error occurred in converting them to kg. 120. all doctors make terrible mistakes . But anythingthat looks odd after construction shouldbe checked. Medical equipment. is rife with latent errors. Many hospitalshold weeklymeetings at whichdoctorscan talk candidlyabout their errors.anewbornbaby was killed by an overdose ofa drug..' The author goes on to suggest ways of reducing medical errors. Cardiac defibrillators.Whatdoes not look right.. There are obvious ways of improving the work situation: metric units should be used throughout and labels on the drug containers could quote typicaldoses for patients ofvanous weights. The newspaperthat reported the incident got the dose out by a factor of thousand. 34 ..He writes: '. may not be right.. As theresult of a calculation error.

She did not hear any screams above the noise madeby the floor polisher33. 35 . In a sense they want to industrialize medicine.15 (page 36) shows the layout of the railway lines. And they can already claim one success story: the specialityof anesthesiology. but that it is little use exhorting people to take more care. When she had finished she replaced the plug on the life-supportsystem but by now the patient was dead.Checkson the air-conditioning system and a search for infectionproducedno clues. which has adoptedtheir precepts and seen extraordinary results.PC(U)tI4TS CAUSED EY SIMPLE SLIPS 'The doctor is often only the linal actor in a chainofeventsthat sethimor her up to fail. when 226 peoplewere killed. railway accidents are well documentedin the officialreports by the Railway Inspectorate and in a numberofbool58'1820' 3.' For several months nurses in a hospital in South Africa found a dead patientin the same bed every Friday morning. andtheyprovide examples illustrating all the principles of accident investigation.5 (page 213) describes two medical errors involving computers. anddo not haveto rethinkour designs or modify our plants. in 1915. most of themsoldiers9. Lines to Londonare calledup lines. 2.we may see the conclusion more clearly. the cause was discovered: every Friday morning a cleaner unplugged the life support systemand pluggedin her floor polisher.9 Railways By considering accidentsdue to humanerror in other industries we can check the validityofour conclusion —thatoccasionalslips and lapsesofattention are inevitable andthat we musteitheracceptthemorchangethework sttuation. whichrequirescloserexanilnation and correction. Railways have been in existence for a long time. Section 12. Because we are not involved. 2. atQuintinshill just north of the Scottishborder on the London-Glasgow line. Error experts.9.1 Signalmen's eirors A signalman'serror led to Britain'sworstrailwayaccident. Figure 2. not the individuals in it. Finally. Those interested in industrial safety will find that the study of railway accidents is an enjoyable way of increasing their knowledge of accident ' prevention.lines from Londonare calleddown lines. therefore. believe that it's the process. not just those concerned with human error.

and accepted another train.15 Layoutoflinesat Quintinshill. thoughhe could see it from his windowand hadjust got offit. who had just come on duty. A contributory cause was the failure of the signalman who had just gone off duty to inform the signalmanin the next signalbox that the line was blockedand to put a remindercollaron the signal lever. He could see the slow train through the signalbox window. The signalman. The woodencoachesof the troop train caughtfire and many ofthose who survived the first impact were burned to death. had had a lift on the slow train and had jumped off the foot-plate as it was backingon to the up line. Nevertheless.AN ENGINEER'S VIEW OF HUMAN ERROR Up troop train To Glasgow Wind direction It was also thoughtthat smoke from the DownGoodsobliterated the wreckage ofthefirstcrash from the driverofthe sleeping carexpress Figure 2. The accident occurred becausethe signalman forgotthat there wasa train on theup line. he completely forgot about it and accepted a south-bound troop train whichraninto the slowtrain. One signalman had a lapse of memory — and obviously it was not deliberate. the sceneofBritain's worst railway accident Down sleeping car express The two ioop lines were occupied by goods trains and so a slow north-bound passenger train was backed on to the up line in order to let a sleeping car express come past.A minuteor so later the north-bound express train ran into the wreckage. The other signahnan was taking short cuts — omitting to carry out jobs which he may haveregardedas unnecessary. 36 .

Sometimes acceptingan occasionalaccidentis the right solution. 37 . there are three ways of preventing similarincidents happemng again: (1) Changethe hardware. (2) Both signalmen were sent to prison — it was war-timeand soldiers had beenkilled.2 What shouldwe do? As in the other incidentsdiscussedin this chapter. would be one of the last places where track-circuiting would be introduced. or the threat of it. Itwas not. installedthereuntil the electrification ofthe London-Glasgow line manyyears later. many branchlines arestill not track-circuited. society fifes muchbetter than thosein use in 1915. thoughfortunatelywith much less serious consequences. (2) Persuadethe operatorsto be more careful. Although a similar accident could happen today on the many milesof UK branch linesthat arenot track-circuited. So we tell peopleto be more carefulif the accident is trivial.but expensive. In fact. especially iftheyhavejust got off fact. the accidents ariseout of thework situationand. though we do not like to admitit. might preventpeople taking short cuts.Theyhavedone. will prevent anyone forgetting that thereis a train outside their signalbox. Thepresenceofatrain on aline can complete a 'trackcircuit' whichpreventsthe signal being cleared. if we cannot acceptan occasional accident. It is doubtful ifprison. (3) Accept the occasional accident (perhapstaking action to minimize the consequences). since prison sentences were probably ineffective.inthiscasepossible. but a better way is management supervision. acceptedthat soonerorlater othersimilaraccidents willoccur. Did anyone check that the rules were followed? Itwould be surprisingifthe accidentoccurredon the first occasion on which a collar had been left off or another signalmannot informed (see Chapter5). track-circuiting wasjust coming into operation and the inspector who conducted the official enquiry wrote that Quintinshill. (1) Changing thehardwarewas.SC CIÜEISTS CSSISED BY SIMPLE SLIPS 2. or the threatof it. Prison. because of its simple layout. theconsequences wouldbeless serious as modem all-steel coachesand modem couplings withstand accidents and (3) In practice. At the time. we should change the work situation.9. punish them if the accidentis serious and pretend we have done something to preventthe accidentrecurring.

The signalman forgotthat the freighttrain was standingon the trackand allowed the passenger train to approach. discussed later (Section 4. The track-circuiting prevented himreleasingthe signal so he assumed there was a faultin the track-circuiting and displayeda green hand signal to the driver of the oncoming train. If signalmen are not permittedto use green hand lamps. 2. .16 Tracklayoutat Chinley NorthJunction. Once we have come to a conclusion we close our minds tofurtherevidenceand do not carry outthe simplestchecks.AN ENGINEERS VIEW OV HUMAN ERROR To Buxton N Signal box To Manchester To Sheffield passengertrain (dmu) Approaching Standingtrain (fleight) Figure 2. there was anirregularityin the signalman'sprocedures. A signalman forgot there was a freight train standing on the wrong line and accepted another train (Figure 2. though the consequences were less serious. He did not even check the illuminated diagraminhis signalbox which would haveshownthe standing traint0.16). Had he done so. A similar accident to Quintinshill occurred in 1979 on British Railways. At Many accidents Aisgill in the Pennincs in 1913.9. A change since 1915 is that there was no suggestion that the signalman should be punished.4. was imprisoned6. page 88). He was distracted byproblemswith his engineand 38 standing train. Instead the official report wonders if his domestic worries madean error more likely. what do they do whentrack-circuiting and signals go out of order? The incident is also an example of a mind-set. who survived. He should have gone down to the track to give the green hand signal. he might have seen the This incident shows how difficult it is to design protective equipment which is proofagainst all humanerrors. 14 people were killed and the driver. As at Quintinshill. not displayed it from the signalbox.3 Drivers' errors — signals passedat danger(SPADs) have occurred because drivers passed signals at danger.

reducedto four months on appeal. As with signalmen. including the driver. of course. in his AnnualReportfor 1985. we should accept an occasionalerror (the probability is discussed in Section 7. After the accident a 'repeater' was installed to improve its visibility. At Moorgatein 1975. were killed. In March 1989 a British Railways train driver passed a caution signal withoutslowing theyhaveno incentive to break the rules. This is the correct procedure if the speed of the train is already low! On busy lines drivers are constantlypassing signals at amber and cancellingthe alarm can become almostautomatic. Followinga similar accident in 1996. of which two were actuallyserved. At Harrowin 1952. 42 people. Five people were killed and 88 injured.ACCIDENTS CAUSED BY SIMPLE SLIPS the Inquiry criticized the management for the strictness with which they punished delay and incorrecttreatment of engines. but accidents still occur. As a result of this policy driverswere more afraidofbeing late thanofhavingan accident. the driverwas prosecuted for manslaughter. This becameincreasinglycommon duringthe 1980sand in 1986the Chief Inspectorof Railways. Many are described in official reports. Davis11 has analyseda numberofcasesin detail and has shownthat while a few of the drivers were clearly unsuited to the job. It is possible for the driver to cancel the hooter but take no further action. another driver was prosecuted for manslaughter but acquitted40. in a throwback to the 19th century. including the driver. In September 1990. the majority were perfectly normal men with many years' experience who had a moment's aberration. The managerswho did not provide the equipment that could have prevented the accident were not.8. The officialreport on the accident39 pointed out that the signal involved had been passed at red on several previous occasions. the brakes are appliedautomatically. unconcerned about safety. at best they may have decided that more lives would be savediftherailway'sresources were spentin other ways. At worst they did not understand the nature of humanerror.The driveris the personmost atrisk and many other drivershavebeenkilled in this way. askedBritishRailways to consider a form of Automatic Train Protection (ATP) whichcould not be 39 . convicted and sentenced to six months' imprisonment.He did not brake until he saw a red signaland his train crashed into the rear of the train in front.were killed12. 112 people. page 147) — this may be acceptable on little-used branch lines — or install a form of automatic train protection. On the UK Automatic Warmng System (AWS) a hooter sounds when a driver approaches a caution (amber or double amber) or a stop (red) signal and ifhe does not cancelthe hooter.

to prevent on averagetwo deaths per year at a cost of some tens of millions of pounds per year. Following further accidents. (Unofficially TPWS is knownas ATP-Lite. in 1989 British Railways agreed to implementa system over a period of ten years23. a buffer stop or a speed restriction too rapidly. ATP was recommendedas SPADs have the potential to kill some tens of people in a single accident — 31 were killed at Paddington in 1999 — and as public opinion is particularly averse to railway accidents38.however. • Providing driverswithdiagrams oflittle-used routesthrough complexjunctions and taking themthrough on simulators. whereone traing has to cross the path ofanother. the speedof a train is measured.3. However. In March 2001 a Public Inquiry43 recommended that ATP should be installed on all major railway lines. They are also applied if a train passes a signal at danger. page 147) of a drivererror and we must either accept the occasionalerror or prevent it by design (see Section 13. agreed to adopt a simpler but less effective system than that originally proposed.whatever the speed36. Ifit is approaching a danger signal. Thepsychologicalmechanisms that may have caused the driver to act as he did are discussedby Reasonand Mycielska13. In the Train Protection and Warning System (TPWS). Formany years theconcentration on humanerror diverted attentionfrom what couldbe done by betterengineering. Nevertheless. the cost turned out to be excessive and the Governmentrefused to authorize it. as it is called.8.From an engineering viewpoint.90% ofthe accidents due to SPADs haveoccurredas aresult ofsuch movements. Ten years later. Many people will see this recommendation as democracy in action. the brakes are cancelled applied automatically. Even at this date some railway offiprevent cials did not see the need to do anything more to prevent driver errors.) In August 1999 the Government approved regulations requiring TPWS to be fitted to 40% ofthe trackby the end of2003. such as: • Improving the visibility of signals. page227). no changes had been made and Railtracl. especially those where SPADs have occurred more thanonce (abouttwo-thirds ofall SPADs). 40 . the now privatized successor to British Railways. • Redesigning tracklayouts so asto reduceconflicting movements. The cost was high and the saying 'The driver is paid to obey signals' was often heard22. perhaps more. it is sufficientto realize that for one reasonor another there is a significantchance(see Section 7. Others may regard it as giving the most to those who shoutthe loudest. Road engineers could save far more lives with this money.RN ENG1NER'SVIEW OE HUMAN ERROR by the driver and which would automatically apply the brakes to the train passing a stop signal.

Forexample.10 Other industries 2. but I want to state categorically that I do not believein pilot error as a major cause ofaccidents.. realized the notice was uselessbut wanted to do nothing.'15 41 . It was the result of bad design. most modemjets are fittedwith ground spoilers. perhaps three. The reaction of the US Federal Aviation Administrationwas to suggest putting a notice in each cockpit alongside the spoiler lever saying 'Deployment in Flight Prohibited'.into whichthey havefinallyfallen. it is true.Pilot error accidents occur. On the DC-8 the pilot could either: (1) Lift a lever before touch-down to ann thespoilers (they would thenlift automatically after touch-down): or (2) Wait until after touch-down and pullthe same lever. It was inevitable that sooner or later someone would move the lever the wrongway. Result: 109 people killed. Many other aircraft accidentsare blamed on pilot error when they could havebeenpreventedby betterdesign. not becausethey havebeen sloppy. a very few rare cases whereit seemsclear that the pilot wilfullyignoredproper procedures and gothimself into a situation whichled to an accident.Hurst writes: 'Some 60% ofall accidentsinvolvemajor factors whichcan be dismissed as "pilot error".whichare raisedaftertouch-down reducelift. but becausewe on the ground have laid booby traps for them.ACCIDENTS CAUSED Y SIMPLE SLtPS 2. This sort ofdiagnosis gives a . They might just as well have put up a notice saying Do Not Crash this Plane'. or wilfullydisobedient. There are. The manufacturer of the DC-8. McDonnellDouglas. careless. The accidentwas not the fault of the pilot.1 Aircraft Many aircrafthave crashedbecausethe pilots pulled a leverthe wrongway'4..flatmetal plates to hingedto theupper surface ofeachwing.10. feeling ofself-righteousness to those who work on the ground. But this sort of thing perhaps accounts for one or two per cent of accidents — not 60%. One day a pilot pulled the lever before touch-down. After two. They must notberaised beforetouch-down the aircraftwill drop or suddenly. more planes had crashed in the same way they agreed to fit interlocks to prevent the groundspoilers being raisedbeforetouch-down.

17 Two methods ofattaching ametal plate to a hoist 42 . Figure 2. (b) Thisdesignis notdependenton someonetightening a screwto the correctextent.& ENOTNEER'S VIEW OF HUI4AN ERROR (a)The screwmust be tightenedto the nght extent.

I am indebtedto my former secretary. There is no reason to believe that road users behave any worse at this junction than at any other and little hope that they will change their ways. or manypeople would pass themregularly. The operatorknew whatto do.for the following incident.17(b) which is not dependent for correct operation on someone tightening it up to the full extent. Eileen Turner. The accidentwasputdown as due to humanfailing and theoperatorwas told to be more carefulinfuture. The accident could be prevented by better design. she cleaned the bedroombefore going early to bed one night.ACCIDEBTS CAUSED BY SIMPLE SLIPS enginedrivers. 2. say 'four fives' to themselves and then key a four. It would have been better to use a type of clamp such as that shown in Figure 2.11 Everydaylife (and typing) A man whose telephonenumber is 224 4555 gets frequentcalls for a doctor whosenumberis 224 5555.3 Mechanical handling Aplate fell fromaclampwhile beingliftedbecausethescrewholdingit inposition had not been tightened sufficiently (Figure 2. An Accident Spotlight' issued jointly by a police force and a local radio station gave a diagram of a road junction where seven injury-causing accidentsoccurred in one year and thensaid. In an unusually houseproud mood. she went into the living room. where after a few minutes.17(a)). 2. It is cheaperto blameroad users. she noticed that the time by the 43 .Thathoweveris expensive. 'Principalcause:road user error'. She wokethe next morning at six o'clock and. finding she couldn't getback to sleep. decided to get up and washher hair. brushing her teeth and washing her hair. alsocanpassredsignalsas the resultofa temporary lapse ofattentionbut in this case the policehaveno optionbut to assume that the action was deliberate.10. It seems that peoplelook up the doctor's number. butforsomereasondidnotdo it.10. After showering. Drawing attention to the number of accidents that occur at the junction may persuadeafew peopleto approach it more cautiously butit would be betterto redesignthejunction. Peoplemay be under some stress when telephoning the doctor and this may makeerrors more likely. followedby several fives.2 Roads Motorists.

A contractor was hired to demolish a row of dilapidated cowsheds. To prevent a similar incidenthappening again she could change the hardware — modify the top of the clock so that it could not be put down upside down — orchange software—that is.with his nameon.18 Twowaysof puttingaclock down Wrong rather old clock there was ten past one (Figure 2. He sent a driverwith an excavator to do the job. On first glance the time was now twentyto sevenbut closer examination showed that the clockwas upside down! (Figure 2.AN ENGINEER'S VIEW GE HUMAN ERROR 0 0 Right Figure2. Bill Bryson.18). andmovedall the other entriesin that column one line down.1. from his clothingM. Insteadof the cowsheds on one side ofthe road. The clock had obviously had its day and was going haywire but Eileen went to the bedroomto check.18). admits that he has gone out to buy some tobacco and post a letter.1 The top line in the thirdcolumnwas accidentally left blankand all the entries in the column weremoved onerow down United Nations number 1001 1002 Name Abcd Efgh Fire-fightingagent Water 1003 1004 ljkl Mnop Dry powder Foam 44 . Table 2. give up dusting! the The writer. He also reports that a bank robberwhocoveredhis face with a balaclavaforgotto remove his identitybadge. the driver demolishedan historic 18th centuryfarmhouseon theother side42. as shownin Table 2. bought the tobacco first and thenposted it. The typist left the firstline in the thirdcolumnblank. The bookletswere in use for many months beforetheerror was spotted. Fire brigade officers (not in the UK) were provided with a 50-page booklet which listed in three columns the names of chemicals they might come across. their United Nations numbersand the fire-fightingagent to be used.

producing flames several hundred feet high) said. of course.physically tired. various subjects to throw what they considered to be an extremely dangerous acid into the face of a human being . The man involvedwas not. Actually. innocuous water) over the assistant who was present in the same room. In a momentof distractionhe emptiedit.. A major influenceover the behaviourof the operating teams was their tirednessand frustration. In the . which can so easily and obviously go wrong. mostly trivial in their consequences. in this particularexperiment theperson in charge madewhat he calls a "mostregrettable mistake in technique" by forgetting to changethe nitric acid to the innocuousdish of water.ACCIDENTS CAUSED BY SiMPLE SLIPS The followingtwo 'human failing' accidents occurredduring demonstration lectures. water . and perhaps errors of other sorts.. of course. The penny. under hypnosis. Underthese conditions it was possible to induce. an assistant substituted for it a like-sized bowl of . The professorofchemistry was lecturingon salts of mercury. 2.' A trade union leader is quoted as saying that the 45 Fatigue may make slips.. Many other slips.12 Fatigue ClaphaniJunctionrailway accident (seeSection6. 'It is clear that in this case operatingteams overlooked things that should not have been overlooked. page 118) a contributory factor to a slip was 'thebluntingofthe sharp edge of close attention' due to working seven days per week without a day off. are describedby Reasonand Mycielska1. believinghe was drinkinga glass ofeau sucrée. more likely..'16 deplorable accident has taken place at the Grenoble Lycée. and misinterpreted instrument readings. 'Theexperimenter demonstrated the power of nitric acid to the subject by throwing a pennyinto it. and had by his side a glass full of a mercurialsolution.. The report on a chemicalplant accident(according to press reports a flare stack was filled with oil whichoverflowed. The unfortunate lecturer died 'A almostimmediately."7 The first accidentcould be prevented by not indulging in such a foolish experiment. was completely disintegrated While the subject's view of the bowl of acid was cut off by the experimenter. so that inonecase the assistant had real nitric acidthrown over him.5. The hypnotized subject was thenordered to throw the dish ofnitric acid (in actual fact.. however. and the secondby usingdifferentcontainers for drinks and laboratory chemicals.


management team members were more tired than the operators as the managerswere working 12-hour shifts24. Obviously we shoulddesignwork schedules that do not produce excessive fatigue, but inevitably some people will be tired from time to time as the result of factors that have nothingto do with work. When we can we should design plants and methods of working so that fatigue (like other causes of error)does not haveseriouseffectson safety, outputand efficiency.

References in Chapter 2

Reason,1. and Mycielska, C., 1982, Absent-minded2 The Psychology ofMental Lapses and Everyday Errors (Prentice-Hail. Englewood Cliffs, New Jersey,
USA). Technical DataNote 46, Safriyat Quick-opening andOtherDoors ofAutoclaves, 1974(FactoryInspectorate, London, UK), Guidance Note FM/4, High Temperature Dyeing Mac/ones, 1976 (HMSO. London,UK). Kletz,T.A, 1980, LossPrevention, 13: 1. Roll, R.T.C.. 1987, Red for Danger, 4th edition (David and Charles, Newton Abbott, UK). Schneider, A. and Masd, A.. 1968, Raihvav Accidents of Great Britain and Europe(David and Charles, NewtonAbbot,UK). Hamilton, J.A.B., 1981, Trains to Nowhere, 2nd edition (Allen and Unwin, London. UK). Gerard, M. and Hamilton, J.A.B., 1984, Rails to Disaster (Allen and Unwin, London, UK). Hamilton, J.A.B., 1969, Britain's Greatest Rail Disaster (Allen and Unwin, London, UK). Olver, P.M, 1981, Railway Accident. Reporton the Collision that Occurred on 14th February 1979at GhznlevNorth Junctionin the London MidlandRegion of BritishRailways(W\4S0,London,UK). Davis, D.R.. 1966, Ergonomics, 9: 211. McNaughton, 1.K.A., 1976, Railway Accident. Report on tile Accident that Occurred on 18th February 1975 at Moorgate Station on the Northern Line, London TransportRailvi'ays(}'ISO,London, UK). As Reference I, page204. Eddy, P., Potter, B and Page, B., 1976, Destination Disaster (Hart-Davis and MacGibbon, London,UK) Hurst. R. and L.R (ethtors), 1982, Pilot Error, 2nd edition (Aaronson, New York,USA). Eysenck, HJ.. 1957, Sense and Nonsense iii Psychology Penguin Books, London,UK).




7. 8. 9. 10.

11. 12

13. 14. 15. 16.



17. Nature, 18 March 1880, quotedinNature. 20 March 1980, page216. 18. Nock. O.S. and Cooper, B.K., 1982, Historic Railway Disasters, 4th edition (Allen and Unwin.London, UK). 19. Hall,S., 1987, DangerSignals(Ian Allen, London,UK). 20. Reid, R.C., 1968, Traiiz Wrecks (Bonanza Books, New York, USA). 21 LossPreventionBulletin, 1989. No. 090, page29. 22. As Reference 19, page 126. 23. Hidden, A. (Chairman), 1989, Investigation into the Claphain Junction Railway Accident, Sections 14.27—14.31 and 15.8—15.12 (HMSO,London, UK) 24. Evening Gazette (Middlesbrough), 24 and 25 August 1987. 25. Miller, J. in Silvers, R.B (editor), 1997, Hidden Histories of Science, page 1 (GrantaBooks,London, UK and NewYorkReviewofBooks,New York.USA). 26. Operating Experience Weekly Swnmaiy, 1998, No. 98—52, page 3 (Office of NuclearandFacilitySafety,US Department ofEnergy, Washington, DC, USA). 27. Operating Experience Weekly Summary, 1999, No. 99—17, page 1 (Office of Nuclearand FacilitySafety,US Department ofEnergy, Washington. DC. USA). 28. Sykes,W S. 1960, Essayson the First HundredYearsofAnaesthesia, Chapter1

29. The Phillips 66 Company HoustonChemical Complex Explosion and Fire, 1990 (US Department ofLabor.Washington, DC, USA). 30. Tyler. W.H., 1864, in Public RecordOffice file MT 29 25, page 676, quotedby Jarvis, P. 1999. Ffes!iniog Railway Magazine,Spring, No. 164,page 329. 31. Weaver, M., Daily Telegraph, 6 March 1999; Deacock, A.R., ibid. 8 March 1999; Davies, C., ibid., 21 April 1999; Poole, 0., ibid., 14 April 1999. 32. Gawande, A, 1999, TheNew Yorker, 1 Februa.iy, page20. 33. Cape Timmies, 13 June 1996, quoted in JRR News, August 1997(Institutefor Risk Research. Toronto,Canada). 34. Hall,S.. 1989, Danger on the Line (Ian Allen, London,UK). 35. Hall,S., 1996, British RailwayAccidents (IanAllen,London, UK). 36. ModernRailways, June 1999, page417. 37. ModernRailways, September 1999, page 659. 38. Evans,A.,ModernRailways, September 1999, page638. 39. Report on the Collision that Occurred on 4th Marc/i 1989 at Purley in the SouthernRegion ofBritishRailways, 1990 (FUvISO, London, UK). 40. Marsh,A., DailyTelegraph, 23 March1998. 41. Bryson. B. 1998. Notesfrom a Big Count,y. pages 104 and 294 (Doubleday. London,UK) 42. Pile, S., undated, The Return ofHeroic Failures, page 18 (Secker & Warburg, London,UK). 43. Uff, J. and Cullen, W.D. 2001, The Joint inquiry into Train ProtectionSystems (HSEBooks,Sudbury, UK).

(Churchill Livingstone, Edinburgh, UK).


Accidents that could be

prevented by better training or instructions
am underorders myselfwithsoldiersunderme Isay Come here! andhe comes;and to my servant, Do tins!andhe does zt. Acenturionm St. Matthew,Chapter8, Verse9 (NewEnglishBible)

toone, Go! andhegoes; to another,


The painfid taskofthinking belongsto me. You need only obey implicitly without question.' Admiral SirGeorgeRodney,to his captains, l7O8-

3,1 Introdudion
at one time, they may still be true in some parts oftheworld,buttheyarenolongertrueinWesternindustryand I doubtif they are still true in the armed services. Tasks have changed and people's expectations havechanged. Accordingto a book on the war-timeHome Guard, the part-time defence force25, 'As General Pownall knew from bitter experience, the Home Guard could be coaxed and persuaded to act in a certain way, but it could not be treated like the army where orders were simply orders.' Trainingas usedhere includes education (giving people an understanding of the technology and of their duties) and also includesteaching them skills such as the ability to diagnosefaults and work out the actionrequired, while instructions tell them what they should and should not do. Oversimplifying, we can say that training should prevent errors in knowledge-based and skill-based behaviour while instructions should prevent errors in rule-based behaviour. These errors are called mistakes to distinguish them from the slips and lapsesof attention discussedin Chapter2. If a task is very simple, instructions may be sufficient,but in the process industries most tasks are no longer simple. We cannot lay down detailed instructions to cover every contingency. Instead we have to allow people judgementand discretion and we need to give them the skills and knowledge theyneed to exercisethatjudgementand discretion. We should give themthe skill and knowledge needed to diagnose what is wrong and decide what
These quotes may havebeen true



action to take. We shall look at some accidents that occurred becausepeople were not adequately trained. A biologist who has studied the ability of insects to learn writes that this ability has evolved when the environment is too unpredictable for fixed, inborn behaviour patternsto be appropriate. The individual, if it is to survive, has to learn how to adjust its behaviour in response to changing circumstances. Many insects — indeed, most of those investigated — have been shown to be capable learners39. In the same way, process plants are too unpredictable to be always operated by fixed rules. Operators have to learn how to adjust their behaviourin response to changing circumstances. People's expectations have changed. They are no longer content to do what they are toldjust becausethe boss tells them to. Insteadthey want to be convinced that it is the right thing to do. We need to explain our rules and procedures to those who will have to carry them out, and discuss them with them, so that we can understand andovercome their difficulties. Of course, this argument must not be carried too far. Nine of your staff may agree on a course of action. The tenth man may never agree. He may

haveto be told. 'Wehave heard and understandyour views but the rest ofus haveagreedon a differentcourse ofaction. Please get on with it.' The man who thinks he knows but does not is more dangerousthan the man who realizes he is ignorant. Reference 20 describes accidents which occurred as the result of incorrectbeliefs. Always check to make sure that training and instructions have been understood. The message received may notbe thesame as theone transmitted. Sometimes no message is received. If operators are given a list of flash points or occupational exposurestandards, do theyknow what theymeanorhow they shoulduse them? The examples that follow illustrate some of the ways we can prevent mistakes — that is, accidents that occur because people do not know the correctactionto take:

• • •

The obviousway is to improvethe training or instructions, butbeforedoing so we shouldtry to simplifythe job or redesign the work situationso as to reduceopportunities for error (mostof this chapter). Traincontractors as well as employees (Section 3.3.4, page 56). Tell peopleabout changes in equipmentorprocedures(Section3.3.5, page

• Explain the designer's intentions and the results of not following them (Sections 3.3.7 and 3.3.8,page 61j. • Avoid contradictions (Section3.5,page65).


• Makepeopleawareofthe limitations oftheir knowledge ( 66). • Checkthatinstructions areeasytoread,readilyavailableandwrittentohelp
thereader ratherthan protect thewriter. Explain and discuss newinstruc— tions with the people who are expectedto follow them (Section 3.7, page

• Train people to diagnose faults and act accordingly. Instructions cannot cover all eventualities (Section 3.8, page 72). • Includethe lessons of past accidents in training(Sections3.3.1 and 3.3.3,
pages 53 and 54).
Discussions are better than lectures or writing as people have an opportu-

nity forfeedbackand theyremembermore. Sometimes accidents have occurred because of a lack of sophisticated training, as at Three Mile Island; sometimes because of a lack of basic

The accidentat Three Mile Island nuclearpower station in 1979 had many causesand many lessons can be drawn from it"2 but some of the mostimportant ones are concernedwith the human factors. In particular, the training the had operatorsreceived notequippedthemtodeal withthe eventsthat occurred. To understand these I must briefly describethe design of the pressurized water reactor of the type used at Three Mile Island, and the events of 20

32 Three Mile Island

March 1979. Figure3.1 shows a very simplified drawing. Heat generated in the core by radioactive fission is removedby pumpingprimarywater round and round it. This water is kept under pressure to prevent it boiling (hence it is called a pressurized water reactor, to distinguish it from a boiling water reactor). The primarywater gives up its heatto the secondary water which does boil. The resultingsteam thendrivesa turbine, before bemg condensedand the condensate recycled.All the radioactive materials, including the primarywater, are enclosed in a containment building so that they will not escape if there is a

The troublestarted when a choke occurred in a resin polisherunit, which removes impurities from the secondary water. To try to clear the choke the operatorsused instrument air. Unfortunately it was at a lowerpressure. As a result water got into the instrument au' linesand the turbinetripped. 50

partly because they did not really understand how the temperature and pressure in the primarycircuitdependedon each other.1 A pressurized This stopped the heat being removed from the radioactive core. and this caused the primarywater to boil. this light was not operated by the valve position but by the signal to the valve. The production of fission heat stopped automatically because silver rods which absorb neutrons dropped down into the core and stopped the radioactive fission. they chose to believe the PORVlight and ignorethe other readings. and partly becausetheir instructions 51 . Several other readings should have suggested to the operators that the PORVwas stuckopen and that the primarywater was boiling. However. The pilot operated relief valve (PORV) lifted and the make-up pumps started up automatically to replace the water whichhad evaporatedfrom theprimarycircuit. heatwas still producedby radioactive decay at about 6% of the full load.CCDENTS PREVENTPLIiPY BETTEI TRAiNING OR INSTRUCTIONS waterreactor— simplified Figure 3. However. Unfortunately the PORV stuck open. The operators did not realize this because a light on the panel told themit was shut. However. The operators had not beentoldthis or had forgotten.

Note that the only action taken by the operators made matters worse. The accident at Chemobylnuclear power station was also due to human error but in this case the primary error was a failure to follow instructions. If they had done nothing. less affectedby humanerror (of any sort) or equipmentfailure.Conditions were clearlywrong and their training had emphasized the dangerof adding too much water. the system would have cooled down safely on its own. One successful method of doing so has been described by Duncan and co-workers3. they did not understand how the pressureand tempcrature were related. the level in the primary circuitfell and damage occurred. They therefore shut down the make-up water pumps. Better training will make a repeat of Three Mile Island less likely but a better solution. The operator is shown a mock-up display or panel on which various readings are shown.1. With the make-up water isolated.The problemsgradually increasein difficulty. is to developdesigns of nuclearreactor that aremore user-friendly — that particular. in the long term. (2) It did not tell them how to recognize a small loss of water — though it covereda major loss such as would occur if there was a break in the primary circuit— and what action to take ifthis 101). We cannotforesee everything that will go wrongandwrite instructions accordingly — though whatdid go wrong shouldhave been foreseen— and so we need to train operators to diagnosepreviouslyunforeseen events.AN ENGiNEER'S VIEW OF HUMAN ERROR and training had emphasized that it was dangerous to allow the primary circuitto get too full ofwater.1. as already stated. iFrom them he has to dtagnosethe fault and say what actionhe would take. (3) It did not train them in the skills of diagnosis. though lack ofunderstanding may have playeda part (seeSection 5. The training ofthe operators was deficient in threemajor respects: (1) Itdidnot givethemanunderstanding ofthephenomenataking placein the primarywater circuit. however. Gas-cooled reactorsand fast reactors(the fast refers to the speed of the neutrons) are friendlier than water-cooled reactorsand a number ofother inherently safer designshavebeen proposed2. The operatorsthoughtthe PORVwas shut. 52 . The operator learns how to handle all foreseeable problemsand acquires general skills which will helphim handle unforeseen problems.

For example.3 Other accidents that could be preventedby relatively sophisticated training The training requiredmay not seem very sophisticated until we comparethe incidents described in Section 3. he found that the stirrerhad stopped. ifhe had seen it.3. It is also useful to supplementtrainingwith a plant 'black book'. both from the plant and other similarplants.1 Re-starting a stirrer A numberof accidents have occurredbecausean operatorfound that a stirrer (or circulationpump) had stopped and switchedit on again. Part of the line was cleared successfully. but it will still be necessary to train the operators and maintain the instructions. An instruction was then issued detailing the action to be taken if the stirrer stopped. are a sort ofprotectivesystem and like all protective systemsthey shouldbe checkedregularlyto see that they are complete and maintained as necessary.The operatorrealizedthat the efthent goingto drain was too acidic. an acidic effluent was neutralized with a chalk sluny in a tank. like labels. Reactants mixed suddenly with violentresults. No copy ofit could be found on the plant.high pressurewaterwash equipment was being used to clear a choked line. It should not be cluttered up with reportson tripping accidentsand other trivia. but one section remainedchoked so the operatorsdecidedto connect the high pressure water 53 . An alarm to indicatethat the stirrerhad stoppedmighthelp and the manholeon thetank shouldbe replacedby a hinged lid which will lift whenthepressure rises. No-one was injured. A similarincidenthad occurredon the same plant about fouryearsearlier. It should be compulsory reading for newcomers and it should be kept in the control room so that others can dip into it in odd moments. 3. Initial training is insufficient.On lookinground.2 Clearing choked lines Several incidentshaveoccurredbecause peopledid not appreciate the powerof gases orliquidsunderpressureand used themto clear chokes. Regularrefreshers are necessary. he had forgotten it. a folder of reports on incidentsthat have occurred. It is difficult to prevent accidents such as this by a change in design.3. He switchedit on again. The acid and the chalk reacted violently. Instructions. blew off the manholecover on the tank and liftedthe bolted lid.The operatorhad not seen the instruction or. but shouldcontainreportson incidents of technical interest.4 (page 62). Forexample.hCCIOENTS fltEVRNTABLE BY SETTER TRAINING OR INSTRUCTIONS 3. 3.

neglecting friction. As the pressure of the water was 100 bar (it can get as high as 650 bar) and as the pipe was designedfor only about 10 bar. 54 . Forexample. On another occasion gas at a gauge pressure of 3 bar — which does not seem very high — wasused to clear a chokein a 2 inch pipeline. It is necessaryto be able to apply itto real-lifeproblems. Everyoneshould know the safe working pressure of their equipmentand should never connect up a source of higherpressure without proper authorizationby aprofessionalengineerwho shouldfirstcheck that the relief system is adequate.They did not understandthat a force of 30 pounds was exerted against every square inch of a door that was 3. Instead of securingthe chamberdoor properly he fixed it by inserting a metal rod — a wheel dog — into the top lugs. the other operators on the plant found it hard to believe that a pressureof 'only 30 pounds' caused the door of the filter to fly open with such force. In all these cases it is clear that the people concemedhad no idea of the power of liquids and gases under pressure.Calculation. They wondered if a chemical explosionhad occurred. The operators did not understandthe difference betweenforce and pressure. showedthat if the plug weighed 0.3. Many people seem to find it difficult.5 kg and it moved15 m. Instead of suspectingthat something mightbe wrong. Many operators find it hard to believethat a 'puff of air' can damage steel equipment. a sphere which was stuck inside the pig chamber of a meter prover. The plug of solid was movedalong with such force that when it hit a slip-plate it made it concave. The scaffolding was erected whenthe column washot and theneveryone was surprised that the scaffolding becamedistortedwhenthe column cooleddown5. it is not surprising that twojoints blew. hitting variousobjectson the way4.3 Failures to apply well-known knowledge It is not. In the incidentdescribedin Section 2. thenits exit velocitywould be 500 km/hour! Aninstrumentmechanicwas trying to free. ofcourse.AN ENGINEERS VIEW OF HUMAN ERROR directly to the pipe.2 (page 13).5 feet in diameter and that the total force on the door was therefore 20 tons. scaffolding erected was around a 75 m tall distillation column so that it could be painted. with compressed air. the operatorshad the joints remade and tried again. This time a valve broke. When the gauge pressure reached 7 bar the door flew off and the sphere travelled 230 m before coming to rest. 3. They omit toapply the mostelementary knowledge. sufficient to haveknowledge.

used the available water for cooling surrounding vessels to preventthe fire spreading.2 A choked stramer had to be removed. It caineoutwithsuch force that two men severalfeet awaywere sprayedand scalded. but above the liquid level themetal softened andburst. The vessel had been exposed to fire for 1½ hours before it burst and duringthis time the fire brigade had. 55 .When theyare assured that the relief valve was OK it may still take them some time to realize that the vessel burst because the metal got too hot and lost its strength.. at Feyzinin Francein 1966 a 2000 m3 sphere containing propaneburst. Therewas a column ofhot water 4 feet (1. The relief valve.. I have discussed the Feyzin fire on many occasions with groups of students and with groups of experienced operating staff and their reaction is often the same: the relief valve must have been faulty or too small or there must havebeen ablockagein its inlet or exit pipe. For example. We all know the weight of water but many people are surprised by the pressureproducedby a column ofit (Figure 3. would preventthe vesselbursting. Below the liquid level the boiling liquid kept the metal cool.2). it was believed. killing 18 people and injuringmany more10. The joint above theliquid was broken to let the waterrunout. As arough rule of thumb.2m) tall abovethe liquid.RLE RY RETTER TRATNING OR INSTRUCTIONS 4 feet column of hotwater Choked strainer Figure 3. Many pressure vessels have burst when exposed to fire. on the advice of the refinery staff. a columnof liquidxfeet tall may spreadx feet throughacrackedjoint.CC'ORi'VS PREVRNTI.

The Institutionof ChemicalEngineers providessets of' notesand slidesfor use inthis way1 1.AN ENGINEER'S VIEW OF HUMAN ERROR Everyone knows that metal loses its strength when it becomes hot. One method of helping to break down the barrier between them is by the discussion of incidentssuch as Feyzin. today people realize why the sphere burst more quickly than theydid in the past. ofcourse. they say. Most of us keep 'book learning' and 'real life' in separate compartments and the two rarely meet. The typical construction worker. The remedy lies in better inspection after construction. poisoning or chemicalburns. the lid of one was forciblyejected. but persuading them to use it. Afterwards people recalled that similar incidentshad occurred before andthat a restraining device had been acquired to hold the lids in position while the pressurewas released26. Some empty 45-gallon drums were transportedfrom sea level to a site at 7200 feet (2200 m). The group puzzleout why it occurred and say what they think should be done. When the drums were being opened. but there was a failure to apply that knowledge. However.2 bar) above the outside pressure. For example. both by the refinery staff at the time and by the peoplewho attended my discussions. We all know the pressure falls as the altitude increases but no-one applied the knowledge. On occasions contractors have strengthened the wall/roof seam. Nevertheless. so that if the tank is overpressured the walllroof seam will fail rather than the walllfloor seam. a problempeculiar to plant operators. The pressureinside the drums was then 3 psi (0.3. (Construction errors are discussedfurtherin Chapter9. One of the major problemsin education is not giving knowledge to people. but is it also possible to give contractors' employees more training in the consequences of poor workman- ship or short cuts on theirpart? Many ofthem do not realize the nature ofthe materialsthat will go throughthe completedpipelines and the fact that leaks mayresult in fires. Many engineers are sceptical of the value of such training. Failure to apply the knowledge we have is not. 3.not realizingthat itwas supposed to be left tanks are usually made with a weak seam roof.) 56 . it might perhaps be tried. Many pipe thilures have occurredbecausecontractors failed to follow the design in detail or to do well what was left to their discretion12. Here is another example of failure to apply well-known knowledge becauseit belongs to a differentcompartment of life. explosions. is not interested in such things.4 Contractors Many accidentshave occurred because contractors were not adequately trained.

if they do not fit they must be modified. A 25 tonne telescopic jib crane was being used to remove a relief valve.Ourfirst example concernsconstruction. The Figure 3.3 Thiscrane tnedto lift too greata load for thejib length and elevation 57 . but errors in construction: an attempt to force together components that had beenbadly made and didnot fit. Nordid they check the construction to see that itwas carried out with sufficientcare'3.3). No-onetold them that with a box girderbridge— then a new type — components must not be forced together. The jib length was 38 m and the maximum safe radiusfor this jib length is 24 m. The driverincreasedthe radius to 31 m and the crane fell over onto the plant (Figure 3. Damage was slight but the planthad to be shut down and depressuredwhile the crane was removed. under construction across the Yarra river. Australia in 1970.hutthoseconcerned were not told. The consulting engineers made no attempt to ensure that thecontractors understood the designphilosophy and that traditional methods of construction could not be used.'CC1UENTS RRVRNTABLE BY BETTER TRAINING OP INSTRUCTIONS 3. from aplant. However it is customary for construction teams to force together components that do not fit.5 Information on change Accidentshave occurred becausechanges in design necessitated changes in construction or operating methods.3. collapsed during construction. The cause was not errors in design. weight 117 kg. a box girder bridge. In Melbourne.

Everyoneknows that substances contracton cooling. Fortunately he was wearinggoggles15 (Figure 3. The acid was kept on one side of the plant and the alkali on the other. alongside a hotel. Gas leakedinto the basement of the hotel and exploded. Becauseof this the driver got no warning of an unsafe condition and. Whilean operatorwas on his days off someone decided it would be more convenient to have a container of acid and a container of alkali on each side of the plant. incidentally. Apparentlynobody told the men who installed the pipe that when plastic pipe is used in place of metal pipe. it is necessary to allow for contraction. In the following winter the pipe contracted and nearly pulled itself out ofthe couplings. a plastic gasmain was laid in a street in Freemont. Certainly they had failed to take this into account in thetrainingof thedriver. He did not realize that the crane could be manoeuvred into an unstablepositionwithoutan alarmsounding. The pipe was 106 m long and contracted about 75 mm'4. as he lifted the valve. Sulphuric acid and caustic soda solution were supplied in similar plastic containers (polycrates). Nebraska. Valve B became blocked and so the lines were rearranged as shown on the right.AN ENGINEER'S VIEW OF HUMAN ERROR crane was fitted with a safe load indicatorof the type that weighs the load throughthe pulley on the hoist rope. A small catalyst storage tank had the connections shown on the left of Figure 3. This used to he done by opening valve B but nobody 58 . the crane overturned. There was a violent reaction and he was sprayed in the face. An effluent had to be neutralizedbeforeit left a plant. it does not take into account the weight of thejib. When the operator came back no-one told him about the change. An operator whohad been on temporaryloan to another plant returned to his normal place of work. (This might be classified as an accident due to failure to apply well-known knowledge. alsofailed to realize that an extra degree of freedom requires a change in the method of measuring the approach to an unsafe condition. In thesummerof 1974.4). and was fixed to metal mains at each end by compression couplings.They had.5. Those responsible for training the driver had perhaps failed to realize themselves that the extra degree of freedomon a telescopic jib crane — the ability to lengthen the jib — means that it is easier to manoeuvre the crane into an unsafeposition. Sometimes acid had to be added.) At a more prosaic level. The driver had been driving telescopic jib cranes for several years but did not appreciate the need not to exceed the maximum jib radius. He was asked to charge some catalyst to the plant. Without checkingthe labels he poured some excess acid into a caustic soda container. sometimes alkali. accidents have occurredbecausepeople were not told of changes made while they were away. The next winterwas colder and the pipe came right out.

5 When valveB becameblockedthe lines were rearranged butnobodytold the operator who was awayat the time 59 .a man poured some excessacid into a caustic container.ACC iDENT PREVErJTABLF BY RETTBR LRA INING OR INSTRUCTIONS Without checkingthe labels.4 A changewas made whilean operator was away IT BLEW OUT INTO HIS FACE Catalyst feed line to plant to plant B Originaldesign Modifieddesign Figure 3. * Always check labels * Tel! people about changes made while they were away Figure3.

3. To quote from a report on a case ofwater hammer. A. • A situationin whichanyone can do ajob. There were two outside operators who worked as a team. Both were some distance away. The previous organization'spractice had been to drain the steam system at a knownlow point before re-introducingsteamfollowingan outage The new organization was not given a turnoveron how to drain the system and they tried to drain it at a locationthat was not the low point. over the loudspeaker.AN ENGINEER'S VIEW OF HUMAN ERROR told him about the change.The operators are not assignedspecific tasksbut each iscapableofcarryingoutthewhole range ofoperations andeachtaskis doneby whoeveris most conveniently availableat the time. 60 . which no-onedoes it. caneasilybecome a situationin is neededformakingsure that this does not occur.' Fortunately there was no damage to the steam lines. A common failingis not changing instructions ortrainingafter a changein organization. givespeople a larger range ofinterests and soundsgood. A better operator would have traced the lines to make quite sure before openingany valve. but water hammerhas rupturedsteam lines on many other occasions27. butproperlabellingwould haveprevented the accident. It is bad practiceto rely on operationalmethods to overcome poor design. and each left it to the other whomhe thoughtwould be nearer. Everybody thinkssomeoneelse is doingit. Several tons of an expensive catalyst drained into a sump where they reacted with rainwater and producedtoxicfumes.'operational responsibility for the laundrysteam system had been assignedto a new organization.6 Team working With the reduction in manpoweron many plants it has becomefashionableto employ'team working'. only violent banging. A system One incident occurred (in part) becausea control room operator. The blankedvalveB was obviously not the feed valve so he opened the valve alongsideit. Whenhe got to the catalysttank he noticed that something seemedto be different.3. More steam traps shouldhavebeen fitted. but there are snags: • Extensive trainingis necessaryto make sure thatevery operatorcan really carry out the fullrange oftasks. The control room operatordid not ask a particularoutside man to close the valve. asked the outside operators to close a valve. This obviously encourages efficiency.

3.8 Ignorance ofwhat mightoccur Air filters contaminated with lithiumhydridewere madesafeby placingthem in avatofwater.3. The concentration of the powder could not be controlled but this did not matter as the oxygen content of the exhaustgas was low.7 The limitations ofinstructions The following incidents show what can occur when a unforeseen problem arises and people have not been trained to understandthe hazards and the designintention. The powder was then transferred into road tankers and offloaded using the exhaust from the tanker's engine. The source of ignition was an electrostatic discharge from the unearthed metal parts of the largehose28. but simple operations such as putting a piece of discarded material in water are not usually Hazoped. The heat developedby the reaction burnt the wooden frame of the filter. The hydrogen exploded. 3. No problem. so to assist the transferthey inserteda compressed air line inside the joint between the two hoses. A Hazop gives the commissioning or operating managera good understandingof the designer's intentions which he can then include in the operating instructions. This incident shows that they 61 . The concentration of powder was kept below the minimum concentration at which a dust explosion was possible. Aluminiumpowder was manufactured by sprayingmolten liquid through an atomizing nozzle. Hazard and operability studies (Hazops) force us to consider deviations such as More of'.It was then sortedby size by conveying it pneumatically through a series of cyclones and screens. usingits exhaustgas. When the transfer started there was an explosion.They did not understand anything about explosiveconcentrations of either powder or oxygen. One day a filtercontainedmore lithiumhydridethan usual. The tanker was put as near to the plant as possible but its hoses would not reach. even thoughit was under water. One day the contents of a tanker had to be recycled through the sorting section of the plant. Thewaterreactedwith the hydrideforminglithiumhydroxide and hydrogen. blewthelid offthe vatandscattered piecesofthe filter29. Theoperators looked for an extensionhose but the only one they could find had a larger bore. They did not see anything wrongwith this as air was used for movingthe powderinside the plant. The operators decided to empty the tanker in the usual way.sonie bubbles of air were probablyentrapped.PCCiOENTS EREVENTABLE RY BETTER TRAINING OR INSTRUCTIONS 3. They then realized that the velocity in the larger hose would be low.they stuck the tanker's hose inside the end of the larger one. filled the gap with rags and wrapped tape round the joint.

'No. Unfortunately therewere someleaks. Obvious reasons are more hydridein the air being filtered or delay in changing a filter. A Hazop brings out the consequences of not following intentions and these canbe includedin the plantinstructions. Whatseemsobviousto us maynotbe obviousto others.The tankwasemptiedwith a vacuumpumpand thenthepump was shut down and isolatedand theliquid drawnintothetank. Ifthere was a need to test. page 67).This was not in the instructions but was a 'trickof the trade'.'16 It is fair to point out that this incidentoccurred many years ago. Once we ask whethermore lithiumhydride than usual could be presentwe are halfwayto preventingthe accident. the designer's 3. and the men were affectedby the chlorine.2(page 91).4. The operators said. They thoughttheirjob wasjust to take readings.They did not expect to find any so theydid not wear breathing apparatus.4 Accidentsthat could be prevented by elementary training This section describes some incidents whichoccurredbecausepeople lacked themost elementary knowledge oftheirduties or thepropertiesof the material theyhandled.not to respondto them. it doesn't: you see. (3) Aliquidwas transferred intoatankbyvacuum. There is an example in Section4.1. At one time old filters were perforatedto release trapped air before they were immersed in water. knowledge so basic that it was considered unnecessary to impart it. (1) Many accidentshave occurred because operators wrote down abnormal readings on the plant record sheets but did nothing about them. Someone told them that spray from the inflow would be drawn into the pump. This incidentis also an example of forgotten knowledge.AN ENGINEER'S VIEW OF HUMAN ERROR should be whenhazardous materials are handled. forgotten when experienced people left and were replaced(see also Section 3.7. therewas no need to test. all the time the pump is running there's a strong streamof "vac" running into the tank and it keeps the splashesaway. Ifwe were sure therewould be no leaks. They did not eventell theirforemen. (2) After changing a chlorine cylinder. 62 . two men openedvalves to make sure therewereno leaks on the linesleadingfrom thecylinder.Thiswas not quick enough forone shift whokeptthepump running. then leaks were possible and breathing apparatus should havebeenworn.

' (7) Maintenance workers as well as operators occasionally display the most extraordinary ignorance of whatpracticesare or are not acceptable. (8) A reaction mixture started to boil violently. The enginecaughtfire whenthe ignition was switched's thissafer. (6) Soon after green (that is. To save climbingonto the top of the tankertwice. Aman got out ofa mini-van just intime to preventtheattendant fillingit up throughthe ventilatorin the roof A young attendant used his cigarettelighter to check the level in a road In this last case the attendant was watched by the driver who himself removedthe oil filler cap'7. Result: the wagon was suckedin. (iii) \Vhen the tankeris empty.A new branch was made and the old one removed.the attendants seemedto he themost untrained of all employees.while thenew one was 24 inchesexternaldiameter. the old branch was 24 inches internal petrol that doesn'tburn. The supervisor thereforedecided to make a series of parallel cuts in the branch.splay them out until theyfitted the hole in the vessel and weld up the gaps! Fortunately before the job was complete it came to the notice of a seniorengineerandwas stopped. 'Don'tworry. One day he had an idea. Fortunately 63 . despite the hazardous nature ofthematerial theyhandled. • An attendant put petrol into the oil filler pipe of a rear-engined van. rising towards the vessel manhole. His instructions said: (i) Openthevent valve on the top of the wagon. When the new branch was offeredup to the opening in the vesselit was found to be a little bit too small. OR INSTRUCTWNS (4) Anoperatorhad to empty tankers by gravity. lead-free)petrol became available. The operatordid not understand that when a liquidflows out of a tank it leaves an empty spacebehind. As itwas near the gatehouse. The gateman said. he askedthe gatemanifhehadseen it and takenanyaction.ACCEOET5 PREVErTARLE B BETTER TRAININC. An operatortried tocontainitby closing the manholelid. a university safetyofficersaw petroldripping fromacar in the university carpark.For example: • • tanker. close the vent valve. A 24 inch manhole branch on a vessel had corroded and had to be replaced. (5) Before the days of self-service petrol pumps. he decided to carry out step (iii) before step (ii). (ii) Openthe drain valve.

Thepersonwhoauthorized thework should have warned the subcontractor. Whenit didrise. employed to remove contamination from the plantequipment.thejetofliquid escaping through the manholepunched a holein the asbestos-cement roof4 m above30. (13)An American at a car rental desk in the UK was overheard complaining that thecarshe hadbeengivenwas veryslow. Three operatorswho opened the drums were eiposed to vapour concentrations abovethe short-termexposure limit quoted in the material safety data sheet. Almost all reactions go faster as the temperature rises. Engineers as well as operators can be unaware of the power exertedby a runaway reaction. It shouldnot be necessary to tell operatorsnot to pump with the lid fastened.Theyleft the lid on the drum to preventsplashing.sprayeda water-based solvent onto abox containing440-voltelectrical equipment. therewas an explosionandpeople were killed27. (11)Two operatorswere askedto transfersome sludge into a 60 gallon drum usingapositivepumpthat delivered1—1. When the pressure in the drum reachedthe delivery pressure of the pump. Don't assume that everybody knows what seems obvious. causingseriousarcing. It would bedifficult for those who write instructions to think of all the situations in whichpressure mightbe present. the lid flew offand hit him on the forehead3. (12)Softwarecompanies haveconsidered changing the instruction 'Strike any key' becausesomanypeoplehavetelephoned themto say that theycannotfind the Any' key33. Neverthelessthe designers of a plant came to the conclusion that a certain reaction would fade away if the temperature rose above a certain figure.AN ENGINEERS VIEW GE HUMAN ERROR he was not successful or the vesselmighthaveburst. Shehadneverbeforedrivena car 64 . (9) Somedrums ofresin. normally storedin a coollocation(as recommended in the material safetydata sheet). One of the operators decidedtoremovethe lid.were movedto alocationwhere thetemperature was at times above 45°C. Whiletryingto loosen the locking ring. A year later the incident was repeatedon a site that had been sentthe report32. As it was. Operators shouldbe given enoughunderstanding ofpressureand its formation to makesuch detailedinstructions unnecessary. the transfer stopped. Trainingrather thaninstructions isthebetterway topreventaccidents such as this.3 gauge(15—20psig). With the passage of time everyone had forgotten why the drums were keptin a coolarea3 1 (10) A subcontractor.

Operators were instructed to heat a reactantto 45°C as they added it to a reactor over a period of60—90 minutes. without telling anybody. the fire alarm sounded and the driver continued with the journey. A vehicle caught fire in 1996. This went on for a long time.with emissionoftoxic fumes. 65 . Knowledge that is obviousto many may not be obvious to people with adifferentbackground.5 Contradictoryinstructions Operators sometimes receive contradictory instructions. until a runawayreactionoccurred. They believed this to be impossible. an alarm sounds in the driving cab and the driver is instructed to stop the train.1 (page 48). 3. He decided prop had dropped.OENT\ PREVENTABLE BY BETTER TRAINING OR INSTRUCTIONS with amanualgearbox. they decided to add it at a lower temperature and heat the materialalready in the reactor. One suspects that the author of the instruction wants to turn a operate blind eye but also wants to be able to avoid responsibility when a valve is damaged. or by the other ways suggested in Section3. and became custom and practice. Soon afterwards another alarm indicated that a instructions.If a prop descendsat other times. Heavy road vehicles are camied through the Channel Tunnel between England and France on railway wagons. or even occasionally. he would have noticed that his instructions were not being followed. The incidents could have been prevented by better management. so. they do not like to say so and instead they take what they think is the nearestequivalent action. The driver then had contradictoiy to stop the train as a dropped prop might derailit. because the heater was not powerful enough. if operators believe that their instructions cannot be followed. The prop had not dropped. in these cases by better training and instructions. or at least severaltimesper week. In a sense all these incidents and the others describedin this chapterwere due to human failingbut it is not very helpful to say so. the alarmwas false and stoppingthe train greatlyincreased the damageto the train and to the tunnel. These wagons are fitted with supports (props) which are lowered when vehicles are being loaded and unloaded. She put the gearleverin the positionmarked 1 andleft it there34. Unfortunately. If a fire occurs drivers are instructed not to stop until they are out of the tunnel. For example. they on maybetold neverto use wheel-dogs stiffvalvesbutmaystill be expectedto them.PCQ1. Otherincidents are describedin Chapter9. A first-line manager should try to look at every figure onthe recordsheetsevery day. If the managerhad examinedtherecord sheets regularly.

7 m) diameter temporary pipe which was designedand installed very quickly by men whohadgreatpracticalexperience anddrive but did not know howto design largepipesoperating athigh temperature and pressure. was due to the failure of a large (0. he should have. more subtle than those just described. They were unconscious oftheir incompetence. 'ThePresidentcreated. The factory normally 66 . A report from the US Congresson allegations that arms were sold to Iran and the proceeds used. The responsibility lay with the senior managers who asked them to undertake a task for which they were unqualified.'21 See also Sections 5. Sometimes relaxation of the normal safety rules is justified.' It also said. requiring different actions. This is understandable. A manager emphasizes the importance of completing repairs.8. They did not know what they did not know. illegally. might operate at the same time and that their operation could be contingentrather thancoincidental. to buy aims for the Nicaraguanrebels said. experiments or a production run m a certain periodof time. If so the managershouldsay so clearly in writing and acceptresponsibility. Without actually saying so he gives the impression that the normal safety procedures can be relaxed.Instead they went ahead on their own.1 and 13. which killed28 people. But they did not know this and didnotrealizethattheyshouldhavecalledin an expertinpipingdesign. He should not drop hints and put his staff in a 'Heads I win. Unfortunately no-one foresaw that two alarms. 3.6 Knowledge of what we don't know The explosion at Flixborough in 1974. tails you lose' position(see Section 3. They could not be blamed. Most contradictory instructions are. however. highly-stressed pipes is a specialized branchofmechanical engineering.AN ENGINEER'S VIEW 1)1- HUMAN ERROR The officialreport is not entirely clear but it seems likely that the operation of the prop alarm was the result of damage to the alarm system by the fire. page 72). 'If the Presidentdid not know what his national security advisers were doing.2 (pages 99 and 223). They lacked the professional training which would have allowedthem to see when expert advice was needed. It is most unlikely that a random failure occurred by coincidence during the fire35'36. the design of large. or at least tolerated. an environment wherethose that did know of the diversion believed with certainty that they were carrying out the President'spolicies. Their only drawing was a full-sizedsketch in chalkon the workshop floor22'23. If there is an accident he can say that he never authorized any relaxation.

will the staff know when they ought to be consulted? As alreadystated. precise instructions are essential (see Chapter 12). Men are remarkably good at detecting meaning in a smog of verbiage but they shouldnot be expectedto do so. With reductions in staff. if a company dispenses with in-house experts and decides to hireoutside consultants as and whenrequired.6 (page 68) is typicalof the languageand appearance ofmany. There will be a fully-qualified electrical engineeravailable for consultation.may be made responsible for electrical maintenance. Unlike many similarinstructions. It was developedin consultation with the operators and is intended for the training of new operators.3.7. OR INSTRUCTWNS employed just one professionally-qualified mechanical engineer. Table 3. In addition to the well-established methods of training. there are newer methods touched on in Sections 3. it explainsthe reasons for the various steps. Here are somequestions that shouldbe asked about instructions.2 and 3. but does the control engineerknow when to consult him?Similarly. With computer control.1 Are they easyto read? Many instructions are not. 31 Some simple ways of improving instructions • • • • Are they easy to read? Are they writtento help the reader or to protectthe writer? Are they explainedto thosewho will have to carry them out? Are they maintained? 3. Figure 3.AC( LDEJTS PREy TABLE BY BRI TEE ERAININC. He had left and his replacementhad not yet arrived. for example. A control engineer.3 (pages 50 and 54). Figure3.7 (page 69) shows a much betterlayout.whileinstructionstell themthe precise courseof action to be followed. The example is rathersimplebut illustratesthe needto explainwhy as well as what. people may be asked to undertake additional givespeopletheknowledgeand understanding they need to do theirjob properly.1 (page 70) is an instruction (based on Reference 37) for the transfer of liquidinto a batch tank. Sooneror later they will fail to comprehend. 67 .

6 Figure Are your plant instructions like this? 68 . Make him work at it. include as much information as possible on administration. consider whether you have considered every eventuality so that if at any time in the future anyone should make a mistake whilst operating one of the plants on East Section. Don't use one word when five will do. 3. particularly that which you need when the lights have gone out following a power failure. use numbers. never start anything. all else having failed. EAST SECTION MANAGER AUTHOR: DATE: 1 DECEMBER 1996 UNCLE TOM COBBLEY AND ALL COPIES TO: Firstly. Don't use words.AN ENGINEER'S VIEW Of HUMAN ERROR INSTRUCTIONNO: TITLE: WC 101 HOW TO LAY OUT OPERATING INSTRUCTIONS SO THAT THEY MAY BE READILY DIGESTHD BY PLANT OPERATING STAFF. do not use any logic in the indexing system. Remember that the man reading this has turned to the instructions in desperation. you will be able to point to a piece of paper that few people know exists and no-one other than yourself will have read or understood. randomly distributed throughout the folder so that useful data are well hidden. Whenever possible use the instruction folder as an initiative test. it's a good way to learn. put the last numbered instruction first. and therefore this is a good time to introduce the maximum amount of new knowledge. routine tests. which would make their meaning rapidly clear. being careful to avoid explanations or visual displays. always initiate it. maintenance data. plants which are geographically close and training. be meticulous in your use of the English language and at all times ensure that you make every endeavour to add to the vocabulary of your operating staff by using words with which they are unfamiliar. for example.




The followingextractfrom a plant instruction shows the action a supervisor andfouroperatorsshouldtake whenthe induceddraughtfan providing air to a row of furnaces (known as A side) stops. Compare the layout with that of
Figure 3.6.


2 3



Panel Operator

2 3


ShutTRCs on manual Reduce feed rate to affectedfurnaces Increase feed to Z furnace Check temperature of E54 colunm

Furnace Operator 1 Fire up B stream andZ furnaces 2 Isolate liquid fuel to A stream furnaces 3 Changeover superheater to B stream 4 Check that outputfrom Z furnace goes to B stream Centre Section Operator


Changepumps onto electric drive Shut down J43 pumps

Distillation Operator 1 Isolate extraction steam on compressor 2 Changepumps onto electric drive
Figure 3.7

... or like this?


Table3.1 An instruction that explains thereasonfor each step
Instructions for charging Compound A to Batch Tank 1 Step Details

1. Wear standard protective clothingand Standard protectiveclothingishard hat, rubbergloves, safetyglassesand safety shoes.

2. Check thatTank 1 is empty
3. Check that inletvalves to other tanks

iftank is not empty it will overflow.

Ifvalves arenot closed another tank
might overflow.
Valve is located on far side oftank from flowmeterand is labelled. Valve is located near the flowmeter and

are closed,

4. Check that discharge valve on Tank 1 is closed.
5. Openthe inletvalve on Tank 1

is labelled.
Charge quantityis shownon batch card. Check that the pumphas started. Check the level indicatoron Tank I is rising,If not, stoppump and check for closed valve.

6. Enter charge quantityon flowmeter. 7. Startthe transferusing the switchon
the fiowmeter.

8. Duringtransfer, checkfor leaks 9. Whenflowmeter reaches setpoint, checkthatflow has stopped.
10. Closeinlet valvetoTank I. 11. Startstirreron Tank I

Ifaleak is found, stop pump and inform

ifflow has not stopped, shutdown pump

12. Charge inhibitor

Batchcard showstype and amount. Failure to add inhibitor could cause polymerizationand rise of temperature.



3.7.2 Are theywrittento helpthe reader or to protectthe writer? Itisnotdifficultto recognize instructions written toprotectthewriter. Theyare usually written in a legalistic language, are very long and go into excessive detail.As aresult theyare often not read.Aninstruction that covers 95% ofthe circumstances that might arise, and is read and understood, is better than one that tries to cover everything but is not read ornot understood. 3.7.3 Are they explained to those whowill have to carry them out? On one worksthe instructions on the procedure to be followedand the precautions to be takenbefore men were allowedto enter a vesselor other confined space ran to 23 pagesplus 33 pagesofappendices; 56 pages in all. There were many specialcircumstances but even so this seems rather too long. However, whenthe instruction was revisedit was discussedindraftwith groups ofsupervisorsand the changes pointedout.This was time-consuming for both supervisors and managers but was the only way of making sure that the supervisors understood the changes andfor the managersto find out if the changes would work in practice. Most people, on receivinga 56-page document, will put it aside to read when they have time — and you know what that means. New instructions shouldalwaysbediscussedwiththose whowillhaveto carry them out (see Section 6.5, page 118). 3.7.4 Are they maintained? Necessary maintenance is of two sorts. First, the instructions must be kept up-to-dateand, second,regularchecks shouldbe made to see that the instructions are present in the controlroomand in alegible condition. Iftoo worn they shouldobviously bereplaced.Ifspotlessly clean,like poetrybooksinlibraries, they areprobablynever readand the reasons for this shouldbe sought. Perhaps they are incomprehensible. A seniormanagervisiting acontrolroom shouldask to see theinstructions — operating as well as safety. He may be surprised how often they are out-of-date, or cannotreadily be found orare spotlessly clean. One explosion occurredbecausean operatorfollowedout-of-date instructions he found in folder in the control room. Finally, a quotation from H.J.Sandvig:


'Operators are taught by other operators and ... each time this happens is left unsaid or untold unless specific operating instructions are provided,specific tasks are identifiedand written and management reviews these procedures at least annually and incorporates changes and improvementsin the process.'18
something 71


A plant contained four reactorsin parallel. Every three or four days each reactorhad to be taken offline for regenerationof the catalyst. The feed inlet and exit valves were closed, the reactor was swept out with steam and then hot air was passed through it. One day a fire occurred in the reactor during regeneration. Afterwards the staff agreed that the steam purging should be carried out for longer and should be followedby tests to make sure that all thefeed had been sweptout. Aninstruction was writtenin the shift handover log and a handwrittennote was pinned up in the controlroom but no change was madeto the folder oftyped operating instructions. A year later an operatorwho had been on loan to anotherunit returnedto his oldjob. He saw that the noteon extra sweeping out and testinghad disappeared and assumedthat the instruction had been cancelled. He did not carry out any extra sweeping out or any tests. There was another, and larger, fire. There was no systemfor maintaining instructions. (In addition, the content of the instructions was inadequate. Feed may have been entering the reactor througha leaking valve. The reactor should have beenisolatedby slip-plates or double block and bleedvalves.) Section 11.5 (page 194) describes some further incidents which could havebeenpreventedby betterinstructions.
3.7.5 Othercommon weaknesses in instrudions They do not correspond to the way the job is actuallydone. They containtoo much or too little detail. Complex unfamiliar tasks with serious results if carried out incorrectly need step-by-step instructions. Simplefamiliartasksmay be tools of the trade and needno instructions at ali. Different levels ofdetail areneededby novicesandexperienced people. Forexample, if a line has to be blown clear with compressed gas, novices should be told how long it will take and how they can tell that the line is clear.Table 3.1 (page 70) is an exampleof an instruction for novices. The reasons for the instructionsare not clear (seeChapter5). The boundariesofspace, time andjob beyondwhichdifferentinstructions apply are not clear.

• •

• •

As stated atthebcginningofthis chapter, training givesus an understanding of our tasksandequips us to use our discretion, while instructions tellus precisely
whatwe shouldand shouldnot do; training equips us for knowledge-based and skill-nased behaviour while instructions equip us for rule-based behaviour. Which do we want'? 72

3.8 Training or instructions?


While admitting that most jobs today require some knowledge-based behaviour, many managers feel that in some situations, especially those involvingsafety, rule-based behaviouris essential. In practicethe rules never cover every situationand people have to use their discretion. They are then blamed either for not following the rules or for stickingrigidly to the rules when they were obviously inapplicable, another example of contradictory instructions (see Section 3.5, page 65). To try to cover every situationthe rules become more and more complex, until no-one can understand them or fmd the time to read them. People may adopt the attitude that since they are going to be blamed in any case when things go wrong, it hardly matters whetherthey followrules or not. It would be better to recognize that people have to be given some discretion, give them the necessary training, distinguish between the rules that should never (well,hardly ever) be broken and those that on occasions may be and acceptthat from time to time peoplewill makethewrongdecisions. Of course, whenever possible, exceptions to safety rules should be foreseen and authorizedin advance. The results of trying to legislatefor all eventualities have been descnbed by BarbaraTuchman, writing about naval warfare in the 18th century. In the 17th century individual captains used their own tactics, and this often caused unmanageable confusion. The UK Admiralty therefore issued Fighting Instructions, which required ships to act in concert under the signalledorder of the commanding officer and forbade action on personal

'In general, the result did makefor greaterefficiencyin combat,though in particularinstances ... it could cause disasterby persuading a too-submissive captain to stick by the rule when crisis in a situationcould better have been met by a course determinedby the particularcircumstances. As deviations from the rule were alwaysreportedby some disgruntled officer and tried by a
court-martial, the Instructions naturallyreduced, if not destroyed, initiative exceptwhen a captain of strong sell-confidence would act to take advantage of the unexpected. Action of this kind was not infrequent, even though no people so much as the Britishpreferred to stay weddedto the way thingshad always beendone before.In allowing no room for the unexpected that lies in wait in the waywardness of men, not to mention the waywardness of winds and oceans, Fighting Instructions was a concept ofmilitaryrigidity that must foreveramaze the layman.'

In 1744, after thebattle of Toulon, Admiral Mathews was court-martialled and dismissed fornot stickingstrictly to the rules. Awareof this decision, in


1757 Admiral Byng (who had been on the bench whichtried Mathews) stuck to therules at thebattle ofMinorca and was court-martialled and shot for not having done his utmost torelieve the garrison ofMinorca.(This produced the famous saying of Voltaire, 'Dans cc paysci il est hon de tuer de temps en temps un amiral pour encourager des autres.' ['In that country they find it pays to kill an admiral from time to time to encourage the others.']) During the American War of Independence many British officers refused to accept commands, as they were afraid of being made scapegoats if anything went wrong. Though the penaltiesare less, similar incidents have occurredin industry. For example, a waste heaterboiler is fitted with a low water level alarm and the operatorsare told to shut it down at once if the alarmsounds. An operator does so; the alarm was false and production is lost. Nothing is said to the operator but he senses that everyone wisheshe had not shut the plant down. Perhapssomeonesuggests to the foremanthat this operator'stalents mightbe better employed on another unit, where strict attention to the rules is necessary. Everyone gets the message. Next time the alarm sounds it is ignored and the boiler is damaged! Getting the right balance between following the rules and using one's discretion is difficult in every walk of life. A midrash (a commentary in the form ofa story) on Genesis, writtenabout the 5th century AD. says:

'Abrahamsaid to God: "Ifyou wish to maintainthe world, strictjustice is impossible; and if You want strictjustice, the World cannot be maintained. You cannot hold the cord at both ends. You desire the world. You desire justice. Take one or the other. Unless you compromise the World cannot


Cases when training is not the best answer

3.9.1 Electric plugs Ordinary domesticthree-pinelectric plugs are sometimes wired incorrectly. This may be due to a slip, or to ignorance ofthe correctmethod. The usual remedy is training and instruction: electric appliances come with instructions for fitting the plug. However, it is not difficult to reduce the opportunities for error. One method, now compulsory in the UK, is to fit the plug in the factory. Another method, so simple it is surprising that it has not been used more often, is to colour the terminals in the plug the same colour as thewires.





3.9.2 Kinetic handling Training in kinetic handlingmethods is often recommended as a curefor back injuries but in many cases it is only part of the story. Before introducinga training programmethe layout and designofareas wherepeoplehave to work shouldbe examined to checkthat theyare the bestpossible. Forexample, aman has been seen working with his feet at differentlevels, lifting 30 lb from a conveyorbelt onto staging behindhim. Ahoist was needed, not training ofthe man. In anotherexamplethere were two almost identicalconveyorlayouts, yet back accidents occurred on one and never on the other. An examination showedthe first line was close to very largedoors so that, whenmen stopped for a rest after getting hot and sweaty, they were standing in a draught, with the not unnatural result that they suffered back trouble. In another example, trucks broughtloads from stores and depositedthem in unoccupiedareas near to but not at theactualplace wherethegoods would be required, with the result that men were then called upon to manhandle them over the last part. Very often the temporary resting places of these goods were pebbledareas and other unsatisfactory places so that men did not havea properfooting, with inevitable falls and strains as aresult. In short, we should examine the layout and planning of a task before considering training.

3.9.3 Attitudes It is sometimes suggested that through training we should try to change people's attitudes.I doubtifsuch training is eitherjustifiedor effective. It is not justified because a person's attitude is his private affair. We shouldconcernourselves only with whetherornot he achieves his objectives. It is not effectivebecauseit is basedon the assumption that ifwe changea person's attitude we changehis behaviour. In fact, it is the other way round. Anattitude has no real existence; itis just ageneralizationabout behaviour. If someone has too many accidents, let us discuss thereasons for themand the action needed to prevent them happening again. Behavioural safety training (see Section 5.3, page 107) can be very effective. After a while he may act differently, he may have fewer accidents and everyone will say that he has changed his attitude. In short:

Don'ttry to change people's attitudes.Justhelp themwith their problems, for example by tactfullypointingout unsafe acts and conditions.

London. hypocrite and flatterer. Chapters 11 and 12 (Butterworth-Heinemann. 1979 (Pergamon Press. P.20(2): 140. 21. 13. 15 January 1966 Fire.. Chapter 16 (Butterworth-Heinemann. 3. Reportissuedby the US NationalTransportation Safety Board. 25 March1966. 8. 19. ParisMatch.'19 (See Section5. (Chairman).July 1982.)' in Britain. GeneralGood is the plea of the scoundrel. The FhxboroughDisaster: Reportofthe Courtof hiquiry (HMSO. Reportofthe President's Goinmission on the Accidentat Three MileIsland (the Kemeny Report). PA.AN ENGINEERS VIEW OF HUMAN FRROR Wiffiam Blake wrote.Washington. page 132. The Chemical Engineer. 15.. Looking back. 6.E. This is the policy — the commonlaw oftheorganization. a commonpattern may be seen. 1980. Parker.Philadelphia. 19 November1987. R. Jerusalem.) Similarly. 3rd edition. 60(2) 243. 1984. 20 Kletz. 1975...UK). T. 17. Lagadec.No. February1966. page 103. page 27. 1975. 3rdedition(Taylor & Francis. Report of Royal Gorninission into the Failure of the West Gate Bridge. 365. Learning from Accidents. 1983. Dispelling Chemical Engineering Myths. Oxford. 1996. 76 . 21 January 1966. New York.JAOC'S. 12. 875. 5. page21 The Engineer. Oxford. UK). PetroleumReview. Chemist. The Guardian. page475. Learningfrom Accidents. NewYork.July 1983. A. 1971 (StateofVictoriaGovernment Printer. ata!. USA) 2. DC. NewYork Tunes. 4. 1981. UK). T. SpecialSupplement. PetroleumTimes. Major Technological Risks.55. 60. References in Chapter 3 1. Howarth. Safety training packages. No. 3rd edition.E. 7. page66. 2001.. UK).A.. 14. Rugby.A. He who would do good to another must do it in MinuteParticulars.23 June 1971. . USA). PetroleumReview. USA). 16.. Australia). January 1974. 10. 11. It is better to suggestways of dealingwith specific problems. Blake.Melbourne. page 175 (Pergamon Press. 22.. HJ. W.J. 2001.2. Kletz. Marshall. Sandvig. Kletz.A. do not try to persuadeBoards of Directors to change their policies. T. various dates (Institution of Chemical Engineers. 9. PetroleumReview. 18.

136. Kletz. page 5 (Office of Nuclearand FacilitySafety.B. 98—is. US Department of Energy. Washington. Ward. 25.Washington. 2001. Process Safety Progress. 1998. 31. page 1 (Office of Nuclearand FacilitySafety. 37. Chemical Technology Europe. The Home Guard. London. McClure. Oxford. Learning 24.H. May 1997(HMSO. 1988. US Department ofEnergy. UK).P. 39..Washington. Bryson. page 38.J.'y. 1999.nma.A. OperatingExperience Weekly Su. T. OperatingExperience Weekly Sumnmart. 1999. Lmdley. London. D. page 104 (Doubleday. Tuchman. USA). DC. 99—08. No. 99—34. UK). page 5 (Office ofNuclear and FacilitySafety. DC. USA).G. LossPreventionBulletin.. No. Pratt.Washington.July/August 2000. Notes from a Big Country. 1999. 27. 1999. from Accidents. American InstituteofChemical Engineers33rd Anmiual Loss PreventionSymposium. Eliminating error-likely situations during procedureupdates. page 16 (Office of Nuclearand FacilitySafety. 33. 28.. page 74 (Oxford University Press. No. T. OperatingExperience Weekly Swnmary.2(4): 26. Daily Telegraph. page 1 (Office of Nuclearand FacilitySafety. M.J. page 2 (Office of Nuclearand FacilitySafety. 1998. 38. USA). 98—27. DC.UK) Williams. DC. S. 30. 35. 1998/1999. 99—04. 29. pages 120-428. DC. INSTRUCTIONS 23. Washington. No...US Department of Energy. DC. 1996.. US Department ofEnergy. Operating Experience Weekly Summary. NaturalHistory. US Department ofEnergy. 26.. 15—J7March 1999. No. August1997. and Gromacki. 18(4):241. US Department ofEnergy.R. No. page 3 and No. New York. No. T. 1999. B. 7. 165 and 175 (Knopf. 1999.USA) Mackenzie.ACCII3RNTS PREVENTABLE BY BETTER TRAINING OP. 99—16. USA).A. Chapter 8 fflutterworth-Heinemann.Washington. USA). The Firsr Salute. UK). 3rd edition. OperatingExperience Weekly Summary. 34. date unknown. 32. 99—15. Inquny imito the Fire on Heavy Goods Vehicle Shuttle 7539 on 18 November 1996.W. OperatingExperience Weekly Sumnmnamy. Oxford. 77 ... 36. USA) Stephens. 1995. 1998.

Accidents due to a lack of physical or mental ability 'Notlungis unpossible for people who do not havetodo It themselves. so he could not operate this control and watchthe load at the same time1. 4. so one fitter decided to remove his mask and was overcome by the hydrogen suiphide. Most of them occur because peopleareaskedto domore thanpeopleas a whole arecapableofdoing. however. Followingthe incidentlifting aidswere provided. physically or mentally. To prevent the accidents described in tins chapter all we can do is to change the work situation— that is. The driverhad toleanover the side to seehis load. Many companies would have been content to reprimand the fitter for breaking the rules.becausesomehydrogensulphide mightstillbepresent. asked why he had removed his mask and it then became clear that he had been asked to carry out atask whichwas difficultto performwhile wearinga mask2. He couldthen not reachone ofthe controllers. Oneof the causes was found to be the design ofthe cranecab. The company concerned.1 People asked to do the physically difficult or impossibk (1) A steel company found that overhead travelling magnet cranes were frequentlydamaging railwaywagons. Anon These accidents are much less commonthanthosedescribedin Chapters 2 and 3 but nevertheless do occur from time to time.thefitters were told to wear air-line breathing apparatus. (2) Arefinerycompressor was isolatedfor repair and swept out with nitrogen but. the designor methodofworking. Only a few occur because someone was asked to do more thanhe (or she) was individually capableofdoing. 78 . They found it difficult to remove a cylindervalve which was situated close to the floor.

onceperyearorless often. an electrician was usinga clip-onammeter inside a live motor-starter cubicle.L OR MEN'iAL ABIL1T' (3) Incidents have occurred becausevalves whichhave to be operated in an emergency were found tobe toostiff. the electrician's co-ordination of his movements will be a little poorer than normal and an accident will result. The method of working is hazardous — though acceptedpractice — and a better method should be sought. A clip-on ammeter has two spring-loaded jaws which are clipped round a conductor forming acoil whichmeasures thecurrentby induction.was stiffand hadno handle and sohe used awrenchwith apipe extension on the handle. The method ofworkingmakes an occasionalaccidentalmostinevitable. Another. (6) Related to these accidents are those caused by so-calledclumsiness. L. In this case the only safeguard was the electrician'sskill. In many companies such an accidentwould be put down to clumsiness' and the man told to take more care.CR O ?SlCP. He was burnt on theface and neck and thestarter was damaged3. Such accidents are the physical equivalent ofthe mental slips discussed in Chapter2. They should design so that 95% (say) ofthe population can reachit. Sooner or later. say. The ventvalve. (4) Operators often complainthat valvesare inaccessible. For example. We all have off days.alwaysbe readilyaccessible. physicalsafeguard shouldbe provided19. Designers should remember that if a valve is just within reach of an average person then half the population cannot reach it. iftheyhaveto beoperated. A manwas askedto ventgas at agaugepressureof16bar(230 psi). Suchvalvesshould be keptlubricated and exercised from time to time. When handling hazardous materials or equipment we usually provide at least two safeguards (defencein depth). Itisreasonable to expect operators to fetch a ladder or scramble into a pipe trench on rare occasions. Aids such as large mirrors shouldbe providedin places such as loadingbays wherea lot of reversing has to be done1. (7) The following is anotherexample ofwhat somepeoplemightcallclumsiness but was in fact poor layout. Whileopeningthe valvehe accidentally movedinfront oftheopening andwasthrown2 m(6feet) 79 . Emergency valves should. The electrician accidentally shortedtwo live phases(or a live phaseand earth) with the baremetal ends of thejaws.can be out ofreach. (5) It is difficult to give lorry drivers a good view while they are reversing. a ballvalve. The jaws are insulated except for the extreme ends. ofcourse. Butother valves. for one reason or another.ACCU)ETS OUR 'TO .

highly-automated plant developed an unforeseen fault. If aman istold to closeavalvewhenanalarmsoundsorataparticularstagein an operation he cannot be expectedto close the right valve in the required time every time. One case. He was awareofhis condi- tion buthad not reportedit27. Whenthe staff were askedwhythey didnot carryoutadailycheckbydipping. However.IN €.NGNEER'S VIEW 01' HUMAN ERROR bytheescaping gas. and thoseinvolvingestimates ofdimensions. Ultimately an explosionoccurred.1 Information or task overload A new. Hewould havebeenthrownfurtherifatank hadnot been in the way. 4. The missing tail pipe and handle could have been spotted during plant inspectionsor anyone who saw them could have reported them. (9) When train driverspass a signalat reditis usually due to a lapse of attention (Section 2. His error rate will be higherifthelayoutorlabelling arepoororheis under stress or distracted.000 gallons (50 m3) had leakedinto the surrounding groundandcontaminated astreamandariver. astheyhad beentold todo.The vent valveshouldhavebeenfitted with a tail pipeto directthe escapinggas in a safedirection26. This section considers some accidents which occurred becausepeople were expectedto carry out tasks which most people would fail to carry out correctlyon many occasions. was due to diabetic eye disease that hadreducedthe driver's colourvision. It was replaced by one with a small central disc that could be (8) removed easily20. A leak ofpetrolfrom an underground tank at a service station went undetectedfor two weeksby whichtime 10. These failures are of several sorts:those due to information or task overloador underload. however. page 38). 4.itwasfoundthat themanholecoverover the dippipe was heavyand difficult to move.those involving habits. Whetheror not they do so not depends on the encouragement they receive and on whetheror not such reportsare actedon. those involving detection of rare events. The operatordid not know what had occurredand took no action. 80 .2 People asked to do the mentally difficult or impossible in Many ofthe incidents described Chapter2almostcomeinto this category.9.2.3. he will close the right valve most ofthe time.The computer startedto print out a long listofalarms.

not the equipment) and do nothing. but this is not how people behave under pressure (see also Section 12. page 210). The actual distance between the two jobs — 20 m — was rather close to the minimum distance — 15 m — normally required. to give the construction work the attentionit required. However the 15 m includes a safety margin. with his primary commitmentto the operating plant. It can't be worsethan that"?' Unfortunately people do not think like that. was issued at 08. At 12. The action suggested by the designers may be logical. including a welding permit. Plant supervisors sometimes suffer from task overload— that is. The introduction of computers has madeit much easier than in the past to overloadpeople with too much information. The foreman gave permission. If quantities of computerprint-out are dumped on people's desks every week.It was realized that it was unrealistic to expect the foreman. 81 . then most of it will be ignored. two jobs had to be carried out simultaneously in the same pipe trench. For example. It spread over the surface of the water in the pipe trench and was ignited by the welders. "I don't know what's happening so I will assume it is a conditionthat justifies an emergency shutdown. Although the pipeline had been emptied. A permit-to-work. a few gallons of light oil remainedand ran out whenthe slip-plate jointwas broken. If someoneis overloaded by too much information he may simply switch off (himself.C\NIS )'3E TO A LACK O' PHYSICAL OR MENTAL ABILITY Afterwards the designers agreed that the situation should not have occurred and that it was difficult or impossible for the operator to diagnose the fault. Vapour from a small spillage will not normally spread anything like this distance. includingthe bits that shouldbe lookedat. Had he visited the trench he might have noticed that it was flooded. He did not visit the pipe trench whichwas 500 m away. The man removingthe slip-plate was killed. The first job was construction of a new pipeline. 20 m apart. valid for the wholeday. This has caused several accidents.4.00 hours a permit was requested for removal of a slip-plate from an oil line. 'Why didn't you assume the worst and trip the plant? Why didn't you say to yourself. judging that the welders would by this time be more than 15 m (50 ft) from the site ofthe slip-plate. as he was dealing with problemson the operating plant. but they then said to him. Afterwards a special day supervisor was appointed to supervise the construction operations. liquid will spread hundreds ofmetres. On the surface of water. in management as well as operating jobs.Reference37 describes methods to reduce alarmoverload. however. they are expectedto handle morejobs at once than a person can reasonably cope with.00hours.

Overloadingof shift supervisors can be reducedby simple arrangements. in the days before continuously braked trains became the rule.for example. It is verydifficultfornight-watchmen. hot oil came out and caught fire. When the pump was dismantled. Overloading of a supervisor at a busy time of day may havecontributed to anotherserious accident. One of his first jobs was to issue a permit-to-work for repair of a large pump. as shownby the presence oftail lights. It is possible that it was and that someone later openedit.AN ENGINEER'S VIEW OF HUMAN ERROR or construction work is to be carried out on an — for example. to remainalert when nothing has been known to happenand when thereis nothing to occupy their minds and keep them alert. if part ofit is shut down and therest running operating plant — extra supervisionshould be provided. page38). is not the overloading of the supervisor — inevitablefrom time to time — but the lack of a propersystem ofwork.2. The supervisor said he had checked the pump before issuing the permit and found the suction valve (and delivery valve) already shut. railway signalmenwere expected to confirmthat each train that passed the signalbox was complete. start work at the same time as the maintenance team. had neverbefore known that signal to be at danger. for example. One supervisor should look after normal plant operations and the other should deal with the maintenance or construction organization. Three men were killed and the plant was destroyed. They should not. A shift process supervisor returnedto work afterhis days off at 08. 26.2 Detedionof rare events If aman is askedto detecta rare eventhe mayfail to notice when it occurs. the drivers. 40). Onsomeofthe occasions whentrain drivers havepassedasignalat danger(see Section 2. or maynot believe that it is genuine. of course. The valves on the pump should have been locked shut and in addition the first job. The danger is greatestwhenhehas little else todo. The real fault here. It is alsopossible that the supervisor. inmanyyearsofdriving regularlyalongthe same route.9.forgotto checkthe valve. CouIdye not watch with me one hour?' (St. Similarly. 82 . Matthew. Compare Jesus' remark to the discipleswho had fallen asleep. then supervisors have to visit the scene in order to lock them.00 hours on Monday morning. If valves have to be locked shut. It was found that the suction valve on the pump had beenleft open. when maintenance started. should have been to insert slip-plates. If extensive maintenance 4.due to pressureof work.3.

One such task is the calculation and graphingofprocess parameters such as efficiency. Although in theorythey have little to do on a highlyautomated plant.7. in practicethere are often some instruments on manual control. and so on. such as changing over pumps and tanks. I do not think this is the right philosophy. It is sometimes suggested that we shouldrestrictthe amount ofautomation on a plant in order to give the operators enoughto do to keep themalert. page 106). The signalman was blamed(in part)for the accidentbut it is difficultto believethat this was justified. is felt that the operators are seriously underloaded. In practice. then we should not ask them to do what a machine can do better but look for useful but not essentialtasks that will keep them alert and whichcan be set aside if there is trouble on the plant— the process equivalent of leaving the ironing for the babysitter. The call was genuine (see also Section 5. We shouldlook for other ways of keepingthem alert. however. In his 25 years as a signalman he had never previouslyhad an incomplete train pass his box4.ACCtDFNTS DUE TO A LACK OF PHVSICAL OR IENTAL ALITY In 1936 an accidentoccurred because a signalman failed to spot the fact that there were no tail lights on the is difficult for night-watchmen to remain alert. fuel consumption. Similarly if automation is chosenbecauseit is more efficientor effective.we should not sacrifice efficiency or effectiveness in order to find work for the operators. routineinspections to be carried out. If automation is needed to give the necessaryreliability. If.2. During the Second World War. an ambulance crew failed to respondto a call that two boys were trapped in a tank. catalystlife and so on. it was found that the effectiveness of a man carrying out such a passive task fell off very rapidly afterabout 30 minutes. 83 . It is difficult for anyone to realize that an event that has never occurred in 25 years has iifact just happened. there are non-automated tasks to be done. In September 1984 the press reported that after a large number of false alarms had been received. As alreadystated. I doubt if process plant operators often suffer from task underload to an extent that affects their performance.2.3 Task underload Reliability falls off when peoplehavetoo little to do as wellas whenthey have too much to do. then we shouldnot sacrifice reliability in order to find work for the operators. studies were made of the performance of watchkeepers detecting submarines approaching ships.there is equipmentto be prepared for maintenance.

When someonedemonstrated the modified tractor. all the necessary information is not available) and situations wherethe actioncannothe specifiedin advance28. tasks which consist of many simple elements (as peoplehave many opportunities for slips or lapses of attention) and tasks which require very rapid response. a partly-automated design and a fully automated design. This is best done under the supervisionof a foreman who stops the programme at suitable intervals.2 (page 143) shows that an operatormay be more reliableat someprocess tasksthan automatic equipment. he will probablymakeerrors. A goodexampleoccurredin a company whichwas developing an attachmentto fit on the back of a tractor.4 Habits and expectations If we expect people to go against established habits. Section 7.2.fully operatedplantsare not necessarily the most reliable.5. repetitive tasks. The control system should be designedwith this philosophy in mind6 (see alsoSection 12. The comparison showed that the partly-automatic designwas the most reliable. 4.AN ENGINEER'S VIEW OF HUMAN ERROR Another task is studying a training programmeon a video or CD-ROM (but not on the control computer). he twice droveit into anothervehicle1. More important than occupying the operator'stime. In decidingwhetherto give a task to a personor a machme we should give it to whichever is least likely to fail. In the UK people expect that pressing a switch down will switch on a light or appliance. page210). If a volume control is tumed clockwise we expectthe soundto get louder. Machines are better than people at boring tasks such as monitoring equipment. The drivingcontrols had to be placed behindthe seat so that thedriver could face therear. Onthe development model the controlrodswere merelyextendedto the rear and thus the driver found that theywere situatedm the opposite positions to thoseon a normal tractor. errors will result. adds extra explanation ifnecessary and discussesit with the operators. Hunns5 has compared three designs for a boiler control system: a largely manualdesign.4. Habits are also formed by general experience as well as experience on particular equipment. For example. is lettinghim feel that he is in chargeof theplant — able to monitorit and able to intervene when he considers it necessary — and not just a passive bystander watching a fully automatic system. If the controlleron a crane is movedto the 84 .ifa manis trainedto drive a vehicle with the controlslaid out in one way and is thentransferred to another vehicle wherethe layout is different. situations whichare not fully defmed (that is. Despite what has been saidabove. People are necessaryfor dealingwith unforeseen situations.

nativeto China) noticedthat they detoured round an area of grass. Trials have shown that most people leaving a building in an emergency ignoreemergency exit signs and insteadleaveby the routes theyknow best29. The valves were above the ceiling and were operatedby chains. The foremanhad senthimbecauseno-oneelse was available. had collidedwith another cyclist.5 Estimatesof dimensions People are not very good at estimating distances.circuitous route30. He had last riddenone whenhe was a boy. They reported that they were open when they were in fact closed. The bicycle and road surface were in good a weekly safetymeeting a maintenance foremanreported that one ofhis men had fallen off a bicycle. They were so sure that they would be unable to open them they did not try very hard. (If the normal route includesa double door. 4. Operators checked that the valveswere open by closing them slightly and then opening them fully. In 1935 a signalmanon a busy line accepted a second train before the first one had left the section. however. It came out in the inquiry that although he had 23 years' experience and had worked in four different boxes. Errors will occur if designers expectpeopleto break these habits.) A biologiststudyingthe behaviour of migrating chirus (a relative of antelopes. He had been sentto the store for an item that was requiredurgentlyand on the way.kCCU)E1TS DUE TO A LACK OF PHYSICAL OR MENTAL ABILITY right we expect the load to move to the right. Expectations are closely related to habits. with poor eyesight. thoughthe taskwould be withinthecapacity ofmostpeople. and the accidentseemed to be a typical 'humanfailing' one. Further inquiries. disclosed that the man was elderly. very few peopleunboltit. Ten thousandyears ago the grassland was a lake and the chirus still seemed to follow theirancient. and unusedto riding a bicycle. In a better system the valves would be visible and it would be possible to see at a glance whether theywere open orclosed31.3 Individual traits and accident proneness A few accidentsoccur becauseindividuals are asked to do more than they are capableofdoing. one halfof which is keptclosed. Forexample. while entering a major road. Drivers shouldknow the dimensions of theirvehicles and the dimensions ofnarrowopenings shouldbe clearlydisplayed1.2. 4. A plant was fitted with a manually operatedfirewater sprinidersystem. he had been 85 . Vehicles are often driven throughopenings which are too narrow or too short.

' subjects. In one study of 104 railway shunters over three years. Swain8 describes several attempts to measure the contribution of accident-prone individuals to the accident rate.1 (page 11). the ten shunters with the highest accident rate in the first year were removedfrom the datafor the following two years. But whenever we are considering how to prevent accidents we must avoid the danger of laying too much blame on abnormalities oftemperamentandpersonality. Three involved braking. However. His suitability as a driver was obviously suspect32. consider whether he is suitable for the particularjob. This bnngs us to the question of accident proneness which has been touchedon already in Section 2. remember that a workermay have more than the average share of accidents by chance. A train dnver passed a signal at dangerand causedan accidentin which one personwas killed and 70 injured. Eysenck9. Hunter7 writes: 'Accident proneness is greatly influenced by the mental attitude of the The accident-prone are apt to be insubordinate. Whatever may be the case in some industries. Obviously there are not enough accidentsto go round 86 .3. we should obviously look at his work situation and if it is normal.9. one excessive speed and two passing signals at danger. Suppose that in a factory of 675 workers there are in a given period 370 accidents. We should indeed be guilty of a grave error if for any reasonwe discouraged the manufacture ofsafemachinery. These and other defects of personality indicate a lack of aptitude on the part of the subjects for their occupation. however. Most signal-passed-at-danger incidents.AN ENGINEER'S VIEW or HUMAN ERROR chosen for this presentbox by seniority rather than merit and had taken five weekstolearn how to work it4.Smiilar results were obtainedin a studyof847 car drivers. however. shows that personality testing of South African bus-drivers reducedaccidentratesby 60% over 10 years. temperamentally excitableand to show a tendency to get flustered in an emergency. occur to normallygood drivers who have a momentary lapse ofattention(seeSection2.Let us bewarelest the concept of the accident-prone person be stretched beyond the limits within which it can be a fruitful idea. in the process industries very few individuals seem to have more than their fair share of accidents. It is temptingto think that we can reduce accidents by psychological testing but such tests are not easy and few accidents seem to occur becauseindividuals are accident-prone. If anyone does. The inquiry disclosedthat betweenfour and eight years earlier he had committed six irregularities. The accidentrate for these two years actuallyrose slightly. page 38).

In examining accident statistics it ts.this maybe due to: • chance.job definition and time period so that the numberofaccidents is unusually high. Swain8 describes a study of 2300 US railway employees. necessary to define the boundaries of the samplebefore examiningthe figures.15 workers will have5 accidents each. The people will easily find other ways of injuringthemselves. This is similarto the Texas sharpshooter who empties his gun at the barn door and then draws a targetround the bulletholes to prove what a good shot he is. High accident ratesin a company may even be a form of mass psychogenic illness.ACCIOENTS OUt. responsibility and recognition (see Appendix 1). if a manhasanunusually largenumber ofaccidents. • lack ofphysicalor mental ability. use this as an excuse to withdraw from work). then: • • 11 workerswill have3 accidents each. of course. achievement. and 472 high-accident men. • 1. 11 we select a group of accidentsand then fix the accidenttype. 87 . Obviously changing the job so as to remove opportunities for accidents will not prevent accidents of this type. who had fouror less accidents. having accidentally injured themselves. compared with his fellows. If we believe that a man's injuries occur in this way then we have to try to find out why he finds work intolerable. TO A I ACK OF FHNSCAL OR MENTAL kBtLfl \ and many workers will have no accidents and most of the others will have only one. Perhapshe does not get on with his fellow workers. a mass reaction to poor working conditions21. If the accidents were distributed at random then we would expect 476 men to have five or more accidents. oncein every sixorsevenperiodswe shouldexpectone workertohave five accidents by chance. To sum up on accidentproneness. Thatis. Another reason why some men may have an unusuallylarge number of accidents is that they deliberately but unconsciously injure themselves in order to have an excuse for withdrawing from a work situationwhich they find intolerable10 (or. If the accidents occur atrandom. They were divided into 1828 low-accident men. we can prove anything. 0. • personality. perhaps his job does not provide opportunities for growth.5 workers will have 4 accidents each. who had five or more accidents.

is the individual who keeps trying to find other solutions to his problem. or that our solutionhas unwanted side-effects. The last three categories do not seem to contribute agreat deal to accidents in theprocess industries.'22 of De Bono writes. People were asked to try to identify the objects while they were still out of focus.'12 'Most people when faced with a problem. indeed. We are thenso busycongratulating a ourselves that we fail to see that there may he a better solution. This is known as a 'mind-set' or. 'Even for scientists there comes a point in the gathering evidence when conviction takes over and thereafter selects the Mind-setsare describedby Raudsepp13: evidence. tend to grab the first solution that occurs to them and rest content with it. that some evidencepoints the other way. it is difficult to revise or drop it in the face of contradictory evidence. psychological problems. there is a tendency to believe that this scenario is the only one that can occur. In one expenment. The striking finding was this: If an individualwrongly identifiedan objectwhile it was far $8 .4 Mind-sets Wehave problem.AN ENGINEER'S VIEW OF HUMAN ERROR • possibly. were projectedupon a screen. color slides of familiarobjects.but these are rejectedautomatically. 'Andonce ajudgementis arrivedat. Einstellung. 'When a scenariofits the cunent situation. Rare. Gradually the focus was improved through several stages. Accidents may be a symptom of withdrawalfrom the work situation... 4. such as a fire hydrant. This is especially evidentwhena personfeelsunderpressure. In some countries accidentsmay occur becauseemployees' command of the written or spoken languageused is poor or becausetheir educationand background make them reluctant to ask questions.' 'Many interesting psychological experiments have demonstrated the fixating power of premature judgements. the culture of the home and the factorymay not be the same'1. Wethink of a solution. Once an explanation is articulated. if you prefer a more technical term. we tend to persevere in it even when the evidence is overwhelming that we are wrong. Otherscenariosmay well be possible.

in a simplified form. he frequently still could not identify it correctly when it was brought sufficiently into focus so that another person who had not seen the blurred visioncould easily identifyit. 4.A(LLUENTS 00K TO A LACK or PHYSICAL OR MENTAL ABILI V out of focus.1 An operator's mind-set A good example is providedby an accidentthat occurred many years ago. A person who is in thehabit ofjumping to conclusions frequentlycloses his mind to new int'ormation.1 14 To condenser From storage Figure 4. they are includedin this chapterbecausethey are a feature of people's mentalabilities whichis difficultto overcome and whichwe have to live with. on thecoker shown. in Figure 4. as the only way of avoiding them seems to be to make people aware of their existenceand to discourage people from corning to premature conclusions.4.2 and 10. Others are describedin Sections 2.2 (pages 35.and limitedawareness hampers creativesolutions. 50 and 178).9. 3. What this indicatesis that considerably more effort and evidence is necessary to overcome an incorrectjudgement. Here are some examples of mind-sets.' Mind-sets might have been discussed under Straining'.1 Simphfiedline diagramofcoker 89 . hypothesis or belief than it is to establish a correct one. However.

though primitive. an operation that took place every few days. When oil came out of the vent.) The operatorsat last realized that their theory might be wrong. On earlieroccasions lightinga burner hadcured the same symptoms.other shiftsmighthavebeen more awareofthepossibleerror. The design and method of working made an error in the end almost inevitable but the error rate (1 in 6000) was very low. They decided to check that the vent valve was open. On this occasion it did not. lower than anyone could reasonably expect.AN ENGINEER'S VIEW OF HUMAN ERROE The accident occurred while the umt was being started up following a shutdownto empty the coker of product. the rising level of oil compressing the air and oil vapourabove it and raising their temperature until it exceeded the auto-ignition temperature ofthe oil vapour. The normal procedure was to fill the plant with oil by opening the vent on the top of the dephlegmator and opcrating the low pressure filling pump and high pressure circulating pump in parallel.00 am) that the low pumpingrate and high pressure were due to an unusually high viscosity of the oil and they decided to light one burner in the furnace. which it could not overcome. circulation established and the furnace lit. This procedure. rnerelywrong. If it had been reported. Theirdiagnosis was not absurd.) However it may be that the vent valve had been left shut before butfound to be so in time. On the plant and a neighbouring one there had been 6000 successful start-ups before the explosion. Before they could open itan explosionoccurred killingone of the operators. 90 .00 prn they found that the plant was fillingwith oil rnore slowly than usual and that the pump delivery pressure was higher than usual. however.00 am the pump delivery pressure started to rise more rapidly. were reluctant to considerthat their theorymightbe wrong. it was closed. The filling pump got hot and had to be shut down. Onthenight of the accidenttheafternoonshift operatorforgotto open the vent. (1 in 1000 would be a typical figure. The incident is also interesting as an example of the slips discussed in Chapter 2. the filling pump shut down. As it was a very cold night they decided (about 2. thoughnot related to the subject of this book. They found it shut. Finally. The opcrators. The operators. it was below the normal on-line operating pressure and so the relief valves did not lift. The deplilegmator acted as a giant slow-moving diesel engine. The cause ofthe explosion is interesting. and the incidentnot reported. however. at about 5. ignored this clue and blamed the overheating of the pump on poor maintenance.It was pumping against a rising pressure.was not unsafe as the oil had a high flash-point (32°C).When the night shift came on duty at 11. (Thoughfar above normal for the filling stage.

but in addition a simplechangein the work situation would help: readings abovewhichaction is requiredshouldbe printed. preferablyin red. They did not realizethe significance of the readings. met the cold ethylene gas and froze. For 11 hours before the split occurredthe operators(on two shifts) were writing down on the record sheet readings above the relief valveset point. They didnot even draw the foremen'sattentionto themand the foremen did not notice the high readings on their routine toursofthe plant. The stack could not be turned into a flare stack.05 bab.5 psi or 0.1 bar) and the vapour would be discharged to alow stack (20 m high). it was realized that on a stilE day the cold. This incidentis interesting in another respect.4. Obviously better training of the operators was needed. Use of a formal technique.Finally. someone had an idea: put steam up the stack to warm up the ethylene so that it continued to rise and would not fall to the ground. completely blocking the 8 inch diameter stack. The tank was overpressured and splitnear the base. Heat leakinginto the tank wouldvaporizesomeethylene whichwas to be cooledand returned to the tank. but the design team were so hypnotizedby their solution that they were blind to its deficiencies. 4. Fortunately the escaping ethylene did not ignite and the tank was emptiedfor repair withoutmishap. as the heatradiation at ground level would be too high.ACCIDENTS DUE TO A LACK OF PHYSICAL OR MENTAL ABILI IV A designers mind-set A new tank was designed for the storage ofrefrigerated liquidethylene at low pressure (a gauge pressure of 0. In Renaissance times. In retrospect. Varioussolutions were considered and turned down.8 psi or 0. heavy ethylene vapourwould fall to ground level where it might be ignited. they were oftensupported on two 91 . such as a hazard and operability studyt5. This solution got everyone off the hook and was adopted. Afterwards a flare stack was constructed. When the refrigerationunit was shut down — several days per year — the ethylene in the tank would be allowedto warm up a few degrees so that the relief valve lifted (at a gauge pressure of 1. After construction had started. running down the walls of the stack.4. they steadied out at a gaugepressure of 2 psi (0. The stack could not be madetaller as the base was too weak to carry the extra weight. condensate from the steam.3 A classicalmind-set Ifmarblecolumns lie on the groundwhile awaitingerection.and took no action. they maybecome discoloured. it seems obviousthat ethylene vapour at —100°C rmghtcause water to freeze.14 bar).2 4. the full-scale deflection of the instrument. on the recordsheets. When the plant was commissioned. therefore. for consideringthe consequences of changes wouldprobablyhave helped.

the modification was made to overcome one problem but created another problem.2(d). This had the additionaladvantage that itwas easierto get aliftingrope underneath them.AN ENGINEERS VIEW OF HUMAN ERROR woodenbeams as shownin Figure 4.2(b). a casting weighing 180 kg (400 lb).2 Two ways of supporting a marble column so thatit is not in contact with the ground. as with so many design shown in Figure4.2(a).2(c)).4 A simple mind-set Riggerswere trying to lift. It is the designer's responsibility. to see how a changeinteracts with the rest ofthe existingsystem33. They had been told that it had been unbolted from its support but it did not budge. Someone realizedthat a long column mightbreak under its own weight.No-onerealizedthat if one of theend beams sankinto thegroundthenthe cantilevered end mightbreakunder its own weight. 4. Nevertheless.4. A third beamwas therefore added(Figure4. On the thirdattemptthe LL (a) (c) (d) Figure 4. Petroski comments that. he says.and possiblefailuremodes 92 . they continued to try to raise shownin Figure 4. with a crane.

If a theory is persuasive enough . The scholar then said that the tests were meaningless.. continue to believe in it eventhough the evidenceagainstit appears. coppersmithing and the manufactureof perfumesand unguents'7. There was no loadcell to indicatethe force being appliedby the crane34. 4.. Even when the theory has become an intellectual slum. A distinguished biblicalscholar dated themto the 9th—7th centuries BC—600or700 yearsearlierthanthe DeadSeaScrolls— and suggested that the languagewas Philistine... There are many examples of distinguished scholars who. These showed that they were modern. Later. Thewriting was similarto earlyHebrewbutnotidentical with it. convinced that the ancient Greeks could wnte nothing but great poetry. scientists will accommodate inconsistentor anomalous findings by decorating the accepted theory with hastily improvised modifications. the parchments had been handledby so many people that they could have been contaminated16. The documents were obviously fakes. but with the wordsin the wrongorder. The original identifier of the documents stuckto his opinion.fortunately withoutcausingany injuryor damage. other scholars pointed out that the writing was a copy of a well-known earlyHebrewinscription.thoughit wasrated for 5 tonnes.ACCIDENTS DUE TO A LACK (IF PHYSICAL OR MENTAL ABILITY nylonslingbroke. the flax industiy.5 Scholarly mind-sets Mind-sets are not restrictedto operators and designers. If so.the community will not abandonthe condemned premisesuntil alternativeaccommodation has been developed. Here are two examples. This causedthe cranehook to swing. to others.' 93 . The castingwas in fact still securedand in addition had sharp edges that may have cut the sling. they would be the first documents yet discovered in that language. Miller wntes35: '. One classical scholar. Somecamel-skin parchmentswere offered for sale inJordanin 1966. perilouslypropped and patched. it can successfullyeliminateor discreditany evidencewhichmightbe regarded as contradictory 'Thehistory of science is full of such examples.4. The parchments were datedby radio-carbon tests. having adopted a theory. once an idea lodges in the imagination. to be overwhelming. thought he had discovered the poetic metrical achievement when system for the poetry of the LinearB tablets — no mean one realizes that these tablets are administrative texts connected with the wool industry.

PetroleumReview.'25 References in Chapter 4 1. we shouldbe patient.6 Cognitivedissonance These examples illustratewhat is sometimes called 'cognitive dissonance'— literally. if we believe that incorrect diagnosis of a plant problem may cause an accident.thebeliefsthey haveheld for years.'24 4.7 Notorious mind-set A frightening example ofpeople's abilitytosee only the evidencethat supports theirview and ignoretherestis providedby the story of the witchesofSalem. as told by MarionStarkey13. or operating difficulties. Whatshouldwe do whenwe meet this reaction'? If time permits. If new information upsets our beliefs or meansthat we have to changeour ways.' and. 'preconceived notionscanbe more damaging thancannon. it was hereticalto express belief in their reality36. page 34. London. he believes what he wants to believe and rejects what does not verify what he alreadyknows. 'As with all conspiracy theories. Sell. January 1964.We cannot expect people to discard in a few minutes. at about the same time.4. We are not always more enlightened thanour forebears.G. an unpleasantnoise in the mind23. and more frequently in Scotland. The diagnosis is probably recent and will not be as deeply ingrained as views that have beenheld for a long time. Similarincidents occurred in England. Until the 13th century the Church considered the traditional power of witches as imaginary. As Paul Johnson writes (in a differentcontext). or thinks he knows.April 1982. R. bred of insanity or dreams. In the 18th centuryseveralpeoplein the townof Salem in NewEnglandwere accusedofwitchcraftand executedand for a time hysteriagrippedthe town. 94 . Even afterthe original accusers had suspiciously fled some of those in authority were unwillingto admitthat they mighthave been wrong. or even a few weeks.AN ENGINEER'S VIEW OF HUMAN ERROR 4.. the rest follows with intoxicatinglogic. The historian BarbaraTuchman writes that in wartime. oncethe first imaginative jump is made. ErgonomicsVersus Accidents (Mrnistry ofTechnology. we deny it or minimize its importance. there may be no time for patience.'It is very difficult for a recipient of secret information to believe its validity when it does not conform to his preconceived plans or ideas.4.UK) 2. We haveto persuadethose concerned that another scenario is possible.

1973. 1965.J. J. 25.. 20..M. 21.. PamphletNo 4 (Tavistock Publications. O. Sevenoaks. A History ofthe Jews. July 1984.M. Swain. Lloyds C'asualtv Week. J.USA. Schmitt. Tuchman...D. The Devil in Massachusetts (Knopf.. T.USA).A. E. 19. 60(9):29 14. in Vervalin. 1983. 10(3):66. Houston.. 1984. US Department ofEnergy. page 1064 (English 3. Texas. page 18.Aldershot. 10. 1984.F. page 69 (Buttenvorths. page47. 3rdedition. New York. London. and Bryans. 2. and Kumamoto. 075. and Fitzgerald.UK) 11.. M. 1985. 9(5): 74.UK). 6.M. 1999. 99—20. 7. E. Pipkin. 1975. Hazop and Hazan — Identifying and Assessing Process IndustiyHazards. No.Volume 1.NewJersey. page 27. 1982.J. 403. 1949. London. LossPreventionBulletin. 95 .L.H (editor). Safety Management (South Afnca). page 95 (Gulf Publishing Co. quoted in Disaster Prevention and Management. London. 1987.A. 5th edition.013K 10 A LACK O HYICAL OR MENTAL ABILITY PetroleumReview. 1969). Johnson. 1985. Raudsepp. Eysenck.A.T. BibltcalArc/iaeology Review. J.J. 211 (Weidenfeld and Nicholson. 4. 1997. Safety Management (South Africa). Umversities Press.UK).J. in Widner. H.A.D. An AtlasofManagement Thinking. Operating Experience Weekly Summary. New York. P. UK) (alsopublishedby PenguinBooks. USA').London. 24. 1984. September 1989. and Hamilton. 1962..M. page 335 (Prentice-Hall. UK). B. Factand Fiction in Psychology.B. 23.. 9. Kletz.USA) (also publishedby Anchor Books. Designingfor Reliability and Safely Control. 15.. page 7 (Office of Nuclearand FacilitySafety. C. A work situation approach to job safety. Hill.page29. Henley.4th edition (Institution ofChemical Engineers. The witchesof Salem are also the subjectofArthurMiller's play. SelectedReadingsin Safery. 1981. 6(4): 267. No. 22. No. 18. page 371 (Academy Press. T..NewJersey. DC.USA). H. page254 (Knopf. 1983). Terotechnica. Gerard.D.. Kletz.J. 17. 8. 1999. 1988. Muhly.B. page33. HydrocarbonProcessing. Foxcroft. The FLrstSalute. 13. The Diseases of Occupations. Rugby.D. September 1984. Bond.USA').L.W. 12.. London. 1981. page 129 (MaunceTemple Smith. Hunns. Industrial Accidents. 1990. Rails to Disaster.N.Washington. The Crucible. Fire Protection Manual fbr HydrocarbonProcessingPlants.W. Macon. (editor). Sicknesi and Other Absences..UK). Critical Aspects of Safety and Loss Prevention.A. USA) 27.J. The Chemical Engineer. NewYork. A. BiblicalArchaeology Review. de Bono. 1987. and Trist. Englewood Cliffs. 159. London. UK). Georgia. 18 Apnl 1977. 1981. Chapter6 (PenguinBooks. 5.UK) 26. USA). Psychogenic Illness(Eribaum. M. Hunter. 16.pages72 and 59 (Allen and Unwin.. Love..

The Body in Question. a Guide to Design. Washington.UK).Natural Hzstomy. No. 99—21. Miller. 35. 34. 36. 1994. DisasterPreventionandManagement. 1999 (EngineeringEquipment and Materials UsersAssociation. DesignParadigms.UK. 1978. Nelson.J. page 7 (Office of Nuclearand FacilitySafety. G. 98—50. page 13 (HSEBooks. 28 October 1996. 1998. and Southern. Graves. The Trials ofthe LancashireWitches (David& Charles.D. 1969... DC. 1985). Peel.W. USA). 29. 33. Newton Abbot.. UK). US Department ofEnergy. The Use of Computers in Safety CriticalApplications. E. London. DC. 1999..Chapter 14 (CambridgeUniversity Press. Schaller.. 1998. pages189 and 190 (Cape. H. US Department of Energy.Sudbury.êN ENGtNEER'S VIEW 01' HUMAN ERROR 28. Operating Experience Weekly Summamy.May 1996.Daily Telegraph. P. Washington. UK)(also pubhshedby HendonPublishing. Alarm Systems. UK). 96 . UK). 31.Management and Procurement. London. USA). No. 7(5): 401. 37. Peoski. StudyGroup on the Safety of Operational ComputerSystems. Cambridge. 30. page 9 (Office of Nuclearand FacilitySatety. page48. Marsden. Operating Experience Weekly Summary. A. page 5.1998.B. 32.

Accidents due to failures to follow instructions WThen don'tdo it. These decisions are divided below into thosemade by managersand those made by operators. The division into two categories is not. Junior managersmay needtrainingthemselves. Iam lazv whenmy bossdoesn'tdo it.sometimes peoplebreak the rules to makelife easier for themselves butatother timesthey do so with the intentionofbenefiting the organization as a whole. The errors discussed in this chapter and the next are the only ones in which people have any choice and in which blame has any relevance. in whichcase the accidentis really due to a lack of training (see Chapter 2). Managers should keep their eyes open to see that the properproceduresare being followed.beforeblamingpeoplefor so-called violations. we shouldask: • were therules known and understood? • was itpossibleto followthem? • did theycover theproblem? • were thereasons for themknown? 97 . whenever possible we should look for engineering solutions — designs which are not dependenton the operator carrying out a routine task correctly— but very often we have to adopt a software solution. and they may also cut corners if their bosses do not check up from time to time. ifthey are not convincedof the need for safety procedures. but because ofa deliberate decisionto do or not to dosomething. absolute. If operatorscut corners it may be becausethey are not convincedthat the procedureis necessary. of course. Even here.' I Anon This chapter considers some accidents which occurred not becauseof slips. he istoobusy. that is. As with the slips and lapses of attention discussed in Chapter2. we have to persuade peopleto follow rules and good practice. Non-compliance is therefore abetter term thanviolation. Thesedecisions are oftencalledviolations butmany are errors of judgement. ignorance orlackofability. They may however cut corners becauseall people carrying out a routine task become careless after a time.

.when like as not. incorrect way. Ifpeoplehavetowalkup anddown stairsseveral timesto operatea series ofvalves. The examples that followillustratethe ways in which non-compliance can be prevented or reduced. Unless it fails.The taking of nsks withoutproper stupidthing responsible thought is reckless! . Unless it comes off.'12 commended The Englishlanguagecontains anumberofirregularverbs such as: • • Iamfinn. it will be regardedas inspircd. If possible. 98 . • youbreaktherules. he is to which we canadd: • I show initiative. are stubborn. when like as not. is . 'The carefullyconsidered risk. • he is trying to wreckthejob.We can tellthem that this lowersthe plant efficiency but moving the valves or operating themremotelywill be more effective. • you pig-headed. We should: • Explainthe reasons for the rules andprocedures. Make sure everyone understands the rules.They want to be • • convinced that the actions are necessary. The most effective way of convincing them is to describe and discuss accidents that would not have occurredif the rules had beenfollowed. it will be regarded as a damn to have done in the first place.Rules imposed by authority rather than convictionsoonlapse when the boss moveson or loses interest.If thecorrectway ofcarryingout ajob is difficult and there is an easier. well thoughtout and responsibly taken. thereis agreattemptation to operatethem in the wrongorder.. makethetask easier.. We do not livein a society in whichpeople will automatically do what they are told. the incoiTect way will be followedas soonas our backsareturned.AN ENGINEER'S VIEW OF I-IUMAN ERROR • • • were earlierfailurestofollowthe rules overlooked7 was he or she tryingto help? iftherehad notbeenan accident. wouldheorshehavebeenpraisedfortheir initiative? There is a fine line between showing initiative and breaking the rules. Do not send them through the post but explainthem to and discuss themwith thepeople whowill haveto carry themout..

7. • writingprocedures in plainEnglish. Accidents of this type have been particularly emphasized by W.' Howard describes several accidents which occurred because managers were not 'farming' as well as they could — and should — have done. We are not expectedto standover peopleall the time but we are expectedto make occasionalchecks and we should not turn a blind eye when we see someone working unsafely(see Section 13. In a survey of the reasons for non-compliance'3.2. particularly in a paper sub-titled 'We aint farmin' as good as we know how'1. He tells the storyof a young graduatein agriculture who tried to tell an old farmerhow to improvehis methods. • updating procedures whenplant and workingpracticeschange. TO ?&tLURES TO tOLLOW INSTRUCTIONS Another example: due to pressure of work a maintenance section were not carrying out the routine examination of flame arresters. employees said that the followingwere the most important: • iffollowedto the letterthejob wouldn't getdone.NCCtOETS OUt. • ensuring that procedures always reflect current working practices (see Section 3. page 223). • peopleprefer to rely on their skills and experience. • people assumetheyknow what is in the procedure. • Carry out checksand audits to makesure that the rules are being followed. 'Listen son. This sectionis concerned with deliberate decisions notto carryoutactions whichsomeone clearlybeentoldto takeor has whichare generallyrecognized as part ofthejob. • people are not awarethat a procedureexists. The farmer listened for a while and then said. Howard. To repeatthespellingexampleof Section 1. For example: 99 . one of the best-knownUS loss prevention engineers.4 (page 6). we aint farmin' now as goodas we knowhow. page 67). the They recommended followingstrategies forimprovement: • involvingusers in the designofprocedures.B. 5. The design was changed so that thejob could be done by operators. I shouldbe blamed for writing 'thru' only if there is a clear instruction not to use American spellingand no-oneturneda blind eye whenI used it in the past.1 Accidents due to non-complianceby managers Chapter 6 considers the errors of seniormanagers whodo not realizethat they could do more to preventaccidents.

The accident was not the result of equipmentfailure or human error as usually thought of — that is. in order to increase production by 5%. Whatever have I said. Notests hadbeenmadetosee whathappenedwhenthereactionproductwas becauseeveryone keptformany hours. that no-one knew how long they had been out of order and that the trips were nevertested. 'Whenwill you be back on line?' Many years ago. When seniormanagersvisited the plant.' It was not what he had said that created the impression but what he had not said. frequently brought safetymattersinto the conversation. unlike the second.' he asked.It was then foundthat: An explosion occurred in a hold tank in whichreactionproduct had been • The pressurehad been risingforseveralhours beforetheexplosion but the • operatoron duty that day had had no training for the operation and did not realize the significance of the pressurerise. did they ask about safety orjust about output? (See Section 3. Before you blame the operating management. In these matters official statements ofpolicy count for little. Howard also describeda dust explosion and fire which occurred because the operating management had bypassed all the trips and alarms in the plant. The first works manager. He was genuinely shocked. a factory inspector visited a site where there were six tanks containingliquefiedflammablegas. Both managers spent a good deal of their time out on the site. and were good at talking to operators. a slip or lapse of attention (see Chapter2) — but rather the result of conscious decisions to postpone testing and to put an inexperienced operatorin charge. page65. my works managerchanged. 'Is anyone hurt?' or was it. After a disgruntled employee had blown the whistle. Another example of a deliberate management decision was described in Hansard2. 'to give that impression? Please assure everyone that I am fully committedto safety.) If someone telephoned theheadoffice to report an incident. when I was employed on operations. Little things count for more. could not be tested and were ofan unreliabledesign.whatwas the first question? Wasit. 100 . someone told the new works managerthat the employees believed himto be less interested in safety than his predecessor. After a few months. remember they were not working in a vacuum.5. They acted as they did because they sensed that their bosses put output above safety.AN ENGiNEER'S VIEW 01' HUMAN ERROR kept for several hours. The factory inspector found that five of the alarms were not working.Testshadbeen repeatedly postponed was 'too busy' running the unit. supervisors and junior managers. not safety. Each tank was fitted with a high level alarm and a high level trip.

The Chemobyl reactorwas like amarblebalancedon aconvex surface. was the scene of the world's worst nuclear accident.not becauseit was unsuccessful.. in 1986. Someone.1 Chernobyl Chemobyl.Inall other commercial nuclearreactor designs a rise in temperature causes a fall in heat output. but because the management emphasis on the works has changed.Otherreactorsare like a marble on a concave surface (Figure 5. a design used only in the formerSovietUnion: (1) The reactor was unstable at outputs below 20%. they usually let it quietly lapse and do not draw attention to the change. When managers make a deliberate decision to stop an activity. page 102). when a water-cooled reactor overheated and radioactive material was discharged to the atmosphere.)Howeverthe top management had failed to get the message across to the local management who took a deliberate decision not to test their trips and alarms. 101 . a one-millionth increase in the death rate from cancer in Europe.. If the temperature rises theheatoutputfalls. estimating accidentprobabilities. not on the reductionofbreakdowns.KCC1OWVS OUE TO F&tLURES CO FOLLOW INSTRUCTIONS In this case the top management of the organization concerned was committedto safety. However one report frankly stated.gravitymadeit move increasingly fast. (Their calculations were useless as the assumptions on which they were based — that trips and alanns would be tested — were not correct. this forecastis based on the pessimistic assumption that the risk is proportional to the dose even whenthe dose is a smallfraction of the normal background radiation. 'Thedata collectionsystem ..a town in the Ukraine. Although only about 30 people were killed immediately. Ifit startedto move. or nothing will happen. Anyrise in temperature increased thepower outputand the temperature rapidlyrosefurther.'3 Accidents are sometimes said to be system failures or organizational failures but systemsand organizations have no minds of their own.1. There were two major errors in the designofthe Chemobyl boilingwater reactor. However. The emphasis is now on the reductionof unnecessary work. The system is now no longerrunning. 5. several thousandmore may die during the next 30 years. The accident report should say who should do so. in which way and by when. They would have been better employed out on the plant testingthe alarms. a manager.1. was run for a period of about 3 years . as they employed a large team of safety specialists in their head office. has to change the system or organization..

was likeamarbleon aconvexsurface. page 65). (2) The operators were told not to gobelow20% outputbuttherewas nothing to preventthemdoing so. to keep auxiliary equipment running during the minute or so that it took for the diesel-driven emergency generators to start up. 102 . There was nothing wrong with the experiment but there were two major errors in the way it was carriedout: • theplant was operatedbelow20% output. Ifit started to move. while shutting down. Another fundamental error at Chernobylwas a failure to ask what would occur if the experiment was not successful. The accidenthappenedduring an experimentto see if the reactor developed enough power. • the automatic shutdown equipmentwas isolated so that the experiment could be repeated. itrose 100-fold in one second. Perhapsthey did not fully understand the hazards of the actions they took. Otherreactorsare like a marble on a concave surface. Whatever the reason.AN ENGINEER'S VIEW OF HUMAN ERROR Chernobyl Othernuclearreactors Figure 5. the most effective way of preventing similar accidents in the future is to change the design.1 The Chernobylnuclearreactor. It is not clear why those in charge at Chernobyl ignored two majorsafety precautions. Perhapsthe importance ofcarryingout the tests — there would not be another opportunity for a year — hadgiven them theimpressionthat the safetyrules could be relaxed(see Section 3. The reactors should be redesigned so that they are not unstable at low outputand it shouldbe impossible to switch offthe protective equipment. gravity madeit move increasingly fast.5. While we should try to persuadepeople to tbllow the rules — by explaining the reasons for them and by checking up to see that they are followed— whenthe results ofnot followingthe rules are seriouswe should not rely on this alone and shoulddesign safer plants8'9. When the temperature startedto rise. Before every experiment or change we should list possible outcomes and their effects and decide how they will be low outputs.

(2) We shouldmakethe systemas easy to use as possible and make sure that any equipment needed.2. Thereis atemptation. preferably by descnbingaccidents that have occurredbecausethere was no adequate procedure or the procedure wasnotfollowed. To prevent these accidents. is omittedor shortened.notfill in forms. No-one says anything. 5.1 No-one knew the reason for the rule Smokingwasforbiddenon a trichioroethylene (TCE)plant.notto bother. The discussionleader should outline the accident and then let the group question him to find out the rest ofthe facts. Theirjob. Or the fitter does notbotherto wear the properprotective clothing requested on the permit-to-work. They do so. so let'sJust ask himto fix the pumpagain.2 Preparation for maintenance find Many supervisors permit-to-work procedures tedious.ACCIDENTS DUE TO FAILURES I0 FOLLOW INSTRUCTIONS 5. and nothing goes wrong. or the right pump at the wrong time. such as locks and chains and slip-plates. Discussions arebetterthanlectures orreportsandtheInstitution of Chemical Engineers' safety training packages on Safer Maintenance andSafer WorkPermits4 can be used to explainthe needfor apermit-to-work system.Thefitteris experienced. Ultimately the fitter dismantles the wrong pump. fora quickjob. ultimately the fitteris injured. and most of those describedin this chapter. so they do so again. but easily made. When operators have to 5. and an accidentis unlikely ifthe task. thena secondtime and soonbecomescustomand practice. or part of it. to avoid unpleasantness. perhaps becausethe operatoris exceptionally busyortired. They decided that it would be safeto smoke. is to runtheplant.2. Someexamples follow. The workers tried to ignitesomeTCE andfoundtheycouldnotdo so.It is done once. carry out routine tasks. No-one hadtold themthat TCEvapourdrawnthroughacigarette forms phosgene. has done thejobbefore. The group shouldthen say what theythink shouldbe done to preventthe accidenthappening again. the temptation to take shortcutsis great. and there is an accident. is readily 103 .2 Accidentsdue to non-complianceby operators These are also deliberate decisions. theyfeel. a three-pronged approach is needed: (1) We should try to convince people that the procedure — in tins case a pennit-to-workprocedure— is necessary.

2. The management had not noticed this and had failed to impress on the operators the importance ofregular analyses and the results of not detecting hydrogen in the oxygen5 (see alsoSection4. protectiveclothing shouldbe as comfortable to wear as possibleand readily available. (3) Managers shouldcheck from time to time that the correctprocedures are being followed. Both streams were supposed to be analysed every hour. of course. factory inspectors went through old record sheets and found that when conditions changed the analytical results on the sheets changed at once.5.2.2. After the explosion. A good managerwould have spotted it and stopped it.2. 5. Engineering solutions. although it would take an hour for a change to occur on the plant. Thefirst step down the road to a serious accidentoccurs when a manager turnsa blind eye to amissingblind. page82). are usually better than reliance on manual tests and operator action and the official report recommendedautomatic monitoring and shutdowns (butsee Section 7. legally and morally. As a result of corrosionsome of the hydrogen had entered the oxygen stream. Ifhe does not. Similarly. Engine drivers were temptedto tamper with the relief valves on 104 . page223). It is more likely that short-cutting has been goingon for weeks or months. expectedto stand over his team at ail times. 5.3 An incident on a hydrogen/oxygen plant Anexplosion occurred on aplantmakinghydrogenandoxygenby theelectrolysis of water.2.AN ENGINEER'S VIEW OF HUMAN ERROR available. even though he is not on the site at the 143). As already stated. it is unlikely that it occurred the first time the short cut was taken. a manageris not.4 An example fromthe railways The railways — or some them — realized early on that when possible of designers should make it difficult or impossible for people to bypass safety equipment. Ifthe correct method of working is difficultor time-consuming then an unsafe methodwill be used. A friendly wordthefirsttime someonetakes ashortcutismore effective thanpunishingsomeone afteran accidenthas occurred.2. If an accident is the result of taking a short cut. then when the accidentoccurs he shares theresponsibility for it. But he should carry out periodic inspections to check that procedures are being followedand he should not turn a blind eye when he sees unsafe practicesin use (seeSection13. It was obviousthat the analyses were not being done. when possible.

On the shortjourney home she had to make severalturns. As early as 1829. They were. On each occasion she was hootedby other cars. This incident also shows the importance. When she got home and looked for the reason she found that the front indicatorlights were connected wrongly: 105 .kCCIOENTS DUE CO F&ILURES TO FOLLOW INSTEUC1 IONS earlylocomotives. of visitingthe scene and notjust relyingonreports. He tried to standonly on the cross girdersbut slippedoff onto the sheeting and fell through into the room The accident was put down to 'human failing'.2 on page 106. Most explosions were due to corrosion(preventable by better inspection or better design). Three weeks earlier anew data terminalunithadbeen installed.2.' (See Figure 14. It couldnot do so14. very few locomotive boilers blew up becausethe drivertamperedwith the relief valve a view encouraged by locomotive superintendents. as shown in Figure 5. it was found that the foot of the ladder leading to the roof was surroundedby junk. to some extent. one of whichmust be completely out of reachor controlof the engineman.5 Taking a chance A man went up a ladder onto a walkway and then climbedover the handrails below. The installers checked that the unit could send signals to the fire station. the failure of the injured man to followthe mies. Whenthe site of the accidentwas visited. The monthlytest of a fire alarm system showed that the fire station was unable to receive signals from a number of the heat detectors. 5. Many of the famous locomotive designers were abysmalmaintenance engineers6.) Incidentally.2.the Liverpool and Manchester Railway laid down the following specification for their engines: 'There must be two safetyvalves. onto a fragile flat roof whichwas clearlylabelled. 5. This may be harderto spot than ajob omittedentirely. poor maintenance or delaying maintenance to keep engines on the road. in investigating an accident. However. A woman collectedher car from a garage after repair of some accident damage. contrary to popular belief. It shows that too little attention was paid to safety in the plant and that in 'taking a chance' the injuredman was followingthe example set by his bosses. page 253. responsible for the accident.39.6 Jobs halfdone A particularform ofnon-compliance is to leave ajob half-done. however. they did not check that the fire stationwas also able to receivesignals. beforetheirline was complete.

7 Non-compliance can be better than compliance instructions are wrong. 5. Suppose a new fitter asks. non-compliance may prevent an accidentor furtherthe objectives ofthe organization. However.' The rule is often ignored. Did this scene encourage a man to take a chance9 106 . he had checked the rear lights but not the front ones.2. impracticable or simply unwise. well-meaning store managersoften estimate the money tied up by 'squirrels' and issue instructions such as. If Figure 5.He keepsa few in his locker.' In other organizations thejobis bodgedordelayed(orbodging and delaystartwhenFredretires). For example.there is a fuzzy borderbetween non-compliance and initiative. When she complained. 'It is forbidden to draw material from store in anticipation of a possible future requirement. 'What shall I do? There are no size4widgets inthe store. the mechanic said he always checked indicatorhghts to make sure that they were connected correctly.2 Foot oftheladderleadingto a fragileroof. to the organization'sadvantage. 'AskFred.RN ENGINEER'S VIEW OF HUMAN EEROE when she indicated that shc wanted to turn left. the right front indicator flashed. As statedearlier.' He maybe told.

If the group as a wholesupports safemethods ofworkingthey maybe able to influence people who carry out unsafe acts. Table 5. not wearing the correct protective clothing. Thesebehavioural sciencetechniques can be very effective in preventing accidents caused by.' And '.S effective.1 and 3. using the wrong tool for a job. and contain the right amount ofdetail.2. a search the literature of industrial psychologyhas failed to of show a single controlled experimenton the real effectiveness of safetymotivation campaigns.1 (page 108) gives some guidance on the degree of detail required. as discussed in Sections 3. We may find that it is impossible to follow the instructions or that a better method of doing the job is possible! Reference 13 describes a formal procedure for involving operating staff in the development of good practice. To improvecompliance with instructions we should first make sure that theyare clear. a friendlyword before an accident is more effectivethan a reprimand afterwards. If they are too complex people will not refer to them and will simply follow custom and practice.tCCO. People are more influenced by their peers than by their bosses. ofcourse. Swain writes: 'Motivationalappeals have a temporary effect because man adapts.follow the rules. specific infonnationfall into the "noise" category. whenever possible. easy to read.. We should discuss new instructions with them and also the reasons for them (see Section 5.As already mentioned. page 103). Regular checks should then be made to make sure that instructions are followed. They tend to be tuned out.Complex rules are often written to protectthe writerrather than help the senseofconveying no information. involve people in the preparation of the instructions they will be expectedto follow. lifting incorrectly or leavingjunk lying around15.7 (pages 48 and 67). He learns to tune out stimuliwhichare noise. either occasionally (most of us) or regularly. Information and advice on specific problemscan. beresponsibleor otherwise avoid sin. We should. up to date. 53 Actions to improve compliance 107 .. for example.1.'7 Swain is discussing general exhortations. TO FktLURES TO FOLLOW iNSTRUCTIONS In case anyone is temptedby motivationalappeals.perhaps I should say that little or no improvement will result from generalized exhortations to peopleto work safely. Safety campaigns which provide no useful. Peoplecan be trainedto spot unsafe practicesand to speaktactfully to those responsible.

check-list! memory aid) SBS = Step-By-Step instruction required .1 z 0 a t1 Table 5. Gulland. Foord and W.00 z t. 0 Medium private comrnumcation) Task criticality High Infrequent Rare Frequent Rare Frequent Low a Infrequent 0 Rare Task familiarity Frequent Infrequent a 0 Task complexity: NWI NWI JA NWI NWI NWI JA JA SBS NWI Low NWI NWE NWI JA Medium NWI NWI NM JA NM JA JA JA SBS SBS High NWI NM NM NWI = No Written Instructionrequired JA = Job Aid required (for example.1 Decision aid to determine level of support required (based on a table supplied by A G.

5. No-one reported this or had the choked filter replaced. In this case.When the secondfilter choked. Inextremecasesan employee maydeliberately damageproductsorequipment. but if he had dismantled the valve in the correct way the leak could have been controlled. non-compliance by managers or poor process or engineering design. It has been much less effective in preventing incidentssuch as the two following. The working one choked and the spare was put on line. The managerwould have accepted the delay but the culture of the workplace was 'get stuck in and get the job done'.felt that their safetyrecord was good and then had a serious fire or explosion because an aspect of the technology was not fully understood or the precautions necessary had beenforgotten. Also. However. or his immediatefellowworkers. Behaviouralsafety training is effective in preventing accidentsthat may injure the person carrying them out. Fromhis pointofview hisdecisionsarenotwrongas his objectives are notthe same as those of the managers. slips or lapses of attention. In many cases. see Appendix 1) are belowexpectations. The behavioural training was not powerfulenough to overcome these two aspects ofthe cultureand trainers hadnot recognized the problem. one spare. There were two filters in parallel— one working. In such cases advice is needed from an employee relationsexpert rather than a safety professional (seeAppendix 1). The pipeline he was workingon had not been isolated correctly.kCCOENTS O1J 10 Fk1LU1ES TO FOLLOW NSTRUCT1ONS They cannot. A fitter was asked to dismantle a valve of an unfamiliar type. Both occurred in a factory that had introduced behavioural training and achievedmajorreductions in the accidentrate. Compames have achieved low lost-timeand minor accidentrates by behavioural safetytraining. this would have held up the job. Another limitationof behavioural training is that its effects can lapse if employees feel that their pay or working conditions (what Herzberg calls hygienicfactors. the result was inevitable.have much effect on accidentsdue to poor training orinstructions. we get away with it. lack ofability. A more safety-conscious man would have got a spare valve out of store and dismantled that first.the unit had to be shut down. Anexample fromhistoryratherthanindustry will showhowfailureto takethe 109 . it would have meant admitting to his fellow workers that he did not know how the valve was constructed. ofcourse. As a result there was a serious leak of a hazardous chemical. whenwe neglecta safetymeasure.4 Alienation Failureto follow instructions orrecognized procedures canbedueto alienation.

5. Macon. did such a castleeverfall to an enemy? Many of the defenders were mercenaries.How.3. 1973. Georgia. the visitorwonders.D. Health and Safety Executive. we cannot expect the same commitment from them. Safety training packages 028 Safer Work Permits and 033 SafrrMaintenance of (Institution Chemical Engmeers. The safety of a project depends on the quality of the procedures as well as the quality of the hardware.T. T.3(3): 147. J. page227.8 May 1980. 3rd edition. Chapter 12 (Butterworth-Heinemann.. Learning from Accidents..UK). page 74.A work situation approach to improving job safety. There are cliffson three sides.B.5 Postscript Describingthe attitudeof the Arnish. UK). UK Atomic Energy Authonty Systems Reliability Service. The case for and against punishment and prosecution is discussed in Section 13. Mercenaries are contractors. an American PuritanSect. Rugby. Kletz. 5. a deepdry moatand several defensivewalls. Oxford. UK). appears impregnable.H. Plant/Operations Progress.AN ENGINEERS VIEW OF HUMAN ERROR realitiesofhuman nature into account can wrecka project. 1974.C. have expected too much of the child.A. The mercenaries knewthat they were ultimately expendable and when the going got rough they were tempted to change sides10. Hansard.London. H. orhave not set a goodexample. The innermostfortification was reservedfor the Crusaderknights. SelectedReadings in Safely. Swain. The Evploszon at LaporreIndustries Limited on5April1975(HMSO. Newton Abbot. If we treat contractors less favourably than employees. Warrington. 6. (editor).. 3. ReportofAnnualGeneralMeeting. Huntingdon writes11: 'V/henthey disobeyit is generallybecausethe parents have not stated the instructions in the way they should. Hewison. in Widner. 4. Locomotive BoilerExplosions(David and Charles. 1984. 110 .' References in Chapter 5 1.. 7. towardschildren. UK). Howard. 1983. 2001. overlooking the Jordan Valley.USA). 2. 8. A. 1976. The 12th century CrusaderCastle of Belvoirin Israel. page 371 (Academy Press. UK. havebeen inconsistent.

Embrey. Risk and Benefit —AReport a. UK) Symposium Operating Eperience Week!)' Summaiy. 'CO LURES 'CO i'OLLOW INSTRUCTIONS 14. The Chernobyl Accident and its Consequences (HMSO. Murphy-O'Connor. Carman: A systematic approach to risk reduction by improving compliance to procedures. page 2 (Office of Nuclearand FacilitySafety. UK). No 98—48. UK). F. Hazards XIV — Cost Effective Safely. and Eyre. Oxford.. G.Windsor. Huntmgdon.Rugby. 1998. P. Symposium SeriesNo. US Department ofEnergy. page 94. 1998.eta!. 144 (Institution ofChemical Engineers.H. 1990. Sellers. 111 . 2000. The behaviour-based approach to safety.. J.quotedby Long.. D. 4.idReflection on a SeriesofConsultations (St George'sHouse.UK).. 12. J.Hazards XV: The Process. 4th edition..Rugby. 147(Institution ofChemical Engineers. The Holy Land. Anon..No. NaturalHistory. page 179 (Oxford timversityPress. 13. 1998. London.G. 1983. its Safety and the Eni'ironment — Getting it Right'.CC1 9. 11. 10. SeriesNo.E. Washington. DC. Gittus. 15. 1987.UK). USA).

whatproblemstheywere meeting. the problems can't be important.Duringmytime as a safetyadviser with ICI Petrochemicals Division. They are due to the failureof seniormanagersto realizethat they could do more to prevent accidents. they rarely show the level of detailedinterestthat they devote to other problem areas. my team did bring about a change in the Division's attitude to safety. ask howthey were gettingon. and thoseduetonon-compliance withrules oracceptedpractice.1. thoseduetopoortrainingor instruction (mistakes). senior managers identifythe problems. If seniormanagerscommenton the lost-time accident rate and nothing else theygive staff the impression that they are not interested in the real problems. butsomemaybedue tolack of ability and a few to a deliberate decisionto give safety a low priority. agree actions and ask for regularreports on progress. efficiency or product quality require attention. costs. Although seniormanagers repeatedly say that safety is important.This approach is rarely seen where safety is concerned. what successthey had had Lord Sieff (formerchairman ofMarks & Spencer)' Thelastfourchaptershavedescribedfoursortsof accidents: thosedue to slips orlapsesofattention. Many look only at the lost-time accident rate but in many companies today all it measuresis luck and the willingness of injured employees to return to work. Someonehas to change the culture of the organizationand this needs the involvement. these accidents are sometimes saidto be due to organizationalfailures but organizations have no mindsof their own. or at least the support. those due to amismatchbetweenthejob and the abilityoftheperson askedto do it. but we could only do so because everyone knew that the Board supported our activities. If output. 'If the senior managersare notinterested.Accidents that could be prevented by better management Whendidyou lastsee the senior members ofyour team andre-emphasise the policyto them. Theresults are theopposite ofthose intended: everyone thinks. ofseniorpeople.' 112 . The accidents describedin this chapterare not due to afifth typeoferror.As stated inSection5. They are thus mainlydueto lackoftraining.

For example. nevertheless. The same applies to managers' errors. This may be one reason why safety is given insufficient detailedattentionand instead exhortation to work safely replaces consideration of the real problems. This point is worth emphasizingbecausewriters who are ready to excusetheerrors of ordinary people. When theaccidents describedin this chapterwere the subjectof official reports. and all the other weaknesses of human nature If output. it may be a long time before a serious accidentoccurs. In earlier chaptersI tried to make it clear that blame is rarely an appropriate action after an accident.the recommendations were quiteclear and canbe summed up by the words of the report on the ClaphamJunctionrailway accident (Section6. such as those summarized in Sections 6. occasionally incompetence. costs. Otherpossiblemeasuresare discussedin Section 6.' Unfortunately. Instead we should look for ways of reducing opportunities for error.2 (pages 114 and 115). Another reason is that senior managers want a single measure of safety performance.8 (page 125).even those of criminals. They fail as often as the rest of us and for much the same reasons: not wickedness. if safety Standards fall. Few are due to a deliberate decision to ignore risks. of improving training or instruction and so on. Directors are not supermen. They send the safety officer instead.1 and 6. and prefer to tackle underlying causes rather thanlook for culprits.seem to take a different view when managersfail. In contrast. page 118): a concernfor safety which is sincerely held and repeatedly expressed but. two academics have written a book called Toxic C'apiralism: Coiporate Crime and the Chemical Industr. At this point in earlier chapters I summarized the actions needed to preventaccidents similarto thosedescribedin the pagesthat followed. but ignorance. efficiencyand productquality fall. is as much protection from danger as no concern at all. but does not. The lost-time accident rate seems to supply this.kLCU)ENTS PREVENTABLE BY BETTER MANAGEMENT apparentand obviously require immediateaction. not a deliberate decision to ignore hazards.5. the results are soon I 113 . most managersdo not think it applies to them and who will tell them otherwise? Authors of company accident reports. which they do not recognize. not slips and lapses of attention as they have time to check their work. A reviewer summarized its message as 'corporations are unable see anything other than safetyas an avoidablecost' and 'the safetyprofessionalis helpless against the higher pressures of corporate strategies that play down their role' do not think this applies to the majority of companies though it obviously applies to a few cowboys.are unlikely to suggestthat their directors could do better and most directors do not attend safety conferences. The fall in standards is hidden or latent. is not carried through into action.

3. or so it was believed. After the explosionmany changes were madeto improvethe standard ofjoint-making. tools andinspection wereallimproved. during a recession. No expense was spared to achieve this aim but the underlying weaknesses in the management system went largely unrecognized. 114 . or if the managersof the group had allowedthe component parts less autonomy. 6. some years later.a specialized technology and little could be learnt from those who handled them at low pressure. The explosion blew downthe monastery walls. the official report on the King's Cross fire (see Section 6. and therefore strenuous efforts must be made to prevent leaks. The other plants in the group had never believed that leaks of flammable gas will not ignite. page 116) said that there was 'little exchange ofinformation or ideas between departments and still less cross-fertilisation with other industries and outside organisations' (Paragraph 9). the explosion might never have occurred. though they were not far away. Ifthemanagement oftheplantwheretheexplosion occurredhad been less insular and more willing to compare experiences with other people in the group. which killed four men. Unfortunately the managers of the ethylene plant had hardly any technicalcontact with the other plants. The factory was a monastery. such as repeatedly postponing jobs that can wait until tomorrow in order to do something that must be done today and ignoring mattersperipheral to their core responsibihties. The leak was due to a badly-made jointand so joints must be made correctlyin future. It is doubtful if the senior managers of the plant or the group ever realized or accepted this or discussedthe need fur a changein policy. Similarly. the variousparts ofthe group weremerged.AN ENGINECR'S VIEW OF HUMAN ERROR that affect the rest of us. handling flammable gases at high pressure was. Poorjoint-makinghad been toleratedfor a long time beforethe explosion because all sources of ignition had been eliminatedand so leaks could not ignite. The training. However. a group of people isolatingthemselves from the outside world. they believed.The plant was part of a large group but the individual parts of it were technically mdependent. even though we do everything we can to remove known sources.1 An accident caused by insularity2 An explosion. They knew from their own experience that sources ofignitionare liable to turn up. occurred in a plant which processed ethylene at high pressure. A leakfrom a badly-made jointwas ignitedby an unknowncause.

But the underlying cause was the amateurism of the seniormanagement and their failure to consult the other companies in the group.not to maintainhalfthe plant while the other halfis on line. Salesofone product grewto the point that a largecontinuous plant was necessary. to 115 . like fife breaks in a forest. A chemicalcompany. though they fully accepted that they were responsible for everything that went on. madeall its productsby batchprocesses. It should not be left for a foreman to sort out a few days beforethe shutdown. to put the drains underground. I doubtifit ever occurredto Ihe seniormanagers of the company or group that their decisions had led to the accident. the layout was very congested. part of a large group. if possible. The contractor sold them a good process designbut one that was poor in other ways. If it is essentialto do so then they would have been told to build the two halves well apart and to plan well in advance. No changes in organization were made immediately but a few years later responsibility for the continuous plantwas transferred to another company in the group. isolated by slip-plates. flammable hydrocarbon leaked out of an open end and was ignited by a diesel engine used by the maintenance team. One of the slip-plates was overlooked. Iftheyhad consulted othercompanies the in thegroup they would have beentoldto watchthe contractor closelyandtold some of the pointsto watch. When some of the parallel sections of the plant were shut down for overhaul other sections were kept on line. the drains were open channels and the plantwas a mixtureof senes and parallel operation. Theimmediatecause of the fire was the missing slip-plate. to divide the plant into blockswith breaksin between. The foreman who decidedwhere to put the slip-plates should have followeda more thorough and systematicprocedure. No-one in the company had much experience of such plants so they engaged a contractor andleft the designto him. where slip-plates should be inserted to isolate the two sections.They were acknowledgedexperts and had a high reputation forsafety and efficiency.kcctnEWVs VREV T&BLE BY BFTTER MANAUFMENT 6. Two men were killed and the plant was damaged. and. Four tonnes of a hot. Ifthey had consultedthem they would havebeen told to watchthe contractorclosely. The congested design increased the damage and the open draniage channels allowedthe fire to spread rapidly. Peoplecan be expert in one field but amateurs in anotherinto whichtheir technology has strayed.2 An accident due amateurism3 This accidentis somewhat similarto thelast one. at the design stage.

many processmanagersgive service lines less than their fair share of attention and they are involved in a disproportionate numbcrofincidents.) (See alsothe quotationat the end of Section 6. whichsetfire toan accumulation ofgreaseand dust on the escalatorrunning track. dust and a missing cleat were the immediate causes of the fire. A metal cleat which should have prevented matches fallingthroughthespacebetweenthe treadsand the skirtingboardwas missing andthe runmngtrackswere not cleanedregularly. The view thus grew that no firc could becomeserious and they were treatedalmost casually. There was no defencein depth. From 1958 to 1967 there were an average of 20 fifes per year. Some had caused damage and passengers had suffered from smoke inhalation but no-one had been killed. London.3 The fire at King's Cross railway station Thefifeat King's Crossunderground station. no interestat seniorlevels. by using non-flammable grease. Although the combination of a match.The immediatecause was alightedmatch. page 114. by installing smoke detectorswhich automatically switchedon the water spray. called 'smoulderings' to make them seem less 1987. grease. that occasionalfires on escalators and other equipmentwere inevitable and could be extinguished before they caused senous damage or injury. (In the same way. No water was appliedto the fire. There was no clear definition of responsibility. had little or no training in emergency proceduresand their reactions were haphazard and unco-ordinated. acceptedby all concemed. It seems that LondonUnderground ran trains very competently and professionally but was less interested in peripheral matters such as stations. notjust when it seemed to be gettingout ofcontrol. by replacing missing cleats. by regular cleaning. or reducedin number and size. Recommendations made after previousfires were not followedup. The management of safety in London Underground was criticized in the official reporr.including the highestlevels of management.) 116 . killed31 people andinjuredmany more. London Underground employees. It spread to the woodentreads.1. by better training in fife-fightingand by calling the firebngade whenever a fire was detected. by replacing wooden escalators by metal ones. droppedby apassenger on anescalator. The water spray system installed under the escalator was not actuated automatically and the acting inspector on duty walked right past the unlabelled water valves. an underlying cause was the view. skirting boards and balustrades and after 20 minutes a sudden eruption of flame occurred into the ticket hall above the escalator. Yet escalator fires could have been prevented.AN ENIITNEER'S VIEW OF HUMAN EREOR 6. promotedlargelyon the basis ofseniority. no auditing.

he assumed it was.. There was pressure to keep to time and the boat was late.'The official reports. They did not have any propercomprehension of what their duties were. seem to have continued in office. The Certificates of Competency of the captain and first officer were suspended but. The Board of Directors did not appreciate their responsibility for the safe management of their ships . as the ship rolled. '. Responsibility for safety was not clear. There was no monitoring system. soon after leaving Zeebrugge in Belgium. After the disaster they were fitted at a cost of only £500 per ship' Ships were known to have sailed before with open doors but nothing . who were well-qualified. He was unable to recognize him as the officers and crew worked different shift systems. was done. one director told the Court ofInquiry that he was responsible. Theinner and outerbow doors had beenleft open becausethe assistant bosun. if no defects were reported. the underlying or cardinal faults lay higher up in the Company. In fact. there was a more senous and fundamentalerror.' In particular. requests for indicatorlights to show that the doors were shut were not merely ignored but treated as absurd.. The officer in charge of loadingdid not check that the assistant bosun was on the job. Before sailing the captainwas not told that everythingwas OK.' The Directors. shows that ifwe look below theobviouscauses andrecommendationswe find ways of improving the management system. another said no-one was. In the holds of most ships there are partitions to confine any water that enters. 'From top to bottomthe body corporatewas infectedwith thedisease ofsloppiness. with the loss of 186 passengers and crew. 'Shore based 117 . who should have closedthem. the underlying causes were weaknesses in design aiid poor management.. On most ro-ro ferries the partitions were omitted so that vehicles had a large uninterrupted parking space.PCC'.such as the absenceofindicator lights. and makeit unstable.from which thisandother quotations are taken. however. was asleep in his cabin and didnot hearan announcement on the loudspeakers that the ship was ready to sail. Safetydependedon keepingwater out. Defence in depth was lost.OEMTS REVENTA13LE BY BETTER MANAGEMENT 6. Any water that entered could move to one side. Complaints of overloading were brushed aside. en route for Dover.4 The Herald of Free Enterprise5 In 1987thecross-Channel roll-on/roll-off ferry HeraldofFree Enterprisesank.. However. Managerial sloppiness' occurred at all levels. In addition to the detailederrors in design. those charged with the management of the Company'sro-ro fleet were not qualifiedto deal with many nauticalmatters and were unwillingto listen to their Masters. six months after the tragedy the Chairman ofthe holding company was quoted in the press as saying.

without correction and without training illustrates a deplorable level ofmonitoring and supervision within BR (British Rail) which amounted to a total lackof such vital management actions' (Paragraph 8. errors in the way wiring was carried out by a signalling technician(not cutting disused wires back. The Zeebrugge report should be read by any director or senior manager who thinks that safetycan be left to those on the job and that all theyneed do is producea few expressions of goodwill.27). not securing them out of the way and not using new insulation tape on the bare ends). when ships on long voyages were not seen for months. they had become his standard working practices (Paragraph 8.'6 He had spent most of his careerin propertymanagement and may not have realized that managers in industry accept responsibility for everything that goes on. in which35 peoplewere killedand nearly 500 injured.22)and becauseof 'the bluntingof the sharpedge of close attention which working every day of the week. or even to see the needto take. This may have been difficult on ships in the days of Captain Cook.4). but the underlying cause was failureofthe managers to take. even years. was lacking. No matterhow conscientious he is in preparing and carrying out his work. Sooneror later people fail to carry out routine tasks and so a monitoring or audit system shouldbe established. Such a system. but today a ferryon a short sea crossingis as easy to audit as a fixed plant. were repeated non-compliances of the type discussedin Chapter5 and aslip ofthe typediscussedin Chapter2. bare end). produces' (Paragraph 8. even thoughit is hundredsofmiles from their offices. without discovery. a wire count by an independent personor evenby the person whodid the work. (Paragraphnumbers refer to the official report7.AN ENGINEERS VIEW OF HUMAN ERROR management could not be blamed for duties not carried out at sea.26). perhapsbecause 'his concentration was broken by an interruption ofsome sort' (Paragraph8.) 'That he could have continued year after year to continuethese practices. without the refreshingfactor of days off. It is thoseunusualand infrequentevents that have to be guarded againstby a system of independentcheckingof his work' (Paragraph 8. The non-compliances. 118 . 6. were not isolated incidents. the action they shouldhave taken. 'Any worker will make mistakes duringhis working life. there will come a time when he will make a slip.5 The Clapham Junction railway accident Theimmediatecauses ofthis 1989 accident.2). In addition the technician made 'two further totally uncharacteristic mistakes' (disconnecting a wire at one end only and not insulating the other.

the installer.7.5).notjust sentto themthroughthe post (seeSection3.34).4).64—66). 119 .\CCDETS PREVENT\BLE BY BTTEft MANAGEMEN1 The supervisor was so busy workinghimselfthat he neglected his duties as supervisor (Paragraph 8. The original system of checkingwas a three-level one.44). A failure • All these errors add up to an indictment of the senior management who seem to have had little ideawhat was going on. Afailureto followup: onemanageridentifiedtheproblems inthesignalling department (poortesting)and arrivedat a solution (new instructions and an elite group of testerswith a new engineerin charge) but then he turnedhis attentionto other things and made the assumption that the solution would work and the problemwould go away (Paragraphs 16.9. The supervisor above the imrncdiate supervisor ofthe technicianturned a blind eyeto the fact that good practicewas not bemg followed. is as much protectionfrom dangeras no concernat all' (Paragraph 17.59). Othererrors contributing to the accidentwere: • Failuresincommunication: instructions werenot received ornot read(Para• • graphs 8.18). The officialreport makes it clear that there was a sincere concernfor safety at all levels of management but there was a 'failure to carry that concernthroughinto action.7. In such a system people tend after a while to neglect checks as they assume that the others will find any faults. page 67). is not carried through into action.9 (page 147).9. Newinstructions shouldbe explained to thosewhowill haveto carry themout.44). Asking for too much checking can increase the number of undetected faults (a point not made in theofficialreport).23and9. It has to be said that a concern for safety which is sincerely held and repeatedly expressed but. A failure to learnthelessons ofthepast:similarwiring errors hadbeenmade afew years beforethoughwith less serious results (Paragraphs 9. He had been 'pitched in the twilight of his career into work that was foreign to him' (Paragraph 16.Sec Section 7. 9.50—53and to employ suitable staff: the testing and commissioning signal was doingajobwhichhe hadreallyno wishto do at a placewhere engineer hehadnowishtobe. Ifhehadlittlelikingforthejob hehadless enthusiasm' (Paragraph 9.30—41. the supervisor and a testerwere supposedto carry out independent checks (Paragraph 8. nevertheless.

. 'I've appointed a good man as safei' adviser andgiven him the resources he wants.7. This calls in question the qualityof Occidental'smanagementof safety. havehad to consider a numberofshortcomings in what existedor took place on Piper. 'Safety is not an intellectual exercise to keep us in work. It is a matterof life and death. .1 The need for user-friendlydesigns As discussedin Chapter2. It is the sum of our contributions to safety management that determines whetherthe peoplewe work with live or die. The problemsdiffer from company to company and from time to time butthe following are widespread.AN ENGINEER'S VIEW OF HUMAN ERROR In 1988 the explosion and fire on thePiperAlphaoil platformin theNorthSea 6.) 6. they said that safety was important but they did not get involved in the precise actions required. To quote from the official report9: safety. Technicalassessor to the public enquiry. What more canI do?' Here are some suggestions.' (BrianAppleton..' 'No senior managerappearedto "own" the problem [blockages in water delugelines] and pursueit to an early and satisfactory conclusion.' 'I do not fault Occidental'spolicy or organisation in relationto matters of I Thetop men in Occidental were not hard-nosed and uncaring peopleinterested only in profit and unconcerned about safety. fromequipment well aspeople. 120 .7 What more can senior managers do? I suggest at thebeginningof this chapterthat seniormanagersshould identify the major safetyproblems. see that they were carried out and monitorprogress. They said and believed all the right things.6 Piper Alpha claimed167 lives. On Piper Alpha they died.Whenhazardous materials arehandledthe lowesterrorratesobtainable. peopleare actually very reliablebut thereare many opportunities forerrorin the course ofa day's work. Theimmediatecause was apoorpermit-to-work systemand poorhandoverbetweenshiftsbut theunderlying cause reinforcesthemessage of the other incidents describedin this chapter. 6. However. agreeactionsand followup to see that actions have been taken. as may be too high. and in particularwhetherthe systemswhich they had for implementing the company safety policy were being operatedin an effective manner.

by limitingthe energy available or by eliminating hazardous phases. they can withstand human error or equipmentfailure without serious effects on safety (and output and efficiency). Use equipmentthat cantoleratepoor operation or maintenance. then we need smaller equipment and the plant will be correspondingly changing reactionconditions. on-lineor off-line)clear. page 66) little or no thought was given to ways of reducing the inventory of hazardous material in a plant. Makeincorrectassemblyimpossible. In many cases they can do so12. Use hazardous materials in the leasthazardous fonn(attenuation).2 The needto see that the rightproceduresare used andthe right mix of knowledge and experience is available To achieve theuser-friendly designsjustdescribed we needto consideralternatives systematically dunng the earlystages ofdesign. Flixborough weakened the public's confidence in that ability and ten years later the toxic release at Bhopal (over 2000 killed) almost destroyedit.equipmentor operations. not by adding on protective equipmentbut by equipment design. In addition. 'Whatyou don'thave. Simplify the design.7. Veryoften safetystudies do not take placeuntil late in design whenall we can do is controlhazardsby 121 .' Use a safer matenal instead (substitution). Now many companies are coming round to the view that they should see if they can keep lambs instead of lions. Seniormanagers should ask for regularreports on progress in reducingthe inventories in new and existing plants and ask how new designs compare with old ones. These may seem obviousbut until after the explosion at Flixborough in 1974 (see Section 3. the process industries are trying to design plants and equipmentthat are inherently safer or user-friendly — that is. Limit effects. if we can intensify.Complexity means more opportunities for humanerror and more equipmentthat can fail. confident of their ability to keep the lions under control. • • • • • • • • Use so little hazardous material that it does not matter if it all leaks out (intensification). 6. The following are some of the ways in which plantscanbe mademore user-friendly. Avoid knock-onordomino effects. People accepted whatever inventorywas required by the design. Such plants are often cheaper as well as safer as they contain less added-onprotective equipment which is expensivedo buy and maintain.CDtWS EREVENIABLE BY BETTER MA'JAGEMLNT Increasingly. therefore. can't leak.6. Makethe statusofequipment(openor closed.

This is a particulardangerat the present time when companies are reducing manning and the over-fifties are looked upon as expenses to be eliminatedrather than assets in which thirty years' salary has been invested. All it can do.7.AN £NGINLER'S VIEW UI HUMAN ERRUR adding-on additionalprotectiveequipmentor procedures. is ensure that people's knowledge and experience are applied systematically and thus reduce the chance that something is missed. Beneath each layer of cause and effect are other layers. People will go through the motions but the output will he poor.' Accident investigation is rather like peeling an onion or dismantling a Russiandoli. Senior managers should systematically assess the levels ofknowledge and experience needed and ensurethat theyore maintained. The sameapplies to safety management systems and specific procedures such as those for controlling modifications and preparingequipment formamtenance. however. for the lessons learned and conclusions drawn tendto be so ease-specific as to seem hardlyrelevantto anything but a cloneof the faileddesign. that is nearest to our current interests.3 The need for morethoroughinvestigation of accidents When an accidenthas occurred the facts are usually recordedbut. Some accidentreports tell us more about the interest and prejudicesofthe writerthan about the most effective methods ofprevention. Such a concenspecifics Case histories trated view often discourages all but the most narrow of specialists from pursuing the ease history to any degree. Petroski writes13: of failures often tend to focusin excruciating detail on the of the failed design and on the failure mechanism. Such a changewill notcomeabout unless seniormanagersactively encourageit. the inner ones with ways of avoiding the hazards and with weaknesses in the management system. we often thaw only superficial conclusions from them. The outer ones deal with the immediate technical causes. 122 . 6. We tend to pick on the one. If the staff lackknowledge and experience then thesystem is an empty sheli. Some recent incidents have left me with an uneasy feelingthat some managers believe that a good safety management systemis all they need for safe performance. as the accidents desenbedin this chapter show. There has been an explosionof interest in safety management systemsin recent years and no topic is more popular at conferences. We identify the immediatetechnical causesbut we do not look for the underlying weaknesses in the management system. often the immediatetechnicalcause.

computers and management consultants. The present enthusiasm for downsizing makes the problemworse we can manage without advice and experience until we fall into the trap that no-one knewwas there. His first goal was a water hole at the extremenorth-westerncorner of the tribal territory. No-one knows. All too often we forget the lessons of the past and the same accidents happen again. The trek led on through a sequence of more than 50 water holes.more than half a century earlier. TheAustralian Aborigines lack (or did at thetime) literacy.4 The needto learn and remember the lessons ofthe past Even when accidents are investigated thoroughly the lessons are often forgotten. Yet they did better than many high technology companies do today. If we have paid the high price of an accident we should at least turn it into a learning experience'5. taking their memories with them.7. In 1943 an elderlyAborigine led his group on a six-month trek to escapea drought inWesternAustralia. exceptthe people whoretiredearly. and the lessons learnt after an accidentare forgotten. to see beyond the immediate technical causes. closely involved in the detail. with the only additional clues to the route being occasional marks left by earliermovementsof peoples. a very desirable thing to do. • Include wasintroducedand the accidents that wouldnothave occurred ifithad been followed. After a while people move. When the resources there started to fail. whichhe had visitedonly once in his youth. so the procedureis abandoned or the equipmentremoved and the accident it was introducedto prevent happens again. If someonefrom anotherplant is included in the investigating team he may see them more clearly. Senior managersshould not accept reports that dealonly with immediate causes. The fbllowingactionscan help us remember the lessons of the past. through territory known to him only through the verses of a song cycle sung at totemic ceremonies and depictingthe legendary wanderings of 'ancestralbeings'. The little band fmally emerged more than 360 miles fromwherethey started14. asks why we are following a time-consuming procedure or using inconvenient equipment. 123 in every instruction. he led them westwards again. However. In a typical factory it is often hard to find anyone who has been on the same plant for more than ten years.ccNTs VENTkBLE BY RETTrR MANAGEMENT It is often difficultfor a junior manager. Someone. code and standard a note on the reasonwhy it . 6. keen to improveefficiency.the Aborigines have one advantage that we lack: people stay in the same group for a long time.

Ina similarway. Never abandonaprocedurebeforewe know why it was adopted. Existingaccidentdatabases(including the Institution of ChemicalEngineers') are not being used as much as we hopedtheywouldbe.AN ENGINEER'S VIEW GE HUMAN ERRGR • Describe old accidents as wellasrecentones ill safetybulletinsandnewsletters and discuss them at safety meetings. As I type these words the spell-check and grammar-check programs are runningin the background. more easily than at new. Giving the message once is not enough. drawing my attention to my (frequent) spelling andgrammarerrors.If we don't suspect theremay be a hazard. especiallythe information that younger and less experienced company employees. people are not awareof 124 . in our own and other companies. • Rememberthatthefirst stepdown theroadto the nextaccidentoccurswhen • Never remove equipment before we know why it was installed. accidents involvingnon-return(check) valves. if someone types the words 'non-returnvalve' (or perhaps even makesa diary entry that thereis goingto be a meeting on non-returnvalves). before they leave or retire. • Follow up at regularintervalsto see that the recommendations made after accidents are being followed. With such a system. in designas well as operations.We need a systemin whichthe user is passiveand the computeris active. someoneturnsa blind eye to a missmgblind. details ofpast accidents. Another weaknessofexistingsearchengines is their hit or miss nature. Suppose we are looking in a safety database to see if there are any reports on accidents involvingthe transportof sulphuric acid. Most search engines will either display them or tell us there are none. Filterscould prevent itrepeatedly referring to the same hazard. asafetydatabase coulddraw atteiition to any subject on whichit hasdata. we don't look. 'Fuzzy' search engines will offer us reports on the transport of other minerals acids or perhaps on the storage of sulphuric acid. • Read booksthat tell us what is old as well as magazinesthat tell us what is • Devise better retrievalsystems so that we can find.Theuserhas to ask the database if there is any information on. • Include importantaccidentsofthe pastin the trainingofundergraduates and • Ask experienced people. We either geta hit or we don't. The computeris passiveand the user isactive. thecomputerwillsignalthat thedatabasecontainsinformationon these subjects and aclick of the mouse will then display the data. present. and the recommendations made afterwards. to write down their know-how. We consult themonly when we suspect that theremay be ahazard. say.

Other companies prefer surveys of employees' perceptions and/or performance monitoring19.. an annualreporton theprogress made on reducing inventories of hazardous materials.1 (page 120).PCC1DENTS PREVENT/. 6. The Financial Tunes HandbookofManagement (1184 pages. (1) A monthly summary ofthe cost offires and other dangerous occurrences can drawattention toproblemareas and theireffecton profits. The EssentialManager's Manual(1998)discusses business nsk butnot accident risk while The Big Small BusinessGuide (1996) has two sentences to say that one must comply with legislation. what could we measureinsteadoras well? Here are some suggestions. If thereare no reports on the keyword. the Handbook of Skills(1990)devotes 15 pages to the management of healthand Management safety. 6. The costscan be divided into insuredcosts. Syllabuses and booksforMBAcourses and NationalVocational Qualifications in management containnothingon safetyorjust a few lineson legal requirements. both in process and in storage. 1995) has a section on crisis management but 'there is nothingto suggest that itis the functionofmanagersto preventor avoid accidents'. uninsured costs and business interruption costs. Comparedwith the lost-time accidentrate and the costsofdamagethese methods try to detectfallingstandardsbeforean accidentoccurs. Reference 18is au anthologyofthemethods used in various companies.5 Management education A surveyofmanagement handbooks17shows that mostofthemcontainlittle or nothingon safety. can concentrate attention on one of the most effective ways of preventing large leaks. for whichprecise figures are available. (2) As suggested in Section 6.7.BLE BY BE1TER MA4AGEMEN L This is done by arranging keywords in a sort offamilytree.8 The measurementof safety II thelost-timeaccidentrate is not an effective measure of safety. For example.A good 125 . (4) The number of faulty permits-to-work found by routine inspection (not afterincidents) and/orthe numberoftrips and alarmsfoundto be faulty. In contrast.7. the system will offer reports on its parents or siblings16. (3) Many companies use the results of audits to compare one plant with anotherandoneyear withanother.

If we learn from these we can prevent many accidents.AN ENGINEER'S VIEW OF HUMAN ERROR practiceis to display charts. If toomuch attention ispaid to the numberofdangerousoccurrences rather than their lessons. thismethodis not quantitative. (5) Many accidents and dangerous occurrences are precededby near misses.1 Charts similar to this one showwhentests are due and the results of recent tests.awaiting repair FR=Test failed.or ifnumerical targetsare set. indicatethe names or referencenumbers of tripsand alarms.nowrepaired Stars. 126 . C . B. similar to Figure 6. A.However.1. Everyone enteringthe controlroomcan seetheir statusat a glance. Cormng events cast their shadowsbefore. such as leaks offlammable liquids and gases that do not ignite. Week 1 2 3 4 5 OK 6 7 8 9 * 10 A B OK OK C !ii liii ii OK ii E liii ii OK 4 E OK OK F G OK ' F J RP OK * Key - =Test due RP = Test refusedby process team RM =Test refusedby maffitenanceteam OK = Test passed F =Test failed. in each control building showing the statusofall trips and alarms. squares etc ofdifferentcolours standout more clearlythan letters Figure 6. then somedangerous occurrences will not be reported. Everyoneentering the building can see their present and recentcondition at a glance.

I.ACCiDENtS PREVENTABLE BY BETTER MANAGEMENT / _ Ma:e:Iailure Equipmentfailure -\ Actual impoitance Effortexpended importance Figure6. (Chairman). And becauseof that he sought the roots of Roman decline not in its empirebut in its emperors. 6. Oxford.London.London. UK) Kletz. 2001.UK) Department of Transport. never recognizing the latter as but the former's personification.the prevention ofhumanerror (astraditionally understood) and thepreventionofmanagement failures. Oxford. UK). 3.n'estigation vito the Claphain Junction Railway Accident (HMSO. 2001. Kletz.. Don'tAsk the Pnce. D.2 The cffort expended on the causesofaccidentsand their relative 6. 1988.London. The right-hand side showtheir relative importance. T. 1988. 'He saw the open and more superficial errors in persons and individuals but not the hidden and more profounderrors in structures and systems. But who can changethe empirebut the emperor? References in Chapter 6 2. 5. Daily Telegraph.1987. Chapter 4 (Butterworth-Heinemann.UK). UK). Chapter 5 (Butterworth-Heinemann. 10October1987. 3rd edition. MVHerald ofFree Enterprise:Report ofCourt No. Hidden. Fennell.2 (adaptedfrom Reference 8) shows the relative effortdevotedin the pastto the prevention ofequipment failure. Sieff. Learning from Accidents.M. Iin'estzgatzon into the Kiiigs Cross Underground Fire(HMSO.8074: FormalInvestigation (FIMSO. I. TA. 7.' The Roman author Tacitus described by John D. Learning from Accidents. 3rd edition.London. 4.A. 1989. 127 . Crossan20 (I have changedevil to errors). (Chairman).9 Conclusions The left-hand sideofFigure 6. UK).. A..page329 (Collins Fontana.

Dc. IChemEResearch Event. USA). UK. van Steen. 18. 12. 21(1)' 34. Batstone. The PublicInquiryinto thePiperAlp/iaDisaster.A. Gill. 14.D.. Rugby. 10..A. USA) Petroski. Design Paradigms. 7—8 April 1998. Cullen.. 1999. 1990.. Saferi' Peiformance iI'Ieasuremenr (Institution of Chemical Engineeri.. W. InternationalSymposium on PreventingMajor Chemical Accidents. San Francisco. page 78 (Herald Families Association. M. Integration ofaccidentdatabase with computertools. Februamy 1987 (not included in published proceedings). Paragraphs 14. Aldershot.W. Tombs and ispublishedby Ashgave.London.Philadelphia.. Health and Safety at Work. Zeebrugge — Leai'ningfrom Disaster. R. 20.Rugby. Cramer. PA. R. 9.J. UK).UK). Health and Safety at TVork. 21(7): 20. 1993. H. The bookreviewedis by F. 1994. 15. 16. 1999. I 128 .. CA. Newcastle. 1996. 21(8). 1987. page 15 (Harper Collins.D.AN ENGINEER'S VIEW OF HUMAN ERROR 8. F. Brace. UK).H.CL..E.J. ButtoIph. Crossan.10 and 14. S. Process Plants: A Handbook Inherently Safety Design.2nd for edition (Taylor&Francis. D.M.. 1998. Who KilledJesus'.T. 29. 1995. 2nd edition (van Nostrand Reinhold.. 1993. 17.P. 1996. 1998. page 12. page 82 (Cambndge University Press. Washington.T.. London.USA). Safety by Objectives. 13. Petersen. Lessons from Disaster— HowOrganisations noMemory and Accidents Recur(thstitution ofChemical Engineers. and Jefferson.UK).51 (HMSO.. 19. 11.. UK) Chung. iliffe. New York.February1997. UK. NaturalHistory. Pearce and S. 24 and 21(9) (editor). Cambridge. Kletz. have Kletz.

I could jiggle it around. And if they put in a number I didn't like. work for me.) Anyone who uses estimates of human reliability outside the usual ranges (seelater) shouldbe expectedtojustify them. (A witness at an inquiry said. However. 129 . observed figures based on a large sampleare availablebut this is not the case for typicalprocess industry tasks. unless there is evidenceofchange.'14 He was not speakingparticularlyof figures on human error but there is no branch of hazard analysis to which his remarks are more applicable. page 149). make a mistake in arithmetic and so on. First. lack of motivation (Chapter 5) or poor management (Chapter 6). but we can perhaps assume that they will continue in an organizationat the same rate as in thepast. There is no recognized way of estimating the probabilities of such errors. One company has developeda system for carrying out an audit of the management. press the wrong button. lack of physical or mental ability (Chapter4). Second. the figures are estimates of the probability that someone wifl. have a moment's forgetfulness or lapse of attention and forget to close a valve or close the wrong valve.K. They assume that equipment failure rates vary over a range of 1000:1 depending on the quality of the management(see Section 7. E. for example.The probability of human error 'The bestofthem isbutan approximation.. it is temptingfor those who wish to do so to try to 'jiggle' thefigures to get the answer they want. the figures are mainlyestimatesby experienced managersand are not based on a large number of observations. Gonner12 This chapter lists some estimates that have been made of the probability of error. these data have several limitations. I tried to talk my staff into doing it as they. whilethe }i'orst bears no relationwhateverto the truth. at least.10. Where operations such as soldering electric components are concerned. awardingmarks under various headingsand multiplying equipmentfailure rates by a factor between 0. errors of the type describedin Chapter 2.1 and 100 derived from the results13. Becauseso muchjudgementis involved..C. '. They are not estimates of the probability of error due to poor training or instructions (Chapter 3).

The figures quotedare for typical process industry conditions but too much or too little stress can increase error rates (see Section 4. said that it was reasonable to expect him to always close the right valve. particularly managers responsible for production. Fourth. if a largenumberofpeoplewere in a similar situation. elTors would be made.3. 10 minutes.AN ENOf1rEERS VIEW OF HUMAN ERROR Third. We can estimate fairly accurately the reliabilityofthe alarm and if we think it is too low it is easy to improveit. When the alarm sounds and flashes in the control room the operatoris expectedto go outside. have said that it is well-knownthat men are unreliable.1 Reliabilities in a man/machine system 130 .1 Why do we need to know human error rates? In brief. I hope that the incidents describedin Chapter2 will havepersuaded readers of the impracticality ofthis approach. everyone is differentand no-one can estimate the probabilitythat he or she will make an error. peopl&s opinions varied from one extreme to another. We should therefore install fully automatic equipment. particularly design engineers. 7. But what about the manin the middle? Will he alwaysdo whathe is expectedto do? In my time in industry.1. including thehuman. select the correct valve out of many and close it within. Considerthe situation shown in Figure 7. Wecan estimate roughly the reliability of the valve — the probability that theoperatorwill be able to turn the handle and that the flow will actuallystop — and if we think it is too low we can use a better qualityvalve or two in series.2. =4 Reliability Easy to improve? Alarm Known accurately Yes Valve ? Knownroughly Yes No Figure 7. If he did not he should be reprimanded. the actual error rates depend greatly on the degree of stress and distraction. All we can estimate is the probability that. page 83). so that the valve is closed automatically whenthe alarmconditionis reached. say. At other times people. becausemen are part ofthe totalprotectivesystemand wecannotestimate thereliabilityofthe total protective systemunless we know the reliability of each part ofit. At times people.

Operators usually notice that a plantis approaching a dangerouscondition and take actionbefore the emergency trip operates. test and maintain the trip. If it is acceptable.. install. an independent trip isolates the source of heat if the temperature in the tank rises above the set point. Automation does not removedependence on people. but ask. we can consider a fully automatic system. ' ' 131 . '. The trip fails at random so on average it will be dead for 2 weeks every 2 years or for 2% of the time. or 'Theoperator that the operator will .it merely transfers some or all ofthe dependence onto differentpeople. Its fractional dead time or probability offailure on demandis 0. The consequences are such that this is considered tolerable. and the vessel will overflowonce every 50 years.)Assumealso that the operatorspots the rising level and intervenes on 4 out ofthe 5 occasions. 'The operator never .. construct. In addition. we can continue with the present system. we would have to install more reliable trip systems than we normallyinstall. An example maymake tbis clearer.. If the consequentfailurerate is too high.. (Thisis not a typicalfigure. The trip fails once in 2 years (a typical figure) and is tested every 4 weeks (another typical figure).'Whatis the probawill always will . will he continue to do so when manning is reducedand the operator has extra duties? Will anyone even ask this question? Ifyou are a designer. ReliabiLity engineers often assume that operators will detect an approach to trip conditions four times out of five or nine times out of ten and that only on the fifth or tenth occasion will the trip haveto operate. the probabilitythat it will be in a failed state is 0.02.. we are also dependenton those who design.THE PROBARELITY OP HUMAN ERROR Both these attitudes are unscientific. In addition.. Ifthey did riot. Suppose the temperature in a vessei is automatically controlled by varying the heat input. We should not say. Having agreed a figure we can feed it into bility our calculations. But how confident are we that the operator will intervene4 times out of 5? Even if experience shows that he did so when the plant was new. every situationis different.02. Assume the temperature rises above the set point 5 times/year and a hazardous situationwould developunless the operator or the trip intervenes.controlengineeror plantmanager. The demandrate on the trip is once/year. do you know how often you are expecting your operatorsto intervenebefore your trips operate? Is the answerrealistic? This simple example shows that installing a trip has not removed our dependence on the operator.

2 (page 13).the telephone may ring. Busy controlroomsare the more common. (2) In a busycontrol room. butcontrolroomsin storageareas are usually quieter placeswithless stressand distraction. 0.3 Figure 7.01 (1 in 100) 0 001 (4) Ifthe valve is immediately belowthe alarm.2 shows the fault tree for loss of level in a distillation column followed by breakthrough of vapour at high pressure into the downstream storage tank.but it will be veryhigh (more than 1 in 2) as whensomeoneis in danger. A line diagram and furtherdetails includinga detailedquantitative hazardanalysisare given in Reference I. and askhim to close the valve This provides opportunities formisunderstanding. The estimates are conservative figuresbased on thejudgementsofexperienced managers.1 (1 m 10) (3) In aquiet controlroom. say 10minutes. 132 A more complex example . Thismay seem highbut before theoperatorcan respondto the alarmanotheralarm may sound. 7. the probability that a typical operatorwillfail to close the rightvalveintherequiredtime. byradio orloudspeaker. on many plants the control room operator cannotleave the controlroomand has to contact an outside man. 0. a fitter may demanda permit-to-work and another operator may report a fault outside Also. Weshould assume 1 in I for designpurposes. as described in Section2. errors increase15.2 Humanerror rates — a simple example The following are some figures I have used for failure rates in the situation described— thatis. failuretoopen a valve duringaroutineoperation. The operator'sfailurerate will be verylow but an occasional error (1 in 1000) may still occur. (1 in 1) The operator'sfailurerate will not reallybe as highas this.AN ENGINEER'S ViEW (it HUMAN ERROR 7. or thinkshe is in danger. Failure probability 1 (1) Whenfailure to act correctlywill result in aserious accident such as afireorexplosion. Afailurerate of I in 1000is aboutthe lowestthat should be assumed for any process operation —for example. In soniccasesafigurebetween I in 10 and 1 in 100 can be chosen.

2 Fault treeforvapourbreakthrough from highpressuredistillatton colun-minto downstream equtprnent due to loss oflevel indistsllation colunmbase (original destgn) 133 . Levelin columnstartsto fall 0 I per year Level controllerntisdirectedtoo faropen when on manual control (includingplantstart-upl Levelin columnstartstofall .1 4 0016 0 5per 'yeat Level control valve goes too far open due to control valve faults Level incolumn startstofall 01 peryeat Controlvalvebypassmisdirected too faropen when on bypass control duringon-line valve maintenance Level incolumn startsto tall 002 peryear Control valve bypass left open in error after on line valve maintenance Level utcolumn starts to fall 002 peryeai Figure 7.11-SR FRouvnsLvr OF 5{UMAEN ERROR Level in columnlost High pressure vapour breaks throughinto downstream equipmentand storage Operator fails toprevent continuing fall in level inthetuneavailable based on the low level pre-alarm Level control valve goes too far 015 open due to level impulse line or transmitterfaults Levelin column starts to fall tankatfullline rate 00184 peryear 015 0 lsperyear Level control valve goes too far open due to level controller faults Level in columnstarts totall 0 15 per year Level control valve goes too far open due 10 valve positioner faults.

04 (4% or I in 25) as conditions are more unexpected and the operator is more likely to be busy elsewhere.1 (10% or I in 10) as the operator has to ask an outside man to adjust a manual valve. An alarm warns the operator that the level is falling. In another similarstudy Lawley2 used the figures set out in Table 7. Forthe sixth branchP = 0. This reduces the chance that the figureswill be jiggled' (see quotation on page 129) to get a result that the designer. if loudspeakers are inaudible in part of the plant or if several outside operators canrespondto a request.25 have been estimated. the estimates of operator reliability were agreed between an experienced hazard analyst and the commissioning manager.02 and 0. so that each leaves it to the others (see Section 3.AN ENGINEER'S VIEW OF HUMAN ERROR The right-hand column ofthe tree shows the 'bottomevents'. or anyone else. page 53). The operator could notknow that the level was falling. The tree shows the probabilities that the operator will fail to preventa continuing fall in level in the time available.02 (2% or I in 50) as the operatoris normally in close attendance when a controller is on manual and correction is possible from the control room. For the second branchP has been assumed to be 0. for the last branchP = 0.05 (5% or 1 in 20). The control room operator may delaymakingthe requestin order to makesure that the level really is falling. is 1 becausethe alarm and the level indicatorin the control room would also be in a failed condition. For the third and fifth branchesP = 0. Finally. The figures used apply to the particularplant and problem and shouldnot be applied indiscriminately to other problems. In the study of which Figure 7. ForthefourthbranchP has beenassumedto be 0. values of P between0. For example. 134 . The fault is unusual and the outside man may overlookit. the probability that theywill fail will be high.1 in a form used by Lees3.2 formed a part. the initiating events that can lead to loss of level.3.25 (25% or I in 4).005) for the coincidental failure of the alarm. Fortheother branches. The outside operatorhas to he contactedhut he shouldhe near the valve and expectingto be contacted. Forthe top branchthis probability. would like. P. They include an allowance (P = 0.

01 0.TI-IC PROIiACILIrY (F HUMAN ERROR Table7.1 Induslrial fault tree analysis: operatorerror estimates Crystallizer plant Operatorfails to observe levelindicatorortake action Operatorfails to observe level alarm ortake action Manualisolationvalve wrongly closed (p) Controlvalve fails openormisdirected open Controlvalve falls shut or misdirected shut (1) Propane pipeline Operator fails to take action Probability 0.1 0.4 extremely short 0.8 1 Continued overleaf 135 .2 0.05 0.001 0.05 02 • regard as significantand pipework icing (a) emergency blowdown (b) plannedblowdown oppositepipeline fluidtemperaturelow given alarm 30 mm 0.04 0 03 Fiequency events/y 0.025 005 0.05 arid 0.ioilerfailure Id) controlvalve orvalve positionerfailure oppositeslowly developing blockageon heat revealedasheat transferhmitation exchanger oppositepipelinefluidlow temperature given alarm oppositelevel loss in tank supplying heat transfer mediumpumpgivenno measurement (p) oppositetank blowdown withoutpriorpipeline isolationgiven alarms whichoperatorwouldnot 0.6 limited • oppositebackflowin pipelinegiven alarm • opposite temperature low at outletofheat exchanger givenfaihu-e ofmeasuring instrument commontocontrolloop and alarm 0.5 05 Time available Probability • to isolatepipelineatplannedshutdown • to isolatepipeline atemergencyshutdown • oppositespurious tank blowdowngivenalarmsand • oppositetanklow level alarm • oppositetanklevel highgiven alarm with (a) controller misdirected or bypassed when on manual (b) level measurement failure (c) levelcont.002 0.1 • • 0.005 flareheader signals 30 mm 5—10mm 0.04 • • 5 mm 5 mm 0.

0025/ changeover 0.011 shutdown Frequency 0. blockage or isolationerror (p) Misdirection ofcontroller when on manual (assumedsmall proportion oftime) Notes: 0.lIy l/y I = literaturevalve p =plant value Othervalues are assumptions 7. Table 7. and group identification (about70 in total). The task is broken down into individualsteps and the probability oferror estimatedfor each. fatigue. • a series of 'performance shaping factors' such as temperature. Swain5 has developed a methodknown as THERP(Technique for Human Error Rate Prediction). The probability of error is assumedto be the product of five factors — K1—K5 — which are defined and given values in Table7.1 (cont'd) lndustnalfault tree analysis operator errorestimates m Misvalving changeover oftwo-pump set (standby pumpleft valved open.AN ENGINEER'S VIEW OF HUMAN ERROR Table7.workingpumpleft valved in) Pumpin single or doublepumpoperationstopped manuallywithoutisolatingpipeline LP steamsupplyfailureby fracture.4 Other estimates of human error rates Bello and Columbori4 have devised a method known as TESEO (Technica Empirica Stima ErroriOperati). monotony.3 (pages 138—139) is a widely quoted list of error probabilities taken from the US Atomic Energy Commission Reactor Safety Study (the Rasmussen Report)6. 136 . availability of tools.2. complexity.taking into account: • the likelihood of detection. hours worked. • theprobability of recovery. • theconsequence of the error (ifuncorrected).

expert.well-trained Averageknowledge and traimng Little knowledge.3 0.1 1 0.1 K2 10 1 20 Temporary stress factorfor non-routine activities Time available. routine Notroutine Temporary stress factorforroutineactivities Time available. poorinterfacewithplant Worstmicroclimate.routine Requiringattention. poorinterfacewithplant K5 0. poorlytrained Activityanxiety factor Situationofgraveemergency Situationofpotentialemergency Normalsituation Activityergonomicfactor Excellentmicroclimate.5 1 3 K4 3 2 1 3 7 10 137 .THE PROBABILITY OF HUMAN ERROR Table7. goodinterfacewithplant Discretemicrochmate.2 TESEO:errorprobability parameters Type ofactivity K1 Simple. s 3 0.1 Operatorqualifies K Carefullyselected. s 2 10 0 001 0. excellent interfacewithplant Good microclimate.01 0. discrete interfacewithplant Discretemicroclimate.5 K9 10 1 30 45 60 0.

Continued opposite 138 . e. Given that an operatoris reachingfor an incorrectswitch (or pair ofswitches). 3 x lO lO_2 Generalhumanerror ofcommission.he selects a particular srniilar appeanngswitch(or pair ofswitches).wherex= thenumberofincorrectswitches (or pairs ofswitches) adjacentto the desired switch(or pair of switches) The l/x appliesup to 5 or 6 items. 3 < IO 3 x iO l/x lO Errorsofonussion. he fails to note from the indicator lamps that theMOV(s)is (are)alreadyin the desired state and merely changes the status ofthe MOV(s)withoutrecogmzing he had selected the wrong switch(es). assuming no decisionerror.Forexample. Selectionof a switch(or pair of switches) dissrnular in shapeor locationto the desired switch (or pair ofswitches). With up to 5 or 6 items he doesnot expectto he wrong and thereforeis more likelyto do less dehberatesearching Given that an operatoris reaching for a wiongmotoroperated valve (MOV) switch (or pair ofswitches). failuretoreturn manuallyoperatedtest valve to properconfiguration after maintenance.Afterthat point the errorrate wouldbe lowerbecausethe operatorwouldtake more thncto search.wherethe items beingomittedare embedded in aprocedurerather than at the end as above Simplearithmetic errorswithself-checking but withoutrepeating the calculation by re-doingit on anotherpieceofpaper.g. e.AN ENGINEER'S VIEW OF -rOMAN ERROR Table 7.3 Generalestimates oferror probability used in US AtomicEnergy Commission reactorsafetystudy Estimated Activity error probability lO Selectionof a key-operated switchratherthan a non-keyswitch (thisvalue does not includethe error ofdecisionwherethe operatormisinterprets the situationand believesthekey switchis the correct choice). operatoractuates large-handled switch ratherthan small switch.g misreading label and thereforeselecting wrong switch Generalhumanerror ofomissionwherethere is no display in the controlroomofthe statusoCthe itemomitted.

A. 10 i0_2 x 139 . Operatorfalls to act correctlyafterthe first severalhoursin a high stresscondition. as in tryingto compensate for an error made in an emergency situation.this higherrorrate wouldnot apply. x. for any task.1i-iE PRO. Generalerror rategiven veryhighstress levelswheredangerous activitiesare occurring rapidly Give severe time stress.iLI ry O HUM. during generalwalk-around inspections. Personnel on differentwork shift fail to checkconditionof hardwareunlessrequired by check-listor wntten directive Monitorfails to detectundesired position ofvalves. assuming no check-listisused. e. a largeLOCA(lossof coolingaccident) Operatorfalls to act correctlyafter the first 5 minutesafter the onset ofan extremely high stresscondition.N Ek'. for an activitydoubles for each attempt. after apreviousincorrect attempt. This limiting condition corresponds to an individual'sbecoming completely disorganized or ineffective.there is a complete recoveryto thenormalerror rate. If an operatorfails tooperateconectly one oftwo closelycoupled valves orswitches in aprocedural step.0 10_I 10_i 5xI 0.3 Same as above. until the limitingconditionof an error rate of 1 0 is reachedoruntil timeruns out. Note:withcontmuing feedbackofthe error on the annunciator panel.OR Table7. x 1. theimtialerror rate.0 1.g.exceptthat the state(s) ofthe incorrectswitch(es) is (are)not the desiredstate..2—0. Operatortails to act correctlyafterthe first 30minutesin an extreme stresscondition.0 9 x 10 Operatorfails to act correctlyin first 60 seconds aftertheonset of an extremely high stresscondition.ii.3 (cont'd) Generalestimatesoferrorprobability used in US Atomic Energy Commission reactorsafetystudy 1. x. After 7 days after alargeLOCA. he also fails to operate correctlythe other valve Monitoror inspector fails to recognize initialerror by operator. etc.

poor training or instructions or non-compliance with instructions. The University of Birmingham has developed a databaseof human error probabilities called CORE-DATA (Computerised Operator Reliability and ErrorDATAbase)24.6 x (1 in 3900batches) 2. Their datacould be used to estimate futureenor rates in the companyof origin.1 2.3 x 1. a technique for estimating the influence of training and instructions on the probability of error. The method thus appeals to reliabilityengineers. • enorswhichwere too smallto matter— estimated at 80% ofthetotal. The disadvantages ofTHERPare the amount of time and expert effort required. Many human factors experts believe that any gain in accuracycompared with the simpler methodsalreadydescribedis out ofall proportionto the effort expended. Taylor-Adams and Kirwan25 have published some human enor failure rates based on one company's experience but their paper does not show whether the enors were due to slips or lapses of attention. However they do not include: • errorswhichwere corrected—estimated at 80% ofthetotal. in 25 batches. • errorswhichwere notreported— estimated at 50% of the total. If these areincluded. A studyof batchchemicaloperations producedthefollowing results: Probability perbatch omitted Ingredient Ingredient undercharged Ingredient overcharged Wrongmgredientadded 2.HUMAN ERROR An advantage of THERP is its similarityto the methods used for estimating the reliability of equipment: each component of the equipment is considered separately and its basic failure rate is modifiedto allow for environmental conditions such as temperature and vibration. Embrey23 gives more detaileddescriptions of these and other techniques.3 x 10' (1 in4375 batches) 8.AN ENOINEERS VIEW 01.3 x tO4 (1 in 1200batches) x l0 (1 in 8750batches) t0 (1 in4375 batches) 10 Totalerrors These errorrates seem rather low. the total error rate becomes 4 x 10—2 per batch or 1 There are on average fourcharging operations per batch so the error rate becomes 10—2 per operation (1 in 100 operations) which is in line with other estimates. if there is no evidenceof 140 . including a description of the Influence Diagram Approach.

10. and rather less often — perhaps 9 times out of 10 — the stress is high — say the operator knows that the plant will shut down in 5 minutesif the sparepump is not startedup correctly. misreadinglabeland therefore wrongswitches Errorsofomission.5. or goes to the wrongpump. (5) Press startbutton.E1LVt'? O 4UMA4 ERROR change. are lines three and five. (3) Closesuction valveoffailedpump. in a low stress situation.5 Two more simple examples 7. It does not matter if the operatorforgets to carry out steps (2) and (3). of the uses of some of these figures. the operator forgets the whole job becausehe has other thingson his mind. FromTable7. data on errors due to poor trainingor instructions or non-compliance with instructions apply only to the place where they were measured (see Section 7. but not in other companies.1 Starting a spare pump As an example simpleprocessoperation: starting up a sparepumpafter the runningpumphas tripped out. 149). Step (1) is includedas perhaps one of the commonestsources of error is failing to carry out this step — that is. low stress situation.2 and assume that thejob will be done correctly99 times out of 100 in a normal. (4) Opensuction valveofspare pump. Let us see if a more analyticalapproach is helpful. Data on slips and lapses of attention are generic. the lineswhich seemmost applicable. (6) Opendelivery valveofspare pump. wherethe items bemgomitted areembeddedin aprocedure 3 x 10 141 .VE 'R. let us consider another if Estimated error probability Activity 3" l0 Generalhumanerror ofconimission —for selecting example. so thereare four steps whichhave to be carriedout correctly.3. Many analysts would use the simple approach of Section7. The task can be split into a numberof steps: (1) Walkto pump. (2) Closedelivery valveoffailedpump.

goodinterfacewith K5= I factor: Probability oferror = K1 K2 K3 K4 K5 = 0. 142 .25 x4/52=0. its fractional dead time (probability of failureon demand) will be: if these figures are assumedto apply to each step.r': Requiring attention.01 orlin 100 None of our estimates of human reliability is lower than this. rather than one 'requinng attention'. so try applying Table7.3 is not much help in a condition of moderate stress. This illustrates the limitation of these techniques. Activity ergonomic More than 20 secs available Average knowledge and 'ainmg Potentialemergency plant K4=2 Goodmicroclimate.04(1 in 25).5 K3 Stressfactor: Operator qualities: Acttvuv avve1y the total probability 12>< oferror is: 10—=0. routine K = 0. However if we assume each step is 'simple'. but it is hardly justified in the nonnal situation. though it does consider very high stress situations (last five lines).25/year and if the mechanismis tested every 4 weeks.'. when successive figures in a table differby an order of magnitude. thus confirming the instinctive feel ofmost engineers that in this situation an automatic system is more reliable and should be installed if failure to start the pump has serious consequences.012(1 in 80) Table 7.AN ENGINEER'S VIEW OF HUMAN ERROR Thereare four criticalsteps. A typical failure rate will be 0. It is interestingto comparethese estimates with thereliabilityof an automatic start mechanism. and is now I in 250 for the task as a whole. The techniques are perhaps most valuable in estimatingrelativeerror probabilities rather than absolutevalues. and shows how easy it is for an unscrupulous person to 'jiggle' the figures to get the answer he or she wants.2. the error rate is 10 timeslower for each step.01 K2= 0.01 error is 0. the totalprobabilityof ½x0. Type ofactiv.

depends on the numberoftimeshe isexpectedto Start a pump. eliminate the humanelement.THE kROEAB1L1T' O HUMAN ERROR 7.1 (page 130). This is confirmedby Table 7. (For a more detailed treatment of this problemsee Reference 7.3 More opportunities — more errors In considering errorssuch as thosemadein starting a pump. We are merely dependenton differentpeople.5. An automatic system would have a failure rate of about 0. as distinct from theprobabilityoferrors. if we replace au1 operatorby automatic equipment we do not. not the other)as these people usually work under conditions of lower stress than the operator.with littleto distract the operatorwhois out on the plantgivingthejob his full attention.5/yearand as it used every day testing is irrelevantand the hazard rate (the rate at which is thetank is overfihled)is thesame as the failurerate.5. test and maintain the automatic equipment. Most analysts would estimate afailurerate of 1 in 1000occasions or about oncein 3 years. We may remove our dependence on the operatorbut we are now dependent on the people who design. as is often thought. It may be right to makethe change (it was in one ofthe casesconsidered. The foremen and operators on the batchplants had a poor reputation as a gang of incompetents who were always making 143 .) As mentioned in Section 7. construct. 7. fillinga tank. do not forget that the actual numberof errors made by an operator. install.The operation is averysimpleone. etc. The automatic equipment is therefore less reliablethan an operator.001 K2 = 0. fill a tank.2 Fillinga tank Suppose a tankis filledonce/dayandthe opcratorwatches thelevelandclosesa valvewhenitisfull.. I once worked in a works that consistedof two sections: large continuous plants and small batch plants.etc.2 whichgives: K1=0.5 K3 = K4 = 1 K5 = 1 Failurerate = 0. hut do not letus foolourselves that we have removed our dependence on people. Replacing the operator by automatic equipmentwill increase the number of spillages unless we duplicate the automatic equipment or use Rolls Royce quality equipment (if available). about onceevery 2 years.5 x I O or 1 in 2000occasions (6 years). men have been known to operate such systemsfor 5 years without incident.In practice.

7. putting material in the wrongtank.9999 0. for any reason. Skill-based actions. Theseassumptions are not always true. last five items). theoperation ofa beverage vending machine. the less likelywe are to take the wrong actionand attempts havebeen made to quantifythe effect (seeTable 7. thathe has been told to carry out the action (ifit is rule-based) and that he has been given thenecessarytraimng (if it is knowledge-based) (seeChapter3). that he does not.4 (page 25).etc. discussed in Chapter 2. decide to ignoreinstructions (Chapter5).5.) Ofcourse.if the actionrequiredisbeyondthe abilityof theperson concernedor he decides not to carry out theaction. However. Theapplication and limitations ofthe datacanbeillustratedby applying it to one ofthebutton-pressing operations described in Section2. theprobabilitythat he will fail to do so is small and can be ignored.9995 0.3. tanksfilled.4 The effect of the time available The more time we have to take action. Some of the best men on the continuous plants were transferred to the batchplantsbut with little effect..AN ENCJ[NEER'S V1EV (Jr HUMAN ERROR mistakes: overfillingtanks. are automatic or semi-automatic and need less time for successful completion than rule-based actions which in tum need less time than knowledge-based actions (both discussedinChapter3. many more tunesper day.99994— The pushbuttons on the beveragemachines are haveput an arrow againstthelast item.j inch by % inch so I . pumps were startedup. The pushbuttons are considered underseveralheadings: The first is size. Plant designers sometimes assumethat ifsomeonehas 30 minutesor more in which to act. The probabilitythat the coiTect button will be pressed dependson the size as shown below: Miniature inch more than 1%inch 1 0. extra time will makenodifference. 7. 144 l. Errors rates on the batchplants were actually lower thanon the continuous plants hut there were more opportunities for error.6 Button pressing The American Institute forResearch has published aseriesofpapersonthereliability of simple operations such as those used in operating electronic equipment8. this assumes that the action required is withinthe person's physicalor mental ability (Chapter 4). etc.

9985x 0.9999 x 0.9985 ÷— 0. A potentially ambiguous indication ofcontrolpositioning 0.9997 0.9998 x 0.9993 Thefinal item is clarityofthelabelling: At least two indications ofcontrolpositioning 0.PerhapsI am seven times more careless than 145 .9991 0.THE PROBABTLrTY OF HUMAN hEROR or row: 1—5 Next we consider the number of pushbuttons in the group.9999 but clear and concise indication of control positioning 0.9996 — Single.9998 x 0.9998 ÷— 0.9996= 0. three errors can be expectedin every 1000operations) I actuallygot thewrong drink about oncein 50 times (that is. 20 in 1000 times) that I used the machines.997 (that is.9995 0.9995 x 0.9991 The total probability of success is obtained by multiplying these six figures together: 1 Reliability = 0.9990 The next item to be considered is the numberofpushbuttons to he pushed: 2 4 8 0.9993 0.9998 The fifth item is whetheror not the buttonstays down when pressed: Yes No 0.9998. Single column 4—10 11—25 0.9965 On a beverage machine only one button has to be pushed so I assumed that the probabilityof success is 0. The fourthitem is the distance betweenthe edgesofthe buttons: inch— inch — inch inch inch up 0.9995 — 0.

146 .formany assembly line and similaroperations errorrates are available basednotonjudgementbut on a largedatabase. be useduncritically in other contexts. not high stress. or I may be thinkingabout the work I have just left. Theyreferto normal.001 He quotesthe following error ratesforinspectors measuring the sizes of holes in ametal plate: Errorsin addition.situations.008 0. so I may talk to the people whopass.) Reference 31 lists the factors which affect the probability keying a telephone number. to use thejargon usedby many workers in this field). of course. Someexamples follow. Swain9 quotes the following figures for operations involved in the assemblyof electronic equipment: Error rate per operation Excess solder Insufficient solder Two wires reversed Capacitorwiredbackwards 0. of an error in 7.006 0. Remember that manyerrors can be correctedandthat not all errors matter(or causedegradation ofmission fulfilment.0005 0.013 0. or inadequately trained or physicallyor mentally incapable.002 0. subtractionand division Algebraic sign errors Measurement reading errors Copyingerrors per six-digit number 0. A.025 0. (The machines are in the corridor.004 These shouldnot. It is more likelythat the difference is due to the fact that the methodof calculation makes no allowance for stress or distraction (ignored in this mechanistic approach) and that the small amount present is sufficient to increase my error rate seven times.D.AN ENGINEER'S VIEW OF I-{IJMAN ERROR the average person.7 Non-process operations As alreadystated.

Taylor and Lucas27 quote 2 incidents per million train-miles for 1980and 3 for 1988. say 83. Ifthe error rate ofa single operator is I in 100. 72.9 Some pitfalls in using data on human reliability 7.the chance that a signal will be passedat danger is 1 in 96 x 106/185= 1 in 5 x signals approached(hot signals at dangerapproached). The rest were due to other causes such as signals being suddenly alteredas thc train approached.Therefore. In the same way. the error rate of an operator plus a checker is certainlygreaterthan 1 in 10. 91 signals were passed at danger(information from BritishRail). 185 signals passed at danger in 72. nearly half their incidents were due to insufficient brakmg so that the train came to a halt a short distance past the signal.7 mile) on the main lines. the addition ofthe checkermay actuallyincreasethe overallerror rate. say 185. A second man in the cab of a railway engine is often justified on the grounds that he will spot the driver's errors. If we assume that between] in 10 and I in 100 (say 1 in 35) signals approached is at danger then a signal will be passed at danger once every 5 x 10/35 occasions that a signal at dangeris approached. lo io 7. 80—90% of these. in 1972.1 Checking maynot increase reliability Ifaman knows he is being checked. Say 0. It was estimatedthat many incidents are not reportedat the time and that the true total was 2—2. in good agreement.75 = 96 x 106 signals were passed.75 mile on average. in thc Eastem Region of British Rail. This is at the upperend of the range quotedfor human reliability26. However. Such incidents may be excludedfrom the figures I was given.5times this figure. the second officer on the flight deck of an aircraft may be reluctant to question 147 . Taylor and Lucas' figures suggestthat my estimatesare not too far out. Even so. so it may be interesting to quote an estimateoftheir error rates'°. Tn 1972.000 and mayevenbe greaterthan 1 in 100— that is. rather less frequenton branch lines.3 (page 38) discusseserrors by train driverswho passed signals at danger. Therefore.THE PRURkB1LTY or HUMAN ERROR 7. the junior man is usuallyreluctant to question the actions of the senior16.000 signalsat danger approached30. In practice.8 Train driver errors Section 2. Another recentestimateis oncein 17.2 x 106/0.2 x 106 miles (information from British Rail). were due to drivers' errors. The total mileagecoveredin the Eastern Region in 1972was 72. he worksless reliably. The spacing of signals is about 1 every 1200 yards(0.6 incidentsper million train-miles.9.2 million train-miles is equivalent to 2.9.

AN ENGINEER'S VIEW OF HUMAN ERROR theactionsofthecaptain. the whole lot on one occasion iii 100. the man who is going to carry out the maintenance will be wellmotivated. The wrong motor was locked off.000. Either way. Installing another alarm(at a slightly differentsettingor on a different parameter) will not reduce the failurerate to 1 in 10. as they believe this will weaken the responsibility of the isolator. 7.5. Otherorganizations encourage him to do so but do not insist on it. If the operatoris in a state in which he ignores the first alarm.9. A man was asked to lock off the power supply to an electric motor and anotherman was asked to check that he had done so. If two men can swop jobs and repeat an operation then error rates come down. then there is a more than average chance that he will ignore the second.000!). (In one plant there were five alarms in series.2 Increasing the numberofalarms does not increase reliability proportionately Suppose an operatorignores an alarm on 1 in 100 of the occasionson whichit sounds. 148 . 'In violation of the site conduct-of-operations manual. The two men then change over and repeat the calibration. neither referred to the lockoutltagout permit or the lockout tags. The designerassumedthat the operatorwould ignoreeach alarmon one occasion in ten. whichrequiresthe acts of isolation and verification to be separated in time and space.One reason suggested for Australia's outstandingly good air safetyrecordis that theyhave a culture in which secondofficers are notreluctantto questionthe actions ofthecaptain17. the isolatorand the verifierworked together to apply the lockout. Reference 11 describes the calibration of an instrumentin which one man writes down the figures on a check-listwhile the other man calls them out. The probability of errorwas put at Requiringa man to sign a statementthat he has completed a task produces very little increase in reliability as it soon becomes a perfunctory activity1 (seeSection 6.Both isolatorand the verifierused the drawing with the incorrectly circledbreaker to identify the breaker to be locked out. The report on the incident states.'28 Someorganizations expect the man who is going to carry out the maintenance to check the isolations. although both initialledthem. page 118). i0. He is the one at nsk. The following incident shows how checking procedures can easily lapse. The wrong breaker was shown on a sketch given to the men but the right one was shown on the permit and the tag and was descnbed at a briefing.

THE PROBABILITY OF HUMAN ERROR 7. On the occasionon whichhe ignores the reading the chance that he will ignorethe alarm is greaterthan average.9. Also. Even the most comprehensive study of assessment techniques cannottell us whetherany ofthe methods it evaluatedare worth much further consideration. will lead to loss ofcontrol. in certaincircumstances people are capableofmuch higherreliability.000 as the error rate for the simplest tasks. if you prefer. The author concludes.ifnotcorrected.6give 1 in 10. guessing.3 Ifan operatorignoresa reading he may ignore the alarm Suppose an operatorfailsto noticea high reading on 1 occasionin 100 — it is animportantreading and he has been trainedto pay attention to it. Inprocessoperations the operator only interacts with the system from time to time.000. They quote a net errorrate (afterrecovery) of 1 in iO per opportunity for air traffic controllers and for flight deckoperations on aircraft carriers. he makes an and adjustment then turnshis attention elsewhere.' APT is anotherterm for experienced judgernent. Then we cannot assume that he will ignore the reading and the alarm on one occasion in 10.4 Unusuallyhigh reliability Ihave suggested above that about 1 in 1000actionsis the lowesterrorrate that shouldbe assumedforany process operationandTable7. Suppose that he ignores the alarm on 1 occasion in 100. must be thoroughly familiarwith the task.9.Theysuggestthat toachievethesehigh reliabilities the operator: • • • mustbe able to recoverfrom the effects oferror. must be in continuous interactionwith a rapidly changing system where errors. such as pressing a button. Perhapsthe only real message to emerge is that AbsoluteProbabilityJudgment(APT) was the least worse ofthe methods evaluated. While the first editionof this book was in the press a review appeared of attempts to validate human reliability assessment techniques'9.3 andSection7. According to Rasmussen and Batstone 18. 7.10 Data on equipment may be data on people Doesdataon thereliabilityofequipmenttellus something aboutthe equipment or does it tell us more about the people who maintain and operate the equipment? 149 . 7. processoperations areusually designedso that there is a large marginbetweennormaloperation and loss of controland the operators are not continuously operating at theboundaries ofsafe operation. gut feel or.

Only rarely do the machinesin the office fail to deliver(or deliver cold).AN ENGINEER'S VIEW OF HUMAN ERROR Consider light bulbs as an example of a mass-produced article usually used under standard conditions. The design may be the same but the customers treat them differentlyand there is probablyno systemforreportingandrepairingdamage. The failure rate gives us information about the equipmentand does not vary by more than a factor ofabout four evenin aharsh environment20. the failure rate depends on the conditions of use and the maintenance policy.The main cause offailure was poor installation and the secondmost common cause was wrong specification of the material of construction. Beverage vending machines are a simple example.1. 150 . page 168). train the maintenance team and check that the bellows are properlyinstalled. It told us that bellows cannot withstand poor installation and are therefore best avoided when hazardous materials are handled. The other cause of bellows failure— incorrect specification— is easy to to this cause tells us about put right onceit is recognized. We would use someone els&s data with confidence. still useful. The failure data measure the training or competence of the operating team and tell us nothingabout the machinery21. Mechanicalequipmentis different. Instruments come into the same categoryas light bulbs. Even here. We do not expect the failure rate to vary from one user to another and we are confident that failure data will tell us something about the inherent propertiesofthe bulbs. Ifyou have tried to use similar machines in public places you will know that they often fail. nevertheless. if someone reported an unusually high failure rate we would question the conditions of use (were they used horizontally or upside down or subject to vibration?) or wonderif theuser(or supplier) had treated them roughly. If bellows are used we shouldspecify the methodof installation in detail. Machinery sometimes fails because it has not been lubricatedregularly. Expansion loops should be used instead. however. dataon pipework failures tell us about the qualityofdesign and construction rather than the qualityofthe pipes used (Section 9. Similarly.The information was. A study in one company showed that I in 50 failed per year and had to he replaced. The failure rate varied a great deal between one factory and another and told us more about the training and motivation of the maintenance and construction teams and the quality of the supervisionand inspectionthan it told us about the bellows.These vaiy greatlyfrom one location to anotherandgreat cautionis neededin usingother people's data. The failure rate due the knowledge of the specifiers and the company's procedures for transferring information from one designsectionto another. A more technicalexample is bellows.

7. construction. In some other countries employers are. The purposeof risk quantification is to help us decide whether or not the risk is tolerable. the first step should be to see if we can use an inherently safer design (see Section 151 . The latter are due to failures by designers or other people (or to a decision to accept an occasional failure) but we find it convenientto use in our calculations the resultant figures for the failure rate of the equipment rather than the failurerate ofthe designer. maintenance workers. There are risks so high that we do not tolerate them.12 Conclusions Enthusiasts for the quantification ofrisk sometimes become soengrossed the in techniques that theyforgetthepurposeand limitations ofthe study. maintenance and operations. operators and others allmakeerrors.11 Who makesthe errors? Designers.THE FROBILIT\ OF HUMAN FREON A man had three Ford cars and crashed each of them so he decidedto try another make. Risks are sometimes said to be due either to human failures or equipment failures. in theory.In some activitiesmaintenance errors may predominate. soitmaybe worth restatingthem. in othersoperating errors and so on. however unlikely to occur or small in their consequences. Scott and Gallaher22 quote the following for the distributionof errors leadingto valvefailure in pressurized water nuclearreactors: Physicalcauses 54% Human causes 46 Administrative errors 7% 8 DesignelTors Fabrication errors 4 Installation errors 5 Maintenance errors 16 7 Operatorerrors Chapters 8—11 consider in turn errors made during design. regardless ofthe typeoferror involved. In the UK this philosophy has been clearly spelt out by the Health and Safety Executive29. If we decide (quantitatively or otherwise) that a risk is too high. expectedto remove all hazards. but in practice this is impossible. Does this tell us somethingabout Fordcars or about the man? 7. risks so small that we accept them and in between we reduce themifthe costs of doing so are not excessive.

UK).. 1913.. USA) 9. Albuquerque. 1980. II1)' 3. Swain. UK). 1999. Lawley.H. Gonner. WASH 1400 (Atomic Energy Commission. PA. Appendix ifi.. if that is not practicable.C. Manchester (North WestBranch. 16.Report No.J. Pitblado. is that it shows vet-v dearlywhichfailures. better procedures? Fault tree analysis. and Linnerooth. could a non-flanmiable one be used instead? Ifnot. A. 1985.s AD 607161—5 (American Institutefor Research. for example. ofequipment or people. Akerm. Payne.iiOiuEER'S ViEW OF (-CUISAN ERROP. Chichester. is widely used for estimatinghazard rates and risks to life. 1.1 of Green. 9(37): 169 14. Atomic Energy Commission. et aL. High RiskSafety Technology (Wiley.. T A.J. ReportNo. 152 . 10.1. etal. 1974.P.D. and Guttennan. quoted by Kunreither. Semina. 13(4): 211. Reliability Engineering. Report 2. Reactor Safety Study — An Assessment of AccidentRisk in US Commercial Nuclear Power Plant. R M. Rugby. References in Chapter 7 1. page 120). Plant/OperationsProgress. 1980. (editor).. 6. UK). Germany). London. Its great virtue. 89.. An Index of ElectronicEquipment Operabiliti. Kletz.C. A. W. Journal ofOccupationalAccidents. Pittsburgh. and Slater. 12. The assessment of humanreliabilityin processcontrol..Conference on Human Reliability in the Piocess Control C'entre. as in Figure 7. For example. however. Livingstone-Booth.. 3.6 (Institution of Chemical Engineers. 1975. 1976. 1982. Hazop and Hazan — Identifi'ing and Assessing Process Indust. 1980.and Columbori V. Quotedby Boreham.K. can the probability of a leak and fire be reducedby using better designs of equipment or. Swain.Risley. 1983. H C.T.Warrington. A. RoyalStatisticalSociety Journal.HE. RiskAnalysis andDecisionProcesses(Spnnger-Verlag. 1964. and Holmes.C. G. on Hwnan Reliabtlu-v (UK Atomic Energy Authonty. New Mexico). 4th edition. Chapter 2. 1990. Kletz. Handbook ofHuuman ReliabilityAnalvsis with Emphasis on Nuclear Power Plant Applications. F.7. D. 7. Bello. C. 1983. Section3.'y Hazards.Institution ofChemical Engineers) 4. 1(1): 95 El. See.. It thus tells us t'here we should seek improvements. Reliability Engineering.E. Kieta. Reliability Engineering. E. DC. VitalStatistics(BBCPublications. T.A and Lawley.A. 5.D. if a flammable solvent is used.. Williams. Lees.6.UK) 13.. D. 0/ito.2. 1. As Reference 6. Railroad AccidentReporr Head-on Collision of Two Penn C'entral TransportationCoinpanv Freight Trains near Pcttisi'ille. 6. H. USA). Washington. A. contributemost to the probability' of failure. 1973. Report No NUREG/CR—1278 (Sandia Laboratories. 8.G. 15. HG.

DC. Lees. 149 20. USA). 6(5). M.W. Near Miss Reporting as a Safety Tool (Butterworth-Heinemann. R. a Crnnputemised Human Error Prohainlity Database (HSEBooks. 1999. 1989. NUREG/C'R 0848 (quoted in The MANAGER Technique. Chapter5 (American Instituteof Chemical Engineers. UK) 25. November 1988 (Technica. Disaster Pmei'entzoii and Manage/ne/it. Rasmussen. Taylor-Adams. Oxford.6 (Buttenvorth-Heinemann.M. 18. 1985. ContractResearch Report245/1999.. and Lucas. I 13.T 35 (WorldBank. 1999 (HSEBooks.H. Operating Experience with Valves in Lighrn'ater-reacsor Nuclear Power Plantsfor the Period 1965—]978.USA). R. 28. 1 December 1984.UK). W. Boyes. Reducing Risks.B. Loss PreventionBulletin. Reliability Engineering. No. 23. page 11 (Office of Nuclearand FacilitySafety. Section 13. Oxford. USA) 29. in van der Schaaf.Ti-iC i'ROBAIiiLiTY OF HUMAN ERROR NTSB—RAR—76—10. Taylor. 146. T. UK). S. 1994. and Batstone. 1998.No.B.. December 2000. Embrey. USA). Reliability Engineering. Washington.. 2nd edition (HSE Books. Flight International. J. Scott. ModernRailways. Sudbury. UK)). 99—24. AR.P. US Department ofEnergy.P. 1985. D. 1991. Williams. W..L. pnvate communication 27. Health and Safety Executive. 1992. page45. 26. Lucas. Ford..H. 1976 (National Transportation Safety Board. Ramsden. 153 . UK). ProtectingPeople. page No. F. Gibson. and Hale. T. Washington. 21.NewYork. DC.A. 11 185 22. 30. 1999. page 29.D. (editors). Loss Preventionin the Process Industries.. See also HSE Discussion Document. The Tolerability ofRiskfrom Nuclear Power Stations. D. 318. DC. Why Do Complex OrganisanonalSystems Fail7. 2nd edition. 24. and Megaw... D.Washington. OperatingExperience Weekly Summary. and Megaw.. London. and Semmens. Guidelines for PreventingHuman Error in Process Safely. 11(3). 1997. R...W.T. The Implementation of CORE-DATA. J. B . Report No. 1996.K. UK). and Gallaher. R.. page 1449 and 26 January 1985. Gibson. Sudbury. Kletz. R. and Kirwan. 17. 31. 1972.A. J.A. Sudhury. 19.

a deliberate decision not to follow a code. Forsberg'3 So far. The changes necessary will he clearer when we have looked at some examples. However.potentially thousandsofplantswithskilledoperators. Foleyt 'Ifismuchsimplertofindafei'gooddesignersand inspectors than to staff withassurance. we should try to find ways of changing the design processso as to producebetterdesigns. at least on new plants. Makedesigners aware ofthe reasons for these safetypointsby tellingthem about accidents that have occurred because they were ignored. As with 154 . ignorance.Designersmakeerrors for all the reasons that other people make errors: a lapse of attention. of course. We cannot buy our way out ofevery problem.W. lack of ability.' P. Unlike operators they usually have time to checktheir work.Some accidents that could be prevented by better design 'Don't take square—peg humans and trt and hammer theniinto roundholes.Ihaveclassified themby the type oferror.often without extra cost.' C.This chapterdescribes somemore accidents due to various sorts of human error that could be prevented by better design— in particular. safety by design should always be our aim but often there is no reasonably practicable or economicway of improving the design and we have to rely on improvements to procedures. As stated in Chapter 1.this chaptergives examples ofcases where changes in designare practicable. by changing the designso as to removeopportunities forerror. wish to implythat accidents are due to the negligence ofdesigners. I do not. Reshapethe holesas describing accidents dueto humanerror. so we should try to prevent other accidents by changing the design situation— that is. Just as we try to prevent some accidents by changing the work situation.but the main points that come out are: • • Coverimportant safetypoints in standards or codesofpractice. so slips and lapses of attention are frequently detected before the design is complete.

The boilerwas fittedwith two sightglasses. The report recommended that it should not be possible to isolate all the protectiveequipment so easily. set at a lowerlevel. It should not be possible to isolatesafety equipmentso easily. can bring appalling evidenceto light. For example. Perhaps the operator was not properly trained. in companies where they are not aregular feature. or becausehe suspected the protective equipment might be out of order. page 54).PLC1D1r. although not a design matter. page 162). 155 . Perhaps they had been closedfor maintenance and someone forgot to open them.tS ?RE TPVL. one audit of 14 photo-electric guardsshowedthat all had been bypassed by modifications to the electronics. All this protectiveequipmenthad been isolatedby closing two valves. Trips and alarms should be isolated only after authorization in writing by a competentperson and this isolationshouldbe signalled in a clear way — for example.3. It is necessaryto isolate safety equipmentfrom time to time but each piece of equipment should have its own isolationvalves. discussionis better than writing or lecturing(see Section 3. Carry out hazard and operability studies1 on the designs. which should have isolatedthe fuel supply. a light on the panel. The boilerexploded becausethe water levelwaslost. 8.7. PY SETTER DCCN • • operating staff. alow level trip which shouldhaveswitched on the water feed pump and another low level trip. by In addition. two low level alarms. regular checks and audits of protective systems should be carried out to make sure that they are not isolated. modifications whichcould havebeen madeonlyby an expert4.7. We do not know why the valves were closed. Such surveys.3. Perhapshe deliberately isolated them to make operation easier. so that everyone knows that it is isolated. an earlier Hazop on the fiowsheet (or another earlier study of the designconcept)may allow designers to avoid hazards by a change in design instead of controlling them by adding on protective equipment2 (sec Section 8. Perhaps they were closed in error. Makedesigners more aware ofthe concepts described in Section 8. so that only theminimumnumber need be isolated. It does not matter.1 Isolation of protective equipment An official report3 describeda boiler explosion which killed two men.As well as the normal Hazop on the line diagrams.

The other factorieswere informed.1 Remove opportunities for operatorerrors The wrong valve was opened and liquid nitrogen entered a mild steel line causingit to disintegrate. &3 Pipe failures About half of the large leaks that occur on oil and chemical plants are due to pipe failures. This is a commonfailing. 8. Thepipe couldbe madeof stainless steel. the factory management found a simpleway ofmaking sucherrors less likely— theypaintedthe recommended tyre pressures abovethe wheelsof all their vehicles. so willing that we sometimes over-react. When the accident has happenedelsewhere. Forone reason or another an erroris liable to be made sooneror later and an engineering solution is desirable. This incidentis similarto thosedescribedinChapter 2. They are less willing to preventequipmentgetting too cold (or too hot). they would install a relief valve or similar that itwould not matterifthe valvewas opened.7 bar). These have many causes.3.7 bar) instead of the recommended 25 psi (1.but decided to take no action — or nevergot roundto it. A tyre burst and the vehicle skiddedinto a hut.keptshutby alow temperature trip. 156 . Here are accounts of a few that could havebeen prevented by betterdesign6. Design codesand procedures should cover this point7. A small fire at our place of work has more effect than alarge fire elsewhere in the countryor a conflagration overseas. Whenan accidenthas occurredto us we are willing to make a change to try to preventit happening again. the safety adviser has to work much harderto persuade people to make a change. The company concerned operatedmany factories. There is another point of interestin this incident. Again we do not know why the tyres were overinflated. or possiblyhe was distracted while inflating them.2 Befter information display The tyresona company vehicle were inflatedto a gaugepressureof40 psi (2.1. but only the one where the incident occurred made thechange. or the valve could be an automatic one.AN ENGINEER'S VIEW O HUMAN ERROR 8. Designers wouldneverdesigna plantso thatoperationofavalveinerrorcausedequipmenttobeoverpressured. It may have been due to ignorance by the man concerned. However. Another example of prevention of errors by better information display is shownin Figure 8.

page 158). 8.2(a. page 158). There is then no doubt winchway the knobshould be turned.However a plant that decidedto use this systemfound that corrosion got worseinstead ofbetter. Abetter design is shownin Figure 8. The nozzle had beeninstalledpointingupstream insteadofdownstream (Figure 8.ACC1DETS PRBVENTABLE B? BEFTER DESWN Y Figure 8.2(b) (page 158). 157 .2 Remove opportunities for construdion errors Afireatarefinerywas caused by corrosion ofan oilpipelinejust afterthepoint at whichwater had beeninjected8 (Figure8. The water is added to the centreofthe oil stream through a nozzle so that it is immediately dispersed.1 Puzzle— which way do you turn the knob to increasethereading?It is betterto put the scaleon the base-plate insteadofthe knob.2(c).3.

2 Methodsof adding waterto an oil stream 158 .AN ENGiNEER'S VIEW OF HUMAN ERROR Flow Water Corrosion (a) Originaldesign Flow Flow (c) Reviseddesignas installed Figure 8.

when possible.3.7) Another error madeby vesseldesigners is inadvertently providing pockets intowhich water can settle. somedo occur result oftreatingthevesselin ways notforeseenbythe designer —forexample. In these cases the action required is better education of the designers. so as to remove the opportunity for construction and maintenance errors as well as operating errors. 8. it would be impossible to assemble the equipmentincorrectly.During normaloperation the maximum deflectionwas 1 inch. such as start-up. overpressuring it by isolationof the relief valve. particularly the process engineering section.CCU)LNTS PRFVBNTABLB BY BETTER DCSION Once the nozzle was installed it was impossible to see which way it was pointing.Many are the 8. The design organization should ensure that he receives information on transientconditions.thoughitisnotentirelyclearwhethertheerror was due to alack 159 . such as an arrow. It was the subjectoftwo official reports9'10whichagreedthat thedischargewas duetoan operating error. (Nevertheless.3 Design for all foreseeable conditions Abellowswas designedfornormaloperating conditions. The man who carried out the detailed design of the pipework was dependenton the information he received from other design sections. in certain cases. Whenit was steamed throughata shutdown. it may vaporize with explosive violence. although only afew ofthese could beprevented by betterdesign. shutdown and catalystregeneration. as well as normaloperating conditions. Cumbria. it was noticedthat one endwas 7 incheshigherthan the other.4 Vessel failures These are very rare. On a better design there would be an external indication of the directionof flow. better still. We shoulddesign.althoughit was designedfora maximum deflection of± 3inches.who were probablynot awareof thehazard. If designers do fit isolation valves below relief valves.5 The Sellafield leak A causecélébrein 1984was aleakofradioactive material into the sea fromthe British NuclearFuels Limited(BNFL) plant at Sellafield. then they are designing an error-prone plant. when the consequences of isolationare not serious. When thevessel heats up and hot oil comesinto contact with the water.much less common thanpipeworkfailures. 8. isolationof a relief valvemay be acceptable. or.

the sea line was 10 inches diameter.that would havebeendetectedby a hazardandoperability study'. Reference 1 describes many other accidents that could have been preventedby Hazop. poortraining orwrongjudgement. so solids settledout in the sea line where thelinear flow rate was low and were later washedout to sea. this caseby improving the design process.3). The return line was 2 inches diameter. if one hadbeen carriedout. the ability to pump material back from the sea tanks to the plant. Whetherit was or not. This should not have matteredas BNFL thought they had 'second chance' design. As a result of the operating error somematerial which was not suitable for dischargeto sea was moved to the sea tanks (Figure 8. itis the sort of design error that would be pickedup by a hazard and operability study. Unfortunately the retum route used part of the discharge line to sea.AN ENGINEER'S VIEW OF HUMAN ERROR 2 inch return line to plant 10 inchline tosea 18 mch line 10 inch line to sea Figure 8. 160 . Theauthors oftheofficial reports seem to havemade the common mistake oflookingfor culpritsinsteadoflookingfor ways of changing thework situation. The design looks as if it might have been the result of a modification.3 Simplified line diagram ofthe wastedisposal system at Sellafield ofcommunication between shifts. both official reportsfailed to point out that the leak was the result of a simple designerror.

Thedetail may have beenleft to a craftsman.He might resent being told to avoid sharpedges where stress will be concentrated. At one time church bells were tuned by chipping bits off the lip.'23 Doesn't the designer choose the materials of construction? 161 .A more knowledgeable designerwould have tapered the gradient as shown in Figure 8. It is not easy to know where to draw the line.2 Stressconcentration A non-returnvalve cracked and leaked at the 'sharp notch' shown in Figure 8.4(b) (page 162). leavinga sharp notch.FsCCUEWtS ?REVE\ PLE Y ETTEP.6.6 Other design errors 8. All the incidents could havebeenprevented by betterdesign: • Aplungerin aquick-disconnect valvein acompressed airlinewas reversed. DEStON 8. a 'dead' tone and ultimately to failure15.. Each supervisor has to know the abilityand experience ofhis team.6.1 Breathing apparatus A report14describedseveralfailuresof the compressed air supply to breathing apparatus or compressed air suits.. 8.The pipe stub on the non-returnvalve had therefore been turneddown to matchthepipestub onthe flange. The original flangehad been replaced by one with the same inside diameter but a smaller outside diameter. We should not need to explain it to a qualified craftsman. Afteranothercrash in 1999 a Cessnaspokesman was quoted in the press as saying that therewas no faultinthedesign ofthe aircraft. The ragged edge led to stress concentration. Amanufacturer'sstamp on one ofthe components prevented itholding securely. killingmore than 29 people.In everycase corrosion oftheexhaust systemwas knownor suspected tohave occurred. • Anotherquick-disconnect valvecame apartin use. Itwas absolutely aviabledesign andtheywereviablematerials for the aircraft. The design was the result of a modification.4(a) (page 162). Some knowledge is considered part of the craft. 8. • A nozzlebroke as the resultofexcessivestress. cracking.6. 'It is a featureof the material whichhas shown that it does not take the wear over a numberofyears . It shouldhave beendesignedso that incorrectassembly was impossible.3 Choice ofmaterials A dozenCessnaaircraft crashedin aperiodof29 months.

Many of these designers would have benefitedfrom a period as maintenance engineers or members of the start-up team on the plants they had designed. Ideally every design engineer should have worked on operating plant.RN ENGNERR'E VIFW OF HUMAN ERROR Flange / .Sharp notch Butt weld Figure8.many designers are unawareof a wholerange ofdesignpossibilities. Foremost amongst these is the concept of inherently safer design. Figure 8. Thus we can use so little hazardous matenal that it 162 Conceptual shortcomings . Stressconcentration led to faflure. Instead of controlling hazards by adding on protective equipment we can often avoid the hazard. goodpracticeor the practicalities ofplant operation.4(a) Turningreduced the thicknessofthestub on the non-return (check) valve butlefta sharpnotch. In addition.4(b) A safer solution 81 Most ofthe incidents described so far havebeen mistakes: the designers were unawareof specific hazards.

sounds more like a gimmickthan a recipe for action. This is not true. 'What you don't have. Reference 2 describes many examples of what has been or might be done and suggests procedures that can help us to identifypossibilities. Most senior managers are unaware. The following example illustrates the concept. A potentially hazardous chemicalreaction is the manufacture of ethylene oxide (EO) by oxidizing ethylene with pure oxygen. we can use a safer material insteador we can use the hazardous material in the least hazardous form. Perhapsmy catch-phrase.4. While most designers are aware of it they have a poor understanding of its scope and benefits. as the examples in this book show. There are many reasons why inherently safer designs have been adopted much more slowlythan techniques such as Hazopand QRA2"7.And.7.tCC1c)'aTs 'REVENTP. close to the explosivelimit. Surveys have shown that most safety advisers are now familiar with the conceptof inherently safe design. We can often simplify the design so that thereare fewer opportunities for errors and less equipment to fail (see Section 6. When explosions have occurred. However. Later plants use water underpressure insteadofparaffinas the coolant16. Psychologists (and others) often say that we have reached the limit of what can be done by changes indesign to make plantssafer and we nowneed to concentrate on changing behaviour. Any major leakcould producea devastating vapourcloud explosion. they have been localized incidents.BLE 1\ FTTER DE1GN hardly matters ifit all leaks out. Could it be a turn-offrather than a spark? Othershortcomings ofsome designers are: 163 . page 88) that any technique of value must have a certain degree of complexity. the reaction takes place in the vapour phase and so the inventory in the reactor is small. can't leak'.1. many companies are far from using the best availabledesigns. Very substantial reductions in inventory are sometimes possible. The biggest hazard on many older EO plants is not the ethylene/oxygen mixture in the reactor but the method used for cooling it. One possible reason is that designers and others havea mind-set(see Section 4.more time to consider alternatives in the early stages of design. and therefore they will not become widespread until senior managers become more awareand activelyencouragetheiruse. Inherently safer designs require a major change in the design 120). similar to the one that occurred at Flixborough in 1974. The best companies may be near the limit of what can be done by adding on protectiveequipment but the potentialities ofinherently saferdesignshave still to be grasped. The reactor tubes are surroundedby about 400 tonnes of boiling paraffin under pressure.

to deal too explicitly with the errors of engineering. apparently. This isunfortunate. can make greaterimprovementsinefficiency.Thatwas not theirjob. Judgingfrom thepaucity ofprofessionalliterature on failures (as opposed to failuresthemselves) in the first three-quarters of the present century.Newplants. They didnot askifthecontrol placed unit had tobe in aZone 2 area. He lists questions that shouldbe asked. 'Engineers should be slightly paranoiac during the design stage. Theirjob was toprovide equipment suitable for the area classification already agreed. Here is another. the structure will be safe and sound. themind of the engineering profession thefailures andmistakes theythenseemed doomed torepeat. Reluctance to learnfrom failures Accordingto Petrowski20 (writing in 1994).AN ENGINEER'S VIEW OF HUMAN ERROR Failure to consider options for the future When comparingalternative designswe shouldfavourthosethat makefurther developments possible. especially on scale-up Excessivedebottlenecking According to Basu19 the chemists do not know what information will be required and cannottake it into accountwhenplanningtheir was unfashionable at best. located on a convenient site Zone 2) a few metresfrom the main structure. Poor liaison with research chemists.usingnewtechnology. and unprofessional at worst.' Reluctance to considerpossiblefailuremodes According to saving. safetyand protection oftheenvironment18. 'By the late nineteenth and early twentieth century the increasingdramatic successes of engineers drove from theliteratureand. They should consider and imagine that the impossible could happen. butitselfappearsto be anerrorthat isnowbeingrectified. 164 Poor liaison with other design sections .3. A plant extension was providedwith its own control unit.' There was an example in Section 8.They shouldnot be complacent and securein the mererealizationthat if all therequirements in the design handbooks and manualshave been satisfied. It (electrical area classification contained electrical equipment notsuitable foruse inaZone2area andthe electrical section ofthedesigndepartment thereforearranged for the equipment to in a metal box andpurgedwith nitrogen.3 (page 159). If they had queried the choiceofsite thecontrolunit could havebeenmovedto a safe area only a few metres away. It drawsresourcesawayfrom innovation and delays the date whenbrandnew plants are built.

havenever worked on aplant andmay expect operators to look at variousdisplay pages at frequentintervals. we shouldremove hazards whenpossible by inherently saferdesign. is to actin the reverse order. Thosewho have operating experience willknow that the same page maybe left on displayfor long periodsand will therefore provide read-only displays which. Somecompanies appoint the commissioning managerearly in design and he is based in the design department or contractor's office. willprovideacontinuous display oftrends.They firstproposea procedural solution. for example. Onlyrarely do they look for ways of removingthe hazard. should we depend on procedures. 8. Poor liaison with operators somedesigners fail to talk to theircolleagues. a The fundamentalshortcomings describedin this section will not be overcomeuntil seniormanagersrecognizetheneedto do so andmakethenecessary organizational changes. Only when that is not possible. 165 .A1CIDENTS PREVENTABLE BY BETTER DESCN Not only was the blanketingexpensivebut its design was poor.8 Problemsof design contractors whoareaskedfor a cheapdesignarein a quandary. more fail to discuss the design with those whowillhave to operatethe plant. It failedin its task and an explosion occurred22. to provide an operating input. like old-fashioned chart recorders. Unfortunately the default action of many designers.Many controlsystemdesigners. whentheydiscover hazard.Ifwecannotdo so. thenwe shouldadd onprotectiveequipmentto keep the hazards under control. they proposeaddingon protectiveequipment. which are desirable butperhaps not essential. Do they Designcontractors leave out safetyfeaturesand safetystudiessuch as Hazop. One of the advantages ofHazopis that itforcesdifferentdesign sections and the commissioningmanagerto talk to eachother. in orderto getthe contract? Obviously thereis a limit beyondwhichnoreputablecontractor willgobut with this reservation perhaps they should offer a minimumdesign and then say that they recommend that certainextrafeaturesand studies are added. but unfortunately rather late in design. if Looking for alternativesin the wrong order As alreadydiscussed. Ifthey have been convinced that this is impractical.

T.' No. fillingin disused reservoirs and canalsor closing access throughderelict land. UK).. 166 . or indeed anyone whois tired or under someform ofstress.4th edition(Institution of Chemical Engineers. London. 3rd edition. The Journal of the Association for PetroleumActs Administration.. items 2 and 5 (Taylor & Francis. 'A more caiing society would be the idealbut thereis no wayofobligingpeople to bemore caring..London. Hazop and Hazan — tdentz'ing and Assessing Process IndustryHazards. drunks. Kletz..PH iU HRR S 'JIEW 0? HUMAN ERROR 8. page 42 (Tavistock Publications. USA) 3. Boiler Explosions Act 1882 and 1890 — Prelimznar' Enqiur. 8. Oxford. Philadelphia.A. 6. Already the design professions seem to take seriouslydesigning for the physicallyhandicapped who are otherwise active.A. UK)..For example: • • • • Since 1955 new paraffin heaters have had to be self-extinguishing when tippedover. 3453. 1969 (HMSO. UK). 1998. A. Philadelphia. 6(12): 16 5. Maybethey will soon see the needto designforthe infirm. Chapanis. Kietz. Since then these heaters have caused fewer fires. Man-machine Engineering.M. neglected children. The Bulletin. 1996.. Chapter 16 1 References in Chapter 8 (Butterworth-Heinemann. They couldbeprevented by usingfires withoutexposedheatingelements.' Kletz. T. 3rd edition. Kletz. Many drownings ofsmallchildren could be prevented by making the edges ofponds shallow. Field. Health andSaferv at Work.. Process Plants: A Handbook of Inherently Safer Design (Taylor & Francis. Manybumsaretheresultoffallsonto electric fires. April 1971. PA. 1984. T. the mentally ill.Rugby. Many drownings due to illness or drunkenness could be prevented by providingrailingsor closing paths atnight.. 4. 2. T. 1965. 2001. PA. USA). 7.A. The report concludes that.UK). 1999. 'Environmental and product design is the most reliable long-term means of accident prevention .' They therefore gaveexamples ofways inwhichbctterdesign has reducedorcouldreduceaccidents in or near the home. Dispelling ChenucalIndustryMyths. Learning from Accidents. Since 1964 new children's nightdresses have had to have increased resistance to flame spread.A. etc.9 Domestic accidents A report on domestic accidents1' said.. drug addicts. Since then fire deathshave fallen.

1984(Department ofEnvironment. Section 4. Kletz. R. August 1990. CA.Rugby.W.UK)... Design Paradigms. 15. Uhlig. Daily Telegraph... Chapter 2 (Butterworth-Heinemann. 1990. 1988. T.London.. 14 Operating Experience Weekly Swnmaiy. London. 1994. Science Dimension. 20. 22. Malpas. T. 1986. 10.ninatton ofthe Beach Incidentat BNFL Sellafield. page 7 (Office of Nuclearand FacilitySafety.Washington. DC.. Philadelphia. Endeavour.UK). 1984. quoted by Petrowski. 21. 19.tCCDEWtS E'ENTABLE 13Y BETTER DESIGN The C'onta.R. 2001. 11.T. Passive and inherent safety technology for lightwater nuclearreactors. UK). San Diego. USA. Basu. Petrowsld. Zetlm. Process Plants: A Handbook for Inherently Safer Design. Poyner. Researchand Innovation for the 1990s. 16. 13 Forsberg. 1988. 16(6) 13.P. No. 3rd edition. Design Paradigms. 1998. Chemical Engineering Progress.. AIChE Summer Meeting. in Atkinson. 1980. 98—52. L. USA). P. London.1. Cambridge.A. 9.2(Taylor&Francis. An Incident Leading to Contamination of the Beaches near to the BNFL Windscale and Calder Works. UK).. 98 (Cambridge Umversity Press. 1998.A. 1979. 4 September 1999.K. 1999. H. 1984 (Health and SafetyExecutive. Kletz. 18. page 3 (Cambridge University Press. Oxford. Personal Factors in Domestic Accidents (Department of Trade. page 28 (Institution ofChemical Engineers.. 18(1): 64. 3(1): 19.H. C. Learning from Accidents.B. 167 . UK). PA. Foley. US Department ofEnergy. 12. Kletz.94(9):75.. USA).A. 1994. 17. UK). ProcessSafety Progress. UK). B. Cambridge.

Many ofthe failures occurredmonthsoryears after the construction errors.' little. and perhaps bettertraining. in particular. was left in position. A Truthfid Song This chapterdescribessome accidents whichwere the result of construction errors. operating teamcould alsohavepreventedthe failure. RudyardKiplmg. (The 168 .) • Pipes were inadequately supported. Itwas not shown on any drawingandblew out 20 years later.1 Pipe failures The following examples of construction errors which led to pipe failures (or near-failures) are takenfrom a larger reviewofthe subjectt. The construction error that led to the collapse of the Yarra bridge has already beendiscussed(seeSection3. Things have altered in the building trade. in accordance with good engineering practice. was leftinposition. Aplug. • • A temporary support.erectedto aidconstruction. they resulted from the failure of construction teams to follow the designin detail or to do well.' disclosed.3. insertedtoaidpressuretesting. vibrated and failed by fatigue. by seeing that thepipe was adequately supported. since things were made.5. 9. while on the other hand FlindersPe-ie (describmg thepyramids)3 SHow ver.whathad beenleft to theirdiscretion. page57).Some accidents that could be prevented by better construction on the one hand the mostbrilliant workmanship was itwas intermingled withsomeastonishingcarelessnessand clumsiness. The reasons for the failures may havebeenlackoftraining orinstructions or alackofmotivation. butthe actions neededare the same: betterinspectionduring and afterconstruction orderto in that detailsleft tothe discretion ofthe see thatthe design has been followedand construction team havebeen carriedout in accordancewith good engineering practice.

and tore in tryingto do so.ACCEDENTS PREVENTABLE BY BETTER CONcTLSUCTION • A construction workercuta hole in a pipe at the wrongplace and. accurately than usual (seeSection 7. • • • • • • • • As an exampleofthe last bullet. Thepipe wasfree to expand 125 mm (5 inches) but actuallyexpanded150 mm(6 inches). did not fit exactly and were welded with a stepbetweenthem. which were to be welded together. The water has frozen. When apipeexpanded. patched the pipe and said nothing. none of which were detected during construction and all of whichcausedplant shutdowns2: • • Turbinebladeswere madefrom the wronggrade of steel. Often the wrongmaterial has been supplied. but the construction teamdid notcarry outadequatechecks. On many occasions construction teams have used the wrong material of construction. On severaloccasions bellowshavefailedbecausethe pipes betweenwhich they were installed were not lined up accurately. Pipeshavebeenfixed to supports sofirmlythat theywerenotfreeto expand. There was a leak of phosgene and aman was nearly killed. Two pipe ends. so three were improvised by welding together bits of pipe.on whichthe pipe wasresting.byhow much? Insufficientforged T-pieces were available for a hot boiler feed water system. discovering his error. splittingthe pipe.Infact. an ammoniaplant was found to have the following faults.As the weld was not known to exist it was not radiographed. The bolts in the couplingbetweenaturbineand acompressor thewrong had dimensions. 149). Otherhangers faileddue to incorrectassembly and absenceoflubrication. The patch was then coveredwith insulation. Theresponsibility lies with the construction management rather thanthe workers. abranchcamein contact with agirder. The men who installed themapparently thoughtthat bellows can be usedto takeup misalignment in pipes.not less. They failed. Four yearslater theyfailed. It was substandard and corroded. A new line had thewrong slopeso the contractors cut and welded some of thehangers. whenbellowsareused. or the pipe has corroded.and was knocked off. Dead-endshave beenleft inpipelinesinwhichwater and/orcorrosion productshave accumulated.10.pipesshouldbe alignedmore. 169 . Did the construction teamknowthat the pipewouldbeused hot andwouldthereforeexpandand.

170 .is this wise? A listofpoints to checkduring and afterconstruction is includedin Reference 1. and damagewas greaterthat it needhave been. Since a minor error — from the supplier's pointof view— can havedisastrous results. The nvets in air compressorsilencers were madefrom the wrongmaterial. not just a construction worker. When an explosion occurred in the building the pressure rose to a much higher level than intendedbefore the walls blew off. for example. Nevertheless.1 Contractors madeit stronger Acompressor house was designedso that the wallswould blowoffifanexplosion occurred inside. The construction team decided that this was a poormethodof securing the wall panels and used screws instead. Section 3. Many of them are not included in any code of practice as they are considered obvious. Blistering caused by weldingfins onto furnacetubes. such as premature failure or corrosion. Somefurnace tubesmadefrom two differentgrades ofsteel. was analysed to check that it was made from the material specified.2 Miscellaneous incidents 9. Ifuse of the wrong grade of material could have adverse results. all construction and maintenance material. In this case a construction engineer. Otherfaults detectedin time included: Defectsin furnace tubes. Probablyno code.2. the reasonfor the unusual design.N ENOtNEER'S VIEW OF HUMAN ERROR • • • • • • • There was amachiningerror in anothercoupling. from welding rods to complete items. Following similar experiences. The walls were to be made from lightweightpanels securedby pop rivets.3. this has been done.4 (page 56) describes another similar incident. says that pipes must not be fixed so firmly that they cannot expand.He did not understand. in the 1970s and 1980s many companies introduced positivematerials identification programmes. Somefurnacetubebends madefrom the wronggrade of steel.'. When inspecting a plant after construction we have to look out for errors that no-one has ever specifically forbidden. 9. Many of these programmes were abandoned when suppliers achieved quality certification. lvlineral-fihledthermocouples werefilled with the wrongfiller. and had probablynot been told. failed to follow the design. they are good engineering practice.

it was leaking and a mixture of nitrogen and flammable vapourenteredthe new storage tank.The roofwas blown offand landed. additional to anypermitissuedto construct new equipment. They would thenitrogenwill alwaysbecontaminated withvapour.1 Arrangement ofnitrogenlineson two tanks Connection of the new tank tothe nitrogen systemled to an explosion.1).1DENTS PREVENTBLB BY BETTER CONSTRUCT ION 9. No connection should be made to existing equipment without a permit-to-work.2 Contractors exceeded authorization An explosion occurred in a new storage tank (Figure 9. Withoutpermission from the operating team. The vapourmixed with the air in the tank and was ignitedby a welder who was completing the inlet pipingto the tank. Reducing valve Closed but leaking Existingtank (in usc) New tank (empty) Welder working on pipeline 4 Figure 9. is Oncenew equipment connected to existingplantit becomes part ofit and shouldbe subjectto the full permit-to-work procedure.1). 171 . have connected up a productline but they thoughtit would be quitesafe to connect up the nitrogenline. Nitrogenis a processmaterial. by greatgoodfortune. and withouttheir knowledge. the construction team had connected up a nitrogenline to the tank.2.AL. Although the contractors closedthe valve in the nitrogen line (Figure 9. The contractors had failedto understand that: • • • The vapourspace ofthenewtankwas designedto bein balancewith that of theexistingtank. still underconstruction. they said. and it should be treatedwith as much care andrespectas any other process material. on one ofthe few pieces of empty groundin the area.No-onewas can cause asphyxiation.

Perhapsthe tank was inspected less thoroughly than usual. A new atmospheric pressure tank was constructed thestorage ofwater. • Pressure testing of the completed assembly was ineffective or not camed out. The for levelwas measuredby a floatsuspended insideaverticaltubethe full height of thetank. as it was on1y a water tank'. One day the tank was overfilled and the roofcame off althoughthe level indicatorsaid it was only abouthalf full. 9. Thefire was soonextinguished. no-onewas harmedby the fumes buta $17. • Thebarhad crackedduringheattreatmentand this wasnotdiscovered until • Thevalvewas alsomade from the wronggrade ofsteel.4 A few small holesleft out the flanges failed in service. This is ineffective if the people we are talking to do not understandthem or do not realize the importanceof following them.agradeunsuitablefor welding. accidentally setfire to someinsulation. Fumeswere suckedinto the building.AN ENGINEER'S VIEW O HUMAN ERROR 9. two socketweld flanges and two shortpipes.No-one consideredthe effectsthecontractors mighthave on theventilation4.2. As the level in the tankrosethe air in the tube was compressed and the level in the tube rose less than the level in the tank. In engineering we communicate by quotingspecifications.000 experimentwas ruined.2. • The post-welding heattreatment was omitted. thoughforgings were specified. • They were machined from bar.2. 9. • Theflanges weremadefrom the wronggrade ofsteel. 172 . were found in a 12-inch length of equipment followinga minor modification: insertion of a flangedvalvemadefrom asocketweldvalve.5 Poor co-ordination Some contractors.3 Six errors in twelveinches Sixerrors.anyone ofwhichcouldhave led to amajorincident. welding near the air inlet to a ventilation system.Holesweredrilledin thetube tomake sure that the level insideit was alwaysthe same as the level outside it. It was then found that the construction team had not drilledany holes in the top halfof the tube.

for example. Bellows are thereforebest avoided. avoid designs which are intolerant of construction errors. Who shouldcarry outthe inspection? The checks madebyconstruction inspectors in the pastare clearlynot sufficient. Obviously. 173 . Nevertheless.ACCiDENTS PREVENTABLE BY BETTER CONSTRUCTION 9. as they will suffer the results of any faults not found in time. when hazardous materials are handled. Bellows. though theymay not spot their own errors.3 Prevention of construction errors In the openingparagraphofthis chapterI suggested that errors by construction teams are best detectedby detailedinspection during and after construction. because of the itinerant nature of the workforce. The team who will start up and operate the plant have an incentive to inspect the plant thoroughly.may see more readily than anyone else when their intentions have not been followed. Certainly. errors similar to that shown in Figure 8. there is no excuse for not telling construction engineersand supervisors why particular designs have been chosen. This chapter is not intendedas an indictment ofconstruction teams.2. but not bellows. and thus avoiding errors such as some of those describedin Sections9. because of the nature of the task.1 and 9. Fixed piping can be forced into position. The designers. Couldwe reduce the numberof construction errors by taking more trouble to explainto construction workers the nature of the materials to be handled and the consequences ofnotfollowingthe designand good practice? Usually little or no attemptis made to carry out such training and many construction engineers are sceptical of its value. may fail if the pipes betweenwhichthey are placedare not lined up accurately. What is obvious to designers and operatorsis not obviousto them. The inspectionof the plant during and after construction should therefore be carried out by the start-up team assisted by one or more members ofthe designteam. They have a differentbackground to those who design and operate plants. it is difficult to prevent construction errors by changes in design and the approach must be mainly a software one — better inspectionand perhaps training.2 (page 158) can be avoided by designingthe equipmentso that it is impossible to assemble it incorrectly. and flexibility obtained by incorporating expansion loops in the piping. Howeverwe should. when possible. perhaps it night be tried. Similarly.

2.. 3. Kletz. 3(2) 117.A. T. USA). UK). C'o-ordinating C'onstruction/&!ai. 1984. from Accidents.W.AN ENGINEER S VIEW 01 NUMAN ERROR. London. Chapter 16 Oxford. References in Chapter 9 1. 4.F. UK). (Butterworth-Heinemann. S. Learning 174 . KoIff. 1893.itenance Plans wit/i Facility Maiiager may Deter Unexpected Problems and Accidents. 2001. 3rd edition. DC.R. 1990 (US Department ofEnergy. Ten Years Diggingin Egypt(Religious Tract Society. Plant/Operations Progress. Sqfetv Note DOE/EfI—0127. P. Petrie.and Mertens..Washington.

(b) On a numberofoccasions men havebeen asked to change a temperature measuring device andhaveremovedthe whole thermowell. As with construction errors. One such incident.' Anon This chapterdescribessome accidents which were the result of maintenance errors. Bolts which can safely be undone when the plant is up to pressurecould be painted 149. on afuel oil line.1. otherscould be painted red4. as shouldequipment suchas bellows (seeSection 7. A hardwaresolution is possible in these cases. page 176). such as a spring underpressure. One fire whichstartedin this waykilledsix men. causeda serious refineryfiret. 175 . and Section 9. On other occasions trappedmechanical energy. Here again. However.Sometimes the maintenance workers were inadequately trained.3. has beenreleased. a hardware solution is often possible — liquid will not be trapped ifthe valves are installedin a vertical sectionof line. sometimestheytook shortcuts.Some accidents that could be prevented by better maintenance 'The biggestcause ofbreakdowns ts maintenance. whenaskedto remove theactuatorfromamotorized valve. 10.itis often difficultor on impossible to avoid themby achange indesignand we are mainlydependent training and inspection. A similarsuggestion is to use bolts with recessedheads and fill the heads with lead if the bolts shouldnot be undonewhenthe plant is up to pressure.10. sometimes theymadeaslipor hada moment'slapse of attention. which are intolerant of poor quality maintenance. page 173).1 Incidentswhich occurred because people did not understand how equipment worked (a) Peoplehave been injured when dismantling diaphragmvalves because theydidnotrealizethat thevalvescan containtrapped liquid(Figure10. (c) On severaloccasions. designs whichcan be assembled incorrectly shouldbe avoided. Sometimes it is difficult to distinguish betweenthese causes. men have undone the wrongbolts and dismantled the valve2'3. or more than one was at work.

Altogether60 m (200 feet) of cable fell 5 m (15 feet).The adjacenthangers thenfailed. but the behavioural training had not recognized and tackledthe macho culturethat led to the accident. Peopleworked more safely and the accidentrate. Thecultureofthe workplace causedhim to press on.1 Liquid trapped in adiaphragm valve Thiscan be avoidedby locatmg the valve in a vertical line. However.3 (page 109). The designwas unfamiliar and as aresult there was a sudden release of trapped pressure. this would have caused delay and would have meant admitting that he didnotknow the construction ofthevalve.The weight ofthecables straitened the end of one of thehangers. However. a fitter was askedto dismantle a valve. (e) Figure 10. Cables were suspended from the 176 . (d) As discussedin Section 5. roofofa largeductby wire hangerswith a hook at each end.AN ENGINEERS VIEW OF HUMAN bRR0I Figure 10. they hookedonly one end of eachhangeronto thebracketandhookedthe otherendonto the hanger itselfas showninthelowerpartofFigure10. fortunately not seriously13.2.The threemen whomoved thecablesdecided tohookthehangers overthetop ofabracketthat wasfixed to the adjoiningwall. had fallendramatically. One man was injured. Theydidnotrealizethatthisdoubledthe weightonthe end fixedto the wall. The cables had to be movedtemporarily so that some other workcould becarried out.2 illustratesanotherincident. both lost-timeand minor. Both ends ofthe hangers were hooked over a steel girder. A more safety-conscious fitter mighthave got a spare valve out ofthe store and dismantled itfirst to makesure he knewhow the valvewas constructed.Therewould havebeennoproblemif theyhad hookedboth ends ofeach hangeronto the bracket. The company had madeabiginvestmentin behaviouralsafety training.

A craftsman therefore installeda filter suitablefor this pressure. The drawing incorrectly showedthe filter rated for 300 psi. There is a move today towardsself-managed groups. Each supervisor must know his team andtheirabilities.the hazardis one that many people might overlook.Does this makeincidentssuch as the onejust describedmore likely? A hardwaresolution is possible:the hangerscould be made strong enough to support the weight even if they are used as shown in the lower part of Figure 10. (t) A cylindercontaininggas at 2200psi (150 bar) was connected to a filter andthento areducing valvewhichloweredthepressureto300 psi (20bar).CC1DENTS PREVENTABLE BY BETTER MAINTENANCE Wrong Figure 10.2 Two ways ofsupporttngabundleofcables witha wirehanger How far do we need to go in giving detailed instructions about every aspect of every job. Itrupturedas the isolationvalve on 177 .2. especially to trained craftsmen? In this case.

Inspection Experience shows that it is necessary to carry out regularinspectionsor audits offlameproofequipment standards are to be maintained. glasseswere broken. gaps were too large.2 Incidentswhich occurred because of poor maintenancepractice (a) It is widelyrecognizedthat the correct way to break a bolted joint is to slacken the bolts and then wedge thejoint open on the side furthest from the persondoingthejob. Oftenequipmentat if groundlevel is satisfactory but a ladderdiscloses adifferentstate ofaffairs! 178 . then the leakage is controlled and caneitherbe allowedto blow offor thejointcan be remade. For example. The site originally employed its own craftsman but as the result ofbudget cuts it changedto an outside organization. A numberof similarincidents were reported15.023 bar). attachedthecover to ahoist and lifted it. 10.two men were badly scalded when they were removingthe cover from a largevalve on a hot water line. Ifthereis any pressurein the equipment. screwswere missing or loose. They removed all the nuts. It is someon and times. (g) Aventilation systemfailedbecauseacraftsman puttoo muchgreaseon the air-flow sensors. (b) On many occasions detailedinspections of flameproof electrical equipment have shown that many fittingswere faulty — for example. although the gauge pressure was only 9 inches of water (0. The site could no longer rely on the skill of the craft' now that repairs were carried out by a succession ofstrangers. usedwhenZone 2equipment—cheaper — would be adequate. particularly older plants. This example illustrates the many-pronged approach necessary to prevent many humanerror accidents: Hardware Flameproof equipmentrequirescareful and frequentmaintenance.33 psi or 0. Nevertheless accidents occur becauseexperienced men undo all the bolts and pull ajoiiitapart.AN ENGINEER'S VIEW OF HUMAN CREOR thecylinderwas opened. Many craftsmen would havespotted theunsuitability of the filter as soon as they startedwork14. Flameproof equipment requires easier to maintain specialscrewsand screwdrivers but sparesare not always available. Training Many electriciansdo not understandwhy flameproofequipmentis used and what can happenifit is badly maintained.

We depend on training to make sure people understand the procedures and on inspections to malce sure people follow the procedures. It is. He had moved60 m (200 feet) from the end whenhis face mask startedto fill with water. or may not change them over in time. theyare easilymislaid. he had assumed that the 'safety people' and the factory procedures would do all that was requiredand that he could rely on them. wearingbreathing apparatus supplied from the compressed air mains.CCDENTS PREVEtTABLE BY BETTFR M1NtENANCE When a plant came back on line after a turnaround it was found that on too manyjoints the studbolts wereprotruding far onone sideandnotenough on the other. contamination with water can be prevented. (C) (d) Ayoung engineerwas inspecting theinsideofa 1. to carry out their own checks. 179 . (I) Companies have procedures for examining all liftinggearatregularintervals. so that somenuts were secured by only a few threads. of course. Compressors may fail or be switchedoff. The terminal was secured by a bolt which screwed into a metal block.5 m(60 inches) diameter gas main. before entering a vessel.He pulled it off. but men should nevertheless be encouraged. He discovered that the air line had been connected to the bottom of the compressed air main insteadofthe top. As a young engineer. A hardware solution is possible in this case. reportedmissing andthen found and used again.Beingsmall. or may even supply cylinders ofthe wronggas. whatever the hardware. held his breath. (e) The temperature controlleron a reactor failed and a batch overheated. Nevertheless. with little experience. The bolt had been replaced by a longerone whichbottomed beforethe terminal was tight. On the wholepiped air is more subject to eiTor than air from cylinders or a compressor. If breathing apparatus is supplied from cylinders or from portable compressors instead of from the factorycompressed air supply. no system is perfect. the responsibility of those who issue the entry permit and those who look after the breathing apparatus to nrnke sure everything is correct. A goodpracticeisto fix acoloured tag toeachitem afterexamination arid to changethe colourevery six months. corroded or damagedwire ropes and slings often fail in service16. Those responsible may fail to supply spare cylinders. On eight-bolt joints the bolts were changed one at a time. Of course. Four-bolt joints were securedwith clamps until the next shutdown. We are dependenton procedures to some extent.and walkedquicklyto the end. But some designs provide more opportunities for error than others and are therefore best avoided. Everyone can then see if an itemis overduefor examination. It was thenfound that there was a loose terminalin the controller.

changed the wrongvalveorcutopenthewrongpipeline. Sometimes the process team members have identified the equipment wrongly. sometimes the job has been describedbut the description has been misunderstood. The fitter's workmanship is inexcusable but the possibility of error in selectingthe gasketwas high as four differenttypesofjoint were used on the plantconcerned. he ground only three depressions. A shaft had to be fitted into a bearingin a confinedspace. Sometimes thejobhas beenshown to thefitter who has gone for his tools and thenreturnedto the wrongjointor valve. everyone closed their minds to the possibilitythat itmightbe comingfrom somewhere else.4 onpage62. To stop a steam leak a section of the steam line was bypassed by a hot tap and stopple. It was a tight fit so the men on the job decided to cool the shaft and heat the bearings.3 Incidents due to gross ignorance or incompetence . The shaft was cooledby pouring liquefiedpetroleum gas onto it while the bearing was heatedwith an acetylene torch5! A fitter was requiredto remake a flangedjoint. The original gasket had been removed and the fitter had to obtain a new one. Equipmentwhichis to be maintained shouldbe identifiedby a numbered tag(unlessitis pernrnnently labelled) and the tagnumbershown onthe permit-to-work17. Several people then saidthat they had thought that the steam seemed rather wet11! This incident could be classified as a 'mind-set' (see Section 4. 180 10. After the job was complete it was found that the steam leak continued. page 187.AN ENGINEER'S VIEW OF HUMAN ERROR (g) On many occasions someonehas broken the wrong joint. He therefore ground depressions in the outer metal ring of the spiral woundgasketso that it would fit betweenthe bolts (Figure 10. page 88).3)! As ifthis were notbad enough. someincidentshavebeendue to almostunbelievable ignorance of the hazards. sometimes achalkmarkhas beenwashedoffby rainorthefitterhas been misledby a chalk mark teft from an earlierjob. It was then found that the leak was actuallyfrom the condensate return line to the boiler whichran alongside the steam line. so the gasket did not fit centrally between the bolts but was displaced 12 mm to one side.1. The condensate was hot and flashed into steam as it leaked.firstitem.Instructions such as 'Thepumpyou repaired lastweekisgivingtrouble again'leadtoaccidents. He selected the wrong type and found it was too big to fit between the bolts.4. Foranother incidentsee Section 11. Having decidedthat the leak was corning from the steam main. As in Section 3.

To preventthese incidents we should: trainpeopleinthereasons forthepermitsystemand the sortofaccidents that occur ifit is not followed. checkfrom time totime that the procedures arebeing followed(seeChapter 5). Often it is operating teams who are at fault (see Chapter 10) but sometimes the maintenance team are responsible. Common faults are: • • • • • carrying out a simplejob withouta shortage of examples and someincidents are describedand illustrated in the Institution Chemof ical Engineers'safetytraining packageon SaferMaintenance6. not wearingthe correct protective clothing. 181 . carrying out work beyondthat authorizedon the permit-to-work.4 Incidents which occurred because people have occurred becausepermit-to-work procedures were not Many accidents followed. unftrtunately. There is.CCDRWS VEN1ABL BY BETTER MAINTErANCE Figure 10.3 Aspiral woundgasket was groundawayto make it fit between thebolts ofajoint took short cuts 10.

Ifwe allow those who are responsible to us to use their discretion in borderlinecases. "Gloves and goggles to be worn and this time I meanit"?' Donot ask for more protective clothing thanis necessary. (c) A permitwas issued for a valveto be changed on a line containing corrosivechemicals. Itstatedthat gloves and goggles must be worn. At first sight this accident seems to be due to the fault of the injuredman and there is little that management can do to prevent it. The fitter's 'excuse' was that hewas unaware that the equipment was Ifl use. it would have stipulated that breathing apparatus should be worn7. All supervisors then started to ask for glovesand goggles every time. Many accidents could be avoidedif peoplerespondedto unusual observations.Thefitterdidnot wear themand was splashedwith the chemical. If they had. (b) Several accidents have occurred becausefitters asked for and received a permit to examine the bearings (orsomeother external part)ona pumporother machine and later decided that theyneededto open up the machine. theprocesslinesmaynothavebeenfullyisolated andthe machine may not havebeen drained.AN ENGINEER'S VIEW O HUMAN ERROR Thefollowingare examples of accidents caused by short-cutting: (a) A fitter and an apprentice were affected by gas while replacing a relief valve in a refinery. though many of them were for jobs carrying no risk. someonehad been injuredand the supervisor had beenblamed. If the fitter had been better at putting his case he would have said. Ifa permit is issued for work on bearings. They had not obtaineda permit-to-work. 182 . However. The maintenance workers therefore ignored the instruction and continued to ignore it even on the odd occasion whenit was reallynecessary. Why did the operating team ask for more protectiveclothing than was necessary? I suspectthat at some time in the past a supervisor had asked for too little. Ask only for what is necessary andthen insist that it is worn. The line had been emptied and isolated by locked valves but some of the corrosive liquid remainedin the line. They then did so withoutgettinganotherpermitand a leak occurred. except to see that rules are enforced. examination of the permit book showed that every permit was marked 'Gloves and goggles to be worn'. then inevitably theywill sometimes come to differentdecisionsthan thosewe mighthave come to ourselves.Thefitter and apprentice both smelledgas after removingblanks but took no notice.We shouldcheek their decisions from time to time and discuss with them the reasons for any we consider doubtful. 'Why didn't you tell me that for oncegoggles were necessary? Why didn't you write on the permit. But comingto a differentdecision does notjustifyblame.

Such incidents are rarein the oil and chemical industries. do designers ensure that access and flexibility are adequate? 103 Incidentswhich could be prevented by more frequent or better maintenance All the incidents just describedcould be preventedby better management — that is. by better training. The suction valveon the pumphadbeen left open and the drain valveclosed. It is sometimes suggested that the maintenance workers should check the isolations and in some compames this is required. He picked it up and. without thinking.Any check that maintenance workerscarry out is a bonus. supervision. As they were on the other side of the isolationvalve which isolated the jointhe was supposedto break. Originallyonly the bearings were to be worked on. This incidentraises severalquestions.acorrosivechemical sprayedonto his face. while workingon the pumpbearings. Later the maintenance team decided that they would have to open up the pump. but occasionallyone hears of a case in which routine maintenance or testing has been repeatedly postponed becauseofpressureof work. The suction valve was chain-operated and afterwards the fitter recalled that earlierin the day. Nevertheless they shouldbe encouragedto check. The ensuingfire killed three men and destroyed the plant. or slip-plating elsewhere. Did the fitter understand the nature of the liquid in the pipeline and the need to follow instructions precisely? Had he fitted slip-plates in the wrongposition before? Had anyone had difficulty slip-plating this joint before? If so.He was wearing goggles but knocked them offas he flung uphis arms to protecthis face. Findingit difficultto wedgethem aparthe decided.QCUETS PREVNI kBLE B I3ETTER MA1NTENNCE (d) Many accidents haveoccurred becausethe operating teamfailedtoisolate correctlyequipmentwhich was to be repaired. (e) A fitter was asked to fit a slip-platebetween a pair of flanges. etc. 183 . In most companies howevertheresponsibility lies clearly with the operating team. to fit the slip-platebetweenanother pair offlanges.withoutconsultinghis supervisor or the process team. Other incidents have occurred because of a misjudgement of the level or quality of maintenance required.It is their livesthat are atrisk. They told the process supervisor but he said that a new permitwas not necessary. been considered? When the need for slip-plating can be foreseen duringdesign.the chainhad got in his way. On one occasiona largehot oil pump was opened up and found to be full of oil. had modifications. hooked it over the projectingspindle ofthe open suction valve! This incidentalso involved a changeof intention.

The Aberfan tip was inspected in the most perfunctorymanner. Many of the famous railway engineers such as William Stroudley and Edward Fletcher come out badly. The officialreport9'1° showed that similar tips had collapsed before. a change in design might have prevented the accident. but more effective. so that they could have been cleaned by process operators. and that the action neededto prevent collapse had been wellknownfor many years. most ofthem children. The flame traps had to be unbolted and cleanedby maintenance workers. It is harder. This simple incidentillustrates the themeof this book: it is easy to talk of irresponsibility and lackof commitment and to urge engineersto conformto schedules. inthe casejust describedan accident was almostcertain. Itis easy to blamethe other man. Great designers theymay havebeen but— as mentionediii Section 5. If they could havebeen secured withoutbolts. Followinga collapsein 1965 all engineerswere askedto make a detailed examination oftips undertheir control.AN ENGINEERS VIEW OF HUMAN ERROR Forexample. it is less likely that cleaning would be neglected. we take a chance and there may be no accident. Although scheduled for three-monthly cleaning theyhad not been cleanedfor two years. killing 144 people.after a storage tank had been suckedin it was foundthat the flame arresters in the three vents were chokedwith dirt. In many cases. however. 184 . It may not matteriftheroutine cleaning of flamearresters is postponed for a week or a month or even if a whole cycle of cleaning is omitted. to ask why schedules are not followedand to change designs or methods so that they are easier to follow. Responsibility for tip inspectionhad been given to mechanical engineers instead of civil engineers. Locomotive superintendents.3 (page as 107) — their performance maintenance engineerswas abysmal: too much was left to the foremen. if we neglectsafety measures. However.The Inquirycriticizedthe engineerresponsible 'for failing to exercise anything like proper care in the manner in which he purported to dischargethe duty of inspection laid upon him'. Few failureswere due to driverstampering with reliefvalves or lettingthe water level in the boiler get too low. encouraged the view that many explosions were due to drivers tampering with reliefvalves. A book on railwayboiler explosions8 shows that in the period 1890—1920 the main reasons for these were poor quality maintenance or a deliberate decision to keep defective boilers on the road. though without serious consequences. but if we continue to put offcleaningan accident in the end is inevitable. In 1966 a colliery tip collapsed at Aberfan in South Wales.

. While there is no case for nuclear standards of reliability in the process industries. often the aim is to minimizecapitalcost and it may be no-one'sjob to look at the total cash flow.tCCUNTS PREVENTPLETY BTTER MAJNT1'NANCE results in so many accidents — not just accidents due to human errorbutothers aswell—canwe change the work situation by avoiding the need for so much maintenance? Technically it is certainly feasible. In the nuclear industry. Some railway rolling stock is now being ordered on 'design. however.. Since maintenance 10.mzany studytours by British raiht'ay officials to Japan wit/i littlefollowup . there may sometimes be a case for a modest increase it is easy to see the bolthas loosened— the lineson nutand boltfall out ofalignf/lent with loosening. f 'Isaivplenty ofhigh—techequipment on myvisit to Japan. This forces the contractor to consider the balance between initial and maintenance costs. constant monitoring — these are the things that imiake the difference in Japan.' 'There havebeemi. JamesAbbott18 185 . Pride in thejob. and theyare notrocket science. the sums are never done. We shouldstartapplying the lessons or we shouldstop going on the studytours. a simpleline ispainted on both nutand bolt. When new plants are being designed. but I do notbelieve that ofitselfthis is the key toJapanese railwayoperation —sinjilar high-tech equipmentcan be seemi in the UK. Often.6 Can we avoid the need for so much maintenance? Afterthought 'Oncecorrecth'tightened. where maintenance is difficult or impossible... equipme/it redundancy. Capital and revenue may be treatedas if they were different commodities which cannot be combined. attention to detail. see Reference 12. In the oil and chemical industries it is usually considered that the high reliability necessary is too expensive. For other accounts of accidents involvingmaintenance. build and maintain' contracts. equipmentis designed to operate without attention for long periods or even throughoutits life.

UK) 7. Kletz. DC. TA. Operating Experiemice Weekly Sunimnarv. DC. Operating Experience Weekly Summary. 2001. S S and Crowl. Washington.. 1998. No. CIoe. 17. Chapter36 (Wiley.USA) 3.. H. OSHAIRP—82/002 (US Department ofLabor). Abbott.c. ViEW OF HUMAN ERROR PetroleumReview. What Went Wrong7 — Case HistoriesofProcess Plant Disasters. page 2 (Office of Nuclearand FacilitySafety. DC. 98—22.A. US Department ofEnergy. 1998. 1967.. 12. Health and Safety Executive. London. 9. Hewison. Rugby. ModernRailways. NewYork. Safety and AccidentPreventionin Chemical Operations. Damigerous Maintenance: A Study ofMaintemiance Accidentsin the Chemical Industiy and How to Prevent Them (HMSO..T. 1966. C. 1983. 1982. page 29. page 11 (Office of Nuclearand FacilitySafety. Apnl1984. Kletz. SelectedOccupational Fatalities Relatedto Fire and/or in Con fined Workspaces asFound in ReportsofOSHA Fatality/CatasExplosion trophe Investigation. Petroleum Review.Washington. 8. J.NewYork.H. References in Chapter 10 2.April 1982. London. US Department ofEnergy. W..A (editors). Operating Expemience Weekly Summamy. Safety training packages 028 Safer Work Permits and 033 Safer Maintenance (Institution ofChemical Engineers. Kletz.USA). paragraph171 (HMSO.. page 1 (Office of Nuclearand FacilitySafety. USA). No. 13. (editors).A.Chapter 1 (GulfPublishing Co. Texas. 1.UK). Houston. UK). 99—08. USA). 1999. W. USA). CW. October1981.H ici'NEER''.. DC.S. 11.A. USA. 4thedition. 1998. Operating Experience Weekly Sununamy. T. Chapter 13 (Butterworth-Heinemann. 1987.. 98—23. 1989.1984.USA). Locomotive BoilerEiplosions(David and Charles.H and Wood.Washington. 15. US Department of Energy. 3rd edition. ReportNo.T. The Chemical Engineer. Oxford.D. Texas. page 21.Washington. AIchE LossPreventionSymposium. Kietz. in Fawcett. Plant modifications: maintain your mechanical integrity.W.UK). 186 . Ramsey. Kletz. page 3 (Office of Nuclearand FacilitySafety. 6. 1995. 5. April 1989. USA). No.A in Grossel. 14. Learning from Accident. 406. Handbook ofHighly ToxicMaterialsand Mona gemnent. 16.. page37.Houston. US Department ofEnergy. 4. Chapter11 (Dekker. Newton Abbot. January. pagei. T. No 98—06. 18. Report of the Tribunal Appointedto Inquire into the Disaster at Abeifan on October21st. 10. No. 2001. 1998.UK).

It is not always clear whichof the types of error discussedin Chapters 2—5 were involved.1 on page 188). Whenthe leak testwas completethe process supervisor (who was adeputy) askedthe lead fitter to disconnect the hose. He should haveshownhim thejob and checkedthatit was tagged. while Section 10.He did notshowhim thejob or check thatit was tagged. 11.and then disconnect Whatshould havehappened Twopermitsshould havebeenissued one to connectthehose and a secondone — when the timecame— to disconnect it.2.4 (page 181) describes incidentswhich occurredbecausemaintenance workers cut corners.Some accidents that could be prevented by better methods of operation 'TAT/zen safe behaviour causestrouble. Oftenmore thanone was at work. so thatequipment couldbe leak tested. Here is anotherincident: Whathappened A permitwas issuedto connect a nitrogenhose to flange A (seeFigure 11.achange occursin the unsafe direction. itwhenthe test was complete.1 (page 103). 187 .As thejob is done several times per year thereshould havebeen apermanenttag. Hakkinen8 This chapter describes someincidents whichoccurred as the result oferrors by operating staff.' K. FlangeA was not tagged It should havebeentagged.1 Permits-to-work Accidentswhichoccurred as the result of afailureto operateapermit-to-work system correctly are discussed briefly in Section 5.

not the one who had fittedthe hose — misunderstood his instructions and broke joint B. He should haveinspectedit He should haveinspectedit. (6) Failureofthemanagerto see that the deputyprocesssupervisor was better trained. no-onewas injured. He was new to the Works. Fortunately. Nitrogenhose Vent Figure 11. The lead fitter signedoffthepermit without inspecting job. (5) Failure of the processsupervisor to inspectthe completed job.1 Joint B was brokeninsteadofJoint A There were atleast eight 'humanerrors': (1) and (2) Failureoftwo process supervisors to tag thejob. 188 .AN ENGINEER'S VIEW OF HUMAN EEROR The fitter Who was asked to do thejob — He should havebeen bettertrained. the The process supervisor acceptcdthe permitback and started up theplant withoutinspecting job. the Toxicgas cameout ofthe openjoint B. (3) Failureofthe processsupcrvisor to show thejob to the secondfitter. (4) Failureofthe lead fitter toinspect the completed job.

If any one ofthese errors had not been made. Prevention of similar accidents depends on better training and monitoring.itcould be permanenttyconnected by a double block and bleed valve system. In a sense. Also. then the operatorhas only to be distracted for a moment for the tanker to be overfilled. (8) includes all the others. Adoption of the hardware solution does not lessen the need for training and monitoring. since the nitrogen is used severaltimesper year. If the tankers are filled slowly. by hand. the filler is temptedto leave the job for a few minutes.2 Tanker incidents Road and rail tankers are often overfilled. In addition the process supervisor issued one permit instead of two. 11. the accidentwould probably not have occurred. by a change in design. He is awaylonger thanhe expects to be andreturns to find the tankeroverflowing. Unfortunately these meters do not preventoverfiffing as: 189 11.RCC'OtNTS VREVENT&BLR BY BETTER METHODS OF OPERATION (7) Failure ofthe maintenance engineerand foremanto see that the new fitter was bettertrained. though issuing two permits would probably not have preventedthe accident. If the managerand engineer had kept their eyes open before the incident and checkedpermits-to-work regularly. (8) Failureofthe managerandmaintenance engineertomonitorthe operation ofthepermitsystem. If thetankers are filled quickly.2. Even if we prevent. However. the maintenance organizationneed not be involved at all and the nitrogen supply cannot be disconnected at the wrong point. accurate meteringis difficult. For these reasonsmost companies use automatic meters.they could havepreventedtheft subordinates from taking shortcuts and recognizedthe needforfurthertraining. If the nitrogen hose is connected by means of a clip-on coupling. The method actually used was expensive as well as providingopportunities for error. failure to follow the permit system correctly will resultin other accidents.1 Overfilting . ahardwareapproach is possible. Alternatively. Thequantityto be put into thetanker is seton a meterwhichcloses the filling valve automatically when this quantity has been passed. this particular accident occurringagain.

thc pressureinside will rise.2. 190 discharge liquid). • frequentsmallleaksare morehazardous thanan occasional burst. . they are relatively cheap. but it can result in the tankerbeing completely full ofliquid. 11. After the insulation fell off a rail tanker while it was being filled. They close the filling valve automatically when the tanker is nearly full. • theremaybe a residue iii thetanker leftover from the previousload which • theremay be a fault in themeter. One such tanker was overfilled in Spain in 1978 and burst. killing over 200 people12. Although a separate level measuring device is required in every compartment ofevery tanker. especially on rail tankers whichhave topass through long tunnels. • the relief valveis ofno valueifthevehicle falls over(as thenthevalvewill • if a tanker is overpressured the manhole bolts will stretch and relieve the pressure. If it warms up. For these reasons many companies have installedhigh level trips in their tankers. • thevalvesmay be knocked offin an accident. This does not matter if the tanker is fitted with a relief valve. pressure tankers are used. • theyaredifficultto maintain onvehicles that are awayfrombase for a long was found that a valvein the vapour return line was closed and the gauge pressure was 23 bar instead of the usual 10 bar.This is justifiedon the groundsthat: the operatorfails to notice. either as the result ofa slip or becauseofa misunderstanding of the quantity required. Overfilling does not result in a spillage as the vapour outlet on the top of the tanker is normallyconnected to a stack or a vapourreturn line. (The tanker is believed to have been weakened by use with ammonia.AN ENGINEER'S VIEW OF HUMAN ERROR • thewrong quantity may be set on themeter.) Failure to keep the vapour return line opeu can also result in the overpressuring of tankers which do not have relief valves.2 Overfillingpressure tankers When liquefied gasesare transported. but in the UK tankers containing toxic gases are not normally fitted with relief valves. Relief valves are fitted to all pressure tankers in the US but they are not fittedincontinental Europe. On the Continent many tanlcers carrying liquefied flammable gases are also not fitted with relief valves7.

When training sessions were arranged no-oneturnedup as they could not be spared. if there is one. • The drivershad verylittle understandingof theproperties of thematerials handled.2. To use the manual valves the automatic valves had first to be opened.The tanker driversset the quantity required on a meter. and thenpressedthe start button. Overfilling is prevented by following proceduresand it is said that since 1945 no tankerhas burst.canbe prevented by fitting bursting discsbelow the relief valves. injuring 11 and destroyingthe whole row of 8 fillingpoints. Maintenance of the relief valves should be a small problem compared with maintenance of the vehicle as a whole. In addition. 11. They were inside a locked cupboardand a notice on the door reminded the operators that before operating the switches they should first check that the manualvalveswere closed. Some of the manual valves were not closed. relief valves are fitted to tankers carrying liquefiedflammable gasesbut not those carrying liquefiedtoxic gases. The petrol caught fire. this was done by operating a series of switchesin the controlroom. The automatic equipment broke down and the supervisor decided to changeover to manualfilling. the filling points were notvisibleto theperson operating the over-ride switches. petrol and other oils came out of the filling arms and either overfilled tankers or splasheddirectly on the ground. hi particular. the official reportmade several other criticisms: l • The employeeshad not been adequately trained.but the design containedtoo many opportunities for error.3 Aseriousincident3 Automatic equipment was installed for loading tankers at a large oil storage depot. is better than relying on procedures that. OE OPERATION a typical UK compromise. killing three men. There was a manual valve in each fillingline for use when the automatic equipmentwas out of order. the reason most often quoted for not usingthem. may not always be followed. inserteda card whichwas their authorization to withdraw product. A cheap hardware solution to a problem. for one reason or another.CWS In ERR TARLE RY RETLER METROOS. He asked the drivers to check that the manual valves were closed and then operated the switches to open the automatic valves. Small leaks from reliefvalves. 191 . It is easy to say that the accident was due to the errors of the driverswho did not check that the manualvalves were closed or to the error of the supervisor who relied on the drivers instead of checkinghimself. The case for and against relief valves shouldbe re-examined7.

and reacted violently with it.3 Some incidents that could be prevented This sectionis concemed with day-to-day instructions rather than the permanent instructions discussedin Section 3. He discharged the load into the wrong tank. Thesolution enteredthe watersupplyandmany customers complained of an unpleasant taste and. A combination of error-prone hardwareand poor software made an accident inevitable in the long run. by better instructions 11.the accidentneed not have occurred. sore throats and green hair! Earlierin the day. page 88). producing nitrous fumes. of diarrhoea. They added the nitric acid to the empty reactor via the normal filling pump and line which containedsome isopropanol. there had been blockages in the lime dosing pump. The real cause of the problem was not found until two days later9.2. Bundles of unsorted documents were handed to the inspector forstudy. hadthe same imagination and the same zeal been displayedin matters of safety as was applied to sophistication of equipmentand efficient utilisation of plant and men. '. The night shift did not fill the reactor with water. later.4. and soon after the tanker arrived. He thought this was obvious as the reactorhad beencleanedthis way in the past. There were no regularinspections ofthe safetyequipment. To quote from the official report.4 Loads delivered to the wrongplace A relief tanker driver arrived at a water treatment works with a load of aluminium sulphatesolution..' 11.. He wroteAgitate with 150 litres nitric acidsolution for 4 hours at 80°C. When we have a problemand know thereis something wrong. He was told where to deliverthe load and given thekey to the inlet cover on the tank. The 192 . The nitric acid displaced the isopropanol into the reactor.which thekey alsofitted. we assume it is the cause of the problemand stop thinkingabout other possible causes (see Section 4. Sometimes care is taken over thepermanent instructions but 'anythingwill do' forthedailyinstructions.5 (page 65). The operators therefore assumed that the complaints about taste were due to abnormal lime dosingand flushed out the lime dosing system. He did not tell them to fill thereactor with water first. A day foremanleft instructions for the night shift to clean a reactor.AN ENGINEER'S VIEW OF HUMAN ERROR • • Instructions were in no sort of order.

This accident could be said to be due to the failure of the night shift to understandtheir instructions or use their knowledge of chemistry (if any)..t-CCU)ENTS PREVENTABLE BY BETTER METHODS OP OPERATION reactor. designedfor a gauge pressure of 3 bar. as the following incidents show. especially for handlinghazardous materials. It canbe prevented only by training people to write clear. A slight change in pressure in one of the lines connected to the tank causedit to overflowagain—onto oneofthe men. On another occasion a man used a hose attached to a caustic soda line to wash mud off his boots.1. Hoses often burst or leak for a variety of reasons. page155). If hoses must be left attached to process lines. mayreducethenumberofincidents.4 Some incidents involving hoses easily damaged or misused. the gaugepressurewould havereached30 bar. If it had not burst. that are Hoses. burst. When pressure was appliedthe hose came undone with such force that the end ofit hit a man and killedhim. A vented overhead tank was overfilled and somemen were askedto clean up the spillage. No relief system can be designed to cope with unforeseen reactions involvingchemicals riot normally used. and while better training. etc. but no-one realizedthat cleaning up spillages should receive as much consideration as conventional maintenance. usually because the wrong sort was used or it was in poor condition. Carelessness on the part of the man who connected the hose? A moment's aberration? Lack of training so that he did not realize the importance of engaging all the threads? It is safer to use flanged hoses if the pressure is high. then the unconnected ends should be fitted with self-sealing couplings. unambiguous instructions. The chance of failurecan be 193 . like bellows (see Section 8. To extinguish a small fire a man pickedup a hose already attachedto the plant. The plant where this occurred paid great attention to safety precautions when issuing permits for maintenance work and would have resented any suggestion that their standards were sloppy. They were working immediately below the tank which was filled to the top of the vent. A hose was secured by a screw coupling but only two threads were engaged. inspections. Unfortunately it was connected to a methanolline. instructions. are itemsofequipment 11. Peopleare often blamed for using the wronghose or using a damagedhose. In these cases a hardware solution is possible. we shouldtryto change the worksituationbyusinghoses as little aspossible.

1 Failure 11. These particularly common in operations.AN ENGINEER'S VIEW OF HUMAN ERROR reduced by using the hose at as low a pressure as possible. 11. It is easy to blame the operators for not pointing out the absence of vent points. Fit a ventvalve to the point on the plant to whichthe hose is connected. A man was sprayed with nitric acid mistwhentheunsecured end ofa hose lifted out of a floor drain. whenunloadingtankers it is better to use a fixed pump insteadof the tanker's pump. If the end of a hose is not connected to equipment.2 shows threeambiguous notices. The filler pointedout the notice. The hardwaresolutionis obvious. There is a need for a design of vent valve that can be incorporated in a hose as standard. but after two years hoses are connected to other points on the plant and the accidentrecurs. It actuallyrefened to the electrical system. For example. Insteadof 194 11. There was no way of releasing the pressure trappedbetweenthe positivepump and the blockage.The managerasked the filler whyhe had not earthed the tanker before filling it with flammableliquid.5 and 11. construction and maintenance. but seem ofwritten communication Instructionsare considered in Sections 3. In another similarincident five men standing up to ten feet away were sprayed with contaminated sludge'2. For example. The hose moved when the flow of liquid was followedby a rush of compressed air13.5 Communication failures can of course affect design. Accidents frequentlyoccur when disconnecting hoses becausethere is no way of relievingthe pressure inside and men are injured by the contents of the hose or by mechanical movement of it.5. so theyare discussedhere. three men were sprayed with sediment when disconnecting a clogged hose at a quick-disconnect coupling. Hose connections should be inspected regularly. After an incidentthere is usually a campaign to fit such vents. it shouldbe secured. .as then the hose is exposed to the suction rather than the deliverypressure ofthepump. Figure Notice (a) appearedon a road tanker.3 (pages 65 and 192). It should not project or it will be knocked off and it should notbe possible to leave itopen. Why are hoses damaged so often? Perhapsthey would suffer less if we providedhangers for them instead of leaving them lying on the groundto be run over by vehicles.

ACCIDENTS PREVENTABLE BY BET ER METHODS OF OPERATION (a) NO ENTRY. Whatdoes (c) mean? 195 . Notice (b) is error-prone.If the full stop is not noticed.2 using the chassis as the earth. there was a wired return to the battery. It had nothingto do with the method used for filling the tanker. the meaningis reversed. CLEARANCE C E R T I F I C ATE REQUIRED (b) Figure 11.

However. 'fragile'. found some boxes marked with an umbrella. for example. The following was printed on the base of a packet of food from a supermarket14: Pictorial symbols are often used in the hope that they will be understood Do not turn upside down Two wartime newspaper headlines have become classic examples ambiguity: of BRITISHPUSH BOTTLES UP GERMANS MACARTHUR FLIES BACK TO FRONT 196 . to indicatethat they shouldnot be allowedto get wet.He did not meanthat the isolationand other necessaryprecautionswere in place. meaningthat the contents were fragile. pictorial symbols can be ambiguous. The operator took this to meanthat entry had been approvedbut all the hygienistmeantwas that the atmosphere insidewas safeto breathe. who lived in a hot country. An industrial hygienisthad written on the work permit that entry was acceptable.what should be lookedfor and what standard is acceptable.kM ENGiNEER S ViEW OF HUMAN ERROR Designers often recommend that equipmentis 'checked' or Inspected' but such words mean little. The designer should say how qften the equipment shouldbe checked orinspected. It was not his responsibility to authorize entry. He took the pictureto meanthat the boxes should be kept out ofthe sun. Anoperatorwas found inside a confinedspace— the inside of an incinerator — although no entry permithad been issued. He took the picture to mean that the contents were already broken and that thereforeit did not matterhow theywere treated. A storeman saw some boxes markedwith broken wineglasses. The followingappeared in a medicaljournal10: MANAGEMENT OF SERIOUSPARACETAMOL POISONING (12 December1988) Under the section on encephalopathy we saidthe patient should be nursed at 30—40degrees. by people who do not understandEnglishand maynot know themeaningof.The same man.This referred to the angle in bed—not the temperature.


2 Failures of verbalcommunication The incidents describedbelow were really the result of sloppy methods of working—that is. Actually tri-ethanolarnine was wanted. A fitter was asked. so he ordered somedrums of this chemical. The plant manager ordered some for further use and the mistake was discovered by an alert storeman who noticed the two similarnames on the drums and asked if both chemicals were needed. by his supervisor. ofpoormanagement.' to the guard of one tram. The fitter thought the supervisor said 347B and started to dismantle it. but it is easy to hear a letter incorrectly. He had previouslyworked on a plant on whichTEA meant tri-ethylamine. It is a good idea to use theinternationalphoneticalphabet shown below. the guard of the other train was alsocalledDick4. Dick.A'i ENOU'iEER'SV[EW OF HUMAM ERROR 11. When the fork-lifttruck driver had finished. to dismantle heat exchanger347C. The heat exchanger would havebeen 347 Bravo. A flat lorry was backed up against a loading platform and loaded with pallets by a fork-lift truck which ran onto the back of the lorry. People shouldnot be expectedto rely on word of mouth when mishearing or misunderstanding can have serious results. it was realizedthat some operators did not know that hot condensate' was boilingwater. One of them was TEA. One day the lorry driverheard another horn and drove off just as the fork-lifttruck was being drivenoffthelorry. ofcourse. Two trains were ready to depart from a station. A B C D E F G H I 198 ALFA BRAVO CHARLIE DELTA ECHO FOXTROT GOLF HOTEL INDIA J K L M N 0 P Q R JULIET KILO LIMA MIKE NOVEMBER OSCAR PAPA QUEBEC ROMEO SIERRA TANGO U UMFORM V VICTOR W WHISKEY X X-RAY Y YANKEE Z ZULU S T . Right away.5. havebeen shown the permit-to-work. particularlyon the telephone. There shouldbe bettermethods ofcommunication. a famous accident occurred on the railways in 1873. Unknown to him. After a number of people had been scalded by hot condensate. A member of a project team was asked to order the initial stocks of raw material for a newplant. it fell to the ground. The signaLman called out. he sounded Ins horn as a signal to the lony driver to move off. unlikely to be confusedwith 347 Charlie. The fitter should. For example. used for clearing choked lines.

the supplier supplied acetone. passengers were exhorted not to do so. whenteam A opened or closedvalves they informed team B. Two men were badly burned and several were exposed to high concentrations of acetone Ifthere aretwo pumps on thesameduty (one working. A contractor was employed to removethe surface coatingfrom a floor. Operators often communicate with the control room by radio. He was told to ordersome muriaticacid (that is.CCiDENTS PRVE ThEL. Communication failures easily occur when valves (or other equipment) canbe operatedby more than one groupof people. Team B then tried to move some contaminated water through the closed valveto another part oftheplant. from the tone of the field operator's voice and the length of the message. On one plant. It was ignited by a floor buffing machine. No-one from team B was available so team A informed team C directly. In one case a field operator calmly reported that a tank was leaking and oil was filling the bund. Nimmo has often asked control room operatorswho have been speaking on the radio.ETTE& METHODS 01' OPERATION often labelled— for example. concluded that everything was under control.For years. one spare) theyare vapour. (a) Train doors Many accidents have occurredbecausepassengers openedtrain doors before the trains had stopped. Only one team shouldbe allowedto operate each item ofequipment. Say the namesout loud: Jay 25 andJayay 25 soundtoo much alike. An electrician asked to replace the fuses in JA25 replaced them in J25 instead. flooded a pump room and seeped into several other rooms1. who in turn informed team C. The control room operator could not understand the message but.6 Examples from the railways . hydrochloric acid).We shouldremember this wheninvestigating incidents. or doors which are 199 11. They deducethe answerfrom the length of the sentences.isor went to investigate and found the operatordesperately tryingto containthe leak. key wordsand the tone ofvoice17. On one occasion team A closed an isolation valve for maintenance. used on some suburban electric lines for many years. A moreeffectivemethodofpreventionis automatic doorscontrolledby the crew. J25A and J25B. When the level in the tank was seen to fall a super. The waterbackedup another line. On one plant they were called insteadJ25 and JA25. E( 2. The radios are often noisy and other people talking in the control room make communication difficult. 'What did he say?' and got the answer that they did not know. He niisheard and ordered 'muriaiic acetone'15 Therebeing no such substance.

As the pressure leaked out of the cylinders. they started to move. the brakes werereleased.the hosesrupturedand90 tons ofpetrol were spilt. the operators found an easier way: they appliedthe hand brakes loosely so that when the air pressure leakedout the brakes would be applied normally. (b) Off-loading tankers Whilepetrol was being offloadedfrom a train ofrail tankers. • The hoses shouldbe fitted withbreakaway couplings whichwill seal ifthe hoses break. However.This took from 15 minutesto over 4 hours. Instructions therefore stated that before the hand brakes were applied the compressed air in the cylinders mustbe blown off. The design of the braking system was poor. They could be appliedin two ways: • Automatically — the brakes were then held on by the pressure in compressed air cylinders. were hard to release by hand and may have had to be dismantled. The officialreport6 did not make this point but it did draw attention to severalother unsatisfactory features: • The sidingshouldbe level. then when the air pressure fell the brakes were appliedmore firmly.AN ENGINEER'S VIEW OF HUMAN ERROR automatically locked when the train is moving. (c) Single-lineworking Two freight trains collided head-on on a single-line railway in the United States. Equipment should be designedso that correct operation is no more difficult than incorrectoperation. • By hand — usinga longlever whichwas held in positionby alockingpin. on the day of the accident they did not move the handbrake levers far enough. • Emergency isolation valves or non-returnvalves should be fitted between • The supervisors should have noticed that brakes were not being applied correctly. Six locomotives were destroyed and the total damage was $4 million.killing threecrewmembers and seriously injuringafourth. If the hand brake was applied while the brakes were already held on by compressed air. There was no 200 . thehoses and the collecting header. Fortunately it did not ignite. The tankers movedbecausethe brakes were not fully applied.depending on the condition ofthe equipment. In 1983 BritishRail decided that these shouldbe fitted to all new coaches5. Unfortunately.

Here are a few more: • A pump bearing failure was traced to poor lubrication. Someweldsinpurchased equipment showedunexpectedfailures. and that dust was blown aboutby the wind. They thenaskedifthey could see the workshop. 11. Drums of oil had • • • • been left standing out-of-doorswith the lids off. Asteam trap in the space abovethe ceilinghad beenleaking. The cooling coils in a storage vessel showed unexpected corrosion.4 (page 62). As reported in Section 3. A materials expertoutlined several possible butcomplexchainsofevents. The nuclearindustryhas a good safetyrecord.1 (page 168). enced man couldhave had a moment'slapse of attention and eitheradditional safety equipment or a better method of working should have been recommended. It hadnotbeenchecked(see Section 9. an aqueous cleaning solution was sprayedonto electrical equipment.The followingtwo incidentsalsooccurred at nuclearsites. evenan experidispatcher'. 201 .7 Simplecauses in high tech industries Very often. Someare describedin Section9.NCC\)?.I thenaskedif the pipework had been checked beforeinstallation to make sure that it was madefromthe grade of steci specified.One of the few accidents at onenuclearsite occurred whenaceilingtile fell 20feet and injureda manin the canteen. protected only by plasticsheets. thecausesof failures and accidents are elementary. page 170).eveninhigh technology industries. BY B T?RBSEHODS or OPERAT1O signalling system. penultimate paragraph. no automatic equipment to prevent two trains entering the same sectionoftrack andnoteven a single-linetoken systemsuch as that used intheUK (in whichthe driverhasto be in possession ofa tokenforthe section of line). allowingrain to enter. causingarcing andburning a 6-inchhole in thecasing. Thematerialsexperts fromthepurchasing company visitedthe supplieranddiscussed various possible butesotericreasonsforthe failures. It wasreplacedbutno-onechecked that the repairwas successful. Itcontinued to leak andwater accumulated18.WV-. A 'dispatcher' authorized the two trains to enter the line in opposite directions andtheofficial report" gavethe cause ofthe accidentas 'inadequate personnelselectioncriteria which resultedin the placementof an individual withoutsufficient trainingorsupervision the safetycriticalpositionoftrain into Whilethis may have contributed to the accident.1. They found that welding was carried out near a large open door. I attended a meeting at which possible causes were discussed.

DC. On several occasions cylinders of hydrogen fluoride(HF) have ruptured after being kept for 15—25 years. K. 9. No 98—18.Washington. 2. UK).) 3. Reporton a Petroleum Spillage at Micheldover Oil Te. No. 8. US Department of Energy. 1998. S. page 6 (Otfice of Nuclearand FacilitySafety. USA) 17.. Tlze Tunes. 1998. Haklunen. 15. Ope>arimig Experience Weekly Summary. OperatingExperience Weekly Summa. 1983. DC. 202 . 1983. US Department ofEnergy. page 8 Office of Nuclearand FacilitySafety. Chemical EngineeringProgress.T. Drug andTherapeutics Bulletin. DC. 98—47. 1998. page 2 (Office of Nuclearand FacthtySafety. References in Chapter 11 1. Chemistiy Britain. London. 7: 393. No. page 1 (Office of Nuclearand FacilitySafety. I.1968 (HMSO. No.Washington. DC. quotedmAtomn. page38. Great Western Broad Gauge Album (Oxford Publishing Company.Washington.A. 99—06. 11. 5.Washington. DC. No.AN ENGiNEER S VIEW 01. (This report contains anumberoferrors. 1998. Journal of Hazardous Materials. Hyrnes. 7.y. Washington. Stmton. Health and SatetyExecutive. DC. Official Reporton the Fire at the West London TeiininalofEssoPetroleum. 12. 24(10): 980 in 10. No. OperatingEvperience Weekly Summary. Nimmo. US Department ofEnergy.7ninalHampshire oil 2 Februaiy 1983(HMSO. 4. page 14 (Office of Nuclearand FacilitySafety. differentwastes reacted and the bottleexploded19. USA). US Department ofEnergy. USA) 19. 1998. 16. A K. 6. USA) 14. Kletz. 42(1):42. • No-one kept a record of what was in it.. 9.. 98—17.SRD R 275 (UKAtomic Energy Authority. 1. H.London.5(3): 160. 1998. USA). 1983. OperatingExperience Weekly Summary. 99—25. 1995.y.ReportNo. Scandinavian Journal of Work and Environmental Health. 1984. DC.Washington. 98—41. UK). 1 December 1998. 189. OperatingExperience Weekly Su. No. 1972. OperatingExperience Weekly Suminamy.G. 1988. 1999..n.91(9):36 18. US Department ofEnergy. 98—51.. page 10 (Office of Nuclearand FacilitySafety..Washington. USA'> 13. 1990. 4(2) (National Tianspoitation SafetyBoard. 1986. News Digest.. Armstrong. USA) 20.HUMAN ERROR • Hazardous laboratory wastes were placed in a plastic bottle for disposal.Oxford. US Department ofEnergy. Plant/Operations Progress.Warrmgton. The Physiologicaland PathologicalEffects of ThermalRadiation. US Department of Energy. UK). Steele.Washington. DC. page 12 (Office of Nuclearand FacilitySafety.UK). No 400.The HF reacts with the steel to form iron fluorideand hydrogen20. OperatingExperience Weekly Swnmaiy. ModernRailways. USA).

The equipmentused is variously describedas microprocessors. 12. This is probablythe most commoncause ofincidents. 19 and 34. 203 . 12. computers and programmable electronic systems (PES).4 Misjudgingthe way operators will respondto the computer.nputers allowus to iizake more mistakes. 17.1). Thefailures are reallyhumanfailures: failures to realize how peoplewill respond. 18. References 19 and 34 are thorough examinations of the subject. fasterthaneverbefore. For more information on the subject of this chapter see References 1. Nevertheless I have used 'computer' throughout this chapterbecauseit is the word normallyused by the non-expert. 12.3 Specification errors.1 Hardwarefailures: the equipmentdid not perform as expectedand the resultsoffailure were not foreseen (see Figure 12.Errors in computer-controlled plants 'To erris human To reallyfoul thingsup izeeds a computer.failures to allow forforeseeable faults. including failures to understand what the computer can and cannotdo. The incidents describedare classified as follows: 12. 12. The last phrase is the most precise as microprocessors do not contain all the features of a general purpose digital computer.' Anon 'C'o. 12.2 Software errors: errors inthe instructions given to the computer.' Anon Theuses of computer control continue to grow and ifwe can learn from the incidentsthathaveoccurredwe maybe able to prevent repetitions.5 Errors in the data enteredin the computer.7 Unauthorized interference with the hardwareor software. 12.6 Failureto tell the operatorsof changes in data or programs.

independence is desirable. A watchdog whichshouldhave given warning of the hardwarefailurewas affectedby the 204 . Interlocks which prevent a valvebeing opened unless another is shut are an intermediatecategory. Hot polymer was discharged onto the floor of a building and nitrogen. It is generallyagreedthat the ultimate safetyprotection ofplant or equipmentshouldbe independent ofthe control systemand shouldbe hardwired or based on an independent computer system. hut not trips and emergency shutdown systems.1 Hardware failures In many cases theresults of failure could reasonably have been foreseen and precautionstakenor designs changed. but where the consequences offailureareserious.They are often part of the control system. Alarms can he part of the control system. as shown by the examples below. a hardwarefailure caused a number of electrically-operated isolation and control valves to open at the wrong time. For example. used for blanketingthe vessels from which the spillage occurred. was also released.AN ENGINEER'S VIEW OF HUMAN ERROR Figure12.1 12.

written for the particular application. The vendors of a microprocessor-based fire protection system have reported that power transients and nearby electric fields can produce false alarms20. Toreduce the first.ERRORS N COMPUTER-CONTROLLED PLANTS fault and failedto respond2'3. trip systemswere independentof the control computer but the designers did not realize that other safety systems. The software had not been designedso as to minimize the effectsofhardwarefailure4'5. Gondran6 quotes the following figures for the probability that there will be a significant error in the applications software of a typical nhicroprocessor-based control system: Normalsystem: Considerable extra effort: Additional effort is as great as the initialdevelopment effort: 102_103 10-6 205 .2 Software errors These can be subdivided into errors in the systemssoftware. 12. should also be independent. such as interlocks to prevent valves being open at the wrong time. and errors in the applications software. watchdogs shouldnot be affected by failures elsewhere. bought with the computer. Note that the initial hardware failure was merely a triggering event. A hazard and operability study which questioned what would happen if foreseeable hardwarefailures occurred. To reducethe latter. thorough testingis essential. It would not have had seriousresultsif the watchdog or the interlocks had been truly independent. On the plantconcerned. The immediatecause of the false alarm was a hardware failure. Software errors are similar to the slips discussed in Chapters 2 and 7. When the hardwarefailureoccurredthe systemreplaced zero by random numbers. The underlying cause was a failure to separate the control and the safetysystems. would have disclosed the designfaults. ifpossibleonly well-tested systems shouldbe used—not alwayseasy in a fieldthat is changing rapidly. Anincidentfrom another industry:on 3 June 1980 the screens at US Strategic Air Command showed that missiles were headingtowardsthe US. The system was tested by sending alarm messages in which the number of missiles was shown as zero. Also. It can take longerthan design. at each stage of a batch.

preferably. Wray7 writes.3 (page 207). softwarefailures are Inore like time bombs.1. Software changes (and hardware changes (see Section 12. Of course. In the first incident describedin Section 12. 'Don't wony about the hazards. fortunately he wasn't injured but I've now got a program of 279 pages of assemblycode with the comments in German and which I suspectcontains a fault. No change should be made until it has been authorized by a responsible person who should first carry out a systematic study of the consequences by Hazop or similartechniques. the control system will keep them under control. we should design such plants whenever we can. However. This can be done in two ways: by installingindependent trips and interlocks. or. and we should be on our guard against the view.7.7. the errors were of the type discussed in Section 12. page 214) should be treated as seriouslyas changes to plant orprocessand subjectto similar control.' (It was found to be easier to scrap the software and startagain. While hardware errors are usually random (as equipmentis rarely in use long enough for wear-outfailures to occur). Software errors are rather like spellingand grammatical errors in written instructions. the software had been extensively modified and was very different from that originally installed. 'I was involved with a machine which failed to stop when a man put his hand through a photo-electric guard. Access to softwareshouldberestricted. However. by developing when possibleuser-friendly and inherently safer designs — thatis. the defence-in-depth approach.7 (pages 120 and 162). sometimes heard but more often implicit. The number of possible paths through a system can be enormous and it may be impossible to test them all. people usually know what is meant despitespellingor 206 .' This is not a view held by control engineers.) Leveson'9 recommends that instead of trying to make software ultra-reliable. Modifications to software can introduce faults.1 and 8.1 (page 204). but other engineers sometimes hold this view. we should try to make the plant safe even if the software is unreliable. as discussedin Sections 6.They lie in waituntil a particularcombination of process conditions occurs. regardless of the type of control. whoknow only too well the limitations of their equipment and the ways in which it can be neglected.AN ENGINEER'S VIEW OF HUMAN ERROR Othersconsiderthat the figure for normal systems is too optimistic for all but the simplestsystems. plant and equipment designs in whichthe hazardsare avoidedrather than kept undercontrol. When we have found that an error is present it is often difficult to locate the fault.

These incidents didnotoccurbecauseofequipmentfaults orerrorsinthesoftware.3 Specificationerrors . The software engineershouldhavebeen a memberof the Hazopteam. We shouldalways check themby considering in turn all possible 207 many. to people or computers. They treated it as a blackbox. may still be wrong. This is discussedin the nextsection.ERRORS fl COMPUTER-COt4 cRc. page 208). Blanket instructions are undesirable whether directed at people or computers. something that will do what we want withoutthe need to understand how it works or what are its limitations. somethingthat will do what we want eventhoughwe do notunderstand what goes on inside it.LLED PLNTS grammaticalerrors.We understand what we are expectedto do if we are told to Save soap and wastepaper' or to 'Wash the teapotand stand upside down in the sink'. Computers can do only what theyaretold to do. 12.even though our spellingand grammarare correct our instructions. One day the computerreceiveda signal that there was a low oil level in a gearbox. The operatorwas busy investigating the cause of the alarm (it turned out to be a false alarm) and did not notice the rising temperature. Actions whichwould affectsafety or operability would then have come to light. However. it would sound an alarm and then hold everything steady until the operatortold it to carry on. It had been treated as a black box. The computer was programmed so that. The computer had just added the catalyst to the reactor and it would normally have increased the cooling water flow to the condenser. During the Hazop the team should have asked what actions the computer would take. A Hazop had been carried out on the plant but had not included the computer. Sometimes the operatorsdid not understand what the computer could and could not do. butbecause theprogram logicwas not specified correctly. The following incidents aretypicalof The first occulTed on a rather simple batchreactor control system (Figure 12. for all possible deviations. at all stages of the batch. the reactor overheated and the contents were lost to atmosphere through the relief valve. It kept the flow steady. when a fault occurred on the plant.2. A numberofincidents haveoccurredbecausea computerfailed to performin the waythat the designeror operators expectedit to perform. It sounded the alarm and kept all its output signals steady. Sometimes thesoftware engineer did not understandthe designer'sor operators'requirements or was not given sufficiently detailed instructions covering all eventualities.

Men can do what we want them to do but a computer can do only what we tell it to do. would have used his discretion and increasedthe cooling before investigating the reason for the alarm.2 Computer-controlled batch reactor situations.) When the manager asked the software engineer to keep all variables steadywhenan alarm sounds. computers can carry out only rule-based actions. This would probablyhave been done if the instructions had been addressed to the operators. not knowledge-based actions. but the design team acted as if the computer would somehowcopewith any problems. if given the same instructions as the computer. To use the expressions used in earlierchapters. Note that an operator.O1EER'S 'hEW O HUMPN ERROE Vent -box Condenser Cooling Figure 12. (Expert systems convert what seem like know!edge-based decisionsinto rule-based ones. did he meanthat the cooling water flow should 208 . The rules can be more complex than a man could handle but will be followedto the letter wheneven the dimmestoperatorwill see the need for an exception.

There was a leak of compressed air into the filter. but should understand what it will do. smoothing the cake. filtration was complete and the computer could move on to the next phase. In particular it would be desirable to provide some means of preventingthe operator opening up the filter while it is full of liquid. the computer countedthe number of times the air pressure in the filter had to be topped up in 15 minutes. who opened the filter door and the entire batch — liquid and solid — was spilt. but we need more and I know ofno courses that set out to train them.but the operatorignored this warning sign. One of the problems is that the software is incomprehensible to most of the customers for whom it is written. greatlyvaluedby their managers. If less than five top-ups were necded. If moie than five top-ups were necded. Opening up a vessel while it is 11111 of liquid at low pressure is not as dangerous but nevertheless dangerous enough. To measure the pressure drop through the cake. the liquid was circulated for a further two hours. One way of solving coniniunication problems is to combine the jobs of those who find communication difficult. Many accidents have occurred because operators opened up autoclaves or other pressure vesselswhile they were up to pressure (see Section 2. or ratherto theprogrammer. It signalledthis fact to the operator.1. or did not appreciate its significance8. Again a Hazop would probablyhave disclosed the weakness in the system for detecting the prcssure drop through the cake and changescould havebeen made. There are such people. There is a need for technologists who are equally at home in the fields of process engineenngand computer control. A Hazop would have providedan opportunity to discuss this point. To be fair to thecomputer. we should not treat the computeras a blackbox. which misled the computerinto thinkingthat filtration was complete. the computer had detected that something was wrong — there was no increase in power consumptionduring smoothing — and had signalled this fact by stoppingthe operation. Another incident occurred on a pressure filter controlled by a computer. Chapter 2 of Reference 1 discusses Chazops. In place of black boxes we need 'glass boxes' in which the logic is visibleto all. page 13). If a process engineerasks someoneto 209 . It is not necessary to understand the electronics but we should understand the logic. under all foreseeable conditions. As these incidents show.ERRORS iN COMPUTER-CONTROLLED PLANTS be kept steady or that the reactor temperature should be kept steady? He probablynever thought about it.2. variations ofHazopsuitable for computerized systems.

they are not scrutable and cannot readily be checked. not thecomputer).2.AN ENGLNECR'S VEEW OF 1-JUMAN FRROR write a conventional instruction for him.2. the operators have no idea what is happening and they switch off(themselves. so moreand more alarmsare added. the 'language' is soonlearnt. Every process engineercan read a process and instrumentation drawing. is notenoughto guarantee overallplantsafety. but at leastwe shouldsubjectthemto a thoroughHazop. low priorityalarms can divertoperators' attention from more important ones. If a process engineer asks for a certain method of control. not the instructions for making it. the end result may be the opposite of that intended(see alsoSection4. here is a quotation from an HSEreport18: 'The achievement of the required reliability by the hardware and software alone. When a serious upsetoccurs hundreds ofalarmsmay soundat once. Computer languages are difficultor impossible to readunless one is thoroughly familiarwith them and using them frequently. if a process engineerasks a software engineer to prepare a computer 80). he will look at the process and instru- mentation diagram to check that his intentions have been understood and carried out. To sum up this section. Ifwe want to see that something will do what we want. Ifthe specification ofthe safety-critical system is inadequate then the overallnuclear system may be unsatè even though the hardware and software implementation of the safety-system is completely reliable (withrespectto its specification).3 (page 83). Fittingalarmsillustrates the maxim that ifwe deal with each problem as it arises. In addition. Even the specification — the instructions to the programmer— maynotbe easy to check.1. we normally look at the final product. he will read it through to check that his intentions have been understood and carried out. the operatorshouldfeel that he is in charge of the plant and not passive bystander. I want to see food in the supermarket beforeI buy it. To quote Reference 18 again: 12. however. the result is no more readable than ifit was writtenin Babyloniancuneiform.' Adding alarmsto acomputerized controlsystemischeap and easy. The operators like to know whenameasurement changes on a page thatis noton display atthetime so that theycan intervene ifnecessary. not recipes.4 Misjudging responses to a computer a 210 . Until this time comes we shallhave to make do with the specifications. In contrast. As discussedin Section 4. In thelong term perhapswe shall see the development of plain English languages which anyone can read21.

The highest pressure recorded was 4 bar although the bursting pressure of the reactorthat burstwas about 60bar22. it wrote them onto ahard disk. It should be possible to link together on a special page for each significant alarmthe information neededfor dealingwith it. The relevant measurements were not normallydisplayedand had to be called up. display a messageadvisingthe operatorsthat something was amiss) when unexpectedmeasurements were received. Unfortunately. So any increase in computerresponsibilities means that more care is neededin supporting operators by interface and job design. Afterwards the operators said that the spillage would have been detected earlier if the chart recorders had not been removed from the control room when the computerswere installed. 211 . but no-one had foreseen the need to ask it to do so. This saves development time.The hard disk survived arunawayreactionand an explosion but all other data were lost. The systemcould havebeen programmed to carry out massbalances. The pages of a display sometimes look alike. The following incident occurred on a plant where the possibilityof a leak of liquid had been foreseen and a sump had been providedinto which any leaks would drain. most of it evaporated.ERRORS IN COMPUTER-CONTROLLED PLANTS 'Human beings can only maintain the skills and commitments needed to carry out tasks by actuallydoing those tasks. additionalscreens could be installed to continuouslyshowtrendsin importantparameters.' Another problem %vith alarms is that the information operators need for handling alanns is often distributedamongst several pages of the display. and by training. The explosion occurred near the end of a five-minute period and nearly five minutes' data were therefore lost. leaving a solid residue and none entered the sump. when a leak occurred it fell onto a hot pipe. Data are usually written onto a disk at intervals. but can cause confusion in an emergency because an operatormay tum to the wrongpage and not realizehe has done so. every five minutes. and thus cost. These shouldbe frequent. More simply.comparereadings for consistency and/orto sound an alarm (or. Onone plantthecomputer collectedspot valuesofeach instrument readingat the beginningof every minute and then. better. A level alarm would then sound. and increasedcomputeroperadon may give the operator less opportunity for this. The leak was not detected for several hours. to ensure that they can carry out theirremaining taskseffectively. The operators could have detectedthat something was wrong by a careful comparisonof trendsin a number of measurements but they saw no need to make such a comparison because they were not aware of any problem.

whichcould be controlled (remote manually) from any ofthree controlstations. into a low-pressure vessel. and station 3 to B. 212 . he used station 3. he sent a false signal to the computer to say that the limit switch was closcd. A and B.AN ENOrNEERS VIEW OF HUMAN ERROR The limit switch on an automatic valve(A) in a feedline to a reactorwas out of order. It was custom and practice to connectcontrol station I to streamA. The computerreceivedthe signalthat the valve had closedand movedon to the next stage ofthe process — additionofhydrogen to the reactor. only the operatorsresponsible for A shouldbe allowedto operate valveson A or makeother changes. As in one of the incidents described in Section 11. while 2 could have been a read-only station connected to either. during Hazops. An engineercame into the control room to check on progress on stream A. A number of operators wanted to observe what was happening on stream A and so stations 1 and 2 were connected to it. As station 3 was normallyused only for streamB. what will happenwhenforeseeable faults occur. The hydrogenpassedthroughvalve A. Station 1 could havebeen permanently connected to A. A batchplant had two parallel trains. As there was a crowd around stations 1 and 2. Whenthe mechanicamved the valve was actually open. errors by the engineerwho failed to follow custom and practice and to switch off the control station when he left. a bank trader in London accidentally and repeatedly hit an mstant sell' button. and by the operatorwho did not check that station 3 was connected to streamB. Accidentally hitting the wrong button should not produce serious effects hut there have been some such incidents in the financial markets.In order to check whether or not the limit switch was faulty. Whatcver thepositionof the valve. he assumedit was the one on display and operatedseveral valves. control station 3 to streamB and controlstation 2 to either stream. Soon afterwards an operator arrived to prepare stream B for start-up. As on plants which are not computer-controlled. He actuallymovedthe valves on stream A and ruinedthe batch23. whichruptured. But the systemwas user-unfriendly. StreamB was shut down. 145 sell orders for French 12-yearbonds caused their price to fall drastically24. of course. For example. the limit switch told the computer that the valvewas open so the operator sentfor aninstrument mechanic. in thewrong direction. Themechanicand the operators did not foresee what would happen when afalse signalwas sentto thecomputer9.6 (page 199). we should ask. Different colours could havebeen used to distinguishthe A andB displays. There were. He then left streamA on display.

123 Enteringthe wrong data Anoperatorcalculated the quantities of reagents requiredfor a batch reaction and askedtheforemantocheckhis figures. 213 .30 am the Australian dollar was worth US$63. She typed the message at 9. moving a cursor and entering figures and.By this time ashift change hadoccurred. The foreman found anerror. anything. However. Speculators made a profit of four million Australian dollars35'36. the computer can be programmed to reject or query instructions which are outsidespecified rangesor do not satisfy consistency tests. rather than coolly assessingwhat arerealistically thebestoptions'37.she forgot that a separate menu had to be called up to activate the delay. Enteringthe wrongdata has causedincidents in other industries and activities. At 9. if necessary.5% with effect from 9. The bank admitted that there were shortcomings in their procedures but nevertheless the clerk and her supervisors were transferred to other positions and their salaries were reduced.Twelvepeoplewere killed10. As a result the computer carriedout its default action and sent the message immediately. in response to transientpublic sentiment often generated by television images. If any people were responsible they were the senior managers who did not have their systemsreviewedfor featureslikely to produceerrors.20.70. and specified a delay of 6 minutesbefore transmission. new operatordid notrealizethat one ofthe the had been changed and he used the original ones. A pilot set the headingin a plane's inertialnavigation system as 270° instead of 027°. To quote John Humphrys. During the Hazop of computer-controlled plants we should consider the results of errors in entering data. Accidental entering of wrong data can be minimizedby requiring operators to carry out at least two operations for example.30 am. The error was not figures hazardous but the batch was spoilt. Responsibilityhas been redefined to include being seen to do something. The plane ran out of fuel and had to land in the Brazilian jungle.24 am.ERRLRS £ COM TER-COTPOLL!DPLANTS Another incident occurred when a clerk at the Australian Reserve Bank was askedto send an e-mailmessage announcing an increase in interest rates from 5 to 5. an action about as effective in preventing furtherincidents as announcing after someone has slipped on an icy road that no-one should do so in future.24 am the Australian dollar was worth US$63. So much Australian currencywas bought during the next five minutesthat at 9.

AN ENGiNEER'S VIEW OF HUMAN ERROR A patient called Simpkin was admitted to hospitalwith suspected meningitis. it is far too simplistic to say that the death was due to a slip by the person who entered the data. when she had returned to the UK. The inertial navigation system guided the plane. The date of the operation was shown on her Canadian notes as 11/4/96. the co-ordinates of the 12. It was switchedto manual control to cany out somemaintenance on a reactor.7. The program should have containedsafeguards such as the need to check that therewas a patientwith that name in the hospitaland that she was suffering from suspected meningitis. along a valleythat endedin acliff. A first test was negative and she was diagnosed as sufferingfrom acute influenza. which was flying at low altitude so that the passengers could view the scenery. 214 . The patient died25. Now that so much communication is international.7 Unauthorizedinterference with hardware or software 12. Unknown to the crew. with the loss of 257 lives.Itlooked very similar to the valley that the crew expectedto follow and they did not realizethat they were on the wrongcourse1112. Like the incidents discussed in Chapter 2. The UK hospital interpreted this as 11 April 1996 and gave her an appointment for an annual check26. 12. A second test was positive but the result was entered into the computer under the nameSirnkin and was thus never seen by her doctor. while on a sight-seeing tour of Antarctica. In 1979an Air NewZealand plane flew into a mountainside. to avoid ambiguity.She was told to arrange for it to be checked a month later. One ofthe valves connected to the reactorhad to be closedbut an interlockprevented it closing. data or programs I do not know of any incidentsin theprocess industries thathave occurred for thisreasonbutit wasthecauseofan aircraft crash. A lady had an emergency pacemaker fitted on 4 November 1996 while visiting Canada.6 Failures to tell operators of changes in destination way-point had beenmoved2 degrees to the East. On asking for an appointment at her local hospitalshe was surprised to be given one for April 1997.1 Hardware A plantcouldbe operated manuallyorbycomputercontrol. months should be spelt.

According to the report on one incident.The possible results of access by malevolent hackersare serious. • Connecting a control computer to another systemis a modification and shouldonly be carriedout aflersystematicstudyof possible consequences. Or perhaps he expected the all-powerful computer to somehowknow what had happened. data flowedfrom the simulatorthrough the control computer to the dcs and replaced the current input data by historic data. the operatordid not believe it was a real emergency. Fortunately this was soonnoticedby alert operators and the plantbroughtback undercontrol. page 205) until a systematic attempt has beenmade to identify the consequences and it has been authorized by a professionallyqualifiedperson. 'thecomputercancope'.ERRORS iN COMPUTER-CON I ROLLED PLAIfl S The connections to thelimit switches on the valvewere thereforeinterchanged so that the valvewas closedwhenthe computer thoughtit was open. was not really the result of using a computer. Ebertssays that some operators expect computers to behave like humans and cannot understandwhy theymakemistakesthat no humanwould make13 Files had to be transferred from a controlcomputerto a trainingsimulator. The computer thought the valve was open and decided to close it. When the maintenance was complete. As a result. there was no direct connection. Some conditions on the plant started to change. This incidentcan teachus severallessons: • No modification shouldbe madetohardware(orsoftware.seeSection12. The incident could have occurred on manual control if the operator had forgotten that the switches were interchanged. releasingflammable material2. To simplify the transfer a direct cableconnectionwas madebetweenthe control computer and the simulator. even whenalarmswere sounding. If made. the plant was switched back to computer control before the limitswitcheswere restoredto their normalpositions. the files were first transferred to a free-standing workstation and then from there to the simulator.2. Since the explosionat Flixborough in 1974this principlehas been widely accepted for modifications to process equipment and operating conditions. it should also be appliedto modifications to computerized systems. 215 . data tiow should be possible only in the outwarddirection. a frequent cause of accidents on plants of all sorts (see Chapter 10). It actually opened it. Perhaps he had forgotten. such as the first level of professional management. This incident. Unfortunately the address of the gateway in the control computer used for the data transfer was the same as that used to connect to the distributed control system (des). It was the result of a totally unsatisfactory method of preparingequipmentfor niaintenance. At first. like most of those describedm this chapter.

change interlocksettings and so on. This is not as simple as it might seem. On other occasions the transmitter is not faulty and an incidentoccurs14 It is usually more difficult today for operators to interfere with the software in this way than in the early days of computer control. Sometimes the operator is right and production continues.some companies allow operators to have 'keys' which allow themto override data.7. plantcontrol computers were rarely connectedto other computers or.OF • All systems should be secure. While we may never be able to prevent completely this sort of action by operators.1.2 Software Suppose anoperatorhas difficulty charging a reactorbecausethe vacuumis too low. page214). For example. However. any disruption ofcomputerized controlsystemscould be more senousthan loss of accounting ortechnicaldata15. we would expect it to be immune to viruses or compatibility problems. Itwas found in the central computer of a Lithuanian nuclear reactor and was said to have been insertedby the person in charge of the safety-control systems. Computer viruses are rather like AIDS.3 Viruses . if they were connected. Do not promiscuously share data and disks and you are unlikely to be affected16. Houses need doors. the flow of data was outwards only. much as operatorshave alwaysacquiredvarious tools and adaptors that theywere not supposedto have. but inward flow is in becoming increasingly common (as described Section 12. Ihaveseenonlyonereportofavirusincomputer controlsoftware. If the machine in charge of a production process is a programmable controller. And we shouldask ourselves why operatorsfindit necessary to acquireillegal tools and keys.AN ENGINEER'S VIEW OF HUMAN ERP. we can at least makeit difficult. viruses would haveto be presentin the original software or introduced via connections to other computers. However.7. 12. A controllerdoes not have a 216 12. Virusescan alsobe introduced by allowing operators to play computergames. Forinfectionto occur. He is able to do so by typingin an acceptable vacuummeasurement in placeofthe figure received from the transmitter. theyareadded to infrequently. Other operators acquire themin various ways.7. The doors on control systemsare less tangible than those on houses butjust as important. Onceprograms havebeenset up. a production control computer may tell a process control computer that more (orless) outputis wanted. so that he could help in the investigation and demonstrate his ability27. At one time. He assumes thevacuumtransmitter is faulty and decides to override it.

217 . it may havebeen modified duringits life.their accuracyshouldbe checked by independent calculation. Leveson28 has describedan incident on the Therac. except that (unfortunately?) it never wears out. the excessive valuewhichcausedthe inertial systemcomputers to ceaseoperation.' A function that did not serve any purposein Ariane 5 was thus left in 'for commonality reasons' and the decision to do so 'was not analysed or fully understood'. The new programmay be generatedon aPC. though this does not mean that incidents no longer occur. Old equipment may be built to lower standards than those used today. withinthe40-second timeframe. with an adequateoperational margin. itmay have suffered from exposure to extremeconditions of use and it may require so much modification for its new duty that it mightbe simplerand cheaperto scrapit and start again. and there is no operating system. the controller program has to be changed. All these reservations apply to old software. when the time comes to upgrade the facility. 'Ariane 5 has a high initial acceleration and a trajectory which leads to a build-up ofhorizontal velocitywhichis five timesmore rapid than for Ariane 4. However. For example. an apparatus for irradiating cancerpatients. 12. The higherhorizontal velocityofAriane5 generated.manual checks of a program for calculating pipe stresses showed that gravitational stresseshad beenleft out in error31. 118 The hazards of old software In the process industries the hazards ofreusingold equipmentare welldocumented. a value beyond the limit present in thesoftware.9 Other applicationsof computers Ifany calculations are carriedout as part of a controlor design program.ERRORS IN COMPUTER-CON1 ROLLED PLANTS disk drive.Shen-0rr29 has describedother incidents and Lions30 has described the loss ofthe Ariane 5 space rocket: 'In Ariane4 tlights using thesame type of inertialreferencesystemthere has been no such failure becausethe trajectoryduring the first 40 secondsof flightis such that the particularvariable relatedto horizontal velocity cannot reach.

As with almost everything else in this chapter. C. UK.AN ENI3INEER'S VIEW (IF HUMAN ERROR Pertrowski givesthe following words ofcaution32: a greater danger lies in the growing use of microcomputers. LossPrevention Bulletin. there is concernthat engineers will take on jobs that are at best on the fringes of their expertise. And being inexperienced in an area. Broomfield. 1989. Nimmo. 1995. B. On any plant incidents can occur ifwe do not allow for the effectsofforeseeable errors or equipment failures. T..W..' Of course.10 Conclusions As we have seen. E and Shen-Orr. L.R. Nunns. operatinginstructions or batch compositions.H. 4. if operators are given blanket instructions or are not told of changes in control settings. but they providenew opportunities for making old errors.Altrinchain. Computer ContiviandHuman Error(Institution ofChemical Engineers. Safety & Reliability Society Symposium. Since these machines and a plethora of software for them are so readily available and so inexpensive.We can makeerrors fasterthan before.. they are less likely to be critical of a computer-generated design that would makeno sense to an older engineerwho would have developed a feel for the structure throughthe many calculations hehad performedon his slide rule.W. computers provided new opportunities forfamiliarerrors. computers do not introduce new sorts of error. 1987. No 088. one does not need a computer to apply formulae uncritically outside their range of applicability. Lessons learned from the failure of computersystems controlling a nylon polymer plant. a Kletz. November 1987. some of the incidentsare particularlylikely to occur on computer-controlled plants becausedifferentdepartments may be responsible foroperation oftheplantand changes to the computerprogram. 12. Bormng. I have describedexamples elsewhere33. if there is poor handoverat shift change or if equipmentor procedures are changedwithout authority.operators (and designers) may have exaggerated views of the computer'spowers and many peoplemayhave limitedunderstanding ofwhatit can andcannotdo and how it does it... 2. Chung. Rugby. 1987. A. However. and Eddershaw. S. page3. B. Eddershaw. Some software contains codes which prevent it being used after a prescribed periodifthe bill has notbeen paid. UK).A. Communications of the Assocw lion for Comnputuuzg 1. P. References in Chapter 12 218 . 3.W..

25. 1982.Instrument Asia 87Conference. Swnposium Series No. 1999. Flighthiternational. OperatingEApeneitce Weekly Swnmaiy.. Learmont. 24.Uxbridge.. S.Anaheim. S. Mooney.R. Shadbolt. D.. G. 14. Reader'. page 5 (Office of Nuclearand FacilitySafety. N. Bristol. 12. 1998. Mahon. 1985. Chemical EngineeringProgress.Safety CriticalSystemsNewsletter.Sudbury. 1984.G. page 37 on Informationand Communications Technologies. Reading. Financial Tunes. 5(2).L..MA. M. Gabbett.November1984. European Safety & Reliability Association Newsletter. Becket. 13.y.Rugby. DC. Hendershot. Safeware — System Safety and Computers. USA). page7. Brunel Univer(Programme sity. M... Eberts.M.30. 17—23January 1990. 17. 55(221): 6 (Chemical Industries Association. Gondran. 6. Applying computers to safe control and operation of hazardous process plants.. Wray.Brussels.UK). 55(220): 95 (Chemical Industries Association. 1999. Auckland. 33 (Institute for Risk Research. Pvc computer control experience. 1985.H. 1991. 1995. (editors). 28 Leveson. 1995.G. of 21. 124 (Institution of Cheimcal Engineers. of page 36 (Institute UniversitySafety Officers. 18(2): 113.US Department Energy. Appendix A (Addison-Wesley. J.E. Hazards XI — New Directionsin Process Safety. Siddall. W. Washington. 1995. 1995. 219 . NewZealand).81(12). and Keeports.. 1984.. TheDaily Telegraph.. Health and Safety Executive.UK).. AM.. USA). 11. Safeware — Syitem Safety and Computers (Addison-Wesley. Health and Safety Executive.c Digest. The Use of Computers in Safety-critical Applications (HSEBooks. Quarterly Safety Summary. AG.'na. USA). Singapore. Why Systems go Wrong and Flow to PreventFailure (HSEBooks. D.Sudbury. N.RItRORS N COM?UTLR-C ON 1 ROLLED PL kNTS Machinery. page 164 M.Daily Telegraph. W.D. No. University of Waterloo. et al. 1993. Leveson. 99—07. 16. UK). 26 September 1989. ComputerControl and Human Limnns: Learning from IT and Telecommunications Disasters. Fleet. UK). Quarterly Safely Su.C. 9. An overview of the Shell fluoroaromatics explosion. 30(2). Sankaran. C'A. 28 August 1998 26. 27. May 1987. UK) 18. Foord. DesignforSafeOperation—ErperunentalRig to Production. UK) 19. London. 15.MA.. 5. Canada).P. 10. 1987. 20.F.UK) 8. Ehrenburger. page42.. Launch meeting ofthe EuropeanSafety and Reliability Association. 22. 1986. 1982. 1988. 7. London.E. PaperNo. Conputer Programmnmgin English. 23. AIChELoss Prevention Symposium. Sagan. October1986.n. Out of Control.. Reading. in Dutton. 19 October 1998. Process Safety Progress. May 1997.D . 1986. Verdict onErebus(Collins.

1999. Australia. 2000. page 63 (Hutchison. 1982.H.A. 220 ... in Kletz. Lions. Errors in Commercial Soft'4'are mciease Potentialfor Process Piping Failures. Humphrys. 31. Philadelphia. Petrowski. Thspelling chemical Engmeerzng Myths. USA) 32. Devil'sAdvocate. PA. T. 1995. Ariane5— Flight501 Failure (European SpaceAgency. 34. Nourishing and poisoning a 'safety culture'. Chemneca Conference. H. Kjetz. UK). Chapter3 (Institution of Chemical Engineers. (editor). 33. 1989(US Department ofEnergy. London. 89—B. Tweeddale..A.USA). Neumann. Washington. To Engineeris Human (St MartinjsPress.AN ENGINEERS VIEW OF HUMAN ERROR 29. J.Rugby. computercontrolandHuman Error.USA). 35.P0.. 1996. NewYork. Perth. DC. C. 37. T. UK). NewYork. Bulletin No. France)..Pans.19February2000 36. 1996. ComputerRelatedRisks (ACMPress.. TheAustralian.M. page 123 (Taylor& Francis. USA). Shen-Orr. 1995.J. 30.L.

1964. a necessary. 138 'Thereisalmostnohumanaction ordecision that cannotbe made to lookflawed and lesssensible in the misleading lightofhindsight. but offered sympathy.Personal and managerial responsibility 'Personalresponsibzlz is a nobleideal. he or she is no longer work-shy or lazy but a suffererfrom chronic fatigue or some other recently discovered syndrome. — B. but most deprived people do not turnto crime. Private Conscience Public Morality. my environment or my heredity?'Either way it was 13. safety?Should people not acceptsomeresponsibility for theirown safety? We live in a world in which people are less and less willing to accept responsibility for their actions. 'What's to blame.' Junctionrailwayaccident16 Report on the Clapham The readerwho has got this far may wonderwhat hashappened the to Has that no part to play in old-fashioned virtue of personal responsibility. — they willproveto be illusions. butitis no use basing social expectations uponit. or those who put himin a position in which he felt compelled to commit the crime. And in the storyof the schoolboyin trouble who asked his father. Inglis. Ifsomeoneis reluctantto work. If a man commitsa crime it is not his fault.ndn'idualaim. 'Whoeverdid this to you must be in need of help'. but the fault of thosewho brought him up. This attitude is parodiedin the story of the modem Samaritan whofound a man lying injuredby the roadside and said. He should not be blamed. Many people react to this attitude by re-asserting that people do have free will and are responsible for their actions. One psychologist writes: 221 . A criminal may say that his crimes were the result of present or past deprivation.1 Personalresponsibility nothim.

As individuals we shouldaccept (and teach our children to accept) that we are responsible for our actions. taking a few short cuts. Changing providethemselves the "courage" to hit a bank. Nor are criminals victims of their peer group . following custom and practice. not to a great extent but doing so to the extentthat experience shows people have done in the past.try to learn. the authorities tried to suppress it by savagery. is a slow business compared with the time-scale of plantdesignand operation.. The managerleft a box of plugs in the control room with a notice saying 'Help yourself'.. Reichel. even indulgingin a little petty crime when the temptation is great. So let us proceed on the assumption that people will behave much as they have done in the past. beforethe notes were redesigned17. In the early 19th century. programmed by our genes.writes: and permissive society. our parents. dealing with largenumbersof people. More than 300 people were hanged. They are kept on ahook.'15 Professors who assign researchprojectsthat requirehundreds of students to use a single source in a library inevitably invite trouble in our competitive An old method for discouraging the theft of fire buckets is to use buckets with a rounded base. are less the reasonwhy some become criminalsthan the tooltheyuse..not the other way around Drugs and alcohol . discussing crimein libraries. parents and siblings are victims of the individual's criminality . Bank of England notes were easily forged. do our best. young criminals-to-bechoose their peers.. The cartoomst George Cruikshankprotested against the penaltiesby drawing the cartoonnote shownin Figure 13. our environment or society at large. making a few mistakes. unless they make a co-ordinated effort to provide multiplecopies of the source and to subsidize convenient photostat machine operating costs. we should expect them to behave like average people — forgettingafew things. if it can be done at all. A number of three-pin plugs were stolen from plant instruments.AN ENGINEER'S VIEW OF HUMAN ERROR criminals aren't victimsofupbringing. But as managers. 222 ..'1 How do we reconcile these conflicting opinions? We should distinguish between what we can expectfrom individuals and what we can expect from peopleen masse. try not to forget. We must try to work safely. Instead of making forgery difficult. otherwise we are mere computers.1. Someof themwere takenbut not all and the thefts stopped. and many transported for the lessercrime ofpassing forged notes...

He must rememberthat 223 . 'The Factories Act is there not merely to protect the careful.'3 'The standard which the law requires is that (the employers) should take reasonablecare for the safety of their workmen. 13. When he asks his men to work with dangeroussubstances he must provide appliances to safeguard them. gibbets and ahangman'snoose. as these quotations fromjudges show: (A person) is not. the vigilantand the conscientious workman. he must set in force a proper systemby which they use the appliances and take the necessary precautions. the wearyand even perhaps in somecasesthe c.'2 We could replace folly' by 'human error'. and he must do his best to see that they adhereto it.hsobedient.1 The cartoonist GeorgeCruikshank protested againstthe severe penalties for forgery by thawinga BankRestriction Note grimly decorated withskulls. In order to discharge that duty properly an employermust make allowance for the imperfections of the human nature.ofcourse. buthe is not entitled to put out of consideration the teachings of experience as to the form those follies commonly take. the indolent. also the careless. but.PERSONAL ANO MANAGERIAL RESPONSIBILITY Figure 13. human naturebeing what it is. bound to anticipate follyin all its forms.© The BritishMuseum.2 Legal views UKlaw supports the view that we shouldexpectpeopleto behavein the future as they havebehavedin thepast.

it is true. careless. spelt out the social justification for saddling an employer with liability wheneverhe fails to carry out his statutory obligations. do his best to keep them up to the mark and not tolerate any slackness. the employeris potentiallyliable to all whosuffer from any failureto fence. 'Once the machineryis shown to be dangerous and require fencing. but that the employer had failed to fence the machinery. 'The accidentin the pigeoncase. And theduty is owedjust as much to the crasslystupidas to theslightly negligent employee. But then the accident would equally not have happened if the machinery had been properly fenced. 'He not primarily concernedwith the reasons for failures to 224 .'5 The judge added that the workman'sactions were not 'actuated by benevolencetowardsthe pigeon'. during working hours to the dust extracting plant — which he had no authority to do — to catcha pigeon flying around in the roof. unreasonable or disobedient or never had moments of clumsiness. In fact I think more arc due to a moment's forgetfulness. The Factories Act.'4 'In Uddin v Associated Portland Cement Manufacturers Limited. in his judgment in the Court of Appeal. The judge's commentssuggestthat many failures to work safely are deliberate.AN ENGINEER'S VIEW OF HUMAN ERROR men doing a routine task are often heedless of their own safety and may become slackabout taking precautions. by his foreman. therefore. forgetfulness or aberration. a workman in a packing plant went. as a result of which he lost his ann. He climbed a vertical steel ladder to a platform wherehe apparently leant over somemachineryand caughthis clothing on an unfenced horizontal revolvingshaft. would be quiteunnecessaryifall factoryowners were to employ only those persons who were never stupid. 'In upholding the award. The judge apportioned 20 per cent ofthe blameto the employer. hence the necessity for legislation with the benevolentaim of enforcingprecautions to prevent avoidable dangers in the interest of those subjected to risk (including those who do nothelp themselves by taking care not tobe injured). It would not be in accord with this pieceof social legislation that a certain degree of folly by an employed person should outlaw him from the law'sprotectiveumbrella. Lord Pearce.He wantedit for the pot. he said. would never havehappenedbut for the unauthorised and stupid act of the employee. However. the law. He cannot throw all the blame on them ifhe has not shown a goodexamplehimself. Humanity was not made up of sweetly reasonable men. The trialjudge found that the workmans action was the height of folly. like this book.

) In practice.) If an engineerguessesthe size of a relief valveinsteadofcalculating it. an error ofjudgementis not negligence. Lord Denningsaid in the Court of Appeal. secretary or other similar officer of the body corporate or a person who was purportingto act in any such capacity. For example. Section 6(1) imposes duties on any person who designs. for a variety of reasons.'6 '"Connivance" connotes a specific mental state not amounting to actual consent to the commission of the offence in question. or getting a competent person to calculate it. I do not think that there is any danger ofprosecution.manager. The Health and Safety at Work Act.6. concomitant with a failure to take any step to prevent or discourage the commission of that offence. a managercan be prosecuted ifhe turns a blind eye to breachofthelaw — that 1996 an asbestoscontractor who took no precautions to preventthe spread ofasbestos was jailed for three months. the managing director of a haulage company was imprisonedfor a year for failing to provide adequate protective equipment and a safe system of work24. 'TheCourts must say firmly. the company is usually prosecuted and prosecution of individuals is rare.'7(See Section 3. ifhe sees someoneworkingunsafely and says nothing. This was the first time someonewas imprisoned for a health and safety offence23. this is not an error of judgement but a failure to apply reasonable care and skill. that in a professional man. after a fatal accident. in the UK. imports or supplies any article for use at 225 . manufactures.'18 (Your employer mightbe prosecuted fornot employing a more competent person. deliberate damage to safety equipmentor failure to carry out duties which were clearly laid down as part of the job. page 66. Later the same year. the company andits directorswere fined25. but that is another matter. men will not always follow the rules and therefore designs and methods of operation should take this into account. under the civil law. As long as one does one's best and exercises reasonable care and skill.?RRSONAL AND MANAGERIAL RESPONSIBILITY work safely but accepts that. Under the criminal law. 'Where an offence under any of the relevant statutory provisions committed by a body corporate is proved to have been committedwith the consent or connivance of. any director. In 1998 a self-employed contractor was imprisoned for dumping asbestos. It seems to occur only whenthere has beengross negligence. he as well as the body corporate shallbe guilty of that offence and shallbe liable to be proceeded againstand punishedaccordingly. The quotations abovewere all made during claimsfor damages by injured people— that is. or to have been attributable to any neglect on the part of.

2 Roofofhousein present-day Jerusalemshowing 'batilements' 226 . manufacturer.'8 work'.RE EI4OiNEER' S VIEW or 1-COMAN ERROR in the course of a trade. This seems to exclude the employee designer. Figure 13. According to Section 6 of the Health and Safety at Work Act 1974. supplierand user ofequipmentmust ensure. that thou bring not blood upon thine house. business or other undertaking carried out by him (whetherfor profit or not) and to matters within his control'. the designer.The Bible tells us: 'When thou huildesta new house. that theequipmentis safe and withoutrisk to health when properly used. then thou shalt make a battlementfor thy root'. so far as is reasonably practicable. According to a 1987 amendment'an absence of safety or a risk to health shall be disregardedin so far as it could not reasonably havebeenforeseen'19 It is interesting to note that thelaw on safetyhas a long history. Section 6(7) states that such liabihty 'shall extendonly to things done In the East tbe roofs of houses were (and still are) used as extra living space (Figure 13. if any man fall from thence.2).

Sometimes people are blamedbecausethosein chargewish to deflectcriticism from themselves on to a scapegoat (see Section 3. then he mayhave to be moved.3 Blame in accident investigations Itfollows fromthearguments of this book that thereis little placefor blame in accident investigations. 227 . (Sometimes the employer can fairly be blamed for not providing a better system of work or better training or instructions or for not employing a more competent person. Even here. People may be blamedfor a moment's slip or lapse of attention because a manager. wishes to divert attention from his own failure to provide a safe plant or system of work. One controllerwent to look for his relief. In my experience. All the controllers on duty were arrested. perhaps unconsciously. leavinghis assistant alone for eightminutes. do not often do this. whowas late. Peopleshouldnotbe blamedfor behaving as most peoplewouldbehave. Before blaming anyone we should answer the questions listedat the beginning of Chapter 5. managers in the process industries. Plantsshouldbe designed and operatedso that these foreseeable errors do not result in accidents (or. The air trafficcontrollers were overworked and the control equipment was poor. If we consider separately the various types of human error discussed in this book. After an accidentthey are usually willing to stand in a white sheet and admit that they mighthave done more to preventit. Even if blame isjustifiedit may not be wise. The same is not true for all industries. as it mayresult in peoplebeing less forthcoming in the future and makeitharderfor us to findoutwhat happened (seeSection 1. page 72). Of course. if someonemakes repeatederrors. if the consequences are not serious. there is no place for blaming the operatorif the error is due to a slip or lapse of attention (Chapter 2). or is unwillingto do it.The assistant madea slip whichled to the accident.8. at least in the larger companies. more than a normal person would make.or the reasons for them had not been explained or someone turneda blind eye when they were broken in the past.PERSC)1'AL Ar1) MhNAC1ER1PL RESPONSB1L!TY 13. to lack of training or instructions (Chapter 3) or to lack of ability (Chapter 4).5. Everybodymakesmistakes or slips and has lapses ofattention from time to time and sometimes they result in accidents. evenwhenthe accidentis due to human error'. or shows that he is incapable of understanding what he is requiredto do. Following widespread protestshe was released after two years20. page8). the occasionalaccidentcan be accepted).)Blameis relevant only when the person concernedhas a choice and the deliberate decisions discussedin Chapter5 are the only errors that could justify blame.the violationmay haveoccurred becausethe rules were not clear. eight were tried and the overworked assistant was sentenced to seven years' imprisonment. In 1976 a mid-air collision in Yugoslavia killed 176 people.

Another candidate for inclusionis theUK train driver who. (A time machine such as Dr Who's Tardis would be a useful piece of equipmenton every plant.. The reasons humanerror that I havediscussedin this book. Managers are genuinely distressed when an accident occurs on their plant and would do anything to be able to put the clock back and prevent it.AN ENGINEER'S VIEW OF HUMAN ERROR If there is ever a martyrology of loss prevention. was sent to prison for passing a signal at danger22 (seeSection2.and • avarice.9. employees are sometimes so used to using blame as an explanation that they start to blame equipment.5. Workers haveseen how little theirlives haveoften been valued. 'the sincerityof the beliefs of those in BR at the time of the Clapham Junction accident. to ignorerisksfor the sake ofextraprofit oroutput.' and. A book intended for trade union members says..thereport on the Clapham Junctionrailway accident21 (see Section 6.) Forexample. by managers or workers. we wouldrecognizeas unwise. • apathy. in 1990. imagination and drive. tradeunion officials.4 Managerial wickedness I • ignorance. '. the Yugoslavcontroller shouldbe the first candidatefor inclusion. When companies try to get away from a blame culture. but not today. do not include indifference to injury as very few accidents result from a deliberate cold-blooded decision. page 118) says.and repeated in for the last section.ifwe hadtime to reflect. cannot for a moment be doubted.' The view that accidents are not due to managerial wickedness is not universally shared. page 38). safety officers. 228 . am satisfied that the errors which were made did not result from any deliberate decision to cut corners on safety.'9 This may have been true at one time.) Accidents occur becausemanagerslack knowledge. Somewriters seem to believe that it is the principalcause of accidents.3.. and perhaps regulators). do not see the hazards that are there and are subject to all the other weaknesses that beset human nature. It hasbeen said that thethreecausesof accidents are: 13. but not becausethey would rather see peoplehurt than take steps to prevent them gettinghurt.. (I say Scold-blooded' because intheheatofthemoment we are all tempted totake a chancethat. They look on industnal safety as a conflict between 'baddies' (managers) and 'goodies' (workers.

Relatively few offences are clear-cut.. Not villains. led astrayby foolishness or ignorance or both in combination.. inadequate supervision. I would agree about the first two but the third is the least important(see Section 6.6. oversight. The wordsin italicssum up this book.?ERSONAL AND MANADERI AL RESPONSIBILITY In industry. and topay more attention to the rules.5 Managerial competence . The real need is for a constructive means of ensuring that practicalimprovements are madeandpreventative measuresadopted. they can be prevented by bettermanagement. Whilewewould like individual workers to take more care. of failure to heed clear warnings. we shouldtry to design our plants andmethods of working so as to remove or reduceopportunities for error.Andifindividual workers dotake morecare it will beas a resultofmanagerialinitiatives — actionto makethem more aware of the hazards and more knowledgeable about ways to avoid them. can produce substantial reductions in those accidents which are due to people not wearing 229 13. page 120).. as mentioned at the end of the paragraph. Behavioural safety training. To conclude this section.' The secondis from the Aberfan Report1 : '.. The first is from the Robens ReporttO.few can be laid withoutqualification at the door ofa particular individual. but decentmen. Exhortation to work safely is not an effective management action. there are no villainsin this harrowing story .few arise from reckless indifference to the possibility of causinginjury. here are two quotations from official reports. are responsible for what happened at Aberfan. butthe Aberfan disaster is a terrifying tale ofbunglingineptitudeby many men chargedwith tasksfor which they were totally unfitted. and of total lack of directionfrom above. The typical infringement or combination of infringements arises rather through carelessness. lack of knowledgeor means. or sheer inefficiency.' If accidents are not due to managerial wickedness. All my recommendations call for actionbymanagers. In such circumstances the process of prosecution and punishment by the criminal courts is largely an irrelevancy. which led to the 1974 Health and Safety atWork Act: 'The fact is — and we believe this to be widely recognized — the traditionalconceptsof the criminal law are not readily applicable to the majority of infringements which arise under this type of legislation.

However.If he gives instructions on these matters he should enforce them. Weshouldlistonlythoseaccident causes we can do something about (seeSection 1.3. (4) Sometimes.'12 'I place the blame for 99 per cent of all accidents fairly and squarely on The secondis from a safetyengineer: 230 .managers. Serious process accidents have often occurred in companies that boasted about their low rates of lost-time and mechanical accidents (see Section 5. merely unhelpful. a word of warning: experience shows that a low rate of such accidents and a low lost-time injury rate do not prove that the process safety is equally good. The workman takes his cue from management. there is a desireto findscapegoats (seeprevioussection). directors. 'Unsafe to dangerous practices are carried out which anybody with an observant eye could see if they wished. and get something done about them as soon as possible. So my counsel for managers is not one of comfortbut one of challenge. both in the past and rightup to the present time. but in the long run.3.AN ENGLNEER' S VIEW OF HUMAN ERROR thecorrectprotective clothing. etc. He should go about with his eyes open instead of sitting in his office. Itis truebutitdoesnothelpusprevent them. The job has al'ays been done like this". 'Sole responsibility for any accident shouldbe placedfairly and squarely on the shouldersof the Departmental Manager. do they do anything about it? Not until a serious accidenthappens and then the excuseis "It has never happened before.This is not untrue. page4). foremenand chargehands. if you arepreparedto maketheeffort. leaving junk for others to trip over. Tosay accidents are due to humanfailingislike saying fallsaredue to gravity. to be able to note unsafe and dangerous practices or work places. (2) Itis easier to tell aman to be carefulthan to modify the plant ormethodof working. The first is from a union official: the shouldersofmanagement. usingthe wrong tools for the job. You can preventmost accidents. page 107). (3) Accidents are due to humanfailing. Why then do publishedaccident statistics say that so many accidents — over 50% and sometimes 80 or 90% — are due to 'humanfailing'? There are several reasons: (1) Accidentreports are writtenbymanagersand it is easy to blame the other person. Let me end this section with a few quotations. If the management doesn't care the workmandoesn't care eitheruntil something happens. not immediately. and if they do see.

is an implicit consequence ofmanagerial control.3showswhatcan be achieved by determined management action. for the period 1960—1982 (fromReference 14)."3 The third is a headline in an old copy of the Farmers Weekly: TAIL BITINGINPIGS FAULTSIN MANAGEMENTAND SUPERVISION Figure 13. 231 . A low accident rate. like efficientproduction. It showshow ICI'sfatal accidentrate (expressed as a five-yearmovingaverage) fell after 1968 when a series of serious accidents drew attention to the worsening performance 14 13.6 Possible and necessary 1960/64 62/66 64/68 66/70 68/72 70/74 72/76 74/78 76/80 78/82 Figure 13.3 ICI'sfatalaccident rates(the numberoffatalaccidentsin iOworking hoursor in agroup of 1000men ina workmg lifetime) expressed asa five-year movingaverage. 'Amplification of the desired managerial effect is more certain when managers apply the same vigorous and positiveadministrative persuasiveness that underlies success in any business function.PERSONAL AND MANAGLRIAL RESPONSIB1LI1 outstanding safety performances occur when the plant management does its job well.

R. E. and Macinn. 13 (International Occupational Safety and Health Information Centre (CIS). Preventing accidents is not one of them. The only safe way is to assume that our contribution is essential. 12. F. June 1965. Financial Tunes.. In this and other books I have tried to contribute towardsthe first two. J. 5. 10. 232 . I. origin unknown. Safety and Health at Work — Report ofthe Committee 1970—1972 (the Robens Report). London. Geneva..AN EN(3(NEER'S VtEW OF HUMAN ERROR Hazop. Fife. Paragraph 47 (HMSO. Safety (BritishSteelCorporation).V.J. we cannot do because there are no known methods. A quotationfrom ajudge's summingup.. Deuteronomy. Paragraph 26.A. References in Chapter 13 2.1 (HMSO. 2ndedition. 1967. Page. But we neverknow until afterwardswhichare the key situations.. Grimaldi. Redgrave's Health and Safety in Factories. The information is available. A quotationfrom ajudge's summingup which appeared in severalnewspapers m October 1968. 1982.M. D. Informnation Sheet No. Science 84. 3. London. August1971. knowledge of the actions needed and commitment. London. In fact. some of it in this book. Only in key situations are our best efforts really necessary. Health and Safetyat WorkAct (1974). 11. 84. Tue Guardian. whenwe are thelastline of defence. \Vhat is often lacking is recognition of the need. (editors). Eva.7 February1966. 1984. I. Commitment is up to you.UK). The quotation is from a House of Lordsjudgement. London. QRA or someother technique. Health and Safety at Work. There are some things. Chapter22. Switzerland). 1966. page 39 (Pan 16 (Butterworth. 1981. we mademany changes — we could not carry out controlled experiments — and we do not know which changes were the most effective. Verse8.. such as turning lead into gold. 7. 4.UK). UK). September1984. Report of the Tribunal Appointedto Inquire into the Disaster at Abeifan on October21st. and Oswald. 1966. 'Justinian'.We never know whenthe othershavefailed and everything depends on us. 1972. Hynes. 13. There was probablysymbiosis between them. UK). Is it reallyneces- I have heard colleagues suggest that the improved performance was due to sary? A lot of the time it may not be. Management and Industrial SafeAcluei'ement. Followingthe advice in this bookcalls for a lot ofeffort. Whincup. 8. Section 37(1). 9. 6. A lot ofthe time we can get by with a performance that is just adequate.

A.13 and 13 3. Reichel. 23. 1987. London. page339). Paper3. London.. 1984. The Tunes. Redgrai'e Health and Safety in Factories 's (Butterworths. A. Fife.. 22. 1989. 4 September 1990. As Reference 16. Stewart. 19.M. 1986. Investigation into the Clapliam JunctionRailway Accident. Hewitt.H. (Chairman). 25—29 June 1984.J. S. 18(3) 5. London. page 432 and Butterworth-Heinemann. 20.Journal ofAcademic Librarianship. and Keyworth. 21. Health andSafely at Work. E. 17. Proceedingsof the HEMRAWNIII World conference (GHEMical ResearchApplied to World Needs). Hawksley.UK). 1996.PERSONAL AND MANAGERIAL RESPONSIBILITY 14.UK). J. 16. Health andSafety at Work. page219. 1982. I. 20(10) 4 233 .V2 15. 1998.3 (HMSO. 1987. Oxford. (editors). V. Daily Telegraph.UK. UK. and Macbin.. 18. As Good as Gold (Bntish Museum. The Hague. Hidden. Paragraph 16. 18(12): 5 25. 6 December 1979. 24. 1996.L. Health andSafe/i' at Work. Paragraphs 13. A I. AirDisasters.UK).page 131 (IanAllan. November1984.

The Adventures of Joe Soap mistake every month. Or we couldmakesimplechanges toplant design or methods ofoperation. a workerinJur' isi-ecog. As a result nopicture was displayedfor so long that it becametatty orbecamepartof the unnoticed remove opportunities for error.used to he a management failure. Rugby. PreventingMajor chemical and RelatedProcessAccidents. AU these incidents _____ __________ happened to someone— though not to the same man—and may happen to us unless Poor Joe makes a Figure14.1 234 .UK) Joe Soapmakesfrequentmistakes —thatis.he takes thewrongactionas hehas notbeen toldwhatis theright action.JohnDoe makesthe little slips we all makefrom time to time. We could tell him to be more careful. page 65 of (Institution Chemical Engineers. SymposiumSeriesNo. 110. PL. Incontrast. These cartoons originally appeared on page-a-month calendars. 1988.The adventures of Joe Soap and John Doe 'Today. ThibautBnan.

Ifttn N Figure 14.THE ADVENTURLS OP 30E SOAP AND JOHN DOE Joe leftthe lob for a minute while draining water .3 235 .2 Joe asked a fitter to do a job on tank F1374B Figure 14.

4 Joe carried petrol.AN ENGINEER'S VIEW OF HUMAN ERROR Joe saw a vent near a walkway / Figure14.5 236 . in a BUCKET! Figure14..

.6 Joe off-loaded liquid nitrogen without testing Figue 14.7 237 . It caught fire 100 feet away Figure14. . .THE ADVENTURES OF JOE SOAP AND JOHN DOE Joe drained some oil into a FLOODED trench.

9 238 .8 Joe left off his goggles and stood too close to take — a sample J Figure 14.AN ENGINEER'S VIEW OF HUMAN ERROR Joe of a vessel with a gas detector and— a getting —zero tested the inside Oils with a flash point above reading allowed welding to go ahead atmospheric temperature are not detected by gas detectors HEAVY Ofl Figure 14.

11 239 .IHE ADVEN LURES OF JOE SOAP AND JOHN DOS Joe added hot oil Figure 14.10 Joe tied a plastic bag over the vent to keep the tank clean Figure 14.


Joe put steam into
opening the steam trap by-pass

a cold main without

Figure 14.12

Joe hopes that NEXT YEAR will be better

Figure 14.13 240



Further Adventures of Joe Soap
Once again poorJoe makes a mistake every

All these incidents happenedto someone— though not to the same man—and may happen to us unless


Figure 14.14

Figure 14.15 241



Joe ignored an

instrument reading which 'could not possibly' be correct...


Figure 14.16

Joe tested the line

at 10 a.m.

The welderstarted

C. LhV




Figure 14.17





Joe let the tractor

leave and then emptied the rear compartment first

Figure 14.18

Joe disconnected a flex before releasing the pressure -





Figure 14.19


Joe let a furnace

tube get i5OC too hot for 8 hours


Figure 14.20

Joe left a portable
light face down on a staging

Figure 14.21 244

.THE ADVENV\JRES OF OE SOAF AND jOHN DOE Joe put his oily overalls on a warm line to dry Figure 14..! Figure 14.23 245 ...22 Joe put some cylinders into a closed van.

AN ENCiTNEERS VIEW OF HUMAN EREOR Joe had the size of a vent changed Figure 14.25 246 .24 Joe left a pipe— used only at start-up —full of water Frost split It Figure14.

TFt kO'JENTURES OP lOP SOAP AND JOHN DOE Joe put his head inside a vessel to see flit was clean Figure 14.26 Now learn from the adventures of John Doe Figure 14.27 247 .

28 1'n Figure 14.29 248 .AN ENGINEER S VIEW (IF lIUMAN ERROR Figure14.

31 249 .TX-XE AOVEX-XTURES OX- 30F SOd\P AND JOHN DOE Figure 14.30 Figure 14.

32 Figure 14.& ENGEER'S VIEW fl HUMkN ERROR Figure 14.33 250 .

I1E AD\'ENTU O 30E SOAP ND JOUN DOE Figure 14.35 251 .34 patS Figure 14.

S ViEW OF 'rUAN ERROR Figure 14.37 252 .36 Figure 14.

39 253 .38 Figure 14.THE ADVENTUEES O 30E SOAP AND JOHN DOE Figure 14.

F. made by different people (managers. errors of judgement and mismatches. AResoundingTinkle 'Neverattribute to malice orother deliberate decision what can be explained by human frailty.Some final thoughts 'The bestthat can behopedforfrom the endingis thatsooner later it i'illarrive.requiressuch differentremediesand discourages thought about these remedies. or N.S.' AfterH. more often than Acts of God or bad luck. designers. is the conceptauseful one? 254 .' There has been much progress since this was written but human error is still quoted. or unpeifection ignorance. Theseterms implythat we can do littleor nothingabout naturaldisasters. in their own fields of activity. If human error covers different actions. construction workers. by differentpeople. in other cases better enforcement of the rules. Humanerrorimpliesthat we candolittle more than tell peopleto take more care. mistakes (includingignorance of responsibilities).1. fictions and fairy tales that mark our community approach to the accidentproblem. violations. Suggestions that 'it was an Act of God' or 'itwas a pure accident' or 'it wasjust bad luck' are still advanced as comprehensive explanations of the causes of accidents and are accepted as such even by professionally trained personswho. Kushnert I have tried to show that humanerrors are eventsofdifferenttypes (slips and lapses of attention. Simpson. in most cases a change in the work situation (see Table 1. like 'ActofGod' and 'cruel fate' that discourage criticalthought. as the cause of accidents as if it was a sufficient explanation. This approach is typical of the fables. are accustomed to apply precise analytical thinking. itis too easy to tell someone to be morecareful.Humanerror is one ofthosephrases. Almost any accidentcan be said to be due to an error by someoneandthe use ofthe phrase discourages constructive thinkingaboutthe actionneededto preventithappening again. Wrigglesworth wrote (in 1972)2: 'Accidentaldeathis now the only source of morbidity for whichexplanations that are essentiallynon-rational are still socially acceptable.operators. maintenance workers and so on) and that differentactions are required to prevent themhappening again — in some cases better training or instructions. page 10).

' Today no-onewould write like this. 1. Wrigglesworth. USA). the ether and protoplasm.even as a first approximation to reality. fire and water. 4. The Outline ofWireless. How GoodDo You Have to Be?. UK). Occupational Safely and Health. a fate that would certainly not have overtaken it if it had served any usefulpurpose.. R. in addition to earth. [Ether. E. 2. It does not explain anything that cannot be explained withoutit.' Based on a quotation from Kushner.] Accordingto Dewey6: intellectual progress usually occurs through shear abandonment of questions togetherwith both of the alternatives they assume — an abandonment that resultsfrom their decreasing vitality . It originally meant a fifth element. and 202 (Newnes. page 220 (OxfordUniveisity Press. 'The whole of space is filled with ether — a continuous very dense and very elastic medium. H S..SOFIE FINAL 1 HOIJGHTS In a book on radio3 that I was given as a school prize in 1940 there is a chapter on the ether. though practically incompressible.'4 Perhapsthe time has come when the concept of human error ought to go the way of phlogiston. 255 . It says. biologists have abandoned the idea of protoplasmas the physical basis of life. page 10 3. Perhapswe should let the term 'humanerror' fade from our vocabularies. ether and protoplasm. Stranger.. We have realizedthat the concept of the ether is unnecessary.. page 109 (Brown. Old questions are solved by disappearing. has twice been discarded as a concept of no value. UK). Perhaps 'cause' as well as 'human error' ought to go the way of phlogiston.' The author surmises that. Boston. 'Today the word "protoplasm" is faffing into disuse. something which permeates inanimate structures and gives them vitality. London. can easily be shiftedin respectto each other. Similarly. evaporating. We do not solve them: we get over them.. 1939. air. stop asking if it is the 'cause' of an accident. incidentally.C.MA. To quote the Medawars. References in Chapter 15 Oxford. Medawar. 1983.. 'the ether itself consists of very fine particles which. and insteadask what actionis requiredto preventit happening again. 1996. Aristotleto Zoos. while newquestions corresponding to the changedattitude of endeavour andprefer- ence take their place. P. something halfway between mind and matterthat could convertthoughtsintophysical events5. April 1972.

KtA ER'S 'JRW OC t ERROR 5. page 290 (Cape. 1978. 6. 256 . Oxford. Dewey. 1910.London. The Sacred Beetle and Other Great Essays in Science.. (editor). 1985..UK). TheBody in Question. UK). M. page20 (Oxford University Press. Miller. quotedin Gardner. The influenceof Darwinism on philosophy.J. 2nd edition.J.

or embrace asceticism.P. like engineers. change diets or life-styles. often unaccompanied by any practical programme. after describing the technological and economic changesneededto provide sufficient food for the foreseeable increasein the world'spopulation. L. while no panacea. 257 . UK).insteadof asking fora change in attitude. Jacks. R (editors). Reference 1. Perhaps. For example. and Bate. Gokiany writes1: above measures. Heroes and saints may be able to transcend human nature. but few ordinary mortals can.' the '. they shouldacceptpeople as theyfind themand try to devise laws.Postscript there is no greaterdelusionthan to suppose that the spirit will work miracles merely because anumberof people whofancy themselvesspiritualkeepon sayingit will work them. l.M. to reduce populadons.. 1999. are more likely to be successful than fervent and well-meaning calls.77 (UniversityofLondon Press) (alsopublishedby Cedric Chivers. Perhaps. page 256 (Butterworth-Heinemann. 1931. The Educationofthe Whole Mami..1966) Religiousand political leaders often ask for a changeof heart. asking theyshouldjusthelp peoplewith their problems. forpeople tochange. J.. Fearing Food: Risk. in Moms. Goklany. codesofconductand soonthat willproducea betterworld without institutions. Health and Environment. Oxford.

from place to place and frompersonto person. canteensand washrooms are filthy and they are treated like dirt. They have to reach a minimum acceptable standard or employees will be dissatisfiedand will nor work well.But improving them alone will not persuadepeople to work.havingaccidentally injured themselves. fringe benefits. It may contribute more to high sickness absencerates. The minimum acceptable standard varies from time to time. All these factors must be brought up to an acceptable standard beforewe can hope to get anyoneto work well. swing the lead. but improving these factors beyond the minimum will not make anyone work better. the workplace. Accordingto Husbandand Hinton.Appendix 1 — Influences on morale InSection4. workingconditions. there is no security. as stated in Section 4. somecompanies have given all 258 . status symbols and social grouping. The following notesdevelopthis latter themea little further. it is not a major contributionto accidentrates.3.3 (page 85)1 suggest that somepeoplemaydeliberately butunconsciously injure themselves in order to withdraw from a work situation which theyfindintolerable2 (or. If a group of employees feel oppressed theymaystrike. use this as an excuse for withdrawing from work). responsibility andrecognition. (See Reference 10 of Chapter4. I want to avoid technical terms as far as possible but there are two that must be used: Hygienic (or maintenance) factorsand inotivators Hygienic flictors include rates ofpay. Forexample. achievement. not comply. perhaps their job does not provide opportunities for growth.feign ignorance.but I want to emphasize that. make the most ofany injury and perhaps eveninjure himself.Ifanindividual employee feelsoppressedhe maydraghis feet. [some children react to a difficultfamily situation by hurting themselves'2. people will not work as well as they mightifthepay is low.) Perhapsthey do not get on with their fellow workers.

achievement. He knows all the rules and all the injustices. The object is obvious and unexciting.perks. clubs. children's outings — but still staff andpayrollremain'browned-off and bloody-minded'. take more responsibility. fine canteens. What's gone wrong? The answer. • less supervision.tell themwhentheyhavedone agoodjoband make itclear that there arc opportunities for promotion. sports fields. • betterworkmgconditions. responsibility aiid recognition. What about involvement? It is usual to give the floor-sweeper a schedule. according to Herzberg1. which weekly. solvemore problems. involve them in deciding what that objective shouldbe and how it is achieved.APPENDIX I — INFLUENCES (iN MORALE — high pay. etc. Give themajob with a definiteaim or objective rather than a collection ofisolated tasks. • moresecurity. he describesthe working conditions. little and big. The motivation seekef on the other hand is motivated by the work itself rather than the surroundings. 259 . How often does the manager(or anyone else on the plant) thank him for keepingit so clean and makehimfeel he is 'in charge of the plant'. drawn up by someoneelse. • more 'perks'. Second. good holidays. He is seeking all the time to complete more jobs. • morestatustrappings. If asked about hisjob.which says which areas are to be swept each day. fulfilpeople's iteed Jbr gi-owth. earn recognition. are usually better motivated than maintenance workers. security. showthemhow their work fits into the widerpicture. pensions. whohave the feeling theyare driving the bus. The theoryis easier to apply to staff jobs than payrollones but nevertheless let us try to apply it to the floor-sweeper.bring the hygienicfactors up to scratch (if theyare belowit). This theoryexplains why we are so well motivated in wartime (the aim or objective is obvious) and why process workers. is that to get people to work well we must: • • First. give them as muchfreedomaspossibleto decide how they achievethat objective. Why not involve the floor-sweeper in drawing up the schedule? He knows better than anyone which areas need sweeping daily.) Some men attach more importance to hygienic factors than others. The maintenance seeker' is seekingall the time thr: • moresalary. (He forgets to add 'forcleaning' whenhe repeatsit to his wife.

January 1973. London. and Hmton. 2. 260 . Carein the Home. Herzberg. P. The production directorwould then ask. Surroundings rich in opportunities for satisfying motivation needs breed motivation seekersand vice versa. page 13. If a works had a high accident rate. the works managerwould often explainthat therehad not reallybeen an increase in accidents but that a number of men had decided to take time off after suffering minor injuries. 1968. 'Why do your men lose time over minor injuries when the men in other works do not?' References in Appendix 1 1. P.. At one time I used to attendregularmeetings of worksmanagersat which the lost-time accidentrates were displayed. Work andthe Nature ofMan (Staples Press..UK). Husband.'N O ROR But people are not born maintenance seekers or motivation seekers. F.

Did he knowthe correct way to install a bellows and the consequences of incorrect 1.Most ofthe points madehavebeendiscussedalreadyin thisbook butit maybe useful to havethemcollectedtogether. A hazard and operability studywould haveprovidedan opportunity to do so. installation? 261 .the airline pilotwho pulls the wronglever. The first six are discussedin greater detail in Reference 2. • The use of bellows had been questioned during design. Infact. tions ofitemsofequipment whosefailurecouldhaveserious consequences. They usuallymeanthat most accidents are causedby people at the sharp end: theoperatorwhoopensthe wrongvalve. they are at theendofalong line ofpeopleall ofwhomcould haveprevented theaccident. But this isnotwhat peopleusually meanwhentheysay that most accidents are due tohumanerror. management and the environment which are not whollytrue. This istrue. Their onlyproblemis thattheyare wrong. • The fitter who installedthe bellowshad done abetterjob. After some months it leaked and a passing vehicle ignited the escaping vapour. the railway engine dnverwhopasses a signal at danger. Figure A2. Most accidents are due to human error • Therehadbeen athoroughinspectionafterconstruction andregularinspec• Everyonehad kept their eyes open as theywalkedroundtheplant. • Gas detectorsand emergency isolationvalveshad beeninstalled.Appendix 2 — Some myths of human error 'Often.The followingare some mythson human error. to save cost. thoughthere is often some truthin them. Damage was extensive as the surrounding equipmenthad not been fire-protected. A bellows was incorrectly installed so that it was distorted.1 (page 262) summarizes an example which combinesbits from severalincidents. the ideasput forwardmade (and make) perfectsense.' SteveJones1 In my book Dispelling Chemical EngineeringMyths2 I list nearly a hundred beliefs about technology. The leakwould not haveoccurred.ifwe include errors bydesigners and managers. if: • Bellows were not allowedon lines carrying hazardous materials.or the damage would havebeenless.

Anexpert in process safety was involvedduring design.Criticalexamination of designs by Hazop Do notuse bellows in linescarrymghazardous materials More involvement by process safetyexperts in design Figure A2.'- Betterfire-protection Bettercontrolofvehicle movements Leak 1* Bellowsmstalled incorrectly - Betterlayout Install gas detectorsand emergcncy isolationvalves Regularinspections Better training offitters Better inspectionafter construction Decisionto use bellows forthis application 1 Decisionto altow use ofbellows 1* . such as the fitter who installed the bellows incorrectly. (See the extract from the Robens Report in 262 . They were all responsible to some extentand it wouldbe wrong and unfair to pick on one of them.1 An exampleofanaccident chain • • • • Theplant hadbeen laid out so that vehicles delivering supplies didnothave to pass close to operating equipment. Did the designers know that diesel enginescan ignite leaks offlammable vapour? There hadbeen bettercontrolofvehicle movements. as he would have drawn attention to many of these items. The fire protection had beenbetter.Pii ENCtNEfl'S N1W Event Extensive damage 0i iiUMPii ERR(Ht Recommendationsfor Prevention/mitigation 1 Fire Ignitionby passingvehicle I-. There were thus at least ten ways in which the chain of events leading to thedamage couldhave beenbroken and many people who could havebroken it. and make him the culprit.

install. • • • The operatorknows what he is expected to do. 263 .) Unfortunately. Automatic equipment is unreliable.thereliabilityofa tripcan be increasedto any desired level by morefrequenttestingor by redundancy or diversity (that is. • A blindeye has not been turnedto pastnon-compliance. However.Up to a point. page 143). Accidents are caused by people so we should eliminatethe human element wheneverpossible Wecannot eliminate the human element. • The operatordoes not have too many more urgenttasks to carryoutfirst. page 143). They design. It mayberighttomakethe changeas these people havemore time andopportunities thanoperators to check their work. Voting systems can prevent is spurious trips. 3. it is better to rely on people Thisis the opposite ofthe secondniyth. the more failures we get. so on the whole. by duplicating components or providing other components capable of carrying out the same function). Ifnot. automatic equipment notalwaysmorereliablethan Each application shouldbe considered on its merits (see Section 7.4. after a serious accident the press think there must be a guilty man' who is responsible. thevalveis nottoo stiffor outofreach). The equipmentis clearly labelled. people.then we can assume that they alwayswill always do so and there is no need for automatic equipment? This is true only when: • The alarm can be heard.5. 30 minutes — to carry out a simple task. he may considerit unimportant. the more we install. alsomakeerrors.APPENDIX 2 — SOME MYTHS OF HUMAN ERROR Section 13. The operator knows whyhe is expectedto do it.5.butwe must notkid ourselves that we have removed the humanelement(see Section7. Ifwe automate anoperationwe are no on on longerdependent the operatorbut we are nowdependent the peoplewho testand maintain theautomatic equipment. 2. • The task is withinthe operator'sphysical and mentalability (for example.2. page 228. manufacture.2. There are also managerswho lay a smoke screen over their own responsibility by blaming workmen who broke a rule or did a badjob. such as closing a valve after an alarm sounds. though they never enforcedthe rule orcheckedthe workmanship. 4 If operatorshave adequatetime — say.

throughout The technical people know their jobs. • Employers cannotsay. since 1974.The best workers are those whofirst look criticallyand systematically at proposed changesto see if there willbe anyundesirableconsequences. experience shows that many technicalpeople are not familiar with the accidents of the past and the actions necessaryto preventthemhappening again. It shouldbe one of the safetyadviser'sjobs to see that theyaretrainedwhentheyarriveandcontinually remindedandupdated their careers. 8. whenhardware. The English language(unlike French) has few irregular verbs forthemto showthat an employer hasnotprovideda safeplant orsystemof' It does havea numberthat are not recognized as such. Followingdetailed rules and regulationscan preventaccidents In somecountries. The best workers are those who get things done quickly Not always. The safetyadviserin a high-technology industry can leave the technologyto themand attend to human problems. 7. Unfortunately.h O1tER 'iEW OF HUtN ERROR 5. 'It must be safeas I am followingall therules. • Codes and advice canbe updatedmorequickly than regulations. trainingand supervision. including theUS.It is sufficient work. There often are3'4.' • Regulators do not have to provethat a rule has beenbroken.Forexample: 264 .thoughadvice and codesofpractice are available. The regulations set goals that must be achieved do notspecify exactly how this mustbe done. the 'go-getter' whorushes mayrepentatleisure. this is widelybelievedand theauthorities write books of detailedregulations that look like telephone directoriesthough they are less interestingto read (andprobablyconsultedless often). The UK systemis betterbecause: • It is impracticable to write detailed rules for complex and changing technologies. Inthe UK. so far as is but reasonably practicable. methodsof operation ororganizations haveto be changed. 6 This myth is believed by those company heads whomake the safety advisers responsible to the humanresourcesmanager(and by those safetyadvisers who do not understand the technology and take an interest in simple mechanical accidentsonly). there has been a general obligation to provide a safe plant and system ofwork and adequate instruction.

especially Section 8. In some companies the reportpointsthe fingerat thepersonattheend ofthechain.7. There is some truthin this. using differenthose connectors on compressed air and nitrogen lines (see Section 2. • He is pigheaded. as shown by the fact that UK accidentshave fallen yearby year though the numberof cars on theroad has increased. 9. people compensate byworkingless safely. He requires extra training or reprimanding. Butnot allpeople do. Sometimes theyarenotawareof cheap minorchangesthat could improve safetyatlittleornocost. • You break therules. From nowon we should concentrateon changing human behaviourand on management. 10. An accidentreport maytell us more aboutthe cultureofacompany and the beliefs of the writer than about the most effective means of prevention. 11. They keep the risk level constant. We have done all we can to improve equipment. page 162). page31) or usingvalves whoseposition(openor closed) is visible at a glance. So far as safetyis concerned. unsafe acts or unsafe equipment This used to be a common belief among old-stylesafetyofficers. or seat belts are made compulsory. • You arestubborn. This is oftenheard but is nottrue. as the casehistories in thisbookhave shown. For example. Accidents are due to either human failureor equipment failure— that is. somepeople compensate by drivingfaster or taking other risks. In fact.Inindustry many accidents are notunder the controlofoperators at all. They occur as the result ofbaddesignor ignorance ofhazards. • He is tryingto wreckthejob. If we reduce risks by betterdesign. who opened the wrong valve. They might suggest improving the labelling (though there may be no system for checkingthat labels havenot vanished). Sometimes they arenot aware of fundamentally differentdesign options such asinherently saferdesigns (seeChapter8. we have: • I show initiative. If roadsand cars are made safer. most accidentscould have been prevented in many ways and many people had the opportunity to preventthem.6. We are still producingpoor designs because thedesigners arenotawareofthealternatives.APPENDIX 2 — SOME MYTHS OF HUMAN ERROR • lamfirm. Inother companies theyask whythere 265 .

accidents are normal In his book Normal Accidents5. He does not considerthe alternative. Todaymany companies are less simplistic. Errors or neglectin design. His diagnosis is correct but not his remedy. injure someone. construction. of an accidentmeasures the degree of negligence This is widelybelieved by the generalpublic and the police. especially whentheyare tightly-coupled — that is. 266 . and try to improve the rest. particularly nuclearpower plants. the replacement presentdesigns by inherof safer and more user-friendlydesigns (see Section 8. thoughit is obviously untrue. If someone throws a scaffold pole off the top of a structure. Why not run pipes and cables abovegroundwhere access is better and corrosion less? If it was essential to run a new cable or pipe underthe road. wouldhoweverbe verydifferent. The reaction ofotherpeople. interlockorrelief valveto preventsuch a simpleslip havingsuch serious consequences. In only a few companies do people ask if the hazard couldbe removedand whythat was not done. if the fencing was readily available and everyone knew whereit was kept. His answeris to scrapthose complexsystemswe can do without. he says. however. including the police.AN ENGINEERS VIEW OF HUMAN ERROR was no trip. In only a few companies. cause damage or haveno effect. component failure or unforeseeninteractions are inevitable and will have serious results. He was writing in the early 1980s so his ignorance of these designsis excusable. Complex systems. why not bore a hole under the road? Why not install conduits under the roads when plants are built?We can't fall intoholesthat aren'tthere. All FowledUp). instead of carrying it downstairs. will someoneask why it was necessary to dig a hole in the road. but the same argument is still heard today.They alsoask if holeshad been left unprotected in the past and if no-one had said anything. that can withstand equipment failure and human error without serious effects on safety (though they are mentioned in passing and called 'forgiving').which are very complexand very tightly-coupled. operation or maintenance. but the degreeofnegligence is the same. this might kill someone. They ask ifresponsibilities were clear. Considera simpleexample: someonefalls into a hole in the road. changesin one part produceresults elsewhere. are accident-prone. Perrow argues that accidents in complex systemsare so likely that theymust be considered normal(as in the expression SNAFU — System Normal. 12 The result 13 In complex systems.7 on page 162 and ently Reference 6). At one time the man or men who dug the hole would simply have been blamed for not fencing it.

but no more.unnecessary paper we should tell themonly what theyneed to know In some organizations employees are given the information they need to do theirjob. I havefailedin my task. However. will be correct the method used to obtain it is said to have been set as an examination question in thedays whentherewere20shillings (s) inthepoundand 12 pence(d)intheshilling8.about a third of theradiationis reflected by theclouds and if we allow for this the average temperature becomes —18°C. in good agreement with experience. If a civil engineer. 267 . If our knowledge is restricted to one field of technology or company organizationwe will have few new ideas. New ideas come from a creativeleap that connects disjointed ideas. 16. say. If the answerto a problem is correct. 15. The following Makeout a bill for the following: 4 lb butter 2 ½ lb lard 3lbsugar 6 boxesmatches @ 2s lOd per lb @ lOd perlb 4packetssoap flakes @31%dperlb @ 7d perdozen @ 2½dper packet One boyaddedup thepriceson the rightand gotthecorrectanswer(4s 8¾d). has a new idea in. control engineering.This systemhas snags. They will never do this if they are restricted(orrestrict themselves) to what they needto know. Creative people often 'belong to the wrong trade union'. the control engineers may be reluctantto acceptit. newspaperreporters and the police. To protect people froma deluge of time-wasting. We don't know what we needto know until we know what thereis to know. The heat produced by radioactivity makesthe actualtemperature higher9.APPENDIX 2 — SOME MYTHS OP HUMAN ERROR 14. The purpose of an accident investigationis to find out who should be blamed This is widely believed by the public at large. Ifyou have got this far and still believe it. A more technical example: if we assume that the Earth absorbs all the radiationfrom the sun that falls on it then the average surface temperature works out as about 7°C. say. Engineers need to build up a ragbag of bits and pieces of knowledge that may one (lay be useful. from the abilityto construct fruitfulanalogies between fields7.

.Chapter2 (GulfPublishing Company. Kletz. ThePanda'sThumb.(N ENCNEER'S VIEW O HUMAN ERROR References in Appendix 2 1.MA.London.A. 1999. UK). 1984. Kletz. In the Blood. Jones.T. Gould.C.UK). 8.N. SI. Dispelling Chemical Engineering Myths (Taylorand Francis.S. Kletz. Process Plants: A Handbook InherentlySafer Design.UK)...A.. Fallacies in Mathematics.. Texas. USA). 1966.NewYork. NormalAccidents (Basic Books. 9... R. deGrasseTyson.A.. 1996.. USA) Sanders. Perrow.2nd for edition (Taylorand Francis. page57 (Penguin Books.USA). page 9 (Cambridge University Press. NaturalHistory. 3. 268 . page 12 (Harper Colhns. chemical Process Safely: Learningfrom case Histories (Butterworth-Heinemann. 1983. DC. USA). 7.E.E. 1998. 2. page92. What Went Wrong? — Case HistoriesofProcess PlantDisasters.T. Maxwell. PA.4thedition.A. London.T. 1998. 4. 1959. Washington. 5. Houston.April 1999. Cambridge. Philadelphia.USA). Boston. 6.

conceit and quartets as well as sonatas. the structure of sonata form. AtfirstthemeAis domi- nant: we can get people to behave differentlyby providingbetter training or instructions. But then themeB appears: perhapswe canmake the task simpler andthen we won'tneedso much trainingor such detailedinstructions. • It is followedby acontrasting themeB.Appendix 3 — Some thoughts on sonata form Listeningto music one evening. it struckme that its argumentcould be summarized in sonataform. 269 . Finailythe themes are repeated. The argumentofthis book maybe expressed as follows: ThemeA We can preventhumanerrors by gettingpeople to behave differently. Contrastingtheme B Itismoreeffective to change the task thanchangetheperson. • Next. after workingon this book during the day. is as follows: • A theme A is stated.thereis adevelopmentsectionin which boththemes are developed but • • one maypredominate. widely used in symphonies. For those unfamiliar with musical terms. Bridge passage Weneed considerthe differentsorts ofhumanerror a differentkey. This sectioncan be as short as a few bars or can take up mostofthe movement. Bridgepassages may connectthe differentsections. to Development1st part Someerrors (mistakes) are due topoor trainingor instructions: the personwho madethe errordidnotknowwhat to doorhow todo it.

perhaps we ignored previousviolations and thus implied that the tasks(or the way they are done) arenot important. though still present. Recapitulation The two themes are repeatedbut nowthemeB is the mainone while theme A. Theme B is dominant: we shouldmakethe taskeasier.paymore attention.There isjust a suggestion oftheme A: in some cases we mayhave to ask a differentpersonto carry out the task in future. We should redesign tasks so that there are fewer opportunities for error (or so that the results of errors are less serious or recoveryis possible). ifthat doesnot work.someonetellingus totake morecare. distracted orunderstress andthereislittle that we cando to now less assertive. imposingpenalties. Theme A starts faintly. But thentheme B appears again: perhaps we shouldmakethetask easier. Again theme Ais atfirst dominant: we can persuadepeople to follow instructions by explaining the reasons forthemand. Development4th part Many errors are the result of slips and lapses of attention.h't EOEEFES 'E' OF ERROR Development2nd part Some errors (violations or non-compliance) are theresult of someone deciding not tofollowinstructions orrecognized goodpractice. perhaps beyond anyone'sability. Then theme B interrupts. tired. Development3rd part Some errors (mismatches) are beyond the mental or physical ability of the person askedto do it. 270 . if particularly we are busy. loudlyandclearlytellingusthat we all makeslips fromtime to time.

UK) (303 pages).Further reading Chapanis. Englewood Cliffs. London. 1965. The themeis very similarto the theme ofthis bookhutthe examples are taken from mechanical and controlengineering rather than the processindustries.Cambridge. 1973. andMycielska. New Jersey. UK) (229 pages). USA) (263 pages). 271 . Reason. Human Error in Risk Assessment. Report No. 1990.P. Man-machine Engineering (Tavistock Press. J. A briefreport on methods of classification andprediction. Rugby.. UK) (17 pages). interface design and job design. (editor). Symposium Series No. E.. A.. 1990. UK).. London. Littleslips and big disasters. J. Man and Computerin Process Control (Institution ofChemicalEngineers. 1989. UK) Reason. Interdisciplinaiy Science Reviews. Hunzan Factors in Industrial Safety (HMSO. 9(2): 179—189. Brazendale. Rugby.UK) (68 pages). Press. A collection of papers on the role of the operator. 1984. A briefaccount ofthe actions needed. Edwards. SRD/HSE/R510 (UK Atomic Energy Authority. 1984 (Institution ofChemicalEngineers. J. Ergonomic Problems in Process Operation. K.. J. F. Anaccountofthemechanisms underlying everyday andmore serious slips. and Lees. Wamngton. Absent-minded? The Psychology ofMental Lapses and Everyday Errors (Prentice-Hall. HumanError(Cambridge University (302 pages). 1982. 90.For a shorteraccountsee Reason. Health and Safety Executive.

New York.. 1994. USA) (390 pages). Petroski. 272 . Improving Maintenance— A Guide Sudbury. Petroski.f NC1NEER S VIEW OF HUMAN ERROR Centerfor ChemicalProcess Safety. Washington. USA) (63 pages). 2000 (HSE Books.. 1982. Guidelines forPreventingHuman Error in ChemicalProcess Safety (American Institute of Chemical Engineers. Cambridge. 1990(Chemical Manufacturers Association.UK) (74 pages). H. UK) (209 pages). H. DesignParadigms — Case HistoriesofError andJudgement in Engineering (Cambridge University Press. 1994. to Reducing Ermr. DC. NewYork.USA) (245 pages). A Manager's Guideto ReducingHumanErrors: Improving Human Peiforinance in the ChemicalIndustry. To Engineeris Human— The RoleofFailure in Successfiul Design(StMartinsPress.

169 273 . 65—66. 80—81. 105. 227 38 80—82 alarmoverload alarms 21—22. 149—150. 179. 125. 105. 31. 226 85 bicycles 'black book' 53 blackbox blame 207. 97—98. 199 'Acts ofGod' agitation aircraft Aisgill 254 32. blindeye 65.Index A Aberfan 184. 227 1.204—205. 113. 169.87 12. 53. 211 acetone acids 199 45. 17. 207—208. 143—144 behavioural safetytraining 12. 130—131. 104.266—267 213.225.210-211. 209 8—9. probability of 140. 148 11—12 123 79 122—123. 124. 35 34 225 121 157. 85—88 auto—pilot accidentinvestigation accidentstatistics accident—prone automatic train protection 39—40 automation 83—84. 257 bolts botching 63.225—229. 191. 53 9. incorrect asbestos assembly incorrect 83 110 169. 159. 143. 207. 229 audits 78—94 Australia ability. 155. 74. 84. 155. 221. 190 B back injuries 75 batchprocesses errors. 104. 58. 176. 53. 161 263. 118. 99. 261—262 BelvoirCastle 110 machines 25—26. 100—101. 116. 189 123. 263 boilers (see also waste heat boilers) 17.263 alertness alienation 83—84.lack of Aborigmes access 2. 119. 229 behavioural science techniques 107—109 bellows 150. 182.215 109—110 115 amateurism ambulances Amish ammonia anaesthetics appearance. 144—146. 161. 41. 155. 178. 104. 263 and people compared 84. 129. 192—194. 58. 150 beverage 121 Bhopal Bible 74. 75. 184—185 175 probability oferrors attitudes 146 75. 125. 37. 146—147. 198. 253. 130. 173. 99.2 13—214.

201 ofprocedures 4 costs 185 40. 169. 118—119. 261 contamination 192 contractors 56. 172 65—66. 181. 113. 203—218 computers calculationsby 217 207.122—123 underlying versus actions 255 caustic soda 193 chance 86—87 Channel Tunnel 65—66 209 Chazops check—lists 16 powerof houses compressor 22. collierytips 184 168—173. 208—210 199 198—199 194—197 57 251 burstingdiscs buttons 25—29. 84 267 creativity 221—226 crime 274 . 250 probabilityofpressingerrors 144—146 Byng. 53—55. 196. 182—183 52.AN ENGiNEERS 'flEE OF NOMAN ERROR 57 200 62. 182 22—24. 16. 151. 194. 104—107. 265 causes 114-120. 266 61. 165. 74. 215 capability exaggerated errors in entries 212—214.Admiral 74 C cablerailways 9 cables 176 calculation errors 32—34. 216 hardware failures 204—205 210—213 responsesto retentionofdata 211 software errors 205—206 208—210 specification errors condensate 198 31 connections. 228 classification systems 4. 121. 81. 121. 105. box girderbridges brakes breathingapparatus bridges colour—coding 175. 187—189. 179. wrong connectorsfor hoses conspiracy construction Chemobyl 38 Chinley chlorine 62 chloroform 31 chokes 50. 213 cars 6. 124. 125. 32. 63—64. 169—170 compressors 211 computerdisplays 8. 91 43 clamps ClaphamJunction 45. 101—102 31—32. 58.179. 57. 147—149. 201 119 weakness of chemicalburns 78. 161. 157. 193—194 94 57. 64. 155. 81—82. 157. 144—146. 78. 110. 26 193 cleaning clocks 43—44 clumsiness 79 dissonance 94 cognitive cokers 89—90 20. cranes 17. 161. 131—132 corrosion 104. 78. 115. 265 72 54 170 checks 99. 30.26. 209. 117—118. 27.7. 73 contradictory mstructions controlrooms 27. 200. 163.212 colunmsupports commonlaw commonmodefailure cornmumcation byradio verbal written complexity compressed air compressed gases 91—92 76 22 119. 179.43.

132 distractions 26. 211. 154—166. 148. 100. 202. 121. 26. 235. 114. 201 178 180. ancient dolls 3 domestic accidents 43—44. 148 electricity. 16. 90. 84. 125 88 74 84—85 expectations 102 experiments and amateurs 115 experts 170 explosion venting 31—32. 58. 56. 56. 265 62. 166. in numbers 30—31 measurement 146 opportunities for probability of examples limitations reasons for in software recoveryfrom escalators 3 6. 100. 117—118. 251 149—150 48.245 electncalarea classification 164 electrical equipment 64. 187—189. responsibilities of discretion 73 discussions 50.rare. 242 275 . 66—67. design 43. 147—151 130—131 205—206 ether ethylene 27 116 255 31—32. 173. 238. 34—35. 90. 54. 162—165 build and maintain' 185 liaison 164—165. 66. 16. 130. 78. 63. 98. 115 dimensions. 120. 61. 124 ofmanagers Einstellung electricplugs 113. 155 distillation columns 17. 244 isolation of 17. 91. 163. 115.209 errors(see also traindrivererrors) in batch processes 140. 12. 172 design 214 169 164 79 71. 177-479. 57.detectionof eveiydayerrors everydaylife evidenceignored(see mind—sets) 82 12 11 E education 48—76. 112. 91. estimatrnn of 85 112—125 directors. 179. explosions 80. 196. 204. 184. 170. 52. 265 alternatives 23—24. 240 drain valves 17—21. 109. 183. 143—144 112—125 by managers definition of v medical 214 31. 71. 129—152 131—146 129—130. 94 diagnosis diesel engines 90. reductions in) 44 demolition 2—4. 170—171. 105. 248. 250 opening of 13—16. 176. 164. 103. 41. 80—81. 64 duplication. 235. 20—23. 79. effecton 147—149 probability oferror dusts 61. 163 163 ethyleneoxide events. 32—34. 247. 109. 64. 120. 258—260.INDEX culture cylinders D 88. 178. 71. 253 166 drowning drums 30. 114. emergency evacuation entry to confined spaces equipment failures flameproof identification of 85 databases dates dead—ends 124—125 debottlenecking defencein depth demanning (see manpower. 146 93 documents.

64. 120—121. 125. 162—163. 20. 'glassboxes' goodengineering practice gravity 180. 265 flamearresters flameproof equipment flare stacks Flixborough foam—over 99. 155. 207—210. 199. 175—180. 184 178 hydrogen fluoride hydrogen sulphide hygienic factors I 202 78 258—260 45. 72. 187—189. fire—fighting 193. 264 1. 90.30. 125. 84—85 113. 65—66. 178—179. 54. 55. 166. 172. 119. 168. 252 209 168 2 greyhairs H habits hangers Harrow 4 initiative insects inspection instability processes) (of (see also runaway reactions) 101—102 instructions 2—7. 209 20. 206—208. 243. 91.231 180. 15. 205. 166 231 45—46.163. 184. 91 ICI identification ofequipment ignition ignorance 66—67. 191.AN ENGINEER'S VIEW OF HUMAN ERROR F falls hazard and operability fatalaccident rate fatigue of metals 105. 199. 164. 32. 165. 250 114. 166 studies (Hazop) 20. 170. hazards. 120. 183. 261 44. 175. 248. hidden(see also forgotten hazards) Hazop(see hazard and operability studies) Healthand Safety atWorkAct 113 229 Herald ofFree Enterprise HomeGuard hoses 117—118 48 32. 265—266 5. 244. 171. 160. 187—189. 85. 222 193—194. 72. 265 17—18 indicatorlights information information display (seealso labels) information overload information retrieval inherently saferdesign Freud 12 friendlydesign(seeuser-friendly design) full—scale deflection 267 126. 107. 261 G gaskets 151—152. 103. 73 276 . 206. 156 80—81 91 furnaces 29. 261 fencing(of machinery) Feyzin filters fires 168 224 55—56 13—15. 62. 177. 173. 164. 234—253. 123—124. 232. 100. 109. 48—76. 109. 165.121. 183. 230 176 39 contradictory 65. 36. 98. 109. 106 49 56. 200. 156—157. 235. 212—213. 192—193. 28. 61—62. 244 124 52. 118. forgotten 119. 61.215 239 force and pressure confused 54 222 forgery 11—46 forgetfulness hazards 53. 61. 115—1 16. 261 53—67. 116.

28. 214 Melbourne 57 memory (see forgotten accidents) mercenaries 110 methanol 193 methodsof working 17 mind—sets 38. 250 10. 148. 15. 79. 122—123. 204—206. 249 vi. 18—20. 161. 11—46. 187—192. 80. 265 21—22. 103—104. failureto use tasks knowledge—based M machines people compared 83—84 and maintenance 4. 121. 163. 178—180. 264 22—23. 27—29. 56 67—77 instrumentfailure insularity intention changes interlocks (see also trips) 150 114 182—1 83 liquefiedpetroleum gas (see also propane) liquidtransfers lithiumhydride Lithuania lost—time accidentrate lubrication 180 192 61—62 216 112—113. 190 2. actionspossibleby manpower. 178. 17 16. 112—125 227—231 managenalresponsibility managers 112—125 responsibilities of 120 senior. 60. 80. 92—93. 180. 116 missing ladders lapsesofattention law layout leak testing leaks 30. 54—56 5—6. 201. 114—115. 192 277 . 159—160. reductions in 60. 214—216. 201 effectof temperature on 54.72—74. 58. 26. 179 lifting gear 150 light bulbs liquefiedflammable gas 100. 182. 53. 175—185 185 avoiding theneed ofinstructions 71 isolationfor preparation for 75 116 114. 183. 3—4. 172. 109. 215.201. 227 225—226. 105. 99. 169—170. 201 34. 156. 176.8—9. premature joints K keys (seebuttons) kinetichandling King's Cross knowledge. 81—82. 250. 262 187—191 31—32. 32. 199. 261 onto water 81 lecturedemonstrations 45 221—226 legal responsibility libranes 222 43. 55. 67. 183—185 L labels 20. 211. 72. 30.208 qualityof 63. 41. 115. 178 manualhandling 75 materials ofconstruction 150. [44. 125 errors 7. 263. 193—194. 75.22. 193 88—89 judgement. 180—183. needtoreduce J 230. 62. 79. 27—29. 34—35. 149—150. 252. 215. 248.INDEX lost ways ofimproving 53.58 measurement errors. 260 150. 17. probability of 146 measurement ofsafety 125 Meccano 3 mechanical handling 198 medicalerrors 31. 115. 5 management education 113. 266 121 inventories. 114. 88—94.34.

errors in nuts and bolts 30—31 103—104. 160.RN ENGINEER' S vIEW Oi RUMRN ERROR mirrorimages mismatches missiles mistakes modifications v. 179. effecton accidents of 63. 148. 156. 185. 179. probability of) problems. 60. disappearance of propane 255 55 278 .201—202. 187—192. 237 nitrogen 164—165. 193. 196. 172. myths N naval warfare 269—270 261—267 73. 100. 171. 125. 60. 101—102. 112 189—190. 187—189. 83 nearmisses 126 266 negligence 171. 168. 266 numbers. 54. 245 217 236 statementsof polyethylene powder 100 31—32 61 250 preparationfor maintenance excessive pressure. 5—6.81—82. 217. 258—260 107 258—260 214 40 163 93 94 136 16—17. 5. 180—183. 214 214—216 p pacemakers Paddington paraffin parchments patience performance shaping factors permits—to—work 6 39 109—110. 204 blanketing (see violations) non—compliance normalaccidents 266 notices 20 nuclearpowerstations 50—52.78—94. 151. (see overpressunng) pressurevessels (see also reactors)13—16 probability data (see errors. 159 132—134 163 v. 150. 198 86 personality. 264 58—59. 80. 185 0 oldequipmentand software open containers underpressure 13—16 opening equipment full equipment 209 opemng forthefuture 164 options organizational failures 101.48—76 91—92. 66. people nottold unauthorized monitoring Moorgate morale motivational appeals motivators music 206. 196 120 179. 64. 27 overprcssuring probability of oxygen 13—16. 156. 156. 200 petrol 63 green 198 phonetic alphabet 169 phosgene 196 pictorialsymbols 54 pigs error 41 pilot pipe supports pipebridges 168—169 33 120 PiperAlpha pipes plastic pitfalls poisoning policy 53—54. 217 240. 216. 242 overfilling overheating (seealso runawayreactions) 55. 119 205 91. 101—102. 246 58 147—149 45.

121. 144. 208 53 rules. 147 (SPADs) similarnames 198—199 190—191. 212 settling record sheets 117—118 62. 82. 138—139 'secondchance' design reactions 30. 3—4. 189—190. 45.90. 179. pumps 20—21. automation. 189 91 signalmen'serrors 35—38. 263 184—185. 266 responsibility prosecution protective clothing 225 58. 62. refrigeration 149 198. 35—40. signals passed atdanger 104—105. runaway reactions 90. 228. 156. 55.INDEX 227—231 managerial 221 personal 151—152 risks. 193. 63—65. roofs. 200—201 reliability. 211 automatic start 141—142 rupture discs 251 pushbuttons (see buttons) S 264 Q safety advisers 35—38 122 Qumtinshill safetymanagement systems Salem 94 R 238 sampling 54 32.252—253.25. scapegoats 73—74. 147. 182—183. 53. servicelines 160 207.high reliefvalves 20—21. 238 legal 221—226 smoking 103 279 . 72—74. 121—122. 38—40. 38. 205. 181—1 82. 227. trips) 4. 146. 264 protoplasm 87 relaxationof 66 psychogemc illness 30. 63—65. respirators 118—1 19. 83. 230. 266 accidents (see forgotten 22. 104—105. seniority 116 72. 85—86. 165. scaffolding railways 85—87. 82—83. 184. 201 207—208. 51. 159. visits to 105 116 scrap 241 station fife Rasmussen Report 160 136. 86. 181—183. 227. 104. 105 interlocks. 57. 90—91. 149 shortcuts recoveryfrom error 36. 100. tolerability of road accidents 43 protective equipment(see also road tankers 61 alarms. 225 rootcauses isolation of 155 rule—based tasks 5—6.72—74.8—9. 211. 258 reactors(see also nuclearpower Sellafield 159—160 116 stations) 18—20. 80. 213. 178 research 10 skill—based tasks 5—6. 163. 32. 53. 144 30 slips vi. falls through 122—123 162—163. 104. 11—46. 192—193. 82—83. 253 scenesofaccidents. 91 shipping 1. 27. 32 self—inflicted injuries 87. 27—30. 163 repeated simplicity 'skill ofthe craft' hazards) 161. 198—201. 182. violations of 97—1 10 protective systems 255 rules (see also instructions) 208. 80.

151. 181. 204. 62. 50. 178 222 theft. 266 V tests 4. 109 7—8 time effecton errors tolerance (ofmisuse) 'trackcircuits' 263 144 121. 178—179. 190. 100—102. 242 hail—clone 105—106 software 205—206 vacuum 62 valves (seealso drain valves) 13—24. 43. 125. 175. 266 101 turbines tyres 50 156 T tailors 9 U tankers 63. 80. 58 construction TESEO 136—137 unconscious mind 11—12 unforeseen hazards 234—253 unforeseen problems(see also 211—212 modifications) iii university degree courses 3. 198. 22—23. 200. 168.41 100—101. 113.48—76. 52. 156. 180. overloading of systemfailures 81 147 training 2—8. 82—83. 79. 251—252. 119. 209. 131—132. 215 unavoidable dependence on 143 transfers ofliquids 156. 118 supervision supervisors. 91. 80. 65. 140 ThreeMile Island 50—52 280 .11—12. 214—215 vehicles (see also cars) 84—86. 124. 104—106. 175 trapped liquids 176. 171-472. 160. 27—29. 182. difficult 60 team working 43 telephonenumbers temperature. 243 tanks 56. 194. 190. 131. 173 37—38 38—40. 189—192. 70. probability of task overload 81 task underload 83—84 98 tasks. 212. effecton materials of 54.inscrutable 209 software tests 205—206 217 spacerockets SPADs(see signals passed at danger) Spain spares 190 106.unofficial stress 12.184 143 overfilling. 180. 204—206. 199. 72. 91. 212.60. 130. 193. user—friendly design 206. 109. 146. 237. 120—121. 243 trapped pressure traps trips(see also interlocks) 21—22. 84. 237—238. 249. 166 stressconcentration 161 37. 142—143. 201. 173. 263. 228 train drivererrors probability of spellingerrors 191. 183. 245. 155—156. 240 steamdrums 22 stirring(seeagitation) 106 stores. 100. 194. 262 ventilation 172. 189. 253 spilages 61 staticelectricity steam 13. 26. 26. avoiding THERP(Techniquefor Human ErrorRate Prediction) 136. 237. 104. 86. 21—22.AN ENGINEER'S VIEW OF HUMAN ERROR software. 177. 248. 159—160 109. 32.

rightanswer V 104 60 201 94 267 216 78—79 Yarranver bndge Yugoslavia 57 227—228 281 . electrolysis of bymanagers byoperators ofrules viruses visibility can prevent accidents 106 of 107—109 prevention 97—98. 97—110 witches wrong method. 159. 184. 79. 118—119.97—110.sucked—rn violations 98. 246 63. 90. 246 W wasteheat boilers wastematerial watchdogs watchmen 22. 237. 184.1I4DEX vents 20—22. 265 55. 236. 212. 239. 192—193. 8. 178.irregular vesselfailure vessels. 74 202 204—205 verbs. 227—229 99—102 t03—106 83 50—54. 246 water waterhammer weldfailure v. 239 214—215. 193—194. 5—6. 172.63.

Sign up to vote on this title
UsefulNot useful