You are on page 1of 33

Process Safety Engineering

Module 5.3 : Human Factors


CSB video – Tanker Unloading
This video is about offloading a hazardous chemical
from a road tanker. A simple operation that went badly
wrong due to human error.

Make a note of the different types of error, e.g.


 Design faults
 Procedure faults
 Lapses and violations

How could it have been prevented ?


How could the response to the incident be improved ?

11 mins
CSB video – Tanker Unloading
Design faults
Identical couplings, no labeling, no remote shut-off, no
toxic gas detection / alarm.

Procedure faults
PPE locked away, no driver PPE, responder delay

Lapses and violations


Complacency, unsupervised connection and transfer

Prevention – address all the above


Improved response – practice drills, , public siren, better
communication with responder
What are Human Factors?
“Human factors refer to environmental,
organisational and job factors, and human and
individual characteristics which influence
behaviour at work in a way which can affect
health and safety.”

Health and Safety Executive (1999)


Reducing Error and Influencing Behaviour
Human Factors in Process Safety Engineering

 Human Factors usually means ‘human error’ – e.g. mistakes.


Many underlying causes are possible - physical or mental.
In HF terminology, ergonomics and psychology.
 Process Safety Engineering can consider ergonomics issues
but not psychological issues such as stress, attitudes,
motivation, etc. These are best handled by a HF specialist.
 We will consider why workers make errors and their
underlying causes and then consider how HF is handled
during Process Safety Engineering.
Resistance to Human Factors
 HF is widely used in nuclear and defence industries (high
consequences of errors) and could contribute to the success
or failure of process safety programs in the chemical process
industries.
 But most of industry has not developed special initiatives to
comprehensively address human factors.
• Human factors has seemed
• too ambiguous or subjective (difficult for engineers to
handle)
• too involved and comprehensive (time intensive)
• potential for high cost for redesign (if done too late)
• difficult to change human behaviour (if we don’t know
how to)
• unnecessary – what will happen will happen
Too much focus on hardware issues

• e.g. Reliability of alarm systems:


– Electrical integrity
– Uninterruptible power supplies

• But what about the reliability of the person who has to


respond to the alarm?
– Will they notice the alarm?
– Will they know what action to take?
– Will they take it?
What are ‘Human Factors’?

Job Task, Work Overload, Fatigue, Environment


(Procedures,, Time Pressure, Maintenance)
Environment Noise, Lighting, Heat, Design of Display & Controls,
Ergonomics, Labelling and Signage
Organisation Culture, Leadership, Resources, Work Pattern,
Communication
Personal Training, Competence, Skill, Personality,
Attitudes, Risk Perception
Causes of incident of a major
Offshore Oil Operator in 1997 (n=276)
Equipment/technical failure 28%

Violations
33%

Error/omission
HF > 60% 28%

Design failure 12%


Typical Human Error Rates
Type of operation Error rate Examples

High level of stress 9 in 10 Major emergency

Complicated, non-routine 1 in 4 Plant upset with multiple alarms

Non-routine with other duties 1 in 10 Operator recently transferred from


being carried out at same time another plant

Non-routine without other 1 in 30 Mental arithmetic error without


pressing duties checking calculation on paper

Routine but requiring care 1 in 100 Act of omission, e.g. failure to operate
valve when valve status is not displayed
in control room

Routine, average task 1 in 300 Misreading a label and operating


wrong switch
Routine and simple 1 in 1000 Operation of the wrong sort of switch
(large rather than small)

Very simple operation > 1 in 10000 Selection of a key switch


Focus on Personal Safety
How is safety performance reported in your company ?
“No lost time injuries”
“Zero incidents”
“XX hours worked without injury”

There is often a bias towards personal safety, rather


than major accident prevention
– e.g. Focus on the safety of people carrying out
maintenance, rather than reviewing maintenance error
as an initiator of major incidents
– e.g. Focus on ill-health effects of fatigue, rather than as
a cause of human performance issues
Focus of Safety Management

Event
Severity

Major hazard
accidents are
here Frequency / probability

..but most of the management system, e.g.


performance measures, audits, behaviour
modification, etc. are aimed here
Four Types of Human Error

•Slips - Errors of action


•Lapses - Forgetting
•Mistakes - Sticking to rules or
making wrong decision
•Violations - Taking risks

Depending on 3 types of behaviour


• Skill-based behaviour
• Rule-based behaviour
• Knowledge-based behaviour
Types of Human Failure

Human Failure

Rule- based behaviour Knowledge- Skill-based


based behaviour behaviour

Violations Mistakes

Memory Action
Routine Situational Exceptional Lapses Slips
Skill-based Behaviour

Errors made when working in ‘autopilot’ mode.

For example, carrying out routine tasks with little conscious


thought or attention to the task, such as driving a car.

Repetitive operational tasks and some maintenance tasks


are like this. These tasks are vulnerable to error if attention
is diverted even momentarily (e.g. texting while driving).

Skill-based errors can be either slips or lapses.


Slips

• Intend to perform one


action but carry out
another
• Often described as
carelessness
• Characteristic of skilled
behaviour
Examples of Slips
A slip is failure to carry out an action, such as:

• Performing an action at the wrong time


• Omitting a step in a sequential task
• Doing too much or too little (e.g. over-tightening a bolt)
• Doing the right thing to the wrong object (e.g. press the
wrong button)
• Doing the wrong thing to the right object (e.g. misread
an instrument dial)
• Doing something in the wrong direction (e.g. move a
switch up instead of down)
Lapse
A lapse is caused by a loss of memory, i.e. forgetting to
carry out an action, or forgetting our place in a sequence of
actions.
Typical examples are a tanker driver forgetting to
disconnect the loading hose before driving off, or a pilot
forgetting to lower the wheels before landing, or a
maintainer forgetting to replace a component when re-
assembling a pump.

Avoid by minimising distractions and interruptions of the


task, providing a reminder such as a checklist for tasks
which take time to complete or involve waiting time.
Lapse – Going the Wrong Way on the Motorway
Rule-based Behaviour
Following a rule like ‘if that happens – then do this’.
An example of a ‘rule-based action’ is ‘if the trip alarm
sounds, then check that valve X is closed and pump Y has
stopped’.
Mistakes occur when we do the wrong thing believing it to
be right, when there is clear evidence that we are wrong.

Example : An operator was familiar with filling a tank, which


always took about 30 minutes. But the pipework had been
changed such that it filled more quickly. He ignored the high
level alarm because he thought it must be wrong, and the
tank overflowed.
(How often do we blame the instrumentation ??)
Rule-based Violations
Violations are deliberate breaches of rules or
procedures but are rarely malicious. Examples are
defeating an interlock or bypassing a trip, not
wearing PPE or removing a machinery guard.

Violations can be :
‘routine’ if people believe that the rules are unnecessary or are
never enforced.

‘situational’ if the time, resources or tools are inadequate for the task
or people consider the rules lead to unsafe working.

‘exceptional’ if the task is different from the norm and people decide to
take a risk, believing that the benefit outweighs the risk.
An example is the low load test of the Chenobyl nuclear reactor,
which lead to the explosion.
Knowledge-based Behaviour

When there is no ‘rule’ to work to or a


known ‘rule’ does not seem to apply.

It’s really about problem solving where an operator has to


work out a plan of action from first principles or using his
experience of similar situations.

Mistakes can happen where the operator has to consider


multiple sources of information that may be incomplete,
possibly spurious or even conflicting -
– and must apply considerable mental effort to arrive at a
diagnosis of what is happening and find a suitable solution.
Exercise 7 - Case Study
Formosa Plastics VCM Incident

• Explosion at Formosa
Plastics VCM facility in
2004
• 5 Fatalities
• Plant didn’t reopen
(employed 139)
• Human and system errors
caused incident

Hold at 05:42
Formosa Plastics VCM Incident

Which is an example of a slip, lapse, mistake or violation?

1. Equipment configuration and layout

2. Employees rushing towards incident

3. Locating emergency air supply near reactor safety valve

4. Operator turning wrong way at base of stairs

5. Operator bypassing reactor safety valve

24
Ways to Reduce Human Error

1. Always assume that human error will occur


2. Provide multiple means of warning of error in the early
stages of the event sequence
3. Provide means of stopping an error before it becomes critical
4. Design with layers of defence that control error outcomes
5. Incorporate human factors into inherently safer design
practices, management practices, and into improvements in
the work environment
6. Ensure human factors are imbedded into the culture
Examples of How to Avoid Human Errors
• Opening equipment under pressure
– Locate PG and vent valve adjacent to the door/hatch
– Avoid need to stand in front of door/hatch when opening it
– Mechanically interlock handle to the pressure and vent valves
– Design the door/hatch to open slightly before opening fully.
• Operating the wrong valve / Pressing the wrong button
– Colour coding and labelling
– Avoid ambiguous or confusing layout
• Testing for leaks of toxic / flammable substances
– Assume that a leak will occur and wear appropriate PPE
• Non-routine activities
– Use checklist and second signature
• PPE not suitable for the task, e.g. in confined space
– Re-design the task

28
Human Factors in HAZOP
Given the number of industrial accidents where human failures are a
major contributor, there is a wide consensus that HAZOP should
include HF. The question is ‘how’ ?

‘Procedural HAZOP’ (sometimes referred to as ‘Human HAZOP’) is one


approach, in which all the operating steps are reviewedin detail and
those considered to be safety critical are selected for HAZOP.

‘Safety critical’ means having the potential to cause or limit the


escalation of major accident hazards (MAH), generally associated with
‘loss of containment’ with serious consequences, e.g. a major injury
on-site or worse.

They are then examined in more detail using the ‘Procedural HAZOP’
guidewords
Procedural HAZOP Guidewords

Guideword Prompt
No/none Not completed at all
More/less Too fast/much/long
Too slow/little/short
Reverse In the wrong direction
Sooner/later Too early/too late
At the wrong time
In the wrong order
Part of Partially completed
Other than On the wrong object
As well as Wrong task selected
Task repeated
Other Ways of Studying Human Factors
The examination of human factors in the HAZOP should be
regarded as a first pass through the issues.
It is not the only way in which human factors can be assessed.

• 3-D model walk-throughs can also highlight improvements


to the plant’s layout, maintainability and constructability.

• SIL/LOPA analysis can identify the need for new alarms and
interlocks or re-organising tasks.

• Emergency procedures can sometimes provide insights into


task content and sequences.
Alarm Issues

Why do alarms and trips so often fail to protect against hazards?

 Alarms sounding too frequently


 Poor choice of alarms priorities
 No response defined for alarms – operator has work out the cause.
 Insufficient time to respond to an alarm (e.g. set too close to trip
setting)
 Alarms and trips known to have failed but not repaired
 Alarms not known to be inoperable (left in test position or over-
ridden)
 Alarms reset manually instead of automatically (inactive if not reset).
Alarm Management

Guidance on alarm design and management is provided by EEMUA 191:


‘Alarm systems - a guide to design, management and procurement’.

• Alarms should direct the operator’s attention towards plant


conditions requiring timely assessment or action;
• Alarms should alert, inform and guide required operator action;
• Every alarm should be useful and relevant to the operator, and have a
defined response;
• Alarm levels should be set such that the operators have sufficient time
to carry out their defined response before the plant condition
escalates;
• The alarm system should accommodate human capabilities and
limitations.
Key Learning Points

Possible reasons why the industry been slow to adopt


Human Factors :
• It is too time consuming.
• It involves extra costs for redesign - especially if it is done
too late.
• Engineers are trained to understand how ‘things’ behave
but not ‘people’.
• It is too subjective so there is no ‘right’ answer.
• Difficult to change human behaviour – especially if we
don’t know how.
• It is unnecessary – what will happen will happen.

You might also like