HUMAN ERROR I
Error Classification
& H FA C S
C h r i s t o p h e r Ta n
PSY 340 c h r i s t o p h e r. t a n @ h e l p . e d u . m y
OVERVIEW
• Human error classification
o Omission, commission
o Slips, lapses, mistakes
o SRK-based errors
• Errors, near misses, and accidents
• Factors contributing to errors
o Individual vs. system factors
o Reducing errors
• HE Models
o Person vs. system approaches
o Swiss Cheese & HFACS
HUMAN ERROR
What is human error?
Human Error (sometimes “operator error”)
• “Actions outside the bounds of expected or acceptable performance for a situation or
system” – Miller & Swain (1987)
• “Inappropriate human behaviour that lowers levels of system effectiveness or safety” –
Wickens (2003)
• May or may not result in accident, damage, injury, etc.
• Often unintentional
HUMAN ERROR
What is human error?
Analysed in…
• White collar workspaces
• Manufacturing (assembly, QC)
• Healthcare (doctors, nurses)
• Transportation (aviation, cars, rail, ships)
• Power plants
• Machinery/Computers
HUMAN ERROR
What is human error?
Examples:
Doors
• Push vs. pull, walking into glass
Appliances & devices
• Stoves, light switches, gadgets, applications
Miscommunications
‘Everyday slips’
HUMAN ERROR
What is human error?
Examples:
Driving
• 94% of serious crashes due to human error (NHTSA) → avoidable
• Causes include DUIs, speeding, distracted driving, fatigue, traffic law violations, etc.
Aviation
• 80% of aviation accidents caused by human error (pilots, air traffic controllers, mechanics, etc.)
• Only 20% due to machine/equipment failures
Shipping
• Commonly judgement errors, poor watchkeeping, non-compliance
• Analysis of 100 accidents at sea (Dutch Shipping Council)
• 96 of 100 accidents involved human error → people involved could have prevented accidents
• BUT human error only made up 15% of cited causes (345/2,250 causal factors)
HUMAN ERROR
What is human error?
Examples:
Healthcare
• Doctors, nurses, pharmacists
• In the U.S., 44,000 preventable deaths per year due to human error
• Of which, 7,000 deaths attributable to doctors’ sloppy handwriting (3-4 billion prescriptions annually)
• Case example:
o “Woman given erectile dysfunction cream for dry eye” (BBC, 2019)
o Doctor changes dosage from 25mg to 50mg; pharmacist prescribes 250mg, killing 14-yo boy with leukemia
o “Isordil” misread as “plendil”; patient dies within a week due to complications
HUMAN ERROR
What is human error?
“Isordil” or “Plendil”?
HUMAN ERROR
Towards a systems perspective
• The perception of error is changing
• From human as cause of error (i.e., operator error model) to problems within system as a whole
o Similar to Reason’s (1990) Person Approach vs. System Approach (discussed later)
• Reducing HE involves ensuring human-system compatibility
o In general, accidents do not occur → humans adapt to and cope with poor system design
o But performance can be improved; error likelihood reduced
o Also recall: depletion of cognitive & attentional resources; stress
HUMAN ERROR
Towards a systems perspective
• Approx. 75-95% of industrial accidents → caused by human error (Norman, 2013)
o Machine error is low
o Operator incompetence?
o Systems must be intuitive to reduce error
• When things go wrong, what is our response?
o Built structures collapse
o Electronic devices malfunction
o Vehicles break down
• BUT when accidents are thought to be caused by humans?
o Blame & shame
o Often no attempt to remediate system
HUMAN ERROR CLASSIFICATION
Commission vs. Omission
• Errors can occur at input, process, and output stages
• Initial focus → on behavioural aspects of HE (i.e., output error)
• Errors of commission vs. omission (Swain, 1963)
Commission Omission
• Inappropriate action executed (e.g., incorrect, • Action/behaviour that should be performed is
out of sequence, not timely, lacking quality) not executed (i.e., forgotten, skipped)
• Possible causes: Inadequate training, poor • Possible causes: Misunderstand system,
instruction, misunderstand system, etc. confusion, distraction, low vigilance, etc.
• E.g., Pressed wrong button, selected incorrect • E.g., Not wearing safety belts, skipping steps
mode, took wrong turn on checklist, missing out data entry
HUMAN ERROR CLASSIFICATION
Slips, lapses, & mistakes
• Error evaluation now focuses more on cognitive processes than merely behavioural aspects
• Distinction between slips & lapses (execution error) & mistakes (planning error)
Type of Omission / Phase of
Description
Error Commission Occurrence
• Error in intended execution; Unintentional commission
• Understanding & intention are correct; Execution wrong
Slips Commission Output
Unintentional
• Examples:
o Typing off-position, wrong gear shift, slips of tongue
• Failed to execute appropriate action; Unintentional omission
• Examples:
Lapses Omission Output
o Sending email without attachment, not using parking brake, forgetting
protocol/procedural steps (e.g., medical staff), car repair instances
• Understanding & intention incorrect; correct/intended execution of wrong
Intentional
behaviour
Input &
Mistakes • Establish wrong goal/plan via deliberate and conscious thinking Commission
• Examples: Process
o Turning into 1-way street, late assignment submission, taking a longer route
HUMAN ERROR CLASSIFICATION
SRK-based errors
• Skill-, rule-, and knowledge-based errors (Rasmussen, 1974; 1983; Reason, 1990)
o Errors occur when actions are inappropriate to the situation
Type of Error Description
• Incorrect execution of overlearned & automatic behaviour
Skill • Usually slips & lapses
SRK-Based Errors
• Misclassification of situation; application of wrong rule
• Often occurs when there are exceptions to the rule
o If situation A, then invoke rule Y (but actually not A)
Rule • Examples:
o Road system in U.S. vs. everywhere else (and unintuitive roads), doctor prescribing incorrect dosages to
target patient group, cooking steaks of different thicknesses
• When encountering novel situations, users require a novel response (i.e., no known skills/rules to follow)
• Caused by knowledge deficit/misapplication; wrong understanding of situation/cues
Knowledge • Also due to cognitive limitations, biases, etc.
• Examples:
o Emergency landings, surgery, firefighting
HUMAN ERROR CLASSIFICATION
SRK-based errors
NOVEL,
COMPLEX,
DIFFICULT
KNOWLEDGE-BASED
PROBLEMS
FAMILIAR
PROBLEMS
RULE-BASED
EXPECTED
SKILL-BASED
ROUTINE
AUTOMATIC CONSCIOUS
HUMAN ERROR CLASSIFICATION
Violations
• Intentional failure; deliberately performing incorrect actions; “illegal”
• Correct understanding of situation
• Intentional formulation and execution of incorrect action
• Often comes with good intentions & deliberate reasoning
o Recall: Workarounds
• Examples:
o Using lift during fire hazard evacuation
o Exceeding speed limit (esp. construction zones)
o Running traffic lights
o Driving in spite of flashing warning symbols
o Signing off on behalf of authority, clients, etc.
HUMAN ERROR CLASSIFICATION
Violations
Case Study: BP Texas City Refinery Explosion
• Planned safety inspections failed to be carried out;
safety regulations not enforced
• Common practice to fill raffinate splitter tower to
99%, even when official procedures specified no
more than 50%
• Day-shift supervisor left work early with no
designated replacement (BP requires an experienced
supervisor to be present)
ERRORS, NEAR MISSES, & ACCIDENTS
• Accidents
o “When something happens unexpectedly without intention leading to consequence of
damage or injury”
o Accidents caused by errors, but not all errors lead to accidents
o Without intention → No intention to cause accident, even if behaviour is intentional
• Near Miss (i.e., close calls)
o When there is error and no accident
o BUT there was a potential accident
o E.g., Almost crashing, took wrong medication, slipped and almost fell
• Also: Incident (Wickens, 2003)
o Occurrence of event that could have resulted in accident, but did not
o Near misses → considered incidents
ERRORS, NEAR MISSES, & ACCIDENTS
Accident & error reporting
• Errors/incidents that result in major negative consequences → very rare
• Most unsafe behaviours cause very little to no loss at all
• BUT reporting near misses are crucial
o Evaluate preceding events to determine causal factors
o Only then can effort be made to reduce error
• Reporting only when accidents occur is often too late; severe damage already done
• Many factors/error combined together lead to major accidents
o Reporting incidents & near misses help to identify and address these individual factors
before they eventually cause accidents
ERRORS, NEAR MISSES, & ACCIDENTS
Challenges in error reporting
• What are potential challenges in error reporting?
o Recall: operator error model
• Reporting incidents, near misses, or errors that did not result in loss
o Incidents have no obvious cost to organization/employee, so why report?
o Reluctance → people get into trouble; punished/fined
o Reporting behaviour is valuable, yet system punishes it
FACTORS CONTRIBUTING TO ERROR
• Errors occur at various stages (i.e.m input, processing, output)
o Therefore, errors must have various causes
• Causes of error may arise from the individual or the system (Miller & Swain, 1987)
CONTRIBUTING FACTORS
INDIVIDUAL FACTORS SYSTEM FACTORS
• Internal to operators
• Components of the system as a whole
• Individual differences
• External from operators
• Cognitive capacity and limitations
FACTORS CONTRIBUTING TO ERROR
Individual factors
• Individual differences
o Personality
o Attitudes
• Human limitations → Cognitive capacity/limitations
o Decision-making
o Information processing
o Memory
• Cognitive processes impacted by additional factors
o Expertise
o Sleep deprivation
o Stress
FACTORS CONTRIBUTING TO ERROR
Individual factors
Expertise
• Performance increases with expertise
• Higher expertise → higher automaticity → more available cognitive resources / reduced
workload
• Experts have more knowledge-based skills to solve novel problems
Sleep Deprivation
• Reduces ability to think systematically
• Impairs memory, perception, concentration, & reaction times
• People acknowledge that SD negatively impacts their performance, but still overestimate their
abilities under sleep deprived conditions (Jones et al., 2006)
• Younger people (20s to 30s) assess SD impacts on themselves as lower than on older people; but
reverse is actually true (Phillip et al., 2004)
FACTORS CONTRIBUTING TO ERROR
Individual factors
Stress
• Arises when perceived demand on operators exceeds ability to cope
• Stressors → can be environmental, psychological, temporal
• Environmental
o Physical aspects of environment (i.e., air quality, temperature, lighting, office set up)
• Psychological
o Workload, cognitive appraisal (i.e., task complexity, high risk, time pressure)
• Temporal
o Fatigue, sleep deprivation, work shifts
Individual differences in stress
• Stress is perceived → individual response to stressor
• Dependent on individuals’ characteristics & expertise
FACTORS CONTRIBUTING TO ERROR
Individual factors
Psychological Stress – Mental Workload
• Task demands on operators’ limited information processing capacity
• Often viewed in relation to operator resources
• “… portion of operator information processing capacity or resources that is actually
required to meet system demands” (Omolayo & Omole, 2013)
• “… the cost of performing a task in terms of a reduction in the capacity to perform…”
(Kramer & Sirevaag, 1987)
FACTORS CONTRIBUTING TO ERROR
Individual factors
Psychological Stress – Mental Workload
• Work Overload
o MWL is too high
o Task demands > user capacity; time pressure
o High stress/pressure situations
o E.g., Military operations, emergency flight situations, firefighting
FACTORS CONTRIBUTING TO ERROR
Individual factors
Psychological Stress – Mental Workload
• Work Underload
o MWL is too low (user has to maintain sustained attention in low arousal situations)
o Boredom; fatigue; also prone to cognitive & performance deterioration
o Recall: Vigilance
o E.g., Lifeguard, long-haul pilots/drivers
o E.g., Air traffic controller errors typically occur more during “low-workload shifts” (Stager et
al., 1989)
Both work overload and underload can impair performance; fatigue
FACTORS CONTRIBUTING TO ERROR
System factors
• How does the larger system affect performance outcomes / likelihood of error?
• Mistakes result from decisions made long before erroneous behaviour – Don Norman
• Systems include interaction of various components:
o People
o Tasks
o Environment
o Tools; technology
o Organizational structure & culture
o Policies, procedures, protocol, regulations, laws
REDUCING ERROR
Generally, reducing error can be achieved via 2 approaches (recall: individual & system factors)
• Individual Factors: Increasing human reliability (through selection & development)
• System Factors: Designing (or redesigning) systems for better human use
INDIVIDUAL FACTORS
Expertise
• Training → Increase knowledge & skill; reduces
error at all SRK levels
• Practice; exposure
Leverage on optimal stress
• Yerkes-Dodson Law (1908)
• Experts have higher ‘threshold’ of stress than
novices
• Bridge ‘gap’ by training for expertise or stress
management & coping skills
REDUCING ERROR
INDIVIDUAL FACTORS
Checklists
• HIP and memory limitations; reduces load on human operator
• Helpful with complex systems and tasks; long behavioural sequences
• Used in healthcare, aviation cockpit procedures, construction, finance, etc.
• E.g., Medical checklists → reduced surgery-related complications and deaths by 30%
REDUCING ERROR
SYSTEM FACTORS
Task redesign
• Implement human factors principles into task & work environment
• Prevent overload (excessive workload on operators) and prevent underload (reduced vigilance;
boredom)
o Design displays to be easily interpreted
o Consider MWL in duty scheduling
▪ E.g., Army helicopter flight time limits based on workload; night & low-to-ground flying
imposes higher workload (shorter duty cycles)
o Increase use of automation → operators can focus on more complex aspects of task
o Higher frequency of breaks; shorter shifts → prevents vigilance deterioration
REDUCING ERROR
SYSTEM FACTORS
Task redesign
• Standardized work processes → reduce complexity of tasks (less demands on WM)
o E.g., SOPs, protocol
• Environmental design/conditions
o Temperature
o Lighting
o Workspace
REDUCING ERROR
SYSTEM FACTORS
Warnings
• Alerts and supplies info that helps prevent harm or dangerous behaviours
o E.g., Car headlight reminder, seat belt light, fire alarms, warning labels, etc.
• Use of warnings sharply increased in 80s due to legal risks
o But how much do warnings help?
• Effective warnings → noticed, understood, implementable
• Must consider:
o Characteristics of warning mechanisms
o Target audience
REDUCING ERROR
SYSTEM FACTORS
Warnings (cont’d)
• Usually visual or auditory (sometimes both)
• Visual warnings
o Visible/salient – i.e., size of visual, location, colour (e.g., contrast), movement (e.g., flashing,
blinking)
o Written warnings → readable (e.g., font type & size, contrast)
o Easily interpretable – i.e., universally understood pictures, symbols
• Auditory warnings
o Volume (consider setting in which warnings will be used)
o Messages must be shorter than visual-verbal warnings → WM limitations
REDUCING ERROR
SYSTEM FACTORS
Safety Culture
• Safe behaviours as norms
• Incident/near miss reporting
• Open communication around safety issues and concerns
• Top-down support; management as ‘role models’
• Seek solutions > blame
MODELS OF HUMAN ERROR
Approaches to HE (Reason, 2000)
Person Approach Human operator as the main cause of error
System Approach Human error is a consequence of systemic factors
“We cannot change the human condition, but we can change the
conditions under which humans work”
- Reason (2000)
MODELS OF HUMAN ERROR
Approaches to HE (Reason, 2000)
Approach Description
• Focuses on errors of individuals; “unsafe acts”
• Blame people for incompetence, negligence, moral weakness, etc.
o Individuals are the “sharp end” (i.e., front-line operators)
o E.g., Doctors, nurses, pilots, bankers, etc.
PERSON • Assumes unsafe acts are due to human fault
o E.g., Negligence, incompetence, carelessness, recklessness, non-compliant
• Countermeasures directed at reducing unwanted human behaviour → blame & shame
o E.g., Disciplinary measures, fines, revocation of license, litigation threats
• Isolates unsafe acts from their system context
• Focuses on conditions under which individuals work (system components and factors)
• Assumes human fallibility as given; errors are consequences of systemic factors
• Countermeasures based on building ‘defences’ within the system to reduce error
SYSTEM • When accidents occur…
o “How and why did defences fail?”
o NOT “Who blundered?”
MODELS OF HUMAN ERROR
Reason’s Swiss Cheese Model
Defence layers:
• Put in place to prevent error & accidents
o Engineered (alarms, walls, auto shutdowns)
o People (surgeons, nurses, control operators)
o Procedures & regulations
• Layers have ‘holes’ that allow errors
Accident
System/Defence failures Defence layers
MODELS OF HUMAN ERROR
Reason’s Swiss Cheese Model
Holes in defences are due to:
• Active Failures
• Latent Failures (or latent conditions)
Active Failures
• Unsafe acts committed by operators (humans in direct contact with system)
• Slips, lapses, mistakes, violations
Latent Failures/Conditions
• “Resident pathogens”; may lie dormant and unnoticed within system for long periods
• When holes are ‘aligned’, combines with other latent and active failures to trigger accidents
• E.g., Unworkable procedures, faulty alarm systems, design & construction deficiencies, lack of
safety culture, poor supervision, etc.
MODELS OF HUMAN ERROR
Reason’s Swiss Cheese Model
• Primarily theoretical
• Helps conceptualize how accidents occur – i.e., accident trajectory
• But a little vague
o What exactly do the holes in each layer represent? → Unclearly defined
o How do we analyse specific accidents?
• Operationalized as HFACS
o Human Factors Analysis & Classification System
o Human error framework originally developed for and used by the US Air Force
o Used in investigating accidents & identifying patterns of HE to ultimately reduce error
HUMAN FACTORS ANALYSIS & CLASSIFICATION SYSTEM
HFACS (Shappell & Wiegmann, 2000)
HFACS describes 4 levels of failure (i.e. holes in different layers of cheese)
4. Organizational influences
3. Unsafe supervision
2. Preconditions of unsafe acts Latent Failures
1. Unsafe acts
Active Failures
Accident
HUMAN FACTORS ANALYSIS & CLASSIFICATION SYSTEM
HFACS (Shappell & Wiegmann, 2000)
Unsafe acts
• Errors & violations → committed by frontline operators
• Active failures that directly cause accidents
• Errors:
o Skill-based errors: Execution errors; slips & lapses
o Decision errors: Choose wrong response; mistakes
o Perceptual errors: Sensory input degraded; faulty info
• Violations:
o Routine violations: Habitual; often tolerated (“rule bending”)
o Exceptional violations: Isolated departures from authority; one-off behaviours; not
condoned by management
HUMAN FACTORS ANALYSIS & CLASSIFICATION SYSTEM
HFACS (Shappell & Wiegmann, 2000)
Preconditions to unsafe acts
• “Why do errors & violations take place?”
• Conditions that led to unsafe acts
• Can be categorized as
o Environmental factors
o Conditions of operators
o Personnel factors
HUMAN FACTORS ANALYSIS & CLASSIFICATION SYSTEM
HFACS (Shappell & Wiegmann, 2000)
Physical environment
Environmental • Temperature, lighting, toxins, workspace layout, weather, terrain
Factors Technological environment
• Design issues – controls, displays/interfaces, automation
Adverse mental state
• Stress, mental fatigue, situational awareness, non-vigilance, overconfidence, complacency,
confusion, misunderstanding, haste
Conditions of
Adverse physiological state
Operators • Medical conditions, illness, physical fatigue, sleep deprivation
Physical/mental limitations (when task demands exceed capabilities)
• Sensory & information processing limitations
Crew resource management
• Communication, coordination, planning, & teamwork factors (i.e., lack thereof)
Personnel
Personal readiness
Factors • Off-duty activities/factors required for optimal performance
• Rest requirements, fitness requirements, alcohol restrictions
HUMAN FACTORS ANALYSIS & CLASSIFICATION SYSTEM
HFACS (Shappell & Wiegmann, 2000)
Unsafe supervision
• Poor managerial decisions & oversights
• Failure to provide adequate guidance, oversight, & training
• Failure to correct/report known problems
o Potentially hazardous issues known but ignored
o E.g., Unsafe behaviours, ignorance of safety precautions, faulty equipment, deficiencies in
expertise/training
• Supervisory violations
o Wilful disregard of rules & regulations
o E.g., Authorize hazardous actions, proceed in spite of insufficient documentation
HUMAN FACTORS ANALYSIS & CLASSIFICATION SYSTEM
HFACS (Shappell & Wiegmann, 2000)
Organizational influences
• Organizational climate → working atmosphere
o Organizational culture & norms
o “How things are done around here”
o Reflected in structure, command chain, communication flow, & policies
• Resource management
o Organization-level decision-making on resource allocation/maintenance (people, financial,
facilities, etc.)
o Usual considerations → safety vs. cost-efficiency; safety rarely wins
• Established operational processes, rules, & procedure → govern work activities
o E.g., SOPs, scheduling, policies, incentive systems, KPIs, production quotas, etc.
ÜBERLINGEN MID-AIR COLLISION
Case study example
Video link: https://www.youtube.com/watch?v=NlKu7BtMe8I&ab_channel=JohnGood
3:21 – 13:30
38:38 – 44:50
ÜBERLINGEN MID-AIR COLLISION
Case study example
Unsafe Acts
Preconditions to
Unsafe Acts
Unsafe Supervision
Organizational
Influences
RESOURCES
• Introduction to Human Factors: Applying Psychology to Design (Stone et al., 2017)
• The Human Factors Analysis and Classification System – HFACS. (Shappell & Wiegmann, 2000)
• Reason, J. (2000). Human error: Models and management. British Medical Journal, 320, 768-770.