Professional Documents
Culture Documents
Safety management
system basics
SMS1 | Safety management system basics
Contents
Introduction 01
Why SMS? 01
Case study: a tale of two siblings 02
SMS – what’s in it for you? 04
Business benefits – parallels between business,
safety and quality management 05
© 2014 Civil Aviation Safety Authority. First published July 2012; fully revised December 2014 (2nd edition)
For further information visit www.casa.gov.au/sms
This work is copyright. You may download, display, print and reproduce this material in unaltered form only (retaining this notice) for your
personal, non-commercial use or use within your organisation. Apart from any use as permitted under the Copyright Act 1968, all other
rights are reserved. Requests for further authorisation should be directed to: Safety Promotion, Civil Aviation Safety Authority, GPO Box 2005
Canberra ACT 2601, or email safetypromotion@casa.gov.au
This kit is for information purposes only. It should not be used as the sole source of information and should be used in the context of other
authoritative sources.
The case studies featuring ‘Bush Aviation and Training’ and ‘Outback Maintenance Services’ are entirely fictitious. Any resemblance to actual
organisations and/or persons is purely coincidental.
01
7. ‘SMS for small, non-complex organisations’ Safety management is not a dark art – its central
concepts are simple. In fact, safety management
8. ‘SMS in practice’.
was succinctly described at an ICAO working group
as ‘organised common sense’.
‘A safety management system (SMS):
a businesslike approach to safety–a The guidance provided by this resource kit, including the
checklists throughout the booklets, is not legal advice,
systematic, precise and proactive process is not a substitute for individual advice, and may not be
for managing safety risks.’ applicable to everyone’s situation.
Transport Canada
02
A father wants to give his two children a good start The Sydney-based sibling adopts a formal safety
to their working lives. management system (SMS) based on six simple
strategies:
Both are qualified pilots and are keen to run an
aviation business together. However, they cannot 1. Appointing one of the best line pilots as a
agree on where to base their business, as one part-time safety officer
wants to live in Sydney and the other loves
2. Regular staff meetings to identify safety risks to
Melbourne.
the operation and controls to manage these
The father purchases two Metro III aircraft and
3. Establishing a confidential safety reporting
gives one to each child, who both apply for an air
system for staff to report safety hazards
operator’s certificate (AOC). On receiving an AOC,
the Sydney-based sibling secures some regular 4. Weekly safety meetings to manage and resolve
contract work flying mining workers to and from identified safety issues
regional centres.
5. Central recording and capture of safety
The Melbourne-based one is able to sign a information to identify emerging safety risks
contract with the Victorian government for regular
6. Regular distribution of safety information to staff,
scheduled services throughout the state. Over the
reinforcing a ‘safety-first’ culture.
next few months both businesses grow and take
on more pilots, ground handling, engineering and In contrast, the Melbourne-based operation relies
maintenance, and administration staff to cope on less formal methods to manage safety. These
with additional passenger numbers and extra tend to be ‘on the run’.
services. Their fleet size also expands to meet these Eight months later, the father asks an independent
demands. auditor to have a look at each business. While both
While each business continues to be successful, businesses are financially sound, the auditor finds
their proud father notices their approaches to safety evidence that the Sydney-based operation has a
management are very different. stronger safety culture than the Melbourne-based
one, as in the results on the right:
03
This story shows the vital role safety culture plays in the safety and operational
success of an organisation.
A small to medium-sized operator on a limited budget does not have to spend large
amounts of money to improve its safety culture.
In fact, implementing safety management programs will help to improve
operational safety, reducing inefficiencies and leading to reduced operating costs.
04
According to ICAO (1993), the characteristics of a »» Possess a structure which has been designed
‘safe culture’, which should guide decision-makers with a suitable degree of complexity
in modelling corporate safety culture, include the »» Have standardised procedures and centralised
following: decision-making consistent with organisational
»» Senior management places strong objectives and the surrounding environment
emphasis on safety as part of the strategy »» Rely on internal responsibility, rather than
of controlling risks regulatory compliance, to achieve safety
»» Decision makers and operational personnel objectives
hold a realistic view of the short- and long-term »» Put long-term measures in place to mitigate
hazards involved in the organisation’s activities latent safety risks, as well as acting short term to
»» Those in senior positions do not use their mitigate active failures.
influence to force their views on other levels
of the organisation, or to avoid criticism ‘If you are convinced that your
organisation has a good safety
culture, you are almost certainly
mistaken. A safety culture is strived
for, but rarely attained. The process
is more important than the product.’
James Reason
06
INFORMED
‘The five key ingredients of CULTURE
an effective safety culture’ Those who manage
James Reason’s model and operate the system
have current knowledge
about the human, technical,
organisational and
LEARNING environmental factors that
CULTURE determine the safety of
An organisation must the system
possess the willingness and as a whole.
the competence to draw the
right conclusions from its safety
FLEXIBLE CULTURE information system and
An organisation can be willing to implement
major reforms. REPORTING CULTURE
adapt in the face of high-
An organisational climate
tempo operations or certain
in which people are prepared
kinds of danger - often shifting
JUST CULTURE to report their errors and
from the conventional
There is an near-misses.
hierarchical mode to a
atmosphere of trust.
flatter mode.
People are encouraged
(even rewarded) for providing
essential safety-related information,
but they are also clear about
where the line must be drawn
between acceptable
and unacceptable
behaviour.
»» adopting scientifically based, risk-management »» The overall safety objectives of the organisation
methods »» The commitment of senior management to
»» systematic monitoring of safety performance provide the resources necessary for effective
safety management
»» creating a non-punitive work environment which
encourages hazard and error reporting »» A statement about responsibility and
accountability for safety at all levels of the
»» senior management commitment to pursue safety
organisation
as vigorously as financial results
»» Management’s explicit support of a ‘positive
»» adopting safe practices and safety lessons
safety culture’, as part of the overall safety culture
learned
of the organisation.
»» stringent use of checklists and briefings to ensure
consistent application of standard operating Safety objectives
procedures (SOPs) The safety objectives should state an intended
safety outcome—what you are going to do. These
»» integrating human factors in safety training to
objectives may be expressed in terms of short-,
improve error management skills.
medium- and long-term safety goals.
To be able to measure the effectiveness of
operational safety objectives, they should be
SMART (specific, measurable, achievable and
realistic; and have a specified timeframe within
which they are to be achieved).
09
see booklet 2 in this kit. Generally, the hazard exists now: while the risk
associated with that hazard might occur in the
future. A large number of white ibis at the landfill
centre adjacent to the aerodrome is a present
hazard—they are sizable birds. The future risk is
that if they are involved in a bird strike, they could
cause engine failure and an aircraft crash.
10
Risk assessment
Risk is the chance (likelihood), high or low, that
somebody could be harmed by various hazards,
together with an indication of how serious
(consequence) the harm could be.
Don’t overcomplicate the process. Many aviation
organisations know their hazards well and the
necessary control measures are easy to apply. If
Poor meal choice
you run a small organisation and you are confident
you understand what is involved, you can do the Mike, a captain working for a small Essendon
assessment yourself. airport-based charter operation, meets some
friends at a seafood restaurant. He chooses the
Risk management is fundamental to safety curried prawns and does not drink any alcohol.
management and involves five essential steps: As the night wears on, Mike starts to feel unwell
and leaves, going to bed early. However, he is
up for most of the night with food poisoning and
safety risk management manages to get only two hours sleep. He arrives
at work early the next morning dehydrated and
Equipment, procedures, Hazard fatigued, and does not pay enough attention
organisation, eg. Identification to the NOTAMs forecasting low cloud and
thunderstorms en route. Mike is forced to divert
Analyse the likelihood Risk analysis around the ‘unexpected’ weather and with the
of the consequence probability extra miles tracked, nearly runs out of fuel before
Communicate and consult
Evaluate the seriousness Risk analysis Mike made a number of errors (unsafe acts).
of the consequence if it severity He chose to come to work knowing he was not
does occur fit for duty (mistake) and he paid little attention
to the NOTAMs (slip). His errors resulted from
Is the assessed risk(s) Risk assessment fatigue (workplace condition).
acceptable and within and tolerability
However, as with most incidents, there is more
the organisation’s safety to it than that. During investigation, we discover
performance criteria? that Mike’s fellow pilots also admit to coming to
work not fit for duty, and not declaring it, because
Yes, accept No, take action Risk control/ of management pressure not to call in sick
the risk/s to reduce the mitigation because of a shortage of pilots. So it’s not just
risk/s to an Mike. His not declaring he was unfit for duty can
acceptable level now be considered as a routine violation (cultural
practice).
1. Organisational factors – the organisation 3. Unsafe acts – actual errors or violations made
establishes the work practices environment. by those doing the job. Unsafe acts are usually the
Organisational processes can affect safety through: last elements of the chain of accident causation and
include:
»» robust, clear work procedures
»» operating equipment outside limitations
»» providing appropriate time and resources
to do the job »» forgetting a crucial step in a procedure
»» providing adequate and appropriate »» misdiagnosing a problem
supervision or training
»» wilfully breaking a work-related rule or procedure.
»» positive organisational culture.
Example: The flight crew incorrectly calculate
Example: Poor pilot induction training can result in the fuel required to divert to the chosen alternate
inadequate knowledge of company procedures. airport.
2. Workplace conditions – task, equipment, As well as these three elements, there is a critical
environment or human limitations that increase the fourth area: defences.
likelihood of human error. These error-producing
4. Defences are barriers or safeguards against
conditions can include:
errors, and can range from hard-engineered safety
»» inappropriate, poor or faulty, equipment devices (seatbelts, electronic warning and detection
systems) to soft defences, such as standard
»» high workload
operating procedures (SOPs), or raising staff
»» unfamiliar tasks awareness through education or training programs.
There are usually multiple defences within any
»» fatigue
system.
»» excessive noise or temperature
Example: The flight operations policy of loading
»» inclement weather additional fuel ensures that the incorrect fuel
»» use of prescribed medications or alcohol calculation does not result in fuel starvation during
and other drugs (AOD) aircraft diversion.
»» personal or financial stress On their own, each of the four types of failures
will not usually result in an accident. However,
»» lack of proficiency. a breakdown at each failure level can create
Example: An airport closed due to fog means the opportunity for an accident to occur.
flight crew must make a decision about the best
Flawed defences: ‘Swiss cheese’
alternate airport.
James Reason’s approach to accident causation is
often referred to as the ‘Swiss cheese’ model. The
model illustrates that an organisation’s defences
(slices of Swiss cheese) move around constantly,
but if their holes align a hazard can pass through
multiple layers of defences (or slices of cheese).
13
Hazards
Accident
What life raft?
In August 1998, a Boeing B737-300 aircraft
According to the Swiss cheese model, some of the was diverted to Adelaide due to poor weather at
holes in defences are due to errors (active failures) Melbourne Airport. During the overnight service
made by employees who are typically on the in Adelaide, the engineering and maintenance
front line. staff performed an over-water-return check on
the aircraft, which should have included the
Other holes in the defences are caused by removal of only one life raft. However, due to
organisational factors (latent conditions), or other high workload and the unfamiliarity of Adelaide
error-producing conditions in the workplace. engineering staff with the permanent life raft
modification program, all three life rafts on the
The Swiss cheese example suggests that no
aircraft were removed instead.
defences are perfect. However, the critical task
in maintaining safety is to find the holes in the The aircraft then operated to Sydney, via
defences, and build stronger and better layers of Melbourne, where another over-water
defence. preparation check was made before the aircraft
flew the Sydney to Wellington service. This
The following airline safety incident illustrates the check normally included an inspection of the
Swiss cheese model: two permanent life rafts and the loading of one
additional life raft. However, while the usual
process of fitting the additional life raft took
place, the engineering staff did not check to see
if the two permanent life rafts were fitted, as they
assumed that the permanent life raft installation
program had been completed.
Before departure, the captain completed
his pre-flight walk around, which included
checking to ensure that all life raft equipment
was on board. This involved looking through a
narrow inspection or viewing hole. Shortly after
boarding, the customer service manager (CSM)
received a report from two flight attendants that
the emergency equipment, including life rafts,
had been checked. The aircraft subsequently
flew over water to Wellington without the legally
required life rafts.
This Boeing 737-300 incident is significant if you
consider the implications of a trans-Tasman Sea
ditching without sufficient life rafts.
14
If we apply the four elements of the Swiss cheese Organisational factors – the investigation revealed
model, we can quickly see that the incident involved a number of deficiencies in crew training on how
more than just a series of errors by aircrew: to check emergency equipment, as well as the
checking procedures themselves. For example, the
Absent/failed defences – life raft removal and
technical crew manual indicates how many life rafts
maintenance was usually carried out in either
are required on a B737, but does not lay down a set
Melbourne or Sydney. Because the aircraft was
procedure for checking them. The crew also reported
diverted to Adelaide, the engineers were not familiar
vast differences in their emergency equipment check
with the procedure. The wording of the engineering
training. Senior management’s decision to modify
instruction: ‘remove all over-water equipment’ was
the life raft equipment on the B737 over an extended
also misleading. The engineering system was not
period increased the opportunity for error.
flexible enough to cope with a change in normal
procedures and so this defence failed. The aircraft The life raft incident clearly illustrates breakdowns at
technical log also did not indicate that the life rafts each failure level; but had just one of those failures
had been removed. not occurred, the outcome might have been different.
For example, despite engineering staff in Adelaide
Unsafe acts – the captain, the cabin crew and the
misinterpreting the life raft removal procedures, the
engineer in Adelaide all made errors. The captain
operating crew in Sydney had the opportunity to
said he had inspected the life rafts, but for some
identify the lack of life rafts.
reason, missed that they had been removed. The
CSM relied on the information provided by two The life raft incident shows that one person alone
cabin crew members that they had checked the usually does not cause an accident. Rather, an
life raft equipment correctly. The Adelaide engineer accident is the result of a combination of failures, not
misunderstood the engineering instructions, relying just by crew, but throughout the entire company and
on how the procedure used to be done, and as beyond. While you may have limited control over the
a result, removed all three life rafts instead of the actions of others, there are many things you can do
required one. to prevent the holes in the Swiss cheese lining up.
Workplace conditions – fog in Melbourne, and Reference
the subsequent diversion to Adelaide, set up the ‘What life raft?’ Edkins, G. (2001).
incident to occur. High workload is a common
workplace factor, often increasing the likelihood of Swiss cheese model for the life raft incident
human error. Engineering staff in Adelaide faced a
Organisational factors
high workload with unscheduled maintenance on »» Training deficiencies in life raft
several aircraft diverted from Melbourne. The cabin checking procedures
crew bus was 15 minutes late, meaning they were »» Protracted life raft modification
under time pressure to complete all their checks process
before passengers boarded. A misleading placard Workplace conditions
located adjacent to the raft inspection hole also »» Poor weather at Melbourne
stated: ‘life rafts permanently fitted’. This might have »» Time pressure on aircrew and engineers
created an expectation among the crew that the life »» Placard read: ‘life raft permanently fitted’
rafts were never removed. To complicate matters
further, the design of the raft inspection hole was Unsafe acts – errors & violations
»» Captain missed the life raft removal
poor. It was very narrow, and crew members had
»» Cabin crew missed the life raft removal
to position themselves directly beneath the hole to
»» Adelaide engineer misunderstood the
view the contents of the overhead bin clearly.
job task card
Defence failures
»» Adelaide engineering staff unfamiliar
with ‘over-water return’ procedure
»» Ambiguous wording of entry into
service (EIS) instruction
15
Errors are as normal as breathing oxygen, and »» safety reporting and data analysis
about as certain as death and taxes.
»» investigating incidents.
The SHELL model
ICAO uses the SHELL model to represent the main
components of human factors. The letters SHELL For more information about
stand for:
human factors and aviation,
»» S = software: the procedures and other aspects see booklet 6 in this kit.
of work design
»» H = hardware: the equipment, tools and
technology used in work
»» E = environment: the environmental conditions
in which work occurs, including the organizational
and national cultures influencing interaction
»» L = liveware: the human aspects of the system
of work
»» L = liveware: the interrelationships between
humans at work.
Toolkit
Index of toolkit items
References
18
Jargon busters—abbreviations,
acronyms and definitions
Abbreviations
A D
AC Advisory circular DEEWR Department of Education, Employment
and Workplace Relations
ALAR Approach-and-landing accident
reduction E
ALARP As low as reasonably practicable ERP Emergency response plan
ALoS Acceptable level of safety ESB Effective safety behaviours
(used in conjunction with ICAO member
states’ state safety program)
F
FAA Federal Aviation Administration
AME Aircraft maintenance engineer
(United States)
AOC Air operator’s certificate
FDA Flight data analysis
AQF Australian Qualification Framework
FDAP Flight data analysis program
AS/NZS Australian/New Zealand Standard
FMAQ Flight management attitudes
ATSB Australian Transport Safety Bureau questionnaire
B FRMS Fatigue risk management system
BITRE Bureau of Infrastructure, Transport and FTO Flight training organisation
Regional Economics
G
C
GAPAN Guild of Air Pilots and Air Navigators
CAAP Civil Aviation Advisory Publication
GIHRE Group interaction in high-risk
CAIR Checklist for assessing institutional environments
resilience against accidents
H
CAO Civil Aviation Order
HF Human factors
CAP Civil Aviation Publication
HMI Human-machine interface
(United Kingdom)
CASA Civil Aviation Safety Authority I
CASR Civil Aviation Safety Regulation ICAM Incident cause analysis method
J R
JAR-OPS Joint Aviation Requirements - Operations RPT Regular public transport
MEDA Maintenance error decision aid SSAA Safety-sensitive aviaition activity (used
in relation to alcohol and other drugs
MOS Manual of standards
regulation – CASR Part 99)
MOSA Maintenance operations safety audit
SOP Standard operating procedure
MoU Memorandum of understanding
SRB Safety review board
N SSP State Safety Program
NTS Non-technical skills, see also ‘Human
SWI Safe work instruction/s
factors’
T
O
TEM Threat and error management
OH&S Occupational health & safety. See WHS
TNA Training needs analysis
P
U
POH Pilot’s operating handbook
UT University of Texas
Q
V
QA Quality assurance
VFR Visual flight rules
QMS Quality management system
W
WHS Workplace Health and Safety
[New term for OH&S]
21
Definitions
Accident: an occurrence associated with the ALARP: as low as reasonably practicable, means a
operation of an aircraft which takes place between risk is low enough that attempting to make it lower,
the time any person boards the aircraft with or the cost of assessing the improvement gained
intention of flight until such time as all such persons in an attempted risk reduction, would actually be
have disembarked, in which: more costly than any cost likely to come from the
risk itself.
»» a person is fatally or seriously injured1 as a
result of: ALoS: acceptable level of safety. Used in reference
to ICAO member states’ ‘state safety programs’
-- being in the aircraft, or
-- direct contact with any part of the aircraft, Assessment: process of observing, recording, and
including parts which have become detached interpreting individual knowledge and performance
from the aircraft, or against a required standard.
-- direct exposure to jet blast, except when the Behavioural marker: a single non-technical skill
injuries are from natural causes, self-inflicted or competency within a work environment that
by other persons, or when the injuries are to contributes to effective or ineffective performance.
stowaways hiding outside the areas normally
Change management: a systematic approach to
available to the passengers and crew,
controlling changes to any aspect of processes,
the aircraft sustains damage or structural
procedures, products or services, both from the
failure which
perspective of an organisation and of individuals.
-- adversely affects the structural strength, Its objective is to ensure that safety risks resulting
performance or flight characteristics of the from change are reduced to as low as reasonably
aircraft, and practicable.
-- would normally require major repair or
Competency: a combination of skills, knowledge
replacement of the affected component,
and attitudes required to perform a task to the
except for engine failure or damage when the
prescribed standard.
damage is limited to the engine, its cowlings
or accessories; or for damage limited to Competency-based training: develops the
propellers, wing tips, antennas, tyres, brakes, skills, knowledge and behaviour required to meet
fairings, small dents or puncture holes in competency standards.
the aircraft skin; the aircraft is missing or is
Competency assessment: The process of
completely inaccessible2.
collecting evidence and making judgements as to
whether trainees are competent.
Complex organisation: an organisation with more
Notes than 20 employees performing safety-sensitive
1. For statistical uniformity only, an injury resulting in death within aviation acitivities. Such organisations can discuss
thirty days of the date of the accident is classified as a fatal injury their assessment as non-complex with CASA if
by ICAO
they feel they meet all other criteria other than the
2. An aircraft is considered to be missing when the official search number of employees performing SSAA. (See also
has been terminated and wreckage has not been located.
‘Small, non-complex organisations.)
22
Management: planning, organising, resourcing, Risk assessment: the overall process of risk
leading or directing, and controlling an organisation identification, risk analysis and risk evaluation.
(a group of one or more people or entities) or effort
Risk identification: the process of determining
for the purpose of accomplishing a goal.
what, where, when, why and how something could
Maintenance operations safety audit (MOSA): happen.
behavioural observation data-gathering technique to
Risk management: the culture, processes and
assess the performance of maintenance engineers
structures directed towards realising potential
during normal operations.
opportunities whilst managing adverse effects.
Non-technical skills (NTS): Specific HF
Safety: the state in which the probability of
competencies such as critical decision making,
harm to persons or property is reduced to, and
team communication, situational awareness and
maintained at, a level which is as low as reasonably
workload management.
practicable through a continuing process of hazard
Operational safety-critical personnel: perform identification and risk management.
or are responsible for safety-related work, including
Safety culture: an enduring set of beliefs, norms,
being in direct contact with the physical operation
attitudes, and practices within an organisation
of the aircraft, or having operational contact with
concerned with minimising exposure of the
personnel who operate the aircraft.
workforce and the general public to dangerous or
Operational safety-related work: safety-related hazardous conditions. A positive safety culture is
activity in one or more of the following work areas: one which promotes concern for, commitment to,
and accountability for, safety.
»» maintenance
Safety manager (SM): person responsible for
»» flying an aircraft
managing all aspects of an organisation’s safety
»» cabin crew operations management system.
»» dispatch of aircraft or crew Safety management system (SMS): a
systematic approach to managing safety,
»» development, design, implementation and
including the necessary organisational structures,
management of flight operations, safety-related
accountabilities, policies and procedures.
processes (including safety investigations)
Safety-sensitive aviation activity: any aviation
»» any other duties prescribed by an AOC holder as
activities in an aerodrome testing area
flight operations safety-related work.
Service level agreement: see ‘Contractors’
Quality management system (QMS): a set of
policies, processes and procedures required for Small, non-complex organisations:
planning and execution (production/development/ Organisations with 10 or fewer employees
service) in the core business areas of an performing safety-sensitive aviation activities (SSAA)
organisation. are automatically considered to be small and non-
complex. Organisations with more than 10, but
Risk: the chance of something happening that will
fewer than 20 SSAA employees, and which do not
have an impact on objectives.
exceed any of the other criteria for non-complex
»» A risk is often specified in terms of an event or organisations, may also be considered small and
circumstance and any consequence that might non-complex.
flow from it.
»» Risk is measured in terms of a combination of the
consequences of an event, and its likelihood.
»» Risk can have a positive or negative impact.
24
References
CASA regulations and advisory Further reading
material ‘The INDICATE safety program: A method to
Civil Aviation Advisory Publication (CAAP) SMS-1(0) proactively improve airline safety performance’.
2009 . Edkins, G.D. Safety Science, 30: pp 275-295. (1998).
Civil Aviation Safety Regulation (CASR) Part 99 – ‘What life raft?’ Edkins, G. D. Qantas Flight Safety,
alcohol and other drugs. Issue 2: Spring, pp5-9. Qantas Airways. Sydney,
(2001).
Safety Behaviours: Human Factors for Pilots.
Civil Aviation Safety Authority, Australia. (2009). Safety at the Sharp End: a Guide to
Non-technical skills. Flin, R; O’Connor,
Safety Behaviours: Human Factors for Engineers. P & Crichton, M. Ashgate (2008).
Civil Aviation Safety Authority, Australia.
(2013). Culture at Work in Aviation and Medicine: National,
Organizational and Professional Influences.
ICAO publications Helmreich, R.L and Merritt, AC. Ashgate (1998).
‘Human factors, management and organization’. ‘Safety culture and human error in the aviation
Human Factors Digest No. 10 ICAO, Montreal, industry: In search of perfection’. Hudson, P.T.W.
Canada. (1993). in B. Hayward & A. Lowe (eds.) Aviation Resource
Management. Ashgate, UK. (2000).
ICAO Annex 19 – Safety Management 1st edition.
Human Error. Reason, J. Cambridge University Press,
ICAO Safety Management Manual (Document 9859) Cambridge (1992).
3rd edition 2013.
Managing the Risks of Organisational Accidents.
Reason, J. Ashgate, Aldershot, UK. (1997).
Managing Maintenance Error: a Practical Guide.
Reason, J & Hobbs, A. Ashgate (2003).
A Human Error Approach To Aviation Accident
Analysis: The Human Factors Analysis &
Classification System. Wiegmann, DA & Shappell,
SA. Ashgate (2003).
Follow CASA on