You are on page 1of 10

This article was downloaded by: [University of Delaware]

On: 07 October 2014, At: 12:54


Publisher: Taylor & Francis
Informa Ltd Registered in England and Wales Registered Number:
1072954 Registered office: Mortimer House, 37-41 Mortimer Street,
London W1T 3JH, UK

Ergonomics
Publication details, including instructions for
authors and subscription information:
http://www.tandfonline.com/loi/terg20

Human factors and aviation


safety: what the industry
has, what the industry
needs
a
Daniel E. M Aurino
a
International Civil Aviation Organization
Published online: 10 Nov 2010.

To cite this article: Daniel E. M Aurino (2000) Human factors and aviation
safety: what the industry has, what the industry needs, Ergonomics, 43:7,
952-959, DOI: 10.1080/001401300409134

To link to this article: http://dx.doi.org/10.1080/001401300409134

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all
the information (the “Content”) contained in the publications on our
platform. However, Taylor & Francis, our agents, and our licensors
make no representations or warranties whatsoever as to the accuracy,
completeness, or suitability for any purpose of the Content. Any
opinions and views expressed in this publication are the opinions and
views of the authors, and are not the views of or endorsed by Taylor
& Francis. The accuracy of the Content should not be relied upon and
should be independently verified with primary sources of information.
Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities
whatsoever or howsoever caused arising directly or indirectly in
connection with, in relation to or arising out of the use of the Content.

This article may be used for research, teaching, and private study
purposes. Any substantial or systematic reproduction, redistribution,
reselling, loan, sub-licensing, systematic supply, or distribution in any
form to anyone is expressly forbidden. Terms & Conditions of access
and use can be found at http://www.tandfonline.com/page/terms-and-
conditions
Downloaded by [University of Delaware] at 12:54 07 October 2014
E RG ONOMICS, 2000, VOL . 43, N O. 7, 952±959

Human factors and aviation safety: what the industry has, what
the industry needs

D ANIEL E. M AUR IN O*
F light Safety and H uman F actors Programme, International Civil Aviation
Organization, 999 U niversity Street, M ontreal, Quebec H3C 5H 7, Canada

Keywords: H uman error; Cultural factors; Aviation safety; Safety paradigm;


Proaction; Accident investigation.
Downloaded by [University of Delaware] at 12:54 07 October 2014

The use of statistical analyses to assert safety levels has persuasively been
established within the aviation industry. Likewise, variations in regional statistics
have led to generalizations about safety levels in di erent contexts. Caution is
proposed when qualitatively linking statistics and aviation’s resilience to hazards.
F urther caution is proposed when extending generalizations across contexts.
Statistical analysesÐ the favoured diagnostic tool of aviationÐ show sequences of
cause/e ect relationships re¯ ecting agreed categorizations prevalent in safety
breakdowns. They do not, however, reveal the processes underlying such
relationships. It is contended that the answers to the safety questions in
contemporary aviation will not be found through the numbers, but through the
understanding of the processes underpinning the numbers. These processes and
their supporting beliefs are in¯ uenced by contextual constraints and cultural
factors, which in turn in¯ uence individual and organizational performance. It is
further contended that the contribution of human factors is fundamental in
achieving this understanding. This paper, therefore (1) argues in favour of a
macro view of aviation safety, (2) suggests the need to revise a long-standing
safety paradigm that appears to have ceased to be e ective, and (3) discusses the
basic premises upon which a revised safety paradigm should build.

1. Introduction
As aviation is about to celebrate its ® rst centennial, a dilemma that is part of the fabric
of the modern worldÐ challenging established dogmaÐ is upon the industry. The
apparent stagnation of aviation’s accident record suggests a systemic nature in the
safety problems faced by contemporary aviation, and makes the case for innovative
ways to pursue aviation safety. Long-standing beliefs, anchored in conservatism and
convention, are under challenge by emerging thinking about safety issues. Such
thinking argues for a shift in a safety paradigm that appears to be unresponsive to the
demands of a complex socio-technical system. At the same time, it fosters a growing
recognition of the need to attack causes rather symptoms of safety de® ciencies. It
would seem timely to revisit the safety paradigm aviation has held for over ® ve decades,
essentially based in the compilation of statistics, the investigation of accidents, and the
punishment of safety o enders. Paramount to a revised paradigm is to distance from
the allocation of blame those at the operational end of events in favour of the macro
appraisal of the aviation system. This implies the need to stay away from the `design,

*Author for correspondence. e-mail: dmaurino@icao.int


Ergonomics ISSN 0014-013 9 print/ISSN 1366-584 7 onlin e Ó 2000 Taylor & F rancis Ltd
http:www.tandf.co.uk/journals
Human factors and aviation safety 953

blame and train’ cycle so cherished by aviation; and to consider all components of the
aviation system rather than only those obviously connected to safety breakdowns. It is
imperative for aviation to accept that the failures of people involved in daily routines
are symptoms of de® ciencies at the deep foundations of the system (R eason 1997).
This paper presents a practitioner’s conceptual view of the potential contribution
of human factors to a revised safety paradigm. Such a paradigm should build upon
at least four fundamental premises:

(1) safety must be considered to be a social construct;


(2) human and organizational performance must be considered to be inseparable
from the contexts within which they take place;
(3) conventional views on human error must be revisited; and
(4) proaction must replace reaction.
Downloaded by [University of Delaware] at 12:54 07 October 2014

2. Safety as a social construct


U ltimately, safety is a state of mind. Safety partly rests in formal structures and
protocols, but fundamentally in attitudes. Safety intertwines with risk and human
life. The perception of what constitutes risk and the value of human life are far from
uniform across the global village. An unacceptable risk for one society might be
quite tolerable for another. There are societies where, for example, body-counting
outweighs moral values deeply ingrained in others. There are societies where liability
substitutes for collective participation in decisions about risk. H ere power is the main
factor, not risk and safety. Indeed, there are many who believe that societies have the
number of accidents and loss of life that they are willing to accept before allocating
resources to reduce risks. F ar from suggesting amoral calculation, this rather implies
that there is a strong cultural component associated with safety endeavours (H ood
and Jones 1996).
The link between attitudes, the value of human life, and the allocation of
resources to reduce risks would be undisputed by neutral observers, but it does not
sit very well with aviation. After all, decades of ponti® cating and smoke-blowing
about safety have led to a vernacular conviction that in aviation, safety comes ® rst.
This naive slogan suggests that safety is a universal value, and ignores that judging
what constitutes an acceptable risk is a subjective, social activity that will be di erent
in di erent cultures and even in di erent organizations within a single culture.
F ollowing this line of reasoning, it is suggested that safety may be judged according
to discrete standards but it can hardly be universally measured.
Therefore, before any safety endeavour is attempted, it is essential to acquire a
perspective of where the values sustaining safety ® t within the cultural beliefs of the
receiving group. Such beliefs de® ne the group’s `operational space’, the boundaries
of which are provided by national culture. In order to be e ective, safety endeavours
should be consistent with the group’s beliefs. Safety and risk are not properties
inherent to processes or artefacts, but they are constructed by common group history
and experience. R ather than being the objective process that conventional
knowledge would have aviation believe, safety is a subjective process of quantifying
and accepting risks. SafetyÐ and risksÐ are simply in the eyes of the beholder
(Vaughn 1996).
There are no `culture-free’ endeavours: training, safety, design, research or any
kind of human endeavour have strong cultural components. Simply put, humans
cannot dissociate themselves from their heritage. As a consequence, the generalized
954 D. E. Maurino

practice of designing safety solutions within the industrialized belt of aviation, and
then exporting them world-wide expecting that their e ectiveness will remain intact
is open to challenge. TechnologyÐ the most widely-accepted means to advance
aviation safetyÐ is a good example to illustrate the pitfalls in exporting uncalibrated
safety solutions. Technology is designed within a very narrow band of the industry,
yet it is used throughout the world without cross-cultural considerations. There are
important cultural issues in the transfer of technology, beyond those related to
anthropometry and biomechanics. The use of technology involves procedures that
are not inherent to the equipment, but which must be designed. Such design follows
the originating culture standards. The authority gradient expected among users of
technology re¯ ects that associated with the originating culture. Such gradient may be
di erent in other cultures, and the quality of feedback essential to the proper use of
technology will then su er accordingly. Last but by no means trivial, the di culties
Downloaded by [University of Delaware] at 12:54 07 October 2014

regarding knowledge and understanding of English should not be underestimated


(M eshkati 1989).
Culture also a ects the performance of tasks in various and subtle ways, fostered
by social biases, perceptions of status, mental models and education. It is therefore
naive to export a safety solution which worked in one context to another context and
expect linear bene® ts. In fact, such attempts might generate aberrant solutions, since
they will probably address symptoms rather than causes of safety de® ciencies.
D i erent contexts present distinct problems which require distinct, culturally-
calibrated solutions, underpinned by contextual needs and constraints.

3. The importance of the organizationa l context


The relevance of cultural calibration becomes obvious when considering that
aviation organizations do not operate in a vacuum, but conduct their businesses by
navigating through socially-de® ned operational spaces broadly delineated by
national cultures. F urthermore, while the ways in which organizations within
speci® c social contexts conduct their everyday routines might seem similar, these
routines encode preferences, making each organization unique and distinct. As
people interact in work groups, organizational cultureÐ the ways we do business and
the way we speak about how we do business hereÐ is created, biased by overarching
social beliefs, and adapted to meet the production demands of the organization.
Organizational culture determines acceptable and unacceptable behaviours and
becomes a mandate that a ects individual and organizational decision-making
(H elmreich and M erritt 1998).
U nderstanding the organizational context that gives behaviours their true
meaning thus becomes a prerequisite to understanding human and organizational
performance `in the wild’. This allows one to view events as they presented
themselves and changed for the actors at the time of the action, therefore preserving
the real-time view. M ost important, such an approach distances analysts from
`M onday morning quarter backing’: words are cheap and it is always easy to be wise
with the bene® t of hindsight, but it is questionable whether such wisdom serves other
than for the allocation of blame.
F rom a contextual perspective, assessing safety through statistical analyses is a
limited exercise. The numbers may convey di erent meanings and generate di erent
perceptions about safety in di erent groups, as fostered by national and
organizational culture. F urthermore, for all the rigour statistics may re¯ ect, cultural
beliefsÐ as well as the bene® t of hindsightÐ play a role in de® ning the numbers: an
Human factors and aviation safety 955

analyst’s deliberate e ort is to stick to absolute impartiality and to resist `seeing what
one wants to see’ in the data being evaluated. Statistics reveal a succession of cause-
e ect relationships, but they fail to reveal the processes underlying such relation-
ships. The answers to understanding safety questions lie not in the numbers, but in
the interpretation of the processes behind the numbers.
Therefore, when evaluating a particular system’s safety, markers more reliable
than statistics, the absence of accidents or any of the other traditional parameters
aviation has used for years are necessary. One core marker is the concept of safety
culture, de® ned as the set of beliefs, norms, attitudes, roles and social and technical
practices that are concerned with minimizing exposure of employees, managers,
customers and members of the general public to conditions considered to be
dangerous or hazardous. The social ingredient as to what constitutes danger or
hazard is the variable that allows for cultural calibration of the concept. The building
Downloaded by [University of Delaware] at 12:54 07 October 2014

blocks of safe cultures are well-de® ned:

· decision-makers place a strong emphasis on identifying hazards and


controlling risks
· both decision-makers and operational personnel hold realistic views of the
short- and long-term hazards involved in the group’s activities
· those in top positions do not use their in¯ uence to force their views and avoid
criticism
· those in top positions foster a climate that is receptive to criticism, comment
and feedback from lower levels of the group
· there is an awareness of the importance of communicating relevant
information on hazards at all levels of the group
· there is promotion of appropriate, realistic and workable rules relating to
hazards and to potential sources of damage. Such rules are supported and
endorsed throughout the group.

The bridge between safety as a social construct, the organizational context, and
the safety cultures developed by organizations thus becomes the macro-micro
connection that fosters the understanding of human and organizational performance
in context. Such consideration has conspicuously been neglected by the conventional
safety paradigm. N either human nor organizational performance can be understood
without taking into account the social and organizational context within which they
manifest themselves.

4. Human error: does it exist?


A macro approach to safety introduces the opportunity for re¯ ection on human
error. Analyses of human performance in safety events is largely conducted with
abstraction of contextual in¯ uences. F urthermore, aviation research on human
judgement and decision-making has until recently been conducted out of context and
it has ignored the natural component of the decision-making process. Such research
has been applied as a template to measure human performanceÐ and therefore
errorÐ in di erent cultures and operational contexts. Likewise, errors have been
considered to emerge from either the technology or the human, but seldom from the
joint human-technology system. The weakness in this approach becomes evident
when considering the in¯ uence technology exerts upon human and organizational
performance. It becomes more so when considering that, in spite of abundant
956 D. E. Maurino

research and the development of prescriptive/normative models for aviation training


in judgement and decision-making, both continue to be assessed as prevailing factors
in safety breakdowns (K lein et al. 1993).
Conventional analysis on human error in aviation backtracks an event under
scrutiny until a point at which analysts ® nd a particular kind of human or
organizational performance that produced results other than those intended by the
actors. At that point, human error is pronounced. This, with limited consideration of
the process leading to, and with the knowledge of, the `bad’ outcome, obvious to the
analysts but certainly unknown to the actors, a priori of the `bad’ outcome. This
analysis largely ignores the conditions as presented at the time the event took place,
and which may have in¯ uenced the `improper’ performance. It also neglects the
organizational culture that gives meaning to events and performance. F rom this
perspective, consideration of human error is a reactive judgementÐ the harshness of
Downloaded by [University of Delaware] at 12:54 07 October 2014

which is proportional to the magnitude of the `bad’ outcomeÐ on what at the time of
the event was perceived to be a normal performance by the actors (Amalberti 1996).
F urther weaknesses of this analysis of error come to light when considering that
because of aviation’s defences, the relationship between process and outcome is not
linear: numerous errors are committed during routine operations which seldom
result in bad outcomes. M onitoring systems indicate instances in which `bad’
processes result in `good’ outcomes, because of system defences. Likewise, relatively
`good’ processes result in `bad’ outcomesÐ often because of chance. Although the
relationship between process and outcome is loose in terms of causality, the concept
has yet to penetrate the armour of aviation’s prevailing convention. Therefore, the
fact remains that, unless a bad outcome exists, human error is not pronounced.
F inally, in aviation, with the inherent competition between production and safety
goals, operational decision-making (and therefore error) must balance both
production and safety demands. The optimum performance to achieve the
production demands may not always be fully compatible with the optimum
performance to achieve the safety demands. Operational decision-making lies at the
intersection of production and safety, and is therefore a compromise. In fact, it
might be argued that the trademark of experts is how e ectively they manage this
compromise.
A contemporary safety paradigm should therefore consider errors as symptoms
rather than causes of safety breakdowns, because error-inducing factors are latent in
the context, largely bred by the balancing compromise between safety and
production. F urthermore, aviation must acknowledge that error is a normal
component of human performance. This reinforces the value of monitoring and
reporting systems, so that error-inducing factors are uncovered before they combine
with ¯ aws in human and organizational performance to produce safety breakdowns.
M ost important, assessing that an errorÐ be it individual or organizationalÐ has
occurred should be the starting rather than the stopping point of the safety
investigation process. D igging into the architecture of the system will yield to
countermeasures aimed at error detection, error tolerance and error recovery, rather
than to pathetic e orts aimed at error suppression.

5. Proaction and reaction


Consideration of the social underpinnings of safety, how they in¯ uence organiza-
tional performance, and shape safety cultures tolerant to human error are the pillars
of proaction. A proactive approach to safety should also put to rest the myth of zero
Human factors and aviation safety 957

accidents. While the goal of zero accidents might be pursued as a matter of


philosophical principle, it should not cloud the rational approach of subject-matter
experts. Accidents will continue to happen simply because it is not viableÐ ® nan-
cially speakingÐ to deploy the huge amount of resources necessary to completely
prevent them. Aviation, similar to other high-technology production systems, is
replete with hazards. H owever, because of its defences, the risks generated by such
hazards are relatively low. There are speci® c techniques to minimize risks, and most
of the risks can be cancelled, but it is ® nancially impractical to cancel all risks.
M anagement tools, such as monitoring and reporting systems, are fundamental in
keeping watch upon those risks that it is impractical to cancel. Accidents will also
continue to happen because error-free human performance will simply never happen.
It is impossible to anticipate and deal with unexpected and random issues such as
forgetfulness, tiredness, inattention and so forth. F urthermore, errors serve to
Downloaded by [University of Delaware] at 12:54 07 October 2014

calibrate human performance, just as accidents are a measure of the state of


calibration of the system, provided that feedback mechanisms are in place.
It is often the case in aviation that signals of potential danger are masked by the
noise and demands from daily routines. To exercise proaction, it must be possible to
separate noise from signal, to clearly identify potential dangers towards which one
can direct the attention and prioritize the allocation of resources. This can only be
achieved through a free ¯ ow and exchange of information. M onitoring and reporting
systems are the right tools to ® lter noise from signal, plus they help to di erentiate
process from outcome. In proactive endeavours, safety is considered to be an
outcome, an `ideal’ state that the system aims to achieve. The actual process is
hazard identi® cation and risk cancellation. By continuous vigilance on the routine
activities engaged by the system while pursuing its production goals, monitoring and
reporting systems allow the constant tracking of hazards and evaluation of the risks
that they involve. The result of hazard identi® cation and risk cancellation is safety
management.
M onitoring and reporting systems allow prevention by control: they imply the
recognition that the system is imperfect and under-speci® ed by design, and must
remain under constant scrutiny. They seek to anticipate the status of known markers
of the system’s resistance to hazards in order to strengthen the system’ s design and
defences. M onitoring and reporting systems embody proaction because they work
essentially forward, exercise prevention by control, and focus on the process
regardless of the outcome.
At the opposite end, the objective pursued by accident investigations is to
determine where, when and how deviation from existing rules has taken place,
thereby implying that deviation from rules was the cause of the breakdown and that
the system is without ¯ aws. Conventional safety builds upon the simplistic notion
that accidents are the product of mis® ts engaging in inappropriate performance. In
fact, accidents are brutal declarations of sickness of the system, and involve the
failure to heed visible and identi® able warnings in advance. Accident investigations
work essentially backwards, exercise prevention by design and focus almost
exclusively on the outcome with scant consideration to the process, thus embodying
the essence of reaction (Woods et al. 1994).

6. The role of the accident investigation process


The failures in basic organizational processes, such as allocation of resources,
planning, budgeting, ® nancing, establishing goals and so forth, are the causes behind
958 D. E. Maurino

the symptoms observed in ¯ ight decks, in air tra c control rooms, in ramps and in
maintenance hangars. While it is important to address symptoms while longer-term
strategies aimed at the causes take place, it would be regrettable if all energies
continue to be devoted to myopic attempts to address symptoms exclusively. While
accident investigation must be recognized for its historical contribution to aviation
safety, the industry cannot a ord to use up meagr e resources in reactive endeavours.
It cannot a ordÐ either ethically or ® nanciallyÐ to wait for accidents to learn safety
lessons. M ore importantly, it need not wait. At the heart of this paper is the
contention that, through the application of human factors knowledge to prevention
strategies, there exists the possibility of proactively anticipating those ¯ aws which
already exist in the system and which will eventually lead to accidents. It is possible
to apply techniques to identify latent unsafe conditions within the system, before
they combine with failures in operational contexts to provoke accidents.
Downloaded by [University of Delaware] at 12:54 07 October 2014

H owever, the shift in the safety paradigm and prevention strategies will not be
possible unless air safety investigators acknowledge the value of applied human
factors knowledge. H uman factors knowledge has been incorporated into some
countries’ accident investigation protocols. Benchmark reports have been produced
over the last years, attenuating by their sound and broad approach the reactive
nature of the accident investigation process. By addressing the collective rather than
the individual, these reports have o ered potential for signi® cant improvement in
safety and operational e ciency. This potential, by the way, can only be realized if
the recommendations in these reports are acknowledged by decision-makers both in
industry and government. H owever, make no mistake about it: most accident
investigation agencies pay only lip service to human factors. Within these
organizations, the change implicated by the investigation of the broader human
factors issues con¯ icts with conservatism and convention, and is dodged by reasons
of convenience.
There are reasons for concern as to what the future might hold for a
contemporary safety paradigm unless greater numbers of air safety investigators
acknowledge that human factors is a core discipline. This is an exceedingly
important point because aviation has yet to arrive at the point at which existing
barriers are demolished and proactive auditing schemes become fully accepted and
implemented. U ntil then, the accident investigation process will remain the
workhorse of safety for the immediate future, and the vehicle to fortify the
architecture of the aviation system. If the accident investigation process does not
generate a meaningful product, there is no way to feedback and, most important,
feedforward prevention strategies.
As long as accident investigations consider human error without consideration of
contexts and as the cause rather than the symptom, as long as it does not dig into the
deeper layers of the processes surrounding the events under scrutiny, their only
accomplishment will be to put losses behind and to reassert trust and faith in the
system, that is to say, to ful® l political purposes. Only if accident investigations
consider human error as a symptom and look for causes in the context, might
aviation have a chance to learn about system vulnerability and to develop strategies
for change, thus improving system reliability (M aurino et al. 1995, ParieÁs 1996).

7. Conclusions
Within the aviation industry, human factors is not an end in itself, an opportunity to
generate research, nor the last frontier of aviation safety or a frontier of any kind.
Human factors and aviation safety 959

The incorporation of human factors knowledge into aviation operations and


practices presents another opportunity to contribute to the aviation system’s
production go als: the safe and e cient transportation of people and goods. If human
factors knowledge is expected to be e ective against systemic ¯ aws and failures, its
application must be predicated upon an understanding of systemic safety and a
safety paradigm that are relevant to contemporary civil aviation. Some encouraging
progress has been made, but there is need for improvement.
This paper suggests one possible way to move forward. At its foundation, it
carries the belief that the integration of human factors knowledge into aviation
operations represents a cost-e ective approach to anticipate human error rather than
continue regretting its consequences.

References
Downloaded by [University of Delaware] at 12:54 07 October 2014

A M A L BERT I , R. 1996, L a Conduite de SysteÁ mes a Risques (Paris: Presses U niversitaires de


F rance).
H E L MR E I C H , R . L. and M E R R I TT , A. C. 1998, Culture at W ork in Aviation and Medicine,
National, Organizational and Professional In¯ uences (Aldershot: Avebury).
H O O D , C. and J O NE S, D. K. C. 1996, Accident and Design, Contemporary Debates in Risk
Management (London: U CL Press).
I N TE R N A TI ON A L C I VI L A VI A TI O N O R G A N I ZA T I ON 1993, Human F actors D igest N o. 10, Human
F actors, Management and Organization, Circular 247-AN /148, International Civil
Aviation Organization, Montreal.
K L E IN , G. A, O R A SA NU , J., C A L DE R W O O D, R . and Z SAM BO K , C. E. 1993, Decision Making in
Action: Models and Methods (Norwood, NJ: Ablex).
M A UR I N O , D . E, R EA SO N , J., J O H N ST O N, A. N . and L E E , R . 1995, Beyond Aviation Human
Factors (Aldershot: Avebury).
M ESHK A T I, N . 1989, Technology transfer to developing countries: a tripartite micro- and
macroergonomic analysis of human-organization-technology interfaces, International
Journal of Industrial Ergonomics, 4, 101 ±115.
P A R IEÁ S, J. 1996, Evolution of the aviation safety paradigm: towards systemic causality and
proactive actions, in B. H ayward and H . Lowe (eds), Proceedings of the 1995 Australian
Aviation Psychology Symposium (Aldershot: Avebury), 39 ±49.
R EA SO N , J. 1997, Managing the Risks of Organisational Accidents (Aldershot: Avebury).
VA UG H N , D . 1996, T he Challenger L aunch Decision. Risky T echnology, Culture and Deviance at
NASA (Chicago, IL: University of Chicago Press).
W O O DS, D . D ., J O H A N N E SE N , L. J., C O O K , R . I. and SA R TE R , N. B. 1994, Behind human error:
cognitive systems, computers and hindsight, CESER IAC State of the Art R eview, N o
94.01, Crew Systems Ergonomics Information Analysis Centre, Wright-Patterson Air
F orce Base, OH .

You might also like