Professional Documents
Culture Documents
Human Error
Human Error
Most industrial accidents are caused by human error: estimates range between 75 and 95 percent.
How is it that so many people are so incompetent? Answer: They aren’t. It’s a design problem.
Physical limitations are well understood by designers; mental limitations are greatly
misunderstood. We should treat all failures in the same way: find the fundamental causes and
redesign the system so that these can no longer lead to problems. We design equipment that
requires people to be fully alert and attentive for hours, or to remember archaic, confusing
procedures even if they are only used infrequently, sometimes only once in a lifetime. We put
people in boring environments with nothing to do for hours on end, until suddenly they must
respond quickly and accurately.
Error occurs for many reasons. The most common is in the nature of the tasks and procedures that
require people to behave in un-natural ways—staying alert for hours at a time, providing precise,
accurate control specifications, all the while multitasking. Interruptions are a common reason for
error, not helped by designs and procedures that assume full, dedicated attention yet that do not
make it easy to resume operations after an interruption.
Root cause analysis is the name of the game: investigate the acci-dent until the single, underlying
cause is found. Trying to find the cause of an accident sounds good but it is flawed for two
reasons. First, most accidents do not have a single cause: there are usually multiple things that
went wrong.
Second, why does the root cause analysis stop as soon as a hu-man error is found?
When root cause analysis discovers a human error in the chain, its work has just begun: now we
apply the analysis to understand why the error occurred, and what can be done to prevent it.
The Japanese have long followed a procedure for getting at root causes that they call the “Five
Whys,” originally developed by Sakichi Toyoda and used by the Toyota Motor Company as part of
the Toyota Production Sys-tem for improving quality. The Five Whys do not guarantee success.
The question why is ambiguous and can lead to different answers by different investi-gators. There
is still a tendency to stop too soon, perhaps when the limit of the investigator’s understanding has
been reached. When people err, change the system so that type of error will be reduced or
eliminated. When complete elimination is not possible, redesign to reduce the impact. One big
problem is that the natural tendency to blame someone for an error is even shared by those who
made the error, who often agree that it was their fault. Why do people err? Because the designs
focus upon the require-ments of the system and the machines, and not upon the re-quirements of
people.
Deliberate Violations
Errors are not the only type of human failures. Sometimes peo-ple knowingly take risks. When the
outcome is positive, they are often rewarded. When the result is negative, they might be pun-
ished.
Error is the general term for all wrong actions. There are two ma-jor classes of error: slips and
mistakes. slips are further divided into two major classes and mistakes into three. These categories
of errors all have different implications for design.
Slips: A slip occurs when a person intends to do one action and ends up doing something else.
There are two major classes of slips: action-based and memory-lapse. In action-based slips, the
wrong action is performed. In lapses, memory fails, so the intended action is not done or its results
not evaluated
Mistakes: A mistake occurs when the wrong goal is established or the wrong plan is formed. From
that point on, even if the actions are executed properly they are part of the error, because the
actions themselves are inappropriate. Mistakes have three major classes: rule-based, knowledge-
based, and memory-lapse.
Slips are the result of subconscious actions getting waylaid en route. Mistakes result from
conscious deliberations.
Most everyday errors are slips. Intending to do one action, you find yourself doing another.
An interesting property of slips is that, paradoxically, they tend to occur more frequently to skilled
people than to novices.
capture slips
description-similarity slips
mode errors
CAPTURE SLIPS
The capture slip is defined as the situation where, instead of the desired activity, a more
frequently or recently performed one gets done instead: it captures the activity. Capture errors
require that part of the action sequences involved in the two activities be iden-tical, with one
sequence being far more familiar than the other. After doing the identical part, the more frequent
or more recent activity continues, and the intended one does not get done.
DESCRIPTION-SIMILARITY SLIPS
In the slip known as a description-similarity slip, the error is to act upon an item similar to the
target. This happens when the de-scription of the target is sufficiently vague.
a description that usually suffices may fail when the situation changes so that multiple similar
items now match the description. Description-similarity errors result in performing the correct
action on the wrong object.
MEMORY-LAPSE SLIPS
Memory lapses are common causes of error. They can lead to several kinds of errors: failing to do
all of the steps of a procedure; repeating steps; forgetting the outcome of an action; or forgetting
the goal or plan, thereby causing the action to be stopped.
The immediate cause of most memory-lapse failures is interrup-tions, events that intervene
between the time an action is decided upon and the time it is completed.
MODE-ERROR SLIPS
A mode error occurs when a device has different states in which the same controls have different
meanings: we call these states modes. Mode errors are inevitable in anything that has more pos-
sible actions than it has controls or displays; that is, the controls mean different things in the
different modes. Mode error is really design error. Mode errors are especially likely where the
equipment does not make the mode visible, so the user is expected to remember what mode has
been established.
Mistakes result from the choice of inappropriate goals and plans or from faulty comparison of the
outcome with the goals during eval-uation.
Many mistakes arise from the vagaries of human thought, often because people tend to rely upon
remembered experiences rather than on more systematic analysis. We make decisions based
upon what is in our memory.
The Danish engineer Jens Rasmussen distinguished among three modes of behavior: skill-based,
rule-based, and knowledge-based.
RULE-BASED MISTAKES
When new procedures have to be invoked or when simple prob-lems arise, we can characterize
the actions of skilled people as rule-based. Some rules come from experience; others are formal
proce-dures in manuals or rulebooks, or even less formal guides, such as cookbooks for food
preparation. What is a designer to do? Provide as much guidance as possible to ensure that the
current state of things is displayed in a coherent and easily interpreted format—ideally graphical.
KNOWLEDGE-BASED MISTAKES
Knowledge-based behavior takes place when the situation is novel enough that there are no skills
or rules to cover it. In this case, a new procedure must be devised
With knowledge-based behavior, people are consciously prob-lem solving. They are in an unknown
situation and do not have any available skills or rules that apply directly.
MEMORY-LAPSE MISTAKES
Memory lapses can lead to mistakes if the memory failure leads to forgetting the goal or plan of
action. A common cause of the lapse is an interruption that leads to forgetting the evaluation of
the cur-rent state of the environment.
The design cures for memory-lapse mistakes are the same as for memory-lapse slips: ensure that
all the relevant information is con-tinuously available.
A subtle issue that seems to figure in many accidents is social pres-sure. Although at first it may
not seem relevant to design, it has strong influence on everyday behavior. In industrial settings,
social pressures can lead to misinterpretation, mistakes, and accidents.
It is relatively easy to design for the situation where everything goes well, where people use the
device in the way that was in-tended, and no unforeseen events occur. The tricky part is to de-sign
for when things go wrong. Many systems compound the problem by making it easy to err but
difficult or impossible to discover error or to recover from it. It should not be possible for one
simple error to cause widespread damage.
Difficulties arise when we do not think of people and machines as collaborative systems, but
assign whatever tasks can be auto-mated to the machines and leave the rest to people.
What we call “human error” is often simply a human action that is inappropriate for the needs of
technology.
Given the mismatch between human competencies and tech-nological requirements, errors are
inevitable.
Put the knowledge required to operate the technology in the world. Don’t require that all the
knowledge must be in the head.
Use the power of natural and artificial constraints: physical, logical, semantic, and cultural.
Bridge the two gulfs, the Gulf of Execution and the Gulf of Evalua-tion. Make things visible, both
for execution and evaluation.