You are on page 1of 7

Automation vs.

Human
Intervention: What Is the Best Fit
for the Best Performance?
Joel M. Haight, PhD, PE, and Vladislav Kecojevic, PhD
Dept. of Energy and Geo-Environmental Engineering, Penn State University, University Park, PA 16802; jmh44@psu.edu
(for correspondence)

Published online 3 January 2005 in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/prs.10050

In today’s complex industrial processes, automated provide judgment, logic, experience, and opinions. As
control systems are a necessity. However, is complete a component in the system, we are interactive, variable,
automation the answer? Whereas control system auto- and adaptable. We fill many roles because we can
mation provides predictable, consistent performance, it adapt and specialize, but our natural variability makes
is lacking in human judgment, adaptability, and logic. it possible for us to take actions that a system may not
Although humans provide these, we are unpredictable, be able to tolerate [3]. Therefore, our input is a “bless-
inconsistent, and subject to emotions and motivation. ing and a curse” [3] to those responsible for system
To maximize system performance, should we automate design or performance.
humans out of the system? . . . or . . . Do we maximize Human error is often implicated as a cause of or
human input and lose efficient, consistent, error-free contributor to process industry incidents that result in
system performance? The answer is likely somewhere in injuries, fires, spills, unplanned equipment downtime,
the middle of these two extremes and different for each and the like. Even with increasing automation, various
system and situation. This paper provides a review of sources report, that between 50 and 90% of all indus-
the existing literature covering control schemes and trial incidents are caused by human error. It is difficult
parameters that determine system performance. It at- to say what the real figure is because it depends on
tempts to help answer the questions “How can we min- one’s perspective. However, whatever your perspec-
imize human error while still maximizing system per- tive, it can be reasonably well argued that a large
formance?” and “What is the right human–machine percentage of industrial incidents are contributed to or
mix?” © 2005 American Institute of Chemical Engineers caused by human error [4].
Process Saf Prog 24: 45–51, 2005 As technology improves and efficient, productive
output becomes necessary for financial survival, we are
INTRODUCTION inclined to automate control systems. However, there is
Does automation of control systems in today’s in- a cost to the growth in this phenomenon. Twenty-two
dustry help to reduce human error? Intuitively, one years ago, this author, while learning oil production
might expect that if we engineered humans out of the operations, spent much time in the field with experi-
system, errors would decrease. This seems to be a enced first-line supervisors. It was impressive to watch
worthwhile goal. In fact, well-known management them in action. A supervisor would stop the truck upon
consultant, Walter Bennis, says the factory of the future hearing something out of the ordinary (a hiss, an un-
will have only two employees, a human and a dog. The familiar vibration). He would listen more closely, place
human is there only to feed the dog and the dog is his hand on a pump or piping, and then radio mainte-
there to bite the human if he or she touches any- nance to request the repair of a leak, a bad order
thing [1]. bearing, or some other problem. These supervisors
Human error is inevitable, so we may be tempted to relied on “sentient” knowledge (obtained from the
think like Mr. Bennis [2]. Although it may be appealing senses and experience—the concept of “gut feel” may
to automate humans out of the system, humans do partially describe this) to operate the process. They had
a “feel” for the system. With today’s reliance on com-
© 2005 American Institute of Chemical Engineers puter-controlled automation, it is less likely that oper-

Process Safety Progress (Vol.24, No.1) March 2005 45


Table 1. Important assumptions made in this review. protecting the operator from him/herself and is thus
improving the safety of the system. Therefore, even
• Human error is inevitable [2]. though economics has been a primary driving force in
• Under appropriate conditions, automation of the development and increase in automation in recent
industrial system performance is an accepted years, it could be argued that automating a system will
means to reduce human error and its impact. provide a higher level of system safety.
• Humans provide characteristics such as judgment,
flexibility, adaptability, experience, and sentient Time Savings and Efficiency
knowledge (feel for the process) and are therefore One might automate a system to relieve humans of
essential components in industrial systems. time-consuming and labor-intensive tasks [7], to speed
• To achieve maximum system performance while the operation, to increase production rates, to extend
minimizing human error and risk to human an operation to a longer shift or even to continuous
operators, system designers must fully understand production, to reduce system inefficiency, or to ensure
human physical, cognitive, and emotional physical specifications are maintained and consistent.
capacities and limitations and account for these in Automated systems can also free up the operator’s
their designs. resources to allow him/her the time and opportunity
• Human errors and system risk can be minimized if for long-range planning or decision making [8].
system designers incorporate known principles of
human–machine interaction and ensure humans WHY ACTIVELY ENGAGE HUMANS IN SYSTEM PERFORMANCE?
remain mentally engaged in system operation [5]. To answer this question, we should first explore the
• System performance can be maximized if designers role humans play in modern production systems. Al-
incorporate into their designs optimized levels of though the decision to automate today’s complex pro-
the strengths of each of the human and machine duction systems seems inarguable, humans remain a
components in the system [6]. critical component. Humans are more flexible, adapt-
able, and creative than a fully automatic system and
humans are better able to respond to changes or un-
ators will develop this same level of sentient knowl- foreseen conditions [7]. In a process plant, this adapted
edge. To maximize system performance, we must response may often be necessary. Humans often con-
properly consider the strengths and weaknesses of tribute a supervisory role to the process; however, it is
both the automated control system and the human difficult to know to what level supervisory responsibil-
operator. The designer must work with existing opti- ity is required [8]. Humans still provide system intelli-
mization models built to consider the application as gence, planning, creative thinking, and decision mak-
well the human and machine variables to optimize ing. However, other higher cognitive functions are
system performance. In assessing a human–machine being explored (artificial intelligence and neural net-
system, one must account for human variability; thus, works) for assignment to higher-order machines, and
important assumptions must be made. The assump- although this is true, it should be noted that humans
tions made to establish the context of this review paper still provide these functions better and more com-
are shown in Table 1. pletely than machines [7].
The strengths and weaknesses in the human–ma- Because automated systems are designed by hu-
chine interface that will be considered in this review mans, they are subject to human limitations. For exam-
paper are shown in Table 2. ple, it is not possible for the designer to foresee, plan
for, and design an automated response for every pos-
WHY AUTOMATE A SYSTEM? sible situation that may arise in complex plant opera-
tions. Therefore, the best the designer has to hope for
General is to understand the systems operating environment
An automated system is thought to perform a func- well enough to design in as many responses as possi-
tion more efficiently, reliably, and accurately than a ble. Then the system designer must integrate the hu-
human operator. There is also an expectation that the man operator into the system to an adequate level to
automated control system can perform a function at a provide the necessary judgment and flexibility to im-
lower cost than the operator can. Therefore, as tech- plement adapted responses as the unforeseen events
nology advances and becomes more available, its use is present themselves. This is where the challenge to the
growing at a rapid rate [7]. There is little argument designer lies. It is difficult to integrate humans into a
against the efficiency, reliability, and accuracy of an system. Human are driven by ambition and emotion
automated system. and are subject to inconsistencies and forgetfulness.
Humans allow cognitive functions to disengage with-
Safety out realizing it. We switch from “habits of mind” to
With higher reliability, one may be justified in think- “active thinking” and back again several times through-
ing an automated system is a safer system. System out the workday. It is difficult to tell what triggers and
failures and upsets often lead to injuries, loss of con- motivates the switch [9]. It is known that, in general, the
tainment of toxic or flammable materials, or cata- more humans remain engaged in the process, the more
strophic rupture of equipment resulting in significant likely they will maintain the “active thinking mode.”
damage to the surroundings. It may be argued that, System designers should seek to maintain an operator’s
keeping the human out of the system, in a sense, is “active thinking mode” as he or she decides what to

46 March 2005 Process Safety Progress (Vol.24, No.1)


Table 2. Machine and human strengths and weaknesses.

Human Machine
Strengths Weaknesses Strengths Weaknesses
Can apply judgment Inconsistent Consistent No judgment
Adaptable Subject to errors Predictable Cannot be programmed for all
eventualities
Can apply sentient Unpredictable and Efficient No sentient knowledge
knowledge possibly unreliable
Interactive Subject to emotion and Uniform and Constrained by human limitations
motivations reliable in design, installation, and use

Table 3. Human roles in an automated system. pated human responses result when the designer has
not adequately thought through all potential responses.
Acknowledge control system signals (no change to Given this, it appears that an optimized system that
process) maximizes performance and minimizes human errors
Acknowledge control system signals (make required will dynamically operate somewhere between full au-
changes) tomation and complete manual control. Where on this
Record data and instrument readings and adjust as continuum, a system should operate, depends on the
needed application. Whatever the application though, to max-
Monitor system status and override as necessary imize system performance, a designer should maximize
Monitor system status and report findings with no the need for and use of what machines do best: accu-
change rate, consistent, fast, continuous, economic operation.
To ensure minimized human error, a designer must
integrate human input into the system in such a way
that the operator stays mentally and physically en-
automate, and to what level. Unfortunately, increasing gaged. As a “manager” of the interface, the operator
automation tends to promote the likelihood switching should have a monitoring role with override capabili-
to “habits of mind” [9]. To maximize system perfor- ties, receive adequate system status feedback with
mance, research indicates the need to take advantage enough time to respond, and trust system accuracy and
of the human and the control system strengths and then reliability [11]. This is a tall order and different for each
to create effective and active communication between system. Often, the level of interaction between the
them [10]. Table 3 shows the possible roles that the
operators and the control system develops over time.
human operator can fill in a human–machine system.
As personnel, training, and experience change over
HOW SHOULD AN ENGINEER INTEGRATE THE HUMAN AND time, the human input will change. An experienced,
AUTOMATED COMPONENTS? well-trained operator can become bored and less vigi-
Although human error may be caused by, among lant as he or she learns the system. Less vigilance often
other things, the work environment, research shows means more errors of omission in the process. This
that it is nonetheless inevitable [2]. Although minimiz- phenomenon is not fully in the control of the system
ing human error is not the only reason for automating designer.
a system, it is one of a number of the driving forces. An engineer must have an understanding not only of
Before determining how and to what level an engineer/ how the automation hardware and software operates,
designer should automate a system, it is necessary to he or she must also understand the inner workings of
present what “automation” means in the context of this the human operator; physical, mental, motivational,
article. One author defines automation as the execution emotional, training level, experience level, and the like.
by a machine agent (usually a computer) of a function
Often, an engineer will automate a system to a level
that was previously carried out by a human. They
that maximizes economic benefit, but then, almost as
explain that given this definition, what is currently
considered “automation” will change over time [7]. an afterthought, leaves the human operator to manage
A fear that workers have had in the past is that the system as best he or she can [7]. This can lead to
automated systems are meant to replace them. This fear increased errors, upsets, injuries, fires, and spills, for
may have been encouraged by the computer system example. The idea of allocating tasks between humans
“Hal” in the movie 2001, A Space Odyssey. Automation and machines within the context of a complex system
is not intended to remove the human; it merely changes is referred to as function allocation [6]. It is critical to
the nature of the role from one of doer to one of safe operation that designers carefully consider func-
overseer [7]. The operator manages the human–ma- tion allocation. Decisions related to task allocation
chine interface to keep system performance high [8]. A among humans and between humans and machines
problem occurs, however, when the designer does not are design-based decisions [6]. Because there are so
fully consider the ways in which the human role is many variables involved and their interrelationships are
changed by the automation. Unintended or unantici- complex, it is expected that optimization of these vari-

Process Safety Progress (Vol.24, No.1) March 2005 47


ables is desirable and should result in better overall Table 4. Inventory analysis of production and
system performance [6]. landfill.
Because designers are not able to “program in” all
possibilities, they tend to integrate humans into the Ease of use of the system by the operator
system, as a supervisor or monitor. This sometimes Type of automation and type of feedback to the
puts the operator in a position of having to respond to operator
an action already taken by the system. The operator Frequency and perceptibility of operational feedback
then has to respond on the system’s terms. Under these Level to which the operator is kept mentally or
terms, the operator is in a “catch-up” mode and at a physically engaged
disadvantage in terms of heading off an incident [7]. Ease of switching the control system from automatic
Unfortunately, sometimes designers give the operator to manual and the system reliability
override capabilities and room for discretion, without
providing adequate feedback about the system’s inten- Source: DeKock [13], Sarka and Pukkila [14], Pukkila
tions or actions soon enough to allow them to take [15], and Molloy and Parasuraman [16].
correct or complete actions to avoid system upset or
incident. In the airplane cockpit as well as the process
plant control room, visual, auditory, and tactile feed-
back is necessary [11]. place without human intervention and the operator
The full potential of the automated system cannot be would be free to work on other productive activity.
realized if the human operator makes errors that bring However, now the operator is no longer engaged,
the system down. This issue was addressed above, but physically or mentally, and it is likely that the operator
human input can and will have an effect on system would not follow this operation very closely [12]. This
performance. Human performance variables, such as can result in the problem of overreliance. The operator
adaptability, judgment, future-planning ability and cre- completely relies on the control system and does not
ativity, must be quantified and integrated into a design. respond to a pending overfill or upset. However, if the
Application of the concept of a dynamic system of task fill system malfunctioned, the operator would then
allocation is required in the design of complex systems learn not to trust it. This could cause the opposite
and no one design fits all systems [6]. This is the cog- problem of underreliance and result in an operator
nitive human factors version of the adjustable-height potentially forgetting to close the fill valve and overfill-
office chair [6]. The system should be designed to allow ing the tank.
dynamic and variable levels of task allocation between Given this dilemma, the appropriate level of auto-
operator and machine. This puts design and operation mation for an operating system must be properly de-
decision making in the hands of the human again, signed into the system. This depends on who the sys-
however, and issues such as training (may be man- tem is being designed for. It may be age dependent,
dated), experience (may be difficult to quantify), bias training level dependent, or technological savvy de-
(may be unavoidable), for instance, must be consid- pendent. To address this issue, the designer has a
ered when personnel selection decisions are made. In number of considerations and it is assumed that suc-
day-to-day process plant operations, realistically, these cessful consideration of design variables can result in
selection decisions may not be possible. Problems can higher productivity and fewer errors (Table 4).
arise when the human “on the shop floor” makes de- An early traditional approach to automating a system
cisions as to how much input he or she will provide to is one in which either the operator or the machine has
the control system today. Problems associated with this full control of the system at any one time. It was a
will be discussed later in this paper, but these may be manual switch between automatic and manual control
questions for future research. [7, 17]. Today’s systems, however, have become in-
Many industry people are familiar with the phenom- creasingly complex and in some cases, the functions
enon of operators circumventing the automated com- can be performed adequately by either the human
ponent of a system [8]. They may circumvent the auto- operator or the automated system. There are also a
mation system because the system makes their work number of functions requiring the attention of both at
more difficult or time consuming. An automation sys- the same time. This requires task-allocation decisions
tem malfunction also creates problems in the human– to be made by the system designer with input from the
machine interface. For example, if an operator is system operators [5]. The system designer must con-
charged with filling a vessel with liquid product and sider each function or task to be performed by the
that vessel is equipped with an automated level control system and then must determine whether the human or
system and alarm, he or she may be tempted to open the automated system or both should be given control.
the fill valve and leave the site to do other work while The designer must also consider whether the human
the vessel fills. If the control system fails and the tank operator should be able to switch easily between au-
overfills, chances are that human error will be cited as tomatic and manual control, whether the human oper-
the cause. If this system were manual only, the operator ator should be able to override automatic control, and
would probably open the fill valve and stand by for the how he or she will be alerted when the switch is
entire fill time. This is time consuming and tedious, let necessary.
alone the fact that it idles an operator who could be Communication and coordination between operator
doing other productive work. If the system were com- and system are critical. This is important in situations
pletely automated, the filling operation could take where the operator and the automated systems share

48 March 2005 Process Safety Progress (Vol.24, No.1)


control, such as in today’s complex flight systems or Trust and Reliance
pharmaceutical processes [10]. If the operator cannot Once a designer has created a system in which the
easily tell when a change is required or that the system operator remains engaged, he or she must be sure that
has become disengaged or is no longer functioning, the hardware remains reliable and that malfunction is
problems arise and errors are made. This potential minimized. An important component in the human–
problem was evident when Eastern Airlines flight 401 machine system is “trust.” The operator has to be able
crashed in the Florida Everglades in the early 1970s. to trust the automation to be accurate, functional, reli-
The pilots failed to recognize that the autopilot became able, and consistent. Once the operator sees that the
disengaged while they were distracted by a faulty in- hardware is not accurate or subject to malfunction, the
dicator light [16]. The crew became preoccupied with trust vanishes and the operator begins to underrely on
an inoperative landing gear indicator light. They placed the automation [8]. This means he or she may circum-
the aircraft on automatic pilot and set the altitude for vent or disable the automation and rely too heavily on
2000 feet while they addressed the landing gear. When his or her own manual input. The safety, risk, and
the automatic pilot system accidentally became disen- error-reducing benefits of the automation are then lost
gaged, no one recognized it (the alarm sound was [7, 16]. With a level alarm controller that is subject to
obscured by cockpit discussion) and the plane contin- frequent spurious trips and false alarms, operators
ued to fly on manual control without human input. It learn to either disconnect the alarm, acknowledge it
gradually lost altitude until it contacted the ground, and not respond, or switch to complete manual oper-
killing almost everyone onboard. This is also illustrated ation and risk forgetting to close the fill valve and
in long wall mines when an operator is surprised by overfilling the vessel.
movement of a long wall shield. If the operator is The opposite of this phenomenon is also a problem.
unaware of the shield’s operating cycle status or is If the system is designed to minimize human input and
preoccupied, he or she may position him/herself inside it is known to be accurate and reliable, the operator
the shield’s operating envelope and range of motion may be more likely to switch to the “habits of mind”
[18]. In each of these cases, it is acceptable to switch mode and may tend to overrely on the automation [7,
between automatic control and manual control, but the 9]. The operator then does not allocate any attention to
operators must be made aware that the switch is nec- the system as he or she is counting on the automation
essary or has already occurred. to take care of everything. A common joke around an
There is much to consider (much more than shown oil refinery is to ponder the question—“If everyone just
here) and many variables associated with each of these walked away and left the refinery to function on its
considerations [5]. More research is needed to develop own, how long would it continue to operate before it
an improved design model that can be used to help experienced either a complete shut down or a cata-
process-plant control-system designers with these de- strophic event?” Thankfully, the answer to this question
cisions. has not been determined experimentally, but it would
probably be, “not long.” An example of overreliance on
an automated system would be the Eastern Airlines
WHAT CAN GO WRONG IN AN AUTOMATED SYSTEM? flight 401 (discussed earlier), which crashed in the
Florida Everglades. There were a number of problems
General as the pilots received inadequate feedback or alarm
One definition of human error is: “human error con- that the automated pilot became disengaged [11], but
sists of any significant deviation from a previously es- the biggest problem came in that the pilots overrelied
tablished, required or expected standard of human on the automatic pilot [16].
performance” [19, 20]. There are other definitions, but
from the perspective of designing, operating, and Feedback
maintaining an industrial system, this provides a work- One way to help keep an operator cognitively en-
ing foundation. When humans make errors and cause a gaged is to provide accurate and understandable feed-
system to fail, it usually does not fail as the result of any back about system status and mode. If the operator
one reason [19]. It usually fails because of the human– understands the system’s status, he or she is in a much
machine interface decisions the designer made, the better position to respond to upset conditions [22]. This
kinds of people operating the system, the amount of feedback has to be provided in a timely manner and be
training operators received, and the level to which they delivered such that the human operator is not over-
are physically or mentally able to cope with the system whelmed. In one refinery case, a furnace explosion
and its changes. System failure can be a function of occurred after the high-temperature detection and
operating procedures provided for the people or the alarm system malfunctioned, allowing the furnace
environment in which they are working [21]. It is im- tubes to overheat and rupture. A large volume of flam-
portant to recognize that most human errors are not mable liquid spilled into the firebox and ignited. The
made because people are not intelligent or that they are rapid combustion catastrophically ruptured the firebox.
wrong [19]. They commit errors because, in the heat of While this was happening, the operator received over
the moment, they make decisions and take actions that 300 alarms in less than 5 min. Under these conditions,
seem logical, given the situations and systems in which human error is inevitable. Engaging operators in the
they are operating. Errors are “caused” [19]. Human automation system can be done by requiring manual
actions are taken based on, among other things, infor- data recording during the process and expecting an
mation provided by the automation system. analysis and manual trending of the process data.

Process Safety Progress (Vol.24, No.1) March 2005 49


There are a number of ways to engage operators, but automation cannot be designed and built to account for
care must be taken to ensure the expected operator every possibility and we need our human operators to
activity is real and necessary. We learn fast whether our make judgments, interpretations, and moves to prevent
actions are important, necessary, valued, and integral incidents. Without the sentient knowledge that one
to the process. develops from experience and living with the system
As mentioned above, designers tend to automate a and all of its faults, the risk of incident is higher. The
system with economic benefit being the driving force more automated a system becomes, the more overreli-
for determining which aspects will be automated and ant an operator may become and the more he or she
how. They then integrate the human into the system stays in the “habits of mind” mode and the less likely he
“after the fact,” leaving him or her to manage the results or she is to switch cognitive gears to “active thinking”
of the automation [7]. This approach can work pro- when needed [9].
vided the designer considers issues such as perfor-
mance feedback, the level of operator training and Mathematical Modeling of the “Right Mix”
experience, the level to which the automation allows One of the first steps in a design process is to
the operator to remain mentally engaged, and the develop and quantify system performance variables.
speed at which the operator must respond to the feed- For most engineers, this is an every day part of the job.
back [11, 12]. Automated systems designers, in general, In this case, however, most engineers are not trained to
have not yet adequately considered these performance understand, much less quantify, the human variables
and training variables and have not integrated, system- associated with motivation, training, emotion, judg-
atically and fully, these human variables into process ment, flexibility, adaptability, fatigue, or boredom. En-
industry systems. Several accidents outside the process gineers must also account for the operator’s trust in the
industry have illustrated this improvement opportunity. system in terms of designing in maximum reliability [8].
Airplanes operating on automatic pilot were flown into To account for minimally available information, the
terrain and rail cars operating with speed constraints designer has to quantify system variables in terms of
derailed because of high speed [7]. There may be a human needs. More research is needed to aid in this
“disconnect” between automated system designers and quantification. Once this is complete, an optimization
human operators. System designers must work closely function must be built from an understanding of the
with the operators using their designed system. mathematical relationship between these variables and
system performance. Although flight systems have
Use, Abuse, Misuse, Disuse been thoroughly studied, more research is needed to
Incidents involving automated systems appear to understand this mathematical relationship in process
involve human operators abusing, disusing, or misus- plant applications. A generalized optimization function
ing the automated components they are operating [7]. model may follow this format:
Misuse is defined as underreliance or overreliance on
the automated components. Underreliance, as noted max Y ⫽ A1 ⫹ A2· · ·An ⫹ H1 ⫹ H2 ⫹ · · ·Hn
above, refers to an operator not relying on the auto-
mation when it is required or prudent. This is charac-
s.t. Y ⬎ 0
terized by an operator circumventing a storage vessel’s
level control system or disconnecting a turbine’s over-
speed trip mechanism to avoid an alarm. Overreliance min E ⫽ H1 ⫹ H2 ⫹ · · ·Hn
refers to an operator turning over the operation to the
control system and withdrawing his or her own valu- s.t. E ⬎ 0
able input to the system’s performance [7]. This could
be characterized by an operator leaving a job site and where Y is the overall system performance, A repre-
allowing the flooded boot of a reflux drum to be sents the automation variables, H represents human
drained out on its own by the control system. Well- variables, and E represents errors.
known comic, Gary Larson showed in one of his Far The modeling concept is not new, but a challenge
Side cartoons, the image of two airplane pilots, who still remains— defining the appropriate variables and
seemingly trusted their automated control system so quantifying them appropriately for use in the model is
much that when they peered through an opening in the difficult. This is still somewhat undeveloped territory
clouds and saw a mountain goat standing there, asked and so far involves much subjective thinking. In many
“What is that mountain goat doing way up here in this cases, engineers consider the human operator as an
cloud bank?” [23]. afterthought who must align with the system or be left
to struggle with it [7]. More research is needed such that
Sentient Knowledge system designers, operators, and human factors engi-
As the level of automation increases, operators may neers study jointly, the human–machine interface in the
be losing the sentient knowledge of the process that chemical process plant.
they once had. There seems to be fewer and fewer
people who can tell, simply by smell, sound, and feel, CONCLUSIONS
what is going wrong in a process. If our human oper- Mr. Bennis’s slightly exaggerated, slightly real por-
ators are not able to tell that there is an impending trayal of the factory of the future may never come.
upset in the system, they will have to rely on the There are many who hope it does not, and rightly so.
automation to catch and quell a problem. At this stage, Humans provide valuable input to any system and it

50 March 2005 Process Safety Progress (Vol.24, No.1)


should be a goal of every system designer to maximize human–automation coordination in event driven
that input to take advantage of our judgment, flexibil- domains, Hum Factors 41 (1999), 541–552.
ity, experience, adaptability, and motivation. At the 12. L.J. Prinzel, F.G. Freeman, M.W. Scerbo, P.J.
same time, the engineer must maximize system perfor- Mikulka, and A.T. Pope, Effects of a psychophysi-
mance by relying on automation to cover for our inat- ological system for adaptive automation on perfor-
tentiveness, inconsistencies, lack of endurance, lack of mance, workload, and the event-related potential
vigilance, and all of our physical and cognitive limita- P300 component, Hum Factors 45 (2003), 601– 613.
tions. Although automated flight systems have under- 13. A. DeKock, New hazards in the underground coal
gone extensive study, more research is needed in mining environment caused by the introduction of
chemical and hydrocarbon processing operations. We radio remote and other electronic controls, OHS
need to understand better the relationship between Safety, The Queensland Resources Council, Austra-
automation and human variables so that appropriate lia, 1999, pp. 1–10.
quantification can be made to facilitate the design pro- 14. P. Sarka and J. Pukkila, Intelligent mine and its
cess. More education of our engineering students is implementation, Proc 30th Int Symp on Application
needed in the area of human factors to achieve this of Computers and Operations Research in the Min-
level of understanding in future designers of industrial eral Industry, Society for Mining, Metallurgy and
systems [24 –26]. Exploration, Littleton, CO, 2002.
15. J. Pukkila, Implementation of mine automation:
The importance of work safety and motivation,
LITERATURE CITED Civil Engineering and Building Constructions Se-
1. M. Paradies and L. Unger, TapRoot威, The system ries No. 118, Acta Polytechica Scandinvica, ISBN
for root cause analysis, problem investigation, and 951-666-526-8, p. 153.
proactive improvement, System Improvements, 16. R. Molloy and R. Parasuraman, Monitoring an au-
Inc., Knoxville, TN, 2000. tomated system for a single failure: Vigilance and
2. K.R. Hammond, Human judgment and social pol- task complexity effects, Hum Factors 38 (1996),
icy: Irreducible uncertainty, inevitable error, un- 311–322.
avoidable injustice, Oxford University Press, New 17. P.M. Fitts, Human engineering for an effective air
York, 1996. navigation and traffic control system, National Re-
3. P.E. Lorenzo, A manager’s guide to reducing hu- search Council, Washington, DC, 1951.
man errors—Improving performance in the chem- 18. J.J. Sammarco, J. Kohler, and T. Novak, Safety is-
ical industry, Chemical Manufacturers Association, sues and the use of software-controlled equipment
Washington, DC, 1990. in the mining industry, Proc 32nd Annual Meeting
4. J.M. Haight, To err is human and that is hard for of IEEE Industry Applications Society, New Or-
engineers, Eng Times 24 (2003), 4 –5. leans, LA, October 5–9, 1997.
5. R. Parasuraman, M. Mouloua, and R. Molloy, Ef- 19. D. Peterson, Human error reduction and safety
fects of adaptive task allocation on monitoring of management, Van Nostrand Reinhold, New York,
automated systems, Hum Factors 38 (1996), 665– 1996.
679. 20. G. Peters, Human error: Analysis and control, J Am
6. P.E. Waterson, M.T. Older Gray, and C.W. Clegg, A Soc Safety Eng 11 (1966), 1.
sociotechnical method for designing work systems, 21. A. Chapanis, New approaches to safety in industry,
Hum Factors 44 (2002), 376 –391. InComTec, London, 1972.
7. R. Parasuraman and V. Riley, Humans and automa- 22. R.J. Mumaw, E.M. Roth, K.J. Vicente, and C.M.
tion: Use, misuse, disuse, and abuse, Hum Factors Burns, There is more to monitoring a nuclear
39 (1997), 230 –253. power plant than meets the eye, Hum Factors 42
8. A. Kirlik, Modeling strategic behavior in human– (2000), 36 –55.
automation interaction: Why an “aid” can (and 23. G. Larson, Far Side gallery 2, Andrews McMeel
should) go unused, Hum Factors 35 (1993), 221– Publishing, Kansas City, MO, 1986.
242. 24. H. Petroski, Design paradigms: Case histories of
9. M.R. Louis and R. Sutton, Switching cognitive error and judgment in engineering, Duke Univer-
gears: From habits of mind to active thinking, Hum sity, Durham, NC/Cambridge University Press,
Relat 44 (1991), 55–76. Cambridge, UK, 1994.
10. A. Degani and M. Heymann, Formal verification of 25. T.A. Salthouse, A theory of cognitive aging,
human–automation interaction, Hum Factors 44 Elsevier, Amsterdam, 1985.
(2002), 28 – 43. 26. J.W. Senders and N.P. Murray, Human error: Cause,
11. A.E. Sklar and N.B. Sarter, Good vibrations: Tactile prediction, and reduction/analysis and synthesis,
feedback in support of attention allocation and Erlbaum Associates, Hillsdale, NJ, 1991.

Process Safety Progress (Vol.24, No.1) March 2005 51

You might also like