Professional Documents
Culture Documents
Is 29 05 Expert Opinion (1) 0
Is 29 05 Expert Opinion (1) 0
Editor: Daniel Zeng, University of Arizona and Chinese Academy of Sciences, zengdaniel@gmail.com
I
n developing any complex system that involves But how do we know what’s the right balance
the integration of human decision making and between humans and computers in these complex
systems? Engineers and computer scientists often
an automated system, the question often arises as
seek clear design criteria, preferably quantitative
to where, when, and how much humans and directive. Most engineers and computer scien-
and automation should be in the decision-making tists have little to no training in human interac-
loop. Allocating roles and functions between the tion with complex systems and don’t know how to
human and computer is critical in defi ning effi- address the inherent variability that accompanies
cient and effective system architectures. However, all human performance. Thus, they desire a set of
despite the recognition of this problem more than rules and criteria that reduce the ambiguity in the
60 years ago, in this case by NASA (see Figure 1), design space, which for them typically means re-
little progress has been made in balancing role and ducing the role of humans or at least constraining
function allocation across humans and computers. human behavior.
The problem of human-automation role alloca-
tion isn’t an academic exercise or limited to a few A Brief Historical Perspective
highly specialized domains such as NASA. The In 1951, a National Research Council committee
rise of drones (or unmanned aerial vehicles) and attempted to characterize human-computer inter-
the problems with remote human supervision are action (then called human-machine interaction)
an extension of well-documented human-automa- prior to developing a national air traffic control
tion interaction problems in fly-by-wire systems in system.1 The result was a set of heuristics about
commercial aviation. Mining industries increas- the relative strengths and limitations of humans
ingly use automation to augment and in some and computers (see Table 1), sometimes referred
cases outright replace humans, and robots that to as “men are better at’’ and what “machines are
require human interaction are on the battlefield better at’’ (MABA-MABA).
and in surgical settings. While these applications The heuristic role allocation approach, exem-
might seem far from everyday life, Google’s recent plified in Table 1, has been criticized as attempt-
announcement to introduce driverless cars to the ing to determine points of substitution—because,
mass market in 2017 and the race to develop in- for example, such approaches provide engineers
home robots will make the human-automation al- with justification (possibly erroneously) for how
location issue and associated computing demands to replace the human with automation.2 For tra-
ubiquitous. ditional engineers with no training in human-au-
The predominant engineering viewpoint across tomation interaction, this is exactly what they’re
these systems is to automate as much as possible, trained to do—reduce disturbances and variabil-
and minimize the amount of human interaction. ity in a system and make it more predictable. In-
Indeed, many controls engineers see the human deed, they’re trying to “capitalize on the strengths
as a mere disturbance in the system that can and [of automation] while eliminating or compen-
should be designed out. Others may begrudgingly sating for the weaknesses,”2 and this is an im-
recognize that humans must play a role in such portant piece of ethnographic information criti-
systems, either for regulatory requirements or low cal for understanding why traditional engineers
probability event intervention (such as problems in and computer scientists are so attracted by such
nuclear reactors). representations.
Figure 1. The role allocation conundrum for the Apollo missions. (Photos provided courtesy of The Charles Stark Draper
Laboratory, Inc.)
In part to help traditional engineers a utomated system where the human is addresses types of collaborative inter-
and computer scientists understand kept completely out of the loop, and action between the human and com-
the nuances of how humans could this framework was later expanded to puter. Raja Parasuraman and his col-
interact with a complex system in a include 10 LOAs (see Table 2). leagues later clarified that the LOAs
decision-making capacity, Levels of For LOA scales like that exemplified could be applied across the primary
Automation (LOAs) were proposed. in Table 2, at the lower levels the hu- information processing functions per-
LOAs generally refer to the role allo- man is typically actively involved in the ception, cognition, and action, and not
cation between automation and the decision-making process. As the levels strictly to the act of deciding but again
human, particularly in the analysis increase, the automation plays a more using the same 10 levels.4
and decision phases of a simplified active role in decisions, increasingly re- Other taxonomies have proposed
information processing model of ac- moving the human from the decision- alternate heuristic-based LOAs, at-
quisition, analysis, decision, and ac- making loop. This scale addresses au- tempting to highlight less rigid and
tion phases.3,4 Such LOAs can range thority allocation—for example, who more dynamic allocation structures,5
from a fully manual system with has the authority to make the final de- as well as address the ability to hu-
no computer intervention to a fully cision, and to a much smaller degree, it mans and computers to coach and