You are on page 1of 13

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/275573819

Stopping killer robots

Article  in  Bulletin of the Atomic Scientists · January 2014


DOI: 10.1177/0096340213516745

CITATIONS READS
13 162

1 author:

M. A. Gubrud
University of North Carolina at Chapel Hill
45 PUBLICATIONS   694 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Arms Control View project

All content following this page was uploaded by M. A. Gubrud on 03 February 2016.

The user has requested enhancement of the downloaded file.


Bulletin of http://bos.sagepub.com/
the Atomic Scientists

Stopping killer robots


Mark Gubrud
Bulletin of the Atomic Scientists 2014 70: 32
DOI: 10.1177/0096340213516745

The online version of this article can be found at:


http://bos.sagepub.com/content/70/1/32

Published by:

http://www.sagepublications.com

On behalf of:
Bulletin of the Atomic Scientists

Additional services and information for Bulletin of the Atomic Scientists can be found at:

Email Alerts: http://bos.sagepub.com/cgi/alerts

Subscriptions: http://bos.sagepub.com/subscriptions

Reprints: http://www.sagepub.com/journalsReprints.nav

Permissions: http://www.sagepub.com/journalsPermissions.nav

>> Version of Record - Jan 6, 2014

What is This?

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


Bulletin IT IS 5 MINUTES TO MIDNIGHT

of the
Atomic
Scientists ®

Feature Bulletin of the Atomic Scientists


2014, Vol. 70(1) 32–42
! The Author(s) 2014
Reprints and permissions:

Stopping killer robots sagepub.co.uk/journalsPermissions.nav


DOI: 10.1177/0096340213516745
http://thebulletin.sagepub.com

Mark Gubrud

Abstract
Autonomous weapons are robotic systems that, once activated, can select and engage targets without further
intervention by a human operator. Advances in computer technology, artificial intelligence, and robotics may
lead to a vast expansion in the development and use of such weapons in the near future. Public opinion runs
strongly against killer robots. But many of the same claims that propelled the Cold War are being recycled to
justify the pursuit of a nascent robotic arms race. Autonomous weapons could be militarily potent and there-
fore pose a great threat. For this reason, substantial pressure from civil society will be needed before major
powers will seriously consider their prohibition. However, demands for human control and responsibility and
the protection of human dignity and sovereignty fit naturally into the traditional law of war and imply strict
limits on autonomy in weapon systems. Opponents of autonomous weapons should point out the terrible threat
they pose to global peace and security, as well as their offensiveness to principles of humanity and to public
conscience.

Keywords
autonomous weapons, Convention on Certain Conventional Weapons, international humanitarian law, killer
robots, unmanned weapons

ince the first lethal drone strike in the most dynamic and destabilizing com-

S 2001, the US use of remotely oper-


ated robotic weapons has dramat-
ically expanded. Along with the broader
ponent of the global arms race.
Drones and robots are enabled by
embedded autonomous subsystems that
use of robots for surveillance, ordnance keep engines in tune and antennas
disposal, logistics, and other military pointed at satellites, and some can navi-
tasks, robotic weapons have spread rap- gate, walk, and maneuver in complex
idly to many nations, captured public environments autonomously. But with
attention, and sparked protest and few exceptions, the targeting and firing
debate. Meanwhile, every dimension of decisions of armed robotic systems
the technology is being vigorously exp- remain tightly under the control of
lored. From stealthy, unmanned jets like human operators. This may soon change.
the X-47B and its Chinese and European Autonomous weapons are robotic
counterparts, to intelligent missiles, sub- systems that, once activated, can select
hunting robot ships, and machine gun- and engage targets without further inter-
wielding micro-tanks, robotics is now vention by a human operator (Defense

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


Gubrud 33

and the contexts of when such force is


used will expand. As the machines
become increasingly adept, the role of
humans will gradually shift from full
command, to partial command, to over-
sight and so onÓ (Anderson and Waxman,
2013). Automated systems are already
used to plan campaigns and logistics,
and to assemble intelligence and dissem-
The X-47B unmanned combat aircraft during a 2011 test. inate lethal commands; in some cases,
Photo credit: US Air Force humans march to orders generated by
machines. If, in the future, machines are
to act with superhuman speed and per-
Department, 2012). Examples include haps even superhuman intelligence,
drones or missiles that hunt for their tar- how can humans remain in control? As
gets, using their onboard sensors and former Army Lt. Colonel T. K. Adams
computers. Based on a computerÕs deci- observed more than a decade ago (2001),
sion that an appropriate target has been ÒHumans may retain symbolic authority,
located, that target will then be engaged. but automated systems move too fast,
Sentry systems may have the capability and the factors involved are too complex
to detect intruders, order them to halt, for real human comprehension.Ó
and fire if the order is not followed. Almost nobody favors a future in
Future robot soldiers may patrol occu- which humans have lost control over
pied cities. Swarms of autonomous weap- war machines. But proponents of
ons may enable a preemptive attack on autonomous weapons argue that effect-
an adversaryÕs strategic forces. Autono- ive arms control would be unattainable.
mous weapons may fight each other. Many of the same claims that propelled
Just as the emergence of low-cost, the Cold War are being recycled to argue
high-performance information technol- that autonomous weapons are inevit-
ogy has been the most important driver able, that international law will remain
of technological advance over the past weak, and that there is no point in seek-
half-centuryÑincluding the revolution ing restraint since adversaries will not
in military affairs already seen in the agreeÑor would cheat on agreements.
1980s and displayed to the world during This is the ideology of any arms race.
the 1991 Gulf WarÑso the emergence of
artificial intelligence and autonomous Is autonomous warfare
robotics will likely be the most import-
inevitable?
ant development in both civilian and
military technology to unfold over the Challenging the assumption of the inev-
next few decades. itability of autonomous weapons and
Proponents of autonomous weapons building on the work of earlier activists,
argue that technology will gradually the Campaign to Stop Killer Robots, a
take over combat decision making: coalition of nongovernmental organiza-
ÒDetecting, analyzing and firing on tar- tions, was launched in April 2013. This
gets will become increasingly automated, effort has made remarkable progress in

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


34 Bulletin of the Atomic Scientists 70(1)

its first year. In May, the United Nations have decided against pursuing autono-
Special Rapporteur on extrajudicial kill- mous weapons, those that have decided
ings, Christof Heyns, recommended that in favor of themÑincluding the United
nations immediately declare morator- States (Gubrud, 2013)Ñwill seek to
iums on their own development of manage the issue as a public relations
lethal autonomous robotics (Heyns, problem. They will likely offer assurances
2013). Heyns also called for a high-level that humans will remain in control, while
study of the issue, a recommendation continually updating what they mean by
seconded in July by the UN Advisory control as technology advances. Propo-
Board on Disarmament Matters. At the nents already argue that humans are
UN General AssemblyÕs First Comm- never really out of the loop because
ittee meeting in October, a flood of humans will have programmed a robot
countries began to express interest or and set the parameters of its mission
concern, including China, Russia, Japan, (Schmitt and Thurnher, 2013). But auton-
the United Kingdom, and the United omy removes humans from decision
States. France called for a mandate to dis- making, and even the assumption that
cuss the issue under the Convention on autonomous weapons will be programm-
Certain Conventional Weapons, a global ed by humans is ultimately in doubt.
treaty that restricts excessively injurious Diplomats and public spokesmen may
or indiscriminate weapons. Meeting in speak in one voice; warriors, engineers,
Geneva in November, the state parties and their creations will speak in another.
to the Convention agreed to formal dis- The development and acquisition of
cussions on autonomous weapons, with autonomous weapons will push ahead if
a first round in May 2014. The issue has there is no well-defined, immovable, no-
been placed firmly on the global public go red line. The clearest and most natural
and diplomatic agenda. place to draw that line is at the point
Despite this impressive record of pro- when a machine pulls the trigger,
gress on an issue that was until recently making the decision on whether, when,
virtually unknownÑor scorned as a mix- and against whom or what to use violent
ture of science fiction and paranoiaÑ force. Invoking a well-established tenet
there seems little chance that a strong of international humanitarian law,
arms control regime banning autono- opponents can argue that this is already
mous weapons will soon emerge from contrary to principles of humanity, and
Geneva. Unlike glass shrapnel, blinding thus inherently unlawful. Equally import-
lasers, or even landmines and cluster ant, opponents must point out the threat
munitions, autonomous weapon systems to peace and security posed by the pro-
are not niche armaments of negligible spect of a global arms race toward
strategic importance and unarguable robotic arsenals that are increasingly
cost to humanity. Instead of the haunting out of human control.
eyes of children with missing limbs,
autonomous weapons present an abs-
Humanitarian law vs. killer robots
tract, unrealized horror, one that some
might hope will simply go away. The public discussion launched by the
Unless there is a strong push from Campaign to Stop Killer Robots has
civil society and from governments that mostly centered on questions of legality

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


Gubrud 35

under international humanitarian law, general classification of persons as com-


also called the law of war. ÒLosing batants or noncombatants based on ob-
Humanity,Ó a report released by Human servation is well beyond the state of the
Rights Watch in November 2012Ñcoin- art. How long this may remain so is less
cidentally just days before the Pentagon clear. The capabilities to be expected of
made public the worldÕs first open policy artificial intelligence systems 10, 20, or
directive for developing, acquiring, and 40 years from now are unknown and
using autonomous weaponsÑlaid out highly controversial within both expert
arguments that fully autonomous weap- and lay communities.
ons could not satisfy basic requirements While it may not satisfy the reified
of the law, largely on the basis of principle of distinction, proponents of
assumed limitations of artificial intelli- autonomous weapons argue that some
gence (Human Rights Watch and Inter- capability for discrimination is better
national Human Rights Clinic at Harvard than none at all. This assumes that an
Law School, 2012). indiscriminate weapon would be used if
The principle of distinction, as a less indiscriminate one were not avail-
enshrined in Additional Protocol I of able; for example, it is often argued that
the Geneva ConventionsÑand viewed drone strikes are better than carpet
as customary international law, thus bombing. Yet at some point autonomous
binding even on states that have not rati- discrimination capabilities may be good
fied the treatyÑdemands that parties to a enough to persuade many people that
conflict distinguish between civilians their use in weapons is a net benefit.
and combatants, and between civilian Judgment of proportionality seems at
objects and military objectives. Attacks first an even greater challenge, and some
must be directed against combatants and argue that it is beyond technology in
military objectives only; weapons not principle (Asaro, 2012). However, the
capable of being so directed are con- military already uses an algorithmic
sidered to be indiscriminate and there- Òcollateral damage estimation method-
fore prohibited. Furthermore, those ologyÓ (Defense Department, 2009) to
who make attack decisions must not estimate incidental harm to civilians
allow attacks that may be expected to that may be expected from missile and
cause excessive harm to civilians, in drone strikes. A similar scheme could
comparison with the military gains be developed to formalize the value of
expected from the attack. This is known military gains expected from attacks,
as the principle of proportionality. allowing two numbers to be compared.
ÒLosing HumanityÓ argues that tech- Human commanders applying such
nical limitations mean robots could not protocols could defend their decisions,
reliably distinguish civilians from com- if later questioned, by citing such calcu-
batants, particularly in irregular warfare, lations. But the cost of this would be to
and could not fulfill the requirement to degrade human judgment almost to the
judge proportionality.1 Distinction is level of machines.
clearly a challenge for current technol- On the other hand, IBMÕs Watson
ogy; face-recognition technology can computer (Ferruci et al., 2010) has
rapidly identify individuals from a lim- demonstrated the ability to sift through
ited list of potential targets, but more millions of pages of natural language and

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


36 Bulletin of the Atomic Scientists 70(1)

weigh hundreds of hypotheses to answer so nobody is to blame. Going further,


ambiguous questions. While some of David Akerson (2013) argues that since
WatsonÕs responses suggest it is not yet a robot cannot be punished, it cannot be
a trustworthy model, it seems likely that a legal combatant.
similar systems, given semantic informa- These are some of the issues most
tion about combat situations, including likely to be discussed within the Conven-
uncertainties, might be capable of tion on Certain Conventional Weapons.
making military decisions that most However, US Defense Department
people would judge as reasonable, most policy (2012) preemptively addresses
of the time. many of these issues by directing that
ÒLosing HumanityÓ also argues that Ò[a]utonomous and semi-autonomous
robots, necessarily lacking emotion,2 weapon systems shall be designed to
would be unable to empathize and thus allow commanders and operators to
unable to accurately interpret human exercise appropriate levels of human
behavior or be affected by compassion. judgment over the use of force.Ó
An important case of the latter is when Under the US policy, commanders
soldiers refuse orders to put down rebel- and operators are responsible for using
lions. Robots would be ideal tools of autonomous weapons in accordance
repression and dictatorship. with the laws of war and relevant trea-
If robot soldiers become available on ties, safety rules, and rules of engage-
the world market, it is likely that repres- ment. For example, an autonomous
sive regimes will acquire them, either by weapon may be sent on a hunt-and-kill
purchase or indigenous production. mission if tactics, techniques, and pro-
While it is theoretically possible for cedures ensure that the area in which it
such systems to be safeguarded with is directed to search contains no objects,
tamper-proof programming against other than the intended targets, that the
human rights abuses, in the event that weapon might decide to attack. In this
the world fails to prohibit robot soldiers, case, the policy regards the targets as
unsafeguarded or poorly safeguarded having been selected by humans and
versions will likely be available. A the weapon as merely semi-autonomous,
strong prohibition has the best chance even if the weapon is operating fully
of keeping killer robots out of the hands autonomously when it decides that a
of dictators, both by restricting their given radar return or warm object is
availability and stigmatizing their use. its intended target. The policy pre-
Accountability is another much- approves the immediate development,
discussed issue. Clearly, a robot cannot acquisition, and use of such weapons.
be held responsible for its actions, but Although the policy does not define
human commanders and operatorsÑor Òappropriate levels,Ó it applies this
even manufacturers, programmers, and rubric even in the case of fully autono-
engineersÑmight be held responsible mous lethal weapons targeting human
for negligence or malfeasance. In prac- beings without immediate human super-
tice, however, the robot is likely to be a vision. This makes it clear that appropri-
convenient scapegoat in case of an unin- ate levels, as understood within the
tended atrocityÑa technical failure occ- policy, do not necessarily require direct
urred, it was unintended and unforeseen, human involvement in the decision to kill

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


Gubrud 37

a human being (Gubrud, 2013). It seems command to engage an individual target


likely that the United States will press (person or object) must be given by a
other states to accept this paradigm as human being, and only after the target is
the basis for international regulation of being reliably tracked by a targeting
autonomous weapons, leaving it to indi- system and a human has determined
vidual states to determine what levels of that it is an appropriate and legal target.
human judgment are appropriate. A second principle is that a human
commander must be responsible and
Demanding human control and accountable for the decision, and if the
commander acts through another
responsibility
person who operates a weapon system,
As diplomatic discussions about killer that person must be responsible and
robot regulation get under way, a good accountable for maintaining control of
deal of time is apt to be lost in confusion the system. ÒResponsibleÓ refers here to
about terms, definitions, and scope. a moral and legal obligation, and
ÒLosing HumanityÓ seeks to ban Òfully ÒaccountableÓ refers to a formal system
autonomous weapons,Ó and HeynsÕs for accounting of actions. Both elements
report used the term Òlethal autonomous are essential to the approach.
robotics.Ó The US policy directive speaks Responsibility implies that com-
of Òautonomous and semi-autonomous manders and operators may not blame
weapon systems,Ó and the distinction inadequacies of technological systems
between these is ambiguous (Gubrud, for any failure to exercise judgment and
2013). The Geneva mandate is to discuss control over the use of violent force. A
Òlethal autonomous weapon systems.Ó commander must ensure compliance
Substantive questions include whether with the law and rules of engagement
non-lethal weapons and those that target independently of any machine decision,
only matriel are within the scope of dis- either as to the identity of a target or the
cussion. Legacy weapons such as simple appropriateness of an attack, or else must
mines may be regarded as autonomous, not authorize the attack. Similarly, if a
or distinguished as merely automatic, system does not give an operator suffi-
on grounds that their behavior is fully cient control over the weapon to prevent
predictable by designers.3 Human- unintended engagements, the operator
supervised autonomous and semi-auton- must refuse to operate the system.
omous weapon systems, as defined by the Accountability can be demonstrated
United States, raise issues that, like fractal by states that comply with this principle.
shapes, appear more complex the more They need only maintain records show-
closely they are examined. ing that each engagement was properly
Instead of arguing about how to define authorized and executed. If a violation is
what weapons should be banned, it may alleged, selected records can be unsealed
be better to agree on basic principles. One in a closed inquiry conducted by an
is that any use of violent force, lethal or international body (Gubrud and Alt-
non-lethal, must be by human decision mann, 2013).4
and must at all times be under human This framing, which focuses on human
control. Implementing this principle as control and responsibility for the deci-
strictly as possible implies that the sion to use violent force, is both

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


38 Bulletin of the Atomic Scientists 70(1)

human-occupied territory or vehicles.


Each such system should have an acc-
ountable human operator, and autono-
mous response should be delayed as
long as possible to allow time for an over-
ride decision.

The strategic need for robot


arms control
The Counter-Rocket, Artillery, and Mortar (C-RAM)
gun can fire thousands of rounds per minute at incoming Principles of humanity may be the stron-
projectiles. gest foundation for an effective ban of
Photo credit: US Air Force
autonomous weapons, but they are not
necessarily the most compelling reason
why a ban must be sought. The perceived
conceptually simple and morally compel- military advantages of autonomy are so
ling. What remains then is to set stand- great that major powers are likely to
ards for adequate information to be strongly resist prohibition, but by the
presented to commanders, and to require same token, autonomous weapons pose a
positive action by operators of a weapon severe threat to global peace and security.
system. Those standards should also Although humans have (for now)
address any circumstances under which superior capabilities for perception in
other partiesÑdesigners and manufac- complex environments and for inter-
turers, for instanceÑmight be held respon- pretation of ambiguous information,
sible for an unintended engagement. machines have the edge in speed and pre-
There is at least one exceptional cir- cision. If allowed to return fire or initiate
cumstance in which human control may it, they would undoubtedly prevail over
be applied less strictly. Fully autonomous humans in many combat situations.
systems are already used to engage Humans have a limited tolerance of the
incoming missiles and artillery rounds; physical extremes of acceleration, tem-
examples include the Israeli Iron Dome perature, and radiation, are vulnerable
and the US Patriot and Aegis missile to biological and chemical weapons,
defense systems, as well as the Counter and require rest, food, breathable air,
Rocket, Artillery, and Mortar system. and drinkable water. Machines are
The timeline for response in such sys- expendable; their loss does not cause
tems is often so short that the require- emotional pain or political backlash.
ment for positive human decision might Humans are expensive, and their re-
impose an unacceptable risk of failure. placement by robots is expected to
Another principleÑthe protection of yield cost savings.
life from immediate threatsÑcomes While todayÕs relatively sparse use of
into play here. An allowance seems rea- drones, in undefended airspace, to target
sonable, if it is strictly limited. In particu- irregular forces can be carried out by
lar, autonomous return fire should not be remote control, large-scale use of
permitted, but only engagement of robotic weapons to attack modern mili-
unmanned munitions directed against tary forces would require greater

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


Gubrud 39

autonomy, due to the burdens and vul- weapons arms race would be global in
nerabilities of communications links, scope, as the drone race already is.
the need for stealth, and the sheer num- Since robots are regarded as expend-
bers of robots likely to be involved. The able, they may be risked in provocative
US Navy is particularly interested in adventures. Recently, China has warned
autonomy for undersea systems, where that if Japan makes good on threats to
communications are especially problem- shoot down Chinese drones that
atic. Civilians are sparse on the high seas approach disputed islands, it could be
and absent on submarines, casting doubt regarded as an act of war. Similarly, for-
on the relevance of humanitarian law. As ward-basing of missile interceptors
the Navy contemplates future conflict (Lewis and Postol, 2010) or other strategic
with a peer competitor, it projects weapons on unmanned platforms would
drone-versus-drone warfare in the skies risk misinterpretation as a signal of immi-
above and waters below, and the use of nent attack, and could invite preemption.
sea-based drones to attack targets inland Engineering the stability of a robot
as well. confrontation would be a wickedly hard
In a cold war, small robots could be problem even for a single team working
used for covert infiltration, surveillance, together in trust and cooperation, let
sabotage, or assassination. In an open alone hostile teams of competing and
attack, they could find ways of getting imperfectly coordinated sub-teams.
into underground bunkers or attacking Complex, interacting systems-of-
bases and ships in swarms. Because systems are prone to sudden unexpected
robots can be sent on one-way missions, behavior and breakdowns, such as the
they are potential enablers of aggression May 6, 2010 stock market crash caused
or preemption. Because they can be more by interacting exchanges with slightly
precise and less destructive than nuclear different rules (Nanex, 2010). Even
weapons, they may be more likely to be assuming that limiting escalation would
used. In fact, the US Air ForceÕs Long be a design objective, avoiding defeat by
Range Strike Bomber is planned to be preemption would be an imperative, and
both nuclear-capable and potentially this implies a constant tuning to the edge
unmanned, which would almost cer- of instability. The history of the Cold War
tainly mean autonomous. contains many well-known examples in
There can be no real game-changers in which military response was interrupted
the nuclear stalemate. Yet the new wave by the judgment of human beings. But
of robotics and artificial intelligence- when tactical decisions are made with
enabled systems threatens to drive a inhuman speed, the potential for events
new strategic competition between the to spiral out of control is obvious.
United States and other major powersÑ
and lesser powers, too. Unlike the specia-
The way out
lized technologies of high-performance
military systems at the end of the Cold Given the military significance of
War, robotics, information technology, autonomous weapons, substantial pres-
and even advanced sensors are today glo- sure from civil society will be needed
bally available, driven as much by civil- before the major powers will seriously
ian as military uses. An autonomous consider accepting hard limits, let alone

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


40 Bulletin of the Atomic Scientists 70(1)

prohibition. The goal is as radical as, and Human Rights (1968), and the Geneva
no less necessary than, the control and Convention additional protocols (1977).
abolition of nuclear weapons. It has been invoked as the source of
The principle of humanity is an old authority for retroactive liability in war
concept in the law of war. It is often crimes and for preemptive bans on inhu-
cited as forbidding the infliction of need- mane weapons, implying that a strong
less suffering, but at its deepest level it is public consensus has legal force in antici-
a demand that even in conflict, people pation of an explicit law (Meron, 2000).
should not lose sight of their shared Autonomous weapons are a threat to
humanity. There is something inhumane global peace and therefore a matter of
about allowing technology to decide the concern under the UN Charter. They are
fate of human lives, whether through contrary to established custom, prin-
individual targeting decisions or through ciples of humanity, and dictates of
a conflagration initiated by the unex- public conscience, and so should be con-
pected interactions of machines. The sidered as preemptively banned by the
recognition of this is already deeply Martens Clause. These considerations
rooted. A scientific poll (Carpenter, establish the legal basis for formal inter-
2013) found that Americans opposed to national action to prohibit machine deci-
autonomous weapons outnumbered sion in the use of force. But for such action
supporters two to one, in contrast to an to occur, global civil society will need to
equally strong consensus in the United present major-power governments with
States supporting the use of drones. an irresistible demand: Stop killer robots.
The rest of the world leans heavily
against the drone strikes (Pew Research
Funding
Center, 2012), making it seem likely that
This research was supported in part by a grant
global public opinion will be strongly from the John D. and Catherine T. MacArthur
against autonomous weapons, both on Foundation.
humanitarian grounds and out of con-
cern for the dangers of a new arms race.
In the diplomatic discussions now Notes
under way, opponents of autonomous 1. Philosopher Peter Asaro (2012) takes this a
weapons should emphasize a well- step further, arguing that those who plan or
decide military attacks are assumed to be
established principle of international
human, and that proportionality, in particu-
humanitarian law. Seeking to resolve a lar, is inherently subjective and represents
diplomatic impasse at the Hague Confer- not just an algorithmic criterion but a moral
ence in 1899, Russian diplomat Friedrich burden upon commanders to give due con-
Martens proposed that for issues not yet sideration to the human costs in judging
formally resolved, conduct in war was whether a lethal action is justified.
still subject to Òprinciples of inter- 2. However, emotion is an active area of
national law derived from established research in robotics and artificial intelli-
custom, from the principles of humanity, gence. Research goals include the recogni-
tion of human emotion, the simulation of
and from the dictates of public consci-
emotional responses, social interaction, and
ence.Ó Known as the Martens Clause, it internal states as moderators of robot behav-
reappeared in the second Hague Conven- ior. Moreover, it is not clear that emotion is
tion (1907), the Tehran Conference on the best way to govern behavior in robots.

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


Gubrud 41

Hard rules such as ÒDonÕt open fire on civil- Defense Department (2009) Chairman of the Joint
iansÓ might be preferable (Arkin, 2009). Chiefs of Staff instruction 3160.01: No-strike and
3. Mines have already been addressed by the the collateral damage estimation methodology.
Convention on Certain Conventional Weap- February 13. Available at: www.aclu.org/files/
dronefoia/dod/drone_dod_3160_01.pdf.
ons and by the landmines and cluster muni-
Defense Department (2012) Directive 3000.09: Auton-
tions treaties. Their inclusion in an
omy in weapon systems. November 21. Available
autonomous weapons ban would strengthen at: www.dtic.mil/whs/directives/corres/pdf/
it from the point of view of conceptual and 300009p.pdf.
moral clarity, but might needlessly compli- Ferruci D, Brown E, Chu-Carroll, et al. (2010) Building
cate the negotiations. Watson: An overview of the DeepQA Project. AI
4. Such records would logically include the Magazine 31(3): 59”79. Available at: www.
data on which the engagement decision was aaai.org/ojs/index.php/aimagazine/article/view/
based, as well as the identities of commander 2303.
and operator, and some evidence of their Gubrud M (2013) US killer robot policy: Full speed
human action. Identities can be kept secret, ahead. Bulletin of the Atomic Scientists, web edi-
but the commanderÕs account of the justifi- tion, September 20. Available at: http://thebulletin.
org/us-killer-robot-policy-full-speed-ahead.
cation for an attack may be crucial. The rec-
Gubrud M and Altmann J (2013) Compliance meas-
ords can be time-stamped, encrypted, and ures for an autonomous weapons convention.
archived, and a digital signature (hash) of International Committee for Robot Arms Control
each record published or held by an inter- Working Paper no. 2. Available at: http://icrac.
national body, in order to prove that records net/wp-content/uploads/2013/05/Gubrud-Altmann_
were not later altered. Some data, such as the Compliance-Measures-AWC_ICRAC-WP2.pdf.
time-stamped engagement command, could Heyns C (2013) Report of the Special Rapporteur on
be recorded in real time by tamper-proofed Extrajudicial, Summary or Arbitrary Executions.
Òglass boxÓ verification devices (Gubrud and United Nations Document A/HRC/23/47, April
Altmann, 2013). 9. Available at: www.ohchr.org/Documents/
HRBodies/HRCouncil/RegularSession/Session23/
A-HRC-23-47_en.pdf.
References Human Rights Watch and International Human
Adams TK (2001) Future warfare and the decline of Rights Clinic at Harvard Law School (2012)
human decisionmaking. Parameters 31(4): 57”71. Losing humanity: The case against killer robots.
Akerson D (2013) The illegality of offensive lethal November 19. Available at: www.hrw.org/
autonomy. In: Saxon D (ed.) International Huma- reports/2012/11/19/losing-humanity-0.
nitarian Law and the Changing Technology of War. Lewis GN and Postol TA (2010) How US strategic
Leiden: Martinus Nijhoff, 65”98. antimissile defense could be made to work. Bulle-
Anderson K and Waxman M (2013) Killer robots and tin of the Atomic Scientists 66(6): 8”24.
the laws of war. Wall Street Journal, November 4, Meron T (2000) The Martens Clause, principles of
A19. Available at: http://online.wsj.com/news/ humanity, and dictates of public conscience.
articles/SB10001424052702304655104579163361884 American Journal of International Law 94(1):
479576. 78”89.
Arkin RC (2009) Governing Lethal Behavior in Auton- Nanex (2010) Analysis of the ÔFlash CrashÕ. June 18.
omous Robots. Boca Raton, FL: CRC. Available at www.nanex.net/20100506/Flash
Asaro P (2012) On banning autonomous weapon sys- CrashAnalysis_Intro.html.
tems: Human rights, automation, and the dehu- Pew Research Center (2012) Global opinion of Obama
manization of lethal decision-making. slips, international policies faulted. June 13. Avail-
International Review of the Red Cross 94(886): able at: www.pewglobal.org/2012/06/13/global-
687”709. opinion-of-obama-slips-international-policies-
Carpenter C (2013) US public opinion on autonomous faulted/.
weapons. Available at: www.whiteoliphaunt.com/ Schmitt MN and Thurnher JS (2013) ÔOut of the loopÕ:
duckofminerva/wp-content/uploads/2013/06/ Autonomous weapon systems and the law of
UMass-Survey_Public-Opinion-on-Autonomous- armed conflict. Harvard National Security Journal
Weapons_May2013.pdf. 4(2): 231”281.

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014


42 Bulletin of the Atomic Scientists 70(1)

Author biography Science and Global Security at Princeton Uni-


Mark Gubrud is a member of the International versity. He did his doctoral work in low-tem-
Committee for Robot Arms Control. Research perature and nanoscale experimental physics
for this article was done while he was a postdoc- at the University of Maryland and has taught
toral research associate in the Program on physics at the University of North Carolina.

Downloaded from bos.sagepub.com at PRINCETON UNIV LIBRARY on January 6, 2014

View publication stats

You might also like