Professional Documents
Culture Documents
net/publication/275573819
CITATIONS READS
13 162
1 author:
M. A. Gubrud
University of North Carolina at Chapel Hill
45 PUBLICATIONS 694 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by M. A. Gubrud on 03 February 2016.
Published by:
http://www.sagepublications.com
On behalf of:
Bulletin of the Atomic Scientists
Additional services and information for Bulletin of the Atomic Scientists can be found at:
Subscriptions: http://bos.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav
What is This?
of the
Atomic
Scientists ®
Mark Gubrud
Abstract
Autonomous weapons are robotic systems that, once activated, can select and engage targets without further
intervention by a human operator. Advances in computer technology, artificial intelligence, and robotics may
lead to a vast expansion in the development and use of such weapons in the near future. Public opinion runs
strongly against killer robots. But many of the same claims that propelled the Cold War are being recycled to
justify the pursuit of a nascent robotic arms race. Autonomous weapons could be militarily potent and there-
fore pose a great threat. For this reason, substantial pressure from civil society will be needed before major
powers will seriously consider their prohibition. However, demands for human control and responsibility and
the protection of human dignity and sovereignty fit naturally into the traditional law of war and imply strict
limits on autonomy in weapon systems. Opponents of autonomous weapons should point out the terrible threat
they pose to global peace and security, as well as their offensiveness to principles of humanity and to public
conscience.
Keywords
autonomous weapons, Convention on Certain Conventional Weapons, international humanitarian law, killer
robots, unmanned weapons
ince the first lethal drone strike in the most dynamic and destabilizing com-
its first year. In May, the United Nations have decided against pursuing autono-
Special Rapporteur on extrajudicial kill- mous weapons, those that have decided
ings, Christof Heyns, recommended that in favor of themÑincluding the United
nations immediately declare morator- States (Gubrud, 2013)Ñwill seek to
iums on their own development of manage the issue as a public relations
lethal autonomous robotics (Heyns, problem. They will likely offer assurances
2013). Heyns also called for a high-level that humans will remain in control, while
study of the issue, a recommendation continually updating what they mean by
seconded in July by the UN Advisory control as technology advances. Propo-
Board on Disarmament Matters. At the nents already argue that humans are
UN General AssemblyÕs First Comm- never really out of the loop because
ittee meeting in October, a flood of humans will have programmed a robot
countries began to express interest or and set the parameters of its mission
concern, including China, Russia, Japan, (Schmitt and Thurnher, 2013). But auton-
the United Kingdom, and the United omy removes humans from decision
States. France called for a mandate to dis- making, and even the assumption that
cuss the issue under the Convention on autonomous weapons will be programm-
Certain Conventional Weapons, a global ed by humans is ultimately in doubt.
treaty that restricts excessively injurious Diplomats and public spokesmen may
or indiscriminate weapons. Meeting in speak in one voice; warriors, engineers,
Geneva in November, the state parties and their creations will speak in another.
to the Convention agreed to formal dis- The development and acquisition of
cussions on autonomous weapons, with autonomous weapons will push ahead if
a first round in May 2014. The issue has there is no well-defined, immovable, no-
been placed firmly on the global public go red line. The clearest and most natural
and diplomatic agenda. place to draw that line is at the point
Despite this impressive record of pro- when a machine pulls the trigger,
gress on an issue that was until recently making the decision on whether, when,
virtually unknownÑor scorned as a mix- and against whom or what to use violent
ture of science fiction and paranoiaÑ force. Invoking a well-established tenet
there seems little chance that a strong of international humanitarian law,
arms control regime banning autono- opponents can argue that this is already
mous weapons will soon emerge from contrary to principles of humanity, and
Geneva. Unlike glass shrapnel, blinding thus inherently unlawful. Equally import-
lasers, or even landmines and cluster ant, opponents must point out the threat
munitions, autonomous weapon systems to peace and security posed by the pro-
are not niche armaments of negligible spect of a global arms race toward
strategic importance and unarguable robotic arsenals that are increasingly
cost to humanity. Instead of the haunting out of human control.
eyes of children with missing limbs,
autonomous weapons present an abs-
Humanitarian law vs. killer robots
tract, unrealized horror, one that some
might hope will simply go away. The public discussion launched by the
Unless there is a strong push from Campaign to Stop Killer Robots has
civil society and from governments that mostly centered on questions of legality
autonomy, due to the burdens and vul- weapons arms race would be global in
nerabilities of communications links, scope, as the drone race already is.
the need for stealth, and the sheer num- Since robots are regarded as expend-
bers of robots likely to be involved. The able, they may be risked in provocative
US Navy is particularly interested in adventures. Recently, China has warned
autonomy for undersea systems, where that if Japan makes good on threats to
communications are especially problem- shoot down Chinese drones that
atic. Civilians are sparse on the high seas approach disputed islands, it could be
and absent on submarines, casting doubt regarded as an act of war. Similarly, for-
on the relevance of humanitarian law. As ward-basing of missile interceptors
the Navy contemplates future conflict (Lewis and Postol, 2010) or other strategic
with a peer competitor, it projects weapons on unmanned platforms would
drone-versus-drone warfare in the skies risk misinterpretation as a signal of immi-
above and waters below, and the use of nent attack, and could invite preemption.
sea-based drones to attack targets inland Engineering the stability of a robot
as well. confrontation would be a wickedly hard
In a cold war, small robots could be problem even for a single team working
used for covert infiltration, surveillance, together in trust and cooperation, let
sabotage, or assassination. In an open alone hostile teams of competing and
attack, they could find ways of getting imperfectly coordinated sub-teams.
into underground bunkers or attacking Complex, interacting systems-of-
bases and ships in swarms. Because systems are prone to sudden unexpected
robots can be sent on one-way missions, behavior and breakdowns, such as the
they are potential enablers of aggression May 6, 2010 stock market crash caused
or preemption. Because they can be more by interacting exchanges with slightly
precise and less destructive than nuclear different rules (Nanex, 2010). Even
weapons, they may be more likely to be assuming that limiting escalation would
used. In fact, the US Air ForceÕs Long be a design objective, avoiding defeat by
Range Strike Bomber is planned to be preemption would be an imperative, and
both nuclear-capable and potentially this implies a constant tuning to the edge
unmanned, which would almost cer- of instability. The history of the Cold War
tainly mean autonomous. contains many well-known examples in
There can be no real game-changers in which military response was interrupted
the nuclear stalemate. Yet the new wave by the judgment of human beings. But
of robotics and artificial intelligence- when tactical decisions are made with
enabled systems threatens to drive a inhuman speed, the potential for events
new strategic competition between the to spiral out of control is obvious.
United States and other major powersÑ
and lesser powers, too. Unlike the specia-
The way out
lized technologies of high-performance
military systems at the end of the Cold Given the military significance of
War, robotics, information technology, autonomous weapons, substantial pres-
and even advanced sensors are today glo- sure from civil society will be needed
bally available, driven as much by civil- before the major powers will seriously
ian as military uses. An autonomous consider accepting hard limits, let alone
prohibition. The goal is as radical as, and Human Rights (1968), and the Geneva
no less necessary than, the control and Convention additional protocols (1977).
abolition of nuclear weapons. It has been invoked as the source of
The principle of humanity is an old authority for retroactive liability in war
concept in the law of war. It is often crimes and for preemptive bans on inhu-
cited as forbidding the infliction of need- mane weapons, implying that a strong
less suffering, but at its deepest level it is public consensus has legal force in antici-
a demand that even in conflict, people pation of an explicit law (Meron, 2000).
should not lose sight of their shared Autonomous weapons are a threat to
humanity. There is something inhumane global peace and therefore a matter of
about allowing technology to decide the concern under the UN Charter. They are
fate of human lives, whether through contrary to established custom, prin-
individual targeting decisions or through ciples of humanity, and dictates of
a conflagration initiated by the unex- public conscience, and so should be con-
pected interactions of machines. The sidered as preemptively banned by the
recognition of this is already deeply Martens Clause. These considerations
rooted. A scientific poll (Carpenter, establish the legal basis for formal inter-
2013) found that Americans opposed to national action to prohibit machine deci-
autonomous weapons outnumbered sion in the use of force. But for such action
supporters two to one, in contrast to an to occur, global civil society will need to
equally strong consensus in the United present major-power governments with
States supporting the use of drones. an irresistible demand: Stop killer robots.
The rest of the world leans heavily
against the drone strikes (Pew Research
Funding
Center, 2012), making it seem likely that
This research was supported in part by a grant
global public opinion will be strongly from the John D. and Catherine T. MacArthur
against autonomous weapons, both on Foundation.
humanitarian grounds and out of con-
cern for the dangers of a new arms race.
In the diplomatic discussions now Notes
under way, opponents of autonomous 1. Philosopher Peter Asaro (2012) takes this a
weapons should emphasize a well- step further, arguing that those who plan or
decide military attacks are assumed to be
established principle of international
human, and that proportionality, in particu-
humanitarian law. Seeking to resolve a lar, is inherently subjective and represents
diplomatic impasse at the Hague Confer- not just an algorithmic criterion but a moral
ence in 1899, Russian diplomat Friedrich burden upon commanders to give due con-
Martens proposed that for issues not yet sideration to the human costs in judging
formally resolved, conduct in war was whether a lethal action is justified.
still subject to Òprinciples of inter- 2. However, emotion is an active area of
national law derived from established research in robotics and artificial intelli-
custom, from the principles of humanity, gence. Research goals include the recogni-
tion of human emotion, the simulation of
and from the dictates of public consci-
emotional responses, social interaction, and
ence.Ó Known as the Martens Clause, it internal states as moderators of robot behav-
reappeared in the second Hague Conven- ior. Moreover, it is not clear that emotion is
tion (1907), the Tehran Conference on the best way to govern behavior in robots.
Hard rules such as ÒDonÕt open fire on civil- Defense Department (2009) Chairman of the Joint
iansÓ might be preferable (Arkin, 2009). Chiefs of Staff instruction 3160.01: No-strike and
3. Mines have already been addressed by the the collateral damage estimation methodology.
Convention on Certain Conventional Weap- February 13. Available at: www.aclu.org/files/
dronefoia/dod/drone_dod_3160_01.pdf.
ons and by the landmines and cluster muni-
Defense Department (2012) Directive 3000.09: Auton-
tions treaties. Their inclusion in an
omy in weapon systems. November 21. Available
autonomous weapons ban would strengthen at: www.dtic.mil/whs/directives/corres/pdf/
it from the point of view of conceptual and 300009p.pdf.
moral clarity, but might needlessly compli- Ferruci D, Brown E, Chu-Carroll, et al. (2010) Building
cate the negotiations. Watson: An overview of the DeepQA Project. AI
4. Such records would logically include the Magazine 31(3): 59”79. Available at: www.
data on which the engagement decision was aaai.org/ojs/index.php/aimagazine/article/view/
based, as well as the identities of commander 2303.
and operator, and some evidence of their Gubrud M (2013) US killer robot policy: Full speed
human action. Identities can be kept secret, ahead. Bulletin of the Atomic Scientists, web edi-
but the commanderÕs account of the justifi- tion, September 20. Available at: http://thebulletin.
org/us-killer-robot-policy-full-speed-ahead.
cation for an attack may be crucial. The rec-
Gubrud M and Altmann J (2013) Compliance meas-
ords can be time-stamped, encrypted, and ures for an autonomous weapons convention.
archived, and a digital signature (hash) of International Committee for Robot Arms Control
each record published or held by an inter- Working Paper no. 2. Available at: http://icrac.
national body, in order to prove that records net/wp-content/uploads/2013/05/Gubrud-Altmann_
were not later altered. Some data, such as the Compliance-Measures-AWC_ICRAC-WP2.pdf.
time-stamped engagement command, could Heyns C (2013) Report of the Special Rapporteur on
be recorded in real time by tamper-proofed Extrajudicial, Summary or Arbitrary Executions.
Òglass boxÓ verification devices (Gubrud and United Nations Document A/HRC/23/47, April
Altmann, 2013). 9. Available at: www.ohchr.org/Documents/
HRBodies/HRCouncil/RegularSession/Session23/
A-HRC-23-47_en.pdf.
References Human Rights Watch and International Human
Adams TK (2001) Future warfare and the decline of Rights Clinic at Harvard Law School (2012)
human decisionmaking. Parameters 31(4): 57”71. Losing humanity: The case against killer robots.
Akerson D (2013) The illegality of offensive lethal November 19. Available at: www.hrw.org/
autonomy. In: Saxon D (ed.) International Huma- reports/2012/11/19/losing-humanity-0.
nitarian Law and the Changing Technology of War. Lewis GN and Postol TA (2010) How US strategic
Leiden: Martinus Nijhoff, 65”98. antimissile defense could be made to work. Bulle-
Anderson K and Waxman M (2013) Killer robots and tin of the Atomic Scientists 66(6): 8”24.
the laws of war. Wall Street Journal, November 4, Meron T (2000) The Martens Clause, principles of
A19. Available at: http://online.wsj.com/news/ humanity, and dictates of public conscience.
articles/SB10001424052702304655104579163361884 American Journal of International Law 94(1):
479576. 78”89.
Arkin RC (2009) Governing Lethal Behavior in Auton- Nanex (2010) Analysis of the ÔFlash CrashÕ. June 18.
omous Robots. Boca Raton, FL: CRC. Available at www.nanex.net/20100506/Flash
Asaro P (2012) On banning autonomous weapon sys- CrashAnalysis_Intro.html.
tems: Human rights, automation, and the dehu- Pew Research Center (2012) Global opinion of Obama
manization of lethal decision-making. slips, international policies faulted. June 13. Avail-
International Review of the Red Cross 94(886): able at: www.pewglobal.org/2012/06/13/global-
687”709. opinion-of-obama-slips-international-policies-
Carpenter C (2013) US public opinion on autonomous faulted/.
weapons. Available at: www.whiteoliphaunt.com/ Schmitt MN and Thurnher JS (2013) ÔOut of the loopÕ:
duckofminerva/wp-content/uploads/2013/06/ Autonomous weapon systems and the law of
UMass-Survey_Public-Opinion-on-Autonomous- armed conflict. Harvard National Security Journal
Weapons_May2013.pdf. 4(2): 231”281.