You are on page 1of 9

23rd Annual Session of the Seoul Model United Nations

Forum: General Assembly One (DISEC)


Question of: Establishing guidelines for the use of lethal
autonomous weapon systems (LAWS)
Student Officer: Audrey Chan, Deputy Assistant President

Introduction

In a world where new technologies are constantly emerging, the concept of autonomous
technologies introduced the expanded use of unmanned ground vehicles, and has since been
actively developed for military purposes, resulting in the full development of lethal autonomous
weapons (LAWs) which will potentially revolutionize warfare, sparking debate all over the
world. The capability of these robotic weapons brings forth new humanitarian and legal
challenges, especially to the protection of civilians and the compliance with international human
rights and humanitarian law. As LAWs are becoming more sophisticated and versatile, it is but
timely that such unpredictable technologies be viewed under a human rights lens and brought
under legal review.

Although there is yet an internationally agreed definition of LAWs, they are


fundamentally a type of autonomous military system that can independently select and attack
targets1 through engineered algorithms, therefore capable of functioning without the intervention
of a human operator. Lethal autonomous weapons can be used in the air (drone aircrafts), on land
(unmanned ground vehicles), on and underwater (unmanned submarines)2, or in space; such as
unmanned space drones. LAWs covers a wide range of weapon systems, from said fully
autonomous weapons that can launch attacks on its own, to semi-autonomous weapons that
require affirmative human action to carry out a mission. Semi autonomous weapons already exist
in our world, like military drones that only fire under the control of soldiers in areas where the
United States is at war or engaged in military operations3. This may not sound like too much of a
threat as of now because lethal autonomous weapons are not fully developed currently, but
“People who work in the related technologies think it’d be relatively easy to put together a very
effective weapon with the existing technology that can replace humans with an algorithm in the
next two years”, says Stuart Russel, a computer science professor and leading AI researcher.4
Given how fast artificial intelligence (AI), has grown in recent years, it is inevitable that lethal
autonomous weapons will emerge in the near future, posing a greater threat to society, due to
advanced AI such as facial and object recognition and instant tactical decision making as

1 file:///Users/audreychan/Downloads/4221-002-autonomous-weapons-systems-full-report.pdf
2 https://www.naval-technology.com/projects/sea-hunter-asw-continuous-trail-unmanned-vessel-actuv/
3 https://www.vox.com/2019/6/21/18691459/killer-robots-lethal-autonomous-weapons-ai-war
4 Ibid

SEOMUN XXIII Research Report • 1


23rd Annual Session of the Seoul Model United Nations

potential technologies essential to the development of LAWs. LAWs will play an important role
in present and future warfare as an unpredictable weapon that will either help reduce collateral
damage and harms of war5 or potentially be a force that can destroy the world, hence
highlighting the importance to prevent such negative consequences.

Due to the ambiguity of lethal autonomous weapons, it is difficult to determine the


autonomy of existing (and future) weapon systems, creating conflict because of the different
opinions on the considerations for the autonomy in a given weapon system.6 The US Department
of Defense has divided autonomous weapons into three main categories based on the level of
autonomy and human control: autonomous weapon systems (aka human out-of-the-loop),
supervised autonomous weapon systems (aka human on-the-loop) and lastly semi-autonomous
weapon systems (aka human in-the-loop).7 According to the report from the International
Committee of the Red Cross expert meeting in Geneva8, a speaker stated that the criteria for
evaluating the implications of autonomy in a weapon system is the mission to be performed, the
sophistication or complexity of the system and the extent of human intervention and supervision.
Given an increase in military interest towards autonomous weapons due to its high appeal,
various factors must be taken into account by both the military itself and member states when
accessing the desirability of autonomy selecting and attacking targets, and the acceptable
autonomy for a given arms system.9 Such may include: the level of advantage offered to
militaries because of this and how high the needs are for said autonomy when operating a certain
task or mission, and the reliability of communications between machine and man. Hence, the
underlying problem of LAWs: “the balance it strikes between humanitarian concerns and
military necessity”10, as stated by India’s representative at the CCW meeting in 2018.

A major issue regarding the concept of LAWs is its violation of international acts and
ethical concerns.

Definition of Key Terms

Lethal Autonomous Weapons (LAWs)


A weapon system that, once activated, can select and engage targets without further
intervention by a human operator.11 Also known as Autonomous Weapon Systems (AWS) or
5 Ibid
6 file:///Users/audreychan/Downloads/4221-002-autonomous-weapons-systems-full-report.pdf
7 US Department of Defense (2012) Autonomy in Weapon Systems, Directive 3000.09, 21 November
2012, Glossary, Part II Definitions.
8 Ibid footnote 6
9 Ibid
10 https://www.un.org/press/en/2018/gadis3611.doc.htm
11 Ibid footnote 7

SEOMUN XXIII Research Report • 2


23rd Annual Session of the Seoul Model United Nations

Lethal Autonomous Robots (LARs), the most significant aspect of such systems is that it has an
autonomous ‘option’ in regards to selection of a target and usage of lethal force.12

Supervised Autonomous Weapon System


Autonomous weapon system which allows human operators to step in and terminate
engagements in the case of a malfunction or system failure.13

Semi-autonomous Weapon System


Weapon system that can only select individual or specific target groups when controlled
by a human operator14

Meaningful human control


Meaningful human control is a concept that has been the main focus of international
discussions regarding AWS, as the unique and identifying feature of a lethal autonomous
weapons system is the ability to operate without meaningful human control. According to
Article36, the requirement for meaningful human control develops from two basic concepts:
“1) that a machine applying force and operating without any human control whatsoever is
broadly unacceptable
2) that a human pressing a ‘fire’ button in response to indications from a computer,
without cognitive clarity or awareness, is not sufficient to be considered ‘human control’
in a substantive sense.” 15

Timeline of Key Events

December 2, 1983 - Convention on Prohibitions or Restrictions on the Use of Certain


Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have
Indiscriminate Effects entered into force.
The convention seeks to ban or restrict the use of certain conventional arms which are
excessively injurious or have indiscriminate effects.16 The convention covers landmines, booby
traps, incendiary weapons, blinding laser weapons, and clearance of explosive remnants of war.17
The Convention is also shortened to the Convention on Conventional Weapons (CCW).

12 C. Heyns, Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof
Heyns. UN General Assembly, A/HRC/23/47 (9 April 2013), para. 38,
http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf.
13 Ibid footnote 10
14 Ibid
15 http://www.article36.org/wp-content/uploads/2016/04/UK-and-LAWS.pdf
16 https://news.un.org/en/story/2019/03/1035381
17https://en.wikipedia.org/wiki/Convention_on_Certain_Conventional_Weapons#:~:text=The%20United
%20Nations%20Convention%20on,or%20whose%20effects%20are%20indiscriminate.

SEOMUN XXIII Research Report • 3


23rd Annual Session of the Seoul Model United Nations

September 1, 2009 - Establishment The International Committee for Robot Arms Control
The International Committee for Robot Arms Control (ICRAC) was established for
“prohibition of the development, deployment and use of armed autonomous unmanned
systems”18

October 19, 2012 - Formation of the Campaign to Stop Killer Robots


“Campaign to Stop Killer Robots” was formed by representatives from seven non-
governmental organizations.19 It is a growing global coalition of 165 international, regional and
national NGOs in 65 countries that aims to ban fully autonomous weapons.20

November 21, 2012 - United States becomes the first government to establish a policy on
fully Autonomous Weapons

The United States Department of Defense published a policy directive (Number 3000.09)
on autonomous weapons, marking the country as the first government to establish a policy on
said weapon systems.21

May 30, 2013 - First Human Rights Council debate on LAWs

20 nations spoke about lethal autonomous weapons after a presentation of the report by
the UN special rapporteur on extrajudicial killings.22

May 13, 2014 - First Multilateral meeting on LAWs at the UN in Geneva

Summoned under the CCW, UN agencies, the Campaign to Stop Killer Robots, ICRC
and representatives from 87 nations participated in the meeting which “featured presentations by
18 experts on technical, ethical, legal and operational questions raised by the weapons”.23

September 26, 2019 - Declaration of LAWs opened for endorsement during the Alliance for
Multilateralism event by France and Germany

The declaration on LAWs is a preview of the ongoing work on the CCW, which uses a set of
guiding principles to outline a framework for the development and usage of AWS.

Position of Key Member Nations and Other Bodies

United States of America

18 https://www.stopkillerrobots.org/about/
19 https://www.stopkillerrobots.org/action-and-achievements/
20 Ibid
21 Ibid
22 Ibid
23 Ibid

SEOMUN XXIII Research Report • 4


23rd Annual Session of the Seoul Model United Nations

Since 2012, autonomy has become an official component of the US national security
strategy with the Department of Defense policy directive. This policy allows for the engagement
of targets, pre-selected by human operators, by semi-autonomous weapon systems, and fully
autonomous weapons as well after senior level DoD approval.24 The United States has invested
greatly in artificial intelligence for military use and developing AWS based in air and on land
and sea. An example is the US Phalanx system for Aegis-class cruisers, which automatically
detects, tracks and engages anti-air warfare threats.25 Although there are currently no existing
lethal autonomous weapons in the United States, it may have to develop LAWs in the future if
US adversaries choose to do so, says senior military and defense leaders.26

Russian Federation
Russia has opposed proposals for a legally binding instrument for LAWs and is investing
in the development of weapon systems with less and less human control over selection and
engagement of targets27, having made military investments in robotics and artificial intelligence a
top priority for national defense. 28 In line with Russian programs for the 'Creation of Prospective
Military Robots by 2025' and the 'Concept for the Deployment of Robotic Systems for Military
Use by 2030,' Russia is preparing to have autonomous systems protecting its weapons silos by
2020 and plans to have 30% of its fighting force partially or completely autonomous by 2030.29
President Putin believes that whoever is the leader in AI will be the ruler of the world, and
already has an autonomous Uran-9 robotic tank deployed to Syria.

China
China has always been ambiguous on where they stand regarding fully autonomous
weapon systems. In 2018, China stated that it was fully supportive of prohibiting fully
autonomous weapons, however clarified that its ban was only limited for use on the battlefield;
meaning the country is still supportive of its development and production. Xi Jinping aims to
have the country become the AI world leader by 2030, and technologies -- such as smart
surveillance cameras, voice recognition capabilities, big data services and facial recognition
devices -- made by Chinese AI companies are being sold to the government and exported.30 Daan

24 https://onlinelibrary.wiley.com/doi/full/10.1111/1758-5899.12713
25 https://www.timesofisrael.com/israeli-killer-robots-could-be-banned-under-un-proposal/#gs.fresdn
26 https://fas.org/sgp/crs/natsec/IF11150.pdf
27 https://www.stopkillerrobots.org/2019/08/russia-united-states-attempt-to-legitimize-killer-
robots/#:~:text=During%20the%20early%20morning%20negotiation,of%20lethal%20autonomous
%20weapons%20systems.
28 Government of Russia, Statement to the Convention on Conventional Weapons Group of
Governmental Experts on lethal autonomous weapons systems, November 16, 2017,
https://conf.unog.ch/digitalrecordings/index.html?guid=public/61.0500/419FD8D8-C666-45AE-
8CAAFD3AA0DA9949_10h17&position=3296 (accessed July 20, 2020).
29 https://onlinelibrary.wiley.com/doi/full/10.1111/1758-5899.12713
30 https://time.com/5673240/china-killer-robots-weapons/

SEOMUN XXIII Research Report • 5


23rd Annual Session of the Seoul Model United Nations

Kayser from PAX, a European peace organization, said that “these technologies could easily be a
key component for autonomous weapons”.31 Since 2019, China has not repeated its call for a new
international treaty for the prohibition of LAWs.32

UK
The United Kingdom considers existing international humanitarian laws sufficient for
regulating LAWs and thus has no plans to support an international ban. 33 According to the
Ministry of Defence in 2011, the UK wishes to only increase levels of automation in weapon
systems where it would be more effective; it does not intend to develop fully autonomous
weapons. 34 It is currently developing autonomous weapon systems, however the decision to
strike must be overseen by human authority ie. commanders and operators. 35 The UK’s
definition of LAWs (“higher-level and futuristic weapon systems that are self-aware, and
significantly more sophisticated than ‘killer robots’”36) has raised concerns due to the fact that it
is suggested that future developments of fully autonomous weapons are not opposed. The UK
currently has a Taranis jet-propelled combat drone prototype that can autonomously search,
identify and locate enemies but only engage after human approval.37
Israel
Israel has remained open-minded on the capabilities and potential of future LAWs,
believing that it may allow for better cooperation with armed conflict laws compared to human
soldiers.38 Israel is currently testing, producing, developing and using autonomous weapon
systems, such as the Harpy, a “Fire-and-Forget” autonomous weapon system designed to detect,

31 Ibid
32 https://cisp.cachefly.net/assets/articles/attachments/82945_arms0820_web.pdf
33 Government of the United Kingdom, Statement to the UN Human Rights Council, May 30, 2013,
https://stopkillerrobots.org/wp-content/uploads/2013/03/KRC_ReportHeynsUN_Jul2013.pdf, pp. 21-22
(accessed June 16, 2020).
34 United Kingdom Ministry of Defence, “Joint Doctrine Note 2/11: The UK Approach to Unmanned
Aircraft Systems,” March 30, 2011,
https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/33711/20110505JDN_211_
UAS_v2U.pdf (accessed July 17, 2020).
35 Government of the United Kingdom, Statement to the Convention on Conventional Weapons informal
meeting of experts on lethal autonomous weapons systems, November 15, 2017,
https://conf.unog.ch/digitalrecordings/index.html?guid=public/61.0500/EDA01C29-125B-4280-9EC1-
92C44A4E9830_10h09&position=3223 (accessed July 17, 2020); Government of the United Kingdom,
Statement to the Convention on Conventional Weapons informal meeting of experts on lethal
autonomous weapons systems, April 13, 2015,
http://www.reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2015/meeting-
expertslaws/statements/15April_UK.pdf (accessed June 16, 2020).
36 http://www.article36.org/wp-content/uploads/2016/04/UK-and-LAWS.pdf
37 https://www.timesofisrael.com/israeli-killer-robots-could-be-banned-under-un-proposal/#gs.fqy6r1
38 Government of Israel, Statement to the Convention on Conventional Weapons Meeting of High
Contracting Parties, November 14, 2014, https://unoda-web.s3-
accelerate.amazonaws.com/wpcontent/uploads/assets/media/A9D6A596BC5B169DC1257D9700471102/
file/Israe_LAWS_MSP.pdf (accessed June 15, 2020).

SEOMUN XXIII Research Report • 6


23rd Annual Session of the Seoul Model United Nations

attack and destroy radar emitters39, while refusing calls to discuss a new international treaty to
ban or fully restrict autonomous weapons.40

South Korea
Although South Korea is “wary of fully autonomous weapons systems that remove
meaningful human control from the operation loop, due to the risk of malfunctioning,
potential accountability gap, and ethical concerns” 41, it is an active participant in the researching,
developing and investing in AI and autonomous weapons for military use (as it relies on
mandatory conscription) albeit, similarly to France, not possessing any LAWs nor wish to
acquire them.42 South Korea’s highest concern is the development of defensive autonomous
weapon systems with threat to the North; having developed the semi-autonomous Samsung
Techwin surveillance and security guard robots in 2006 which is deployed in the demilitarized
zone between North and South Korea to detect targets through infrared sensors.43

France
France does not possess nor have the intention to ever obtain lethal autonomous
weapons44 and recognizes the complex ethical legal, operational and technological concerns that
comes with removing human control from such weapon systems. France launched multilateral
talks on LAWs in November 2013 as the CCW president, and refused to entrust life or death
decisions to fully autonomous weapon systems.45 In 2019, France, along with Germany, during
the Alliance for Multilateralism event, declared that a human must always be responsible for the
decision to use these systems and that states must examine the legality of these new weapons that
they are developing or requiring at the design stage,46 in its proposition to the CCW to agree to a
non-legally binding political declaration.
39 https://www.timesofisrael.com/israeli-killer-robots-could-be-banned-under-un-proposal/#gs.fqy6r1
40 https://cisp.cachefly.net/assets/articles/attachments/82945_arms0820_web.pdf
41 Government of the Republic of Korea, Statement to the Convention on Conventional Weapons
informal meeting of experts on lethal autonomous weapons systems, April 13, 2015,
https://www.unog.ch/80256EDD006B8954/(httpAssets)/2A22908A9A03E949C1257E29005B90C1/$file/2
015_LAWS_MX_R oK_GS+Corr.pdf (accessed June 15, 2020).
42 Government of the Republic of Korea, Statement to the Convention on Conventional Weapons Group
of Governmental Experts meeting on lethal autonomous weapons systems, 25 March 2019,
https://twitter.com/BanKillerRobots/status/1110209366614044675 (accessed July 17, 2020).
43 Ibid footnote 25
44 Government of France, Statement to the UN Human Rights Council, May 30, 2013,
http://stopkillerrobots.org/wpcontent/uploads/2013/05/HRC_France_10_30May2013.pdf (accessed June
15, 2020).
45 Florence Parly, Minister of Defense, “Intelligence artificielle et defense,” April 5, 2019 (unofficial
translation), https://www.defense.gouv.fr/english/salle-de-presse/discours/discours-de-florence-
parly/discours-de-florence-parlyministre-des-armees_intelligence-artificielle-et-defense (accessed July 20,
2020)
46 https://www.diplomatie.gouv.fr/en/french-foreign-policy/united-nations/multilateralism-a-principle-of-
action-for-france/alliance-for-multilateralism-63158/article/11-principles-on-lethal-autonomous-weapons-
systems-laws

SEOMUN XXIII Research Report • 7


23rd Annual Session of the Seoul Model United Nations

Campaign against Killer Robots


CCW

Suggested Solutions

The concept of lethal autonomous weapon systems raises questions on four main areas:
the technology, the legal, the political and the ethical. Yet the most urgent question that needs to
be answered is one of utmost importance: the definition of lethal autonomous weapons. What is a
lethal autonomous weapon? What counts as a lethal autonomous weapon? To what extent should
autonomy be allowed? Who is to be held accountable if someone was wrongfully killed? How
should ‘mistakes’ be addressed? Should LAWs be banned or regulated and how so? These are all
the biggest concerns at the root of the issue, and delegates should aim to address these in their
resolutions.

Governments and member states should first of all continue their support for the
negotiations of international treaties. It is vital for member states to lay down and decide on a
collective definition for lethal autonomous weapons as there are currently only national
definitions in certain countries; a standard international definition is yet to exist. Delegates must
work towards a common understanding to form a precise definition, taking into account the
tradition of ethics in armed forces. Due to its ambiguosity, the most effective solution currently
is to establish a legally-binding instrument that sets a political and legal framework for LAWs ie.
a treaty. Member states should form an agreement, based on their country stance, to regulate the
research, development, creation, and use of lethal autonomous weapons in this treaty.
Regulations can include restricting the amount of LAWs through a universal consensus reached
by countries, and having a new committee established to monitor and supervise countries’
actions in accordance with the regulations. In any case of a breach in rules, said committee will
also be in charge of deciding and formulating how a country should be prosecuted based on the
severity of its actions. Delegates may come up with their own set of punishments and be able to
give a reasoning behind them.

A way to ensure that the process of transferring decisions and responsibility to the
autonomous weapon proceeds as desired and that human control is always maintained, is by
constant monitoring and supervision by a higher power, perhaps the committee previously
mentioned, and setting up a shut down system which can only be manipulated by humans and
cannot be altered by any means. Member states may also prevent potential hacking and causing
collateral damage and mistakes by strengthening fire walls and adding encrypted passwords and
antivirus software to the algorithm.

SEOMUN XXIII Research Report • 8


23rd Annual Session of the Seoul Model United Nations

Another idea could be to call for the creation of an agency or committee dedicated to
multifaceted research of the effects of autonomous technology, especially its potential military
use. This would play a significant role in regulation as many advancing technological
developments in civilian and military sectors need to be critically analyzed and politically
accompanied.47 The results of said research can be used to create international norms for the
extent of development, research and autonomy allowed for LAWs.

In regards to violating ethical acts and international laws such as the IHL, inclusion of
judicial bodies such as the International Court of Justice is a good solution for clarifying
accountability, offering advisory opinions and ensuring neutrality. Member states could also
recommend raising awareness and education on LAWs, hosting expert meetings and spreading
latest news and developments of LAWs.

Bibliography

Some Suggested Additional Resources:

Project Ploughshares - is a Canadian peace research institute with a focus on disarmament


efforts and international security, specifically in the areas of the arms trade, emerging
military and security technologies, nuclear weapons, and outer space security.-
https://ploughshares.ca/tag/lethal-autonomous-weapons-systems/

The Role of the United Nations in Addressing Emerging Technologies in the Area of Lethal
Autonomous Weapons Systems - https://www.un.org/en/un-chronicle/role-united-nations-
addressing-emerging-technologies-area-lethal-autonomous-weapons

Autonomous weapons that kill must be banned, insists UN chief -


https://news.un.org/en/story/2019/03/1035381

All drone strikes ‘in self-defence’ should go before Security Council, argues independent
rights expert - https://news.un.org/en/story/2020/07/1068041

47 https://www.swp-berlin.org/10.18449/2019RP03/#hd-d15993e964

SEOMUN XXIII Research Report • 9

You might also like