Professional Documents
Culture Documents
The general idea of LAWS is that once such system is developed and activated,
it would with the help of AI, machine learning, sensors and complex algorithms,
identify, search, seek and attack targets without any human intervention. As of
now, no country in the world is successful in developing a fully-autonomous
weapon system, however, rapid advancement has made this a distinct
possibility in near future. It is important to note that at this point, several
countries use near- autonomous defensive systems. These are used in a
protective roles to intercept incoming attacks. These systems do not actively
search, seek and attack on the target but instead respond to predetermined
threats. The Iron Dome of Isreal, is one such example.[2]
The question which is of concern is, whether LAWS will comply with the existing
International Humanitarian Law (“IHL”) without breaching any of the ethical
concerns.
Due to increased inclination towards ethics and human dignity, the degree of
autonomy becomes very relevant. Majority of CCW State parties and other non-
governmental parties believe that LAWS must have some human control. It is
difficult for a machine to ever achieve a human like spirit. The relevant
question is what degree of human control is required in such systems. An ideal
approach would be let the system work autonomously however, as and when
needed humans must have a veto on the functioning of LAWS. A principle of
human control should be internationally recognised within the framework of
CCW and possibly other documents of international law and the basis from
which requirements can be developed as part of a norm- shaping process.[10]
IHL has three basic principles enshrined in its law i.e. the principle of distinction,
proportionality and precaution. These principles are usually kept in mind while
distinguishing between military and civilian objects. It is hard to judge that a
machine would ensure distinction, judge proportionality or take precaution
should the circumstance change.[11] Ideally, these obligations cannot and
should not be transferred to a machine as it is hard to hold them accountable
and therefore, the relevance of limited human control as highlighted above is
necessary.
Further, to bridge the gap between the ethical considerations and IHL, it is
relevant to highlight the Martens Clause[12] that requires systems and their
usage to meet the ‘principles of humanity and the dictates of public conscience’.
It is used for cases which are not covered by existing treaties and highlight the
importance of upholding the basic ethical principles of human dignity. Since it’s
a new arena with no precedent whatsoever, the predictability, accuracy, and
reliability of such systems is uncertain and therefore, we need pillars of
humanity to put reasonable restrictions on the development and usage of the
same.
Few ways to regulate LAWS are listed below- each of the options presented has
certain legal, ethical and operational implications depending on the perspective
of State parties.
A third option could be banning only the use of LAWS as opposed to both
development and use. This option would allow States to continue to research
and develop such systems without any standard of regulation. This would also
allow development and use on domestic level and export of technologies to
State and Non-State actors. As opposed to completely banning uses of LAWS, a
protocol could be considered to set limitation on certain uses of LAWS rather
than a complete ban.
A Way Forward:
This debate on LAWS is at a very early stage and therefore, very broad and
vague. A pragmatic approach to deal with the LAWS is necessary. To begin
with, it is necessary for the international community to first come up with a
working definition to start the process of detailed discussion on individual
elements. Lack of consensus on a working definition is delaying the discussion
on a regulatory framework, while many countries have already started
developing the technology required for LAWS. Many major powers have already
made significant progress in this arena. At present, there are a lot of challenges
that needs to be addressed by GGE over and above the existing concerns.
These challenges could be: first, the delay of process because of lack of
consensus on precise technology; second: exchange and misuse of technology
by non-state actors does not fall within the existing scope of this discussion;
third: the existing framework is strengthened only by IHL. It does not include
other legal framework such as human right, criminal law, product liability, etc.;
fourth: the focus of debate is just on weapons and their use on armed conflicts.
However, the usage and benefits in other spheres of security and peace keeping
operations is not under discussions.
Referances:
[1] Brigadier Saurabh Tewari, “Impact of Disruptive technologies on warfare”,
CLAWS-Issue Brief No. 185, June 18, 2019.
URL- https://www.claws.in/publication/impact-of-disruptive-technologies-on-
warfare/
[4] United Nations, UNODA, CCW. Report on the Meeting of Experts of the High
Contracting Parties to the Convention on Prohibitions or Restrictions on the Use
of Certain Conventional Weapons (2014, 2015 and 2016), Available on the
Internet.
[7] International Human Rights Clinic, Heed the Call- A Moral and Legal
Imperative to Ban Killer Robots, (August 2018). Available on the Internet.