You are on page 1of 3

Scalability of weapons systems and

Prolifiration risk

Lethal automomous weapons (law’s) are a type of autonomous


military robot that can independently search and engage target with
help of programmed incriptions and constraits.

1.Unpredictable performance

Lethal autonomus weapons or (law’s) have been called


“unpredictable by design” this is mainly due to the unpredictable
behavior of these weapons settings in real world.

2.Escallation Risk

Given the speed and scale at which autonomous weapons are


capable of operating, these weapons systems carry risks of
unintentional escalation. a good parallel of how adversial AI systems
can rapidly escalate out of control. And just having human control
on an autonomous weapon eliminates 80% of problems
Recent research by RAND cooperative has noted that “the speed of
autonomous weapons did lead to inadvertent escalation in the
wargame”
3.Scalability of weapons systems and proliferation task
Artificial intelligence enables tasks to be accomplished at scale and
lower cost. The resulting ability to mass produce autonomous
weapons cheaply, creates a dynamic that is highly destabilizing to
society. When a human is required to make the decision to kill, there
is an inherent limit to how many weapons they can adequately
supervise, on the order of a single to a few individual weapons.
Removing human judgment also removes the limit on the number of
weapons systems activated, meaning a single individual could
activate hundred, thousands, or even millions of weapons at once.

3.Selective Targeting of Groups

Selecting individuals to kill based on sensor data alone, especially


through facial recognition or other biometric information, introduces
substantial risks for the selective targeting of groups based on
perceived age, gender, race, ethnicity or dress.

Ways to Reduce damage by laws:

1. The Positive Obligation of Human Control: The first is a


commitment by countries that all weapons systems must
operate under meaningful human control. This means that
humans and not algorithms; decide to kill.

2. Prohibitions on Systems Incapable of Human Control: The


second element is for countries to agree to prohibit weapons
systems incapable of meeting the human control requirement.

You might also like