€ www.publicserviceeurope.

com/article/2750/the-future-of-global-warfare-killer-rob ots The future of global warfare: killer robots 20 November 2012 Pre-emptive prohibition on fully autonomous weapons is needed as giving machines the power to decide who lives and dies on the battlefield would take defence te chnology a step too far – warns€campaigner Despite a lack of public awareness and publ ic debate a number of governments, including European states, are pushing forwar d with the development of fully autonomous weapons - also known as killer robots . These are weapon systems that will function without any human intervention. Th e armed robot itself will select its target and will determine when to fire. Thi s is a frighteningly dangerous path to follow in terms of the need to protect ci vilians during armed conflict. Killer robots would be unable to distinguish adequat ely between combatants and civilians in the increasingly complex circumstances o f modern battlefields, and would be unable to make proper proportionality determ inations. That is, whether the military advantages of an attack exceed the poten tial harm to civilians. Giving machines the power to decide who lives and dies o n the battlefield would take technology too far. Killer robots would lack the hu man qualities necessary to protect civilians and comply with international human itarian law. They would lack the ability to relate to humans and to apply human judgment. They would also create an accountability gap, as it would be unclear who should be held responsible for the inevitable violations of the law that would o ccur. Fully autonomous weapons do not yet exist, though precursors do. The precu rsors demonstrate the rapid movement toward autonomy and replacing humans on the battlefield. The United States is the most active in developing these technolog ies but others include China, Germany, Israel, Russia, South Korea and the Unite d Kingdom. Sophisticated fully autonomous weapons may be fielded within 20 or 30 ye ars, according to many experts; some of whom indicate that cruder versions could be available much sooner - in a matter of years not decades. Armed drones are n ot fully autonomous weapons and not part of the call for a ban. Human Rights Wat ch has extensively criticised the way drones have been used - for extrajudicial killings for example - but the key issue with drones is not the nature of the we apon, as it is with fully autonomous weapons. Drones have a 'man-in-the-loop' wi th a human remotely selecting the target and deciding when to fire. With killer robots, the human is out-of-the-loop and the machine determines what to attack a nd when. We have has just released the first in-depth report by a non-governmental organisation looking at this issue Losing humanity: the case against killer robo ts. We conclude that these weapons would not be able to comply with internationa l humanitarian law standards and would pose unacceptable dangers to civilians. A nd we are calling for a pre-emptive ban on the development, production, and use of fully autonomous weapons. Governments should enact such a ban at the national level, as a stepping stone to an international treaty with a comprehensive proh ibition. At present, militaries mostly speak of retaining some degree of human over sight over armed robots for the foreseeable future. But numerous military planni ng documents make clear that many see fully autonomous weapons as the desirable and inevitable future of warfare, at least for rich nations. With each passing d ay, more and more money will be ploughed into research and development of fully autonomous weapons. More investment means the more such weapons will become part of plans and doctrine for future fighting. Killer robots need to be stopped now , before it is too late and their march from science fiction to reality becomes irreversible. Steve Goose is director of the arms division at the Human Rights Watc h campaign group