You are on page 1of 17

Engholm 1

Brandon Engholm
CST 300L
December 9th, 2016
Automated Systems within the Military
Since the advent of the microchip, people and organizations have been flocking to see
how technology can perform for them. The military is no stranger to this, as their mission is to
defend the United States using all assets available to them. From this, automated systems and
machines are increasing in use by the military. Should this research and development continue?
Automated combat systems are increasing in use within the military such as the Predator
drone system and automated defense systems such as the Phalanx CIWS1 are seeing more
investment by the military, should we continue research and development of these technologies
to use them in combat. To understand these systems we must first know the history about them.
Background
Unmanned Aerial Vehicles (UAVs), also termed drones, earliest recorded uses date as
far back as 1849 where the country of Austria used unmanned hot air balloons to drop explosives
on the Italian city of Venice (Naughton, 2003). World War II was the next major
experimentation using drones. Development was done by Reginald Denny. He later sold his
idea for radio-controlled drones to the US Military and became one of the first firms in the
country

to do so (Dennyreginald). Surprisingly Denny's first audition for the military was a


fiasco,
"Unbeknownst to the military that day, the aircraft went completely out of control," wrote
Reginalds son. "The brass was extremely impressed with the wild aerobatics, while my

Close-In Weapon System, pronounced sea-whiz

Engholm 2

father and his group were terrified that the drone might dive into the reviewing

stands."(Dennyreginald, p. 11)
Denny won the contract and went on to produce thousands of drones for anti-aircraft training.
Post-WWII, the next development in drones was done by the Army Signal Corps where they
made a pilotless drone that could take pictures with a camera (Pilotless Photo Drone Takes
Aerial Pictures, 1956). Just 25 years after this, the idea of the Predator system would begin.
The Predator system was initially developed in the early 1980s but under a different name
and company. However this company experienced financial issues and in 1990 all of its assets
were purchased
by General Atomics (General Atomics Predator, 2008). General Atomics
Aeronautical Systems2 was awarded the contract to develop the Predator system in January of
1994, finishing development in mid 1995. The Predator system had been designed in response to
a DOD requirement to provide to the warfighter3 persistent intelligence surveillance and
reconnaissance information combined with a strike capability (U.S. Air Force, 2015). It was
originally used as a reconnaissance system; however, in 2002 its designation was changed and it
was given two Hellfire4 missiles enabling it to strike ground targets from the air. A newer version
of the Predator termed Predator B was developed in 2001, with a heavier payload capability and
increased speed and size (Predator B RPA). According to the datasheet by General Atomics, the
Predator B features both remotely piloted or fully autonomous capabilities (Predator_B021915).
More recently, General Atomics5 along with DARPA have each been considering possibilities of
autonomous

and networked drones (Operating in Contested Environments, 2015). Both

Subsidiary of General Atomics (GA).


A soldier in the field.
4
A missile designed to strike armored targets.
5
(US Air force STEALTH UAV armed with LASER GUN named General Atomics Avenger, 2014)
3

Engholm 3

organizations have envisioned a swarm-style autonomous system although with varying


configurations and deployment options (Engelking, 2015).
There have been incidents involving drones for several years. According to a New
American report, somewhere between 750 to 1000 civilians have been killed in drone strikes
since the year 2006 (Grier, 2009). In another report, by the Bureau of Investigative Journalism,
they found that rescuers of those injured by a nearby strike had been targeted after the initial
strike itself (Shane, 2012). The Bureau continued its investigation the following year, estimating
that around 2,537 people had been killed in a drone strike that were unaffiliated with terrorism
(McVeigh, 2013). The Bureau also found that many children had been killed in strikes (Doble,
2011).
The second major technology is the Phalanx close-in weapon system or CIWS, developed
by General Dynamics, which is now owned by Raytheon. This system was developed in
response to the invention of the guided-missile which is too fast for conventional anti-missile and
anti-aircraft weapons to defend against. It uses a 20 millimeter gatling gun autocannon and
operates on a rotating platform. Linked to it is a fire control and radar system. It is capable of
detecting, monitoring, and attacking any enemy missile or aircraft within its range (NavWeaps,
2010).
The Phalanx has had several incidents in its operational history. The first was on
February 10th, 1983 where the USS Antrim, an Oliver Hazard Perry guided-missile frigate, was
conducting live-fire exercises off the Atlantic coast. Utilizing a small drone, the Antrim was
testing its Phalanx CIWS system. Upon the destruction of the Drone, it fell into the Antrim where

Engholm 4

the fuel of the target drone caught fire. A civilian instructor near the crash area died from burns
that he sustained because of the fire (USS Antrim (FFG 20), n.d.).
On October 11th, 1989, an amphibious cargo ship named the USS El Paso was also
conducting live-fire exercises in the same situation along with another amphibious cargo ship,
the USS Iwo Jima. This time after the target drone was engaged and destroyed, the CIWS
detected the falling drone, reactivated, and continued to fire. Shots from the Phalanx struck the
bridge of the Iwo Jima causing damage to the ship, injuring one sailor and killing one officer
(Plunkett, 1989).
On February 25th, 1991, during the 1991 Gulf War, a group of four ships including the
battleship USS Missouri; HMS Exeter, a British destroyer; and the guided missile frigate USS
Jarrett. The Missouri was engaging in naval bombardment of a Kuwaiti island held by Iraqi
forces when an oil well exploded. The battlegroup believed itself to be under attack by an Iraqi
missile; the Missouri responded by releasing missile countermeasures, also known as chaffs.
The Phalanx system on-board the USS Jarrett was set to automatic mode whereby it
automatically fired a short burst at the chaff. Several rounds struck Missouri causing denting to
the ship's side and piercing a guest berthing room. No one however was injured6 in the incident
(LEAD REPORT, 1998).
The last and most recent incident occurred on June 4th, 1996. At this time RIMPAC or
the Rim of the Pacific exercise, which is an international maritime warfare exercise held
biennially, was occurring. At the time, an A-6E Intruder aircraft was towing a target behind it
about 1,500 miles west of Hawaii. The Japanese Destroyer JDS Ygiri equipped with a phalanx

Coincidently, there is a false report that one person was killed in this event.

Engholm 5

locked onto the A-6E Intruder before it was out of the ciws engagement envelope or the
maximum range of the system and opened fire. The Intruder was hit and went down into the
ocean however both Pilots ejected safely. It was thought that the Phalanx was on automatic mode
but post investigation found that the CIWS was activated before the A-6E was out of the
engagement envelope (WATANABE, 1996).
There is one thing that must be made clear. While the Phalanx is a fully automated
defense system, the Predator is not. However, the Predator is still highly automated. It is capable
of fully automated take-off and landing, along with GPS navigation and an autopilot style
function, meaning that it can maintain flight with predetermined speed and direction (Asaro,
2012).
Stakeholders
There are multiple stakeholders involved in this issue. This would involve the U.S
Government, the military, the manufacturers of both technologies, the soldiers in the armed
forces, foreign civilians, and potential bystanders.
As a stakeholder, the government has many positions on this issue given the very nature
of the government. Each Presidential administration has different views on how the U.S. Armed
Forces should operate. This could range from increased military operations to decreased military
budget.
The two manufacturers, Raytheon and General Atomics, have an objective to make
money and sell their product but any resolution to an issue that involves not using their product
is directly against their objective.

Engholm 6

Soldiers are a more general stakeholder and will favor technologies that will keep them
safe but most importantly help or assist them in getting the job done.
Foreign civilians are a more direct stakeholder because they are impacted by the use of
these technologies. Their opinion is reflected in the operation of these technologies and the
outcomes from how they perform. The opinion of foreign civilians can also affect the opinion of
the government creating a connection between them and causing difficulties for other
stakeholders.
The military is a direct stakeholder as they directly control and operate these automated
systems. With Congress and the Presidency in control of the military budget, leaders of the
Armed Forces must consider a vast number of alternatives while maintaining their mission which
is that of being able to eliminate any enemy force that threatens the country. With declining
defense budgets this has directly impacted the opinion of military leadership and the
development, demand, and deployment of these automated technologies. For example, Vice
Admiral Rowden, Commander Naval Surface Forces in the Navy, has called for maximum
lethality challenging the CRUSER Robo-Ethics 2015 to develop machines with maximum
lethality assuming flat or declining defense budgets (Englehorn, 2015).
Bystanders are somewhat of an invisible stakeholder and by their nature are only a
stakeholder when involved in any issues related to these automated systems. As can be seen in
the Phalanx incidents, bystanders be they civilians military or any other persons can be affected
if they are close enough to the incidents. Some Predator strikes that caused collateral damage can
injure or kill a bystander.

Engholm 7

There are many issues involved with these automated technologies. Some are in the past
while others are more recent. UAVs such as the Predator have seen several issues while in
service. There are four main issues involved with the predator UAV. The first is collateral
damage which can involve anything between damaged vehicles for houses to destroyed buildings
and fire. The second issue is desensitized pilots. These are pilots while working in the control
room of a UAV do not feel any regret or remorse or any feelings when they kill their target.
Third issue is accidental death which occurs when a predator drone hits its target but also
impacts innocent bystanders and or causing collateral damage which may also results in more
deaths. This can be attributed to bad Intel. Lastly, there is pilot stress which can occur when a
pilot has been sitting for too long in a control room and becomes stressed from intensive activity
(Chow, 2013).
Next there is the phalanx CIWS which as previously written there have been several
incidents involving the Phalanx, the most recent having had been in the year 1996, when a
Japanese Destroyer accidentally shot down a US aircraft. There are two main issues surrounding
the use of the Phalanx system. The first is collateral damage which can occur when a phalanx
shoots at a target and an intermediary object moves between the phalanx and the target thus
causing damage to the intermediary object. The second issue is automatic mode which is seen in
the most recent incident caused the destruction of the U.S. aircraft.
The third major issue is known as the kill chain. The kill chain is the steps it takes to
identify a target and destroy it. The military developed an easy-to-remember acronym for
remembering the kill chain. It is F2T2EA meaning find or locate a Target, fix or make it
difficult for the Target to move comma track or observe movement of the target comma Target

Engholm 8

select the appropriate weapon or kill system comma and engage colon apply the selected weapon
comma and assess or evaluate the effect(s) of the operation (Hutchins, Cloppert, Amin, 2010).
The ethical question regarding the kill chain is exactly how much should be automated within the
kill chain. Much of the Phalanx kill chain is already automated and simply involves the Phalanx
operator determining what mode they should use against an incoming missile or incoming
airplane. Once the mode is selected then there are two pathways in which the kill chain operates.
In automatic mode comma the first portion of the kill chain is being constantly done by the
Phalanx radar which continuously searches for a Target inside of its engagement range. Once a
Target enters the Phalanx engagement envelope it tracks the target, locks onto the target and
opens fire (Raytheon: Phalanx Close-In Weapon System). An important note regarding the kill
chain is how each system capable of lethal force has one, but they are not the same. The kill
chain for a predator strike involves confirmation from the President of the United States giving
confirmation to strike the target (Bohn, 2016). Essentially, once the drone operator has found
his/her target, this target is nominated for strike by the agency using the drone, and either the
President or recommending agency (with major national security officials unanimously agreeing
it should be undertaken) can initiate the strike (Bohn, 2016).
The fourth and final major issue is lethal delegation, that is, delegating the decision to use
automated lethal force against a person or object. There is considerable pushback on this issue
from major industry leaders and other notable scientists, including Stephen Hawking and Elon
Musk. In an open letter presented at the International Joint Conference on Artificial Intelligence
(Gibbs, 2015), Musk, Hawking, and thousands of other industry professionals advocated against
the use of Killer Robots, as they put it, in today's military. They believe that it would cause an

Engholm 9

international arms race in lethal automated Weapons Systems or in their words Killer Robots
(Tucker, 2015). Another possibility is that the automated systems would be limited to the ethics
they have been programmed with or that they have learned through a highly specific training
course.
Issues
With issues come possible options, which each stakeholder has, some of them matching
opinions of others, but for different reasons. The first potential option would be to ban all
automated or autonomous technology used by the military. This option would present substantial
pushback from the military manufacturers and in some cases the government. This solution is
guided by the Rights Approach and supported by both foreign citizens and potential bystanders.
The rights approach falls along the line of Kantian Duty-based ethics but also pulled some of its
flow from philosopher John Locke (Brown, n.d.). It specifies that the best action is one that
secures the rights of those that are affected by the issue(s) (Brown, n.d.). This action would
remove all harmful or potentially harmful equipment from operation thereby securing the rights
of these stakeholders.
The second potential option would be to ignore the complaints and do absolutely nothing.
This option directly contradicts the first option supported by foreign citizens and potential
bystanders. It is also not a direction the military would take. This option is guided by the egoistic
approach and supported by the manufacturers Raytheon and General Atomics, and by the
government. The egoistic approach is a modification of the utilitarian framework and can be
considered the ethics of self-interest (Brown, n.d.). It works by modifying the utilitarian ideals to
produce the greatest amount of good for him or herself rather than the greatest good for all. The

Engholm 10

specific reasons behind the approach, however, differ between the government and
manufacturers. The manufacturers will seek to continue producing and selling their product
whereas the government seeks to maintain its self image, that being any image that is
non-negative and doesnt affect their politics. Thusly actions taken by each entity alone are less
likely to benefit the other. For example the government may flip flop its opinion or action on
something which may inversely benefit the manufacturers. On the opposite side manufacturers
may implement something into their product that does not maintain the government's self image
and this does not benefit the government. This action does little to mitigate any of the issues and
only looks out for the opinion of the manufacturers and government. The government, for
example, has done little to mitigate issues involving these technologies today, according to the
Stimson Research Center, The Obama administration received poor grades including three
Fs. (Stimson, 2016).
The last potential option would be to change the Rules of Engagement and operational
procedures involving the Predator system and Phalanx CIWS. This means that the exact policy
regarding operating these technologies and kill chain for both of these systems will be modified
to ensure that the primary issues of accidental death and collateral damage do not occur again.
This option, in a general sense, is closer to the Government and Manufacturer positions.
However, the intent is to remove the problem while continuing use of the technologies. The
ethical framework of this option lies in the mindset of its supporter, the military, the framework
being the duty-based approach. The duty-based approach is based around the intent of ones
actions. It stems from duty or the obligation to perform the action (Brown, n.d.). For the

Engholm 11

military, it is their duty to protect and serve the United States of America, thus they are fully
committed to achieving that objective to the highest degree possible.
Concessions
My view of the issues does not follow a mutually exclusive solution.While I support the
third option in modifying the Rules of Engagement and related policies; I believe in a three-step
potential option. All three steps can happen simultaneously, these being: reviewing the
procedures in operating these technologies, improve upon these technologies ensuring they are
not faulty, and persistent training in all technologies concerned. I will acknowledge that even
with these three steps implemented, there is always the factor of human error. In some of the
incidents, human error is indeed attributable to the cause of the event. It is entirely possible that
even with the changes I have recommended, that no actual change comes from it. I will say that
banning drones does offer the immediate removal of threat to foreign civilians, as a potential
strength.
Technology has often created greatness, but it has its share of downsides. Each side has a
view and a need. It is this need that we must understand. The technology may have problems, but
systems in the past were never perfect either. Take the automobile for example; it has faults and
has caused deaths. Do we shy away from it? No, this does not happen, for we have ironed out the
problems over many years, and created rules to guide our usage of them. By this same mindset,
we can solve this issue.

Engholm 12

Works Cited
Amirfar, C. (2016, April 14). U.S. Delegation Statement on "LAWS and Human Rights/Ethics".
Retrieved December 16, 2016, from Mission of the United States Geneva Switzerland:
https://geneva.usmission.gov/2016/04/14/u-s-delegation-statement-on-laws-and-human-ri
ghtsethics/
Arkin, R. C. (2008). Arkin [PPT]. Atlanta: Mobile Robot Laboratory, Georgia Institute of
Technology. Retrieved December 16, 2016, from http://agi-conf.org/2008/slides/arkin.ppt
Asaro, P. (2012, June 1). On banning autonomous weapon systems: human rights, automation,
and the dehumanization of lethal decision-making. International Review of the Red
Cross, Volume 94(Number 886), p. 23. Retrieved December 16, 2016, from
https://www.icrc.org/eng/assets/files/review/2012/irrc-886-asaro.pdf
Bohn, K. (2016). Newly released US drone policy explains how targets can be chosen. Retrieved
December 16, 2016, from
http://www.cnn.com/2016/08/06/politics/obama-administration-drone-policy/
Bowcott, O. (2015, April 13). UK opposes international ban on developing 'killer robots'.
Retrieved December 16, 2016, from theguardian:
https://www.theguardian.com/politics/2015/apr/13/uk-opposes-international-ban-on-deve
loping-killer-robots
Brown University. (n.d.). Retrieved December 17, 2016, from
https://www.brown.edu/academics/science-and-technology-studies/framework-making-et
hical-decisions

Engholm 13

Chow, D. (2013, November 5). Drone Wars: Pilots Reveal Debilitating Stress Beyond Virtual
Battlefield. Retrieved December 16, 2016, from LiveScience:
http://www.livescience.com/40959-military-drone-war-psychology.html
Dennyreginald [PDF]. (n.d.). Muncie: Academy of Model Aeronautics.
Doble, A. (2011). Study reveals 168 child deaths in Pakistan drone war. Retrieved December 16,
2016, from
https://www.channel4.com/news/study-reveals-168-child-deaths-in-pakistan-drone-war
Drone Wars Pakistan: Analysis. (n.d.). Retrieved December 16, 2016, from
http://securitydata.newamerica.net/drones/pakistan-analysis.html
Engelking, C. (2015, April 6). DARPA's Plan to Overwhelm Enemies With Swarming Drones.
(Kalmbach Publishing Co.) Retrieved December 16, 2016, from Discover Magazine:
http://blogs.discovermagazine.com/drone360/2015/04/06/darpas-swarming-drones/#.VrH
hSPnhDcc
Englehorn, L. (2015, June). CRUSER Robo-Ethics 2015 Summary Report [PDF]. Monterey:
NAVAL POSTGRADUATE SCHOOL.
General Atomics Predator. (2008, January). Retrieved December 16, 2016, from Spyflight:
http://spyflight.co.uk/Predator.htm
Gibbs, S. (2015, July 27). Musk, Wozniak and Hawking urge ban on warfare AI and autonomous
weapons. Retrieved December 16, 2016, from theguardian:
https://www.theguardian.com/technology/2015/jul/27/musk-wozniak-hawking-ban-ai-aut
onomous-weapons

Engholm 14

Grier, P. (2009). Drone aircraft in a stepped-up war in Afghanistan and Pakistan. Retrieved
December 16, 2016, from
http://www.csmonitor.com/USA/Military/2009/1211/Drone-aircraft-in-a-stepped-up-warin-Afghanistan-and-Pakistan
Hutchins, E. M., Cloppert, M. J., & Amin, R. M. (2010). Intelligence-Driven Computer Network
Defense Informed by Analysis of Adversary Campaigns and Intrusion Kill Chains.
Lockheed Martin Corporation. Retrieved December 16, 2016, from
http://www.lockheedmartin.com/content/dam/lockheed/data/corporate/documents/LM-W
hite-Paper-Intel-Driven-Defense.pdf
LEAD REPORT. (1998). Retrieved December 16, 2016, from
http://www.gulflink.osd.mil/du_ii/du_ii_refs/n52en417/8023_034_0000001.htm
Lin, P., Bekey, G., & Abney, K. (2008). Autonomous Military Robotics: Risk, Ethics, and
Design. San Luis Obispo: California Polytechnic State University. Retrieved December
16, 2016, from http://ethics.calpoly.edu/onr_report.pdf
McVeigh, T. (2013). Investigation to record victims of US drone attacks in Pakistan. Retrieved
December 16, 2016, from
https://www.theguardian.com/world/2013/sep/22/journalists-website-breaks-silence-victi
ms-drone
Naughton, R. (2003, February 2). Remote Piloted Aerial Vehicles. Retrieved December 16, 2016,
from Monash.edu: http://www.ctie.monash.edu/hargrave/rpav_home.html#Beginnings
NavWeaps. (2010). Retrieved December 16, 2016, from
http://www.navweaps.com/Weapons/WNUS_Phalanx.php

Engholm 15

Operating in Contested Environments. (2015, March 30). Retrieved December 16, 2016, from
DARPA: Defense Advanced Research Projects Agency:
http://www.darpa.mil/news-events/2015-03-30
Pike, J., & Aftergood, S. (2002, November 6). RQ-1 Predator MAE UAV. Retrieved December
16, 2016, from FAS: Intelligence Resource Program:
https://fas.org/irp/program/collect/predator.htm
Pilotless Photo Drone Takes Aerial Pictures. (1956, June). Popular Mechanics Magazine, p. 144.
Retrieved December 16, 2016, from
https://books.google.com/books?id=QuEDAAAAMBAJ&pg=PA144&dq=1954+Popular
+Mechanics+January&hl=en&sa=X&ei=jLnBT_OmOpT3gAfc2_WlBQ&ved=0CD4Q6
AEwAjgy#v=onepage&q&f=true
Plunkett, A. J. (1989). Iwo Jima Officer Killed In Firing Exercise. Retrieved December 16, 2016,
from
http://articles.dailypress.com/1989-10-12/news/8910120238_1_iwo-jima-ship-close-in-w
eapons-system
Predator B RPA. (n.d.). Retrieved December 16, 2016, from http://www.ga-asi.com/predator-b
Predator_B021915 [PDF]. (n.d.). General Atomics Aeronautical.
Raytheon: Phalanx Close-In Weapon System. (2016). Retrieved December 16, 2016, from
http://www.raytheon.com/capabilities/products/phalanx/
Scotty, G. (2016). Ethics in Comm & Tech. CST373-02.
Shane, S. (2012). U.S. Said to Target Rescuers at Drone Strike Sites. Retrieved December 16,
2016, from

Engholm 16

http://www.nytimes.com/2012/02/06/world/asia/us-drone-strikes-are-said-to-target-rescu
ers.html
Stimson. (2016). Obama Administration Receives Poor Grades On Reforming US Drone Policy.
Retrieved December 17, 2016, from
http://www.stimson.org/content/obama-administration-receives-poor-grades-reforming-u
s-drone-policy
TAB H -- Friendly-fire Incidents. (n.d.). Retrieved December 16, 2016, from
http://www.gulflink.osd.mil/du_ii/du_ii_tabh.htm
Tucker, P. (2015, July 28). US Drone Pilots Are As Skeptical of Autonomy As Are Stephen
Hawking and Elon Musk. Retrieved December 16, 2016, from Defense One:
http://www.defenseone.com/technology/2015/07/us-drone-pilots-are-skeptical-autonomystephen-hawking-and-elon-musk/118680/
U.S. Air Force. (2015). Retrieved December 16, 2016, from
http://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104469/mq-1b-predator
.aspx
US Air force STEALTH UAV armed with LASER GUN named General Atomics Avenger.
(2014, January 17). Youtube. Retrieved December 16, 2016, from
https://www.youtube.com/watch?v=mPvBDlQOtqY
USS Antrim (FFG 20). (n.d.). Retrieved December 16, 2016, from
http://www.navysite.de/ffg/FFG20.HTM
WATANABE, T. (1996). Japanese Ship Accidentally Downs U.S. Jet. Retrieved December 16,
2016, from http://articles.latimes.com/1996-06-05/news/mn-11915_1_japanese-ship

Engholm 17

Wisdom, K. (2016). Major ProSeminar. CST300-01.

You might also like