Beintema 1

Jacob Beintema
Professor Brown
Business Ethics BA 209
22, February 2015
Driverless Cars
With increasing technology in today’s automobile industry, cars have come a long way
since I was born twenty two years ago. Back in 1992, most cars were still being manufactured
with the common single barrel carburetor, which fueled the engine that annually needed a tune
up or a rebuild kit. This may sound like rocket scientist talk to many, but for those who have a
small knowledge for cars, carburetors sound like ancient history today and would hardly be seen
in any cars on the road today. Today what you may see is a human driving the automobile, but in
the future you may not see a human driver. Major manufacturers are working on and designing
driverless cars. Automobiles that are operated by software and computers, which basically makes
the robot concept come to mind. Although these driverless cars may bring an array of safety
features, I am focused on the question, is having driverless cars on the roads today ethical?
Every time a driver gets behind the wheel of a car they are faced to make decisions on the
road. Decisions are basically what driving consists of along with making the physical action. A
major question that comes up with driverless cars is “what happens to the cars in a difficult
situation where one’s life is on the line?” (Stokes) Jason Millar, a PHD candidate in the
department of Philosophy study’s this question a lot, and has come up with the idea of the
“Tunnel Effect.” This basically says “You are driving in an autonomous car along a narrow road,
headed towards a one-lane tunnel when a child errantly runs on to the road and trips. The car
cannot brake fast enough to avoid hitting the child and so it must decide whether to swerve off
the road, effectively harming you, or remain driving straight, harming the child.” (Stokes)

Beintema 2

Now this is a decision that a computer would have to make. Who is going to be at fault,
the auto manufacturer? Could it be that the programmer of the computer at fault? I do not know
how you could leave a computer up to making this decision. There will be an accident and a
police report and usually a ticket would be issued. Who will the ticket be made out to? How
would a computer be able to serve jail time if need be, if convicted of man slaughter? Bryant
Walker Smith, a law professor who has written extensively on driverless cars says “the cars are
going to crash, and that is something that the companies need to accept and the public needs to
accept" (Press). I agree with this concept, that the cars will crash no matter what and that a
computer may react faster than most human beings as well. Lives might be saved with driverless
cars but the owner will still be held accountable for their accidents. State lawmakers and federal
lawmakers really have not paid too much attention to making these new laws for the cars either.
“Just four states have passed any rules governing self-driving cars on public roads, and the
federal government appears to be in no hurry to regulate them” (Press). Before these cars hit the
road there will have to be a clear law with set rules and regulation before anyone can ride in a
driverless car.
There is a solution that has been brought up. That when an accident may occur it would
be left up to the owner to make the decision through the programmer. Like in health care when
an ethical decision comes into play the doctors or nurses leave the decision up to the patient to
make. This still would not make these situations perfect because you would still be picking who
gets hurt and essentially who is more or less valuable. It would be easy to make decisions
beforehand for the users, but what happens when your decision puts your children or spouses life
at risk. Then whose life would be more valuable. This is why putting the decisions in the users
hands is not quite perfect either. When and if the driverless cars become mainstream, “companies

Beintema 3

are going to need to include a code of ethics training program in the operating manual that will
educate users on the new ethical choices that this type of driving opens them up to, and how to
deal with situations like those dangerous situations” (Heim).
Overall, in ten to fifteen years I do believe that these driverless cars will be on the road
whether we like it or not. Google is already far into the process along with other manufacturers,
and Mercedes Bens has already showed a driverless automobile at this year’s car show in
Detroit. It was just a concept car but it usually does not take to long for concepts to be put into
production after they grab the consumer’s attention. These cars will crash on the roads due to
other humans driving and making the wrong decisions or just dangerous un-avoidable situation,
and there will be ethics brought into play as to who was at fault. Manufacturers and users will
have to be ready to answer these questions and come up with ethical solutions.

Beintema 4

Works Cited
Heim, Jillian. Should Googles "driverless" car require code of ethics training? 20 September
2014. 22 2 2015.
Press, Associated. Driverless car Ethics: Who should Die? 27 November 2014.
22 2 2015.
Stokes, Andrew. The eithics of driverless cars. 20 August 2014. 22 2 2015.