You are on page 1of 9

9/8/2019 The Legislation, Liabilities and Ethics of Self-Driving Cars > ENGINEERING.

com

We're working on a new experience for engineering.com stories! Try the new look
(https://new.engineering.com/article/13602?utm_source=engcom_new)

The Legislation, Liabilities and Ethics of Self-Driving Cars


Shane Laros (http://www.engineering.com/Author/ID/361857/ShaneLaros) posted on November 04, 2016 | 3 Comments
(https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-

Self-Driving-Cars.aspx#disqus_thread)

Listen to this story

On Saturday, May 7, 2016, Joshua Brown was killed in an automotive collision when an articulated truck turned left in front of his Tesla Model S.
The accident was highly publicised (https://www.engineering.com/AdvancedManufacturing/ArticleID/12569/What-Does-the-Tesla-Autopilot-
Fatality-Mean-for-Self-Driving-Vehicles.aspx) and initiated intense debates
(https://www.engineering.com/AdvancedManufacturing/ArticleID/12641/VIDEO-Media-Flame-War-Continues-over-Tesla-Autopilot-Fatality.aspx)

on the legislation and liability surrounding autonomous vehicles. While fatal car accidents are an unfortunately common occurrence, this incident
was notable as it was the rst time someone was killed by an autonomous vehicle driving without input from its passenger.

Brown was hardly a novice when it came to operating his Model S. His YouTube channel (https://www.youtube.com/user/NexuJosh/videos)
displays multiple videos of him operating the car and showing the “autopilot” feature in operation. He notes in comments that the system was
able to learn as he used it, improving on the car’s ability to handle di cult driving situations and curves in the road.

One video shows the car avoiding an accident with a truck on the highway.

Autopilot Saves Model S

https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-Self-Driving-Cars.aspx 1/9
9/8/2019 The Legislation, Liabilities and Ethics of Self-Driving Cars > ENGINEERING.com

“Tessy did great. I have done a lot of testing with the sensors in the car and the software capabilities,” Brown noted in the video description from
April, 2016. “I have always been impressed with the car, but I had not tested the car's side collision avoidance. I am VERY impressed. Excellent job
Elon!”

So, what went wrong on May 7th? A Tesla blog (https://www.tesla.com/blog/tragic-loss) stated the car was unable to identify the white truck
against a bright sky, while others have asserted that the fault lies with Tesla, or with Brown himself.

"By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security," said Laura MacCleery, vice president of consumer
policy and mobilization for Consumer Reports (http://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/).

"In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we're deeply concerned that consumers
are being sold a pile of promises about unproven technology," she added, referring to Tesla’s “public beta test” status of the software.

While the autopilot software does warn drivers to keep their hands on the wheel, it’s not an immediate alert, and discounts the human factor.

If your car is handling all your driving responsibilities, why would a driver pay attention – and isn’t that the point?

That question depends a little on the driver’s perspective, and lot more on the legislation of autonomous cars.

Autonomous Vehicle Legislation


Legislation exists for other in-car distractions, from cell phone use to watching movies, but with the advent of vehicle autonomy, new laws will be
needed to help keep the roads safe.

The Society of Automotive Engineers (SAE) has identi ed 6 levels of driving automation (http://www.sae.org/servlets/pressRoom?
OBJECT_TYPE=PressReleases&PAGE=showRelease&RELEASE_ID=2715), based on the functional aspects of the available technology and the

varying levels of human involvement in the act of driving. Its cut-o point for calling a driving system “automated” is the monitoring of the driving
environment.

https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-Self-Driving-Cars.aspx 2/9
9/8/2019 The Legislation, Liabilities and Ethics of Self-Driving Cars > ENGINEERING.com

The SAE levels di er from the older U.S. Department of Transportation's National Highway Tra c Safety Administration (NHTSA) policy
(http://www.nhtsa.gov/Research/Crash+Avoidance/Automated+Vehicles) on vehicle automation, which was developed in 2013.
The NHTSA de nes vehicle automation
(http://www.nhtsa.gov/About+NHTSA/Press+Releases/U.S.+Department+of+Transportation+Releases+Policy+on+Automated+Vehicle+Development)
as having ve levels, based on the automation of primary vehicle controls (brake, steering, throttle, and motive power). This could include
assisted braking systems going as high as level 2, if combined with traction control systems.

During the Automated Vehicles Symposium (http://www.nhtsa.gov/About+NHTSA/Speeches,+Press+Events+&+Testimonies/mr-2016-av-


symposium-07202016) in July of this year, Mark R. Rosekind, Administrator of the National Highway Tra c Safety Administration, spoke about
vehicle automation as a safety measure:

“At the National Highway Tra c Safety Administration, there are two numbers that explain exactly why we are so forward-leaning on this issue,”

he said. “The rst is 35,200. That is how many people we lost on American roads last year. The second number is 94. That’s the percentage of
crashes that can be tied back to a human choice or error.”

Rosekind’s sentiments echo the idea that vehicle automation is about safety and saving lives.

While this may be true, there will need to be rm agreement on what constitutes an autonomous vehicle before any legislation can be passed
and bring more of these vehicles on the road.

Will this new technology fall under a di erent class of laws from existing vehicles, or will manufacturers be held accountable for errors in

judgement as people would be currently?

https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-Self-Driving-Cars.aspx 3/9
9/8/2019 The Legislation, Liabilities and Ethics of Self-Driving Cars > ENGINEERING.com

It would be di cult, if not impossible, to convict a piece of software of legal misconduct; even if the system is particularly “smart” and if the car is
being trained by observing human drivers, it may be di cult to pin down exactly who is at fault.
Of course, legislating autonomous vehicles is only a part of what will be needed going forward. Cybersecurity for the vehicles’ software will be
extremely important, which suggests that much of the liability could likely fall on manufacturers themselves.

But is that really feasible?

Autonomous Vehicle Liability


Should Tesla be on the hook for calling its autonomous driving system “Autopilot”?

The name implies that the car is doing all the work, but is contradicted by the brand’s own marketing - as noted in the press kit
(https://www.tesla.com/presskit/autopilot), “Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep
their hands on the steering wheel.”

While the message is clear enough, some confusion is understandable.

When an individual is driving their non-automated car, they are responsible for the errors that account for Rosekind’s 94 percent.

If the car is driving itself and gets into an accident, is it the driver who should have been paying attention, or the company that released the
software responsible for the accident that is at fault?

Interestingly, Volvo has taken the initiative and stated that it will accept full liability for the actions (http://www.volvocars.com/intl/about/our-
innovation-brands/intellisafe/intellisafe-autopilot/news/volvo-cars-responsible-for-the-actions-of-its-self-driving-cars) of its autonomous cars in
an e ort to speed along legislation and development of the technology. There have been similar statements (http://www.cbsnews.com/news/self-

https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-Self-Driving-Cars.aspx 4/9
9/8/2019 The Legislation, Liabilities and Ethics of Self-Driving Cars > ENGINEERING.com

driving-cars-google-mercedes-benz-60-minutes/) from Google and Mercedes-Benz, all carrying the caveat that the companies will take
responsibility provided the fault is in the software or the vehicle itself - not the driver.

This may not be an option for a edgling company like Tesla. Musk may have demonstrated an impressive amount of forward thinking, but the
company lacks the clout of its larger automotive competitors, or Google, for that matter.
It does, however, go a long way in showing the company’s con dence in its technology. The Google eet has covered a lot of ground in its testing
on the streets of California, and although there have been accidents, the vast majority (http://www.theverge.com/2016/2/29/11134344/google-
self-driving-car-crash-report) were determined to be the fault of other drivers.

All vehicles with autonomous capabilities that are currently on the road are built with the idea that the driver still needs to be in control, aware
and responsible for their actions.

Unfortunately, this necessary driver interaction introduces a problem that software will always struggle to compensate for - human error. 

Google Self-Driving Car on City Streets

https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-Self-Driving-Cars.aspx 5/9
9/8/2019 The Legislation, Liabilities and Ethics of Self-Driving Cars > ENGINEERING.com

People, by their very nature, make for idiosyncratic drivers. Some get angry in response to gridlock, others get distracted. One person may see

the truck in their peripheral vision while another would react too late, causing an accident with an otherwise uninvolved car.

For that reason, the easiest solution would be to have cars automated, networked and working together

(https://www.sbdautomotive.com/en/autonomous-cars-tra c) to ensure smooth driving.

It might sound like a prohibitively expensive endeavour, but in light of the costs associated with the development
(http://www.torontoenvironment.org/campaigns/transit/LRTfaq#howmuchspent) or maintenance (http://www.citylab.com/commute/2011/11/1-
billion-doesnt-buy-much-transit-infrastructure-anymore/456/) of any transit system, it doesn’t seem so unattainable, especially if it will save lives
and reduce tra c (http://www.theglobeandmail.com/globe-drive/culture/commuting/how-self-driving-cars-will-ease-tra c-
congestion/article15876882/).

However, this goes against the design of current, distinctly vehicles, and would require infrastructure and a common set of protocols

for all manufacturers to follow. Once again, the hydra-headed legislative issues return.

Of course, all the necessary steps, red tape and bureaucracy will always have to contend with one distinctly human fact: change is di cult.

There are people who would outright refuse to use vehicle automation, whether they are driving enthusiasts (http://driving.ca/auto-
news/news/lorraine-autonomous-cars) or skeptical of the software's capability (http://www.roadandtrack.com/car-
culture/features/a26991/autonomous-cars-the-hard-truth/). They may not live in urban centers where vehicle automation is likely to have the
greatest impact, preferring to use their old Chevy pickup truck on dirt roads or taking a modi ed 4x4 o -road. These are not easy cases to deal
with via automation. But even putting the idiosyncratic cases aside, there is one other major hurdle to the mass implementation of self-driving
cars.

Autonomous Vehicle Ethics


Some people like to stay in control; others would be more than willing to give up that control for an easy commute to the o ce or safety on busy
streets. People often have more faith in themselves than others, or in software designed to accomplish a speci c task—whether that con dence

is warranted or not.

People’s expectations for autonomous vehicles are no di erent.

Humans make split second decisions when driving, working on a mix of training and instinct. Autopilot software on the other hand, while being
capable of a measure of “learning”, ultimately follows a set of rules.

What happens when these rules run counter to our ethical judgements?

https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-Self-Driving-Cars.aspx 6/9
9/8/2019 The Legislation, Liabilities and Ethics of Self-Driving Cars > ENGINEERING.com

A study co-authored by an MIT professor (http://news.mit.edu/2016/driverless-cars-safety-issues-0623) asked respondents if they preferred their
automated vehicles to “minimize casualties in situations of extreme danger” or to put it into an example, “having a car with one rider swerve o

the road and crash to avoid a crowd of 10 pedestrians.”


Unsurprisingly, most people preferred the utilitarian approach, i.e., the action that would save the most lives. However, respondents also

indicated that they would be to drive or own a car that would show preference to other’s lives over their own.

This harkens back to what philosophers and ethicists call the Trolley Problem (https://en.wikipedia.org/wiki/Trolley_problem), which involves

weighing one life against several others.

This results in a social dilemma where self-interest could end up making roads more dangerous for everyone. This moral dissonance could
potentially have a severe impact on the adoption of self-driving cars.

Not comfortable stepping around the issue, a representative of Daimler AG recently stated (http://blog.caranddriver.com/self-driving-mercedes-

will-prioritize-occupant-safety-over-pedestrians/) that the company’s cars would prioritize saving the driver and the car’s passengers over
pedestrians.

“If you know you can save at least one person, at least save that one. Save the one in the car,” said Christoph von Hugo, manager of driver
assistance systems at Mercedes-Benz.

“If all you know for sure is that one death can be prevented, then that’s your rst priority.”

https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-Self-Driving-Cars.aspx 7/9
9/8/2019 The Legislation, Liabilities and Ethics of Self-Driving Cars > ENGINEERING.com

Despite a later correction (http://jalopnik.com/now-mercedes-says-its-driverless-cars-wont-run-over-ped-1787890432) noting that the “statement


by Daimler on this topic has been quoted incorrectly,” this points towards potential ethical problems well.

If one auto manufacturer decides to put its passengers rst, it may make the cars more attractive to consumers, but potentially at the cost of

overall safety on the road. In line with the problem of human error, if every car is working on a di erent measure of “morality” without working
together to ensure these commands do not con ict, will the roads be any safer?

Daimler’s o cial policy revolves less on ‘whom to save’ and more on ‘stay out of that situation in the rst place.’ The idea is that by making

automated driving systems essentially perfect, the Trolley Problem won’t even come up, and everyone can go home happy and healthy, at least in
theory.

Hugo concluded with a statement supporting this o cial stance, “This moral question of whom to save: 99 percent of our engineering work is to

prevent these situations from happening at all.”

Remember, the NHTSA did say that 94 percent of accidents are due to human error.

This all ties back into issues of liability in identifying the culprit of a fatal crash where the car decided who should be saved: its driver, or

pedestrians.

Is it cynical to think that not all human drivers would make a better moral decision than their car when relying on instinct?

In the end, these ethical questions will be decided by the engineers who design autonomous vehicles, much to the philosophers’ chagrin.

Self-Driving Cars: Legislation, Liability and Ethics

Automated vehicle technology is reaching a point where it will either become widely adopted and become a part of our daily drive, or be cast

aside in favor of better driver assistance - leaving control rmly in the hands of the person behind the wheel.
There are more questions than answers at this point and while it is, for better or worse, up to lawyers and politicians to make the decisions on

how the automotive industry will move forward, it is the engineers who are at the forefront of this technological revolution.

Vehicle automation will only succeed if we ask the di cult moral questions, identifying the balance between personal security and saving lives.
The software also needs to be secure enough to avoid any interruption or intervention by an outside party, and cybersecurity is the only way

people will feel safe enough to purchase one of these cars.

Finding a way to integrate a person’s individualism with the potential eet of automated people movers will be a challenge that may be more
di cult than legislating vehicle automation. Anticipating these needs and factoring them into a design will not be easy.

For automotive manufacturers to be comfortable enough to follow in the steps of Volvo, Mercedes and Google, their cars must be at a certain

level of safety and e ciency so that the automation can make up for the myriad ways human error can intervene with the safe operation of a car.

What it comes down to is that there is no easy answer that will get vehicle automation in the mainstream faster. Safety standards will need to be
re ned before people will be comfortable relinquishing the wheel, and those, too, will be in the hands of engineers.

These hindrances don’t stop automated vehicle technology and the discussions surrounding it from being immensely interesting, a big step

towards our technological future—and hopefully a faster, more productive commute.

https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-Self-Driving-Cars.aspx 8/9
9/8/2019 The Legislation, Liabilities and Ethics of Self-Driving Cars > ENGINEERING.com

For more information on the history of autonomous vehicles, check out our feature covering  one hundred years of driverless cars

(https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/12665/The-Road-to-Driverless-Cars-1925--2025.aspx).

If you’re more interested in autonomous vehicle technology itself, check out our feature on what tech it will take to put self-driving cars on the
road (https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13270/What-Tech-Will-it-Take-to-Put-Self-Driving-Cars-on-the-

Road.aspx).

https://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/13602/The-Legislation-Liabilities-and-Ethics-of-Self-Driving-Cars.aspx 9/9

You might also like