You are on page 1of 4

Autonomy" means a unit that can guide itself without human control.

The base of autonomous


cars are comparatively old technologies such as ABS and ESP. These systems are activated
automatically in case of dangerous situations and provide driving assistance. Developments on
these systems form the first steps of autonomous car technology but it should also be noted that
one aspect of autonomous cars is related to artificial intelligence and robotics development.1

European Commission states in the communication to the European Parliament dated 17.05.2018
that automation should be evaluated under 6 levels.2

Zero autonomy – Driver Only: the driver performs all driving tasks.
Vehicle is controlled by the driver, but there are some driving assist features.
Vehicle has combined automated functions but the driver must remain engaged with the driving
task.
Driver is not required to monitor the environment but he must be ready to take control of the
vehicle.
The vehicle is capable of performing all driving functions under identified usage.
The vehicle is capable of performing all driving functions, driver is not required.
Vehicles assisting the driver are already available on the EU market (level 1-2) and level 3-4
vehicles are being tested and some of them are expected to be available by 2020.

General Legal Framework

According to Turkish Road Traffic Law and 1968 Vienne Convention on Road Traffic, a driver must
be present in moving vehicles. According to their definitions, "driver" means a real person
therefore a legal revision is necessary for the use of full autonomous vehicles.

Similarly, on other domestic laws such as California Vehicle Code and amended articles of the
German Road Traffic Law on 2017, vehicles are required to be suitable for driver interventions.

According to The UNECE Word Forum for Harmonization of Vehicle Regulations (WP.29), the level
of safety to be ensured by automated/autonomous vehicles shall not cause any traffic accidents
resulting in injury or death that are reasonably foreseeable and preventable.3 Within this scope,
current studies focus especially on the safety of vehicles. These studies stipulate that protection of
vehicles against cyber-attacks, software updates, having data recorders and similar topics should
be regulated.4

Autonomous Vehicle Accidents

In terms of vehicles under the control of drivers (vehicles of class 0,1 and 2), there is no special
concern however in terms of class 3 and vehicles of subsequent classes, autonomous systems
perform the functions of the driver.

Theoretically, it is not possible to blame the driver for an accident resulting from an action during
use of autonomous system. The determination of whether the driver or the system was in charge
at the time of the accident, may be made with the help of a data recorder (black box), which is
required to be added to autonomous vehicles.
Under mentioned laws, vehicles must be produced suitable for human interventions via the
development process of autonomous technology. Driver intervention is deemed mandatory due to
the assumption that systems are not free of errors. As mentioned in the following example cases,
driver's intervention is examined even if the accident is caused by system.

Tesla Self-Driving Car Accident

In 2016, a 2015 Tesla vehicle crashed into a tractor trailer which changed lanes and the accident
resulted in fatal injuries on the Tesla driver. This accident led to discussions on legal responsibility.
NHTSA (National Highway Traffic Safety Administration) investigated the subject and has reached
following information according to the data recorder;5

Tesla was being operated in Autopilot mode at the time of the crash.
The Automatic Emergency Braking (AEB) system did not work or provide any warning to the driver.
The driver took no braking, steering or other actions to avoid the crash.
NHTSA's examination did not identify any defects in the design or performance of the AEB or
Autopilot systems of the subject vehicles due to the fact that AEB systems are rear-end collision
avoidance technologies and they are not designed to reliably perform in all crash modes, including
crossing path collisions as in this case. The Autopilot system is an Advanced Driver Assistance
System (ADAS) that requires the continual and full attention of the driver to monitor the traffic
environment and be prepared to take action to avoid crashes.

Nilsson v. General Motors6

In 2018, the first known lawsuit on accidents of autonomous cars, was filed against a
manufacturer. An autonomous car attempted to merge into the left lane but the vehicle ahead in
the left lane slowed down so the autonomous car re-centered itself back in its original lane.
Meanwhile a motorcycle behind the autonomous car moved on so the accident happened and the
driver of motorcycle injured. Parties settled in this case.

Uber Autonomous Car Accident

In 2018 in Arizona, during the test drive of Uber's self-driving car, Elain Herzberg was killed, who
was crossing the street. The car did not stop even sensors noticed the pedestrian, because the
emergency stop system was disabled and the driver was not warned by the system and vehicle did
not stop automatically.7

As a result of the investigations, prosecutors have decided not to hold UBER responsible as the
accident was described as preventable. It is stated that the vehicle can be managed both with and
without driver and the driver was watching a video on his phone when the accident happened.8
Therefore it is stated that the driver should have intervened. İn order to determinate driver's
defect, the following criteria should be taken into account: autonomy level of the vehicle and
whether any information was shared with the driver about the disabling the emergency stop
system.

Liability For Autonomous Car Accidents Under Turkish Law

I. Civil Responsibility
Responsibilities of the operator, manufacturer and driver may be raised and discussed.
Responsibility of the driver is evaluated general defect liability provisions and article 49 of Turkish
Code of Obligations. One of the controversial issues in this case is whether the driver has lost the
driver title in the framework of autonomous use. We are of the opinion that the person still
remains as the driver due to the fact that the control over the car is transferred with free will and
intervention on driving is still possible.9 This should not be considered as an absolute liability and
for allocation of responsibility, an accident should be avoidable with driver's intervention.

Operator's liability is regulated as a danger liability under articles 85-86 of the Turkish Road Traffic
Law.10 This liability is also deemed as absolute liability therefore the automation level of the
vehicle will not change the scope of liability.

The source of liability of the manufacturer is disputable under Turkish Law. The draft law on
Product Safety and Technical Regulation establishes the legal base of manufacturer's liability. The
draft law regulates this as a defect liability and gives the manufacturer additional obligations.

II. Criminal Responsibility

In terms of Criminal Law, Turkish criminal legislation regulates the criminal responsibility of natural
persons. According article 20 of Turkish Criminal Law, legal entities do not have criminal liability
but sanctions which are considered to be security measures foreseen in the law are reserved.

The legal ground for disclaiming criminal liability of legal entities and living creatures other than
humans is that they cannot act on the basis of free will. Criminal liability of legal persons is
regulated in various European countries.11 An autonomous vehicle cannot be said to have free
will yet.12 The autonomous vehicle makes the right decision for driving safety based on the data it
collects while driving. Whenever the system makes the right decision, although it makes a decision
other than the intended purpose, then it can be argued that the vehicle has free will. For these
reason, an autonomous car cannot be claimed to have criminal liability in today's circumstances.

Therefore the driver, programmer and manufacturer are those who may be held liable and when it
comes to the manufacturer or programmer liability, it is important to identify and connect the act
to a person since the legal person has no criminal liability.

Liability of programmers and manufacturers will be discussed in scenarios such as intentionally


programming the vehicle system to the accident or intentionally not producing a vehicle
compatible with the software or disabling the use of the program. These persons should act in
accordance with the requirements of there area of expertise otherwise they might be liable for
their negligence.

The intent of the driver will be raised in every event where it is possible to interfere with the
course of the vehicle but liability for negligence is a more controversial issue. According to the
recent decisions, it is understood that the autonomous system does not eliminate the obligation of
care and attention. Class 4 vehicles are fully autonomous under the defined usage and driver's
intervention is not required so the responsibility of the driver will not be raised for class 4 and
class 5, where vehicles are under the control of autonomous system.
Conclusion

Today, the tested vehicles are produced in a way open to driver intervention as required by local
laws. Until 4th class, fully autonomous vehicles for specific areas of use are released to the market,
driver's liability will remain as the principal liability. Upon introduction of fully autonomous
vehicles, it can be foreseen that legal responsibility will pass to the manufacturers.

With the transition to 4th grade technology, the responsibility of manufacturers and programmers
will come up more in terms of criminal liability but it is clear that current regulation will be
insufficient in this regard. On the way to liability of artificial intelligence, introducing criminal
liability for legal entities may be a considered as solution for the legal gaps that may arise in the
transition phase.

You might also like