You are on page 1of 25

Table of contents

List of Figures vi

Glossary of Terms vii

Abstract viii

General Introduction 1

Chapter 1 Project Framework 3

Introduction 3
1 Context 3
1.1 What is an Autonomous Car ? 4
1.2 Levels of Autonomy 4
Figure I.1 Levels of driving automation summary. 4
1.3 What are the benefits of autonomous cars ? 5
1.4 What are the challenges with autonomous cars ? 6
2 Objectives and Motivation 6

Conclusion 6

Chapter 2 Requirements 7

Introduction 7
1 Functional requirements 7
Figure II.1 Self-driving car interior 8
2 Hardware requirements 8
2.1 Exteroceptive Sensors : 9
Figure II.3 Exteroceptive sensors range. 9
2.1.1 Ultrasonic sensors : 9
2.1.2 RADAR (Radio Detection And Ranging) : 10
Figure II.5 RADAR mounted on a self-driving car. 11
2.1.3 LIDAR (Light Detection And Ranging) 11
2.1.4 Camera 12
2.2 Proprioceptive sensors 12
2.3 Choice of Sensors 12

.
2.4 Computing Hardware : 15
2 Software requirements 16
2.1 Carla Simulator 16
2.2 Python Programming Language 17

Conclusion 19

Chapter 3 System Conception 20

Introduction 20
Software Architecture 20

Conclusion 22

Chapter 4 State Estimation and Localization 23

Introduction 23
1 Sensor fusion 23
2 Motion model 25
3 State Estimation Algorithm 25
4 Unscented Kalman filter design 28

Conclusion 29

Chapter 5 Environment Perception for Self-Driving Cars 30

Introduction 30
1 Object Detection 30
1.1 Convolutional Neural Networks (CNN) 31
1.1.1 Architecture Overview 31
1.1.2 The Feature Extractor 33
1.1.3 Prior Boxes 33
1.1.4 Output layer 34
1.2 IOU (Intersection-Over-Union) evaluation metric 35
1.3 Non-max Suppression 37
1.4 Object Tracking 38
2 Semantic Segmentation 39
2.1 The Semantic Segmentation Problem 39
2.2 The basic ConvNet configuration used for Semantic Segmentation 40
2.3 Semantic Segmentation For Road Scene Understanding 40
2.3.1 Drivable Space Estimation 41

.
2.3.2 Semantic Lane Estimation 41

Conclusion 42

Chapter 6 Motion Planning 43

Introduction 43
1 Driving scenarios 43
2 Mission Planning 44
3 Behavioral Planning 46
4 Path Planning 49
4.1 Parametric Curves Problem Formulation 50
4.2 Conformal Lattice Planner 51
4.3 Velocity Profile Generation 52

Conclusion 52

Chapter 7 Controller 53

Introduction 53
1 Architecture of Vehicle Control Strategy 53
2 Vehicle Modelling 54
2.1 The bicycle kinematics model 54
2.2 Longitudinal Vehicle Model 55
2 Controller 57
2.1 Longitudinal Speed Control with PID 57
2.2 Vehicle Lateral Control 58

Conclusion 60

Chapter 8 Implementation 61

Introduction 61
1 Vehicle Model 61
1. 1 Circle Trajectory 61
1. 1 Slope 62
2 State Estimation 63
3 Visual Perception 64
4 Motion Planning 66
5 Putting it all together 66

.
Conclusion 68

General Conclusion 69

References 70

.
List of Figures

I.1 Levels of driving automation summary

II.1 Self-driving car interior

II.1 An example of typical sensors used to perceive the environment

II.2 Exteroceptive sensors range

II.3 Ultrasonic sensor in action

II.4 RADAR mounted on a self-driving car

II.5 LIDAR mounted on a self-driving car

II.6 Comparison of various sensors used in autonomous vehicles

II.7 The approach by Tesla

II.8 The approach by Volvo

II.9 The approach by Waymo

II.10 NVIDIA computing hardware

II.11 Communication between CARLA and Python script

III.1 Software Architecture

IV.1 Sensor Fusion

IV.2 Bicycle model

IV.1 State estimation algorithm


V.1 Object detection of cars, trucks and pedestrians
V.2 A regular 3-layer Neural Network vs ConvNet with its neurons arranged in three dimensions
V.3 basic ConvNet configuration used for object detection

vi
V.4 VGG architecture
V.5 Output Layer
V.6 Object detection during training
V.7 IOU evaluation metric
V.8 Object detection with multiple bounding boxes per object
V.9 Negative and positive anchors
V.10 Non-max Suppression algorithm
V.11 Semantic segmentation output
V.12 Adding the feature decoder to the architecture
V.13 Drivable space estimation
V.14 Lane estimation
VI.1 Left Turning
VI.2 Lane Changing
VI.3 Motion Planning architecture
VI.4 Graph
VI.5 A* Algorithm vs Dijkstra
VI.6 FSM
VI.7 Path Planning Problem
VI.8 Conformal Lattice Planner
VI.9 Trajectory
VI.10 Linear Ramp Profile
VII.1 The vehicle control architecture
VII.2 Forces applied on the vehicle
VII.3 Longitudinal control with PID
VII.4 Feedforward control structure
VII.5 Plant Model
VII.6 Geometric representation of Stanley
VIII.1 Circle Trajectory
VIII.2 Slope representation
VIII.3 Throttle profile

vi
VIII.4 Velocity and distance over time
VIII.5 GNSS and LIDAR data
VIII.6 Filtered Data
VIII.7 Error rate on each degree of freedom
VIII.8 From output of semantic segmentation to estimation of the drivable space
VIII.9 Lane Boundaries
VIII.10 Object detection and distance to impact
VIII.11 Testing algorithms on inference data
VIII.11 Path generated with A* algorithm
VIII.12 Following lead vehicle
VIII.13 Obstacle Avoidance
VIII.14 Stopping at a stop sign

vi
Glossary of Terms

Ego : A term to express the notion of self, which is used to refer to the vehicle being controlled
autonomously, as opposed to other vehicles or objects in the scene. It is most often used in the
form ego-vehicle, meaning the self-vehicle.

GNSS : Global Navigation Satellite System : A generic term for all satellite systems which provide
position estimation. Global Positioning System (GPS) is a type of GNSS.

IMU : Inertial Measurement Unit

LIDAR : Light Detection and Ranging

LTI : Linear Time Invariant

LQR : Linear Quadratic Regulation

MPC : Model Predictive Control

ODD : Operational Design Domain

OEDR : Object and Event Detection and Response nadhm ija ki tra lmessage louta

PID : Proportional Integral Derivative Control

RADAR : Radio Detection And Ranging

SONAR : Sound Navigation And Ranging

vi
Abstract

The major goal of this project was to develop a control system and 3D visualization
of an autonomous vehicle capable of driving on a road, avoiding obstacles, choosing
the best path, among other things. The project was created in a simulation
environment named "CARLA" and comprises a 3D model of a road as well as
numerous real-world scenarios in which the vehicle moves. On the circuit, some
objects can hinder the car's progress, resulting in dangerous scenarios. In the case of a
possible collision, this system reacts by deflecting or stopping the vehicle. The input
data sources for this control system include images from the camera sensor, LiDAR,
and all the other different sensors.

viii
General Introduction

An autonomous car is also called a self-driving car or driverless car or robotic car. Whatever
the name, the aim of the technology is the same. Autonomous vehicle technology tests started in
1920 and were controlled by radio technology. For the past few years, updating automation
technology day by day and using all aspects in regular human life. The present scenario of human
beings is addicted to automation and robotics technology using like agriculture, medical,
transportation, automobile and manufacturing industries, IT sector, etc. For the last ten years, the
automobile industry came forward to researching autonomous vehicle technology (Waymo,
Google, Uber, Tesla, Renault, Toyota, Audi, Volvo, Mercedes-Benz, General Motors, Nissan,
Bosch, and Continental's autonomous vehicle, etc.). Level-3 Autonomous cars came out in 2020.
Every day autonomous vehicle technology researchers are solving challenges. In the future, without
human help, robots will manufacture autonomous cars using IoT technology based on customer
requirements and prefer these vehicles are very safe and comfortable in transportation systems like
human traveling or cargo. Autonomous vehicles need data and updating continuously, so in this
case, IoT and Artificial intelligence help to share the information device to the device.
This is why we chose to talk about this subject in this report which is composed of six
chapters, each focusing on a different aspect of the developed work. The first chapter serves as an
introduction to the project's main theme. A brief overview of the current state of autonomous
driving, as well as a description of the importance of simulation in the autonomous driving field.
This opening chapter also contextualizes and explains the motivation and objectives proposed for
this report. We will also define the meaning of an autonomous vehicle, acquire some knowledge
about the levels of autonomy, understand the benefits that autonomous vehicles offer and the
challenges they are facing. In Chapter 2, we will explore the requirements needed to create a fully
functional self-driving car. Several sensors are investigated to gain a better understanding of the
present landscape, identify common traits, and determine whether the market offer meets the
requirements for level 4 and 5 automation. We will also have a look at the computational hardware
that is used. Chapter 3 will give us a brief introduction to the software architecture and how our

1
work was divided. Subsequently, chapters 4,5,6 and 7 will give us an insight into the work done in
each module. We will start with the state estimation and localization of the car which will let the car
know its position in the environment. Then we will target visual perception which will allow the
vehicle to detect obstacles, to define its drivable space and track objects in real time. Finally, we will
introduce motion planning which will let the vehicle make its decisions and choose the best path,
interact with other dynamic objects... Before the last chapter, chapter 8 takes place to describe all the
results obtained throughout the work. Finally, chapter 10 is where the conclusions of this project are
drawn.

2
Chapter 1
Project Framework

Introduction

The general overview is described in this first chapter, as well as a brief contextualization of the
theme that underpins the current project paper. We will also explore the definition of a self-driving
car as well as its benefits and the challenges faced. Finally, we will address the objectives of this
project and why we chose to work on it.

1 Context
Motorized transportation has modified the manner we live. Once again, autonomous vehicles are
on the verge of doing so. Both technical innovation and social reasons have fueled the evolution of
our modes of transportation, from horses and carriages to cars and driverless vehicles.

When we look at the condition of autonomous vehicles at the beginning of the 2020s, we can
observe that significant milestones have been completed, such as Waymo, Aptiv, and Yandex
offering autonomous taxis in designated locations since mid-2018. At the same time, technological
developers have encountered unexpected difficulties.

“It’s been an enormously difficult, complicated slog, and it’s far more complicated and involved
than we thought it would be, but it is a huge deal.”

Nathaniel Fairfield, distinguished software engineer and leader of the ‘behavior team’ at Waymo,
December 2019

Furthermore, in a wide number of areas, including robotics and the automotive industry, the
capacity to correctly employ computers to simulate a car driving environment in real-time is
crucial. There is a need for an accurate simulation of the position, dimension, and movements of

3
real-world elements (such as other cars, pedestrians...) to test the ability of the car to work in a
real-world environment and to be as safe as we would like it to be.

1.1 What is an Autonomous Car ?


A self-driving car is a vehicle capable of sensing its surroundings and operating without the need for
human intervention. At no point a human passenger is required to take control of the car, nor is a
human passenger required to be present in the car at all. A self-driving automobile can go anywhere
a traditional automobile can go and accomplish everything a skilled human driver can
accomplish.[1]

1.2 Levels of Autonomy


The Society of Automotive Engineers (SAE) currently defines 6 levels of driving automation
ranging from Level 0 (fully manual) to Level 5 (fully autonomous). These levels have been adopted
by the U.S. Department of Transportation.[2]

Figure I.1 Levels of driving automation summary.


● Level 0 : No automation

● Level 1 : Advanced Driver Assistance Systems (ADAS) are introduced: features that either
control steering or speed to support the driver. For example, adaptive cruise control that
automatically accelerates and decelerates based on other vehicles on the road. 

4
● Level 2 : Steering and acceleration are simultaneously handled by the autonomous system.
The human driver still monitors the environment and supervises the support functions. 

● Level 3 : Conditional automation: The system can drive without the need for a human to
monitor and respond. However, the system might ask a human to intervene, so the driver
must be able to take control at all times. 

● Level 4 : These systems have high automation and can fully drive themselves under certain
conditions. The vehicle won’t drive if not all conditions are met. 

● Level 5 : Full automation, automated driving system (ADS) have the same mobility as a
human driver. The vehicle can drive, wherever, whenever.  

1.3 What are the benefits of autonomous cars ?


The possibilities for improving ease and quality of life are endless. The elderly, the
physically disabled, and the kids would be able to live independently.

However, the real promise of self-driving automobiles is their ability to drastically reduce
CO2 emissions. Experts identified three trends in a recent study that, if implemented
simultaneously, would unleash the full potential of autonomous vehicles: vehicle
automation, vehicle electrification, and ridesharing. By 2050, these "three urban
transportation revolutions" could:

● Road traffic should be reduced (30 percent fewer vehicles on the road)
● Reduce transportation costs by 40% (in terms of vehicles, fuel, and    
infrastructure)
● Improve the walkability and livability
● Make parking lots available for other uses (schools, parks, community centers)
● Reduce CO2 emissions in cities by 80% over the world.

5
1.4 What are the challenges with autonomous cars ?
Fully autonomous (Level 5) vehicles are now being tested in several locations throughout the
world, although none are presently available to the general public. We're still a long way off from
that. The difficulties range from technological to legal to environmental and philosophical in
nature. The following are only a few of the unknowns.
● Weather Conditions
● Traffic Conditions and Laws
● Accident Liability
● Others.

2 Objectives and Motivation


This project is incorporated in an autonomous driving simulation topic. At the end of the work,
we want to develop a self-driving car that can safely drive in a 3D environment (e.g. follow lanes,
avoid obstacles, react in dangerous situations, etc...). For this to be accomplished, a software capable
of world representation and accurate simulation must be elected and served as the starting point for
the development of this project. The main motivation to address this theme of work was the rapid
evolution of the automotive sector connected to the autonomous driving industry, which is already
a reality today as many companies are working in this field. Coupled with this topic, the ability to
simulate all this autonomous world has its advantages and plays an essential point to make this
self-driving activity more and even more real.

Conclusion
In this chapter, we had a look at the exact definition of a self-driving car, the levels of autonomy
defined by the SAE, the benefits of self-driving cars, the challenges it faces, and finally the objectives
from this project and the reasons that let us choose to tackle such a project.

6
Chapter 2

Requirements

Introduction
In this chapter, we will explore the different functional, hardware, and software requirements.
First, we will explore what a self-driving car is supposed to offer to its passengers. Then, a brief
explanation of the main sensors and the computational hardware that enable autonomous driving
is carried out and the choice of these exact sensors is also explained. Finally, we will have a look at
the software requirements in which we will explain the choice of the CARLA simulator and the
programming language python.

1 Functional requirements

The majority of people believe that self-driving cars are the way of the future. According to studies,
self-driving cars could reduce the number of deaths resulting from car accidents in the United
States by more than 90% by 2050. However, this isn't the only impact that self-driving cars will
have in the future. So the question is what really can these vehicles offer to its owners ?
The functional requirements that a self-driving car should offer :
● Longitudinal control (acceleration, braking and road speed)
● Lateral control (lane discipline)
● Environment monitoring (headway, side, rear)
● Minimum risk maneuver
● Transition demand
● Human Machine Interface (internal and external)
● Driver monitoring.

7
Self-driving cars are also obliged to offer some safety requirements such as :
● System safety
● Failsafe Response
● HMI/Operator information
● OEDR (Object and Event Detection and Response)
A self-driving car working in its operational domain should offer a safe and comfortable ride to its
owners with an experience as close to the human driver experience as possible or even better.

Figure II.1 Self-driving car interior

2 Hardware requirements
Sensors are devices that measure or detect a property of the environment or changes to a property.
They are classified into two groups based on the type of data they collect. There are exteroceptive
sensors: they record a property of the environment. Extero refers to the outer world or one's
surroundings. Sensors that record a feature of the ego vehicle, on the other hand, are
proprioceptive. Proprio is a Latin word that means "within" or "on one's own." [3]

8
Figure II.2 An example of typical sensors used to perceive the environment.

2.1 Exteroceptive Sensors :

In order to perceive a vehicle’s direct environment, object detectors are used.

Figure II.3 Exteroceptive sensors range.

2.1.1 Ultrasonic sensors :


Ultrasonic sensors (also known as SONAR; Sound Navigation Ranging) use ultrasound waves for
ranging and are the most basic and inexpensive of these devices. This sensor is easily influenced by
poor weather conditions such as rain and dust, but not by lighting conditions. Interference from

9
other sound waves can also impair sensor performance, which can be minimized by utilizing several
sensors, relying on different types of sensors and combining them with sensor fusion algorithms.
Furthermore, as sound waves lose energy as distance increases, this sensor is only effective over short
distances, such as in park assistance.

Figure II.4 Ultrasonic sensor in action.

2.1.2 RADAR (Radio Detection And Ranging) :


It uses radio waves for ranging which travels at the speed of light and have the lowest frequency
(longest wavelength) of the electromagnetic spectrum. RADAR signals are reflected well by
materials that have considerable electrical conductivity, such as metallic objects. Other radio waves
can interfere with RADAR performance, and sent signals can easily bounce off curved surfaces,
blinding the sensor to such objects.

Overall, the key advantages of RADAR include its maturity, low cost, and resistance to low light
and harsh weather. However, because radar can only detect objects with limited spatial resolution
and little information about their spatial structure, discriminating between several items or
dividing objects by arrival direction can be difficult. This has relegated radars to more of a
supporting role in automotive sensor suites.

10
Figure II.5 RADAR mounted on a self-driving car.

2.1.3 LIDAR (Light Detection And Ranging)


It uses light in the form of a pulsed laser. To cover an area, LIDAR sensors shoot out 50,000 to
200,000 pulses per second and combine the returned signals into a 3D point cloud. Objects and
their movement can be recognized by comparing the differences in consecutive seen point clouds,
resulting in a 3D map with a range of up to 250m.

LIDAR has its own lighting sources so the lighting in the surrounding area has little effect on
LIDAR. When working in poor or changeable lighting conditions, LIDAR does not experience
the same issues as cameras.

Comparaison metrics :

● Number of beams
● Points per second
● Rotation rate
● Field of view

11
2.1.4 Camera
Cameras are a type of light-collecting sensor that may capture a large amount of
information about a scene. In fact, some argue that the camera is the single sensor that is
truly necessary for self-driving cars. However, state-of-the-art performance cannot yet be
achieved just by vision.

Comparaison metrics :

● Resolution
● The field of view
● The dynamic range

2.2 Proprioceptive sensors


Sensors that detect ego characteristics are called proprioceptive sensors. Global Navigation Satellite
Systems, or GNSS, such as GPS or Galileo, and inertial measurement units, or IMUs, are the most
frequent types here. GNSS receivers are used to determine the position, velocity, and, in certain
cases, heading of an ego vehicle. The actual positioning methods and corrections utilized have a
significant impact on accuracy. Aside from these, the IMU measures the ego vehicle's angular
rotation rate and accelerations, and the combined measurements can be utilized to determine the
vehicle's 3D orientation. For vehicle control, the most critical factor is heading. Last but not least,
there are wheel odometry sensors. This sensor measures the rate of rotation of the wheels and uses
that information to calculate the ego car's speed and rate of change of heading. This is the same
sensor that keeps track of your vehicle's mileage.

2.3 Choice of Sensors

While all the sensors presented have their strengths and shortcomings, no single one would be a
viable solution for all conditions on the road, hence the need for many sensors. A vehicle must be
ready to avoid close objects, while also sensing objects distant from it. It must be ready to operate in

12
several environmental and road conditions with challenging light and weather circumstances. this
suggests that to reliably and safely operate an autonomous vehicle, usually a mix of sensors is
employed.
The following technical factors affect the selection of sensors:
● The scanning range, determining the quantity of time you've got to react to an object that's
being sensed.
● The resolution of a sensor determines how much detail it can provide.
● The angular resolution, or field of view, determines how many sensors are required to cover
the region you want to see.
● The ability to discriminate between many static and moving objects in three dimensions,
which determines the amount of items you can track.
● The refresh rate determines how frequently the sensor's data is updated.
● General accuracy and dependability in a variety of environments.
● Cost, size and software compatibility.
● The amount of data that was generated.

13
Comparison of various sensors used in autonomous vehicles

Figure II.6 Comparison of various sensors used in autonomous vehicles

Different approaches by Tesla, Volvo-Uber, and Waymo :

Figure II.7 The approach by Tesla

14
Figure II.8 The approach by Volvo

Figure II.9 The approach by Waymo

2.4 Computing Hardware :


The computational brain, the car's main decision-making unit, is the most important component.
It receives all sensor data and generates the commands required to operate the vehicle. Most
businesses want to create their own computing systems that are tailored to their sensors and
algorithms. The most common examples would be Nvidia's Drive PX and Intel & Mobileye's
EyeQ.

15
Figure II.10 NVIDIA computing hardware
Any self-driving computer brain must have both serial and parallel compute components.
Particularly for image and LIDAR processing to do segmentation, object detection, and mapping.
We use GPUs, FPGAs, and custom ASICs which are specialized hardware designed to perform a
certain type of computation.

The drive PX units, for example, have many GPUs and the EyeQs have FPGAs both to accelerate
parallelizable compute tasks, such as image processing or neural network inference. It is essential to
correctly synchronize the different modules in the system, and serve a common clock because we
want to make driving decisions based on a coherent picture of the road scene.

2 Software requirements
2.1 Carla Simulator
There are several simulation platforms currently available that focus on the studying of
autonomous vehicle driving as well as others that are designed for simply racing car games.
Autoware, Airsim, CARLA, TORCS, Udacity Self-Driving Car Simulator, and Grand Theft Auto
are some of them. The term "engine" refers to the software development environment in which a

16

You might also like