You are on page 1of 42

2020

AUTONOMOUS
VEHICLE
TECHNOLOGY
REPORT The guide to understanding
the current state of the art
in hardware & software for
self-driving vehicles.

sponsored by
For the people who
aim to create a better
version of the future.
Contributors 6

Introduction 10
How this report came to be: a collaborative effort 11

Levels of Autonomy 14

Sensing 17
Environmental mapping 18
Passive sensors 18
Active Sensors 22
Choice of Sensors 28
Geolocalization 32
Maps 33

Thinking & Learning 35


SLAM and Sensor Fusion 35
Machine Learning Methods 38
Gathering Data 40
Path Planning 42

Acting 44
Architectures: Distributed versus Centralized 45
Power, Heat, Weight, and Size challenges 47

User experience 48
Inside the vehicle 51
The external experience 53

Communication & Connectivity 55


DSRC or C-V2X 59

Use case: Autonomous Racing 62

Summary 66

About Nexperia 68

About Wevolver 70

References 72
Contributors
Editor in Chief Jordan Sotudeh Akbar Ladak Joakim Svennson
Cover Photographer Many thanks to
Bram Geenen Los Angeles, USA Bangalore, India Norrköping, Sweden Benedict Redgrove The people at Roborace, specifically Victo-
Amsterdam, The Netherlands Senior Strategic Analyst at NASA Jet Pro- Founder, CEO, Kaaenaat, which develops Senior ADAS Engineer, Function Owner London, United Kingdom ria Tomlinson and Alan Cocks.
CEO Wevolver. pulsion Laboratory. autonomous robots for logistics, retail and Traffic Sign Recognition and Traffic Light Benedict has a lifelong fascination with
Master International Science and Technol- security use cases, as well as Advanced Recognition at Veoneer. technology, engineering, innovation and Edwin van de Merbel, Dirk Wittdorf, Petra
ogy Policy, Elliott School of International Driver Assistance Systems (ADAS) for 2- & MSc. Media Technology, Linköping Univer- industry, and is a dedicated proponent of Beekmans - van Zijll and all the other
Editors Affairs, Washington DC, USA. 4- wheeler vehicles for chaotic driving sity, Sweden. modernism. This has intuitively led him people at Nexperia for their support.
conditions in Asia & Africa. to capturing projects and objects at their
Ali Nasseri Matthew Nancekievill Master in Electrical Engineering, Georgia Fazal Chaudry most cutting edge. He has created an aes- Our team at Wevolver; including Sander
Vancouver, Canada Manchester, United Kingdom Institute of Technology. Headington, United Kingdom thetic of photography that is clean, pure Arts, Benjamin Carothers, Seth Nuzum,
Lab manager at the Programming Lan- Postdoctoral researcher submersible ro- Product Development Engineer. and devoid of any miscellaneous informa- Isidro Garcia, Jay Mapalad, and Richard
guages for Artificial Intelligence (PLAI) botics, University of Manchester, UK. Zeljko Medenica Master of Science, Space Studies, Interna- tion, winning him acclaim and numerous Hulskes. Many thanks for the proofreads
research group at the University of British PhD. Electrical and Electronics Engineer- Birmingham, Michigan, USA tional Space University, Illkirch Graffensta- awards. and feedback.
Columbia. ing, University of Manchester, UK. Principal Engineer and Human Machine den, France. Redgrove has amassed a following and
Previously Chair of the Space Generation CEO Ice Nine Robotics. Interface (HMI) Team Lead at the US client base from some of the most ad- The Wevolver community for their support,
Advisory Council. R&D Center of Changan, a major Chinese Shlomit Hacohen vanced companies in the world. A career knowledge sharing, and for making us
Cum Laude PhD. in Engineering Physics, Jeremy Horne automobile manufacturer. Previously led Tel Aviv, Israel spent recording the pioneering technol- create this report.
Politecnico di Torino. San Felipe, Baja California, Mexico research on novel and intuitive HMI for VP of Marketing at Arbe Robotics; develop- ogy of human endeavours has produce a
President Emeritus of the American Asso- Advanced Driver Assistance Systems at ing ultra high-resolution 4D imaging radar photographic art form that gives viewers a Many others that can’t all be listed here
Adriaan Schiphorst ciation for the Advancement of Science, Honda. technology. window into an often unseen world, such have helped us in big or small ways. Thank
Amsterdam, The Netherlands Southwest Division. PhD. in Electrical and Computer Engineer- MBA at Technion, the Israel Institute of as Lockheed Martin Skunk Works, UK MoD, you all.
Technology journalist. Science advisor and curriculum coordina- ing from the University of New Hampshire. Technology European Space Agency, British Aerospace
MSc Advanced Matter & Energy Physics at tor at the Inventors Assistance Center. and NASA. Whether capturing the U-2 re- Beyond the people mentioned here we
University of Amsterdam and the California Ph.D. in philosophy from the University of Maxime Flament Designer connaissance pilots and stealth planes, the owe greatly to the researchers, engineers,
Institute of Technology. Florida, USA. Brussels, Belgium Navy Bomb Disposal Division or spending writers, and many others who share their
Previously editor at Amsterdam Science Chief Technology Officer, 5G Automotive Bureau Merkwaardig time documenting the NASA’s past, present knowledge online. Find their input in the
Journal. Drue Freeman Association (5GAA) Amsterdam, The Netherlands and future, Benedict strives to capture references.
Cupertino, California, USA PhD. in Wireless Communication Systems, Award winning designers Anouk de the scope and scale of advancements and
Contributing Experts CEO of the Association for Corporate Chalmers University of Technology, Göte- l’Ecluse and Daphne de Vries are a creative what they mean to us as human beings. Media Partner
Growth, Silicon Valley. borg, Sweden. duo based in Amsterdam. They are special- His many awards include the 2009 AOP
Norman di Palo Former Sr. Vice President of Global ized in visualizing the core of an artistic Silver, DCMS Best of British Creatives, and Supplyframe
Rome, Italy Automotive Sales & Marketing for NXP Mark A. Crawford Jr. problem. Bureau Merkwaardig initiates, the Creative Review Photography Annual Supplyframe is a network for electronics
Robotics and Machine Learning researcher, Semiconductors. Baoding City, China develops and designs. 2003, 2008, and 2009. design and manufacturing. The company
conducting research on machine learning Board Director at Sand Hill Angels. Adviso- Chief Engineer for Autonomous Driving At Wevolver we are a great fan of Bene- provides open access to the world’s largest
for computer vision and control at the Isti- ry Board Member of automotive companies Systems at Great Wall Motor Co. Illustrations dict’s work and how his pictures capture a collection of vertical search engines,
tuto Italiano di Tecnologia, Genova, Italia. Savari and Ridar Systems, and Advisory PhD. Industrial and Systems Engineering spirit of innovation. We’re grateful he has supply chain tools, and online communi-
Cum Laude MSc. Engineering in Artifi- Board Member of Silicon Catalyst, a semi- - Global Executive Track, at Wayne State Sabina Begović enabled us to use his beautiful images of ties for engineering. Their mission is to
cial Intelligence and Robotics, Sapienza conductor focused incubator. University. Padua, Italy the Robocar to form the perfect backdrop organize the world of engineering knowl-
Università di Roma, and graduate of the Pi Bachelor of Science in Electrical Engineer- Previously Technical Expert at Ford. Croation born Sabina is a visual and inter- for this report. edge to help people build better hardware
School of Artificial Intelligence. ing, San Diego State University, MBA from action designer. She obtained a Master’s in products, and at Wevolver we support that
Pepperdine University, Los Angeles. William Morris Visual and Communication Design at Iuav, aspiration and greatly appreciate that Sup-
Detroit, Michigan, USA University of Venice, and a Masters in Art plyframe contributes to the distribution of
Automotive engineer Education at the Academy of Applied Art, this report among their network.
Rijeka, Croatia.

6 7
“It’s been an enormously
difficult, complicated
slog, and it’s far more
complicated and involved
than we thought it would
be, but it is a huge deal.”

Nathaniel Fairfield,
distinguished software
engineer and leader of the
‘behavior team’ at Waymo,
December 2019 [1]

8 9
Introduction
Bram Geenen Motorized transportation has changed Therefore, this report’s purpose is to How this report This report would not have been
Editor in Chief, the way we live. Autonomous vehicles enable you to be up to date and un- possible without the sponsorship of
CEO of Wevolver are about to do so once more. This derstand autonomous vehicles from a came to be: Nexperia, a semiconductor company
evolution of our transport - from hors- technical viewpoint. shipping over 90Bn components an-
es and carriages, to cars, to driverless We have compiled and centralized the a collaborative nually, the majority of which are with-
vehicles, - has been driven by both information you need to understand in the automotive industry. Through
technical innovation and socioeco- what technologies are needed to effort their support, Nexperia shows a
nomic factors. In this report we focus develop autonomous vehicles. We will commitment to the sharing of objec-
on the technological aspect. elaborate on the engineering consid- Once the decision was made to create tive knowledge to help technology
erations that have been and will be this report, we asked our communi- developers innovate. This is the core
Looking at the state of autonomous made for the implementation of these ty for writers with expertise in the of what we do at Wevolver.
vehicles at the start of the 2020s we technologies, and we’ll discuss the field, and for other experts who could
can see that impressive milestones current state of the art in the industry. provide input. A team of writers and The positive impact these technol-
have been achieved, such as compa- editors crafted a first draft, leveraging ogies could possibly have on both
nies like Waymo, Aptiv, and Yandex This reports’ approach is to describe many external references. Then, in a individual lives, and our society and
offering autonomous taxis in dedicat- technologies at a high level, to offer second call-out to our community we planet as a whole are an inspiring
ed areas since mid-2018. At the same the baseline knowledge you need to found many engineers and leaders and worthwhile goal. At Wevolver
time, technology developers have run acquire, and to use lots of references from both commercial and academic we hope this report provides the
into unforeseen challenges. to help you dive deeper whenever backgrounds willing to contribute information and inspiration for you in
needed. significant amounts of their time any way possible to be a part of that
Some industry leaders and experts and attention to providing extensive evolution.
have scaled back their expectations, Most of the examples in the report feedback and collaborating with us
and others have spoken out against will come from cars. However, indi- to shape the current report through
optimistic beliefs and predictions.[2,3] vidual personal transportation is not many iterations. We owe much to
Gartner, a global research and advi- the only area in which Autonomous their dedication, and through their
sory firm, weighs in by now placing Vehicles (AVs) will be deployed and input this report has been able to
‘autonomous vehicles’ in the Trough of in which they will have a significant incorporate views from across the
Disillusionment of their yearly Hype impact. Other areas include public industry and 11 different countries.
Cycle.[4] transportation, delivery & cargo and
specialty vehicles for farming and Because this field continues to
The engineering community is less mining. All of these come with their advance, we don’t consider our work
affected by media hype: Over 22% of own environment and specific usage done. We intend to update this report
the engineers visiting the Wevolver requirements that are shaping AV into new editions regularly as new
platform do so to gain more knowl- technology. At the same time, all of knowledge comes available and our
edge on autonomous vehicle technol- the technologies described in this re- understanding of the topic grows.
ogy.[5] Despite how much topics like port form the ingredients for autono- You are invited to play an active role
market size and startup valuations my, and thus will be needed in various and contribute to this evolution, be it
have been covered globally by the applications. through brief feedback or by submit-
media, many engineers have ex- ting significant new information and
pressed to our team at Wevolver that insights to our editorial team (info@
comprehensive knowledge to grasp wevolver.com), your input is highly
the current technical possibilities is appreciated and invaluable to further
still lacking. the knowledge on this topic.

10 11
“Autonomous vehicles
are already here – they’re
just not very evenly
distributed.”
William Gibson,
Science fiction writer,
April 2019 [12]

12 13
Levels of Autonomy
When talking about autonomous ve- Level 0 (L0): Level 2 (L2): Level 3 (L3): Level 4 (L4): Level 5 (L5):
hicles, it is important to keep in mind No automation Now both steering and accelera- Conditional automation: The sys- These systems have high auto- Full automation, the vehicle can
that each vehicle can have a range of tion are simultaneously handled tem can drive without the need mation and can fully drive them- drive wherever, whenever.
autonomous capabilities. To enable Level 1 (L1): by the autonomous system. The for a human to monitor and re- selves under certain conditions.
classification of autonomous vehicles, Advanced Driver Assistance Sys- human driver still monitors the spond. However, the system might The vehicle won’t drive if not all
the Society Of Automotive Engineers tems (ADAS) are introduced: fea- environment and supervises the ask a human to intervene, so the conditions are met.
(SAE) International established its tures that either control steering support functions. driver must be able to take con-
SAE J3016™ “Levels of Automated or speed to support the driver. For trol at all times.
Driving” standard. Its levels range example, adaptive cruise control
from 0-5 and a higher number des- that automatically accelerates and
ignates an increase in autonomous decelerates based on other vehi-
capabilities.[6] cles on the road.

Levels of driving automation summary.


Adapted from SAE by Wevolver. [6]

ZZ Z ZZ Z
Z ZZ

0 00 1 11 2 22 3 33 4 44 5 55
NO AUTOMATION NO
NOAUTOMATION
AUTOMATION
DRIVER ASSISTANCE
DRIVER
DRIVERASSISTANCE
ASSISTANCE
PARTIAL AUTOMATION
PARTIAL
PARTIALCONDITIONAL
AUTOMATION
AUTOMATIONAUTOMATION
CONDITIONAL
CONDITIONALAUTOMATION
AUTOMATION
HIGH AUTOMATIONHIGH
HIGHAUTOMATION
AUTOMATION
FULL AUTOMATIONFULL
FULLAUTOMATION
AUTOMATION

You monitor the environment.


You
Youmonitor
monitor
Youthe
the
areenvironment.
environment.
the driver, You
Youare
arethe
thedriver,
When
driver,system requests,
When
Whensystem
systemrequests,
requests,
even when automationeven
even
features
when
whenare
automation
automation
turned on.
features
featuresare
areturned
turned
you
on.
on.must take control.
you
youmust
musttake
takecontrol.
control.
No requirement for you
No
Noto
requirement
requirement
take over control.
for
foryou
youtototake
takeover
overcontrol.
control.

System operates whenSystem


System
specific
operates
operateswhen
whenspecific
specific System operates in allSystem
Systemoperates
operatesininall
all
System suports you driving.
System
Systemsuports
suportsyou
youdriving.
driving. conditions are met. conditions
conditionsare
aremet.
met. conditions conditions
conditions

Steering OR speed Steering


SteeringOR
ORspeed
speed
are automated. are
areautomated.
automated. Steering AND speed are
Steering
automated.
Steering ANDspeed
AND speedare
areautomated.
automated.

14 15
The context and environment (in-
cluding rules, culture, weather, etc.)
in which an autonomous vehicle
Level 5 ADS have the same mobility
as a human driver: an unlimited ODD.
Designing the autonomous vehicle to
Sensing
needs to operate greatly influences be able to adjust to all driving sce-
the level of autonomy that can be narios, in all road, weather and traffic
achieved. On a German Autobahn, the conditions is the biggest technical Because an autonomous vehicle oper-
speed and accuracy of obstacle de- challenge to achieve. Humans have ates in an (at least partially) unknown
tection, and the subsequent decisions the capability to perceive a large and dynamic environment, it simulta-
that need to be made to change the amount of sense information and neously needs to build a map of this
speed and direction of the vehicle fuse this data to make decisions us- environment and localize itself within
need to happen within a few milli- ing both past experience and our im- the map. The input to perform this
seconds, while the same detection agination. All of this in milliseconds. Simultaneous Localization and Map-
and decisions can be much slower A fully autonomous system needs to ping (SLAM) process needs to come
for a vehicle that never leaves a match (and outperform) us in these from sensors and pre-existing maps
corporate campus. In a similar matter, capabilities. The question of how to created by AI systems and humans.
the models needed to drive in sunny assess the safety of such a system
Arizona are more predictable than needs to be addressed by legislators.
those in New York City, or Banga- Companies have banded together,
Static Moving Road Lane Traffic Street
lore. That also means an automated like in the Automated Vehicle Safety Objects Objects Markings Markings Lights Signs
driving system (ADS) capable of L3 Consortium, to jointly develop new
automation in the usual circumstanc- frameworks for safety.[10]
es of e.g. Silicon Valley, might need
to fall back to L2 functionality if it Major automotive manufacturers,
would be deployed on snowy roads as well as new entrants like Google
or in a different country. (Waymo), Uber, and many startups
are working on AVs. While design
The capabilities of an autonomous concepts differ, all these vehicles rely
vehicle determine its Operational on using a set of sensors to perceive
Design Domain (ODD). The ODD the environment, advanced software
defines the conditions under which to process input and decide the
a vehicle is designed to function and vehicle’s path and a set of actuators
is expected to perform safely. The to act upon decisions. [11] The next
ODD includes (but isn’t limited to) sections will review the technologies
environmental, geographical, and needed for these building blocks of
time-of-day restrictions, as well as autonomy.
traffic or roadway characteristics.
For example, an autonomous freight
truck might be designed to transport
cargo from a seaport to a distribu-
tion center 30 Km away, via a specific
route, in day-time only. This vehicles
ODD is limited to the prescribed
route and time-of-day, and it should
not operate outside of it.[7–9]

Example of the variety of static and


moving objects that an autonomous
vehicle needs to detect and distinguish
from each other. Image: Wevolver,
based on a photo by Dan Smedley.

16 17
Environmental Passive sensors This leads to higher noise susceptibil-
ity for CMOS sensors, such that CCD
mapping Due to the widespread use of object
detection in digital images and vide-
sensors can create higher quality im-
ages. Yet, CMOS sensors use up to 100
os, passive sensors based on camera times less power than CCDs. Further- LIDAR Ir Cameras

In order to perceive a vehicle’s direct technology were one of the first more, they’re easier to fabricate using
environment, object detection sensors sensors to be used on autonomous standard silicon production processes.
are used. Here, we will make a dis- vehicles. Digital cameras rely on CCD
tinction between two sets of sensors: (charge-coupled device) or CMOS Most current sensors used for auton-
passive and active. Passive sensors (complementary metal-oxide semi- omous vehicles are CMOS based and
GNSS Long Range RADAR
detect existing energy, like light or conductor) image sensors which work have a 1-2 megapixel resolution.[15]
radiation, reflecting from objects in by changing the signal received in the
the environment, while active sensors 400-1100 nm wavelengths (visible to While passive CMOS sensors are
send their own electromagnetic near infrared spectra) to an electric generally used in the visual light
signal and sense its reflection. These signal.[13,14] spectrum, the same CMOS technology
Short / Medium
sensors are already found in automo- could be used in thermal imaging IMU Range RADAR
tive products at Level 1 or 2, e.g. for The surface of the sensor is broken cameras which work in the infrared
lane keeping assistance. down into pixels, each of which can wavelengths of 780 nm to 1 mm.
sense the intensity of the signal They are useful sensors for detection
received, based on the amount of of hot bodies, such as pedestrians or
charge accumulated at that location. animals, and for peak illumination
By using multiple sensors that are situations such as the end of a tunnel, Cameras Ultrasound
sensitive to different wavelengths of where a visual sensor will be blinded
light, color information can also be by the light intensity.[16]
encoded in such a system.
In most cases, the passive sensor
While the principle of operation of suite aboard the vehicle consists of
CCD and CMOS sensors are similar, more than one sensor pointing in the
their actual operation differs. CCD same direction. These stereo camer-
sensors transport charge to a specific as can take 3D images of objects by
corner of the chip for reading, while overlaying the images from the differ-
each pixel in a CMOS chip has its own ent sensors. Stereoscopic images can
transistor to read the interaction with then be used for range finding, which
light. Colocation of transistors with is important for autonomous vehicle
sensor elements in CMOS reduces its application.
light sensitivity, as the effective sur-
face area of the sensor for interaction
with the light is reduced.

An example of typical sensors used to perceive the environment. Note that various vehicle
manufacturers may use different combinations of sensors and might use all of the displayed
sensors. For example, increasingly multiple smaller LIDAR sensors are being used, and long
range backward facing RADAR can be incorporated to cover situations like highway lane
changing and merging. The placing of the sensors can vary as well. Image: Wevolver

18 19
The main benefits of passive sensors Indeed, Tesla cars mount an array of be done by using a rotating camera “Once you solve cameras for vision, autonomy is
are[17]: cameras all around the vehicle to that takes images at specific inter-
gather visual field information, and vals, or by stitching the images of 4-6
solved; if you don’t solve vision, it’s not solved
• High-resolution in pixels and London based startup Wayve claims cameras together through software. … You can absolutely be superhuman with just
color across the full width of its that its cars which only rely on pas- In addition, these sensors need a high
field of view. sive optic sensors are safe enough for dynamic range (the ability to image cameras.”
• Constant ‘frame-rate’ across the use in cities. The main shortcoming of both highlights and dark shadows in a
field of view. passive sensors is their performance scene), of more than 100 dB,[22] giving Elon Musk,
• Two cameras can generate a 3D in low light or poor weather condi- them the ability to work in various 2017 [19]
stereoscopic view. tions; due to the fact that they do not light conditions and distinguish be-
• Lack of transmitting source re- have their own transmission source tween various objects.
duces the likelihood of interfer- they cannot easily adapt to these
ence from another vehicle. conditions. These sensors also gen- Dynamic range is measured in decibel
• Low cost due to matured tech- erate 0.5-3.5 Gbps of data,[18] which (dB); a logarithmic way of describing
nology. can be a lot for onboard processing a ratio. Humans have a dynamic range “At the moment, LIDAR lacks the capabilities to
• The images generated by these or communicating to the cloud. It is of about 200 dB. That means that in a
systems are easy for users to also more than the amount of data single scene, the human eye can per- exceed the capabilities of the latest technology in
understand and interact with generated by active sensors. ceive tones that are about 1,000,000 radar and cameras.”
times darker than the brightest ones.
If a passive camera sensor suite Cameras have a narrower dynamic
Tetsuya Iijima,
is used on board an autonomous range, though are getting better.
vehicle, it will likely need to see the
General Manager of Advanced Technology De-
whole surrounding of the car. This can velopment for Automated Driving, Nissan,
May 2019 [20]

“Let’s be candid, LIDAR is unaffordable in consumer


vehicles, but if a lidar unit were available today
that had good performance and was affordable, it
would quietly show up in a Tesla car and this whole
Gamma-Ray X-Ray UV Visible IR Microwave Radio hubbub would go away.”
Bill Colleran,
CEO, Lumotive,
June 2019 [21]

10-12 10-10 10-8 10-6 10-4 10-2 100 102 104 106
Wavelength, λ (m)

RADAR
THERMAL CAMERAS
LIDAR
CAMERAS
The electromagnetic spectrum and its
usage for perception sensors .[16]

20 21
Active Sensors Ultrasonic sensors (also referred to as RADAR (RAdio Detection And Rang- Time of flight principle, illustrated.
Image: Wevolver.
SONAR; SOund NAvigation Ranging) ing) uses radio waves for ranging.
Active sensors have a signal transmis- use ultrasound waves for ranging and Radio waves travel at the speed of The distance can be calculated using the
sion source and rely on the principle are by far the oldest and lowest cost light and have the lowest frequency formula d=(v⋅t)/2. Where d is the distance,
of these systems. As sound waves (longest wavelength) of the electro- v is the speed of the signal (the speed of
environment. ToF measures the travel have the lowest frequency (longest magnetic spectrum. RADAR signals sound for sound waves, and the speed of
time of a signal from its source to a wavelengths) among the sensors - light for electromagnetic waves) and t is
used, they are more easily disturbed. rials that have considerable electrical the time for the signal to go to reach the
object and reflect back. This calculation
the signal to return. This means the sensor is easily conductivity, such as metallic objects.
method is the most common but has lim-
affected by adverse environmental Interference from other radio waves itations and more complex methods have
The frequency of the signal used de- conditions like rain and dust. Inter- can affect RADAR performance, while been developed; for example, using the
termines the energy used by the sys- ference created by other sound waves transmitted signals can easily bounce phase-shift in a returning wave. [23]
tem, as well as its accuracy. Therefore, can affect the sensor performance off curved surfaces, and thus the
determining the correct wavelength as well and needs to be mitigated by sensor can be blind to such objects.
plays a key role in choosing which using multiple sensors and relying on At the same time, using the bouncing
system to use. additional sensor types. In addition, as properties of the radio waves can
sound waves lose energy as distance enable a RADAR sensor to ‘see’ beyond
increases, this sensor is only effective objects in front of it. RADAR has less-
over short distances such as in park er abilities in determining the shape
assistance. More recent versions rely of detected objects than LIDAR.[25]
on higher frequencies, to reduce the Signal in
likelihood of interference.[24] -
DAR are its maturity, low cost, and
resilience against low light and bad
weather conditions. However, radar
can only detect objects with low
spatial resolution and without much
information about the spatial shape
of the object, thus distinguishing
between multiple objects or separat-
ing objects by direction of arrival can
be hard. This has relegated radars to
more of a supporting role in automo- Signal out
tive sensor suites.[17]

Di
sta
nc
em
“We need more time for the car to re- ea
su
act, and we think imaging radar will be re d
a key to that.”
Chris Jacobs, Vice President of Autonomous Transporta-
tion and Automotive Safety, Analog Devices Inc,
January 2019 [26]

22 23
“Almost everything is in R&D, of which 95 per- Imaging radar is particularly interest- LIDAR Systems that do not use any
ing for autonomous cars. Unlike short mechanical parts are referred to as
cent is in the earlier stages of research, rather range radar which relies on 24GHz ra- solid-state, and sometimes as ‘LIDAR-
than actual development, the development stage dio waves, imaging radar uses higher on-a-chip.’
energy 77-79 GHz waves. This allows
is a huge undertaking — to actually move it to- the radar to scan a 100 degree field Flash LIDARS are a type of solid-state
wards real-world adoption and into true series of view for up to a 300 m distance. LIDARS that diffuse their laser beam
This technology eliminates former to illuminate an entire scene in one
production vehicles. Whoever is able to enable resolution limitations and generates flash. The returning light is captured
true autonomy in production vehicles first is go- a true 4D radar image of ultra-high by a grid of tiny sensors. A major chal-
resolution.[15,26,27] lenge of Flash LIDARS is accuracy.[30]
ing to be the game changer for the industry. But
that hasn’t happened yet.” LIDAR (LIght Detection And Ranging)
uses light in the form of a pulsed
Phased-Array LIDARS are another
solid-state technology that is under-
laser. LIDAR sensors send out 50,000 going development. Such systems
Austin Russell, founder and CEO of - 200,000 pulses per second to cover feed their laser beam into a row of
Luminar, June 2019 [21] an area and compile the returning emitters that can change the speed
signals into a 3D point cloud. By and phase of the light that passes
comparing the difference in consec- through.[31] The laser beam gets
utive perceived point clouds, objects pointed by incrementally adjusting
and their movement can be detected the signal’s phase from one emitter to
such that a 3D map, of up to 250m in the next.
range, can be created.[28]
Metamaterials: A relatively new
There are multiple approaches to development is to direct the laser by
LIDAR technology: shining it onto dynamically tunable
metamaterials. Tiny components on
Mechanical scanning LIDARS use these artificially structured metas-
rotating mirrors and/or mechanically urfaces can be dynamically tuned to
rotate the laser. This setup provides a slow down parts of the laser beam,
wide Field Of Vision but is also rela- which through interference results
tively large and costly. This technolo- in a beam that’s pointing in a new
gy is the most mature. direction. Lumotive, a startup funded
by Bill Gates, claims its Metamaterial
Microelectromechanical mirrors based LIDARS can scan 120 degrees
(MEMS) based LIDARS distribute horizontally and 25 degrees vertically.
the laser pulses via one or multiple [32]

tiny tilting mirrors, whose angle is


controlled by the voltage applied to
them. By substituting the mechanical
scanning hardware with an electro-
mechanical system, MEMS LIDARS can
achieve an accurate and power-ef-
ficient laser deflection, that is also
cost-efficient.[29]

LIDAR provides a 3D point cloud of the environment.


Image : Renishaw

24 25
Interference from a source with the Among the three main active, ToF
same wavelength, or changes in based systems, SONAR is mainly used
reflectivity of surfaces due to wet- as a sensor for very close proximity
ness can affect the performance of due to the lower range of ultrasound
LIDAR sensors. LIDAR performance waves. RADAR cannot make out
can also be affected by external light, complex shapes, but it is able to see
including from other LIDARS.[33] While through adverse weather such as rain
traditional LIDAR sensors use 900 nm and fog. LIDAR can better sense an
wavelengths, new sensors are shifting object’s shape, but is shorter range
to 1500 nm enabling the vehicle to and more affected by ambient light
see objects 150-250 m away.[26,28] and weather conditions. Usually two
active sensor systems are used in
LIDAR has the benefits of having a conjunction, and if the aim is to only
relatively wide field of vision, with rely on one, LIDAR is often chosen.
potentially full 360 degree 3D cover- Secondly, active sensors are often
age (depending on the type of LIDAR used in conjunction with passive
chosen). Furthermore, it has a longer sensors (cameras).
range, more accurate distance esti-
mates compared to passive (optical)
sensors and lower computing cost.[17]
Its resolution however, is poorer and
laser safety can put limits on the laser
power used, which in turn can affect
the capabilities of the sensor.

These sensors have traditionally been


very expensive, with prices of tens of
thousands of dollars for the iconic
rooftop mounted 360 degree units.
However, prices are coming down: Long range RADAR Cameras LIDAR Short / Medium Ultrasound
Object detection, A combination of 3D environment mapping, range RADAR Close range object
Market leader Velodyne announced in
through rain, fog, dust. cameras for short-long object detection. Short-mid range detection. For objects
January 2020 a Metamaterials LIDAR range object detection.
Signal can bounce object detection. entering your lane.
that should ship for $100, albeit offer- around/underneath Broad spectrum of use Inc. side and rear For parking.
ing a narrower field of vision (60° vehiclesin front that
cases: from distant collision avoidance.
horizontal x 10° vertical) and shorter feature perception to
obstruct view.
cross traffic detection.
range (100m).[34,35] Road sign recognition.

Various object detection and mapping


sensors are used for various purposes,
and have complementary capabilities
and ranges. Image: Wevolver

26 27
Choice of Sensors The following technical factors Vehicle manufacturers use a
affect the choice of sensors: mixture of optical and ToF sen-
While all the sensors presented have sors, with sensors strategically Comparison of various sensors used in autonomous vehicles. [14,18,26,36–38]
their own strengths and shortcom- • The scanning range, determining located to overcome the short-
ings, no single one would be a viable the amount of time you have to comings of the specific technol-
solution for all conditions on the react to an object that is being ogy. By looking at their setup we
road. A vehicle needs to be able to sensed. can see example combinations
avoid close objects, while also sens- • Resolution, determining how used for perception: Measurement Data rate
ing objects far away from it. It needs much detail the sensor can give Sensor Cost ($)
distance (m) (Mbps)
to be able to operate in different you. • Tesla’s Model S uses a forward
environmental and road conditions • Field of view or the angular res- mounted radar to sense the
with challenging light and weather olution, determining how many road, 3 forward facing cameras
Camera 0-250 4–200 500-3500
circumstances. This means that to sensors you would need to cover to identify road signs, lanes and
reliably and safely operate an auton- the area you want to perceive. objects, and 12 ultrasonic sensors
omous vehicle, usually a mixture of • Ability to distinguish between to detect nearby obstacles around
sensors is utilized. multiple static and moving ob- the car Ultrasound 0.02-10 30-400 < 0.01
jects in 3D, determining the num- • Volvo-Uber uses a top mounted
ber of objects you can track. 360 degree Lidar to detect road
• Refresh rate, determining how objects, short and long range
frequently the information from optical cameras to identify road RADAR 0.2-300 30-400 0.1-15
the sensor is updated. signals and radar to sense close-
• General reliability and accuracy by obstacles
in different environmental con- • Waymo uses a 360 degree LIDAR
ditions. to detect road objects, 9 visual LIDAR Up to 250 1,000-75,000 20-100
• Cost, size and software compat- cameras to track the road and a
ibility. radar for obstacle identification
• Amount of data generated. near the car.
• Wayve uses a row of 2.3-meg- Note that these are typical ranges and more extreme values exist. For example, Arbe Robotics’ RADAR can
apixel RGB cameras with high-dy- generate 1GBps depending on requirements from OEMs. Also note that multiple low costs sensors can be
namic range, and satellite naviga- required to achieve comparable performance to high-end sensors.
tion to drive autonomously.[39]

28 29
Different Approaches 3x Forward Facing Cameras (Wide, Main, Narrow) Forward Looking Side Cameras Rear View Camera
by Tesla, Volvo-Uber, and Waymo:

Tesla Model S. Volvo-Uber XC90.Way- Volvo provides a base vehicle with


mo Chrysler Pacifica[36, 40-45] Images: pre-wiring and harnessing for Uber
adapted from Tesla, Volvo, Waymo, by to directly plug in its own self-driv-
Wevolver. ing hardware, which includes the rig
with LIDAR and cameras on top of the
Companies take different approaches vehicle.
to the set of sensors used for autono-
my, and where they are placed around
the vehicle.

Tesla’s sensors contain heating to


counter frost and fog, Volvo’s camer-
as come equipped with a water-jet
washing system for cleaning their
nozzles, and the cone that contains
the cameras on Waymo’s Chrysler has
water jets and wipers for cleaning.

Forward Facing RADAR Rearward Looking Side Cameras 12 Ultrasonics around the vehicle

Uber’s Hardware:
Forward Facing Cameras LIDAR Side and Rear Cameras

4x RADAR Long-range LIDAR 360° Cameras Audio 2x Short-range LIDAR 2x Mid-range LIDAR

Volvo’s Hardware:
RADAR, front & back Forward Facing Cameras Side Cameras Ultrasound, front & back Rear Camera

30 31
Geolocalization accuracy can be achieved using mul-
ti-constellation; where the receiver
In the absence of additional signals
or onboard sensors, dead-reckoning
Maps
leverages signals from multiple GNSS may be used, where the car’s naviga- Today, map services such as Google
Once the autonomous vehicle has systems. Furthermore, accuracy can be tion system uses wheel circumference, Maps are widely used for navigation.
scanned its environment, it can find brought down to ~ 1cm levels using speed, and steering direction data to However, autonomous vehicles will
its location on the road relative to additional technologies that augment calculate a position from occasion- likely need a new class of high defi-
other objects around it. This informa- the GNSS system. ally received GPS data and the last nition (HD) maps that represent the
tion is critical for lower-level path known position.[52] In a smart city world at up to two orders of magni-
planning to avoid any collisions with To identify the position of the car, all environment, additional navigational tude more detail. With an accuracy of
objects in the vehicle’s immediate satellite navigation systems rely on aid can be provided by transponders a decimeter or less, HD maps increase
vicinity. the time of flight of a signal between that provide a signal to the car; by the spatial and contextual awareness
the receiver and a set of satellites. measuring its distance from two or “If we want to have of autonomous vehicles and provide
On top of that, in most cases the user GNSS receivers triangulate their po- more signals the vehicle can find its a source of redundancy for their
communicates the place they would sition using their calculated distance location within the environment.
autonomous cars sensors.
like to go to in terms of a geograph- from at least four satellites.[48] By con- everywhere, we have
ical location, which translates to a tinuously sensing, the path of the ve- to have digital maps By triangulating the distance from
latitude and longitude. Hence, in addi- hicle is revealed. The heading of the known objects in a HD map, the
tion to knowing its relative position vehicle can be determined using two everywhere.” precise localization of a vehicle can
in the local environment, the vehicle GNSS antennas, by using dedicated
needs to know its global position on onboard sensors such as a compass, Amnon Shashua,
Chief Technology Officer at
Earth in order to be able to determine or it can be calculated based on input Mobileye, 2017 [55]
a path towards the user’s destination. from vision sensors.[49]

The default geolocalization method While accurate, GNSS systems are


is satellite navigation, which provides also affected by environmental fac-
a general reference frame for where tors such as cloud cover and signal
the vehicle is located on the planet. reflection. In addition, signals can be
Different Global Navigation Satellite blocked by man-made objects such as
Systems (GNSS) such as the American tunnels or large structures. In some
GPS, the Russian GLONASS, the Euro- countries or regions, the signal might
pean Galileo or the Chinese Beidou also be too weak to accurately geolo-
can provide positioning information cate the vehicle.
with horizontal and vertical resolu-
tions of a few meters. To avoid geolocalization issues, an
Inertial Measurement Unit (IMU) is
While GPS guarantees a global signal integrated with the system.[50,51] By
user range error (URE) of less than 7.8 using gyroscopes and accelerometers,
m, its signal’s actual average range such a unit can extrapolate the data
error has been less than 0.71 m. The available to estimate the new loca-
real accuracy for a user however, de- tion of the vehicle when GNSS data is
pends on local factors such as signal unavailable.
blockage, atmospheric conditions, and
quality of the receiver that’s used.
[46]
Galileo, once fully operational,
could deliver a < 1m URE.[47] Higher

A 3D HD map covering an intersection. Image: Here

32 33
be determined. Another benefit is
that the detailed information a high
definition map contains could narrow
As another example, London based
startup Wayve only uses standard
sat-nav and cameras. They aim to
Thinking & Learning
down the information that a vehicle’s achieve full autonomy by using
perception system needs to acquire, imitation learning algorithms to copy
and enable the sensors and software the behavior of expert human drivers, Based on the raw data captured SLAM and In order to perform SLAM more accu-
to dedicate more efforts towards and consequently using reinforcement by the AV’s sensor suite and the rately, sensor fusion comes into play.
moving objects.[53] learning to learn from each inter- pre-existing maps it has access to, Sensor Fusion Sensor fusion is the process of com-
vention of their human safety driver the automated driving system needs bining data from multiple sensors
HD maps can represent lanes, geome- while training the model in autono- to construct and update a map of SLAM is a complex process because and databases to achieve improved
try, traffic signs, the road surface, and mous mode.[58] its environment while keeping track a map is needed for localization and information. It is a multi-level pro-
the location of objects like trees. The of its location in it. Simultaneous a good position estimate is needed cess that deals with the association,
information in such a map is repre- Researchers from MIT’s Computer localization and mapping (SLAM) al- for mapping. Though long consid- correlation, and combination of data,
sented in layers, with generally at Science and Artificial Intelligence gorithms let the vehicle achieve just ered a fundamental chicken-or-egg and enables to achieve less expen-
least one of the layers containing 3D Laboratory (CSAIL) also took a that. Once its location on its map is problem for robots to become au- sive, higher quality, or more relevant
geometric information of the world in ‘map-less’ approach and developed a known, the system can start planning tonomous, breakthrough research in information than when using a single
high detail to enable precise calcu- system that uses LIDAR sensors for which path to take to get from one the mid-1980s and 90s solved SLAM data source alone.[64]
lations. all aspects of navigation, only relying point to another. on a conceptual and theoretical
on GPS for a rough location estimate. level. Since then, a variety of SLAM
Challenges lie in the large efforts to [59–61]
approaches have been developed, the
generate high definition maps and majority of which uses probabilistic
keep them up to date, as well as in concepts.[62,63]
the large amount of data storage and
bandwidth it takes to store and trans-
fer these maps.[54]

Most in the industry express HD maps


to be a necessity for high levels of
autonomy, in any case for the near SENSING & DATA INPUT COMPUTATION & DECISION MAKING ACT & CONTROL
future as they have to make up for THE VEHICLE
limited abilities of AI. However, some
Cameras (inc. Thermal Cameras)
disagree or take a different approach.
RADAR
According to Elon musk Tesla “briefly
barked up the tree of high precision LIDAR Steering
lane line [maps], but decided it wasn’t
Ultrasound Sensors Simultaneous Accelerating
a good idea.”[56] In 2015 Apple, for Localization Planning
its part, patented an autonomous IMU
And
Braking
Mapping
navigation system that lets a vehicle
navigate without referring to exter- GNSS Signalling
nal data sources. The system in the
patent leverages AI capabilities and “The need for dense Map Data
vehicle sensors instead.[57]
3-D maps limits Vehicle-to-Vehicle Communication

the places where Vehicle-to-Infrastructure Communication


self-driving cars can
operate.”
Daniela Rus, The complex computation and
director of MIT’s Computer decision making environment of
Science and Artificial Intelli- an autonomous vehicle.[65]
gence Laboratory (CSAIL), 2018 Image: Wevolver

34 35
For the all processing and decision The question which approach is best First, we’ll review how the data from es the transformation between the
making required to go from sensor for AVs is an area of ongoing debate. the sensors is processed to reach a two point clouds, which enables to
data to motion in general two differ- The traditional, and most common decision regarding the robotic vehi- calculate the translation and rotation
ent AI approaches are used [66]: approach consists of decomposing cle’s motion. Depending on the differ- the vehicle had.
the problem of autonomous driv- ent sensors used onboard the vehicle,
1. Sequentially, where the driving ing into a number of sub-problems different software schemes can be While useful, the preceding ap-
process is decomposed into com- and solving each one sequentially used to extract useful information proaches consume much computing
ponents of a hierarchical pipeline. with a dedicated machine learning from the sensor signals. time, and cannot easily be scaled
Each step (sensing, localization algorithm technique from computer to the case of a self-driving vehicle
and mapping, path planning, vision, sensor fusion, localization, There are several algorithms that operating in a continuously changing
motion control) is handled by a control theory, and path planning.[67] can be used to identify objects in environment. That is where machine
specific software element, with an image. The simplest approach learning comes into play, relying on
each component of the pipeline End-to-End (e2e) learning increas- is edge detection, where changes computer algorithms that have al-
feeding data to the next one, or ingly gets interest as a potential in the intensity of light or color in ready learned to perform a task from
2. An End-to-End solution based on solution to the challenges of the different pixels are assessed.[69] One existing data.
deep learning that takes care of complex AI systems for autonomous would expect pixels that belong to
all these functions. vehicles. End-to-end (e2e) learning the same object to have similar light
applies iterative learning to a com- properties; hence looking at chang-
plex system as a whole, and has been es in the light intensity can help
popularized in the context of deep separate objects or detect where one
learning. An End-to-End approach object turns to the next. The problem
attempts to create an autonomous with this approach is that in low light
driving system with a single, com- intensity (say at night) the algorithm
prehensive software component that cannot perform well since it relies on
directly maps sensor inputs to driving differences in light intensity. In addi-
actions. Because of breakthroughs tion, as this analysis has to be done
in deep learning the capabilities of on each shot and on multiple pixels,
e2e systems have increased as such there is a high computational cost.
that they are now considered a viable
option. These systems can be created LIDAR data can be used to compute
Perception & High-Level Behavior Motion Controllers Autonomy with one or multiple different types the movement of the vehicle with
+ + + =
Localization Path Planning Arbitration of machine learning methods, such the same principle. By comparing
(low-level path as Convolutional Neural Networks or two point clouds taken at consecu-
planning) Reinforcement Learning, which we tive instants, some objects will have
will elaborate on later in this report. moved closer or further from the
[67,68]
sensor. A software technique called
iterative closest point iteratively revis-
//

End2End Learning = Autonomy

Two main approaches to the AI architecture of autonomous vehicles: 1) sequential per-


ception-planning-action-pipelines 2) an End2End system.[66]
Image: Wevolver

36 37
Machine Learning CNNs are mainly used to process RNNs are powerful tools when work- These methods don’t necessarily sit in
Methods images and spatial information to
extract features of interest and identi-
ing with temporal information such
as videos. In these networks the out-
isolation. For example, companies like
Tesla rely on hybrid forms, which try
Different types of machine learning fy objects in the environment. These puts from the previous steps are fed to use multiple methods together to
algorithms are currently being used neural networks are made of a convo- into the network as input, allowing increase accuracy and reduce compu-
for different applications in autono- lution layer: a collection of filters that information and knowledge to persist tational demands.[77,78]
mous vehicles. In essence, machine tries to distinguish elements of an im- in the network and be contextualized.
learning maps a set of inputs to a set age or input data to label them. The [72–74]
Training networks on several tasks
of outputs, based on a set of training output of this convolution layer is fed at once is a common practice in
data provided. Convolutional Neural into an algorithm that combines them DRL combines Deep Learning (DL) deep learning, often called multi-task
Networks (CNN), Recurrent Neural to predict the best description of an and Reinforcement Learning. DRL training or auxiliary task training. This
Networks (RNN) and Deep Reinforce- image. The final software component methods let software-defined ‘agents’ is to avoid overfitting, a common
ment Learning (DRL) are the most is usually called an object classifier, learn the best possible actions to issue with neural networks. When a
common deep learning methodolo- as it can categorize an object in the achieve their goals in a virtual en- machine learning algorithm is trained
gies applied to autonomous driving. image, for example a street sign or vironment using a reward function. for a particular task, it can become
[66]
another car.[69–71] These goal-oriented algorithms learn so focused imitating the data it is
how to attain an objective, or how to trained on that its output becomes
maximize along a specific dimension unrealistic when an interpolation or
over many steps. While promising, a extrapolation is attempted. By train-
challenge for DRL is the design of the ing the machine learning algorithm
correct reward function for driving a on multiple tasks, the core of the
vehicle. Deep Reinforcement Learning network will specialize in finding
is considered to be still in an early general features that are useful for
stage regarding application in auton- all purposes instead of specializing
omous vehicles.[75,76] only on one task. This can make the
outputs more realistic and useful for
applications.

Algorithms turn input from sensors into object classifications and a map of the environment.
Image: Wayve

38 39
Gathering Data One way to gather data is by using a
prototype car. These cars are driven
In order for these algorithms to be by a driver. The perception sensors
used, they need to be trained on data onboard are used to gather informa-
sets that represent realistic scenarios. tion about the environment. At the
With any machine learning process, a same time, an on-board computer will
part of the data set is used for train- record sensors readings coming from
ing, and another part for validation the pedals, the steering wheel, and all
and testing. As such, a great amount other information that can describe
of data is annotated by autonomous how the driver acts. Due to the large
vehicle companies to achieve this amount of data that needs to be
goal.[77] Many datasets, with semantic gathered and labelled by humans,
segmentation of street objects, sign this is a costly process. According
classification, pedestrian detection to Andrej Karpathy, Director of AI at
and depth prediction, have been Tesla, most of the efforts in his group
made openly available by researchers are dedicated to getting better and
and companies including Aptiv, Lyft, better data.[77]
Waymo, and Baidu. This has signifi-
cantly helped to push the capabilities Alternatively, simulators may be used.
of the machine learning algorithms “Current physical testing isn’t enough;
forward.[79–81] therefore, virtual testing will be
required,” says Jamie Smith, Director
of Global Automotive Strategy at
National Instruments.[82] By building
realistic simulators, software compa-
nies can create thousands of virtual
scenarios. This brings the cost of data
acquisition down but introduces the
problem of realism: these virtual
scenarios are defined by humans and
are less random that what a real vehi-
cle goes through. There is growing
research in this area, called sim-to-
real transfer, that studies methods to
transfer the knowledge gathered in
simulation in the real world.[83]

Using all the data from the sensors


and these algorithms, an autonomous
vehicle can detect objects surround-
ing it. Next, it needs to find a path to
“We have quite a follow.
good simulation, too, “At Waymo, we’ve
but it just does not driven more than 10
capture the long tail million miles in the
of weird things that real world, and over
happen in the real 10 billion miles in
world.” simulation.”
Elon Musk, Waymo CTO Dmitri Dolgov, Simulators are used to explore thousands of varia-
April 2019 [84] July 2019 [85] ble scenarios. Image: Autoware.AI

40 41
Path Planning Training neural networks and infer- “In most cases, if you look at what went wrong
ence during operations of the vehicle
requires enormous computing power.
during a disengagement [the moment when
With the vehicle knowing the objects Until recently, most machine learning the AV needs human intervention - note by
in its environment and its location, tasks were executed on cloud-based
the large scale path of the vehicle can infrastructure with excessive comput- editor], the role of hardware failure is 0.0 per-
be determined by using a voronoi di- ing power and cooling. With autono- cent. Most of the time, it’s a software failure,
agram (maximizing distance between mous vehicles, that is no longer possi-
vehicle and objects), an occupancy ble as the vehicle needs to be able to that is, software failing to predict what the
grid algorithm, or with a driving corri- simultaneously react to new data. As vehicles are gonna be doing or what the pe-
dors algorithm.[86] However, these tra- such, part of the processing required
ditional approaches are not enough to operate the vehicle needs to take destrians are gonna be doing.”
for a vehicle that is interacting with place onboard, while model refine-
other moving objects around it and ments could be done on the cloud.
their output needs to be fine-tuned. Anthony Levandowski,
Recent advances in machine learning autonomous vehicle technology
Some autonomous vehicles rely on are focusing on how the huge amount
pioneer, April 2019 [90]
machine learning algorithms to not of data generated by the sensors on-
only perceive their environment but board AVs can be efficiently processed
also to act on that data to control to reduce the computational cost,
the car. Path planning can be taught using concepts such as attention [88]
to a CNN through imitation learning, or core-sets.[89] In addition, advances
in which the CNN tries to imitate the in chip manufacturing and miniatur-
behavior of a driver. In more advanced ization are increasing the computing
algorithms, DRL is used, where a capacity that can be mounted on an
reward is provided to the autonomous autonomous vehicle. With advances
system for driving in an acceptable in networking protocols, cars might
manner. Usually, these methods be able to rely on low-latency net-
are hybridized with more classical work-based processing of data to aid
methods of motion planning and them in their autonomous operation.
trajectory optimization to make sure
that the paths are robust. In addition,
manufacturers can include additional
objectives, such as reducing fuel use,
for the model to take into account as
it tries to identify optimal paths.[87]

Autonomous vehicles deploy algorithms to plan the vehi-


cle’s own path, as well as estimate the path of other moving
objects (in this case the system also estimates the path of
the 2 red squares that represent bicyclists). Image: Waymo

42 43
Acting Architectures: In the middle of this spectrum are hy-
brid solutions that combine a central
A distributed architecture can be
achieved with a lighter electrical
Distributed vs unit working at higher abstraction
levels with domains that perform
system but is more complex. Although
the demand related to bandwidth

How does a vehicle act based upon In a (semi-)autonomous car, such controller and memory, and it uses
Centralized dedicated sensor processing and/or
execute decision making algorithms.
and centralized processing is reduced
greatly in such an architecture, it
all this information? In current cars functionality is replaced by drive con- those to process the input data it Such domains can be based on loca- introduces latency between actuation
driven by humans, the vehicle’s ac- trol software directly communicating receives into output commands for Increasing demands and complexity tion within the vehicle, e.g. domains and sensing phases and increases the
tions such as steering, braking, or sig- to an ECU. This can provide opportu- the subsystem it controls, for example challenge engineers to design the for the front and back of the car, on challenges to the validation of data.
naling are generally controlled by the nities to change the structure of the to shift an automatic gearbox. right electronic architecture for the the type of function they control, or
driver. A mechanical signal from the vehicle and to reduce the number of system that needs to perform sensor on the type of sensors they process
driver is translated by an electronic components, especially those added Generally speaking, ECUs can be fusion and simultaneously distribute (e.g. cameras).[93]
control unit (ECU) into actuation com- to specifically translate mechanical either responsible for operations that decisions in a synchronized way to
mands that are executed by electric or signals from the driver to electric control the vehicle, for safety features, the lower level subsystems that act In a centralized architecture the
hydraulic actuators on board the car. A signals for the ECUs. or running infotainment and interior on the instructions.[94,95] measurements from different sensors
small number of current vehicle mod- applications.[92] Most ECUs support are independent quantities and not
els contain Drive-by-Wire systems, Today’s vehicles contain multiple a single application like electronic In theory, at one extreme of the affected by other nodes. The data
where mechanical systems like the ECUs, from around 15-20 in standard power steering, and locally run algo- possible setups one can choose a is not modified or filtered at the
steering wheel column are replaced cars to around a hundred in high- rithms and process sensor data.[93] completely distributed architecture, edge nodes of the system, providing
by an electronic system. end vehicles.[91] An ECU is a simple where every sensing unit processes the maximum possible information
computing unit with its own micro- its raw data and communicates with for sensor fusion, and there is low
the other nodes in the network. At latency. The challenge is that huge
the other end of the spectrum we amounts of data needs to be trans-
have a centralized architecture, where ported to the central unit and be pro-
all Remote Control Units (RCUs) are cessed there. That not only requires a
directly connected to a central control powerful central computer, but also a
point that collects all information and heavy wire harness with a high band-
performs the sensor fusion process. width. Today’s vehicles contain over a
[96,97]
kilometer of wires, weighing tens of
kilo’s.[98]

CAD render of the wire harness of a Bentley Bentayga. This is a Level 1 automated ve-
hicle with advanced driver assistance systems: including adaptive cruise control, auto-
matic braking in cities, pedestrian detection, night vision (which recognizes people and
animals), traffic sign recognition and a system that changes speed in line with local
speed limits. Image: Bentley

44 45
Power, Heat, components performing properly and
reliably, they must be kept within cer-
than any software platform or oper-
ating system that has been created
Weight, and Size tain temperature ranges, regardless
of the vehicle’s external conditions.
so far. GPU-accelerated processing
is currently the industry standard,
challenges Cooling systems, especially those that
are liquid based, can further add to
with Nvidia being the market leader.
However, increasingly companies are
the weight and size of the vehicle. pursuing different solutions; much
Next to increased complexity of the of Nvidia’s competition is focusing
system, automation also poses chal- Extra components, extra wiring, and their chip design on Tensor Process-
lenges on the power consumption, thermal management systems put ing Units (TPU), which accelerate the
thermal footprint, weight, and size of pressure on reducing the weight, tensor operations that are the core
the vehicle components. size, and thermal capabilities of any workload of deep learning algorithms.
part of the vehicle. From reducing GPUs on the other hand were devel-
Regardless of how much the architec- the weight of large components like oped for graphic processing and thus
ture is distributed or centralized, the LIDARs, to tiny ones, like the semicon- prevent deep learning algorithms
power requirements of the auton- ductor components that make up the from harnessing the full power of the
omous system are significant. The electronic circuitry, there is a huge chip.[105]
prime driver for this are the com- incentive for the suppliers of auto-
putational requirements, which can motive components to change their As seen, both the physique as the
easily be up to 100 times higher for products accordingly. software of vehicles will change
fully autonomous vehicles than the significantly as vehicles increase their
most advanced vehicles in production Semiconductor companies are level of automation. Next to that,
today.[99] creating components with smaller greater autonomy in vehicles will also
footprints, improved thermal per- impact how you as a user will interact
This power-hungriness of autono- formance and lower interference, all with them.
mous vehicles increases the demands while actually increasing reliability.
on the performance of the battery Beyond evolving the various silicon
and the capabilities of semiconductor components such as MOSFET’s, bipo-
components in the system. For fully lar transistors, diodes and integrated
electric vehicles, the driving range circuits, the industry also looks at
is negatively impacted by this power using novel materials. Components
demand. Therefore, some companies that are based on Gallium Nitride
like Waymo and Ford have opted to (GaN) are seen as having a high im-
focus on hybrid vehicles, while Uber pact on future electronics. GaN would
uses a fleet of full gasoline SUVs. enable to create smaller devices for
However, experts point to full electric a given on-resistance and breakdown
ultimately being the powertrain of voltage compared to silicon because
choice because of the inefficiency of it can conduct electrons much more
combustion engines in generating effectively.[102–104]
electric power used for onboard com-
puting.[98,100] To execute all the algorithms and
processes for autonomy requires sig-
“To put such a system into a The increased processing demand nificant computing and thus powerful
combustion-engined car doesn’t and higher power throughput heats processors. A full autonomous vehicle
make any sense, because the up the system. To keep electronic will likely contain more lines of code

fuel consumption will go up


tremendously.”

Wilko Stark, A GPU processor based hardware


Vice President of Strategy, platform for autonomous driving.
Mercedes-Benz, 2018 [101] Image: Nvidia

46 47
User experience
On a daily basis we interact with Hence, Waymo is trying to build an
vehicles in various roles, including infrastructure of cars as a service.
as a driver, fellow traffic participant The user experience of their autono-
(car, bicycle, etc.), pedestrian, or as a mous vehicles is completely different:
passenger. The shift towards auton- you summon a car like you would
omy must take into account the full summon an Uber. The only difference:
spectrum of these subtle interactions. no one is at the wheel. The car will
How do these interactions evolve, and safely drop you wherever you need
what role do the new sensors and to go, and you do not need to worry
software play? How will they reshape about anything after you reach your
interactions with the vehicle? destination.

To start, we can look at two major Due to the novelty of the technology,
players in the field, Waymo and Tesla, trust-building measures are highly
who have taken different approaches important during the initial years
towards user experience. of autonomous driving. People have
trouble trusting machines and are
Tesla is working on evolving what a quick to lose confidence in them. In
car can do as a product. The car, by it- a 2016 study, people were found to
self, does not change much. Yet when forgive a human advisor but stop
your car can autonomously drive you trusting a computer advisor–for the
anywhere, park itself and be sum- same, single mistake.[106]
moned back, your experience changes
dramatically. Suddenly, all non-auton- Whether the car is the product or part
omous cars seem ridiculously outdat- of a service, to make autonomous ve-
ed. Tesla’s strategy at the moment, is hicles work, users must feel good in-
to build a feature that will radically side and outside of them. Setting the
differentiate them in the market. right expectations, building trust with
the user, and communicating clearly
Waymo, on the other hand, is trying with them as needed are the corner-
to answer a completely different stones of the design process.[107] We’ll
question: Do you really need to own review what the experience of riding
a car? If your city has a pervasive in a self-driving car looks like now, at
service of autonomous taxis that can the beginning of the decade.
drive people around, why even bother
owning your own?

“Trust building is
the major problem
at the moment.”
Zeljko Medenica,
Principal Engineer and Human
Machine Interface 9HMI) Team
Lead at the US R&D Center of
Changan, January 2020 Inside a Tesla with its Autopilot feature.

48 49
Inside the While the vehicle has a 360 degree,
multi-layered view on its surround-
users, and users can feel a loss of
competence in driving.[112,113]
vehicle ings that is obtained with an array
of various sensors, on the screen As trust in the technology increas-
the user only sees a very minimal es, the user interface will probably
While driving a Tesla with AutoPilot, depiction of the surrounding cars, simplify, as individuals will no longer
an L2 autonomous vehicle, the user buildings, roads, and pedestrians; just care to know every single step that
must be behind the wheel and needs enough to understand that the car is the vehicle is planning to do.
to be aware of what’s happening. The acknowledging their presence and
car’s display shows the vehicles and planning accordingly.[111] Preventing mode confusion is one
what it sees on the road, allowing element that contributes to growing
the user to assess the cars ability to At an intersection, a driver usually trust in these systems. Mode confu-
perceive its environment correctly. On looks to the left and right to see if sion arises when a driver is unsure
a highway, the experience is smooth there are oncoming cars. To show the about the current state of the vehicle,
and reportedly 9 times safer than a user that the autonomous driving for example whether autonomous
human driver.[108] system is taking this into account, the driving is active or not. Addressing
map rotates to the left and right at this issue becomes more important as
Waymo has achieved Level 4, meaning intersections, thus showing the user the levels of autonomy increase (L2+).
the vehicle can come to a safe stop that the vehicle is paying attention The simplest way to do this is to
without a human driver taking over, to traffic. This is quite an interesting make sure that the user interface for
although generally a safety driver design artifact: the system always the autonomous mode is significantly
is still involved.[109] Inside a Waymo, takes into account the whole range of different from the one which is used
you feel like you are riding a taxi. data and does not need to “look right in the manual mode.
The main design problem for this, as and left”, but that map movement em-
stated by Ryan Powell, UX Designer at ulates the behavior of a human driver,
Inside a Waymo self-driving taxi. Image: Waymo Waymo, is reproducing the vast array helping the passenger to feel safe.
of nonverbal communication that The screen is hence both the “face”
happens between the driver and the of the car and a minimal description
passenger.[110] of its environment, used as said to
replicate the nonverbal communica-
Even from the backseat, watching the tion that would happen with a human
behavior of the driver can tell you a driver.
lot about what is going to happen.
The gaze of the driver directly shows Until humans are familiar interact-
what is drawing their attention, and ing with autonomous vehicles, the
the passenger can feel safe by seeing experience of riding one needs to
that they saw the woman crossing emulate what we are used to: a driver
the road or the oncoming car at the paying attention to the surroundings.
intersection. This sensation is lost Increasing automation impacts both
without the driver, and the Waymo User Experience and User Acceptance:
passenger is left to passively interact Research indicates that when levels
with the vehicle through a screen in of automation increase beyond level
the backseat. 1 ADAS systems, the perceived control
and experience of fun decrease for

Interface of the Waymo self-driving taxi at an intersection.


Image: Waymo

50 51
During the initial phases of autono-
mous driving implementation, auton-
When technology evolves to the lev-
els of autonomy that do not require
On top of that, there are four chal-
lenges to address when designing in-
The external The way a vehicle moves suggests
what it is about to do, and human
In general, we have previously relied
on simple signals (like turn indicators)
omy will likely only be restricted to
the defined predefined operational
any human driving capabilities, the
user experience will undergo the
terfaces for autonomous vehicles[116]: experience drivers expect a car to behave based
on their experiences with other
and human-to-human interaction.
Some of these ‘human’ habits apply to
design domains. During a domain most dramatic change. 1. Assuring safety drivers on the road. Hence, from autonomous cars, such as signaling,
change, drivers may need to engage 2. Transforming vehicles into places While customers want a great expe- the perspective of human-machine but others, such as human-to-human
and control the vehicle. This transfer Naturally, at level 4 and 5 of auto- for productivity and play rience while riding an AV, we must interaction, it is fundamental to shape interaction, need to be emulated
of control is another aspect that the mation steering wheels, pedals, and 3. Taking advantage of new mobili- not forget about all the other drivers, the behavior of a self-driving vehicle using another method. In general,
user interface needs to facilitate. gear controls can be removed, shifting ty options (with autonomous cars pedestrians and infrastructure that such that its intentions are clear. The it’s easier for robots to interact with
Bringing a driver back into the loop towards a system where the vehicle moving from something we own the vehicle interacts with. Driving is a need for this is made poignant by the other robots than with humans, and
can be challenging, especially if the is controlled with a map interface on to a service) collective dance, defined by rules but numerous cases of a human driver this goes vice versa.
driver was disengaged for a long a screen, as is done on other robotic 4. Preserving user privacy and data also shaped by nonverbal communi- rear-ending an autonomous vehicle
period of time. The transfer of control systems. Furthermore, the consoles security cation and intuition. because it behaved unexpectedly.[117]
can be even more complex if the designed on current cars aim to re-
situation on the road is such that duce distracted driving, a requirement Finally, highly and fully automated
the driver needs to take over control that no longer holds at high and full vehicles could provide mobility to
immediately (this is the case with SAE autonomy.[114] elderly and people with disabilities.
L3 autonomous vehicles). The opportunity for these previously
Removing steering wheel, pedals, excluded users can only be seized
The question that simultaneously and changing the role of the console when the User Experience design
arises is what the vehicle should do if leaves 2 functions for the interface of takes their role and abilities into
the driver does not take over control a fully autonomous vehicle[115]: account.
when requested? One approach that
most automobile manufacturers are • Clear and adequate communica-
currently taking is gradually stopping tion with the passengers
the vehicle in its lane. However, in • Providing some form of manual
some situations such as on a busy control
highway, bridge or in a tunnel, this
kind of behavior may not be appropri-
ate. A different approach would be to
keep some of the automation active
in order to keep the driver safe until
he/she takes over or until the vehicle
finds a more convenient place to pull
over.

In the automated modes of driving,


it is important that the logic of the
system matches the way the user
interprets how the system works, or
in the case where it doesn’t match ex-
pectations, that the logic is communi-
cated to the user.

Drive.ai’s vehicles featured displays to communicate with other road


participants. The Orange and blue color scheme of the vehicle was
designed to draw attention. Image: Drive.ai

52 53
For example, when pedestrians cross driver) If the vehicle was yielding the When studying fleets of self-driv- Communication
& Connectivity
the road and see a car approach- lights would slowly move side-to- ing cars moving in the city, there is
ing from a distance, they will safely side, acceleration was communicated another area that must be analyzed:
assume that it will brake, especially with rapid blinking, and when steadily machine-machine interaction. It is
if they can see the driver looking driving the light would shine com- crucial to understand if an AI trained
directly at them. This makes sure they pletely solid.[118] to predict human behavior can also
know that they have their attention. safely predict the intent of another
How can we emulate this aspect in Drive.ai was another company that AI. Enabling vehicles to connect and
driverless cars? paid attention to teaching auton- communicate can have a significant Enabling vehicles to share informa- The digitalization of transport is One application getting much atten-
omous vehicles to communicate. impact on their autonomous capabil- tion with other road participants as expected to impact both individu- tion is ‘platooning.’ When autonomous
Melissa Cefkin, a human-machine in- Founded in 2015 by masters and PhD ities. well as traffic infrastructure increases al vehicles, public transport, traffic / semi-autonomous vehicles platoon
teraction researcher at Nissan, recent- students in the Artificial Intelligence the amount and type of available management, and emergency services. they move in a train-like manner,
ly described how they are developing Lab at Stanford University, Drive. information for autonomous vehi- The communication needed can be keeping only small distances between
intent indicators that are outside the ai was acquired by Apple in summer cles to act upon. Vice versa it can summed under the umbrella term of vehicles, to reduce fuel consumption
car, like screens able to display words 2019. Their vehicles featured LED provide data for better traffic man- Vehicle-to-Everything (V2X) commu- and achieve efficient transport. Espe-
or symbols. That allows to clearly and displays around the vehicle that com- agement. Connectivity also enables nications.[124] This term encompasses cially for freight trucks this is a highly
simply suggest what the autonomous municated its state and intentions autonomous vehicles to interact with a larger set of specific communication investigated area as it could save up
vehicle is about to do: for example, it with messages icons, and animations. non-autonomous traffic and pedestri- structures, such as Vehicle-to-Vehicle to 16% of fuel.[125]
can communicate to the pedestrian Initially their vehicles contained ans to increase safety.[12,121–123] (V2V), Vehicle-to-infrastructure (V2I),
that it has seen them, and they can 1 large display on the roof of the Vehicle-to-Network (V2N), and Vehi-
cross the road safely.[110] vehicle, but the company learned Furthermore, AVs will need to connect cle-to-Person (V2P).
that’s not where people look for clues. to the cloud to update their software
Ford together with the Virginia Tech Other lessons learned from their work and maps, and share back information A way for inter-vehicle coordination
Transportation Institute has experi- with user focus groups include the to improve the collectively used maps to impact the driving environment
mented with using external indicator importance of the right phrase of a and software of their manufacturer. is through cooperative maneuvering.
lights to standardize signaling to message. For example, just “Waiting”
other road participants. Ford placed didn’t communicate the vehicle’s state
a LED light bar on top of the wind- clearly enough and needed to be re-
shield (where a pedestrian or bicyclist placed with “Waiting for You.”[29,119,120]
would look to make eye contact a

V2V

“We want to be cognizant


of the context in which
you see the car, and be V2N
responsive to it.”
V2I
Bijit Halder,
Product and Design lead,
Drive.ai, 2018 [119] V2P
The concept of Vehicle-to-Everything
(V2X) communication covers various
types of entities that a connected vehicle
communicates with.
Image: Wevolver

54 55
Another example application of V2X vehicle to communicate directly to same hardware infrastructure) and “In an autonomous car, we have to
was recently demonstrated by Fiat the cellular network, a feature that cloud management techniques (edge
Chrysler Automobiles, Continental, DSRC does not provide. computing) to manage data traffic factor in cameras, radar, sonar, GPS
and Qualcomm: V2V equipped cars and capacity on demand.[136] and LIDAR –components as essential
broadcasted a message to following Both technologies are going through
vehicles in the case of sudden braking enhancements (802.11bd and 5G-NR Applications supporting fully auton- to this new way of driving as pistons,
to notify them timely of the potential- V2X) to support the more advanced omous vehicles could generate huge rings and engine blocks. Cameras
ly dangerous situation.[126] applications that require reliability, amounts of data every second. This
low latency, and high data through- has led semiconductor manufactur- will generate 20–60 MB/s, radar up-
The network enabling these features
must be highly reliable, efficient and
put.[132] ers such as Qualcomm and Intel to
develop new application-specific in-
wards of 10 kB/s, sonar 10–100 kB/s,
capable of sustaining the data traffic Current fourth generation (LTE/4G) tegrated circuits. These combine large GPS will run at 50 kB/s, and LIDAR
load. V2X communication is predom-
inantly supported by two networking
mobile network are fast enough for
gaming or streaming HD content, but
5G bandwidth with innovative digital
radio and antenna architectures, to
will range between 10–70 MB/s. Run
standards, each with significantly lack the speed and resilience required change the autonomous vehicle into those numbers, and each autonomous
different design principles[124,127]: to sustain autonomous vehicle net-
work operations.[133] 5G brings three
a mobile data center.[137,138]
vehicle will be generating approxi-
1. Dedicated short-range communi- main capabilities to the table: greater At the same time, it may be noted mately 4,000 GB –or 4 terabytes –of
cation (DSRC), based on the IEEE data rate speed (25-50% faster than that high data loads are not always
802.11p automobile specific WiFi 4G LTE), lower latency (25-40% lower needed. Choosing what the relevant data a day.”
standard. DSRC uses channels of than 4G LTE), and the ability to serve and minimally required data is, and
10 MHz bandwidth in the 5.9 GHz more devices.[134] transferring it at the right time to the Brian Krzamich,
band (5.850–5.925 GHz),[128] right receiver can enable a lot of uses CEO of Intel, 2016 [39]
2. Cellular V2X (C-V2X), standard- In the case of V2N over a cellular con- cases to transfer less data.
ized through the 3GPP release 15 nection, using the Uu interface, the
(3GPP is a global cooperation of requirements of a 5G network are[135]:
six independent committees that
define specifications for cellular • Real data rates of 1 to 10 Gbit/s.
standards). The Cellular-V2X radio • 1ms end-to-end latency.
access technology can be split in • Ability to support 1000 times the
older LTE-based, and the newer 5G bandwidth of today’s cell phones.
New Radio (5G-NR) based C-V2X, • Ability to support 10 to 100 times
which is being standardized at the the number of devices.
moment.[129] • A 99.999% perceived availability
and 100% perceived coverage.
DSRC and C-V2X both allow for • Lower power consumption.
communication between vehicles
and other vehicles or devices directly 5G does not necessarily bring all of
without network access through an these at the same time, but it gives
interface called PC5.[130] This inter- developers the ability to choose the
face is useful for basic safety services performance needed for specific
such as sudden braking warnings, or services. In addition, 5G could offer
for traffic data collection.[131] C-V2X network slicing (creating multiple
also provides another communication logical networks, each dedicated to
interface called Uu, which allows the a particular application within the

56 57
DSRC or C-V2X 2025 onwards. This stems from Chi-
na’s existing ambitious investment in
ed that C-V2X has a more extensive
range and outperforms DSRC tech-
5G connectivity, with renders C-V2X nology in robustness against inter-
The question whether DSRC or C-V2X a choice that fits well with existing ference, and in a number of scenarios,
is the best choice and which will investments. In 2019, about 130,000 such as when a stationary vehicle
prevail is the subject of strong debate. 5G base stations were expected to obstructs V2V messages between two
Performance and capabilities, deploy- become operational in China, with a passing vehicles.[146]
ment costs, and technology readiness projected 460 million 5G users by the
level are among the considerations in end of 2025.[126] Beyond the communication standard,
this discussion. To make the two tech- the cloud network architecture is also
nologies co-exists in a geographic Different automotive manufacturers a key component for autonomous ve-
region would require overcoming the are prioritizing different approaches hicles. On that end, the infrastructure
challenges of spectrum management for V2X. In 2017 Cadillac was one of already developed by companies such
and operational difficulties.[140] the first companies to launch a pro- as Amazon AWS, Google Cloud and
duction vehicle with V2X capabilities, Microsoft Azure for other applications
DSRC is the oldest of the technol- and chose to incorporate DSRC. The is already mature enough to handle
ogies, and the current standard, new Golf model from Volkswagen will autonomous vehicle applications.
802.11p, was approved in 2009. In also be equipped with the WiFi based [147–149]

1999, the U.S. government allocated a technology. In contrast BMW, AUDI,


section of the 5.9 GHz band spectrum PSA and Ford are currently working
for automotive DSRC. During the on Cellular-V2X compatible equip-
Obama administration a rulemaking ment. Mid-2019 Toyota halted its
process was initiated to make DSRC earlier plans to install DSRC on U.S.
required in all cars sold in 2023 vehicles by 2021, citing “a range of
and onwards, though this process factors, including the need for greater
stalled. In December 2019 the US automotive industry commitment as
Federal Communications Commis- well as federal government support
sion proposed splitting up the band to preserve the 5.9 GHz spectrum
in the 5.9GHz spectrum that had band.”[141–143]
been allocated to DSRC, and instead “Wi-Fi is the only
reserve big parts of it for commercial In terms of technical performance safe and secure
WiFi and C-V2X. According to the FCC requirements for higher levels of au-
V2X technology
“We’ve been looking at DSRC for a number of slow traction on DSRC prompted the tonomy, many experts voice that 5G-
changes. NR V2X is the technology of choice that has been
years along with Toyota, GM and Honda, so and that and that DSRC (nor LTE-V2X
The European Union also had been PC5) won’t sufficiently support some tested for more
this is not a step that we take lightly in the working towards enforcing DSRC as key AV features. than 10 years
sense of dismissing DSRC. But we think this a standard, but recently most of its
member states voted against DSRC The semiconductor manufactur- and is ready for
is the right step to make given where we see and in favor of C-V2X.[141] er Qualcomm together with Ford
immediate volume
the technology headed.” compared the performance of C-V2X
China moves singularly in the direc- and DSRC in lab and field tests in Ann rollout.”
tion of 5G, cellular based V2X. The Arbor, Michigan and in San Diego,
Don Butler, country has plans to require C-V2X California. In a presentation to the 5G
Executive Director, Con- equipment in newly built cars from Automotive Association they conclud- Lars Reger,
nected Vehicle Platform Chief Technology Officer at
And Products, Ford, January NXP Semiconductors, Octo-
2019 [144] ber 2019 [145]

A demo of V2X system warning for


vulnerable road users. Image: Audi

58 59
The Robocar.
Image: Benedict Redgrove
Use case: “In an autonomous environment we

Autonomous Racing
don’t have to educate the driver.
Instead we directly input those engi-
neering results into our software.”

A company called ‘Roborace’ has been Driving in the controlled environment DARs and the GPS, but their availabil-
pushing the limits of technology in of a racing circuit can remove much ity allows the teams to choose which
their autonomous racing vehicles. of the unpredictability and variability system and setup they want to utilize Alan Cocks,
Roborace was announced at the end that cars encounter in the real world. for the race.
Chief Engineer, Roborace,
of 2015 and Robocar, their autono- Therefore, Roborace is looking to
November 2019 [150]
mous race car, was launched in Febru- augment the tracks with obstacles Looking forward, the big thing that
ary 2016. The Robocar currently holds (both real and virtual) to simulate re- will impact Roborace will be 5G.
the Guinness World Record for fastest al-world environments. Furthermore, Roborace insists on having full, live
autonomous vehicle at a speed of not needing to take care of passenger data telemetry at all times, so they
282.42 km/h (175.49 mph). Next to safety and user experience removes know exactly what the car is doing.
the Robocar, Roborace developed a many other constraints. Roborace can Next to that they have a constant
second vehicle platform, the DevBot focus on seeking the performance video stream. This means they have to
2.0, which contrary to the former, limits of their vehicles. That means create a 5G network around the entire
also allows space and controls for a their software is constantly learning racetrack. For each new race this
human driver. the maximum new settings the vehi- requires several kilometers of fibre,
cles can use and learning the edge of numerous roadside units, and a lot of
The DevBot is used in the Season performance possibilities live on the batteries.
Alpha program, Roborace’s debut track in order to advance autonomous
competition. Here multiple teams are software at a faster rate. Moving to 5G would allow the Rob-
pitched against one another in head orace cars to basically run anywhere
to head races. The hardware of both Devbot and Robocar host a Nvidia assuming a network is available.
the vehicles is managed centrally and Drive PX 2 computer, which is fairly Hugely reducing the time and work
is the same for each team, meaning common for autonomous vehicles. It’s it takes to deploy these vehicles will
that the only differentiator is the AI a liquid-cooled machine that sports enable to focus development on the
driver software the teams develop 12 CPU cores and has 8 teraflops cars’ software performance and on
for the competition. For example, worth of processing power, enabling acquiring data. And that, according to
improved live path running, or modi- it to achieve 24 trillion operations a Roborace, is exactly the area that in
fying LIDAR algorithms. second. On top of that, to adjust to which autonomous vehicles need the
racing conditions, they’ve added a most development; their software and
Roborace provides the teams with a Speedgoat computer, common in mot- testing various cases and situations.
base software layer. This is an entirely orsport, to allow real-time processing
internal Automated Driving System aimed at increasing performance. Roborace is not only pioneering on a
(ADS), designed to be a starting point, technical level. The company is also
a basis for various teams and projects Furthermore, Roborace cars differ experimenting with future racing
to use and develop on top. The code from normal autonomous vehicles in formats that combine the real and the
is open source and available to all the abundance of sensors that have virtual and Roborace explores how to
teams, and next to that Roborace been included in order to provide a bring this entertainment to a global
provides an API to simplify software base system for multiple develop- fanbase. Their second season, Season
development. ment teams to work on. You don’t Beta, will begin in 2020 with 5 com-
need that many cameras and the LI- peting teams.[150]

62 63
Type Robocar DevBot2.0

- LIDAR
- Ultrasonic sensors
- Front Radar,
Perception
Sensors - Cameras (5x)
- Military spec GPS
(with antennas at both end
of the car for heading)

Battery type Custom design, built by Rimac

Battery capacity 52 kwh 36 kwh

Peak voltage 729V 725V

4x integral powertrain CRB with 2x integral powertrain CRB with


Motor
each 135 kW (one per wheel) each 135 kW

Total Power 540kW 270kW

Top speed (achieved) 300kph. 217 kph.*

Range 15-20 mins** 15 mins**

*On track, note that no specific top speed runs have been attempted.
**At full racing performance, similar to a 1st generation Formula E car.

64
Summary
At the start of the 2020s, the state by most in the industry as a necessary motion control two different AI ap- To increase the available data for au- Due to its rapid progress and per- external communication companies
of autonomous vehicles is such that element. Some are going against this proaches are generally used: tonomous driving systems to act upon formance, the latter is increasingly are researching displays with words
they have achieved the ability to drive conventional wisdom, including Tesla and increase safety, vehicles need to preferred, and experts express that or symbols to substitute the human
without human supervision and inter- (relying on cameras RADAR, and ultra- 1. Sequentially, where the problem share information with other road DSRC won’t sufficiently support some interaction that people heavily rely on
ference, albeit under strictly defined sound), Nissan, and Wayve (relying on is decomposed into a pipeline participants, traffic infrastructure, and key AV features. when participating in traffic.
conditions. This so-called level 4, or cameras only). with specific software for each the cloud.
high automation, has been reached step. This is the traditional, and In parallel with technological devel- Wevolver’s community of engineers
among many unforeseen challenges These sensors are all undergoing most common approach. For this ‘Vehicle-to-Everything’ (V2X) opment, user experience design is has expressed a growing interest in
for technology developers and scaled technological development to im- 2. An End-to-End (e2e) solution communication, two major network- an important factor for autonomous autonomous vehicle technology, and
back projections. prove their performance and increase based on deep learning. End-to- ing technologies can be chosen: vehicles. For lower level automated hundreds of companies, from start-
efficiency. LIDAR sees the most inno- End learning increasingly gets vehicles, where humans at times ups to established industry leaders,
No technology is yet capable of vation, as it’s moving away from the interest as a potential solution 1. Dedicated short-range commu- have to take control and drive, mode are investing heavily in the required
Level 5, full automation, and some traditional, relatively bulky and costly because of recent breakthroughs nication (DSRC), based on a WiFi confusion can arise when the state of improvements. Despite a reckoning
experts claim this level will never be mechanical scanning systems. Newer in the field of deep learning. standard, the vehicle is unclear, e.g. whether au- with too optimistic expectations it’s
achieved. The most automated per- solutions include microelectrome- 2. Cellular V2X (C-V2X), which for AV tonomous driving is active or not. expected we will see continuous in-
sonal vehicles on the market perform chanical mirrors (MEMS), and systems For either architectural approach, applications needs to be based novation happening and autonomous
at level 2, where a human driver still that do not use any mechanical parts; various types of machine learning on 5G. Other key challenges for user expe- vehicles will be an exciting field to
needs to monitor and judge when to solid-state LIDAR, sometimes dubbed algorithms are currently being used: rience design are trust-building and follow and be involved in.
take over control, for example with ‘LIDAR-on-a-chip.’ Convolutional Neural Networks (CNN), At the moment both DSRC and C-V2X communicating the intentions of
Tesla’s Autopilot. One major challenge Recurrent Neural Networks (RNN) and are going through enhancements. The self-driving vehicles. Internally, for
towards full autonomy is that the For higher-level path planning (deter- Deep Reinforcement Learning (DRL) question whether DSRC or C-V2X is the passengers, human driver behav-
environment (including rules, culture, mining a route to reach a destination), are the most common. These meth- the best choice is a subject of debate. ior is often emulated on displays. For
weather, etc.) greatly influences the different Global Navigation Satellite ods don’t necessarily sit in isolation
level of autonomy that vehicles can Systems beyond the American GPS and some companies rely on hybrid
safely achieve, and performance in e.g. have become available. By leveraging forms to increase accuracy and reduce
sunny California, USA, cannot easily multiple satellite systems, augmen- computational demands.
be extrapolated to different parts of
the world.
tation techniques and additional
sensors to aid in positioning, sub-cen- In terms of processors, most AV com-
“The corner cases involving bad weather, poor infrastruc-
timeter accuracy for positioning can panies rely on GPU-accelerated pro- ture, and chaotic road conditions are proving to be tre-
Beyond individual personal transpor-
tation, other areas in which auton-
be achieved. cessing. However, increasingly differ-
ent solutions are becoming available,
mendously challenging. Significant improvements are still
omous vehicles will be deployed Another essential source of informa- such as Tensor Processing Units (TPU) required in the efficacy and cost efficiency of the existing
include public transportation, delivery tion for many current autonomous that are developed around the core
& cargo, and specialty vehicles for vehicles are high definition maps that workload of deep learning algorithms. sensors. New sensors, like thermal, will be needed which
farming and mining. And while all represent the world’s detailed fea- More electronics, greater complexity, have the ability to see at night and in inclement weath-
applications come with their own tures with an accuracy of a decimeter and increasing performance demands
specific requirements, the vehicles or less. In contrast, some companies, are met by semiconductor innovations er. Similarly, AI computing must become more efficient as
all need to sense their environment, including Tesla and Apple, envision a that include smaller components measured by meaningful operations (e.g. , frames or infer-
process input and make decisions, and map-less approach. and the use of novel materials like
subsequently take action. Gallium Nitride instead of silicon. ences) per watt or per dollar.”
For the whole process of simultane- Engineers also face questions about
Generally, a mixture of passive (cam- ously mapping the environment while how much to distribute or centralize
eras) and active (e.g. RADAR) sensors keeping track of location (SLAM), vehicles’ electrical architecture. Drue Freeman,
is used to sense the environment. Of combining data from multiple sources CEO of the Association for Corporate Growth,
all perception sensors, LIDAR is seen (sensor fusion), path planning and Silicon Valley, and former Sr. Vice President
of Global Automotive Sales & Marketing for
NXP Semiconductors, December 2019 [151]

66 67
About Nexperia
Nexperia is a global semiconductor The automotive sector is Nexperia’s
manufacturer with over 11,000 em- most important market, and the
ployees, headquartered in Nijmegen, company supplies to many key players
the Netherlands. Nexperia owns 5 in the field of autonomous vehicles.
factories: 2 wafer fabs in Hamburg Those include OEMs like Hyundai,
(Germany) and Manchester (UK), as pioneering AV technology developers
well as assembly centers in China, like Aptiv, and tier 1 suppliers like
Malaysia, and the Philippines. They Bosch, Continental, Denso and Valeo.
produce over 90 Billion units per year.
According to Nexperia, virtually every Nexperia products show up in many
electronic design in the world uses areas of contemporary vehicles:
Nexperia parts. Their product range In the powertrain they are part of
includes Discretes, MOSFETs, and components like converters, inverters,
Analog & Logic ICs. engine control units, transmission,
and batteries. In the interior they
In 2017, Nexperia spun out of NXP enable infotainment and comfort &
where it formed Standard Products control applications such as HVAC
business unit, to become its own, (heating ventilation and air condition-
independent company. NXP itself ing) and power windows. Furthermore,
was formerly Philips Semiconductors, Nexperia powers ADAS systems such
effectively giving Nexperia over 60 as adaptive cruise control, and is ex-
years of experience. pected to be a major supplier for the
autonomous vehicle industry.
According to the company, miniatur-
ization, power efficiency, and protec-
tion & filtering are the 3 major engi-
neering challenges Nexperia aims to
support with its products. Its portfolio
consists of over 15.000 different
products, and more than 800 new
ones are added each year. Recently Two automotive semiconductor components produced by Nexperia. Image: Nexperia.
Nexperia launched Gallium Nitride
(GaN) based high voltage power FETs
as an alternative to traditional silicon
based high voltage MOSFETs.

MOSFETs produced by Nexperia.

68 69
About Wevolver
Wevolver is a digital media platform mation can come from many places
& community dedicated to helping and different kinds of organizations:
people develop better technology. At We publish content from our own
Wevolver we aim to empower people editorial staff, our partners like MIT,
to create and innovate by providing or contributors from our engineering
access to engineering knowledge. community. Companies can sponsor
content on the platform.
Therefore, we bring a global audience
of engineers informative and inspir- Our content reaches millions of
ing content, such as articles, videos, engineers every month. For this work
podcasts, and reports, about state of Wevolver has won the SXSW Innova-
the art technologies. tion Award, the Accenture Innovation
Award, and the Top Most Innovative
We believe that humans need innova- Web Platforms by Fast Company.
tion to survive and thrive. Developing
relevant technologies and creating Wevolver is how today’s engineers
the best possible solutions require an stay cutting edge.
understanding of the current cutting
edge. There is no need to reinvent the
wheel.

We aim to provide access to all


knowledge about technologies that
can help individuals and teams devel-
op meaningful products. This infor-

70 71
References
1. Hawkins AJ. Waymo’s driverless car: 7. On-Road Automated Driving (ORAD) 13. What is the difference between CCD 19. Greene B. What will the future look 26. Murray C. Autonomous Cars Look to 31. Christopher V. Poulton and Michael
ghost-riding in the backseat of a committee. Taxonomy and Defini- and CMOS image sensors in a digital like? Elon Musk speaks at TED2017. Sensor Advancements in 2019. In: R. Watts. MIT and DARPA Pack Lidar
robot taxi. In: The Verge [Internet]. tions for Terms Related to Driving camera? In: HowStuffWorks [Internet]. In: TED Blog [Internet]. 28 Apr 2017 Design News [Internet]. 7 Jan 2019 Sensor Onto Single Chip. In: IEEE
The Verge; 9 Dec 2019 [cited 27 Dec Automation Systems for On-Road HowStuffWorks; 1 Apr 2000 [cited 8 [cited 27 Dec 2019]. https://blog.ted. [cited 16 Dec 2019]. https://www. Spectrum: Technology, Engineering,
2019]. https://www.theverge. Motor Vehicles. SAE International; Dec 2019]. https://electronics.how- com/what-will-the-future-look-like- designnews.com/electronics-test/au- and Science News [Internet]. 4 Aug
com/2019/12/9/21000085/way-mo- 2018 Jun. Report No.: J3016_201806. stuffworks.com/cameras-photogra- elon-musk-speaks-at-ted2017/ tonomous-cars-look-sensor-advance- 2016 [cited 29 Jan 2020]. https://
fully-driverless-car-self-driv-ing-ride- https://saemobilus.sae.org/viewhtml/ phy/digital/question362.htm ments-2019/95504860759958 spectrum.ieee.org/tech-talk/semicon-
hail-service-phoenix-arizona J3016_201806/ 20. Tajitsu N. On the radar: Nissan stays ductors/optoelectronics/mit-lidar-
14. Nijland W. - Basics of Infrared cool on lidar tech, siding with Tesla. 27. Marenko K. Why Hi-Resolution Radar on-a-chip
2. Romm J. Top Toyota expert throws 8. [No title]. [cited 29 Jan 2020]. https:// Photography. In: Infrared Photogra- In: Reuters [Internet]. Reuters; 16 May is a Game Changer. In: FierceElectron-
cold water on the driverless car hype. users.ece.cmu.edu/~koopman/pubs/ phy [Internet]. [cited 27 Dec 2019]. 2019 [cited 29 Jan 2020]. https:// ics [Internet]. 23 Aug 2018 [cited 16 32. Ross PE. Lumotive Says It’s Got a Sol-
In: ThinkProgress [Internet]. 20 Sep Koopman19_SAFE_AI_ODD_OEDR.pdf https://www.ir-photo.net/ir_imaging. www.reuters.com/article/us-nissan-li- Dec 2019]. https://www.fierceelec- id-State Lidar That Really Works. In:
2018 [cited 8 Jan 2020]. https:// html dar-autonomous-idUSKCN1SM0W2 tronics.com/components/why-hi-res- IEEE Spectrum: Technology, Engineer-
thinkprogress.org/top-toyota-expert- 9. Czarnecki K. Operational Design olution-radar-a-game-changer ing, and Science News [Internet]. 21
truly-driverless-cars-might-not-be-in- Domain for Automated Driving Sys- 15. Rudolph G, Voelzke U. - Three Sensor 21. Coldewey D. Startups at the speed of Mar 2019 [cited 29 Jan 2020]. https://
my-lifetime-0cca05ab19ff/ tems - Taxonomy of Basic Terms. 2018 Types Drive Autonomous Vehicles. In: light: Lidar CEOs put their industry in 28. Koon J. How Sensors Empower spectrum.ieee.org/cars-that-think/
[cited 4 Feb 2020]. doi:10.13140/ FierceElectronics [Internet]. 10 Nov perspective. In: TechCrunch [Internet]. Autonomous Driving. In: Engineering. transportation/sensors/lumotive-
3. Ramsey M. The 2019 Connected RG.2.2.18037.88803 2017 [cited 16 Dec 2019]. https:// TechCrunch; 29 Jun 2019 [cited 29 com [Internet]. 15 Jan 2019 [cited 16 says-its-got-a-solidstate-lidar-that-
Vehicle and Smart Mobility HC. In: www.fierceelectronics.com/compo- Jan 2020]. http://social.techcrunch. Dec 2019]. https://www.engineering. really-works
Twitter [Internet]. 31 Jul 2019 [cited 9 10. Sotudeh J. - A Review Of Autonomous nents/three-sensor-types-drive-au- com/2019/06/29/lidar-startup-ceos/ com/IOT/ArticleID/18285/How-Sen-
Jan 2020]. https://twitter.com/MRam- Vehicle Safety And Regulations. In: tonomous-vehicles sors-Empower-Autonomous-Driving. 33. Sun W, Hu Y, MacDonnell DG, Weimer
sey92/status/1156626888368054273 Wevolver [Internet]. 31 january, 2002 22. Ohta J. Smart CMOS Image Sen- aspx C, Baize RR. Technique to sepa-
[cited 31 january, 2020]. https:// 16. Rosique F, Navarro PJ, Fernández C, sors and Applications. 2017. rate lidar signal and sunlight. Opt
4. Ramsey M. Hype Cycle for Connect-ed www.wevolver.com/article/a.review. Padilla A. - A Systematic Review of doi:10.1201/9781420019155 29. Yoo HW, Druml N, Brunner D, Express, OE. 2016;24: 12949–12954.
Vehicles and Smart Mobility, 2019.
of.autonomous.vehicle.safety.and. Perception System and Simulators Schwarzl C, Thurner T, Hennecke M, doi:10.1364/OE.24.012949
Gartner; 2019 Jul. Report No.: regulations for Autonomous Vehicles Research. 23. Royo S, Ballesta-Garcia M. An et al. MEMS-based lidar for auton-
G00369518. https://www.gartner. Sensors. 2019;19: 648. doi:10.3390/ Overview of Lidar Imaging Systems omous driving. e & i Elektrotechnik 34. Velodyne Lidar Introduces Ve-
com/en/documents/3955767/
11. Self-Driving Cars Explained. s19030648 for Autonomous Vehicles. Applied und Informationstechnik. 2018. pp. labitTM. In: Business Wire [Internet].
hype-cycle-for-connected-vehi-cles- In: Union of Concerned Scientists Sciences. 2019. p. 4093. doi:10.3390/ 408–415. doi:10.1007/s00502-018- 7 Jan 2020 [cited 29 Jan 2020].
and-smart-mobility-201 [Internet]. 26 Jan 2017 [cited 11 Dec 17. Marshall B. - Lidar, Radar & Digital app9194093 0635-2 https://www.businesswire.com/
2019]. https://www.ucsusa.org/re- Cameras: the Eyes of Autonomous news/home/20200107005849/
5. Wevolver. 2019 Engineering State of sources/self-driving-cars-101 Vehicles. In: Design Spark [Internet]. 24. Thompson J. Ultrasonic Sensors: More 30. Lee TB. Why experts believe cheaper, en/Velodyne-Lidar-Introduces-Ve-
Mind Report. In: Wevolver [Internet].
21 Feb 2018 [cited 19 Dec 2019]. Than Parking | Level Five Supplies. better lidar is right around the corner. labit%E2%84%A2/?feedref=JjAw-
22 Dec 2019 [cited 8 Jan 2020].
12. Beevor M. - Driving autonomous https://www.rs-online.com/de- In: Level Five Supplies [Internet]. In: Ars Technica [Internet]. 1 Jan JuNHiystnCoBq_hl-bV7DTI-
https://www.wevolver.com/arti-
vehicles forward with intelligent signspark/lidar-radar-digital-camer- [cited 31 Jan 2020]. https://levelf- 2018 [cited 29 Jan 2020]. https:// YheT0D-1vT4_bKFzt_EW40VMd-
cle/2019.engineering.state.of.mind. re infrastructure. In: Smart Cities World as-the-eyes-of-autonomous-vehicles ivesupplies.com/ultrasonic-sen- arstechnica.com/cars/2018/01/ K6eG-WLfRGUE1fJraLPL1g6Ae-
port/
[Internet]. 11 Apr 2019 [cited 19 Dec sors-more-than-just-parking/ driving-around-without-a-driver-li- UGJlCTYs7Oafol48Kkc8KJg-
2019]. https://www.smartcitiesworld. 18. Dmitriev S. Autonomous cars will dar-technology-explained/ ZoTHgMu0w8LYSbRdYOj2VdwnuKwa
6. Shuttleworth J. SAE Standards News:
net/opinions/opinions/driving-auton- generate more than 300 TB of data 25. The Tesla Team. Upgrading Autopilot:
J3016 automated-driving graphic
omous-vehicles-forward-with-intelli- per year. In: Tuxera [Internet]. 28 Nov Seeing the World in Radar. In: Tesla
update. In: SAE International [Inter-
gent-infrastructure 2017 [cited 12 Dec 2019]. https:// Blog [Internet]. 11 Sep 2016 [cited
net]. 7 Jan 2019 [cited 26 Nov 2019].
www.tuxera.com/blog/autonomous- 29 Jan 2020]. https://www.tesla.
https://www.sae.org/news/2019/01/
cars-300-tb-of-data-per-year/ com/blog/upgrading-autopilot-see-
sae-updates-j3016-automated-driv-
ing-world-radar
ing-graphic

72 73
35. Ross PE. Velodyne Will Sell a Lidar for 42. Bashir E. Opinion Post: What failed in 50. Noboru Noguchi and Mlchio Kise and 57. Ahmad Al-Dahle and Matthew E. Last 64. Castanedo F. A Review of Data 72. A Beginner’s Guide to LSTMs and
$100. In: IEEE Spectrum: Technology, Uber’s Accident that resulted in the John F. Reid and Qin Zhang. and Philip J. Sieh and Benjamin Lyon. Fusion Techniques. The Scien- Recurrent Neural Networks. In: Path-
Engineering, and Science News [Inter- death of a Pedestrian. In: Automotive Autonomous Vehicle Based on GPS Autonomous Navigation System. tific World Journal. 2013;2013. mind [Internet]. [cited 12 Dec 2019].
net]. 20 Jan 2020 [cited 29 Jan 2020]. Electronics [Internet]. 23 Mar 2018 and Inertial Sensors. IFAC Proceed- US Patent. 2017 /0363430 Al, 2017. doi:10.1155/2013/704504 http://pathmind.com/wiki/lstm
https://spectrum.ieee.org/cars-that- [cited 9 Jan 2020]. https://www. ings Volumes. 2001;34: 105–110. https://pdfaiw.uspto.gov/.aiw?Do-
think/sensors/automotive-sensors/ automotivelectronics.com/uber-driv- doi:10.1016/S1474-6670(17)34115-0 cid=20170363430 65. Holstein T, Dodig-Crnkovic G, Pellic- 73. Banerjee S. An Introduction to Re-
velodyne-will-sell-a-lidar-for-100 erless-car-accident-technology/ cione P. Ethical and Social Aspects of current Neural Networks. In: Medium
51. Teschler L. Inertial measurement units 58. Kendall A. Learning to Drive like a Self-Driving Cars. Arxiv. 2018. https:// [Internet]. Explore Science & Artificial
36. Tsyktor V. LIDAR vs Radar vs Sonar: 43. IntelliSafe surround. In: Volvo Cars will keep self-driving cars on track. In: Human. In: Wayve [Internet]. Wayve; 3 arxiv.org/abs/1802.04103 Intelligence; 23 May 2018 [cited
Which Is Better for Self-Driving Cars? [Internet]. [cited 9 Jan 2020]. https:// Microcontroller Tips [Inter-net]. 15 Apr 2019 [cited 29 Jan 2020]. https:// 12 Dec 2019]. https://medium.com/
In: CyberPulse [Internet]. 28 May www.volvocars.com/intl/why-volvo/ Aug 2018 [cited 29 Jan 2020]. https:// wayve.ai/blog/driving-like-human 66. Grigorescu S, Trasnea B, Cocias explore-artificial-intelligence/an-in-
2018 [cited 9 Dec 2019]. https://cy- human-innovation/future-of-driving/ www.microcontrollertips.com/ T, Macesanu G. A Survey of Deep troduction-to-recurrent-neural-net-
berpulse.info/lidar-vs-radar-vs-sonar/ safety/intellisafe-surround inertial-measurement-units-will-keep- 59. Conner-Simons A, Gordon R. Self-driv- Learning Techniques for Autonomous works-72c97bf0912
self-driving-cars-on-track-faq/ ing cars for country roads. In: MIT Driving. Journal of Field Robotics.
37. Tompkinson W, van Rees E. Blickfeld 44. Volvo Cars and Uber present produc- News [Internet]. 7 May 2018 [cited 29 2019; 1–25. doi:10.1002/rob.21918 74. Introduction to Recurrent Neural Net-
Cube Range has a range of up to 250 tion vehicle ready for self-driving. 52. Edelkamp S, Schrödl S. Heuristic Jan 2020]. http://news.mit.edu/2018/ work. In: GeeksforGeeks [Internet]. 3
meters - SPAR 3D. In: SPAR 3D [Inter- In: Volvo Cars Global Newsroom Search: Theory and Applications. self-driving-cars-for-country-roads- 67. Haavaldsen H, Aasboe M, Lindseth F. Oct 2018 [cited 12 Dec 2019]. https://
net]. 17 Oct 2019 [cited 12 Dec 2019]. [Internet]. 12 Jun 2019 [cited 9 Jan Elsevier Inc.; 2012. https://doi. mit-csail-0507 Autonomous Vehicle Control: End- www.geeksforgeeks.org/introduc-
https://www.spar3d.com/news/lidar/ 2020]. https://www.media.volvocars. org/10.1016/C2009-0-16511-X to-end Learning in Simulated Urban tion-to-recurrent-neural-network/
blickfelds-latest-lidar-sensor-has-a- com/global/en-gb/media/pressre- 60. Teddy Ort and Liam Paull and Daniela Environments. 2019. http://arxiv.org/
range-up-to-250-meters/ leases/254697/volvo-cars-and-uber- 53. Waymo Team. Building maps for a Rus. Autonomous Vehicle Navigation abs/1905.06712 75. Andreas Folkers Matthias Rick. Con-
present-production-vehicle-ready- self-driving car. In: Medium [Internet]. in Rural Environments without De- trolling an Autonomous Vehicle with
38. Thusu R. The Growing World of the for-self-driving Waymo; 13 Dec 2016 [cited 29 Jan tailed Prior Maps. 2018. https://toy- 68. Zhang J. End-to-end Learning for Deep Reinforcement Learning. ArXiv.
Image Sensors Market. In: FierceElec- 2020]. https://medium.com/waymo/ ota.csail.mit.edu/sites/default/files/ Autonomous Driving. New York Uni- https://arxiv.org/abs/1909.12153
tronics [Internet]. 1 Feb 2012 [cited 45. Waymo. Waymo Safety Report 2018: building-maps-for-a-self-driving- documents/papers/ICRA2018_Auton- versity; 2019 May. https://cs.nyu.edu/
27 Dec 2019]. https://www.fiercee- On the Road to Fully Self-Driving. car-723b4d9cd3f4 omousVehicleNavigationRuralEnvi- media/publications/zhang_jiakai.pdf 76. Talpaert V, Sobh I, Kiran BR, Man-
lectronics.com/embedded/grow- Waymo; https://storage.googleapis. ronment.pdf nion P, Yogamani S, El-Sallab A, et
ing-world-image-sensors-market com/sdc-prod/v1/safety-report/Safe- 54. Lyft. Rethinking Maps for Self-Driving. 69. Camera Based Image Processing. al. Exploring applications of deep
ty%20Report%202018.pdf In: Medium [Internet]. Lyft Level 5; 15 61. Matheson R. Bringing human-like In: Self Driving Cars [Internet]. reinforcement learning for real-world
Oct 2018 [cited 29 Jan 2020]. https://
39. Sawers P. Wayve raises $20 million reasoning to driverless car naviga- 26 Sep 2017 [cited 11 Dec 2019]. autonomous driving systems. 2019.
to give autonomous cars better AI 46. GPS.gov: GPS Accuracy. In: GPS.gov medium.com/lyftlevel5/https-medi- tion. In: MIT News [Internet]. 22 May https://sites.tufts.edu/selfdriving- http://arxiv.org/abs/1901.01536
brains. In: VentureBeat [Internet]. [Internet]. 5 Dec 2017 [cited 10 Dec um-com-lyftlevel5-rethinking-maps- 2019 [cited 29 Jan 2020]. http:// isaac/2017/09/26/camera-based-im-
VentureBeat; 18 Nov 2019 [cited 2019]. https://www.gps.gov/systems/ for-self-driving-a147c24758d6 news.mit.edu/2019/human-reason- age-processing/ 77. Karpathy A. Multi-Task Learning in
27 Dec 2019]. https://venturebeat. gps/performance/accuracy/ ing-ai-driverless-car-navigation-0523 the Wilderness. SlidesLive; 2019.
com/2019/11/17/wayve-raises-20- 55. Boudette NE. Building a Road Map for 70. Prabhu. Understanding of Convolu- https://slideslive.com/38917690
million-to-give-autonomous-cars- 47. GMV. Galileo General Introduction. the Self-Driving Car. In: New York 62. Wolfram Burgard, Cyrill Stachniss, tional Neural Network (CNN) — Deep
better-ai-brains/ In: ESA Navipedia [Internet]. [cited Times [Internet]. 2 Mar 2017 [cited 29 Kai Arras, Maren Bennewitz. SLAM: Learning. In: Medium [Internet]. Medi- 78. Eight F. TRAIN AI 2018 - Building
29 Jan 2020]. https://gssc.esa.int/ Jan 2020]. https://www.nytimes. Simultaneous Localization and Map- um; 4 Mar 2018 [cited 12 Dec 2019]. the Software 2.0 Stack. 2018. https://
com/2017/03/02/automobiles/
40. Lambert F. A look at Tesla’s new navipedia/index.php/Galileo_Gener- ping. Introduction to Mobile Robotics https://medium.com/@RaghavPrab- vimeo.com/272696002
Autopilot hardware suite: 8 cam- al_Introduction wheels/self-driving-cars-gps-maps. ; http://ais.informatik.uni-freiburg. hu/understanding-of-convolution-
html
eras, 1 radar, ultrasonics & new de/teaching/ss12/robotics/slides/12- al-neural-network-cnn-deep-learn- 79. Open Datasets - Scale. [cited 29 Jan
supercomputer - Electrek. In: 48. What is GNSS? In: OxTS [Internet]. 14 slam.pdf ing-99760835f148 2020]. https://scale.com/open-data-
56. Templeton B. Elon Musk Declares
Electrek [Internet]. 20 Oct 2016 Aug 2019 [cited 29 Jan 2020]. https:// sets
Precision Maps A “Really Bad Idea” --
[cited 9 Jan 2020]. https://electrek. www.oxts.com/what-is-gnss/ 63. Huang B, Zhao J, Liu J. A Survey 71. Convolutional Neural Network
Here’s Why Others Disagree. In: Forbes
co/2016/10/20/tesla-new-autopi- of Simultaneous Localization and Architecture: Forging Pathways to the 80. Dataset | Lyft Level 5. In: Lyft Level 5
[Internet]. Forbes; 20 May 2019 [cited
lot-hardware-suite-camera-nvidia-te- 49. Gade K. The Seven Ways to Find Mapping. 2019. http://arxiv.org/ Future. In: MissingLink.ai [Inter- [Internet]. [cited 29 Jan 2020]. https://
29 Jan 2020]. https://
sla-vision/ Heading. J Navig. 2016;69: 955–970. abs/1909.05214 net]. [cited 12 Dec 2019]. https:// level5.lyft.com/dataset/
www.forbes.com/sites/bradtemple-
doi:10.1017/S0373463316000096 missinglink.ai/guides/convolution-
ton/2019/05/20/elon-musk-declares-
41. Tesla. Autopilot. In: Tesla [Internet]. al-neural-networks/convolution- 81. Open Dataset – Waymo. In: Waymo
precision-maps-a-really-bad-idea-he-
[cited 9 Jan 2020]. https://www.tesla. al-neural-network-architecture-forg- [Internet]. [cited 29 Jan 2020]. https://
res-why-others-disagree/
com/autopilot ing-pathways-future/ waymo.com/open/

74 75
82. Smith J. Why Simulation is the Key to 88. Venkatachalam M. Attention in Neural 94. Scobie J, Stachew M. Electronic 100. Gawron JH, Keoleian GA, De Kleine 106. International Communication Associ- 112. Rödel C, Stadler S, Meschtscherjakov
Building Safe Autonomous Vehicles. Networks. In: Medium [Internet]. To- control system partitioning in the RD, Wallington TJ, Kim HC. Life ation. Humans less likely to return to A, Tscheligi M. Towards Autonomous
In: Electronic Design [Internet]. 3 Oct wards Data Science; 7 Jul 2019 [cited autonomous vehicle. In: eeNews Cycle Assessment of Connected and an automated advisor once given bad Cars: The Effect of Autonomy Levels
2019 [cited 29 Jan 2020]. https:// 19 Dec 2019]. https://towardsdata- Automotive [Internet]. 29 Oct 2015 Automated Vehicles: Sensing and advice. In: Phys.org [Internet]. 25 May on Acceptance and User Experience.
www.electronicdesign.com/mar- science.com/attention-in-neural-net- [cited 27 Dec 2019]. https://www. Computing Subsystem and Vehicle 2016 [cited 8 Dec 2019]. https://phys. Proceedings of the 6th Internation-
kets/automotive/article/21808661/ works-e66920838742 eenewsautomotive.com/content/ Level Effects. Environmental Science org/news/2016-05-humans-automat- al Conference on Automotive User
why-simulation-is-the-key-to-build- electronic-control-system-partition- & Technology. 2018;2: 3249–3256. ed-advisor-bad-advice.html Interfaces and Interactive Vehicular
ing-safe-autonomous-vehicles 89. Baykal C, Liebenwein L, Gilitschenski ing-autonomous-vehicle doi:10.1021/acs.est.7b04576 Applications. ACM Digital Library; pp.
I, Feldman D, Rus D. Data-Dependent 107. Punchcut. UX Design for Autono- 1–8. doi:10.1145/2667317.2667330
83. Pan X, You Y, Wang Z, Lu C. Virtual to Coresets for Compressing Neural 95. Estl H. Sensor fusion: A critical step 101. Stewart J. Self-Driving Cars Use Crazy mous Vehicles. In: Medium [Internet].
Real Reinforcement Learning for Au- Networks with Applications to Gener- on the road to autonomous vehicles. Amounts of Power, and It’s Becom- Medium; 7 Aug 2019 [cited 20 Dec 113. Eckoldt K, Knobel M, Hassenzahl M,
tonomous Driving. 2017. http://arxiv. alization Bounds. Arxiv. 2018. http:// In: eeNews Europe [Internet]. 11 Apr ing a Problem. In: Wired [Internet]. 2019]. https://medium.com/punchcut/ Schumann J. An Experiential Perspec-
org/abs/1704.03952 arxiv.org/abs/1804.05345 2016 [cited 27 Dec 2019]. https:// WIRED; 6 Feb 2018 [cited 27 Dec ux-design-for-autonomous-vehi- tive on Advanced Driver Assistance
www.eenewseurope.com/news/sen- 2019]. https://www.wired.com/story/ cles-9624c5a0a28f Systems. Information Technology.
84. Hawkins AJ. It’s Elon Musk vs. 90. Anthony Levandowski on lessons sor-fusion-critical-step-road-autono- self-driving-cars-power-consump- 2012;54: 165–171. doi:10.1524/
everyone else in the race for learned at TC Sessions: Robotics+AI mous-vehicles tion-nvidia-chip/ 108. Houser K. Tesla: Autopilot Is Nearly 9 itit.2012.0678
fully driverless cars. In: The Verge [Youtube]. Tech Crunch; 2019. https:// Times Safer Than the Average Driver.
[Internet]. The Verge; 24 Apr 2019 www.youtube.com/watch?v=fNgEG5r- 96. Sheikh AF. How Advanced Driver-As- 102. Preibisch JB. Putting high-per- In: Futurism [Internet]. The Byte; 24 114. Els P. Braking and steering systems
[cited 27 Dec 2019]. https://www. Cav4 sistance Systems (ADAS) Impact Auto- formance computing into cars: Oct 2019 [cited 10 Dec 2019]. https:// to control a new generation of au-
theverge.com/2019/4/24/18512580/ motive Semiconductors. In: Wevolver automotive discrete semiconductors futurism.com/the-byte/tesla-autopi- tonomous vehicle. In: Automotive IQ
elon-musk-tesla-driverless-cars-li- 91. Grand View Research (GVN). Automo- [Internet]. 12 Nov 2019 [cited 27 Dec for autonomous driving. In: Wevolver lot-safer-average-driver [Internet]. Automotive IQ; 8 May 2019
dar-simulation-waymo tive Electronic Control Unit Market 2019]. https://www.wevolver.com/ar- [Internet]. 11 Dec 2019 [cited 27 Dec [cited 12 Dec 2019]. https://www.
Size, Share, & Trends Analysis Report ticle/how.advanced.driverassistance. 2019]. https://www.wevolver.com/ 109. Ohnsman A. Waymo Says More Of automotive-iq.com/chassis-systems/
85. Etherington D. Waymo has now By Application, By Propulsion Type, By systems.adas.impact.automotive. article/putting.highperformance. Its Self-Driving Cars Operating columns/braking-and-steering-sys-
driven 10 billion autonomous Capacity, By Vehicle Type, By Region, semiconductors/ computing.into.cars.automotive.dis- “Rider Only” With No One At Wheel. tems-to-control-a-new-genera-
miles in simulation. In: TechCrunch And Segment Forecasts, 2019 - 2025. crete.semiconductors.for.autonomous. In: Forbes [Internet]. Forbes; 28 Oct tion-of-autonomous-vehicle
[Internet]. TechCrunch; 10 Jul 2019 Grand View Research (GVN); 2019 97. Murray C. What’s the Best Computing driving/ 2019 [cited 29 Jan 2020]. https://
[cited 29 Jan 2020]. http://social. Jul. Report No.: 978-1-68038-367- Architecture for the Autonomous www.forbes.com/sites/alanohns- 115. Moyers S. Current UX Design Chal-
techcrunch.com/2019/07/10/way- 6. https://www.grandviewresearch. Car? In: Design News [Internet]. 103. Efficient Power Conversion Corpo- man/2019/10/28/waymos-autono- lenges for Driverless Cars. In: Digital
mo-has-now-driven-10-billion-auton- com/industry-analysis/automo- 17 Aug 2017 [cited 27 Dec 2019]. ration. What is GaN? [cited 27 Dec mous-car-definition-if-you-need-a- Agency Network [Internet]. 5 Dec
omous-miles-in-simulation/ tive-ecu-market https://www.designnews.com/ 2019]. https://epc-co.com/epc/Galli- drivers-license-its-not-self-driving/ 2017 [cited 19 Dec 2019]. https://
automotive-0/what-s-best-com- umNitride/WhatisGaN.aspx digitalagencynetwork.com/cur-
86. Shukla D. Design Considerations For 92. Everything You Wanted to Know puting-architecture-autono- 110. Design Is [Autonomous] – In Con- rent-ux-design-challenges-for-driv-
Autonomous Vehicles. In: Electronics About Types of Operating Systems mous-car/87827789257286 104. Davis S. GaN Basics: FAQs. In: Power versation with Ryan Powell, Melissa erless-cars/
For You [Internet]. 16 Aug 2019 [cited in Autonomous Vehicles - Intellias. Electronics [Internet]. 2 Oct 2013 Cefkin, and Wendy Ju [Youtube].
17 Dec 2019]. https://electronicsforu. In: Intellias (Intelligent Software 98. Complexity in basic cars: SEAT [cited 27 Dec 2019]. https://www. Google Design; 2018. https://www. 116. Kun AL, Boll S, Schmidt A. Shifting
com/market-verticals/automotive/de- Engineering) [Internet]. 15 May 2019 Ateca SUV has 2.2 km of wire, powerelectronics.com/technologies/ youtube.com/watch?v=5hLEiBGPrNI Gears: User Interfaces in the Age of
sign-considerations-autonomous-ve- [cited 27 Dec 2019]. https://www. 100 sensors and control units. gan-transistors/article/21863347/ Autonomous Driving - IEEE Jour-
hicles intellias.com/everything-you-wanted- In: Green Car Congress [Internet]. gan-basics-faqs 111. Niedermeyer E. Hailing a driverless nals & Magazine. IEEE Pervasive
to-know-about-types-of-operating- 24 Feb 2019 [cited 27 Dec 2019]. ride in a Waymo. In: TechCrunch Computing. 15: 32–38. doi:10.1109/
87. Katrakazas C, Quddus M, Chen W-H, systems-in-autonomous-vehicles/ https://www.greencarcongress. 105. Wang J. Deep Learning Chips — Can [Internet]. TechCrunch; 1 Nov 2019 MPRV.2016.14
Deka L. Real-time motion planning com/2019/02/20190224-ateca.html NVIDIA Hold On To Its Lead? In: ARK [cited 8 Dec 2019]. http://social.
methods for autonomous on-road 93. van Dijk L. Future Vehicle Networks Investment Management [Internet]. techcrunch.com/2019/11/01/hailing- 117. Song Wang ZL. Exploring the mech-
driving: State-of-the-art and future and ECUs: Architecture and Technol- 99. Nvidia. Self-Driving Safety Report. 27 Sep 2017 [cited 27 Dec 2019]. a-driverless-ride-in-a-waymo/ anism of crashes with automated
research directions. Transp Res Part ogy considerations. NXP Semicon- Nvidia Corporation; 2018. https:// https://ark-invest.com/research/gpu- vehicles using statistical modeling
C: Emerg Technol. 2015;60: 416–442. ductors; 2017. https://www.nxp.com/ www.nvidia.com/content/dam/en-zz/ tpu-nvidia approaches. PLoS One. 2019;14.
doi:10.1016/j.trc.2015.09.011 docs/en/white-paper/FVNECUA4WP. Solutions/self-driving-cars/safe- doi:10.1371/journal.pone.0214550
pdf ty-report/auto-print-safety-report-
pdf-v16.5%20(1).pdf

76 77
118. Shutko J. How Self-Driving Cars Could 125. Vinel A. 5G V2X – Communica- 132. Gaurang Naik Biplav Choudhury. IEEE 139. Cottrill DCD. Data and digital sys- 145. Reger L. VW Golf Brings WiFi-Based
Communicate with You in the Future. tion for Platooning - Högskolan i 802.11bd & 5G NR V2X: Evolution of tems for UK transport: change and Safe, Secure V2X to the Masses - NXP
In: Ford Social [Internet]. 13 Sep 2017 Halmstad. In: Halmstad University Radio Access Technologies for V2X its implications. UK government’s Blog. In: NXP Blog [Internet]. 30 Oct
[cited 9 Jan 2020]. https://social.ford. [Internet]. 16 Dec 2019 [cited 29 Jan Communications. ArXiv. 2019. https:// Foresight Future of Mobility project; 2019 [cited 29 Jan 2020]. https://
com/en_US/story/ford-community/ 2020]. https://www.hh.se/english/ arxiv.org/abs/1903.08391 2018 Dec. https://aura.abdn.ac.uk/ blog.nxp.com/automotive/vw-golf-
move-freely/how-self-driving-cars- research/research-environments/ bitstream/handle/2164/12742/Da- brings-wifi-based-safe-secure-v2x-to-
could-communicate-with-you-in-the- embedded-and-intelligent-sys-tems- 133. Llanasas R. 5G’s Important Role taanddigital.pdf;jsessionid=1AF1CB- the-masses
future.html eis/research-projects-within-ei-s/5g- in Autonomous Car Technology. 7BEE7C498F8EA713CC3D7C1255?se-
v2x---communication-for-pla- In: Machine Design [Internet]. quence=1 146. V2X Technology Benchmark Testing.
119. Davies A. The Self-Driving Startup tooning.html 11 Mar 2019 [cited 8 Dec 2019]. 2018 Sep. https://www.qualcomm.
Teaching Cars to Talk. In: Wired https://www.machinedesign.com/ 140. Naik G, Choudhury B, Park J-M. IEEE com/media/documents/files/5gaa-
[Internet]. WIRED; 20 Aug 2018 [cited 126. Wevolver.Towards 5G Mobility: The role mechanical-motion-systems/ 802.11bd & 5G NR V2X: Evolution of v2x-technology-benchmark-testing-
29 Jan 2020]. https://www.wired.com/ of effi-cient discrete semiconductors. article/21837614/5gs-impor- Radio Access Technologies for V2X dsrc-and-c-v2x.pdf
story/driveai-self-driving-design-fri- [cited 31 Jan 2020]. https:// tant-role-in-autonomous-car-tech- Communications. IEEE Access. 2019.
sco-texas/ www.wevolver. com/article/ nology pp. 70169–70184. doi:10.1109/ac- 147. Designing a Connected Vehicle Plat-
towards.5g.mobility.the. cess.2019.2919489 form on Cloud IoT Core. In: Google
120. Contributors to Wikimedia projects. role.of.efficient.discrete.semiconduc- 134. Segan S. What Is 5G? In: PC Magazine Cloud [Internet]. 10 Apr 2019 [cited
Drive.ai - Wikipedia. In: Wikimedia tors/ [Internet]. 31 Oct 2019 [cited 10 Dec 141. Yoshida J. The DSRC vs 5G Debate 12 Jan 2020]. https://cloud.google.
Foundation, Inc. [Internet]. 27 Jul 2019]. https://www.pcmag.com/arti- Continues. In: EET Asia [Internet]. 29 com/solutions/designing-connect-
127. Krasniqi X, Hajrizi E. Use of IoT Tech-
2018 [cited 31 Jan 2020]. https:// cle/345387/what-is-5g Oct 2019 [cited 29 Jan 2020]. https:// ed-vehicle-platform
nology to Drive the Automotive In-
en.wikipedia.org/wiki/Drive.ai www.eetasia.com/news/article/The-
dustry from Connected to Full Auton-
135. 5G Implementation Guidelines - DSRC-vs-5G-Debate-Continues 148. ADAS and Autonomous Driving. In:
omous Vehicles. IFAC-PapersOnLine.
121. Zoria S. Smart Cities: A New Look at Future Networks. In: Future Networks Amazon Web Services, Inc. [Internet].
2016;49: 269–274. doi:10.1016/j.
the Autonomous-Vehicle Infrastruc- [Internet]. 28 Mar 2019 [cited 29 142. Shepardson D. Toyota abandons [cited 12 Jan 2020]. https://aws.
ifacol.2016.11.078
ture. In: Medium [Internet]. 19 Nov Jan 2020]. https://www.gsma.com/ plan to install U.S connected vehicle amazon.com/automotive/autono-
2019 [cited 19 Dec 2019]. https:// futurenetworks/wiki/5g-implementa- tech by 2021. In: U.S. [Internet]. mous-driving/
128. IEEE Standards Association. IEEE
medium.com/swlh/smart-cities-a- tion-guidelines/ Reuters; 26 Apr 2019 [cited 31 Jan
802.11p-2010 - IEEE Standard for
new-look-at-the-autonomous-vehi- 2020]. https://www.reuters.com/ 149. Autonomous Vehicle Solutions. In:
Information technology-- Local and
cle-infrastructure-3e00cf3e93b2 136. Chaudry F. Towards A System Of article/us-autos-toyota-communica- Microsoft [Internet]. [cited 12 Jan
metropolitan area networks-- Specif-
Systems: Networking And Communi- tion-idUSKCN1S2252 2020]. https://www.microsoft.com/en-
ic requirements-- Part 11: Wireless
122. Litman T. Autonomous Vehicle Imple- cation Between Vehicles. In: Wevolver us/industry/automotive/autonomous-
LAN Medium Access Control (MAC)
mentation Predictions: Implications and Physical Layer (PHY) Speci- [Internet]. 31 january, 2020 [cited 31 143. Why C-V2X may yet become the glob- vehicle-deployment
for Transport Planning. Victoria january, 2020]. https://www.wevolver. al automotive connectivity standard
fications Amendment 6: Wireless
Transport Policy Institute; 2019 Oct. Access in Vehicular Environments. com/article/towards.a.system.of.sys- - Futurum. In: Futurum [Internet]. 14 150. Geenen B. Developing An Autono-
https://www.vtpi.org/avip.pdf tems.networking.and.communication. Nov 2019 [cited 31 Jan 2020]. https:// mous Racing Car: Interview With Rob-
2010 Jul. Report No.: 802.11p-2010.
https://standards.ieee.org/stand- between.vehicles futurumresearch.com/the-war-be- orace’s Chief Engineer. In: Wevolver
123. Macleod A. Autonomous driving, tween-c-v2x-and-dsrc-looks-to-be- [Internet]. 31 Jan 2020 [cited 31 Jan
ard/802_11p-2010.html
smart cities and the new mobility 137. Leswing K. Qualcomm announc- steering-itself-towards-c-v2x/ 2020]. https://www.wevolver.com/
future. Siemens; 2018. https://www. es chips for self-driving cars that article/developing.an.autonomous.
129. 3GPP. Release 15. [cited 12 Jan 2020].
techbriefs.com/autonomous-driv- could be in cars by 2023. In: CNBC 144. Naughton K. Ford Breaks With GM, racing.car.interview.with.roboraces.
https://www.3gpp.org/release-15
ing-smart-cities-and-the-new-mobil- [Internet]. CNBC; 6 Jan 2020 [cited Toyota on Future of Talking-Car chief.engineer
ity-future/file 12 Jan 2020]. https://www.cnbc. Technology. In: Bloomberg [Inter-
130. Flynn K. Initial Cellular V2X stand-ard
completed. [cited 12 Jan 2020]. com/2020/01/06/qualcomm-snap- net]. 7 Jan 2019 [cited 29 Jan 2020]. 151. Freeman D. Following the Auton-
124. Hoeben R. V2X is Here to Stay—Now dragon-ride-system-announced-for- https://www.bloomberg.com/news/ omous Vehicle Hype. In: Wevolver
https://www.3gpp.org/news-
Let’s Use It for Autonomous Cars. In: self-driving-cars.html articles/2019-01-07/ford-breaks- [Internet]. 31 January, 2020 [cited 31
events/3gpp-news/1798-v2x_r14
Electronic Design [Internet]. 22 Aug with-gm-toyota-on-future-of-talking- January, 2020]. https://www.wevolver.
2018 [cited 19 Dec 2019]. https:// 138. Autonomous Driving at Intel. In: Intel car-technology com/article/following.the.autono-
131. Dedicated Short Range Communica-
www.electronicdesign.com/markets/ Newsroom [Internet]. 8 Jan 2020 mous.vehicle.hype
tions. In: Clemson Vehicular Electron-
automotive/article/21806892/v2x-is- [cited 12 Jan 2020]. https://news-
ics Laboratory [Internet]. [cited 12 Jan
here-to-staynow-lets-use-it-for-au- 2020]. https://cecas.clemson.edu/ room.intel.com/press-kits/autono-
tonomous-cars cvel/auto/systems/DSRC.html mous-driving-intel/

78 79
Collaboratively written by experts from
across the autonomous vehicle field, this
report provides a comprehensive under-
standing of the current cutting edge.

It’s for engineers who need to deepen


their knowledge, for leaders who want to
grasp the technological challenges and
breakthroughs, and for all who are inter-
ested in learning how many fascinating
technologies come together to create the
innovations that change our future.

Address Contact
Plantage Middenlaan 62 @wevolverapp
1018 DH Amsterdam www.wevolver.com
The Netherlands info@wevolver.com

You might also like