You are on page 1of 33

SPECIAL REPORT:

ADAS/CONNECTED CAR
JUNE 2020
Sponsored by
Accelerate ADAS and
autonomous driving
innovation
Dell EMC storage
makes it easy to get
the most value from
your data across
key workflows

Learn more at
http://dellemc.com/Isilon

Copyright © 2019 Dell Inc. or its subsidiaries. All Rights Reserved. Dell Technologies, Dell, EMC, Dell EMC and other trademarks are trademarks of Dell Inc.
or its subsidiaries. Other trademarks may be trademarks of their respective owners.
CONTENTS

FEATURES
2 Sensing with Sound: Next- 22 ZF Establishes Level 2+ ADAS
Level Ultrasonic Sensors for Cost-effective technology solutions to
Intelligent Cars meet diverse customer needs are what
By mimicking bats, echolocation technology the “new pragmatism” in driver-assistance
enables the detection of multiple objects in tech is all about.
3D space.
24 Shifting Design of Autonomous
6 Heat Seekers Architectures
While engineers debate the use of thermal- Electronic controls centralize while
imaging sensors for ADAS, their capability providing commercial-level reliability.
and value are being proven for AVs of all
levels. 27 Democratize AV Technology!
EAptiv’s new generation of open-sourced
10 Radar's Evolving Role in ADAS architectures based on a few central
and the AV Future processors aims to speed AV adoption.
The automotive radar sensor market has been
growing at a 21% annual rate, putting greater
demands on chip design, testing, and module
deployment.

14 Lidar in a Flash ON THE COVER


Continental delivers a short-range 3D flash The battle for bandwidth is a key
lidar sensor that’s expected to find increasing issue in optimizing advanced
ADAS vehicle systems and one
application in commercial vehicles and that requires increasing on-
off-highway machines. vehicle data processing
capability. The major chipmakers
are playing a significant role in
18 Bandwidth for Sale enabling vehicles to
communicate with one another,
The FCC and transportation industry clash and with the ecosystem around
over vital vehicle-communications them. See the article on page 18.
boundaries. (NXP Semiconductor)

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 1


E
ven though ultrasound has

Seeing with Sound


been studied by scientists for
many years, its capabilities
in practical applications

Next-Level
are yet to be fully harnessed.
Bats have been nature’s prototypes
for sound-based navigation. They rely
on echolocation to detect obstacles
BY METAMORWORKS/SHUTTERSTOCK.COM

Ultrasonic Sensors
in flight, forage for food, and see in
dark caves. By mimicking the bat’s
technique, an ultrasonic solution was
developed that provides echolocation

for Intelligent Cars


and enables the detection of multiple
objects in three-dimensional space.

Close-Range Perception
How can it be possible that a
staggering one out of five motor
vehicle accidents takes place in a
parking lot? Even at the slow speeds

2 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


Ultrasonic perception for automobiles.

1. 2. 3.
Sensor Received ultrasound
Person or object waves

Ultrasound Reflected ultrasound


waves waves

The principle of echolocation.

usually found in parking scenarios, the steering a car through a construction In general, ultrasonic sensors
driver cannot perceive his environment site or reliably detecting people in a make use of high-frequency sound
flawlessly, despite parking assistance crowded parking area. As every ride waves for a range of applications.
functions like cameras and PDCs. starts and ends in a parking position For distance measurement, a typical
Tight parking spots in car parks or involving some kind of parking ultrasonic sensor uses a transducer to
crowded parking areas in front of maneuver, it is crucial to safely perceive periodically send out ultrasonic pulses
shopping malls maximize everyday a car’s immediate environment. in the air. These pulses get reflected
driving challenges. Car manufacturers While existing sensor technologies from objects in the detection area of
are beginning to support their cars mostly focus on covering long the sensor and are received back by
with the latest sensor technologies distances, the immediate environment the sensor. By measuring the time it
so that multiple tons of moving around a car (0 to 5 m) is often takes an ultrasonic pulse to travel to
metal are not left without eyes. left out of the discussion. This is the object and get captured by the
Sensors are currently used to help where advanced ultrasonic sensor sensor, the distance to the object
drivers steer more safely and guide systems come into play. In addition can be calculated. This principle is
them into a parking spot via sound to measuring the distance to an called time-of-flight measurement.
and light signals. In order to advance object, advanced ultrasonic sensors Conventional ultrasonic sensors
in areas like autonomous driving, can also calculate the horizontal and used for parking assistance only record
mapping, and collision avoidance, vertical position of an object relative one-dimensional data, which is the
sensors now need to detect more to the sensor itself (i.e. providing 3D distance to the closest object. Azimuth
complex environmental scenarios like coordinates for detected objects). and elevation angles of objects are

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 3


Seeing with Sound

Point cloud example recorded by automotive kit (real data re­corded by sensor).

not calculated with this method and The sensor data can also be used can be used to identify the number of
vertical opening angles are severely for additional comfort features, people sitting in the car, their size, and
limited. Thus, many objects, such as e.g. gesture control to open doors their posture. Based on the information
the curb and low-lying obstacles, and trunks, positioning the vehicle re­garding where people are sitting and
are not picked up by 1D sensors. for automated charging (for their physical characteristics, airbags
The localization of objects in 3D EVs), and collision avoidance for could be adjusted to individual body
space allows ultrasonic sensors to automatically opening doors. sizes and further improve safety.
detect and distinguish among multiple The technology does not collect
objects in a single scan. In that sense, Passenger Monitoring any personal data since ultrasound
the principle of ultrasonic sensors is in the Car Interior cannot evaluate visual input and instead
similar to echolocation, as used by With further improvements in only records anonymous point-cloud
bats. In comparison, a typical ultrasonic the autonomous driving space, the data. This is an especially important
sensor will usually only measure the behavior of the driver is also likely consideration in terms of privacy and
distance to the nearest object. Because to change. While drivers today data protection. Furthermore, gesture
of this, a limited opening angle is must be completely focused on the simulation in the interior of the car
usually applied for this type of sensor. road—ready to react at a moment’s can be used for information and
In contrast, advanced ultrasonic notice—this is likely to change once entertainment purposes, like controlling
sensors allow for opening angles cars are able to drive and steer fully the car’s infotainment systems with
of up to 160°. The sensors provide automatically. Drivers would then be simple pre-configured actions.
reliable, rich, 3D data for the able to lean back and relax, work on There is no doubt that autonomous
close-range environment around a their computers, turn to their children vehicles of the future will need more
vehicle. The sensors are therefore in the back seats, or temporarily enjoy assistance from sensors to safely operate
well-suited for applications in the an expanded infotainment program. in populated places. Whether you are
automotive field and add yet another Such an eventuality puts new de­ living in a big city where detecting
level of safety and redundancy to mands on assistance systems. Just like people and accurate parking plays a
conventional radar, LiDAR, and the numerous sensors available for major role, or in the countryside where
camera technologies. The sensors analyzing a car’s external environment, automated charging is indispensable,
can replace or complement existing simi­lar knowledge is needed for the 3D ultrasonic will provide many benefits
optical sensing systems, providing interior in order to realize a more secure to the everyday life of future drivers.
both redundancy and an improved and intuitive interaction experience. In This article was written by Andreas Just,
level of accuracy compared to this context, the use of 3D ultrasound Head of Marketing for Toposens, Munich,
standard ultrasonic sensors in various again provides interesting ad­vantages. Germany. For more information, visit
autonomous navigation applications. Data gained from an ultrasound sensor http://info.hotims.com/76502-121.

4 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


LEADING LIGHT MEASUREMENT
FOR SENSORS & DISPLAYS

HEAD-UP DISPLAY BACKUP CAMERA


SYSTEMS (HUD) DISPLAYS

DRIVER MONITOR ILLUMINATED


& ID SENSORS INDICATORS

AUTONOMOUS SMART ADAPTIVE


SENSING (LIDAR) HEADLIGHTS

Radiant’s light measurement systems objectively


evaluate visible and infrared wavelengths to ensure
quality light sources and displays used for ADAS.

• Lights & Displays: Measure brightness and color


• HUD: Ensure virtual image quality and visibility
• LiDAR & Sensing:
• Test radiant intensity of near-IR LEDs/lasers
• Evaluate effectiveness of monitoring systems
• Headlamp: Test beam pattern shapes, intensity,
and roadway illumination according to standards

SEE THE DIFFERENCE with Radiant Vision Systems | www.RadiantVisionSystems.com


HEAT SEEKERS

Current-generation thermal imaging


sensors, such as AdaSky’s Viper shown, are
palm-sized and cost in the low hundreds of
dollars at scale, companies claim.

While engineers debate the use of thermal-imaging sensors for ADAS,


their capability and value are being proven for AVs of all levels.

by Lindsay Brooke

W
hat specific sensor types will comprise But a case is building for additional sensing capability,
the advanced driver-assistance systems particularly for automatic emergency braking (AEB) and
(ADAS) of the 2020s? That’s a controversial pedestrian-detection. Those safety-critical functions
subject among engineers who are currently rely on camera-radar inputs to “see” ahead.
developing SAE Level 2 and 3 (and the so-called “L2+”) They enable the vehicle to react to a range of scenarios—
LINDSAY BROOKE

ADAS sensing suites for new vehicles. Many of them from stalled traffic on a highway to humans and animals
believe that visible-light cameras fused with radar will suddenly appearing in the road. (In Michigan alone,
suffice to deliver the object-identification accuracy, there were 53,464 traffic accidents involving deer in
redundancy—and cost effectiveness—that OEMs and the 2018, up 14% from 2016, according to state DoT data.)
driving public expect of ADAS-equipped vehicles. Pedestrian fatalities in the U.S. are alarming: one dies

6 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


Rainy street scene thermal image showing heat
emissions (in white) from vehicles, lamps, humans.

every 88 minutes, on average, in traffic crashes. In 2018, is bad driving weather for half the year,” he noted. For those
6,283 pedestrian lives were lost, up from 5,977 in 2017. reasons, thermal-imaging technology is under consideration
That’s the most since 1990 and represents an increase for both ADAS and SAE Level 4 self-driving AV applications.
of more than 35% since 2008. And three-quarters of all
pedestrian fatalities occur after sunset, NHTSA reports. Outdistancing Headlamps
Detecting people and critters without fail in rain, snow, A family of thermal sensors used in automotive, known as
fog and darkness can be daunting for systems solely based Far Infrared (FIR), operate in a long-wavelength range outside
on optical-and-radar-fused sensing. In October 2019, the the visible-light spectrum, to “see” the relative intensities of
ADASKY SCREEN IMAGE, LINDSAY BROOKE

American Automobile Assoc. tested several production heat (infrared) energy being emitted or reflected from an
AEB systems in various scenarios. During daylight tests, the object—including buildings, parked vehicles and pavement.
test vehicle driving 20 mph (32 kph) struck the target 60% The infrared spectrum consists of a near infrared section
of the time. In nighttime conditions, the vehicle under test (NIR), with wavelengths of 0.75-1.0 μm; a short infrared
traveling at 25 mph (40 kph) hit the soft pedestrian target section (SWIR) with wavelengths of 1-3.0 μm; a mid-infrared
100% of the time. (https://www.aaa.com/AAA/common/ section (MIR), with wavelengths of 3.0-5.0 μm; and the far-
aar/files/Research-Report-Pedestrian-Detection.pdf.) infrared (FIR) section, with wavelengths of 7.5-14.0 μm.
“If NHTSA rules truly 100 percent all-weather performance Anything that generates or contains heat can be
for pedestrian detection by 2021, to meet its 5-Star safety detected and classified with thermal imaging. Humans
criteria, the industry will have to adjust its sensing strategies,” and animals have unique heat signatures that can be
said Raz Peleg, sales director at AdaSky, an Israel-based detected only with a thermal camera. The hotter the
developer of thermal-imaging systems. “In some places there object (such as a parked vehicle’s engine), the more they

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 7


HEAT SEEKERS

Thermal + Radar AEB testing at ACM


To develop a proof-of-concept automatic pedestrian detection
system that fuses radar and thermal camera data and can estimate
the distance of a pedestrian from the front of a test vehicle, FLIR
Systems contracted VSI Labs. The test vehicle was programmed to
automatically stop when the dummy pedestrian was at a proximity
the system determined to be an emergency-stop distance.
Initial tests were completed in December 2019 at the American
Center for Mobility (ACM) near Detroit. The test design was
based on Euro NCAP, but not all testing requirements were
met. Weather during the testing period was colder than the
specified testing temperature range and wet and slick snow-
covered roadways and wind interfered with the test fixtures.
FLIR’s testing at the ACM facility showed promise
Three test cases were conducted in both daylight and
for future pairing of thermal sensors with AEB.
darkness, giving six datasets and 35 total test runs using
an adult Euro NCAP Pedestrian Target (EPTa):

1. EPTa stationary in the middle of the test vehicle’s lane in Car- Test results were promising, according to VSI. In all runs for all
to-Pedestrian Longitudinal Adult 50% (CPLA-50) tests. test cases, the car’s AEB system successfully brought it to a stop
2. E  PTa crossed in front of the vehicle from the roadside in Car- before impacting the EPTa. Additional testing is planned for spring/
to-Pedestrian Far Side Adult 50% (CPFA-50) tests. summer 2020 following AEB algorithm optimization, EPTa heating
3. EPTa crossed in front of the vehicle from an obstructed position in improvements and when weather is within test parameters.
Car-to-Pedestrian Far Side Adult Obstructed 50% (CPFAO-50) tests.
—LB

CLOCKWISE FROM TOP RIGHT: FLIR SYSTEMS; LINDSAY BROOKE; LINDSAY BROOKE
ZF VP Aaron Jefferson believes
thermal imaging is not yet cost
effective for SAE Level 2.

and radars not seeing a child stepping out from behind


Sensor delta-T
sensitivity is at a parked vehicle in dense rain, for example. For that
millikelvin levels, says reason, we’re very interested in thermal imaging.”
FLIR’s Kelsey Judd. In AdaSky’s testing, the company’s latest Viper sensor
has recognized pedestrians at distances greater than 300
yards (275 m), or about twice the range of low-beam
stand out against the background. Thermal sensors are headlamps. The latest automotive-grade thermal sensors
particularly effective when used in less than ideal lighting made by FLIR Systems offer similar object-recognition
situations and are outstanding in smoke and darkness. capability, claims technical project manager Kelsey Judd.
Identifying the thermal radiation emitted, FIRs can detect Considered the mobility industry’s incumbent, Oregon-
objects at distances well beyond those of conventional based FLIR has been selling thermal cameras into
headlamps, says one OEM engineer. “Object detection automotive—mostly as a driver-warning aid—since 2002
for L2 through L5 must have 100 percent fidelity in all through a partnership with Veoneer (formerly Autoliv).
conditions,” he told Automotive Engineering in an email Foresight, another Israeli company in this space, offers
not for attribution. “No one can apologize for cameras a binocular sensor array that fuses thermal images with

8 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


with a roof-mount AdaSky Viper and data-acquisition
hardware was in a perfect confluence of rain, drizzle and
fog. Peleg piloted the car while we watched a dash-mounted
screen. It displayed what the car’s software was seeing: heat
signatures of everything in an urban setting during lunch
hour. Traffic, pedestrians and commercial delivery drivers
stood out brightly against a grey, cluttered background.
Images we captured from the demo car’s screen
Object detection show how the thermal sensor consistently picked
for SAE Level 2 out humans on or near the road at a distance.
through Level Sensor performance continues to improve. The current-
5 must have gen AdaSky Viper FIR offers 640 by 480 pixels resolution
100% fidelity
at refresh rates of 60 frames per second. It can detect
and reliability in
all conditions, delta-T as small as 0.05 degrees kelvin, making it capable
asserts Raz Peleg of classifying road-surface conditions ahead of the vehicle.
of AdaSky. FLIR’s Judd describes the same sensitivity: “it can discern
the temperature difference between your middle and index
fingers.” Determining whether the object is a human, a deer,
or a dog, the computer doing the analytics makes those
decisions. It looks at temperature from our sensor’s data and
at visible color, shape, movement patterns and other inputs.”
those from the visual spectrum to produce a clear 3D In December 2019, FLIR Systems and ANSYS announced
view, the company says. Its stereoscopic vision technology a partnership to integrate a fully physics-based thermal
uses two synchronized cameras to generate a depth sensor into ANSYS’ driving simulator to model, test
map, enabling extremely precise object detection. and validate thermal camera designs within what the
Evolved from military sensor tech, FIRs use deep companies call “an ultra-realistic virtual world.” Their aim
learning and machine-vision algorithms. They are is to reduce OEM development time in optimizing thermal
a necessary complement to cameras and radar for cameras for use with AEB and pedestrian detection.
ADAS as well as for the camera/radar/lidar “triad” On picking out a 98.6°F human walking across an Arizona
for SAE Level 4 self-driving AVs, maintains Peleg, a road on an equally hot summer day, the experts agree: it
former F-16 pilot for the Israeli Defense Force. can be a challenge to differentiate the two. It’s why multiple
“Our system can be fused with those other sensors. sensor types that complement each other are necessary, they
It’s an essential combination; our system can’t read said. Peleg said AdaSky’s image sensing processors (ISP) are
street signs, for example, while cameras can,” he said. key to detailing an object’s emissivity. “We have patents on
“But in low-visibility weather, and in corner cases such this chip, which is manufactured for us by STMicroelectronics,
as with sun blinding and oncoming headlamps and same ones doing it for [machine-vision specialist]
exiting tunnels, thermal imaging does much better— Mobileye.” Such chips draw only 750 milliamps, he added.
and we do it uninterrupted, 24 hours a day.” Not all of the industry is convinced thermal-imaging
He reports that a European car maker and a sensing is ready—and right—for near-term ADAS deployment.
North American OEM will have AdaSky’s Viper in “Although it does fill a ‘white space’ among the current
production vehicles by 2021. Peleg and other experts sensor technologies, I think we serve a very good range with
draw distinctions between the thermal-imaging our camera, radar and lidar,” said Marcus Christensen, North
technology being readied for ADAS and AV systems America customer chief engineer at Continental Automotive,
and those used by some OEMs as night-vision aids. when asked about his company’s plans in this area.
“Those [night vision] display their information on a “Thermal sensing is ideal for detecting objects behind
head-up display for the driver to see,” Peleg said. “Our the vehicle—in fact, I think rear-facing thermal imaging is
image, of higher detail quality, is not displayed to the a viable use case,” observed Aaron Jefferson, VP product
driver. Instead, it’s shown to a software layer which strategy, global electronics, at ZF. “It will definitely be needed
interfaces the vehicle’s subsystems. As with AEB, the for Level 4. And there might be specialized use cases, but
driver doesn’t see anything; the vehicle only reacts.” at low volume particularly for Level 2. At the moment,
LINDSAY BROOKE

however, it doesn’t enhance Level 2 or Level 2+ capability,


Lingering Skeptics where the driver’s always responsible, to justify its cost at
Demonstration drives of AV sensor systems are best the moment. It’s a hardware and functionality cost leap.”
conducted in bad weather, for the best real-world evaluation But series production vehicles appearing in the next year
of system performance. Our drive in a Ford Fusion kitted equipped with the techology may change skeptics’ minds.

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 9


Radar’s Evolving Role
in ADAS and the
AV Future
by Rick Kundi

The automotive radar sensor


market is growing at a 21%
compound annual growth rate,
putting greater demands on
chip design, testing, and module
deployment.

U
ntil recently, various vehicle OEMs were vying as well as with its environment. Within the increasingly
to be first to put self-driving cars on the road capable sensor suite (see table), automotive radar is now
by the magical turn of the decade. But as 2020 an indispensable technology that enables sub-systems
arrived, the bullish tone has switched to a more for advanced driver assistance systems (ADAS).
cautionary outlook. While the truly self-driving SAE Level About a decade ago, 24-GHz radar sensors entered
5 autonomous vehicle (AV) without a steering wheel is still the scene and began to impress engineers, enabling
years away, innovations needed to support the intelligent self- all-weather ADAS functionality such as blind spot
driving car of the future continue to emerge and improve. detection, lane change and parking assistance, and
Developers grapple with the need to comply with safety collision avoidance. Minimally affected by lighting
and security standards as the average car sees more conditions or bad weather, radar quickly gained favor
sensor technology connecting it with onboard devices over camera technology for ADAS applications.

RADAR CAMERA LIDAR


Technology Detection—distance Recognition, classification by im- 360-degree 3D view
(range) and motion ages by laser light
(velocity and angle)
by millimeter waveforms
Application • Adaptive cruise control • Traffic sign recognition • Emergency brake assist
• Automatic emergency • Lane keep systems for pedestrian, crash
braking systems • Parking assistance imminent braking,
• Blind spot detection • Blind spot detection mapping
• Parking assistance • ACC, AEBS
• Surround view
• Rear collision warning
• Cross traffic alert

10 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


24- and 77-GHz frequency band usage in automotive radar.

packages, with more bandwidth available to


achieve greater resolution of detected objects.
The automotive radar sensor market is
growing at a compound annual growth rate of
21%. By 2023, it is expected to exceed $8 billion,
according to Microwave Journal, outstripping
other radar sectors like environmental
monitoring, security surveillance, and aerospace
defense. Meeting this market growth without
compromising on performance and reliability
requires rigorous testing of each radar from
chip design to module deployment.
A self-driving car can have up to 24
radar sensors. Interference effects can arise
between sensors within the same car or
with other onboard devices. Even marginal
errors in measurement, such as wrong angle
calculation at a busy road junction, can result
in dire consequences. Therefore, engineers
Grappling with more sensor test challenges as cars evolve towards Level 5
autonomous driving. must characterize the behavior of each
new radar module before mass production
and installation in the vehicle. These days,
Radars Evolve and Proliferate engineers use radar emulation equipment to generate
The essence of automotive radar is the ability to scan and analyze different signals, with software to create
the three-dimensional space and gather information about test cases for different conformance test standards.
other road users and stationery objects. This includes Some test managers also use intelligent laboratory
picking up the presence of other vehicles or pedestrians operations software to help them manage the thousands
and pets, to details such as location, speed, direction, of tests for their devices under test (DUT). This can help
shape, and identity. In most implementations, the radar them precisely determine prototype DUT readiness for
system generates a RF/microwave or millimeter wave mass production. At the functional test level, engineers
signal and beams it toward the target in question. can now simulate multiple targets for radars operating
The same antenna that transmitted the signal, collects in the 76-81-GHz band. This allows both the radar
the feedback. The feedback triggers the electronic control module developers and car makers to test a multitude of
units on board the car to activate the appropriate ADAS realistic scenarios before the car rolls onto real roads.
response. This can be a lane change or vulnerable road Current radar technology still struggles with providing
user alert, or a trigger to activate adaptive cruise control greater resolution to discern different objects. This is
to help drivers maintain safe platooning distance. where lidar does a better job. A lidar sensor uses a pulsed
Due to spectrum regulations by the European laser to detect objects, usually with higher resolution than
FROM TOP: TI; KEYSIGHT

Telecommunications Standards Institute (ETSI) and the radar. Lidar’s higher degree of granularity can provide a
Federal Communications Commission (FCC), the 24-GHz- much more complete view of the vehicle’s environment.
wide bandwidth and UWB bandwidth will not be available On the downside, lidar is generally more expensive versus
for new automotive radar devices after January 1, 2022. camera and radar sensor technology. New players are trying
These changes are spurring market growth for 76-81-GHz to produce cheaper lidar, with some innovations going for
band usage for automotive radar applications. These higher under $1,000, versus typical prices in the ~$10,000 range.
frequency bands allow designers to create smaller sensor Lidar has other limitations including high data rate and

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 11


Radar’s Evolving Role

From sensor to cellular networks, seamless connectivity is key to enabling autonomous driving.

power consumption, and poorer performance in low lighting. autonomous vehicle developers who face common
It will be interesting to see if new disruptive technologies challenges of meeting new and evolving industry
help make lidar cheaper and better for wider adoption. standards. Global standards organizations including SAE
International and IEEE are working on new standards
Enabling the Future Connected Car for artificial intelligence in autonomous vehicles.
While sensor technology plays a vital role in enabling Initiatives to introduce self-driving vehicles currently
safety and in-cabin comfort, it is high-bandwidth, low- are confined to service fleets operating in safe zones.
latency cellular vehicle-to-everything (C-V2X) which will It may take several more years to see self-driving cars
help realize the vision of truly autonomous and connected alongside those still controlled by human drivers.
cars. C-V2X connects the vehicle sensors to driving data Meantime, more work lies ahead for automotive design
such as speed, location, traffic, other cars, and data such as and test engineers to juggle the jungle of conformance
real-time updated maps. With a supercomputer on wheels, standards, alongside customer expectations for the
testing for signal integrity, performance reliability, and creature comforts of a self-driving gizmo on wheels, with
automotive cybersecurity from the physical backplane all all risks of malfunctions and accidents mitigated.
the way through the various protocol layers becomes critical.
4G and LTE technology, and eventually 5G, will form Rick Kundi is a solutions marketing
the infrastructure to support C-V2X capabilities (see engineer within the Automotive and
Figure 3). According to S&P Global Market Intelligence, Energy Solutions business at Keysight
4G LTE is expected to surpass 80% of total in-vehicle Technologies, responsible for automotive
cellular systems in 2024. The industry is concurrently radar, Ethernet and software solutions. He
banking on 5G technology, which advocates believe will has over 10 years of industry experience
be able to better carry mission-critical communications in application engineering, sales, and
KEYSIGHT

faster and better for autonomous vehicles. marketing and has presented at professional events
The potential of 5G-enabled C-V2X is exciting, and the covering topics such as RF and wireless basics, advanced
road ahead is likely to see more collaboration among automotive radar analysis, and generation solutions.

12 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


Lidar in a Flash
by Ryan Gehm

Continental delivers a short-range 3D flash lidar sensor that’s expected


to find increasing application in commercial vehicles and off-highway
machines in 2020.

A
precise three-dimensional
profile of a vehicle’s
surroundings is a fundamental
prerequisite for automated
driving. A new high-resolution 3D
flash lidar (HFL) sensor developed
for use at close range—50 m (164 ft)
or less—delivers on this capability,
according to Thomas Laux, head
of business development and sales
for HFL segment in Continental’s
Advanced Driver Assistance
Systems (ADAS) business unit.
“This is automotive grade and solid
state, meaning no moving parts. It’s a
bunch of semiconductors, which is ideal
for commercial vehicle and even more so
for the off-highway environment—they
don’t have a lot of hours [of operation]
but their environments can be pretty
rigorous,” said Laux. The IP6K9K
packaging is very important for these
applications, he said, as is a 3D Global
Shutter feature that eliminates motion
distortion and enables persistence mode
and geo-registration of point cloud.
Laux, who is located in Carpinteria,
Calif., as part of Continental Advanced
AGRITECHNICA

Lidar Solutions U.S. LLC, was working for


Continental’s Contadino agricultural robot concept is equipped with the HFL110 lidar sensor along a Tier-2 supplier acquired by Continental
with radar, real-time kinematic GPS, camera and ultrasound to ensure accurate object detection and a few years ago. “We started off with
classification, tracking and a GNSS accuracy of 3 cm. 22 engineers and now we have 300,

14 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


working across domains on everything from
the software to the test and validation, which
is one of the most crucial pieces,” Laux said.
About 25% of the activity in developing
the automotive-grade lidar sensor is
devoted to test and validation, he added.
“For this sensor we designed short range
first. We thought this was a bigger issue,
which turned out we’re right,” Laux said. The
supplier is providing samples of its HFL110
lidar sensor to commercial vehicle (CV) and
off-highway manufacturers for evaluation.
Application markets include agriculture,
construction, mining, UAV delivery and
precision infrastructure inspection.
Continental’s patent-protected flash lidar
is expected to be in mass production by the
end of 2020. The supplier already has launch
customers for automotive and off-highway,
Laux said, but not yet on the CV side.

Particulars of Flash Lidar


Complementing the supplier’s existing
ADAS sensor suite including radar and 2D
color sensors, the HFL110 sensor delivers a
detailed 3D profile in 330 nanoseconds per
frame, regardless of lighting or weather The CUbE, shown here on display at NACV 2019, is Continental’s autonomous
conditions. The sensor generates a high-res electrified development platform for urban “first or last mile” mobility.
3D point cloud 25 times per second within
its 120° x 30° field of view. Its multiple
distance measurements capture 4,096
contiguous pixels (128 x 32 pixels) of depth
data in real-time across the field of view.
“We flash a laser pulse for 4 nanoseconds
to illuminate all of those pixels, 4,096 pixels
with a single flash. We use something called
‘engineered diffuser’ that spreads the light,”
Laux said, adding that the unit has up to four
returns to counter impediments such as rain,
spray, dust, smoke or fog. Built-in heater
and blockage detector, and an optional
washer system, also ensure reliable imaging.
The contiguous pixels tolerate nearby
high or low reflectivity surfaces, producing
FROM TOP: RYAN GEHM; CONTINENTAL

a point cloud and/or an object list to define


and track the vehicle’s surroundings and
moving objects within it. The sensor’s
range is 22 m (72 ft) at 10% reflectivity.
The sensor operates at 1,064 nm in the
near-infrared range, which is Class 1 eye-
safe, and the unit weighs approximately
700 g (24.7 oz). “It doesn’t replace
a 2D sensor, we can’t see color with
this,” Laux explained. “As a human you Immune to vibration or speed distortion, Continental’s high-res HFL110 3D flash lidar delivers a
have ears, eyes, nose, that’s how we 50-m range and 120°x30° field-of-view.

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 15


Lidar in a Flash

“ Volume and manufacturing enhancements will bring cost down.


It will happen.

the high cost of radar over a decade ago.


“Volume and manufacturing enhancements
will bring cost down. It will happen.”

Off-Highway Machine Vision


The HFL110 3D flash lidar featured prominently
at Agritechnica 2019 in Hanover, Germany. In the
John Deere innovation section of its booth, the
HFL110 was demonstrated on a concept tractor,
and a large wall-mounted monitor displayed what
the lidar sensor detected on the OEM’s stand.
Continental received strong interest from a number
of equipment manufacturers, said Laux, noting that
several initiatives are under way but they cannot be
commented on at this time.
An agricultural robot concept, the Continental
Contadino, also employs the HFL110 sensor.
Introduced at Agritechnica, the fully electric and
autonomous Contadino features a modular design
that allows exchangeable implements and different
track widths. The implement carrier can be used for
different light-duty tasks such as seeding, weeding,
spraying, fertilizing and monitoring. Smart farming
In the John Deere innovation section of its booth at Agritechnica, a concept tractor is possible with precise application of pesticides
demonstratedthe HFL110 sensor (seen just under the Deere logo). and fertilizers or a selective sowing process.
The robot is equipped with lidar scanner, radar,
real-time kinematic GPS, camera and ultrasound
interact with the real-world environment. Similarly, with to ensure accurate object detection and classification,
vehicles we need lidar, radar, 2D—a full sensor suite.” tracking and a global navigation satellite system (GNSS)
So, lidar is a must to achieve SAE Level 4 autonomy? “I accuracy of 3 cm (1.2 in). The tools are connected to the
know Elon Musk might have a different opinion, but why robot via open interfaces. This connection provides the
would you cripple yourself and operate either without eyes implement with electric power and acts as a data line
or without ears,” Laux said. “I think the more redundancy enabling access to sensor data and cloud communication.
you have in different wavelengths, where if one doesn’t Several robots can work collaboratively in the
work you’ve got a fall back, is critical for Levels 3 and 4.” field in fleet operation. A trailer that transports the
For dense urban environments, two HFL110 units can be autonomous machines to the field also functions
mounted to provide imaging with overlap in the middle. “You as a charging and refill station for seeds, fertilizers,
can get really good geometry, you can read the FedEx on a etc. Automatic recharging allows 24/7 operation.
delivery truck,” he said. “That kind of geometry is necessary A cloud connection of the fleet allows mobile
AGRITECHNICA

for merging environments, when things are whipping by monitoring and a constant overview of the process.
really fast when it’s raining or you get truck spray.” Contadino is currently in prototype phase.
The new sensors currently are “a bit expensive,” which Continental is looking for partners in the agricultural
is typical with a launch product, Laux noted, referencing sector to realize first field applications in 2020.

16 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


Bandwidth for Sale
by Terry Costlow

The FCC and transportation industry clash over vital


vehicle-communications boundaries.

V
ehicle-to-vehicle/infrastructure (V2X) Highway Traffic Safety Admin. (NHTSA) appeared
communication often is touted as a critical ready to issue a mandate requiring deployment.
technology for automated driving and even But the U.S. mandate never came, nor did a
to derive the utmost benefit from advanced European edict. That created uncertainty that led
driver-assistance systems (ADAS). But despite to chaos in the market.
plenty of verbal support from industry developers, Now, DSRC proponents are trying to stymie an FCC
there’s been little actual deployment. What critics plan to sell a portion of the 5.9 GHz range originally
label “foot-dragging” by the industry now may earmarked only for transportation purposes. Then came
catastrophically stall market implementation. another wild card. While DSRC languished, a cellular-based
V2X technology had a bright future in 1999, when the alternative, cellular V2X (C-V2X) came onto the scene.
FCC allocated 75 megahertz of spectrum in the 5.9 GHz Though there’s uncertainty about the prospects
radio band for Dedicated Short-Range Communications of V2X, advocates say the technology can do a lot to
(DSRC). Years of development and demonstrations prevent accidents today and improve the operations
seemed ready to pay off in 2014 when the National of autonomous vehicles when they emerge.

NXP

DSRC is NXP’s choice for chips that let vehicles communicate with others.

18 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


Vehicle-to-everything (V2X) communication can help reduce accidents, particularly at intersections.

“Driverless vehicles cannot exist without being connected,” Bandwidth for Sale
said Guillaume Devauchelle, VP of innovation and scientific FCC chairman Ajit Pai proposed making the lower 45
development at Valeo. “When you’re running at high speed, MHz of the 5.9 GHz band available for unlicensed uses such
it tells you about things happening at distances that are as Wi-Fi, while retaining the upper 25 MHz of the 5.9 GHz
far longer than on-vehicle sensors [can detect]. That helps spectrum for transportation use. The upsides of the potential
guarantee safety and it makes driving more pleasant. sale could be significant. A 2018 study by the Rand Corp.
You can slow down one kilometer before construction estimated that opening automotive’s 5.9 GHz band for Wi-
work instead of quickly stopping when you get close.” Fi could provide gains to economic welfare for consumers
At the SAE Government/Industry Meeting in Washington and producers that range from $82.2 to $189.9 billion.
in January, Continental’s Bettina Erdem explained that The proposed auction is equally huge for DSRC’s outlook.
45% of all accidents occur at intersections, where V2X can If the FCC proceeds with its plan and the bandwidth is
provide far more safety-related information than on-vehicle used for non-automotive services, interference from Wi-
sensors. She highlighted the importance of V2X by saying Fi could render the V2X spectrum “useless,” according
that 60% of serious and fatal intersection accidents cannot be to John Kenney, director and senior principal researcher
prevented by onboard safety systems. They need V2X input. at Toyota InfoTech Labs. He told attendees at the SAE
Though it’s largely viewed as an important element in conference that if automotive lost 60% of V2X spectrum,
safety and autonomy, V2X has to date seen little acceptance. the industry would lose 60% of potential V2X benefits.
Volkswagen announced plans to ship cars with DSRC Proponents need to quickly coalesce around a
capability, but Toyota scuttled its plan to deploy DSRC in the workable deployment strategy, he added.
U.S. Still, proponents feel there are many reasons to deploy. “There needs to be a consensus regarding V2X
While OEMs and infrastructure providers waited for DSRC technology or people are not going to deploy. Once
to catch fire, others stepped into the void. Industries hungry we have a consensus, we can go to the FCC and say,
for more wireless bandwidth convinced the FCC to sell some ‘This is what we want you to do.’” Kenney said.
CONTINENTAL

of the spectrum set aside for DSRC, while concurrently, While the FCC ponders its auction, fellow government
mobile-phone suppliers developed C-V2X, which leverages officials at NHTSA contend that the auto industry
on-board modems to provide many of benefits of DSRC needs the bandwidth in question. NHTSA division
with reduced design-in and dedicated-infrastructure costs. chief Bob Kreeb predicted that selling the bandwidth

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 19


Bandwidth for Sale

will “limit deployment of existing technology that’s On the upside, the price of implementing either option
ready and usable. We’re disappointed in the FCC’s isn’t too onerous. “The add-on cost to equip a car is around
current proposal that they’ve outlined. We also think $15-$20,” Devauchelle said.
it’s going to limit innovation and creativity.” Automakers would like to see one technology
Another plus for DSRC is that it’s undergone plenty of emerge globally so they can trim overhead and benefit
real-world testing, which helped developers weed out bugs from volume pricing. As the two specifications vie for
and verify performance. Maturity often is a desired trait given acceptance, only one can operate efficiently in each region.
the high reliability levels
demanded in automotive
markets. Supporters also
note that DSRC has far
better latency performance
than C-V2X, which can
be critical in high-speed
safety environments.
“Some of the key
performance parameters of
C-V2X are not as good as
DSRC,” said Huanyu Gu, senior
product manager for V2X at
NXP Semiconductors. “While
over time C-V2X performance
may further improve, it
will take many more years.
The question is: Should
society be kept waiting for
C-V2X to mature, while the
mature technology, DSRC,
is ready for deployment?” Automated-truck developer TuSimple won’t rely on V2X in its forthcoming autonomous haulers.

Cellular Challenge The benefits of V2X grow as more vehicles and roadside
The challenge from C-V2X began gaining ground infrastructure share data. Nor is it practical for a country
around 2017. Cellular and chip providers created or region to have vehicles with incompatible technologies
specifications and groups such as the 5G Automotive sharing the roads. Accidents could still happen when
Association (5GAA) voiced endorsement. Regulators in vehicles with different technologies encounter one another.
the Chinese automotive market are actively supporting “Cars that utilize different technologies cannot
the technology, and Tier 1 suppliers are following suit ‘talk’ to each other, which greatly compromises the
now that a Chinese mandate appears imminent. effectiveness of each technology,” Gu said. “To utilize
“China represents about one third of the market; both technologies within a given region, a theoretical
everyone will have to develop C-V2X for China,” Devauchelle solution could be to equip V2X stations that support
said. “In the U.S. and Europe, it’s difficult to invest. It’s both technologies and translate one into the other.
not that one is that much better, it’s more a matter of The additional latency caused by translating one
regulations and a battle between different stakeholders.” standard to the other will likely make such a solution
Momentum for CV-2X also comes from Ford, a non-starter for safety-critical applications.”
which is planning C-V2X rollouts while supporting the All the uncertainty surrounding V2X is driving
FCC’s spectrum auction. On the semiconductor side, some autonomous vehicle developers to eschew the
major cellular chip suppliers including Qualcomm, technology. TuSimple, a trucking startup that plans to
Intel, Samsung, Huawei, and CATT/Datang S have begin delivering freight hauled by its driverless vehicles
5GAA’s C-V2X technology in their roadmaps. next year, isn’t settling on either technology. Instead,
design teams will rely strictly on on-vehicle sensors.
Foggy Future “V2X is nice to have, but we must be able to determine
Support for both DSRC and C-V2x creates questions for our surroundings without it,” said Chuck Price, TuSimple’s
TUSIMPLE

product planners at all levels. Design teams don’t want chief product officer. “Even if we just talk to our own vehicles,
to support different technologies in various geographies. V2V can enhance operations. But given all the things that can
That adds complexity for design and support teams. happen, we have to assume we can’t talk to other vehicles.”

20 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


This could be you.

Einstein once said, “Imagination is everything.


It is the preview of life’s coming attractions.”
By using your imagination, and entering our 2020 Create the Future Design Contest, you
could win $20,000, global recognition and the attention of industry leaders around the world
who have the ability to bring your ideas to market. Shape the future by entering today!
Visit: CreateTheFutureContest.com
See the website for contest rules and regulations

Last Chance! Entry Deadline July 1, 2020


PRINCIPAL SPONSORS CATEGORY SPONSOR PRIZE SPONSORS
ZF Establishes
Level 2+ ADAS
by Lindsay Brooke

Cost-effective technology solutions to meet diverse customer needs


are what the “new pragmatism” in driver-assistance tech is all about.
Senior VP Aine Denari explains.

T
he wall of hype surrounding the self-driving we think the largest portion of the AD-focused passenger
future has given way to a new pragmatism, car market will be Level 2+, which essentially relies on
and engineers are breathing a collective sigh cameras and radar. Some, however, want solutions that are
of relief. The industry, for the most part, has scalable up to Level 4”—the common architectures help
come to grips with the myriad challenges of making minimize having to test and re-validate across product lines.
autonomous vehicles perform with utmost safety, in all “We’ve seen some customers deliberate the tradeoff
driving scenarios and weather conditions. Cost sensitivity of having a scale-up solution to meet the high-end niche
remains a significant factor, particularly in advanced that’s maybe five percent or less of the total market,
driver assistance systems (ADAS) whose development versus having the most cost-optimized solution for maybe
is closely aligned with NCAP safety requirements. 80 percent of the market,” she noted. “As a full-systems
Tier 1s that are driving both innovation and supplier, we need to be able to offer everything.”
systems integration in this dynamic space have
been adjusting their strategies accordingly. Lidar at SAE “Level 2-Plus”
A new realism has emerged, asserted Wolf-Henning Does lidar—which many engineers still consider an
Scheider, CEO of tech supplier ZF. During the 2020 immature and expensive technology—play a role in
CES in Las Vegas, he said his company’s developments ZF’s Level 2+ plans? CEO Schieder hinted that some
are currently focused on two parallel tracks: ADAS advanced full-range radars now under development
that ZF calls ‘Level 2+’ for passenger vehicles, and SAE may deliver much of lidar’s imaging capability.
Level 4 systems for commercial-vehicle applications “Level 2+ systems still require drivers to have their eyes
and people/cargo movers in defined use cases. on the road,” Denari staated. “But when you move into
ZF’s new “coAssist” system is its first Level 2+ play. It Level 3 and beyond, you must have a third redundant
offers capability between standard SAE Level 2 and Level sensor.” She noted that some customers might want the
3 and is claimed to meet projected Euro NCAP 2024 test additional performance that a lidar provides for enhanced
protocols. CoAssist will use Mobileye’s latest EyeQ chip safety, particularly in their technology-leading flagship
and ZF’s new Gen21 medium-range radar when it enters vehicles: “We definitely see customers putting lidar into
production with “a major Asian OEM” later this year. Level 2+ vehicles to enhance availability and reliability.”
“The range of needs is so diverse across the customer The more an ADAS architecture can be scaled up, the
sets. The way one [customer] defines Level 2+ might be very more it will inherently cost. This can lead to optimizing
different from the way another defines it,” explains Aine for the lowest-cost, mass-market solution. ZF’s new
Denari, senior VP for ADAS at ZF. “Although the majority of coAssist package, including software and functionality, is
the volume of the market will remain GSR or NCAP-focused, priced “well under $1,000” per vehicle. Denari revealed

22 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


that increased efficiencies across
ZF’s validation process, using Aine Denari:
more AI-based tools in simulation, Delivering “the biggest
have helped reduce development impacts in the most
cost-effective way.”
costs that can be at least a million
dollars per platform, per region.
Technology partnerships
are increasingly vital to ZF’s
automated-driving strategy.
Cameras are a collaborative
effort with long-time partrner
Mobileye. Lidars developed with
Ibeo Automotive Systems enable,
along with long-range radars and
360-surround vision, will help
enable Level 4 functionality. Hella is
the short-range radar partner. ZF’s
powerful ProAI compute platform
can use a range of SOCs including a
scalable Nvidia chip, or those from
Xylinx and Qualcomm, to process
signals from the sensor suite that
can also include ultrasonics.
“In terms of our decisions to
invest in a technology or partner,
we’re focusing on the places where
we have a long-term, sustainable,
competitive advantage,” Denari
said. “We look for where we
can have a USP [unique selling
proposition]. There are both new
sellers and new customers breaking
into the market, and we won’t
invest in anything that just makes
us a ‘me too.’ We’ll only invest
where we can bring something for sure—it’s going to be a requirement,” she said. Incumbent
to the table that’s better than everybody else’s.” Level 2 systems now in use that don’t have driver monitoring
A technology that ZF has decided not to invest in, present a danger to their drivers as well as to other road users,
for the short term at least, is vehicle-to-infrastructure she believes.
[V2X]. Denari acknowledges that V2X is an important ZF’s central-compute initiatives are centered around
part of an overall ADAS and autonomous-driving use case. The basic NCAP solutions require no SDE
ecosystem. “We looked at the market landscape and (safety domain electronic controller), with sensor fusion
what the other players can do, at what the potential done on the camera or radar. With many OEMs moving
pricing is and what the real differentiation ability to all-new electrical architectures, ZF is preparing for the
is. We concluded that in that [V2X] space it makes widespread move to centralized domain controllers.
more sense for us to partner with somebody.” “Our goal is to offer a range of solutions,” Denari said.
“For example, we can offer a ProAI that’s purely for ADAS
Taking the Cubix Approach and AD and a ProAI that houses the Cubix [ZF’s new
HD mapping and driver-monitoring are technologies that ZF software that integrates numerous chassis systems] in
is watching closely. The former “will become more ubiquitous which we can incorporate the powertrain function, too.”
as its resolution increases,” Denari said. And she anticipates Throughout AVE’s interview with Denari, she continued
driver-monitoring is “not unlikely” to become mandated as the to stress ZF’s focus on executing “what will have
more-capable ADAS systems such as coAssist gain popularity. the biggest impacts in the most cost-effective way,”
“If we’re going to allow drivers to take their hands off the delivered flawlessly within a business and technology
wheel—and some customers looking for Level 2+ will want that space where cost + value is the holy grail.
ZF

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 23


Shifting design
of autonomous
architectures
Inputs from many sensors are fused and analyzed by a centralized system.

Electronic controls centralize while providing commercial-level reliability.

by Terry Costlow

T
he controls for fully autonomous trucks must deal chief product officer. “A lot of companies are building
with inputs from a number of sensors, analyzing prototypes that aren’t ready for commercial use—things
this data and taking actions that enable the will wear out before they leave the driveway.”
vehicle to complete its route safely and efficiently.
Design teams are racing to put together the perfect control Getting Centered
architecture for their vehicles, while ensuring that these Before autonomy became a focus, many sensors fed data
complex real-time systems operate over long vehicle lifetimes. to their own dedicated controllers. That’s now shifted to
Thousands of megabytes of data from different centralized management, which makes it easier to make
combinations of cameras, radars, lidars and ultrasonic sensors decisions based on 360-degree inputs. Combining controls
flow into controllers that meld this information into a map for various subsystems into a single module simplifies
of the vehicle’s surroundings. Within milliseconds, these the control architecture. It also makes it easier to update
controllers must determine adjustments that keep the vehicle software over the long lifetime of a commercial vehicle.
on its path and operating safely. The computing challenges “A key benefit of the new architectures is that we can
are matched by the needs of creating bulletproof systems. provide a centralized infrastructure for all electronic control
“Sensor fusion is used to create an understandable units,” said Martin Schleicher, vice president, strategy,
surrounding, making it possible for the vehicle to conduct the Elektrobit. “This centralized function not only increases
transport mission in a safe way,” said Johan Larsson, director the vehicle’s safety, but also ensures that the vehicle’s
of Autonomous Solutions, Volvo Trucks North America. “Of software can be kept up-to-date during its entire lifecycle.
course, the perception system setup is redundant, with ‘belt The automated vehicle is another device in the IoT.”
and suspenders,’ meaning that you can operate in a fail-safe Many vendors are planning moves from limited operating
mode even if one of your system components would fail.” domains to more common on-highway driving. More diverse
The controllers that analyze sensor inputs and operating realms require more analysis of the surrounding
determine what the vehicle should do need to operate environment; analyzing data from sensors quickly enough
at extremely high speeds, with little margin for delays to avoid accidents requires hefty amounts of processing
or failures. Engineers tasked with building control power. Development teams are forging partnerships to help
modules also must ensure that components can meet them meet these demands while getting to market quickly.
the strict reliability levels of commercial trucks. “It is because of this, and the importance of being early in
ELEKTROBIT

“We’re talking about how to make this commercially the market, that we have decided to partner up with Nvidia,”
viable, that is the key to realizing the goal of getting Larsson said. “Nvidia has world-class knowledge of artificial
drivers out of the vehicle,” said Chuck Price, TuSimple’s intelligence and computing. We will use their hardware,

24 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


A mix of GPUs, CPUs
and FPGAs handle
different tasks.

simulation tools and some of their software. Combining Centralized systems


this with our world-class knowledge of vehicles and facilitate over-the-
vehicle control is a good base for the development air updating.
of a high-performing automated driving system.”

Fostering Diversity
Many truck and automotive companies are
using Nvidia’s graphics processing units (GPUs),
but GPUs are only part of the processing
equation. Today’s systems typically include
at least one conventional microcontroller that
usually handles decision making, among other
tasks. And GPU makers are beginning to include
microcontrollers, which may reduce the dominance
of traditional central processing units (CPUs).
At the same time, field-programmable gate arrays of the automated-driving system, with clear interfaces
(FPGA)s give design teams the ability to combine to support applications with different requirements.”
conventional CPUs with several parallel processors Powering all the cameras and chips for a large truck
like those found on GPUs. FPGAs can be adapted to takes a lot of energy. Some processor chips can draw
meet different challenges, letting developers adapt close to 100 W, and most systems have a large number
configurations to meet specific requirements. of processors. Cameras also consume lots of power. Price
“It’s a challenge to find the right balance between GPU said that electrical power requirements for TuSimple’s
demands and general-purpose computing demands,” Price autonomous vehicles range from 3-6 kilowatts depending on
FROM TOP: TUSIMPLE; VOLVO TRUCKS

said. “Today, we’re mixing ruggedized GPU systems along the size of the truck and version of the system architecture.
with Intel Xeon processors that run non-GPU algorithms. It’s That puts pressure on the vehicle’s power generators.
the role of FPGAs to optimize certain elements of the system.” “Power is a big deal for us. We’re not compute-
These centralized architectures still need to be constrained, we are power-constrained,” he said.
designed to fit in limited space, dissipate heat and keep “We consume almost all the power in a Class 8 truck
wiring harnesses short. At the same time, components with a big engine and a big alternator. Systems are
must be cost-effective to replace. Modular designs quite power hungry, there are a lot of cameras.”
can help engineers meet these requirements.
“Coming from a distributed architecture, we are moving Remote Control
in the direction of a more centralized architecture,” Larsson While the goal of long-term autonomous programs is to get
said. “But we do use a modular approach in the design people out of the vehicle, some human intervention may be

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 25


Shifting design
of autonomous
architectures
“This centralized function not only
increases the vehicle’s safety, but also
ensures that thevehicle’s software can
be kept up-to-date during its entire
lifecycle. The automated vehicle is
another device in the IoT.”
— Martin Schleicher, Elektrobit

Borrowing from military, industrial sectors


The sensing and computing requirements of autonomous trucks
are high, but volumes are low, driving design teams to borrow
technologies from other fields. Automotive systems are the obvious
choice, but as processing requirements rise, military equipment
suppliers also are focusing on large commercial vehicles.
Commercial-vehicle developers have long leveraged the
technologies and volume pricing of the automotive industry
and that model will continue for autonomous technologies.
While many automotive-grade boards and sensors are designed Remote operators can help
into trucks, ruggedization of these components is a necessity buses and trucks navigate.
because reliability requirements are dramatically different.
“Cars are designed for longevity of 100,000 to 200,000 miles,
necessary when unusual circumstances arise. Truly driverless
so that’s what component suppliers design to,” said Chuck Price,
TuSimple’s chief product officer. “Heavy trucks are specified to a vehicles will pull over and stop when their controllers can’t
million miles. That changes the nature of how things are built.” navigate through confusing situations. Some suppliers
The size of trucks is another major difference from believe that when onboard systems can’t understand tricky
passenger cars. It’s harder to stop large trucks and it’s situations, help can be provided by remote operators.
trickier to maneuver them, especially when traffic gets heavy. Starsky Robotics last year demonstrated its capabilities
Control systems and sensors need to account for these
with a nine-mile drive managed by an operator who
differences, mainly by extending the sensors’ field of view.
“Most of the technologies can be borrowed from passenger cars, sat in a remote data center. Another remote-control
but in many cases, there will be specific tailoring to the ‘truck use proponent, Designated Driver, is demonstrating the
case.’ One thing that will lead to specific technology development technology on buses in Texas. The company thinks that
for trucks is the need for a longer range of the perception system,” commercial vehicles are an area ripe for expansion.
said Johan Larsson, director of Autonomous Solutions, Volvo Trucks “Commercial truck and heavy equipment manufacturers
North America. “A truck needs to see much longer than a passenger
are definitely looking at teleoperations as a potential
car to be able to secure a safe stopping distance. Another challenge
is that some traffic situations, for example, merge scenarios, are
solution for enabling a tighter and younger labor pool,
more complex with a tractor-trailer than with a passenger car.” creating work environments that are more comfortable,
Ruggedization specifications and computing levels for appealing and safer for a new generation of employees,”
FROM LEFT: VOLVO TRUCKS; REMOTE DRIVER

electronics used in commercial vehicles are quite similar to the said Walter Sullivan, CTO, Designated Driver. “Being
requirements set for many military components. Connected- able to distribute employee resources across distant
vehicle needs are also like those of industrial equipment, which geographies for some jobs can be a huge advantage
has for years offered connection to the Internet of Things (IoT).
in managing employees, utilization, and service.”
Connectivity also is a standard feature on many trucks. Those
commonalities are attracting some military system providers. Not everyone agrees with this concept. TuSimple’s Price
“Trucking, smart agriculture and mining vehicles have feels that latency and other issues make it quite difficult
requirements and environments that are similar to defense for remote operators to make real-time decisions.
systems,” said Joe Eicher, director of business development “We don’t believe remote driving can be made safe or
at Kontron. “These applications are also tying into areas like reliable,” Price said. “We do believe we can remotely alter the
the IoT, where our industrial group plays, so our eyes are
plan for the vehicle when it’s in a minimum-risk condition.
squarely on autonomous vehicles in mining, ag and trucks.”
We can tell it to move forward 20 feet and left 2 feet, and
Terry Costlow to do that using the vehicle intelligence to determine if
there’s anything in its path and make other decisions.”

26 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


Democratize
AV Technology!
by Lindsay Brooke

Aptiv’s new generation of open-sourced architectures based


on a few central processors aims to speed AV adoption. CTO
Glen DeVos explains.

T
he software-intensive, electrified
and increasingly automated vehicle
will define the 2020s. Its rise is
driving both the industry-wide
re-thinking of electrical architectures and
the growth of engineering employment
behind it. At the forefront of this trend
is Aptiv, the technology Tier 1 spun off
from Delphi in 2017. It now has more than
19,000 engineers among its 160,000 staff,
comprising one of the highest engineer-to-
employee ratios among large suppliers. 
 “We’ve been adding about 1,500 engineers
per year, primarily in software and systems
engineering, at our 15 major technical centers,”
said CTO Glen DeVos. These resources,
he noted, will help Aptiv accelerate its Glen DeVos is leading
customers’ development of new vehicle Aptiv into new vehicle
architecture technology.
platforms with greater active-safety capability,
including automated-driving functionality. 
 The OEMs want full upgradeability of software advanced, open-sourced ones. “We formed our Smart
(FOTA; firmware over the air and SOTA, software over Vehicle Architecture group a little over two and a half
the air) capabilities, DeVos explained. He noted they’ll years ago when we saw a trend developing: The massive
also want centralization of compute—moving from content occurring in SAE Levels 1, 2 and 3 vehicles that
today’s multiple ECUs to a few domain controllers—and is creating pain points for our OEM customers,” he said. 
zonal control, all with reduced complexity and cost.   More features equals more data and not just for Level
 DeVos called this broad trend “a blank-sheet approach 4. “It’s across the board,” he said. “We realized that to
APTIV

to move away from traditional architectures” to more pack everything they wanted and that we were thinking

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 27


Democratize
AV Technology!


We want to make the compute agnostic and independent from all those


sensors and actuation.

Central Compute Cluster is the heart of Aptiv SVA.

about into an L4 vehicle, there was no way to do it Focus on SAE Level 2/3 
economically without fundamental architecture change.”  SAE Level 2 to Level 3 is currently Aptiv’s main focus
 Based on its booked orders, Aptiv expects deployment for automated-driving systems development. “We know
to begin in 2022 in premium vehicles, ramping up steadily Level 0 and Level 1 systems, with the progression of Euro
from 2025. And while the company has multiple programs and U.S. NCAP [impact safety] requirements, is going to
developing SAE Level 4 automated-driving functionality be the baseline by 2025,” DeVos says. “You won’t have
with customers aimed at commercial geo-fenced operations, cars that are significantly de-contented from Level 2.
such performance is not anticipated to be ready in That’s where the market is moving. It’s our ‘sweet spot.’” 
consumer-level vehicles until the 2030 timeframe.   An ADAS domain controller provides the fusing and
 “We’ve always talked about automated driving being perception modelling, the “brains” that DeVos compares
on the continuum of active safety,” said DeVos, who was to having a file server on board. In the new architectures,
part of Delphi’s pioneering work on Jaguar’s first Active consolidating from the dozens of discrete ECUs on today’s
Cruise Control launched in 1998. “Going from Level 2 to 3 vehicles to up to five powerful central compute controllers,
and ultimately to Level 4 and 5 is all on the continuum. We will drive a change to smaller “decontented” cameras and
want to take the technologies we’re developing for Levels radars with less integrated processing capacity and thus lower
4/5 and apply it to Levels 2/3 as the next generation of cost. The controllers would be responsible for active safety,
advanced capabilities and features. It’s important to think the user experience (UX), propulsion and chassis systems. 
APTIV

how we can bring both ends of the spectrum together.”   “We’re seeing costs getting ‘democratized’ for those

28 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


[up to SAE Level 2] systems,” DeVos noted. “But as you today’s Level 3 systems basically are an overlay on the
go from Level 2 to Level 3, however, there’s an inflection Level 2 architectures. The redundancy is an add. “But
point. This is driven by everything that supports the driver with the next-gen 2025 architectures, there are things
being out of the loop. In our view, Level 3 is advanced we can do to bring the cost and complexity down. That
driver assistance where the car is basically in control. It’s will be helpful in terms of market adoption,” he noted. 
able to make decisions with the driver disengaged.”   A key aspect of Aptiv’s new approach to system
 That brings the need for fail-operational and safe- architecting is what engineers call ‘Safe Dynamic
stop capability, and the need for redundancies. “Power Partitioning’. A traditional operating system (OS)
systems, controls, everything that avoids a single-point would never mix Infotainment (typically a Linux- or
failure,” DeVos explained. “With a Level 2 system, the Android-based platform) with anything that has
driver is that redundancy. With Level 3, it drives a lot of functional safety aspects such as ADAS. Each has its
additional components in today’s architectures.” That separate ECU. And both are typically underutilized. 
includes driver sensing and some level of mapping, the  “The industry norm is not to use any more than 80% of
latter typically provided by lidars which remain expensive.  a box at peak load,” DeVos said. “But when I add all that up
 Then there’s the reality of what DeVos describes as “just and look at the total loading, I’m grossly underutilizing the
more sensors.” While for SAE Level 2 the vehicle may have silicon that’s in the vehicle. And I’m paying for each box, over
forward-looking cameras with 360-deg. radar—a cost- and over again.” He explained that Safe Dynamic Partitioning
effective approach—going to a Level 3 system may include an allows design engineers to take a general compute
array of 360-deg. camera, 360-deg. radar and a lidar sensor.  platform and install whatever they want—infotainment or
 “360 vision systems add a lot more complexity and functional safety, each partitioned and managed safely. 
drive a lot more processing,” DeVos noted. “The compute  “I don’t need two boxes; I can consolidate them. Without
requirements go up dramatically. The domain controller the ability to have this mixed criticality, you end up with
would have a lot more capability than your previous Level a lot of redundant boxes,” DeVos said. “For example, you
2-plus controller, and you need a secondary controller in case can use the infotainment compute as a backup [such as if
that fails. Adding those pieces together the cost adds up.”  a failure were to occur] and put my Level 3 ADAS controls
 Driving down the cost curve on ADAS technologies on it. If I architect the product right, I can get redundancy
will take some time and will be a function of volume and and fail-operational capability without duplication.” 
systems cost optimization, DeVos said. Then ultimately  The advent of purpose-built EV architectures entering
it will be a function of vehicle architecture, because volume production this decade can help reduce cost and
APTIV

Aptiv is increasingly serving as a middleware integrator in the transformation from traditional to software-defined architectures.

ADAS/CONNECTED CAR SPECIAL REPORT JUNE 2020 29


Democratize
AV Technology!

The industry’s move to purpose-


engineered EV platforms such as Ford’s
2021 Mustang Mach-e (shown) works
in favor of the trend to the new
AV architectures.

the speed to market of Level 3. Properly architected, they situation will only worsen as today’s distributed
will not require add-in duplication to get redundancy. architectures proliferate, according to DeVos. 
Instead, it can be accomplished through more effective
sharing among controllers. “We’ll have capability for moving •S
 eparating I/O from computing—with all the sensors,
processes from a failed controller to another, as opposed actuators and data that’s flowing around the vehicle,
to just duplication, which is where we are today,” he said.  with hard connections back to each of the compute
  platforms, changing those sensors and actuators at
An Agnostic Approach  new-model time requires changing everything—re-
Aptiv’s ‘SVA’ approach is based on lessons learned architecting the compute and sensor interfaces.  
from mobile computing (smart phones) and other
industries where software is embedded and inseparably  And that’s not how servers operate, DeVos said.
connected to the hardware in purpose-built machines, They abstract compute from the I/O. “All the I/O comes
each one separate from the other and from one in standard format to that server so it’s managed very
generation to the next. The lessons include:  carefully,” he said. “And that’s the third important point:
enabling the ‘serverization’ of the platform.” This involves
• Abstracting software from hardware means decoupling aggregating compute into several modules that support all
software development from the underlying ECU or the features of the vehicle and doing it more effectively. 
component development. DeVos admits that today  “Essentially, what we want to do is make the
it is a massively complex task in getting everything compute agnostic and independent from all those
to work properly. Proof of that came in 2014, the sensors and actuation,” DeVos explained. “For
first year that warranty costs for software at the us it’s not reinventing the wheel; it’s applying
FORD

OEMs became greater than those for hardware. The this separation to the automotive space.”

30 JUNE 2020 ADAS/CONNECTED CAR SPECIAL REPORT


SAE MOBILUS ™

TECHNICAL RESOURCE
PLATFORM
Your critical advantage to develop
the future of mobility engineering
SAE MOBILUS™ is your destination for mobility engineering resources with instant access to explore,
discover, and share more than 226,000 of SAE’s current and historical standards, technical papers,
eBooks, magazines, and more.

For more information


THE FEATURES YOUR PEERS REQUESTED
+1.888.875.3976
Developed with extensive user feedback, the SAE (U.S. and Canada only)
MOBILUS platform features intuitive, easy search +1.724.772.4086
and navigation so engineers and students can focus on (Outside U.S. and Canada)
solving essential problems facing the mobility industry.
Visit saemobilus.org
The customizable dashboard keeps pertinent materials
accessible by allowing saved searches, personal
document annotations, and custom folder creation

Dynamic redlining visually represents revision tracking


for standards, eliminating a tedious manual process

Improved, intuitive site search returns focused results


with content snippets so you can preview the resource
before you download

COUNTER 4 reporting provides administrators


with accurate, timely content usage data to enable
informed subscription decisions

P17172963

You might also like