Professional Documents
Culture Documents
ADAS Connected Car SR 0620
ADAS Connected Car SR 0620
ADAS/CONNECTED CAR
JUNE 2020
Sponsored by
Accelerate ADAS and
autonomous driving
innovation
Dell EMC storage
makes it easy to get
the most value from
your data across
key workflows
Learn more at
http://dellemc.com/Isilon
Copyright © 2019 Dell Inc. or its subsidiaries. All Rights Reserved. Dell Technologies, Dell, EMC, Dell EMC and other trademarks are trademarks of Dell Inc.
or its subsidiaries. Other trademarks may be trademarks of their respective owners.
CONTENTS
FEATURES
2 Sensing with Sound: Next- 22 ZF Establishes Level 2+ ADAS
Level Ultrasonic Sensors for Cost-effective technology solutions to
Intelligent Cars meet diverse customer needs are what
By mimicking bats, echolocation technology the “new pragmatism” in driver-assistance
enables the detection of multiple objects in tech is all about.
3D space.
24 Shifting Design of Autonomous
6 Heat Seekers Architectures
While engineers debate the use of thermal- Electronic controls centralize while
imaging sensors for ADAS, their capability providing commercial-level reliability.
and value are being proven for AVs of all
levels. 27 Democratize AV Technology!
EAptiv’s new generation of open-sourced
10 Radar's Evolving Role in ADAS architectures based on a few central
and the AV Future processors aims to speed AV adoption.
The automotive radar sensor market has been
growing at a 21% annual rate, putting greater
demands on chip design, testing, and module
deployment.
Next-Level
are yet to be fully harnessed.
Bats have been nature’s prototypes
for sound-based navigation. They rely
on echolocation to detect obstacles
BY METAMORWORKS/SHUTTERSTOCK.COM
Ultrasonic Sensors
in flight, forage for food, and see in
dark caves. By mimicking the bat’s
technique, an ultrasonic solution was
developed that provides echolocation
Close-Range Perception
How can it be possible that a
staggering one out of five motor
vehicle accidents takes place in a
parking lot? Even at the slow speeds
1. 2. 3.
Sensor Received ultrasound
Person or object waves
usually found in parking scenarios, the steering a car through a construction In general, ultrasonic sensors
driver cannot perceive his environment site or reliably detecting people in a make use of high-frequency sound
flawlessly, despite parking assistance crowded parking area. As every ride waves for a range of applications.
functions like cameras and PDCs. starts and ends in a parking position For distance measurement, a typical
Tight parking spots in car parks or involving some kind of parking ultrasonic sensor uses a transducer to
crowded parking areas in front of maneuver, it is crucial to safely perceive periodically send out ultrasonic pulses
shopping malls maximize everyday a car’s immediate environment. in the air. These pulses get reflected
driving challenges. Car manufacturers While existing sensor technologies from objects in the detection area of
are beginning to support their cars mostly focus on covering long the sensor and are received back by
with the latest sensor technologies distances, the immediate environment the sensor. By measuring the time it
so that multiple tons of moving around a car (0 to 5 m) is often takes an ultrasonic pulse to travel to
metal are not left without eyes. left out of the discussion. This is the object and get captured by the
Sensors are currently used to help where advanced ultrasonic sensor sensor, the distance to the object
drivers steer more safely and guide systems come into play. In addition can be calculated. This principle is
them into a parking spot via sound to measuring the distance to an called time-of-flight measurement.
and light signals. In order to advance object, advanced ultrasonic sensors Conventional ultrasonic sensors
in areas like autonomous driving, can also calculate the horizontal and used for parking assistance only record
mapping, and collision avoidance, vertical position of an object relative one-dimensional data, which is the
sensors now need to detect more to the sensor itself (i.e. providing 3D distance to the closest object. Azimuth
complex environmental scenarios like coordinates for detected objects). and elevation angles of objects are
Point cloud example recorded by automotive kit (real data recorded by sensor).
not calculated with this method and The sensor data can also be used can be used to identify the number of
vertical opening angles are severely for additional comfort features, people sitting in the car, their size, and
limited. Thus, many objects, such as e.g. gesture control to open doors their posture. Based on the information
the curb and low-lying obstacles, and trunks, positioning the vehicle regarding where people are sitting and
are not picked up by 1D sensors. for automated charging (for their physical characteristics, airbags
The localization of objects in 3D EVs), and collision avoidance for could be adjusted to individual body
space allows ultrasonic sensors to automatically opening doors. sizes and further improve safety.
detect and distinguish among multiple The technology does not collect
objects in a single scan. In that sense, Passenger Monitoring any personal data since ultrasound
the principle of ultrasonic sensors is in the Car Interior cannot evaluate visual input and instead
similar to echolocation, as used by With further improvements in only records anonymous point-cloud
bats. In comparison, a typical ultrasonic the autonomous driving space, the data. This is an especially important
sensor will usually only measure the behavior of the driver is also likely consideration in terms of privacy and
distance to the nearest object. Because to change. While drivers today data protection. Furthermore, gesture
of this, a limited opening angle is must be completely focused on the simulation in the interior of the car
usually applied for this type of sensor. road—ready to react at a moment’s can be used for information and
In contrast, advanced ultrasonic notice—this is likely to change once entertainment purposes, like controlling
sensors allow for opening angles cars are able to drive and steer fully the car’s infotainment systems with
of up to 160°. The sensors provide automatically. Drivers would then be simple pre-configured actions.
reliable, rich, 3D data for the able to lean back and relax, work on There is no doubt that autonomous
close-range environment around a their computers, turn to their children vehicles of the future will need more
vehicle. The sensors are therefore in the back seats, or temporarily enjoy assistance from sensors to safely operate
well-suited for applications in the an expanded infotainment program. in populated places. Whether you are
automotive field and add yet another Such an eventuality puts new de living in a big city where detecting
level of safety and redundancy to mands on assistance systems. Just like people and accurate parking plays a
conventional radar, LiDAR, and the numerous sensors available for major role, or in the countryside where
camera technologies. The sensors analyzing a car’s external environment, automated charging is indispensable,
can replace or complement existing similar knowledge is needed for the 3D ultrasonic will provide many benefits
optical sensing systems, providing interior in order to realize a more secure to the everyday life of future drivers.
both redundancy and an improved and intuitive interaction experience. In This article was written by Andreas Just,
level of accuracy compared to this context, the use of 3D ultrasound Head of Marketing for Toposens, Munich,
standard ultrasonic sensors in various again provides interesting advantages. Germany. For more information, visit
autonomous navigation applications. Data gained from an ultrasound sensor http://info.hotims.com/76502-121.
by Lindsay Brooke
W
hat specific sensor types will comprise But a case is building for additional sensing capability,
the advanced driver-assistance systems particularly for automatic emergency braking (AEB) and
(ADAS) of the 2020s? That’s a controversial pedestrian-detection. Those safety-critical functions
subject among engineers who are currently rely on camera-radar inputs to “see” ahead.
developing SAE Level 2 and 3 (and the so-called “L2+”) They enable the vehicle to react to a range of scenarios—
LINDSAY BROOKE
ADAS sensing suites for new vehicles. Many of them from stalled traffic on a highway to humans and animals
believe that visible-light cameras fused with radar will suddenly appearing in the road. (In Michigan alone,
suffice to deliver the object-identification accuracy, there were 53,464 traffic accidents involving deer in
redundancy—and cost effectiveness—that OEMs and the 2018, up 14% from 2016, according to state DoT data.)
driving public expect of ADAS-equipped vehicles. Pedestrian fatalities in the U.S. are alarming: one dies
every 88 minutes, on average, in traffic crashes. In 2018, is bad driving weather for half the year,” he noted. For those
6,283 pedestrian lives were lost, up from 5,977 in 2017. reasons, thermal-imaging technology is under consideration
That’s the most since 1990 and represents an increase for both ADAS and SAE Level 4 self-driving AV applications.
of more than 35% since 2008. And three-quarters of all
pedestrian fatalities occur after sunset, NHTSA reports. Outdistancing Headlamps
Detecting people and critters without fail in rain, snow, A family of thermal sensors used in automotive, known as
fog and darkness can be daunting for systems solely based Far Infrared (FIR), operate in a long-wavelength range outside
on optical-and-radar-fused sensing. In October 2019, the the visible-light spectrum, to “see” the relative intensities of
ADASKY SCREEN IMAGE, LINDSAY BROOKE
American Automobile Assoc. tested several production heat (infrared) energy being emitted or reflected from an
AEB systems in various scenarios. During daylight tests, the object—including buildings, parked vehicles and pavement.
test vehicle driving 20 mph (32 kph) struck the target 60% The infrared spectrum consists of a near infrared section
of the time. In nighttime conditions, the vehicle under test (NIR), with wavelengths of 0.75-1.0 μm; a short infrared
traveling at 25 mph (40 kph) hit the soft pedestrian target section (SWIR) with wavelengths of 1-3.0 μm; a mid-infrared
100% of the time. (https://www.aaa.com/AAA/common/ section (MIR), with wavelengths of 3.0-5.0 μm; and the far-
aar/files/Research-Report-Pedestrian-Detection.pdf.) infrared (FIR) section, with wavelengths of 7.5-14.0 μm.
“If NHTSA rules truly 100 percent all-weather performance Anything that generates or contains heat can be
for pedestrian detection by 2021, to meet its 5-Star safety detected and classified with thermal imaging. Humans
criteria, the industry will have to adjust its sensing strategies,” and animals have unique heat signatures that can be
said Raz Peleg, sales director at AdaSky, an Israel-based detected only with a thermal camera. The hotter the
developer of thermal-imaging systems. “In some places there object (such as a parked vehicle’s engine), the more they
1. EPTa stationary in the middle of the test vehicle’s lane in Car- Test results were promising, according to VSI. In all runs for all
to-Pedestrian Longitudinal Adult 50% (CPLA-50) tests. test cases, the car’s AEB system successfully brought it to a stop
2. E PTa crossed in front of the vehicle from the roadside in Car- before impacting the EPTa. Additional testing is planned for spring/
to-Pedestrian Far Side Adult 50% (CPFA-50) tests. summer 2020 following AEB algorithm optimization, EPTa heating
3. EPTa crossed in front of the vehicle from an obstructed position in improvements and when weather is within test parameters.
Car-to-Pedestrian Far Side Adult Obstructed 50% (CPFAO-50) tests.
—LB
CLOCKWISE FROM TOP RIGHT: FLIR SYSTEMS; LINDSAY BROOKE; LINDSAY BROOKE
ZF VP Aaron Jefferson believes
thermal imaging is not yet cost
effective for SAE Level 2.
U
ntil recently, various vehicle OEMs were vying as well as with its environment. Within the increasingly
to be first to put self-driving cars on the road capable sensor suite (see table), automotive radar is now
by the magical turn of the decade. But as 2020 an indispensable technology that enables sub-systems
arrived, the bullish tone has switched to a more for advanced driver assistance systems (ADAS).
cautionary outlook. While the truly self-driving SAE Level About a decade ago, 24-GHz radar sensors entered
5 autonomous vehicle (AV) without a steering wheel is still the scene and began to impress engineers, enabling
years away, innovations needed to support the intelligent self- all-weather ADAS functionality such as blind spot
driving car of the future continue to emerge and improve. detection, lane change and parking assistance, and
Developers grapple with the need to comply with safety collision avoidance. Minimally affected by lighting
and security standards as the average car sees more conditions or bad weather, radar quickly gained favor
sensor technology connecting it with onboard devices over camera technology for ADAS applications.
Telecommunications Standards Institute (ETSI) and the radar. Lidar’s higher degree of granularity can provide a
Federal Communications Commission (FCC), the 24-GHz- much more complete view of the vehicle’s environment.
wide bandwidth and UWB bandwidth will not be available On the downside, lidar is generally more expensive versus
for new automotive radar devices after January 1, 2022. camera and radar sensor technology. New players are trying
These changes are spurring market growth for 76-81-GHz to produce cheaper lidar, with some innovations going for
band usage for automotive radar applications. These higher under $1,000, versus typical prices in the ~$10,000 range.
frequency bands allow designers to create smaller sensor Lidar has other limitations including high data rate and
From sensor to cellular networks, seamless connectivity is key to enabling autonomous driving.
power consumption, and poorer performance in low lighting. autonomous vehicle developers who face common
It will be interesting to see if new disruptive technologies challenges of meeting new and evolving industry
help make lidar cheaper and better for wider adoption. standards. Global standards organizations including SAE
International and IEEE are working on new standards
Enabling the Future Connected Car for artificial intelligence in autonomous vehicles.
While sensor technology plays a vital role in enabling Initiatives to introduce self-driving vehicles currently
safety and in-cabin comfort, it is high-bandwidth, low- are confined to service fleets operating in safe zones.
latency cellular vehicle-to-everything (C-V2X) which will It may take several more years to see self-driving cars
help realize the vision of truly autonomous and connected alongside those still controlled by human drivers.
cars. C-V2X connects the vehicle sensors to driving data Meantime, more work lies ahead for automotive design
such as speed, location, traffic, other cars, and data such as and test engineers to juggle the jungle of conformance
real-time updated maps. With a supercomputer on wheels, standards, alongside customer expectations for the
testing for signal integrity, performance reliability, and creature comforts of a self-driving gizmo on wheels, with
automotive cybersecurity from the physical backplane all all risks of malfunctions and accidents mitigated.
the way through the various protocol layers becomes critical.
4G and LTE technology, and eventually 5G, will form Rick Kundi is a solutions marketing
the infrastructure to support C-V2X capabilities (see engineer within the Automotive and
Figure 3). According to S&P Global Market Intelligence, Energy Solutions business at Keysight
4G LTE is expected to surpass 80% of total in-vehicle Technologies, responsible for automotive
cellular systems in 2024. The industry is concurrently radar, Ethernet and software solutions. He
banking on 5G technology, which advocates believe will has over 10 years of industry experience
be able to better carry mission-critical communications in application engineering, sales, and
KEYSIGHT
faster and better for autonomous vehicles. marketing and has presented at professional events
The potential of 5G-enabled C-V2X is exciting, and the covering topics such as RF and wireless basics, advanced
road ahead is likely to see more collaboration among automotive radar analysis, and generation solutions.
A
precise three-dimensional
profile of a vehicle’s
surroundings is a fundamental
prerequisite for automated
driving. A new high-resolution 3D
flash lidar (HFL) sensor developed
for use at close range—50 m (164 ft)
or less—delivers on this capability,
according to Thomas Laux, head
of business development and sales
for HFL segment in Continental’s
Advanced Driver Assistance
Systems (ADAS) business unit.
“This is automotive grade and solid
state, meaning no moving parts. It’s a
bunch of semiconductors, which is ideal
for commercial vehicle and even more so
for the off-highway environment—they
don’t have a lot of hours [of operation]
but their environments can be pretty
rigorous,” said Laux. The IP6K9K
packaging is very important for these
applications, he said, as is a 3D Global
Shutter feature that eliminates motion
distortion and enables persistence mode
and geo-registration of point cloud.
Laux, who is located in Carpinteria,
Calif., as part of Continental Advanced
AGRITECHNICA
”
It will happen.
for merging environments, when things are whipping by monitoring and a constant overview of the process.
really fast when it’s raining or you get truck spray.” Contadino is currently in prototype phase.
The new sensors currently are “a bit expensive,” which Continental is looking for partners in the agricultural
is typical with a launch product, Laux noted, referencing sector to realize first field applications in 2020.
V
ehicle-to-vehicle/infrastructure (V2X) Highway Traffic Safety Admin. (NHTSA) appeared
communication often is touted as a critical ready to issue a mandate requiring deployment.
technology for automated driving and even But the U.S. mandate never came, nor did a
to derive the utmost benefit from advanced European edict. That created uncertainty that led
driver-assistance systems (ADAS). But despite to chaos in the market.
plenty of verbal support from industry developers, Now, DSRC proponents are trying to stymie an FCC
there’s been little actual deployment. What critics plan to sell a portion of the 5.9 GHz range originally
label “foot-dragging” by the industry now may earmarked only for transportation purposes. Then came
catastrophically stall market implementation. another wild card. While DSRC languished, a cellular-based
V2X technology had a bright future in 1999, when the alternative, cellular V2X (C-V2X) came onto the scene.
FCC allocated 75 megahertz of spectrum in the 5.9 GHz Though there’s uncertainty about the prospects
radio band for Dedicated Short-Range Communications of V2X, advocates say the technology can do a lot to
(DSRC). Years of development and demonstrations prevent accidents today and improve the operations
seemed ready to pay off in 2014 when the National of autonomous vehicles when they emerge.
NXP
DSRC is NXP’s choice for chips that let vehicles communicate with others.
“Driverless vehicles cannot exist without being connected,” Bandwidth for Sale
said Guillaume Devauchelle, VP of innovation and scientific FCC chairman Ajit Pai proposed making the lower 45
development at Valeo. “When you’re running at high speed, MHz of the 5.9 GHz band available for unlicensed uses such
it tells you about things happening at distances that are as Wi-Fi, while retaining the upper 25 MHz of the 5.9 GHz
far longer than on-vehicle sensors [can detect]. That helps spectrum for transportation use. The upsides of the potential
guarantee safety and it makes driving more pleasant. sale could be significant. A 2018 study by the Rand Corp.
You can slow down one kilometer before construction estimated that opening automotive’s 5.9 GHz band for Wi-
work instead of quickly stopping when you get close.” Fi could provide gains to economic welfare for consumers
At the SAE Government/Industry Meeting in Washington and producers that range from $82.2 to $189.9 billion.
in January, Continental’s Bettina Erdem explained that The proposed auction is equally huge for DSRC’s outlook.
45% of all accidents occur at intersections, where V2X can If the FCC proceeds with its plan and the bandwidth is
provide far more safety-related information than on-vehicle used for non-automotive services, interference from Wi-
sensors. She highlighted the importance of V2X by saying Fi could render the V2X spectrum “useless,” according
that 60% of serious and fatal intersection accidents cannot be to John Kenney, director and senior principal researcher
prevented by onboard safety systems. They need V2X input. at Toyota InfoTech Labs. He told attendees at the SAE
Though it’s largely viewed as an important element in conference that if automotive lost 60% of V2X spectrum,
safety and autonomy, V2X has to date seen little acceptance. the industry would lose 60% of potential V2X benefits.
Volkswagen announced plans to ship cars with DSRC Proponents need to quickly coalesce around a
capability, but Toyota scuttled its plan to deploy DSRC in the workable deployment strategy, he added.
U.S. Still, proponents feel there are many reasons to deploy. “There needs to be a consensus regarding V2X
While OEMs and infrastructure providers waited for DSRC technology or people are not going to deploy. Once
to catch fire, others stepped into the void. Industries hungry we have a consensus, we can go to the FCC and say,
for more wireless bandwidth convinced the FCC to sell some ‘This is what we want you to do.’” Kenney said.
CONTINENTAL
of the spectrum set aside for DSRC, while concurrently, While the FCC ponders its auction, fellow government
mobile-phone suppliers developed C-V2X, which leverages officials at NHTSA contend that the auto industry
on-board modems to provide many of benefits of DSRC needs the bandwidth in question. NHTSA division
with reduced design-in and dedicated-infrastructure costs. chief Bob Kreeb predicted that selling the bandwidth
will “limit deployment of existing technology that’s On the upside, the price of implementing either option
ready and usable. We’re disappointed in the FCC’s isn’t too onerous. “The add-on cost to equip a car is around
current proposal that they’ve outlined. We also think $15-$20,” Devauchelle said.
it’s going to limit innovation and creativity.” Automakers would like to see one technology
Another plus for DSRC is that it’s undergone plenty of emerge globally so they can trim overhead and benefit
real-world testing, which helped developers weed out bugs from volume pricing. As the two specifications vie for
and verify performance. Maturity often is a desired trait given acceptance, only one can operate efficiently in each region.
the high reliability levels
demanded in automotive
markets. Supporters also
note that DSRC has far
better latency performance
than C-V2X, which can
be critical in high-speed
safety environments.
“Some of the key
performance parameters of
C-V2X are not as good as
DSRC,” said Huanyu Gu, senior
product manager for V2X at
NXP Semiconductors. “While
over time C-V2X performance
may further improve, it
will take many more years.
The question is: Should
society be kept waiting for
C-V2X to mature, while the
mature technology, DSRC,
is ready for deployment?” Automated-truck developer TuSimple won’t rely on V2X in its forthcoming autonomous haulers.
Cellular Challenge The benefits of V2X grow as more vehicles and roadside
The challenge from C-V2X began gaining ground infrastructure share data. Nor is it practical for a country
around 2017. Cellular and chip providers created or region to have vehicles with incompatible technologies
specifications and groups such as the 5G Automotive sharing the roads. Accidents could still happen when
Association (5GAA) voiced endorsement. Regulators in vehicles with different technologies encounter one another.
the Chinese automotive market are actively supporting “Cars that utilize different technologies cannot
the technology, and Tier 1 suppliers are following suit ‘talk’ to each other, which greatly compromises the
now that a Chinese mandate appears imminent. effectiveness of each technology,” Gu said. “To utilize
“China represents about one third of the market; both technologies within a given region, a theoretical
everyone will have to develop C-V2X for China,” Devauchelle solution could be to equip V2X stations that support
said. “In the U.S. and Europe, it’s difficult to invest. It’s both technologies and translate one into the other.
not that one is that much better, it’s more a matter of The additional latency caused by translating one
regulations and a battle between different stakeholders.” standard to the other will likely make such a solution
Momentum for CV-2X also comes from Ford, a non-starter for safety-critical applications.”
which is planning C-V2X rollouts while supporting the All the uncertainty surrounding V2X is driving
FCC’s spectrum auction. On the semiconductor side, some autonomous vehicle developers to eschew the
major cellular chip suppliers including Qualcomm, technology. TuSimple, a trucking startup that plans to
Intel, Samsung, Huawei, and CATT/Datang S have begin delivering freight hauled by its driverless vehicles
5GAA’s C-V2X technology in their roadmaps. next year, isn’t settling on either technology. Instead,
design teams will rely strictly on on-vehicle sensors.
Foggy Future “V2X is nice to have, but we must be able to determine
Support for both DSRC and C-V2x creates questions for our surroundings without it,” said Chuck Price, TuSimple’s
TUSIMPLE
product planners at all levels. Design teams don’t want chief product officer. “Even if we just talk to our own vehicles,
to support different technologies in various geographies. V2V can enhance operations. But given all the things that can
That adds complexity for design and support teams. happen, we have to assume we can’t talk to other vehicles.”
T
he wall of hype surrounding the self-driving we think the largest portion of the AD-focused passenger
future has given way to a new pragmatism, car market will be Level 2+, which essentially relies on
and engineers are breathing a collective sigh cameras and radar. Some, however, want solutions that are
of relief. The industry, for the most part, has scalable up to Level 4”—the common architectures help
come to grips with the myriad challenges of making minimize having to test and re-validate across product lines.
autonomous vehicles perform with utmost safety, in all “We’ve seen some customers deliberate the tradeoff
driving scenarios and weather conditions. Cost sensitivity of having a scale-up solution to meet the high-end niche
remains a significant factor, particularly in advanced that’s maybe five percent or less of the total market,
driver assistance systems (ADAS) whose development versus having the most cost-optimized solution for maybe
is closely aligned with NCAP safety requirements. 80 percent of the market,” she noted. “As a full-systems
Tier 1s that are driving both innovation and supplier, we need to be able to offer everything.”
systems integration in this dynamic space have
been adjusting their strategies accordingly. Lidar at SAE “Level 2-Plus”
A new realism has emerged, asserted Wolf-Henning Does lidar—which many engineers still consider an
Scheider, CEO of tech supplier ZF. During the 2020 immature and expensive technology—play a role in
CES in Las Vegas, he said his company’s developments ZF’s Level 2+ plans? CEO Schieder hinted that some
are currently focused on two parallel tracks: ADAS advanced full-range radars now under development
that ZF calls ‘Level 2+’ for passenger vehicles, and SAE may deliver much of lidar’s imaging capability.
Level 4 systems for commercial-vehicle applications “Level 2+ systems still require drivers to have their eyes
and people/cargo movers in defined use cases. on the road,” Denari staated. “But when you move into
ZF’s new “coAssist” system is its first Level 2+ play. It Level 3 and beyond, you must have a third redundant
offers capability between standard SAE Level 2 and Level sensor.” She noted that some customers might want the
3 and is claimed to meet projected Euro NCAP 2024 test additional performance that a lidar provides for enhanced
protocols. CoAssist will use Mobileye’s latest EyeQ chip safety, particularly in their technology-leading flagship
and ZF’s new Gen21 medium-range radar when it enters vehicles: “We definitely see customers putting lidar into
production with “a major Asian OEM” later this year. Level 2+ vehicles to enhance availability and reliability.”
“The range of needs is so diverse across the customer The more an ADAS architecture can be scaled up, the
sets. The way one [customer] defines Level 2+ might be very more it will inherently cost. This can lead to optimizing
different from the way another defines it,” explains Aine for the lowest-cost, mass-market solution. ZF’s new
Denari, senior VP for ADAS at ZF. “Although the majority of coAssist package, including software and functionality, is
the volume of the market will remain GSR or NCAP-focused, priced “well under $1,000” per vehicle. Denari revealed
by Terry Costlow
T
he controls for fully autonomous trucks must deal chief product officer. “A lot of companies are building
with inputs from a number of sensors, analyzing prototypes that aren’t ready for commercial use—things
this data and taking actions that enable the will wear out before they leave the driveway.”
vehicle to complete its route safely and efficiently.
Design teams are racing to put together the perfect control Getting Centered
architecture for their vehicles, while ensuring that these Before autonomy became a focus, many sensors fed data
complex real-time systems operate over long vehicle lifetimes. to their own dedicated controllers. That’s now shifted to
Thousands of megabytes of data from different centralized management, which makes it easier to make
combinations of cameras, radars, lidars and ultrasonic sensors decisions based on 360-degree inputs. Combining controls
flow into controllers that meld this information into a map for various subsystems into a single module simplifies
of the vehicle’s surroundings. Within milliseconds, these the control architecture. It also makes it easier to update
controllers must determine adjustments that keep the vehicle software over the long lifetime of a commercial vehicle.
on its path and operating safely. The computing challenges “A key benefit of the new architectures is that we can
are matched by the needs of creating bulletproof systems. provide a centralized infrastructure for all electronic control
“Sensor fusion is used to create an understandable units,” said Martin Schleicher, vice president, strategy,
surrounding, making it possible for the vehicle to conduct the Elektrobit. “This centralized function not only increases
transport mission in a safe way,” said Johan Larsson, director the vehicle’s safety, but also ensures that the vehicle’s
of Autonomous Solutions, Volvo Trucks North America. “Of software can be kept up-to-date during its entire lifecycle.
course, the perception system setup is redundant, with ‘belt The automated vehicle is another device in the IoT.”
and suspenders,’ meaning that you can operate in a fail-safe Many vendors are planning moves from limited operating
mode even if one of your system components would fail.” domains to more common on-highway driving. More diverse
The controllers that analyze sensor inputs and operating realms require more analysis of the surrounding
determine what the vehicle should do need to operate environment; analyzing data from sensors quickly enough
at extremely high speeds, with little margin for delays to avoid accidents requires hefty amounts of processing
or failures. Engineers tasked with building control power. Development teams are forging partnerships to help
modules also must ensure that components can meet them meet these demands while getting to market quickly.
the strict reliability levels of commercial trucks. “It is because of this, and the importance of being early in
ELEKTROBIT
“We’re talking about how to make this commercially the market, that we have decided to partner up with Nvidia,”
viable, that is the key to realizing the goal of getting Larsson said. “Nvidia has world-class knowledge of artificial
drivers out of the vehicle,” said Chuck Price, TuSimple’s intelligence and computing. We will use their hardware,
Fostering Diversity
Many truck and automotive companies are
using Nvidia’s graphics processing units (GPUs),
but GPUs are only part of the processing
equation. Today’s systems typically include
at least one conventional microcontroller that
usually handles decision making, among other
tasks. And GPU makers are beginning to include
microcontrollers, which may reduce the dominance
of traditional central processing units (CPUs).
At the same time, field-programmable gate arrays of the automated-driving system, with clear interfaces
(FPGA)s give design teams the ability to combine to support applications with different requirements.”
conventional CPUs with several parallel processors Powering all the cameras and chips for a large truck
like those found on GPUs. FPGAs can be adapted to takes a lot of energy. Some processor chips can draw
meet different challenges, letting developers adapt close to 100 W, and most systems have a large number
configurations to meet specific requirements. of processors. Cameras also consume lots of power. Price
“It’s a challenge to find the right balance between GPU said that electrical power requirements for TuSimple’s
demands and general-purpose computing demands,” Price autonomous vehicles range from 3-6 kilowatts depending on
FROM TOP: TUSIMPLE; VOLVO TRUCKS
said. “Today, we’re mixing ruggedized GPU systems along the size of the truck and version of the system architecture.
with Intel Xeon processors that run non-GPU algorithms. It’s That puts pressure on the vehicle’s power generators.
the role of FPGAs to optimize certain elements of the system.” “Power is a big deal for us. We’re not compute-
These centralized architectures still need to be constrained, we are power-constrained,” he said.
designed to fit in limited space, dissipate heat and keep “We consume almost all the power in a Class 8 truck
wiring harnesses short. At the same time, components with a big engine and a big alternator. Systems are
must be cost-effective to replace. Modular designs quite power hungry, there are a lot of cameras.”
can help engineers meet these requirements.
“Coming from a distributed architecture, we are moving Remote Control
in the direction of a more centralized architecture,” Larsson While the goal of long-term autonomous programs is to get
said. “But we do use a modular approach in the design people out of the vehicle, some human intervention may be
electronics used in commercial vehicles are quite similar to the said Walter Sullivan, CTO, Designated Driver. “Being
requirements set for many military components. Connected- able to distribute employee resources across distant
vehicle needs are also like those of industrial equipment, which geographies for some jobs can be a huge advantage
has for years offered connection to the Internet of Things (IoT).
in managing employees, utilization, and service.”
Connectivity also is a standard feature on many trucks. Those
commonalities are attracting some military system providers. Not everyone agrees with this concept. TuSimple’s Price
“Trucking, smart agriculture and mining vehicles have feels that latency and other issues make it quite difficult
requirements and environments that are similar to defense for remote operators to make real-time decisions.
systems,” said Joe Eicher, director of business development “We don’t believe remote driving can be made safe or
at Kontron. “These applications are also tying into areas like reliable,” Price said. “We do believe we can remotely alter the
the IoT, where our industrial group plays, so our eyes are
plan for the vehicle when it’s in a minimum-risk condition.
squarely on autonomous vehicles in mining, ag and trucks.”
We can tell it to move forward 20 feet and left 2 feet, and
Terry Costlow to do that using the vehicle intelligence to determine if
there’s anything in its path and make other decisions.”
T
he software-intensive, electrified
and increasingly automated vehicle
will define the 2020s. Its rise is
driving both the industry-wide
re-thinking of electrical architectures and
the growth of engineering employment
behind it. At the forefront of this trend
is Aptiv, the technology Tier 1 spun off
from Delphi in 2017. It now has more than
19,000 engineers among its 160,000 staff,
comprising one of the highest engineer-to-
employee ratios among large suppliers.
“We’ve been adding about 1,500 engineers
per year, primarily in software and systems
engineering, at our 15 major technical centers,”
said CTO Glen DeVos. These resources,
he noted, will help Aptiv accelerate its Glen DeVos is leading
customers’ development of new vehicle Aptiv into new vehicle
architecture technology.
platforms with greater active-safety capability,
including automated-driving functionality.
The OEMs want full upgradeability of software advanced, open-sourced ones. “We formed our Smart
(FOTA; firmware over the air and SOTA, software over Vehicle Architecture group a little over two and a half
the air) capabilities, DeVos explained. He noted they’ll years ago when we saw a trend developing: The massive
also want centralization of compute—moving from content occurring in SAE Levels 1, 2 and 3 vehicles that
today’s multiple ECUs to a few domain controllers—and is creating pain points for our OEM customers,” he said.
zonal control, all with reduced complexity and cost. More features equals more data and not just for Level
DeVos called this broad trend “a blank-sheet approach 4. “It’s across the board,” he said. “We realized that to
APTIV
to move away from traditional architectures” to more pack everything they wanted and that we were thinking
“
We want to make the compute agnostic and independent from all those
”
sensors and actuation.
about into an L4 vehicle, there was no way to do it Focus on SAE Level 2/3
economically without fundamental architecture change.” SAE Level 2 to Level 3 is currently Aptiv’s main focus
Based on its booked orders, Aptiv expects deployment for automated-driving systems development. “We know
to begin in 2022 in premium vehicles, ramping up steadily Level 0 and Level 1 systems, with the progression of Euro
from 2025. And while the company has multiple programs and U.S. NCAP [impact safety] requirements, is going to
developing SAE Level 4 automated-driving functionality be the baseline by 2025,” DeVos says. “You won’t have
with customers aimed at commercial geo-fenced operations, cars that are significantly de-contented from Level 2.
such performance is not anticipated to be ready in That’s where the market is moving. It’s our ‘sweet spot.’”
consumer-level vehicles until the 2030 timeframe. An ADAS domain controller provides the fusing and
“We’ve always talked about automated driving being perception modelling, the “brains” that DeVos compares
on the continuum of active safety,” said DeVos, who was to having a file server on board. In the new architectures,
part of Delphi’s pioneering work on Jaguar’s first Active consolidating from the dozens of discrete ECUs on today’s
Cruise Control launched in 1998. “Going from Level 2 to 3 vehicles to up to five powerful central compute controllers,
and ultimately to Level 4 and 5 is all on the continuum. We will drive a change to smaller “decontented” cameras and
want to take the technologies we’re developing for Levels radars with less integrated processing capacity and thus lower
4/5 and apply it to Levels 2/3 as the next generation of cost. The controllers would be responsible for active safety,
advanced capabilities and features. It’s important to think the user experience (UX), propulsion and chassis systems.
APTIV
how we can bring both ends of the spectrum together.” “We’re seeing costs getting ‘democratized’ for those
Aptiv is increasingly serving as a middleware integrator in the transformation from traditional to software-defined architectures.
the speed to market of Level 3. Properly architected, they situation will only worsen as today’s distributed
will not require add-in duplication to get redundancy. architectures proliferate, according to DeVos.
Instead, it can be accomplished through more effective
sharing among controllers. “We’ll have capability for moving •S
eparating I/O from computing—with all the sensors,
processes from a failed controller to another, as opposed actuators and data that’s flowing around the vehicle,
to just duplication, which is where we are today,” he said. with hard connections back to each of the compute
platforms, changing those sensors and actuators at
An Agnostic Approach new-model time requires changing everything—re-
Aptiv’s ‘SVA’ approach is based on lessons learned architecting the compute and sensor interfaces.
from mobile computing (smart phones) and other
industries where software is embedded and inseparably And that’s not how servers operate, DeVos said.
connected to the hardware in purpose-built machines, They abstract compute from the I/O. “All the I/O comes
each one separate from the other and from one in standard format to that server so it’s managed very
generation to the next. The lessons include: carefully,” he said. “And that’s the third important point:
enabling the ‘serverization’ of the platform.” This involves
• Abstracting software from hardware means decoupling aggregating compute into several modules that support all
software development from the underlying ECU or the features of the vehicle and doing it more effectively.
component development. DeVos admits that today “Essentially, what we want to do is make the
it is a massively complex task in getting everything compute agnostic and independent from all those
to work properly. Proof of that came in 2014, the sensors and actuation,” DeVos explained. “For
first year that warranty costs for software at the us it’s not reinventing the wheel; it’s applying
FORD
OEMs became greater than those for hardware. The this separation to the automotive space.”
TECHNICAL RESOURCE
PLATFORM
Your critical advantage to develop
the future of mobility engineering
SAE MOBILUS™ is your destination for mobility engineering resources with instant access to explore,
discover, and share more than 226,000 of SAE’s current and historical standards, technical papers,
eBooks, magazines, and more.
P17172963