You are on page 1of 7

ENHANCED DRIVER SAFETY WITH ADVANCED VISION SYSTEMS

Dwight Howard
Delphi Automotive Systems IDI/NA Division
Kokomo, Indiana USA
dwight.howard@aptiv.com

INTRODUCTION As electronics and microprocessor controls permitted


With increasing information diverting attention and longer greater sophistication in automotive safety, even more
commutes causing greater fatigue, safe operation of vehicles effective safety technologies have emerged since the 1980s.
is a growing concern. The concept of workload These are shown in Table 2.
management in the operation of vehicles is driving the
automotive industry to address these issues. This paper Table 2. Examples of Saftey Technologies Evolved Since
discusses two primary technologies that will realize the ~ 1980
objectives of workload management. Driver State
Monitoring (DSM) and Gesture Recognition (GS) systems
are being implemented into production vehicles on a rapid
and urgent path. The impact of these technologies on driver
fatigue and safety will be demonstrated. Subsystem
architecture and the integration of DSM/GS subsystems into
vehicle architecture will be illustrated. Sensor technologies,
image recognition/processing, and decision logic will be
discussed. Finally, the state of these technologies, current
limitations, and development paths will be discussed. Evolution of Automotive Safety Technologies
While these legacy features have become standard in
BACKGROUND production vehicles, highly advanced safety features appear
Over the history of motor vehicles, safety has been one of each model year. Many of these are categorized under
the most, if not the most, important factors in the operation emerging Advanced Driver Assistance Systems (ADAS)
of vehicles and equipment. As complex mechanical and technologies that provide an important foundation and
electrical systems, the design of motor vehicles and pathway to full autonomous driving. An example of legacy
motorized equipment must provide absolute safety for automotive safety features are listed in Table 3.
operators, passengers, pedestrians, and property. The
majority of safety relies on the skill, experience and Table 3. Examples of recent and emerging features now
vigilance of the operators of vehicles and equipment. classified as ADAS:
Technology has played a significant role in the capabilities
of vehicles and equipment to function safely and for vehicle
operators to be able to operate vehicles safely. Examples of
safety technologies are exhaustive. For a foundation, it is
worth noting primary legacy features and subsystems which
are important to safety. These are shown in Table 1.

Table 1. Example Legacy Automotive Safety Systems/


Features

ADAS provides assistance to the driver and enhances the


driving experience. The primary function of ADAS is to
ensure safety to the driver and pedestrians. In the future,
ADAS will be used to for very complex purposes. For
example, a number of vehicles will travel in a mode referred
to as platooning. Vehicle-to-vehicle communications will
allow vehicles to maintain close spacing even at very high
speeds. Sensors on each vehicle will facilitate the precision
required to keep the platoon in perfect synchrony. ADAS
provides many detection and warning functions. For
example, when a vehicle drifts across the lane ADAS can
issue a warning or engage brakes and steering to avoid
collision.

To function reliably, ADAS must be able to recognize


objects, signs, road surface, and moving objects on the road.
ADAS must be able to make decisions and issue alerts or act
on behalf of a driver. Another important benefit ADAS
provides to the driver is to offset the information overload
from all of the complex and diverse technologies in vehicles.
Despite all the benefits provided by technologies in vehicles,
the downside is increasing amounts of information with
which drivers must contend. This has led to a growing
concern for what is generally described as “workload
management”. Drivers can rely on ADAS to detect and
Figure 2. DAS/ADAS Function
respond when potential risks arise. This reduces stress on
drivers.
While ADAS capabilities have been in deployment since the
early 2000s, the concept of ADAS as a distinct set of
ADAS technology employs a variety of sensors: radar,
technologies emerged later in the decade. The Features and
LIDAR, ultrasonic, photonic mixer device (PMD), cameras,
capabilities within ADAS are commonly described as a
and night-vision devices. These enable ADAS-equipped
progression of steps in technology, each having higher
vehicles to handle near and far detection in various
levels of sophistication leading to fully autonomous driving
environmental conditions. ADAS capabilities available in
vehicles. This model is shown in Figure 3.
motor vehicles are shown in Figure 1.

Figure 3. Levels of ADAS Leading to Autonomous


Driving

In terms of autonomous driving, the U.S. National Highway


Transportation Safety Administration (NHTSA) has
Figure 1. Advanced Driver Assistance System Vehicle identified 5 stages. This is shown in Figure 4.
Implementation/ Capabilities

ADAS covers a range of use cases. The Driver Assistance


System (DAS) informs and warns, provides feedback on
actions, increases comfort, and reduces workload by
actively stabilizing or maneuvering the vehicle. ADAS
system is considered as a subset of DAS with increased use
of complex processing algorithms to detect and evaluate the
vehicle environment based on data collected from a variety
of sensor inputs. ADAS capabilities that are available in
current production vehicles are shown in Figure 2. While
some ADAS functions require less computing power, other
functions necessitate extremely powerful real-time
processing and adaptive intelligence. Of even greater
impact on compute power is the fusion and democratization
of what can be a large a number of sensors providing a
range of real-time analog and digital data streams to the Figure 4. Levels of Autonomous Driving According to
DAS. NHTSA
The long-term vision is that AD will obsolete the need for
conventional vehicle controls. Steering wheels, brake and For practical application of ISO 26262 guidelines, further
accelerator pedals and other physical controls will disappear. definition is found in the Automotive Safety Integrity Level,
An example of this is a deployment of AD shuttles in the or ASIL, criteria. ASIL provides criteria according to risk,
U.S. city of Las Vegas, Nevada. Images of this vehicle are severity and impact in the event of failure. This is applied
shown in figure 5. The photos show that there are no in classifying the design of systems (and software) in
conventional controls. This vehicle is integrated into a automotive vehicles. Automotive systems designers must
dedicated transportation service over a short route and assure their designs meet the appropriate ASIL criteria.
operates in autonomous mode. Compliance is necessary to produce and market automotive
vehicles. ASIL for automotive systems is shown in Figure 7.

Figure 7. ASIL for automotive systems


Figure 5. Example of an autonomous shuttle bus with no
physical controls
Automotive Safety Sensors
In practical implementations, ADAS capabilities are
Safety Standards and Governance
realized using various sensor types. Examples are shown in
Automotive industry safety has been overseen by
Figure 8.
government regulatory agencies. The increasing
sophistication of automated safety features in motor vehicles
requires safety standards and regulations that cover new
technologies, applications, and use cases. The International
Standards Organization provides industry safety guidelines
in its release of ISO 26262. ISO states that ISO 26262 the
intent is for the guidelines to be applied to safety-related
systems and electrical and/or electronic (E/E) systems
installed in standard production passenger cars that are up to
3500 kg gross vehicle weight. ISO 26262 does not address
unique electrical and electronic systems in any special
configurations that are not in mass production. Automotive
safety systems that fall under ISO 26262 safety guidelines
are illustrated in Figure 6.

Figure 8. Sensors types and packages

A variety of sensors are used because each sensor


technology has strengths and limitations. Vehicle safety
systems engineers must utilize and optimize multiple sensor
types in vehicle architectures to provide the best possible
overall coverage under all conditions. ADAS sensor types
are compared with the latest radar sensor technology,
Millimeter Wave, across primary use cases and conditions.
Figure 6. Automotive safety systems that fall under ISO This is shown in Figure 9.
26262
Figure 9. Millimeter wave radar sensor technology
compared with common existing ADAS sensor types
(http://www.autoroad.cn/en/product/1/) Figure 11. Sensor ranges for ADAS applications

As mentioned earlier, real-world use cases situations present


extreme complexities for ADAS sensors. Optimal ADAS Sensors vs. Vehicle-to-Vehicle and Vehicle-to-
implementations drive architectures that combine sensor infrastructure (V2V/V2I) Technology
types. Combined sensor packaging is most common with
forward-looking radar or LIDAR and camera systems. An While V2X is beyond the scope of this paper, it is worth
implementation combining forward-looking camera and highlighting key points about V2V/V2I (V2X) technology
radar sensors for object detection and recognition is shown in relation to ADAS. V2X provides instantaneous full-time
in Figure 8. information exchange between vehicles. V2X information
includes a range of vehicle operational states. A few
Implementation may also utilize multiple angles to achieve examples of information exchanged via V2X include
3D and combined with radar. This is illustrated in Figure 10. steering angle, deceleration/acceleration, position and
location, suspension/stabilization shocks. V2X information
sharing permits onboard ADAS systems to provide for
increased safety. In fact, most of the V2X features far
exceed what ADAS sensor technologies alone provide.
Driver intentions (turn signals), road and weather hazards
(stability system and wiper/defrost data) can be passed to
nearby vehicles. While V2X and ADAS coexist, clearly
V2X exceeds ADAS alone by operating in a virtual network
with all in-range vehicles. Examples of this include blind
Source: intersections, hidden driveways, and instances where
(http://toyota.com.cn/innovation/safety_technology/safety_t passing is a risky operation and visibility is obscured by a
echnology/technology_file/pre_crash/index.htm) chain of vehicles.
Figure 10. Millimeter wave forward-looking radar
packaged with a forward-looking camera Distracted Driving and DMS-GC
Distracted driving has long been a major cause of accidents
Detection range is critical, particularly in locations where leading to personal injuries and deaths. Government safety
vehicles travel at high speeds and closing rates are agencies and the insurance industry report incidents from
extremely fast. ADAS sensors must provide reliable the immense data. For example, NHTSA reports distracted
performance at the greatest possible distances to permit safe driving caused nearly 3,500 deaths and 391,000 injuries in
response to incidental situations. This dictates the types of 2015. The use of electronics while driving is known to be
sensors that must be used for specific applications. Various major factor. In this same period, NHTSA estimated over a
applications and sensor technologies are illustrated in Figure half million drivers were using electronic devices while
11. driving. These data are shown in Figure 12.

Figure 12. U.S. Driver distraction statistics for 2015


https://www.nhtsa.gov/risky-driving/distracted-driving
Driver Monitoring Systems (DMS)
Driver Monitoring Systems address the problem of
distracted driving. DMS is intended to detect driver
distraction - loss of driver focus/attention on the road, and
provide the ADAS risk severity. When DMS indicates
distraction to ADAS, ADAS can signal the driver with an
audible or a vibration alert. In an extreme situation ADAS
may control steering and braking to avert a lane departure
and collision.

DMS is described by various terminologies and capabilities.


The general purpose, functionality, features and architecture Figure 13. DMS and Gesture recognition components and
are common. Differences may exist in the level of example locations
capability of a particular DMS implementation. Early DMS
capabilities provided face tracking. This yielded limited As mentioned above, accuracy in discerning actual
utility. For example, a head turned away from the road distraction and effecting refined gesture recognition requires
would be discerned but detection of drowsiness would not eye-gaze tracking. DMS must detect specific features of the
be possible. Current DMS technology is capable of blink eye. Most important is the iris. This is performed by
detection and eye gaze tracking. These capabilities allow extremely complex software algorithms. Details of these
greater accuracy in discerning actual drowsiness and algorithms are beyond the scope of this paper. An example
distraction. By the early 2020s, DMS will have the ability of what the software algorithm “sees” is shown in Figure 14.
to discern a driver’s physical state. This will be possible by This is a still image capture from a development system
way of biometric assessment in the DMS algorithms. monitor which displays the camera video and superimposed
tracking indicators on the subject’s features. This process is
DMS architecture is built around camera systems. A used to evaluate and refine the tracking capabilities of a
camera is the primary sensor in the DMS. A basic DMS DMS.
utilizes a single camera. The DMS camera and IR
illuminators are commonly mounted at the steering column,
on the instrument cluster, or on the driver door pillar. For
greater coverage of head movement, a second camera may
located on the instrument panel near the center air vents.
And example of an implementation is shown in Figure 13.

In expanded applications, an additional camera may be


mounted over the center stack area. Generally this
additional camera is for gesture recognition. Gesture
recognition may be integrated with the DMS to provide
extremely fast and accurate selection and control of
infotainment and climate control systems. This is referred
to as gesture control (GC). GC can greatly reduce driver Figure 14. Head/face/eye gaze tracking in a development
distraction and facilitate safer operation of vehicle features. system
During GC operation, DMS identifies the feature control
symbol or icon at which the driver’s gaze is directed. The An example of a DMS eye gaze processing architecture is
driver can make a slight physical gesture in the field of view shown in Figure 15. This example illustrates the
of the GC camera. The DMS-GC recognizes the intent to relationships and functions of key subsystems that make up
select the feature identified by the driver’s gaze. The a DMS. Facial features are provided into this part of the
feature is initiated. processing architecture a coordinate data, specifically the
vertices of key facial features. Within these data are the eye
features. The eye gaze algorithm processes these data to
resolve the gaze solution.
automated parking, backup vision, surround views and
driver monitoring. Distracted and drowsy driving cause of
thousands of deaths annually. Driver monitoring is a
promising answer to this. Driver monitoring technology
will be common in vehicles. DMS reduces the risk of
accidents resulting from distraction and drowsiness.
Complex algorithms are the intelligence in a DMS. These
algorithms must follow driver’s head, eye, and gaze to track
effectively and correctly assess distraction and drowsiness.
Figure 15. A DMS eye gaze architecture highlighting key
image data, data flows, and processing blocks Gesture recognition is another vision-based feature.
Gesture recognition facilitates gesture control. Combining
To provide the eye-gaze algorithm with usable data, vertices gesture recognition with DMS facilitates hands-free
of key facial points must be tracked continuously. This operation of vehicle infotainment and climate systems.
represents the upstream subsystem of the overall eye gaze DMS-GC reduces distraction and helps reduce the driver’s
tracking engine. Partitioning algorithms in this way burden. Greater capability and safety will be realized in
achieves faster lock and eye gaze tracking. Examples of future DMS systems by way of advancing biometric
face tracking combined with eye tracking and iris detection. This will facilitate detection of driver impairment
recognition for various face angles are shown in Figures 16 from drugs, alcohol, or loss of physical capacity. ADAS
and 17. could prevent a vehicle from being started and driven or
take control to bring a vehicle to a safe stop.

ADAS capabilities are evolving in steps that will ultimately


lead to full autonomous driving. As vehicles become fully
automated, the need for legacy controls use to operate
vehicles becomes obscure. Most vehicles in this future time
may not have steering wheels and control pedals. Vehicles
of this type are already being deployed for limited and low
Figure 16. Example of a DMS locked onto and tracking speed transportation services. The concept of ADAS is
irises in the presence of head and eye movement likely to gradually drop from view but the technologies that
started the evolution will still be vital and ever present.

ACKNOWLEDGEMENTS
Brian Bickford, Manager, Advanced Engineering, Delphi
Automotive, LLC
Richard Torke, Senior Engineering, Driver Monitoring
Applications, Delphi Automotive, LLC
Douglas Welk, Chief Engineer, Advanced Engineering,
Delphi Automotive, LLC
Randell Wilson, Senior Engineer, Imaging Applications
Engineering, Delphi Automotive, LLC

Figure 17. Examples of face tracking under different face REFERENCES


poses (a),(b),(c) and facial expressions (d), (e), (f) [1] Distracted Driving
https://www.nhtsa.gov/risky-driving/distracted-
driving
SUMMARY AND CONCLUSIONS
[2] Technology of Pre-crash Safety
ADAS plays an important role in vehicle safety. ADAS
(http://toyota.com.cn/innovation/safety_technology
sensors are the “eyes” and “ears” for detecting objects and
/safety_technology/technology_file/pre_crash/inde
people. There are a range of sensor types that can be
x.htm)
implemented in an ADAS-equipped vehicle. Each type of
[3] Automotive millimeter-wave radar imaging
sensor has particular strengths and limitations. ADAS must
(http://www.autoroad.cn/en/product/1/)
employ multiple types of sensors to cover all detection
Advanced Driver Assistant System Threats,
requirements and capabilities. Complimenting ADAS is
Requirements
V2V technology. V2V has greater range and does not rely
[4] Security Solutions
on line-of-sight. The combination of ADAS and V2V
https://www.intel.com/content/www/us/en/embedd
provides even greater safety.
ed/automotive/advanced-driver-assistant-system-
paper.html
Vision plays a vital role in the array of ADAS sensors.
Cameras provide true vision to ADAS for parking assist,
[5] Autonomous driving in the car seat of the future
IAA 2015: Johnson Controls reveals response to
the autonomous driving megatrend
https://www.presseportal.de/pm/19526/3123207
[6] ]Powering the autonomous world: introducing the
I6500-F for functional safety
https://www.mips.com/blog/introducing-the-i6500-
f-for-functional-safety/

You might also like