You are on page 1of 24

A TECHNICAL SEMINAR REPORT ON

VEHICLE INTELLIGENCE

ON THE PARTIAL FULFILLMENT OF THE REQUIREMENT FOR THE

AWARD OF THE DEGREE OF

‘BACHELOR OF TECHNOLOGY’

IN

MECHANICAL ENGINEERING (MECHATRONICS)


Contents
1) Introduction
2) History
3) Components in Vehicle Intelligence
(a) Mechatronics System
(b) Sensors
(c) Artificial Intelligence
4) Levels of automation
5) Features
6) Benefits of automation
7) Applications
8) Present and Future
9) Drawbacks
10) Bibliography or References
INTRODUCTION
Although the automotive industry has always been a leading force behind
many
engineering innovations, this trend has become especially apparent in
recent years.
The competitive pressure creates an unprecedented need for innovation to
differentiate products and reduce cost in a highly saturated automotive
market to
satisfy the ever increasing demand of technology savvy customers for
increased
safety, fuel economy, performance, convenience, entertainment, and
personalization. With innovation thriving in all aspects of the automotive
industry,
the most visible advancements are probably in the area of vehicle controls
enabled
by the proliferation of on-board electronics, computing power, wireless
communication capabilities, and sensor and drive-by-wire technologies.
The
increasing sophistication of modern vehicles is also accompanied by the
growing
complexity of required control models. Therefore, it is not surprising that
numerous applications of methodologies generally known as “intelligent”,
“computational intelligence”, and “artificial intelligence” have become
increasingly popular in the implementation of vehicle systems. Vehicle
intelligence
is the application of sensors, mechatronics and artificial intelligence (AI) to
enhance vehicles or make them fully autonomous driverless cars.
HISTORY:
It was not before the 1980s that this interest was transferred to the civil
sector:
governments worldwide launched the first projects, which supported a large
number of researchers in these topics. The interest of the automotive
industry in
developing real products was only triggered after feasibility studies were
successfully completed and the first prototypes were demonstrated. Testing
of
autonomous vehicles on real roads in a real environment was one of the
most
important milestones in the history of intelligent vehicles. This happened in
the
mid to late 1990s. In the summer of 1995, the Carnegie Mellon Navlab
group ran
their No Hands Across America experiment. They demonstrated automated
steering, based solely on computer vision, over 98% of the time on a 2800
mile trip
across the United States. Later in 1995 the Bundeswehr Universität Munich
(UBM), Germany fielded a vehicle that was demonstrated with a 1758 km
pioneering autonomous vehicles, from top to bottom: (a) NAVLAB, (b)
UBM, and
(c) Argo vehicles trip from Munich to Copenhagen in Denmark and back.
The
vehicle was able to drive autonomously for 95% of the trip. The car
suggested and
executed maneuvers to pass other cars. Unlike later robot cars, this car
located
itself on the current road and followed it until instructed otherwise. It did not
localize itself in global coordinates and could drive without Global
Positioning
System (GPS) and road maps as found in a modern automotive navigation
systems.
The car’s trunk was full of transputers and ad hoc hardware. A different
approach
was followed by VisLab at the University of Parma within the Argo project:
the
passenger car that was designed and developed was based on a low-cost
approach.
1177 off-the-shelf Pentium 200 MHz personal computer (PC) was used to
process
stereo images obtained from lowcost cameras installed in the driving cabin.
The
vehicle was able to follow the lane, locate obstacles, and – when instructed

change lane and overtake slower vehicles. The main milestone of this
project was
the successful test of the Argo vehicle in a tour of Italy of more than 2000
km
called ‘Millemiglia in Automatico’ in which the vehicle drove itself for 94% of
the
total distance. Current research initiatives are oriented towards the
development of
intelligent vehicles in realistic scenarios. However, due mainly to legal
issues, full
autonomy has not yet been set as the ultimate goal: the automotive
industry has set
as its primary goal the need to equip vehicles with supervised systems and
– more
generally – advanced driving assistance systems (ADAS) instead of
automatic
pilots. In other words, the driver is still in charge of running the vehicle, but
the
drive is monitored by an electronic system that detects possibly dangerous
situations and reacts by either warning the driver in due time, or taking
control of
the vehicle in order to mitigate the consequences of the driver’s inattention.
The
good results obtained by ADAS in the automotive arena in recent years has
induced the military sector to give a new vigorous push to the original ideas
of
automating its fleet of ground vehicles. The Defense Advanced Research
Projects
Agency (DARPA) launched the Grand Challenge in 2003, a race for
autonomous
vehicles that had to travel for more than 200 km in unstructured
environments.
This unprecedented challenge attracted a large number of top-level
research
institutes, who worked with the million-dollar prize in mind and helped the
scientific community take a considerable step forward. In 2005 the DARPA
Grand
Challenge required autonomous driving in a rough terrain desert scenario
with no
traffic, obstacle types known in advance, and few if any road markers on a
course
predefined by 2935 GPS points.
Benefits of Intelligent Vehicles Having intelligent vehicles running on our
road
network would bring a number of social, environmental, and economical
benefits.
An intelligent vehicle able to assess the driving scenario and react in case
of
danger would allow up to 90% of traffic accidents that are caused by
human errors
to be eliminated, saving human lives. According to the World Health
Organization
an estimated 1.2 million people worldwide are killed each year, and about
forty
times this number injured, due to traffic accidents. At the same time,
vehicles able
to drive at high speeds and very close to each other would decrease fuel
consumption and polluting emissions; furthermore they would also increase
road
network capacity. Vehicles communicating with a ground station could
share their
Part F 51.1 1178 Part F Field and Service Robotics routes and be
instructed to
reroute in order to maintain a smooth traffic flow. Vehicles that can sense
and obey
speed limits or traffic rules would reduce the possibility of misinterpretation
and
antisocial driving behavior. Fully automatic vehicles would also offer a
higher
degree and quality of mobility to a larger population, including young, old,
or
infirm individuals, reducing the need even for a driving licence. Finally, the
availability of vehicles that could drive themselves would increase the
quality of
mobility for everyone, turning personal vehicles into taxis able to pick up
people
and take them to their final destination in total safety and comfort,
dedicating the
driving time to their preferred activities. However, this full application of
intelligent vehicles is far from being complete, since unmanned vehicles
technology is still under development for many other applications. The
automation
of road vehicles is perhaps the most common everyday task that attracts
the
greatest interest from the industry. However, other domains such as
agricultural,
mining, construction, search and rescue, and other dangerous applications
in
general, are looking to autonomous vehicles as a possible solution to the
issue of
the ever-increasing cost of personnel. If a vehicle could move
autonomously on a
field to seed, or enter a mined field, or even perform dangerous missions,
the
number of individuals put at risk would drastically decrease and at the
same time
the efficiency of the vehicle itself would be increased thanks to a 24/7
operational
schedule. The key challenge for intelligent vehicles is safety; accidents
must not
occur due to automation errors and there is zero tolerance to human injury
and
death.
Components in Vehicle Intelligence:
1) Mechatronics system
2) Sensors
3) Artificial intelligence
Mechatronics System:
"Mechatronics" is an artificial built synonym for the combination of the three
disciplines mechanics, electronics and computer science. A typical
mechatronical
system differs to classical built up systems by an higher number of
elements
(complexity).
The field of mechatronics in automotive engineering refers to a
combinatorial
approach to design, with emphasis on contributions from mechanical,
electrical,
computer, and control engineering groups.
As manufacturers of passenger vehicles seek opportunities to differenti
offerings, design teams collaborate on novel uses of emerging technology
to
address elements of consumer demand.
Sophisticated driver support systems that deliver safety and performance
enhancements have emerged from such collaborations. Assuring
performance in such systems is mandatory, especially in those that involve
safety
enhancing devices.
" tronics differentiate their
quality and
Systems based on mechatronics use microelectronic mechanical sensors
(MEMS)
to pick up signals from the environment, process them to generate output
signals,
and transform those output signals into forces, motions, and actions.
Examples of mechatronic systems include digitally-controlled combustion
engines,
self-adaptive machine tools, and contact-free magnetic bearings. Advanced
control
capabilities resulting from close integration of mechanical systems with
sensors
and microprocessors are among the most important aspects of
mechatronics.
Interdisciplinary in nature, mechatronics requires input and coordination of
design
elements with control electronics and embedded software as integral
components.
Embedded software is so crucial to the functionality of mechatronics-based
systems that it is typically referred to as a “machine element.”
The most sophisticated of these systems incorporate a range of other
sensors that
can be utilized to gather inputs on road conditions and proximity to other
vehicles
and objects. These sensors can be integrated with mechanical control
systems to
provide automatic braking and throttle control capabilities.
Tire pressure monitoring systems represent an example for automotive
technology
driving MEMS technology. TPMS could be mandatory in European vehicles
soon.
Putting the pressure sensor into the tire instead of the rim supports
additional
measurements such as the tire’s contact to the street. In such an intelligent
TPMS,
MEMS could also serve as energy harvesters that could replace the
battery. At
least two companies are reportedly working on such an ‘intelligent tire’.
Electronics and systems based on mechatronics are among the principal
drivers of
innovation in passenger vehicles and the value they contribute represents
substantial advancements in safety as well as performance. Unfortunately,
they
also occupy a leading position in failure statistics. The greatest challenge is
therefore to master the ever-rising levels of complexity while achieving
zero-error
production and structural durability over the entire performance range
while continuing the integration of new features. For this reason, the goal of
mechanical testing in this field is to ensure safety, elevate production
quality, and
increase structural durability.
SENSORS:
A sensor is a device that detects and responds to some type of input from
the
physical environment. The specific input could be light, heat, motion,
moisture,
pressure, or any one of a great number of other environmental phenomena.
The
output is generally a signal that is converted to human-readable display at
the
sensor location or transmitted electronically over a network for reading or
further
processing.
Different Types of Sensors
The following is a list of different types of sensors that are commonly used
in
various applications. All these sensors are used for measuring one of the
physical
properties like Temperature, Resistance, Capacitance, Conduction, Heat
Transfer
etc.
Temperature Sensor
Proximity Sensor
Accelerometer
IR Sensor (Infrared Sensor)
Pressure Sensor
Light Sensor
Ultrasonic Sensor
Smoke, Gas and Alcohol Sensor
Touch Sensor
Color Sensor
Humidity Sensor
Tilt Sensor
Flow and Level Sensor
Autonomous cars have nearly 40-50 sensors and are of different types.
Why do autonomous cars need so many sensors?
Imagine trying to drive down the road with a completely frosted over
windscreen.
It would be a matter of seconds before you hit something or ran off the
road.
Autonomous vehicles are no different. They must be able to ‘see’ their
environment in order to know where they can and cannot drive, detect
other
vehicles on the road, stop for pedestrians, and handle any unexpected
circumstances they may encounter.
Each type of sensor has its own strengths and weaknesses in terms of
range,
detection capabilities, and reliability. A host of technologies is required to
provide
the redundancy needed to sense the environment safely. When you bring
together
two heterogeneous sensors, such as camera and radar, this is called
sensor fusion.
Autonomous vehicle sensor categories
Automotive sensors fall into two categories: active and passive sensors.
Active sensors send out energy in the form of a wave and look for objects
based
upon the information that comes back. One example is radar, which emits
radio
waves that are returned by reflective objects in the path of the beam.
Passive sensors simply take in information from the environment without
emitting
a wave, such as a camera.
Cameras
Cameras are already commonplace on modern cars. Since 2018, all new
vehicles in
the US are required to fit reversing cameras as standard. Any car with a
lane
departure warning system (LDW) will use a front-facing camera to detect
painted
markings on the road.
Autonomous vehicles are no different. Almost all development vehicles
today
feature some sort of visible light camera for detecting road markings –
many
feature multiple cameras for building a 360-degree view of the vehicle’s
environment. Cameras are very good at detecting and recognizing objects,
so the
image data they produce can be fed to AI-based algorithms for object
classification.
Some companies, such as Mobileye, rely on cameras for almost all of their
sensing.
However, they are not without their drawbacks. Just like your own eyes,
visible
light cameras have limited capabilities in conditions of low visibility.
Additionally,
using multiple cameras generates a lot of video data to process, which
requires
substantial computing hardware.
Beyond visible light cameras, there are also infrared cameras, which offer
superior
performance in darkness and additional sensing capabilities.
Radar
As with cameras, many ordinary cars already have radar sensors as part of
their
driver assistance systems – adaptive cruise control, for example.
Automotive radar is typically found in two varieties: 77GHz and 24Ghz.
79GHz
radar will be offered soon on passenger cars. 24GHz radar is used for
short-range
applications, while 77GHz sensors are used for long-range sensing.
Radar works best at detecting objects made of metal. It has a limited ability
to
classify objects, but it can accurately tell you the distance to a detected
object.
LiDAR(Light Detection and Ranging)
LiDAR (Light Detection and Ranging) is one of the most hyped sensor
technologies in autonomous vehicles and has been used since the early
days of
self-driving car development.
LiDAR systems emit laser beams at eye-safe levels. The beams hit objects
in the
environment and bounce back to a photodetector. The beams returned are
brought
together as a point cloud, creating a three-dimensional image of the
environment.
This is highly valuable information as it allows the vehicle to sense
everything in
its environment, be it vehicles, buildings, pedestrians or animals. Hence
why so
many development vehicles feature a large 360-degree rotating LiDAR
sensor on
the roof, providing a complete view of their surroundings.
While LiDAR is a powerful sensor, it’s also the most expensive sensor in
use.
Some of the high-end sensors run into thousands of dollars per unit.
However,
there are many researchers and startups working on new LiDAR
technologies,
including solid-state sensors, which are considerably less expensive.
Ultrasonic sensors
Ultrasonic sensors have been commonplace in cars since the 1990s for
use
as parking sensors, and are very inexpensive. Their range can be limited to
just a
few metres in most applications, but they are ideal for providing additional
sensing
capabilities to support low-speed use cases.
Aritifical Intelligence:
For an automobile to be autonomous, it needs to be continuously aware of
its
surroundings—first, by perceiving (identifying and classifying information)
and
then acting on the information through the autonomous/computer control of
the
vehicle. Autonomous vehicles require safe, secure, and highly responsive
solutions which need to be able to make split-second decisions based on a
detailed
understanding of the driving environment. Understanding the driving
environment
requires an enormous amount of data to be captured by myriad different
sensors
across the car, which is then processed by the vehicle’s autonomous
driving
computer system.
For the vehicle to be truly capable of driving without user control, an
extensive
amount of training must be initially undertaken for the Artificial Intelligence
(AI)
network to understand how to see, understand what it’s seeing, and make
the right
decisions in any imaginable traffic situation. The compute performance of
the
autonomous car is on par with some of the highest performance platforms
that
were only possible just a few years ago.
The autonomous vehicle is projected to contain more lines of code than
any other
software platform that has been created to date. By 2020, the typical
vehicle is
expected to contain more than 300 million lines of code and will contain
more than
1 TB (terabytes) of storage and will require memory bandwidth of more
than 1 TB
per second to support the compute performance necessary for autonomous
driving
platforms.
A self-driving car’s AI system requires a continuous, uninterrupted stream
of data
and instructions in order to make real-time decisions based on complex
data sets.
With less reliance on needing to recognize the route, the attention of the
autonomous computer can be paid to traffic, pedestrians and the other
potential
real-time hazards. This generally restricted range of operation is referred to
as geofencing,
and reflects the approach that early self-driving vehicles are embracing in
deploying vehicles that are truly driverless. While geo-fencing can lead to a
solution that can work over a limited route, an autonomous vehicle with
heavy
reliance on geo-fencing in one part of the world may not function as well in
another.
Levels of Automation:
Levels of automation:
In the United States, the National Highway Traffic Safety Administration
(NHTSA) has established an official classification system.
Level 0: The driver completely controls the vehicle at all times.
Level 1: Individual vehicle controls are automated, such as electronic
stability
control or automatic braking.
Level 2: At least two controls can be automated in unison, such as
adaptive
cruise control in combination with lane keeping.
Level 3: The driver can fully cede control of all safety-critical functions in
certain conditions. The car senses when conditions require the driver to
retake
control and provides a "sufficiently comfortable transition time" for the driver
to
do so.
Level 4: The vehicle performs all safety-critical functions for the entire
trip, with
the driver not expected to control the vehicle at any time. As this vehicle
would
control all functions from start to stop, including all parking functions, it
could
include unoccupied cars.
Level 5: Fully automated. There is no need of driver. Car take cares entirely
the
driving part even during emergency cases. This type of automation is
generally
associated with many risks.
Features:
1) Advanced driver assistance systems(ADAS) – a definition Advanced
driver
assistance systems (ADAS) are defined here as vehicle-based intelligent
safety systems which could improve road safety in terms of crash
avoidance,
crash severity mitigation and protection and post-crash phases. ADAS can,
indeed, be defined as integrated in-vehicle or infrastructure based systems
which contribute to more than one of these crash-phases. For example,
intelligent speed adaptation and advanced braking systems have the
potential
to prevent the crash or mitigate the severity of a crash.
ADAS systems fit into various levels of autonomy – depends on how many
of
those individual elements are contained within the car
Adaptive cruise control (ACC)
Glare-free high beam and pixel light
Adaptive light control: swivelling curve lights
Automatic parking
Automotive navigation system with typically GPS and TMC for providing
upto-
date traffic information.
Automotive night vision
Blind spot monitor
Collision avoidance system (Precrash system)
Crosswind stabilization
Cruise control
Driver drowsiness detection
Driver Monitoring System
Electric vehicle warning sounds used in hybrids and plug-in electric
vehicles
Emergency driver assistant
Forward Collision Warning
Intersection assistant
Hill descent control
Intelligent speed adaptation or intelligent speed advice (ISA)
Lane departure warning system
Lane change assistance
Night Vision
Parking sensor
Pedestrian protection system
Rain sensor
Surround View system
Tire Pressure Monitoring
Traffic sign recognition
Turning assistant
Vehicular communication systems
Wrong-way driving warning
2) Alcohol ignition interlock systems are automatic control systems which
are
designed to prevent driving with excess alcohol by requiring the driver to
blow into an in-car breathalyser before starting the ignition. The alcohol
interlock can be set at different levels and limits.
3) Autonomous emergency braking (AEB) systems detect approaching
vehicles
or other road users and apply braking to either prevent a collision occurring
or to reduce the impact severity. Early systems were relatively slow in
analysing the information from the camera or LIDAR sensors and these
systems were therefore only able to brake sufficiently to avoid a collision
with a relative velocity of around 15 kph. These systems were therefore
commonly termed “City-AEB” or “low speed AEB”. More recent systems
can operate faster and can therefore detect obstacles at greater travel
speeds
4) Anti-lock braking systems are in-vehicle devices which aim to prevent the
locking of wheels during braking when under emergency conditions, so
preventing the motorcyclist from falling off their vehicles.
5) Lane Keeping Warning Devices are electronic warning systems that are
activated if the vehicle is about to veer off the lane or the road. Their
effectiveness strongly depends on the reaction of the driver and on the
visibility of the road markings. Times to collision in safety-critical lane
changes are normally much less than one second
6) Adaptive Cruise Control (ACC) Enhances automatic cruise control found
in
many new vehicles by automatically maintaining a fixed following distance
from the vehicle in front. The distance to the preceding vehicle is measured
by radar, laser systems or both. When the speed of the vehicle in front is
slower than the adjusted speed, the ACC system adjusts vehicle speed to
allow a safe distance to the lead vehicle.
Benefits of vehicle intelligence:
1. Comfort
2. Driving pleasure
3. Safety
4. Convenience are the basic benefits which we get from these intelligent
vehicles. Although there are different high features which we get from these
vehicles.
VEHICLE INTELLIGENCE combine many different types of information and
communications technology to create a network of systems that help
manage
traffic, protect roads and more. As more and more parts of our
transportation
network become networked, VEHICLE INTELLIGENCE will change the
way
drivers, businesses and governments deal with road transport. These
advanced
systems can help improve transportation in several ways.
1) Improving traffic safety
Unsafe speeds, dangerous weather conditions and heavy traffic can all
lead to
accidents and the loss of life; intelligent transportation systems help with all
of
these. Real-time weather monitoring systems collect information on
visibility,
wind speed, rainfall, road conditions and more, allowing traffic controllers
up-tothe-
minute information on driving conditions. In fully networked systems, this
information can then be used to update warning signs and even speed
limits as
soon as the need arises, keeping drivers alert to the conditions around
them.
Emergency vehicles can respond quickly to accidents as real-time traffic
monitoring alerts them. VEHICLE INTELLIGENCE traffic control helps
divert
traffic away from busy or dangerous areas, preventing traffic jams but also
reducing the risk of collisions.
2) Reducing infrastructure damage
Heavy vehicles can put a lot of strain on the road network, particularly
when
they’re overloaded. Weigh stations and other older forms of weight control
reduce
the risk of overloading but at the expense of wasted time and delayed
traffic. Weigh-in-motion systems measure the type, size and weight of
vehicles as
they move, communicating the collected data back to a central server.
Overloaded vehicles can be identified and appropriate measures taken
resulting in
higher compliance among hauliers and reduced damage to roadways. Not
only do
these systems make enforcement simpler, they can reduce expenditure on
road
repair, allowing it to be allocated elsewhere.
3) Traffic control
Existing centralised traffic control systems go some way toward alleviating
traffic
congestion and ensuring the smooth flow of vehicles through a road
network. Intelligent transportation systems, however, allow traffic lights to
respond
to changing patterns themselves. Adaptive traffic light systems create
smart
intersections that control traffic in response to the patterns they observe
among the
vehicles using them. They can also prioritise specific forms of traffic, such
as
emergency vehicles or public transit. Large numbers of adaptive
intersections
working together produce a system in which lights change in response to
traffic
patterns rather than on a fixed schedule, reducing weight times and
keeping traffic
moving smoothly.
4) Parking management
Illegal parking contributes to crowded, dangerous city streets and creates
problems
for disabled drivers, city vehicles and others needing access to reserved
parking
spaces. Overstaying drivers slow traffic to a crawl in busy areas as visitors
find
themselves unable to park. Traditional parking enforcement systems can
be costly
and inefficient; they may even add to crowding themselves.
Smart parking violation detection scan parked vehicles and communicate
with
parking meters to identify and record illegally parked vehicles.
Instead of taking their chances with a human parking enforcement officer,
drivers
know they will automatically be cited for illegal or extended parking. These
automatic systems help improve traffic flow by increasing driver compliance
and
smooth turnover of parking spaces.
5) Gathering traffic data
Proper traffic planning is impossible without detailed data about road use
patterns.
Existing traffic sensors can learn a lot about how many vehicles use a
particular
road or intersection, but intelligent transportation systems can do much
more. Electronic traffic counters can measure the number and type of
vehicles
using a particular road or be visiting a particular part of a city, as well as
peak
traffic times, journey length and other data.
As ever-growing populations of urban drivers and commuters put more and
more
stress on our road network, cities will need new and better tools in the
constant
struggle to prevent traffic congestion and keep drivers safe. The ever-
growing
VEHICLE INTELLIGENCE revolution represents a new way to think about
traffic
and road network management.
Applications
Intelligent vehicle technologies commonly apply to car safety systems and
selfcontained
autonomous electromechanical sensors generating warnings that can be
transmitted within a specified targeted area of interest, say within 100
meters of
the transceiver. In ground applications, intelligent vehicle technologies are
utilized
for safety and commercial communications between vehicles or between a
vehicle
and a sensor along the road.
Intelligent vehicle technologies provide instantaneous on the road
information to
the motorist who wishes to map a route to a specific destination and
expects the
system to assist in determining the best course of travel. The information
provided
by the in-vehicle system updates approximately every minute (depending
on the
speed of the vehicle) all the transmitter beacon information self-recorded by
the
vehicle while traveling on the road. That is, all vehicles traveling on the
highway
update such information to the local mile markers via DSRC telematics.
The mile
markers in turn communicate with the regional monitoring station and
upload data
so as to populate statistical bar graph trend of traffic flow progression. The
information further made available for access to the date collected by the
system
established data exchange format through standard Internet protocol IP
address communications links.
PRESENT AND FUTURE:
Present: Parking sensors, automatic night lights.
Remote access to car’s performance: basic checks and simple analysis
will be
offered with the Mercedes-Benz B-Class Electric Drive
Google already made its prototype on VI and it has 8 sensors
Working:
Driverless cars will hit the roads in trials in three British cities next year,
how do self-driving cars work?
Google has been testing its prototype car on US roads – it's yet to be
trialled in the
UK – and revealed some details about how its self-driving cars work.
Much of the autonomous technology used in Google's self-driving cars is
already
found on the road.
Volkswagen Polo's automatic braking or the Ford Focus' automatic parallel
parking, which both build on the increasingly common use of proximity
sensors to
aid parking.
Combine these sensors with the automated-steering technology used for
parking,
throw in the seemingly old-hat technology that is cruise control and you
have the
loose framework for a self-driving car.
Google’s driverless car has eight sensors.
The most noticeable is the rotating roof-top LiDAR – a camera that uses an
array
of 32 or 64 lasers to measure the distance to objects to build up a 3D map
at a
range of 200m, letting the car "see" hazards.
The car also sports another set of “eyes”, a standard camera that points
through the
windscreen. This also looks for nearby hazards - such as pedestrians,
cyclists and
other motorists – and reads road signs and detects traffic lights.
Externally, the car has a rear-mounted aerial that receives geolocation
information
from GPS satellites, and an ultrasonic sensor on one of the rear wheels
that
monitors the car’s movements.
Internally, the car has altimeters, gyroscopes and a tachometer (a rev
counter) to
give finer measurements on the car’s position. These combine to give the
car the
highly accurate data needed to operate safely.
How Google’s driverless car works
No single sensor is responsible for making Google's self-driving car work.
GPS
data, for example, is not accurate enough to keep the car on the road, let
alone in
the correct lane. Instead, the driverless car uses data from all eight
sensors,
interpreted by Google's software, to keep you safe and get you from A to B.
The data that Google's software receives is used to accurately identify
other road
users and their behaviour patterns, plus commonly used highway signals.
For example, the Google car can successfully identify a bike and
understand that if
the cyclist extends an arm, they intend to make a manoeuvre. The car then
knows
to slow down and give the bike enough space to operate safely.
2) Digital DriveStyleApp. Allows access to key functionalities of the
smartphone
in the vehicle. Developed with road safety in mind and is designed to avoid
distracting drivers during their journey
Future:
“The car learns, adapts, predicts and interacts with the driver”
“autonomous driving wont happen overnight. It will need more detailed
map
data and computing power”
Luxurious interiors, super advanced crash avoidance technology and
drastically
reducing carbon emission
DRAWBACKS
There are some drawbacks to autonomous cars although pale in
comparison to the
improvements that are on the horizon. These include unemployment of
skilled
workers (taxi drivers and truck drivers), expensive technology (lidar car
systems),
reduced taxes and insurance collection, functional dependency (A.I.) and
the
debate on new laws and legislation linked to the new technology. It does
seem,
however, that these drawbacks are not enough to stop the research and
development already underway.
DRIVER’S LICENSE OBSOLETE?
In the next few years, the main players – Tesla, Nissan, Toyota, Audi,
Google,
Lexus, BMW, Mercedes Benz and Volvo are preparing to launch
autonomous
vehicles with an emphasis on electrification of cars as well. This means that
the
driver’s license may be approaching obsolete status along with the classic
“hot rod
gas guzzlers” sporting oversized diesel and gasoline engines currently
being
outperformed by electric cars across the board (with the exception of
range).
The future of automobile transport is tightly linked to communications
technology
(IoT) and the sharing economy to boost efficiency and safety in an
unprecedented
way.
We can expect major changes in the next 10-20 years in the way our roads
and
transport services are operated with new avenues for value creation in
transport as
a service and major upgrades in safety and efficiency.
References:
1) Artificial Intelligence
https://igniteoutsourcing.com/automotive/artificial-intelligence-inautomotive-
industry/
2) The future of autonomous vehicles
https://www.futuresplatform.com/blog/future-autonomous-vehicles
3) http://roboticsandautomationnews.com/2017/07/01/adas-features-
ofadvanced-
driver-assistance-systems/13194/
4) Google Prototype
https://www.alphr.com/cars/7038/how-do-googles-driverless-cars-work
5) ADVISORS (2003) Action for advanced Driver assistance and Vehicle
control systems Implementation, Standardisation, Optimum use of the
Road
network and Safety, Brussels, 2003.
6) Aga, M. and Okada, A. (2003) Analysis of Vehicle Stability Control
(VSC)’s effectiveness from crash data. ESV Paper 541, 18th ESV
Conference, Nagoya, 2003.

You might also like