You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/336162116

Lidar-based Obstacle Avoidance for the Autonomous Mobile Robot

Conference Paper · July 2019


DOI: 10.1109/ICTS.2019.8850952

CITATIONS READS

3 700

4 authors:

Dony Hutabarat Muhammad Rivai


Institut Teknologi Sepuluh Nopember Institut Teknologi Sepuluh Nopember
2 PUBLICATIONS   3 CITATIONS    162 PUBLICATIONS   451 CITATIONS   

SEE PROFILE SEE PROFILE

Djoko Purwanto Harjuno Hutomo


Institut Teknologi Sepuluh Nopember National Taiwan University of Science and Technology
50 PUBLICATIONS   324 CITATIONS    1 PUBLICATION   3 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Implementasi Metode Anfis untuk Menghindari Dynamic Obstacle di Area Koridor pada Three Wheels Omni-Directional Mobile Robot View project

Design of electronic nose system using gas chromatography principle and Surface Acoustic Wave sensor View project

All content following this page was uploaded by Muhammad Rivai on 17 October 2019.

The user has requested enhancement of the downloaded file.


12th International Conference on Information & Communication Technology and System (lCTS) 2019

Lidar-based Obstacle Avoidance for the


Autonomous Mobile Robot
Dony Hutabarat Muhammad Rivai
Department ofElectrical Engineering Departm ent ofElectrical Engine ering
Institut Teknologi Sepuluh Nopember Institut Teknologi Sepuluh Nopember
Surabaya, Indonesia Surabaya, Indonesia
donyhutabarat.18071@mhs.its.ac.id muhammadJivai@ee.its.ac.id

Djoko Purwanto Harjuno Hutomo


Department ofElectrical Engineering Department ofElectrical Engin eering
Institut Teknologi Sepuluh Nopember Institut Teknologi Sepuluh Nopember
Surabaya, Indonesia Surabaya, Indonesia
djoko@ee.its.ac.id harjuno270396@gmail.com

Abstract- In conditions that are dangerous for humans and such as a high level of precision with a long detection
their environment, the use of robots can be a solution to distance [5,7].
overcome these problems. Various sensors are used to determine
the obstacle free path and the exact position of the robot. II . LITERATIJRE REVIEW
However, conventional sensors have limitations in terms of
detection distance, spatial resolution, and processing A. Light Detection and Ranging
complexity. In this study, an autonomous mobile robot has been LiDAR is an optical scanning technology that measures
developed equipped with Light Detection and Ranging (LiDAR) the properties of radiated light to find distance and other
sensor to avoid obstacle. Braitenberg vehicle strategy is used to
navigate the movements of the robot. Sensor data collection and information from target. Shooting pulses of laser light onto
control algorithm are implemented on a single computer board the object's surface is one method for determining the
of Raspberry Pi 3. The experimental results show that this distance of an object, as shown in Figure 1. Light travels at a
sensor can measure distance consistently which is not affected speed of about 300,000 kilometers per second or 0.3 meters
by the object's color and ambient light intensity. The mobile per nanosecond in vacuum. The time difference between light
robot can avoid colored objects of different sizes. This
emitted and received back can be used to determine the
autonomous mobile robot can also navigate inside a room
without any impact on the wall or the obstacle. distance of the object [9-11] .
The standard information from LiDAR is an arrangement
Keywords -Autonomous mobile robot, Braitenberg vehicle, of points on polar coordinates [8]. The distance measuring
LiDAR, Obstacle avoidance technology is widely used in the fields of geodesy, archeology,
geography, geology, geomorphology, seismology, and remote
sensing [12]. In this study, we use X4 YDLiDAR, a LiDAR
I. INTRODUCTION
which is equipped with a motor so that it can rotate to provide
Some dangerous situations, such as chemical industries, a 360-degree view, which has a working voltage of 5 V and a
polluted environments, and mining areas can endanger
range of measurements from 0.12 m to 10 m indoors . The
workers and residents around them. These will have an impact
on both financial and life losses. Therefore, the use of robots interface data communication uses a serial port or USB
can be a solution to overcome these problems r11 . Mobile adapter.
robots equipped with several sensors can investigate in a room
or in an open area [2], as well as in industrial fields [3, 4]. The Principles of LiDAR
autonomous mobile robots that have been developed are
equipped with obstacle avoidance features as one type of
intelligent robots [5].
The ability to detect walls and obstacles around them to
predict collision-free paths automatically is the main feature Object
of autonomous mobile robots [6]. Various sensors such as
infrared and ultrasonic range finders, cameras and GPS are
used to determine the obstacle free path and the exact position -f[--:
of the mobile robot [7]. However, these conventional sensors
have limitations in terms of detection distance, spatial
-U--' I

.
Photo Detector I
resolution, and processing complexity. As an instance,
.
I
I
blanking intervals and angular uncertainty are limitations on : :
the ultrasonic distance sensors [8]. Time difference
In this study, Light Detection and Ranging (LiDAR) can be converted
technology is used as an obstacle avoidance system on an to distance
autonomous mobile robot. LiDAR has several advantages, Fig. I. The principles of LiDAR.

978-1-7281-2133-8/19/$31.00 ©2019 IEEE 197


B. Raspberry Pi
Raspberry Pi is a single board computer that can be used
in various applications. Raspberry Pi 3 works at a voltage of
5V using a micro USB connector. This computer uses the
Broadcom BCM287 chipset processor, with 1 GB of memory,
as shown in Table 1. Raspberry Pi has data storage of a SD
card memory, as shown in Figure 2.
Table 1. Raspberry Pi 3 technical specifications [13].

Microprocessor Broadcom BCM2837 64 bit Quad


Core Processor

Processor Ooeratinz Voltaze 3.3V

Fig. 3. The differential drive kinematics model [12].


Raw Voltage input 5 V. 2 A power source

Maximum current through 16mA


each I/O pin
C. Mobile Robot
Maximum total current drawn 54mA Many kinds of sensors can be implemented on a mobile
from all I/O pins robot to determine its environmental conditions. The sensors
commonly used are ultrasonic, infrared, camera, and others
Flash Memory (Operating 16 Gbytes SSD memory card [15]. In this study, a LiDAR sensor is implemented on a
System) differential drive mobile robot, where there are two wheels
each of which is driven by a de motor and a free wheel [16].
Internal RAM I G bytes DDR2
The difference in speed between the right and left wheels
is used as a robot control in this steering type [15,17,18]. The
Clock Freouencv 1.2 GHz
speed difference will result in a kinematic motion of the
GPU Dual Core Video Core IV® mobile robot that forms a rotating angle when the mobile robot
Multimedia Co-Processor. Provides turns the direction. The kinematics model for differential
Open GLES 2.0 , hardware- steering is shown in Figure 3, where the ICC is the
accelerated Open VG, and 1080p30 Instantaneous Center of Curvature, R is the radius of the ICC
H.264 high- profile decode.
to the midpoint between two robot wheels, B is the width of
Capable of IGpixells, 1.5Gtexells or
24GFLOPs with texture filtering and the robot, Vn is the linear speed of the right wheel, VL is the
DMA infrastructure. linear speed ofleft wheel. The relationship between linear and
angular velocity is as follows:
Ethernet 10/100 Ethernet

Wireless Connectivity BCM43 143 (802.11 blg/n Wireless


LAN and Bluetooth 4.1) v wR (1)

Operating Temperature -40°C to +85 °C


(2)

(3)

D. Braitenberg Vehicle
Braitenberg vehicle is a vehicle which is equipped with
two sensors and two motors, but with different connections
between them [19]. Figure 4 shows the type of braitenberg
vehicle 2a and 2b. The Braitenberg vehicle 2a is the type used
for obstacle avoidance, while type 2b is for tracking. The '+'
sign indicates a higher sensor signal which will increase the
de motor speed.
In the Braitenberg algorithm, a number of sensors are

'loY X I "!
ft-"lcr:- usa'
J- DMIOut
I
,',IlI: "PU I
c crr ccsn.. o ,n
connected to motor settings where the motor speed is affected
by the sensor input [20]. The algorithm produces a weighted
matrix that converts sensor inputs into motor speeds. The
Fig. 2. Block diagram of the Raspberry Pi 3 [14]. matrix is a two-dimensional array with the columns and the
rows are the number of sensors, and the number of motors,
respectively.

198
-, I /
-8-
/ I -, "
, ___________fI1o.Qi~B~ Q.t _
\
r------
\
I I
sensor J
I
J
I
I'D'rstanc Angu Iar
I
I
I
I
I
I
I
,
Fig. 4. Braitenberg vehicle 2a dan 2b [16].
I
I
Equation (4) represents the weighted matrix with WLS' is
the value of the S, sensor to adjust the left motor speed, while Fig. 5. Block diagram of the mobile robot system.
equation (5) is the value of the sensor. The motor speed can
be calculated as shown in equation (6), where Smax is the
maximum value of sensor.

(4)

(5)

(6)

Fig. 6. The YDLIDAR X4 with 360" scanning [21].


III. RESEARCH METHODS
Figure 5 depicts the overall block diagram of the obstacle
avoidance system. The values that will be used as the
references in this system are the coordinates of the angle and (8)
distance of the object obtained from LiDAR. Robot

-(W *[1-( «:d-~min«: )J)


Operating System (ROS) and YDLiDAR drivers are installed
on the Raspberry Pi 3 single computer board. This study VL =Vrnax L
(9)
applies the Braitenberg vehicle 2b method where the sensor
on the left side ofthe mobile robot is used to control the right

=v~ -[w, 'Hd:-~;JJ


motor and vice versa. Pulse Width Modulation (PWM)
method is used to determine the motor speed. (10)
v,
Figure 6 shows the LiDAR which acts as a sensor that
scans the distance of object in a clockwise direction. In this
study, only 360 data will be used in the range from 90· to -
90· for a half-degree change. The weight values and the If the distance is less than 0.5 m at an angle between 90·
results ofdistance measurements can be presented in equation and 0·, then the right motor speed will gradually decrease until
(7) and (8), respectively. Equation (9) and (10) are used to the distance is 0.15 m. The motor speed reaches a minimum
adjust the motor speed, with Ymax is the motor speed when and makes the robot turn right, and vice versa. The flow
PWM at the duty cycle of 100%, dmin of 0.15 is the minimum diagram of the obstacle avoidance system is shown in Figure
distance value, and drnax of 0.5 is the maximum distance 7.
value.

(7)

199
Lithium Polymer (LiPo) battery pack, and Buck converter step
down module as a 5V power supply. Table 2 shows that the
minimum and maximum distance measurements of the
LiDAR are 0.12 m and 10.5 m, respectively, with an average
error of 0.9%. In order to find out the reliability of the data
obtained by the LiDAR, the measurements are carried out with
various environmental light intensities and different object
colors. There are six kinds of object colors in each of the
experiment, namely red, green, blue, orange, purple, and
black, with the object distances of 1m and 2m. The results of
LiDAR measurements for light intensity of 83 lux, 15.5 lux,
and 0.04 lux are shown in Table 3, Table 4, and Table 5,
respectively. The results of this experiment shows that the
ambient light intensity and the surface color of the object do
Yes not significantly influence the LiDAR measurements.
No
No Reduce the righ t moto r speed
Table 2. The LiDAR measurements.
Distance (m) Read Distance (m) Error (%)
'-----.I Scanning t he right side of the robot 0-0.11 0 -
0.12 0.121 0.83
0.16 0.161 0.83
0.20 0.201 0.83
0.24 0.241 0.83
0.5 0.505 I
Yes 1.5 1.495 0.33
2.5 2.509 0.36
Reduce the left moto r speed 3.5 3.517 0.48
4.5 4.531 0.68
Fig. 7. Flowchart of the obstacle avoidance system. 5.5 5.525 0.45
6.5 6.561 0.93
7.5 7.606 1.41
8.5 8.600 1.17
9.5 9.679 1.88
10.5 10.662 1.54
1l .5 0 -
12.5 0 -
Average 0.9

Table 3. The LiDAR measurements at a light intensity of83


lux.
Object distance: 1 m Object distance: 2 m
Object
Color Measured Error Measured Error
distance (m) (%) distance (m) (%)
Red 0.994 0.6 2.016 0.8
Fig. 8. The mobile robot platform used in this experiment. Green 0.984 1.6 2.008 0.4
Blue 0.980 2 2.013 0.65
Orange 0.992 0.8 2.006 0.3
Purple 0.991 0.9 2.001 0.05
IV. RESULTS AND DISCUSSION
Black 1.003 0.3 2.033 1.65
A. LiDAR measurement Average 0.990 1.03 2.013 0.64
The mobile robot platform used in this experiment is
shown in Figure 8. The robot consists of LiDAR as a 360
scanner sensor for obtaining the angle and distance of objects
around robot, Raspberry Pi 3 as a data collector from LiDAR
and as a motor controller, motor driver module, 11.1V

200
Table 4. The LiDAR measurements at a light intensity of B. Avoidance ofthe mobile robot to various obstacles
15.5 lux. In the first experiment, avoidance of the mobile robot is
Object distance: 1 m Object distance: 2 m carried out with various obstacle colors with the distances of
Object 25 em and 50 ern, Table 6 shows that the mobile robot can
Color Measured Error Measured Error avoid various colors of object with a success rate of 100%. In
distance (m) (%) distance (m) (%) the second experiment, avoidance of the mobile robot is
Red 0.992 0.8 2.099 0.49 carried out with various obstacle widths with the distances of
Green 0.991 0.9 2.091 0.45 25 em and 50 em, Table 7 shows that the mobile robot can
Blue 0.992 0.8 2.069 0.34 avoid various widths of object with a success rate of 100%.
Orange 0.980 2 2.005 0.25 The third experiment is carried out with various colors and
Purple 0.991 0.9 2.088 0.44 sizes of obstacle with the distances of 25cm and 50cm. Table
Black 1.003 0.3 2.022 l.l 8 shows that the mobile robot can avoid colored objects of
Average 0.991 0.95 2.062 0.51 different sizes, namely glass bottle, jerry can, and cardboard
box . However, the robot cannot avoid transparent objects such
as acrylic and polycarbonate bottle. This can be caused by the
laser pulses emitted by LiDAR do not reflected back by the
Table 5. The LiDAR measurements at a light intensity of 0.04
object. The overall experiment includes a navigation of the
lux
autonomous mobile robot through an indoor room both with
Object distance: 1 m Object distance: 2 m and without obstacle. The results ofthe two experiments show
Object that the mobile robot manages to navigate the room without
Color Measured Error Measured Error hitting or crashing into the wall or obstacle. The movements
distance (m) (%) distance (m) (%)
of the mobile robot are shown in Figure 9 and Figure 10.
Red 0.992 0.8 2.013 0.65
Green 0.993 0.7 1.978 l.l
Blue 0.983 1.7 1.994 0.3 Table 8 The mobile robot avoidance to various obstacles
Orange 0.991 0.9 1.982 0.9 Object distance: Object distance:
Purple 0.995 0.5 1.990 0.5 25cm 50cm
Object Collisions Collisions
Black 1.002 0.2 2.018 0.9
Average 0.992 0.80 1.996 0.72 Yes No Yes No
Acrylic ./ ./

Polycarbonate Bottle ./ ./
Table 6. The mobile robot avoidance to vanous obstacle ./
Glass Bottle ./
colors
Jerry cans ./ ./
Object distance: Object distance:
25cm 50cm Cardboard ./ ./
Object
Color Collisions Collisions
Yes No Yes No
Red ./ ./

Green ./ ./

Blue ./ ./

Orange ./ ./

Purple ./ ./

Black ./ ./

Total (%) 0 100 0 100

Table 7. The mobile robot avoidance to various obstacle


widths.
Object distance: Object distance:
25cm 50cm
Object
Width (cm) Collisions Collisions
Yes No Yes No
5 ./ ./
10 ./ ./

15 ./ ./
Fig. 9. The mobile robot navigation in a room without obstacle.
20 ./ ./

25 ./ ./

30 ./ ./

Total (%) 0 100 0 100

201
[4] G. A. Rahardi, M. Rivai, D. Purwanto, "Implementation of hot-wire
anemometer on olfactory mobile robot to localize gas source" ,
International Conference on Information and Communications
Technology, 2018, pp. 412--417.
[5] Y. Peng, D. Qu, Y. Zhong, S. Xie, J. Luo, J. Gu, ''The obstacle
detection and obstacle avoidance algorithm based on 2-D lidar", IEEE
Int. Conf. Inf. Autom . IClA - conjunction with IEEE Int. Conf. Autom .
Logist., 2015, pp. 1648-1653.
[6] M. Bengel, K. Pfeiffer, B. Graf, A. Bubeck, A. Veri, "Mobile robots
for offshore inspection and manipulation", IEEE/RSJ Int. Conf. Intell.
Robot. Syst., 2009, pp. 3317-3322.
[7] K. H. Hsia, H. Guo, K. L. Su, "Motion guidance of mobile robot using
laser range finder" , iFUZZY- Int. Conf. Fuzzy Theory Its App!., 2013,
pp. 293-298.
[8] N. A. Rahim, M. A. Markom, A. H. Adorn, S. A. A. Shukor, A. Y. M.
Shakaff, E. S. M. M. Tan, "A mapping mobile robot using RP Lidar
scanner", 2016, pp. 87-92.
[9] M. D. Adams, "Lidar design, use, and calibration concepts for correct
environmental detection" , IEEE Trans. Robot. Autom ., vo!. 16, no. 6,
Fig. 10. The mobile robot navigation in a room with obstacle. 2000,pp.753-761 .
[10] M. M. Atia, S. Liu, H. Nematallah, T. B. Karamat, A. Noureldin,
"Integrated indoor navigation system for ground vehicles with
automati c 3-D alignment and position initialization", IEEE Trans . Veh.
V. CONCLUSION Techno!., vo!. 64, no. 4, 2015, pp. 1279-1292.
In this study, a differential drive autonomous mobile robot [11] J. Liu, Q. Sun, Z. Fan, "TOF Lidar Development in Autonomous
equipped with LiDAR has been developed to avoid obstacle. Vehicle", IEEE 3rd Optoelectron . Glob. Conf., 2018, pp. 185-190.
The Braitenberg vehicle strategy is used to navigate the
[12] J. Zhang, S. Singh, "Visual-li dar odometry and mapping: Low-drift,
movements of the robot. The distance measurements and the robust, and fast", Proc. - IEEE Int. Conf. Robot. Autom. , 2015, pp.
control algorithm are implemented on a single board computer 2174-2181.
of Raspberry Pi 3. The experimental results show that LiDAR
[13] "Raspberry Pi 3 Pinout, Features, Specifications & Datasheet."
can measure distances between 0.12 to 10.5 m with an error
[Online] . Available : https://componentsl0l .com/microcontrollers/
rate of 0.9%. The color of the object and the intensity of raspberry-pi- 3-pinout- features-datasheet. [Accessed: 03-Jul-20 19].
ambient light do not affect the measurements obtained from
LiDAR. The autonomous mobile robot can avoid colored [14] "pi3-block-diagrarn-rev4.png (\259 x900). " [Online]. Available :
http ://doc .xdevs.com/doc/RP i/pi3-block-diagram-rev4 .png.
objects of different sizes. However, the robot cannot avoid [Accessed : 03-Jul-2019].
transparent objects. This autonomous mobile robot can
navigate indoor room both with and without obstacle with no [15] M. S. Saidonr, H. Desa, M. N. Rudzuan, "A differential steering control
impact on wall or obstacle. with proportional controller for an autonomous mobile robot", Proc.
IEEE 7th Int. Colloq. Signal Process . Its App!. CSPA, 2011, pp. 90--
94.
ACKNOWLEDGMENT
[16] J. Singh, P. S. Chouhan, "A new approach for line following robot
This research was carried out with financial aid support using radius of path curvature and differential drive kinernatics", 6th
from the Ministry of Research, Technology and Higher Int. Conf. Comput. App!. Electr. Eng. - Recent Adv. CERA, 2017, pp.
Education ofthe Republic ofIndonesia (Kemenristekdikti RI). 497-502.
[17] R. Watiasih, M. Rivai, R. A. Wibowo, O. Penangsang, "Path planning
mobile robot using waypoint for gas level mapping", International
REFERENCES Seminar on Intelligent Technology and Its Application, 2017, pp. 244-
249.

[1] M. Vuka, E. Schaffernicht, M. Schmuker, V. H. Bennetts, F. Amigoni, [18] R. Watiasih , M. Rivai, O. Penangsang, F. Budiman, Tukadi, Y.
A. J. Lilienthal, "Exploration and localization of a gas source with Izza,"Online Gas Mapping in Outdoor Environment using Solar-
MOX gas sensors on a mobile robot-A Gaussian regression bout Powered Mobile Robot", International Conference on Computer
amplitude approach" , ISOEN - ISOCS/IEEE Int. Symp. Olfaction Engineering, Network and Intelligent Multimedia, 2018, pp. 245-250 .
Electron. Nose, 2017, pp. 3-5. [19] X. Yang, R. V. Patel, M. Moallem , "A fuzzy-braitenberg navigation
[2] H. Widyantara, M. Rivai, D. Purwanto, "Wind direction sensor based strategy for differential drive mobile robots", J. Intel!. Robot . Syst.
on thermal anemometer for olfactory mobile robot", Indonesian Theory App!., vo!. 47, no. 2, 2006, pp. 101-124.
Journal of Electrical Engineering and Computer Science, vo!. 13, no. [20] E. Fabregas , G. Farias, E. Peralta, H. Vargas, S. Dormido , "Teaching
2, 2019, pp. 475-484 . control in mobile robotics with V-REP and a Khepera N library" ,
[3] M. Rivai, Rendyans yah, D. Purwanto , "Implementation of fuzzy logic IEEE Conf. Control App!. CCA 2016 , pp. 821-826.
control in robot arm for searching location of gas leak", International [21] "YDLIDAR X4 DATASHEET," 2015.
Seminar on Intelligent Technology and Its Application, 2015, pp. 69-
74.

202

View publication stats

You might also like