You are on page 1of 4

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/327219586

GREENHOUSE ENVIRONMENT MAPPING USING A LOW-COST LIDAR VIA


ROBOT OPERATING SYSTEM

Conference Paper · August 2018

CITATIONS READS

0 155

3 authors:

Saiful Azimi Mohamad Shukri Zainal Abidin


Universiti Teknologi Malaysia Universiti Teknologi Malaysia
15 PUBLICATIONS   14 CITATIONS    38 PUBLICATIONS   171 CITATIONS   

SEE PROFILE SEE PROFILE

Zaharuddin Mohamed
Universiti Teknologi Malaysia
137 PUBLICATIONS   1,388 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Adaptive command shaping for flexible structure systems View project

AGV routing View project

All content following this page was uploaded by Saiful Azimi on 25 August 2018.

The user has requested enhancement of the downloaded file.


IGCESH2018
Universiti Teknologi Malaysia, Johor Bahru, Malaysia 13 -15 August 2018

GREENHOUSE ENVIRONMENT MAPPING USING A LOW-COST


LIDAR VIA ROBOT OPERATING SYSTEM

M.S.A. Mahmud*1, M.S. Zainal Abidin 1 and Z. Mohamed 1


1
Control and Mechatronics Department, Faculty of Electrical Engineering, Universiti Teknologi
Malaysia Skudai, Johor, Malaysia.
(E-mail: msazimi2@live.utm.my, shukri@utm.my, zahar@fke.utm.my)

ABSTRACT

In agricultural robotic, the nature of the environment is very important for any type of
agricultural robotic research. The higher degree of environment complexity in agricultural area
encourage researchers to use expensive sensors in order to produce a reliable map data inside
the agricultural environment. This paper implements an inexpensive sensor called RPLIDAR
to produce a map inside a greenhouse environment. The experiment is executed via Robot
Operating System by implementing a Hector SLAM method to do the mapping. In order to
map the LIDAR data, Turtlebot platform is used. The map result was then compared by using
a simulated environment that is initially measured in detail and the result shows higher degrees
of similarities between the implementation of the sensor with the simulated environment.

Key words: agricultural, mapping, Robot Operating System

INTRODUCTION
Nowadays, the application of mobile robot in agriculture has been applied widely especially
such as in greenhouses, orchards and vineyards. A lot of robotics research has been conducted
to solve the problem of autonomous motion planning and control of wheeled mobile robots. To
design an autonomous robotic system, a detailed information of the environment is needed as
the robot will traverse in a dynamic and complex agricultural area.

Several methods have been used to model the agricultural environment such as sonar, machine
vision and laser radar (LIDAR). However, the most common method implemented in
agriculture is using a machine vision a LIDAR because of its good compatibility with
agricultural environment.

Machine vision is a very versatile sensor and it is often implemented in mobile robot navigation
and guidance, plant or object detection and plant characteristics measurements that including
in many agricultural tasks. Usually, a machine vision system sensor consists of a camera and
Charged Couple Device (CCD) or a complementary metal oxide semiconductor (CMOS). A
machine vision device is a very inexpensive and powerful sensing tool however, it must be
combined with other sensors that is within a proper framework to be used in certain application
[1-2]. In addition, multiple camera positions and viewing angles are recorded and it is called

ENGINEERING 28
stereo vision. However, this system requires a stable lighting condition to ensure the stability
in the system and it is very difficult in agricultural area to have a very stable lighting condition.
LIDAR sensor is used commonly for distance measurements, mapping and obstacle avoidance
and detection. This sensory system is found out to be a promising system in mapping for its
ability to accurately measure a relative vectorial position (distance and direction) [3-4]. An
algorithm was developed in [5] to build a detailed orchard maps for precision agriculture
application. It uses a 2D LIDAR sensor called (SICK LMS-291) to compute the orchard map.
It was mounted on a moving ground platform and was analyzed using a hidden semi-markov
model. The result shows promising results as it has an average matching performance of 98.2%
and 96.8% of localization and recognition using a data set recorded in a difference of one full
year. However, the expensive factors of LIDAR sensor have limited the applicability in the
development of the inexpensive mobile robot system.

This paper presents an application of inexpensive LIDAR sensor to compute an agricultural


map for robot navigation. It uses an inexpensive RPLIDAR A1 sensor that was mounted on a
Turtlebot platform. The result was then compared with the simulated environment that was
designed by taking the exact measurement of the environment. The mapping was computed
using a Robotic Operating System (ROS) platform run on an Intel Core i7 notebook.

MAIN RESULTS

Figure 1(a) shows the simulated environment computed using MATLAB Simulink3d. The
simulated environment was designed using a SOLIDWORKS by taking the exact measurement
of the real greenhouse environment. Figure 1(b) shows the computed environment using
RPLIDAR. It was computed by navigating the Turtlebot platform throughout the greenhouse
environment. The map was computed by combining laser and odometry data using Monte
Carlo Simultaneous Localization and Mapping (SLAM).

(a) (b)
Figure 1. Simulated and computed map

Based on Figure 1, it shows that the map computed using RPLIDAR has similarities in the
pattern of the environment. Therefore, it has been proven that the application of the low cost
RPLIDAR sensor is able to compute the greenhouse environment with an accuracy. In this
work, the accuracy of the environment is also tested by selecting the destination and the mobile
robot is able to navigate to the destination without colliding with obstacles. Therefore, it can
be proven that the computed map is reliable to be used as it is able to guide the mobile robot
without any collision.

ENGINEERING 29
CONCLUSION
This paper implements a low cost lidar sensor called RPLIDAR in greenhouse mapping
application. The sensor was mounted on a Turtlebot platform and the mobile robot has been
navigated throughout the greenhouse environment to compute the map based on the sensory
data. The result shows that the computed map has been found out to have the same pattern with
the simulated environment that was drawn by taking the exact measurement of the
environment. The computed map will then be used to design a navigation planner for the future
works.

Acknowledgment: The authors are grateful to the Universiti Teknologi Malaysia, the
Ainuddin Wahid Scholarship and the Ministry of Higher Education (MOHE), for their partial
financial support through their research funds, Vote No. R.J130000.7823.4F759.

REFERENCES
1. Hague, T., Marchant, J. A., and Tillet, N. D. Ground based sensing systems for autonomous agricultural
vehicles. Computers and Electronics in Agriculture 25(1-2), 11-28, 2000.
2. Fernandes, R., Salinas, C., Montes, H. and Sarria, J. Multisensory system for fruit harvesting robots.
Experimental testing in natural scenarios and with different kinds of crops. Sensors 14(12), 23885-23904,
2014.
3. Bietresato, M., Carabin, G., Vidoni, R., Gasparetto, A., and Mazetto, F. Evaluation of a LiDAR-based 3D-
stereoscopic vision system for crop-monitoring applications. Computers and Electronics in Agriculture 124,
1-13, 2016.
4. Subramaniam, V., Burks, T. F., Arroyo, A. A. Development of machine vision and laser radar based
autonomous vehicle guidance systems for citrus grove navigation. Computers and Electronics in Agriculture,
53(2), 130-143, 2006.
5. Underwood, J. P., Jagbrant, G., Nieto, J. I. and Sukarrieh, S. Lidar-based tree recognition and platform
localization in orchards. Journal of Field Robotics, 32(8), 1056-1074, 2015.

ENGINEERING 30

View publication stats

You might also like