You are on page 1of 5

Data Acquisition from Greenhouses by Using

Autonomous Mobile Robot

Halil Durmuú, Ece Olcay Güneú, Mürvet KÕrcÕ


Department of Electronics and Communication Engineering
Istanbul Technical University, Electrical and Electronics
Engineering Faculty, Maslak, Istanbul, Turkey
durmushalil@gmail.com, ece.gune@itu.edu.tr, ucerm@itu.edu.tr

Abstract— In this study, it is aimed to monitor and control the problem with these networks is that they are static. This means
plant production in greenhouses, and increase the efficiency of that the sensor networks must be established for each
greenhouses by doing interdisciplinary research in the fields of greenhouse. Considering that greenhouses have relatively
agriculture, electronics, robotics, and data mining. In the small small areas and high in numbers, installation of these sensor
and outnumbered greenhouses used by farmers, to collect data
using static sensor systems is both costly and impractical. In this
networks to all greenhouses will be costly.
study, an autonomous mobile robot has been developed and used Alternatives of the static sensor networks are to use manned or
to collect information from the greenhouse and mapping the unmanned systems. Unmanned systems consist of mobile
greenhouse. By using the autonomous mobile robot, that is robots. These robots can be moved freely using wheels or they
designed and produced for the project needs, sensory data such can be moved on the rails. For the mobile robots, navigation is
as, RGB-D map of the greenhouse, moisture, temperature, and very crucial when the tasks are performed autonomously. In
light, were obtained. As well as the mobile robot could the previous surveys, navigation techniques of the mobile
autonomously navigate in the greenhouse, it can be manually robots are discussed [6-8].
controlled by an operator. On the robot, robot operating system There are a lot of mobile robot applications in the literature. In
(ROS) is used. By using RGB-D mapping and SLAM packages in
the ROS, robot can find its position in the greenhouse can
some of the studies autonomous mobile robots were used for
measure the temperature, the moisture, and the light density of the monitoring the plants inside the greenhouse [9], harvesting
this location, and generates a three dimensional map of the crops [10], and detection of the pest and spraying remedies to
greenhouse. In this study, data about how it changed the the infected plants [11].
greenhouse environment and the plants grown in the greenhouse In this study, autonomous robot is used for the data acquisition
was obtained with measurements made at regular intervals of from the greenhouses. Different from the other studies, in this
time. The next step of this study is process the data gathered by work, robot generates the RGB-D map of the greenhouse and
the robot in real time to get information such as the number of measures the local positions’ temperatures, moisture level,
the crops, the phonological phase of the crops or condition of the light density, and air pressure.
greenhouse.
In this paper, firstly the design of the robot is briefly
Keywords—Greenhouse; Robotics; ROS; RGB-D map; discussed, then navigation and data acquisition strategy is
explained, and results from the real greenhouse are presented.
I. INTRODUCTION
The greenhouses have very important role in agriculture in all
around the world and in Turkey. Inside the greenhouses,
unlike the outdoor farming temperature, moisture, air flow,
and light can be controlled. This control gives production
efficiency to the greenhouses. But the greenhouses have
limited indoor area compared to the outdoor farming.
In Turkey, farming in greenhouse became widespread and it is
continue to grow. According to information received from the
Turkish Statistical Institute [1], there are 663621 Decare areas
for land under protective cover in 2015. In this area 389407
Decare areas are used for the glass and plastic greenhouse
farming.
In the some previous studies, it was suggested that
greenhouses can be monitored via wired or wireless sensor
networks [2-5]. These networks are very good in terms of
continues data acquisition and greenhouse monitoring. But Fig. 1. Last state of the robot in the field.

T.R. Ministry of Food, Agriculture, and Livestock


I.T.U. TARBIL Environmental Agriculture Informatics Applied Research
Center

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 03,2020 at 02:22:17 UTC from IEEE Xplore. Restrictions apply.
Fig. 2. Electronic hardware design of the robot.

• 2.4 GHz RF com is added to system to control and


II. DESIGN OF THE ROBOT communicate with micro controller unit (MCU) of
In the Fig.1, the last state of the robot has been shown. Design the motherboard directly.
of the robot was done according the project requirements that • Ultrasonic sensors are added for the distance
are movement on the rough terrain, low-cost, modularity and measurement.
the conditions of the observed greenhouse. In addition to these • Motor sensors read the speed of the motors.
design requirements, on board high processing power and • External sensor reading board, reads the IMU, GPS,
extra communication protocols are added. In this section, and environment sensors; then sends the sensory data
hardware and software structures of the robot are briefly to motherboard.
described. • IMU finds the relative orientation of the robot.
A. Hardware Design • GPS finds the position of the robot.
Mechanical hardware of the robot consists of the chassis, • Environment sensors are temperature, humidity,
motors, wheels, and the body. The chassis has six wheels and light, and pressure. These sensors are added to robot
each wheel has its own motor. Wheels are selected according to measure conditions of the greenhouse.
to terrain. Body of the robot which contains the electronic 2) Processing Level
equipment, sensors, and batteries is designed after all parts are Processing level has been added to robot for the high level
selected. Lastly, motor ratings are decided. processing tasks, such as image processing and mapping. Parts
As seen from Fig.2, electronic hardware of the robot has two of this level are described as follows.
levels, the first one is electronics level and the second one is • External storage is added to store map, image, and
processing level. video data of the greenhouse.
1) Electronics Level • Microphone is used to control the robot with voice.
Electronics level has the battery, power board, motors, • Over the GSM communication, gathered data on the
motherboard, RF communication, ultrasonic sensors, motor robot can be uploaded to TARBIL database.
sensors, external sensor reading board, IMU, GPS, and • Wi–Fi connection is used to give navigation
environment sensors. commands and to visualize RGB-D map, sensor data
• Battery is the power source of the robot and camera feed on the remote laptop.
• Power board is added to the robot for controlling and • Stereo Camera Rig is used for capturing images.
monitoring the battery charge and discharge states. These images are processed on board and from these
• Motors are the robot’s motion controllers. images depth image and map is obtained on the
• Motherboard drives the motors, reads the sensors, advanced processing board.
controls the robot and communicates with other
boards.

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 03,2020 at 02:22:17 UTC from IEEE Xplore. Restrictions apply.
Fig. 3. ROS structure of the robot.

• Advanced processing board (APB) is NVidia Jetson Fig.3 shows the ROS structure of this application. Here five
TX1. This board has on board graphics processing ROS nodes are used. Purpose of the each node is described in
unit (GPU) with 256 cores. By using this card and following.
parallel programming advanced task like real time • ZED_ROS_Wrapper: This node is the driver of the
image processing, mapping, and localization tasks ZED stereo camera. This node publishes depth
that require a lot of processing power can be done. image, RGB image and visual odometry messages.
Also this node uses advanced processing boards GPU
B. Software Design
capabilities. At the Fig.4 left camera image, depth
Same as Hardware Design, software design has also two parts. image, and point cloud can be seen.
First part is designed and implemented for the electronics level • Rtabmap: Real-Time Appearance-Based Mapping
and the second part is designed for the advanced processing node uses the RGB-D SLAM approach based on a
board. global Bayesian loop closure detector. This node
1) Electronics Level creates the RGB-D map of the surrounding area [13-
As told at the Hardware Design part, Electronics Level hast 14].
three board with MCU. These boards are the motherboard, the • navigation_node: This node drives the robot to the
power board, and the external sensor reading board. next waypoint by finding the shortest and movable
• For the motherboard, drivers for peripherals, route. Waypoint list is given to this node by operator
communication protocols, PID controller for the from the remote laptop.
motors, sensor reading routines, and main program • serial_com_node: This node transmits the navigation
has been designed and implemented. commands to the motherboard of the robot by using
• For the power board, peripheral drivers, battery UART protocol.
management unit functions, and communication • rtabmapviz: This node is used to visualize map on the
protocols have been programmed. operator’s laptop. Also operator can give new
• For the external sensor reading board, driver program instructions to the robot by using rviz interface.
for each sensor has been written. Main functionality Other than these nodes at Fig.3, voice command program,
of this board MCU is reading sensor measurements, mobile data communication with server, and manual control
and then sending them to the motherboard. over Wi-Fi network program are added to robot.
2) Processing Level
On the Advanced Processing Board, Ubuntu operating system III. NAVIGATION AND DATA ACQUISITION STRATEGY
runs with Robot Operating System ROS. By using ROS The biggest constraint in greenhouses, in all the greenhouses
integration of different sensors and programs can be done is the lack of proper roads and corridors. So for the robots it is
easily. Processing level programs run on the advanced harder to find path. In the previous studies, image processing,
processing board. Besides the ROS nodes, short scripts are different types of range finders, and odometry based path
written to start ROS programs and error handling. finding techniques have been used [7-8].

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 03,2020 at 02:22:17 UTC from IEEE Xplore. Restrictions apply.
Fig. 4. Left camera image, depth image and point cloud image from the stereo
camera.

In this study, robot is trained by operator to learn the initial


map of the greenhouse. Then operator can set the tasks such as
roam in greenhouse and take data, or get these locations
measurements. These missions can be given to robot
periodically. Robot will update the RGB-D map in every run
in the greenhouse.
Algorithm for the learning and navigating in the greenhouse
can be seen at the Fig.5. This algorithm runs on the robot,
every time when the robot starts to working. Fig. 5. Algorithm of learning and autonomous navigating in the greenhouse.
When the operator starts the robot, operator chooses a working
mode that are the learn mode or the navigate mode. In default According to these constraints a cabbage farming greenhouse
robot, robot starts with learn mode unless the navigation that is located at the Istanbul is selected. Inside the greenhouse
waypoints aren’t set. robot is operated using the remote controller. In the learning
In the learn mode, robot is driven by operator. Operator can mode the RGB-D map of the greenhouse gathered. RGB-D
navigate the robot by using remote controller that is directly map of the greenhouse is shown at the Fig.6 Also the
connected to motherboard of the robot or operator can use the temperature, humidity, air pressure, and light sensor
joystick that is connected to remote laptop. While operator is measurements are given at the Table I. For simplicity only
driving the robot, robot creates RGB-D map of the average, maximum and minimum values have shown at the
surrounding area and takes temperature, humidity, moisture, Table I.
and light measurements in ten seconds interval. This loop runs After the learning mode, waypoints are defined from the
until the operator stops driving the robot. After the learn remote laptop. Then these waypoints are sent to robot. After
mode, operator gives waypoints to the robot from the remote restarting the robot, robot enters the navigation mode. In this
laptop. autonomous run, robot followed the given path. While robot
Next time the robot starts to working, robot checks for the wandering inside the greenhouse, robot updates the RGB-D
waypoints. If the waypoints are defined in the known RGB-D map at takes measurements at the waypoints. Once the robot
map robot enters the navigation mode. In the navigation mode, passed all waypoint, it returned to the starting point.
the robot autonomously moves to first waypoint. After that
program checks for the next waypoint, if there is one more
waypoint, robot moves to that waypoint. If there is no
waypoint to go, robot returns to the where it started. While
moving to the next waypoint, robot updates the map and
gathers the external sensors measurements.
IV. RESULTS
For the tests, greenhouse selection has two constraints, first
one is the corridor width and the second one is the plant
heights. To overcome first constraint robot is built as slim as
possible, width of the robot is 30 cm. For the second
constraint, stereo camera rig is installed on a length adjustable
pole. With this setup stereo camera rig can be get image from
the 80 cm. So the plant length should be at least 20 cm below Fig. 6. RGB-D map of the greenhouse.
the stereo camera rig.

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 03,2020 at 02:22:17 UTC from IEEE Xplore. Restrictions apply.
TABLE I SENSOR READING FROM GREENHOUSE. REFERENCES
[1] Turkish Statistical Institute, 2015. [Online]. Available:
Sensor Values http://www.turkstat.gov.tr/. Accessed: Jun. 29, 2016.
Sensor
Average Maximum Minimum [2] J. Paek, J. Hicks, S. Coe, and R. Govindan, "Image-based environmental
Temperature 30 °C 31.2 °C 29.4 °C monitoring sensor application using an embedded wireless sensor
network," Sensors, vol. 14, no. 9, pp. 15981–16002, Aug. 2014.
Humidity 45 % 59 % 43 % [3] R. K. Nikhade and S. L. Nalbalwar, "Monitoring greenhouse using
wireless sensor network," International Journal of Advanced Computer
Pressure 1015 hPa 1017 hPa 10013 hPa Research, vol. 3, no. 10, pp. 79–85, Jun. 2013. [Online]. Available:
Light 354 lux 375 lux 326 lux http://accentsjournals.org/PaperDirectory/Journal/IJACR/2013/6/15.pdf.
Accessed: Jun. 29, 2016.
[4] D. D. Chaudhary, S. P. Nayse, and L. M. Waghmare, "Application of
V. CONCLUSION wireless sensor networks for greenhouse parameter control in precision
In this study, it is shown that greenhouses can be RGB-D agriculture," International Journal of Wireless & Mobile Networks, vol.
3, no. 1, pp. 140–149, Feb. 2011.
mapped and this map can be used to autonomously navigate
[5] Y. W. Zhu, X. X. Zhong, and J. F. Shi, "The design of wireless sensor
through the greenhouse. By using this technique, some network system based on ZigBee technology for greenhouse," Journal of
valuable data such as temperature, moisture, light, and Physics: Conference Series, vol. 48, pp. 1195–1199, Oct. 2006.
pressure can be measured locally in the greenhouse. Since the [6] R. González, F. Rodríguez, J. Sánchez-Hermosilla, and J. G. Donaire,
RGB-D map is updated in every run, changes inside the "Navigation techniques for mobile robots in greenhouses," Applied
Engineering in Agriculture, vol. 25, no. 2, pp. 153–165, 2009.
greenhouse environment can be examined.
[7] E. Kamran, A. Mirasi, and M. Samadi, "Navigation techniques of mobile
By using this robot system, precision farming tasks for robots in greenhouses," International journal of Advanced Biological
instance, spraying remedies, fertilization can be done inside and Biomedical Research, vol. 2, no. 2014, pp. 1438–1449, 2014..
the greenhouse autonomously. Also by using advanced Accessed: Jun. 29, 2016.
spectrometers or spectral cameras more valuable data can be [8] M. H. Ko, B.-S. Ryuh, K. C. Kim, A. Suprem, and N. P. Mahalik,
gathered from the plants. "Autonomous greenhouse mobile robot driving strategies from system
integration perspective: Review and application," IEEE/ASME
RGB-D mapping is sensitive to changes in the area if the Transactions on Mechatronics, vol. 20, no. 4, pp. 1705–1716, Aug.
previously mapped area is changed between two 2015.
measurements, robot will have hard time to localize itself in [9] F. D. Von Borstel, J. Suárez, E. de la Rosa, and J. Gutiérrez, "Feeding
the area and robot will use only visual odometry to localize and water monitoring robot in aquaculture greenhouse," Industrial
Robot: An International Journal, vol. 40, no. 1, pp. 10–19, Jan. 2013.
itself.
[10] E. J. van Henten et al., "An Autonomous Robot for Harvesting
In the future, integration of more capabilities to the robot Cucumbers in Greenhouses," Autonomous Robots, vol. 13, no. 3, pp.
software such as obstacle avoidance, voice commands, and 241–258, 2002.
autonomous charging is planned. Also adding a robot arm to [11] Y. Li, C. Xia, and J. Lee, "Vision-based pest detection and automatic
the platform is one of the design stages. spray of greenhouse plant," 2009 IEEE International Symposium on
Industrial Electronics, pp. 920–925, Jul. 2009.
ACKNOWLEDGMENT [12] S. S. Mehta, T. F. Burks, and W. E. Dixon, "Vision-based localization of
a wheeled mobile robot for greenhouse applications: A daisy-chaining
This research was funded by T.R: Ministry of Food, approach," Computers and Electronics in Agriculture, vol. 63, no. 1, pp.
Agriculture and Livestock, and ITU TARBIL Environmental 28–37, Aug. 2008.
Agriculture Informatics Applied Research Center. [13] M. Labbe and F. Michaud, "Proceedings of the IEEE/RSJ International
The authors are also grateful to MSc. Serkan Gungor from the Conference on Intelligent Robots and Systems," Online Global Loop
Delta Tech. and Innovation Inc. and Akilli Teknoloji Tic. A.ù. Closure Detection for Large-Scale Multi-Session Graph-Based SLAM,
pp. 2661–2666, 2014.
for their supports.
[14] M. Labbe and F. Michaud, "Appearance-Based Loop Closure Detection
for Online Large-Scale and Long-Term Operation," IEEE Transactions
on Robotics, vol. 29, pp. 734–745, 2013.

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 03,2020 at 02:22:17 UTC from IEEE Xplore. Restrictions apply.

You might also like