Professional Documents
Culture Documents
Abstract— In this study, it is aimed to monitor and control the problem with these networks is that they are static. This means
plant production in greenhouses, and increase the efficiency of that the sensor networks must be established for each
greenhouses by doing interdisciplinary research in the fields of greenhouse. Considering that greenhouses have relatively
agriculture, electronics, robotics, and data mining. In the small small areas and high in numbers, installation of these sensor
and outnumbered greenhouses used by farmers, to collect data
using static sensor systems is both costly and impractical. In this
networks to all greenhouses will be costly.
study, an autonomous mobile robot has been developed and used Alternatives of the static sensor networks are to use manned or
to collect information from the greenhouse and mapping the unmanned systems. Unmanned systems consist of mobile
greenhouse. By using the autonomous mobile robot, that is robots. These robots can be moved freely using wheels or they
designed and produced for the project needs, sensory data such can be moved on the rails. For the mobile robots, navigation is
as, RGB-D map of the greenhouse, moisture, temperature, and very crucial when the tasks are performed autonomously. In
light, were obtained. As well as the mobile robot could the previous surveys, navigation techniques of the mobile
autonomously navigate in the greenhouse, it can be manually robots are discussed [6-8].
controlled by an operator. On the robot, robot operating system There are a lot of mobile robot applications in the literature. In
(ROS) is used. By using RGB-D mapping and SLAM packages in
the ROS, robot can find its position in the greenhouse can
some of the studies autonomous mobile robots were used for
measure the temperature, the moisture, and the light density of the monitoring the plants inside the greenhouse [9], harvesting
this location, and generates a three dimensional map of the crops [10], and detection of the pest and spraying remedies to
greenhouse. In this study, data about how it changed the the infected plants [11].
greenhouse environment and the plants grown in the greenhouse In this study, autonomous robot is used for the data acquisition
was obtained with measurements made at regular intervals of from the greenhouses. Different from the other studies, in this
time. The next step of this study is process the data gathered by work, robot generates the RGB-D map of the greenhouse and
the robot in real time to get information such as the number of measures the local positions’ temperatures, moisture level,
the crops, the phonological phase of the crops or condition of the light density, and air pressure.
greenhouse.
In this paper, firstly the design of the robot is briefly
Keywords—Greenhouse; Robotics; ROS; RGB-D map; discussed, then navigation and data acquisition strategy is
explained, and results from the real greenhouse are presented.
I. INTRODUCTION
The greenhouses have very important role in agriculture in all
around the world and in Turkey. Inside the greenhouses,
unlike the outdoor farming temperature, moisture, air flow,
and light can be controlled. This control gives production
efficiency to the greenhouses. But the greenhouses have
limited indoor area compared to the outdoor farming.
In Turkey, farming in greenhouse became widespread and it is
continue to grow. According to information received from the
Turkish Statistical Institute [1], there are 663621 Decare areas
for land under protective cover in 2015. In this area 389407
Decare areas are used for the glass and plastic greenhouse
farming.
In the some previous studies, it was suggested that
greenhouses can be monitored via wired or wireless sensor
networks [2-5]. These networks are very good in terms of
continues data acquisition and greenhouse monitoring. But Fig. 1. Last state of the robot in the field.
Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 03,2020 at 02:22:17 UTC from IEEE Xplore. Restrictions apply.
Fig. 2. Electronic hardware design of the robot.
Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 03,2020 at 02:22:17 UTC from IEEE Xplore. Restrictions apply.
Fig. 3. ROS structure of the robot.
• Advanced processing board (APB) is NVidia Jetson Fig.3 shows the ROS structure of this application. Here five
TX1. This board has on board graphics processing ROS nodes are used. Purpose of the each node is described in
unit (GPU) with 256 cores. By using this card and following.
parallel programming advanced task like real time • ZED_ROS_Wrapper: This node is the driver of the
image processing, mapping, and localization tasks ZED stereo camera. This node publishes depth
that require a lot of processing power can be done. image, RGB image and visual odometry messages.
Also this node uses advanced processing boards GPU
B. Software Design
capabilities. At the Fig.4 left camera image, depth
Same as Hardware Design, software design has also two parts. image, and point cloud can be seen.
First part is designed and implemented for the electronics level • Rtabmap: Real-Time Appearance-Based Mapping
and the second part is designed for the advanced processing node uses the RGB-D SLAM approach based on a
board. global Bayesian loop closure detector. This node
1) Electronics Level creates the RGB-D map of the surrounding area [13-
As told at the Hardware Design part, Electronics Level hast 14].
three board with MCU. These boards are the motherboard, the • navigation_node: This node drives the robot to the
power board, and the external sensor reading board. next waypoint by finding the shortest and movable
• For the motherboard, drivers for peripherals, route. Waypoint list is given to this node by operator
communication protocols, PID controller for the from the remote laptop.
motors, sensor reading routines, and main program • serial_com_node: This node transmits the navigation
has been designed and implemented. commands to the motherboard of the robot by using
• For the power board, peripheral drivers, battery UART protocol.
management unit functions, and communication • rtabmapviz: This node is used to visualize map on the
protocols have been programmed. operator’s laptop. Also operator can give new
• For the external sensor reading board, driver program instructions to the robot by using rviz interface.
for each sensor has been written. Main functionality Other than these nodes at Fig.3, voice command program,
of this board MCU is reading sensor measurements, mobile data communication with server, and manual control
and then sending them to the motherboard. over Wi-Fi network program are added to robot.
2) Processing Level
On the Advanced Processing Board, Ubuntu operating system III. NAVIGATION AND DATA ACQUISITION STRATEGY
runs with Robot Operating System ROS. By using ROS The biggest constraint in greenhouses, in all the greenhouses
integration of different sensors and programs can be done is the lack of proper roads and corridors. So for the robots it is
easily. Processing level programs run on the advanced harder to find path. In the previous studies, image processing,
processing board. Besides the ROS nodes, short scripts are different types of range finders, and odometry based path
written to start ROS programs and error handling. finding techniques have been used [7-8].
Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 03,2020 at 02:22:17 UTC from IEEE Xplore. Restrictions apply.
Fig. 4. Left camera image, depth image and point cloud image from the stereo
camera.
Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 03,2020 at 02:22:17 UTC from IEEE Xplore. Restrictions apply.
TABLE I SENSOR READING FROM GREENHOUSE. REFERENCES
[1] Turkish Statistical Institute, 2015. [Online]. Available:
Sensor Values http://www.turkstat.gov.tr/. Accessed: Jun. 29, 2016.
Sensor
Average Maximum Minimum [2] J. Paek, J. Hicks, S. Coe, and R. Govindan, "Image-based environmental
Temperature 30 °C 31.2 °C 29.4 °C monitoring sensor application using an embedded wireless sensor
network," Sensors, vol. 14, no. 9, pp. 15981–16002, Aug. 2014.
Humidity 45 % 59 % 43 % [3] R. K. Nikhade and S. L. Nalbalwar, "Monitoring greenhouse using
wireless sensor network," International Journal of Advanced Computer
Pressure 1015 hPa 1017 hPa 10013 hPa Research, vol. 3, no. 10, pp. 79–85, Jun. 2013. [Online]. Available:
Light 354 lux 375 lux 326 lux http://accentsjournals.org/PaperDirectory/Journal/IJACR/2013/6/15.pdf.
Accessed: Jun. 29, 2016.
[4] D. D. Chaudhary, S. P. Nayse, and L. M. Waghmare, "Application of
V. CONCLUSION wireless sensor networks for greenhouse parameter control in precision
In this study, it is shown that greenhouses can be RGB-D agriculture," International Journal of Wireless & Mobile Networks, vol.
3, no. 1, pp. 140–149, Feb. 2011.
mapped and this map can be used to autonomously navigate
[5] Y. W. Zhu, X. X. Zhong, and J. F. Shi, "The design of wireless sensor
through the greenhouse. By using this technique, some network system based on ZigBee technology for greenhouse," Journal of
valuable data such as temperature, moisture, light, and Physics: Conference Series, vol. 48, pp. 1195–1199, Oct. 2006.
pressure can be measured locally in the greenhouse. Since the [6] R. González, F. Rodríguez, J. Sánchez-Hermosilla, and J. G. Donaire,
RGB-D map is updated in every run, changes inside the "Navigation techniques for mobile robots in greenhouses," Applied
Engineering in Agriculture, vol. 25, no. 2, pp. 153–165, 2009.
greenhouse environment can be examined.
[7] E. Kamran, A. Mirasi, and M. Samadi, "Navigation techniques of mobile
By using this robot system, precision farming tasks for robots in greenhouses," International journal of Advanced Biological
instance, spraying remedies, fertilization can be done inside and Biomedical Research, vol. 2, no. 2014, pp. 1438–1449, 2014..
the greenhouse autonomously. Also by using advanced Accessed: Jun. 29, 2016.
spectrometers or spectral cameras more valuable data can be [8] M. H. Ko, B.-S. Ryuh, K. C. Kim, A. Suprem, and N. P. Mahalik,
gathered from the plants. "Autonomous greenhouse mobile robot driving strategies from system
integration perspective: Review and application," IEEE/ASME
RGB-D mapping is sensitive to changes in the area if the Transactions on Mechatronics, vol. 20, no. 4, pp. 1705–1716, Aug.
previously mapped area is changed between two 2015.
measurements, robot will have hard time to localize itself in [9] F. D. Von Borstel, J. Suárez, E. de la Rosa, and J. Gutiérrez, "Feeding
the area and robot will use only visual odometry to localize and water monitoring robot in aquaculture greenhouse," Industrial
Robot: An International Journal, vol. 40, no. 1, pp. 10–19, Jan. 2013.
itself.
[10] E. J. van Henten et al., "An Autonomous Robot for Harvesting
In the future, integration of more capabilities to the robot Cucumbers in Greenhouses," Autonomous Robots, vol. 13, no. 3, pp.
software such as obstacle avoidance, voice commands, and 241–258, 2002.
autonomous charging is planned. Also adding a robot arm to [11] Y. Li, C. Xia, and J. Lee, "Vision-based pest detection and automatic
the platform is one of the design stages. spray of greenhouse plant," 2009 IEEE International Symposium on
Industrial Electronics, pp. 920–925, Jul. 2009.
ACKNOWLEDGMENT [12] S. S. Mehta, T. F. Burks, and W. E. Dixon, "Vision-based localization of
a wheeled mobile robot for greenhouse applications: A daisy-chaining
This research was funded by T.R: Ministry of Food, approach," Computers and Electronics in Agriculture, vol. 63, no. 1, pp.
Agriculture and Livestock, and ITU TARBIL Environmental 28–37, Aug. 2008.
Agriculture Informatics Applied Research Center. [13] M. Labbe and F. Michaud, "Proceedings of the IEEE/RSJ International
The authors are also grateful to MSc. Serkan Gungor from the Conference on Intelligent Robots and Systems," Online Global Loop
Delta Tech. and Innovation Inc. and Akilli Teknoloji Tic. A.ù. Closure Detection for Large-Scale Multi-Session Graph-Based SLAM,
pp. 2661–2666, 2014.
for their supports.
[14] M. Labbe and F. Michaud, "Appearance-Based Loop Closure Detection
for Online Large-Scale and Long-Term Operation," IEEE Transactions
on Robotics, vol. 29, pp. 734–745, 2013.
Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 03,2020 at 02:22:17 UTC from IEEE Xplore. Restrictions apply.