You are on page 1of 29

Journal of Terramechanics 91 (2020) 155–183

Contents lists available at ScienceDirect

Journal of Terramechanics
journal homepage: www.elsevier.com/locate/jterra

A review of autonomous agricultural vehicles (The experience of


Hokkaido University)
Ali Roshanianfard a,⇑, Noboru Noguchi b, Hiroshi Okamoto b, Kazunobu Ishii c
a
Department of Biosystems Engineering, Faculty of Agriculture and Natural Resources, University of Mohaghegh Ardabili, Daneshghah Street, Ardabil, Iran
b
Laboratory of Vehicle Robotics, Graduate School of Agriculture, Hokkaido University, Kita-9, Nishi-9, Kita-ku, Sapporo 060-8589, Japan
c
Research Faculty of Agriculture, Hokkaido University, Kita-9, Kita-ku, Sapporo, Hokkaido 060-8589, Japan

a r t i c l e i n f o a b s t r a c t

Article history: Robotic farming will play an undeniably significant role in future sustainable agriculture. Autonomous
Received 18 January 2020 agricultural vehicles for arable crops and their components are reviewed herein, and their differing pos-
Revised 25 June 2020 sible components, advantages and disadvantages are discussed. The autonomous agricultural vehicles are
Accepted 29 June 2020
qualified from technical points of view, including each vehicle’s hardware unit (platform development,
platform, transporter system, operation functions, communication, sensors [positional sensor, attitude
sensor, and safety sensor], and control unit), the physical environment, and the control algorithm. The
Keywords:
development process for different vehicles is described. As an operational case study, all investigations
Smart farming
Digital farming
conducted at Hokkaido University between 1990 and 2018 are discussed and evaluated based on the
Precision farming described classification. The development procedure, the process of component selection, developmental
Agricultural robot challenges, and the performance indicators of the vehicles are discussed in detail. The development pro-
Artificial intelligence cess and applicability of each component are then presented, and recommendations for future studies are
Autonomous vehicle noted. Finally, the most important experiences and lessons, and some recommendations are outlined.
Ó 2020 ISTVS. Published by Elsevier Ltd. All rights reserved.

Contents

1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
2. The characterization of autonomous vehicles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
2.1. The variation AV hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
2.1.1. The variation in platform development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
2.1.2. The variation in platforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
2.1.3. The variation in transporter systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
2.1.4. The variation in operation function. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
2.1.5. The variation in communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
2.1.6. The variation in sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
2.1.7. The variation in controlling unit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
2.2. The variation in environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
2.3. The variation in controlling algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
2.4. Performance indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
3. Literature review results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
3.1. Autonomous vehicle in Hokkaido university . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
3.1.1. AV-1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
3.1.2. AV-2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
3.1.3. AV-3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170
3.1.4. AV-4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
3.1.5. AV-5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171

⇑ Corresponding author at: Assistant Professor of Biosystem Engineering, Department of Agriculture and Natural Resources, University of Mohaghegh Ardabili.
E-mail addresses: alirf@uma.ac.ir, alirf@bpe.agr.hokudai.ac.jp, ali.roshanian@yahoo.com (A. Roshanianfard), noguchi@cen.agr.hokudai.ac.jp (N. Noguchi), hiro@bpe.agr.
hokudai.ac.jp (H. Okamoto), ici@bpe.agr.hokudai.ac.jp (K. Ishii).

https://doi.org/10.1016/j.jterra.2020.06.006
0022-4898/Ó 2020 ISTVS. Published by Elsevier Ltd. All rights reserved.
156 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

Nomenclatures

Abbr. Explanation UTV Utility vehicles


AI Artificial intelligence ATV All-terrain vehicle
AR(s) Agricultural robot(s) AV-n Autonomous vehicles - number
LIDAR Light detection and ranging PTO Power take-off
GDS Geomagnetic direction sensor IO Input/output
CBM Controlled by modifications AD Analog-to-digital board
TMS Transcranial magnetic sensor ECU Engine control unit
PC Personal computer CAN-bus Controller area network-bus
DA Digital-to-analog board IOT Internet of things
PMC Pulse motor control LTE Long-term evolution
USB Universal serial bus SNAV Smooth navigation
PDS Personal digital assistant GPS Global positioning system
LNAV Lateral navigation RTK-GPS Real-time kinematic GPS
XNAV X-ray pulsar-based navigation VeBots Vehicle robotics
SAE Society of Automotive Engineers multi-RTs
JAEI Japan Aviation Electronics Industries Multi-robot tractors
IMU Inertial measuring unit FOG Fiber optical gyroscope
GNSS Global satellite navigation system DGPS differential GPS
OSV Omnidirectional stereovision RMSE Root means square error
API Application programming interface INS Inertial navigation system
PID Proportional-integral-derivative FOG Fiber optical gyroscope
CWM Controlled without modification GA Genetic algorithm
GIS Geographic information system FL Fuzzy logic
LSM Least squares method VRS Virtual reference station
DOF Degrees-of-freedom PTU Pan-tilt unit
PPW Payload per weight CB Circle back
HST Hydrostatic transmission ANN Artificial Neural Network
AAV autonomous agricultural vehicles HRI Human-robot interaction
USV Unmanned Surface Vehicle GNSS Global Navigation Satellite System
MPC Model predictive control MIMO multi-input and multi-output
LRF laser range finder SHB sliding hitch bar

3.1.6. AV-6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171


3.1.7. AV-7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
3.1.8. AV-8 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
3.1.9. AV-9 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
3.1.10. AV-10 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
3.1.11. AV-11 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
3.1.12. AV-12 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
3.1.13. AV-13 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
3.2. Characterization results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175
3.3. The development of communication system. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
3.4. The development of positioning system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
3.5. The development of attitude sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
3.6. The development of safety sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
3.7. Performance qualification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
4. Lessons learned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
5. Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
Declaration of Competing Interest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181

1. Introduction 2017). Supporting this population will not be possible unless


long-term planning is included in the future policies of the agricul-
Being able to provide food for the world’s population will be a ture industry, which has faced other new challenges over the past
significant concern for governments in the near future, and self- few decades — including food self-sufficiency and rural migration
sufficiency in food production is the main criterion with which to cities. The number of agricultural harvestable farms worldwide
countries will struggle. It is estimated that the world’s population is limited, and these farms must deal with water crises, labor
will increase to 11.8 billion by 2100 (Statistics Bureau of Japan, shortages, natural phenomena, and economic issues that could
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 157

markedly reduce farming opportunities. The change of forests to (2012) introduced technology for automated unloading by a
agricultural fields will also have irreversible consequences on the robotic head-feeding combine harvester that uses image process-
ecosystem, air pollution levels, and oxygen production. ing. A fuzzy controller in a mushroom-growing hall was modeled
In developed countries such as Japan and the United States, the by Faizollahzadeh Ardabili et al. (2016). Thanpattranon et al.
younger generation’s interest in agriculture is steadily declining (2016) were developed a control algorithm for a single-sensor
because of farmers’ low income compared to office work, the diffi- tractor-trailer navigation system for traveling between plots for
culty of farm work, and changes in cultural attitudes. According to various in-field tasks, such as product loading/unloading using a
global agriculture statistics, the average age of farmers is 65.9 years laser range finder (LRF). The LRF was used to navigate vehicle
in Japan (Statistics Bureau of Japan, 2017), compared to 55.9 years and a sliding hitch bar (SHB) was developed to modify the opera-
old in the U.S. (USDA, 2019) and 52 years old in Iran tion in narrow rows and to support the navigation system with a
(Asadollahpour et al., 2014). With declining farming populations, wider turn for the trailer than a conventional single hitch point.
the majority of farmers are considered ’too old’ to handle the rigor- More agricultural robots worth mentioning are a cucumber-
ous demands of the industry. According to the latest report from harvesting robot (Van Henten et al., 2009), a robot platform in a
the Statistics Bureau of Japan, the number of laborers continued sugar beet field (Bakker et al., 2011), an asparagus-harvesting robot
to decrease over the period from 1960 to 2013 from 20.33 million (Dong et al., 2011), a mobile grading machine for citrus fruit
(30.2% of total workers) to 13.40 million (3.7% of total workers). (Kohno et al., 2011), a robot combine harvester for beans (Saito
This problem and others have had a negative effect on the agricul- et al., 2013), a robotic system for paddy field farming (Tamaki
tural output in Japan, which was 8.47 trillion yen in 2013, down et al., 2013), a sweet pepper harvester (Eizicovits et al., 2016),
0.7% from 2012 (Statistics Bureau of Japan, 2017). and a multi-arm robotic harvester (Zion et al., 2014). Robots
However, based on a Global Agricultural Productivity report, a designed to harvest strawberries (Hayashi et al., 2010), apples
100% increase in agricultural production is necessary over the next (De-An et al., 2011), white asparagus (Barawid et al., 2007), cher-
40 years (Ellen et al., 2013). It is apparent that the world will have ries (Tanigaki et al., 2008), tomatoes, petty-tomatoes, cucumbers,
fewer farmers, older farmers, limited farmland, water crises, cli- and grapes (Kondo et al., 1996) have been reported. Moreover, sta-
mate change, and population growth in the future, all of which tionary robots are used for sheep shearing (Tanner et al., 2001),
continue to increase the complexity of agriculture issues. These wearable robots are available for agricultural work (Toyama and
factors comprise a multi-variable problem that scientists must Yamamoto, 2009a), and robot tractors have been designed
solve considering a single overarching challenge: how to produce (Noguchi and Barawid, 2011; Zhang et al., 2015). Pettersson et al.
more food with limited resources. (2010) designed a magnetorheological robot gripper for the han-
Artificial intelligence (AI) and robotic technology will be key dling of food products with varying shapes such as carrots, straw-
tools for addressing this complicated situation. Agricultural robots berries, tomatoes, and grapes. Pettersson et al. (2011) designed an
(ARs) could become a benchmark technology for saving human end-effector by the same target to handle varying small-sized
society when the properly designed systems for agricultural unpre- crops. A pneumatic gripper equipped with a pressure sensor to
dictable environments are achieved. ARs can work all day long; grasp eggplants and apples was reported by Blanes et al. (2016).
they can be programmed for different tasks; their efficiency can Kim et al. (2008) designed and manufactured a hybrid robotic sys-
be increased with an optimized algorithm; and they are economi- tem for heavy crops such as melons by combining a parallel mech-
cally efficient in the long term. In the case of robot tractors, ARs can anism and an end-effector. The above-cited papers are just a
have multiple applications, and serval ARs can collaborate to do sampling of the studies in the field of autonomous agricultural
one or several tasks in one or more fields. vehicles.
However, there is still a long way to go before ARs can be used Thanks to many valuable reviews were presented in the field of
effectively worldwide. More opportunities to develop ARs with agricultural autonomous vehicle. Li et al. (2009) briefly character-
high efficiency and at reasonable economic cost are needed. The ized the autonomous guidance system for agricultural vehicles
inception of ARs has provided a new horizon for the future of farm- (AVs) including navigation sensors (GPS, machine vision, dead-
ing. Labor issues must be considered, but the potential capability of reckoning sensors, laser-based sensors, IMUs, and GDSs), computa-
robotic science and farming experience can be merged. Some of the tional methods for features and fuse extraction, navigation plan-
challenges noted above are related to the changes in natural ners to supply control algorithms and steering controllers to
resources, whereas other issues have arisen due to human control position and orientations. As mentioned in this paper,
resources. when the article published, the guidance systems had not yet been
Several research groups around the world have achieved very fully developed and commercialized. Mousazadeh (2013) also
promising advances in agricultural robotics. Qiao et al. (2005) used reviewed and classified the autonomous agricultural vehicles
a mobile fruit-grading robot to study a database that was created (AAVs) from the positioning and navigation point of view. He more
for sweet peppers. An agricultural robot designed to improve the focused on navigation theorems and algorithms such as an artifi-
quality of products, the cost, and safety (especially in greenhouses) cial neural network (ANN), genetic algorithm (GA), and Kalman fil-
was developed by Belforte et al. (2006). Kondo et al. (2009) pre- ter. Also, a review presented by Zhou et al. (2020) focused on path
sented a machine vision system for automatic tomato harvesting. planning problems is a multi-modality constraint. They classified
A wearable agri-robot mechanism for farmers was developed by this issue in 3 main stages: (1) basic ingredients (shape, kinemat-
Toyama and Yamamoto (2009b). Rajendra et al. (2009) described ics, and dynamics), (2) route planning, and (3) trajectory and
the methodology they used to develop an image processing algo- motion Planning, and reviewed different methods applied to the
rithm for a strawberry harvester robot. Unmanned Surface Vehicles (USVs).
To address the challenges and difficulties of daily work in an Bochtis et al. (2014) presented a review of the advantages of
orchard on slopes, Kurashiki et al. (2010b) studied laser-based agricultural machinery management for future intelligent manned
vehicle control for use in orchards, and they proposed a self- and/or unmanned operations. This paper more focused on mecha-
localization algorithm for 2D laser fingers for mobile robots nization and its infrastructure for future smart farming in the
(Kurashiki et al., 2010a). A paper discussing the advantages of 3D strategic, tactical, operational, and evaluation levels.
light detection and ranging (LIDAR) compared to traditional sensor Vasconez et al. (2019b) reviewed the Human-robot interaction
fusion was presented by Weiss and Biber (2011). Kurita et al. (HRI) where humans and robots interact to solve complicated
158 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

problems including the agriculture industry. They compared the In this regard, this paper is a structured review of AAVs devel-
autonomous levels of field crops versus other crops that require oped in the laboratory of vehicle robotics – Hokkaido University
more complicated robotic harvesting. They mentioned about the between 1990 and 2018 by classifying the AAVs based on their
problem of labor shortage in some countries and how they struggle components and categorizing each component. The many relevant
with it. Also, Vougioukas (2019) reviewed the challenges imposed investigations and reports are reviewed and categorized in the fol-
on AAVs by agricultural environments which vary in conditions, lowing sections. The overview of the presented paper is illustrated
canopy structures, and physical and/or chemical characteristics. in Fig. 1.
They discussed the growth of agricultural production and labor
shortage and presented a general review of the recent develop-
2. The characterization of autonomous vehicles
ments of agricultural robots and navigation, the challenges, and
possible solutions.
The existing autonomous vehicles (AVs) used for agriculture are
Lezoche et al. (2020) were presented an impressive review
classified herein based first on their main characterization. Auton-
about Agri-food 4.0, the supply chains and technologies for future
omous vehicles vary based on the type of hardware used (see Sec-
agriculture to understand the future of the Agri-Food. As men-
tion 2.1 below). The hardware can be a specifically developed/used
tioned in this paper, all agricultural machinery will incorporate
platform equipped with different sensors and actuators. The AVs
electronic controls and enhance their current performance soon.
are controlled with the use of various control algorithms and soft-
Using sensors and drones will support the data collection (weather,
ware (Section 2.2). In addition, AVs have been developed, opti-
animals and crops behaviors, and the farm life cycle). There are
mized, and/or tested in different farming environments
also more review articles about agricultural robotics used poultry
(Section 2.2). These three aspects (hardware, controls, and the
production (Ren et al., 2020), Agriculture 4.0 (fourth evolution in
environment) comprise the main infrastructure used herein for
the farming technology) and its requirements (Zhai et al., 2020),
evaluating the existing AVs.
and a general view of smart farming – challenges and advantages
(Charania and Li, 2020).
Although some of the mentioned review articles presented the 2.1. The variation AV hardware
sensors, algorithms, and control methods used in agricultural
autonomous vehicles, all of the mentioned literature have not The hardware of AVs varies based on the platform development
reviewed AAVs from the points of: (1) They do not provide detailed procedure (Section 2.1.1), the type of platform used (Section 2.1.2),
information about the configurations and operation methods of the transportation type (Section 2.1.3), the AV’s functionality in
each sensor (Machine vision, GDS, LIDAR, . . .), algorithms (GA, operation (Section 2.1.4), the communication type (Section 2.1.5),
ANN, FL, . . .), and controlling methods (Open-loop, PID control, and the sensors used (Section 2.1.6).
adaptive controller), (2) most of them did not discuss the different
types of platforms, transportation systems, operation functions, 2.1.1. The variation in platform development
communication methods (internal and external), sensors, control Autonomous vehicles must be installed on a mobile platform.
units, and environment types in detail, (3) they didn’t present The platform can be developed in the laboratory (these are known
the development progress and challenges in the using of each com- as prototype platforms), or a commercialized platform can be used.
ponent used in AAVs. (4) at the time of publishing some of the arti- The commercialized platforms usually do not have all of the
cles, some components such as guidance systems had not yet been required functionality for autonomous navigation. In such a case,
fully developed and commercialized (Li et al., 2009), (5) this paper researchers may modify the mechanical components, actuators,
covers also deals with the terrain-vehicle interaction (Section 3.1.9) chassis, wheels, and/or any other mechanical unit. At this stage,
and the kinds of wheels used in AAVs (Section 2.1.3), and (6) none such a platform is known as a semi-commercialized platform. With
of the papers talked about the development procedure of AAVs in a a semi-commercialized platform, a research group can mechani-
research center, the connection between their components, the cally modify the commercialized platform based on the objective
process of component development and selection, and the creation of the research. For example, an extra wheel could be added; a unit
of an applicable study resource to develop new AAVs. for controlling the power take-off (PTO) may be necessary, or an
actuator to control the steering could be needed. Platforms with
limited controllable mechanical functionality are generally semi-
commercialized platforms.
In contrast, the platform to be used can be developed at a fac-
tory, with all required functionality, and the users do no need to
modify the platform mechanically. This type of platform is known
as commercialized platforms. Generally, during the development of
a new system, a prototype platform is first developed in the labo-
ratory (the prototype). A factory then uses the prototype to manu-
facture the system. During the development flow between the
laboratory and the factory, the platforms are always semi-
commercialized since the platforms are later modified mechani-
cally. A factory can then include the function that was modified
for new models. Finally, when a platform reaches its mechanical
’maturity,’ the platform is known as a fully commercialized system
or a commercialized platform (Fig. 2). At present, most mechanical
modifications are included, and all mechanical components of a
platform are controllable using an engine control unit (ECU). At
this stage, a control system (or control unit) and all of the required
components of the robotic system such as controllable steering,
programmable ECU connected to a PC, required input/output ports
Fig. 1. The flow chart of the presented paper. to connect sensors and actuators, can be applied to the platform.
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 159

Fig. 2. The platform development procedures used in the AVs.

2.1.2. The variation in platforms Table 2


The platforms used for agricultural machinery vary; the most Comparison of wheel-type, crawler-type and foot transporter system.
common and well-known platforms are tractors and combine har-
Wheel Crawler Foot
vesters. Combines are used for harvesting tasks, but tractors do
Complexity of designing and Low High High
the majority of farm work using different implements for tillage,
development
weeding, seeding, fertilizing, and watering. Some airboats and rice Reparation Easy Hard (easy to Hard
transplanters have been used recently in paddy fields, and utility break)
vehicles are used for transportation in other specific cases. Thus, Life time Long Short Short
AV platforms can generally be divided into five types: tractors, Cost Low High High
Material type Various Only hard material Various
combine harvesters, utility vehicles, transplanters, and airboats.
Ex. Metal
The application and varieties of each platform is presented in Weight Light Heavy Light
Table 1. Ground pressure High Low High
Soil compaction High Less Low
Power efficiency Low High Medium
Friction Low High Low
2.1.3. The variation in transporter systems
Maneuverability Good Bad Limited
The transporter system is the main component of a mobile plat- Maneuver environment Dry and All surfaces Limited
forms and AVs. The four currently used transporter system types solid
are the wheel-type, half-crawler type, crawler-type (continuous Traction on slop, slippery Not good Good High
track), and robotic leg (Fig. 3). Each of these transporter systems and wet ground
Drive over various obstacles Hard Easy Medium
was designed for a certain application and performance, and each Traction system Not Optimized Not
has advantages and disadvantages. The designers of an agricultural optimized optimized
AV must consider the manageability of the AV in different farming Steering Good Poor Poor
environments before selecting the appropriate transporter system. Spinning Good Better than Wheel Medium
Turning Big place Small place Poor
The differences between the transportation systems summarized
Speed High Low Low
in Table 2. Precision Good Bad Low
The common-use transporter type is wheel-based (Fig. 3a). This
type of transporter is economically affordable, easy to repair, easy
to design and develop, and lightweight, and it provides good steer- paction over the long term, and reduces the maneuvering speed
ing and vehicle speed. However, the wheels put high pressure on (Fig. 3c). Compared to the wheel type, this type of transporter sys-
the soil, which increases the compaction possibility. The friction tem is more complex to make and repair, and the number of com-
is low because the wheels have a smaller contact surface compared ponents increases the cost, break possibility, and weight;
to crawlers, but this can increase the possibility of wheelspin and additional disadvantages are a shorter system lifetime and reduced
reduce the efficiency. The wheels can be made of different materi- maneuverability, steering, and speed. The large contact surface
als and can provide precise maneuvering for the vehicle. enables crawler vehicles to maneuver in a variety of environments
The crawlers have larger connection surfaces, which increases such as paddy fields and muddy agricultural fields. The first auton-
the friction, reduces the wheelspin possibility, reduces soil com- omous vehicles were equipped with wheels or crawlers, but a new

Table 1
Different platforms used in autonomous agricultural vehicle.

Type of platform Varieties Applications as AV


Tractor Row crop tractors, General purpose tractors, Pulling/pushing different machineries (agricultural machinery, tanks,
tracklayers, Two-wheel tractors vehicles or trailers), Pre-planting process, Plowing, Tilling, Disking,
Harrowing, Planting, Weeding, Watering, Fertilizing, Harvesting
Combine harvester Wheel type (self-propelled), Crawler type Harvesting, Winnowing, and Threshing
(track), Tractor mounted
Utility vehicle Utility vehicles (UTV), All-terrain vehicle (ATV) Transporting, Plowing (field, snow), Raking, Harrowing, Mowing grass,
Building fences, Spreading seeds, Catching calves, Carrying firewood
Transplanter Rice transplanter, Vegetable transplanter, Transplanting seedlings
Flower transplanter
Boats Motorboat, Airboat Fertilizing, Weeding

Fig. 3. Transporter system types, (a) wheel-type, (b) half-crawler, (c) crawler-type (YANMAR CO., 2018), and robotic leg (Boston Dynamics, 2019).
160 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

generation of half-crawler vehicles has also been commercialized ple, electric circuit modification or the addition of a hydraulic
(Fig. 3b). These vehicles provide functionality that is between those cylinder to the steering and brake are CBM functions. Several com-
of wheel-type and crawler-type vehicles: medium speed, medium panies have recently developed agricultural vehicles that include a
friction, and medium cost. controllable function without modification. These functions are
An infrequently used type of transporter system is robotic legs known as ’controlled without modification’ (CWM). The CWM func-
or robotic feet. This special type of transporter system is used for tions are easily controllable using a PC connected to the ECU. A
discontinuous movement applications. A transporter system with function can also be Not available (NA); examples are a hitch or
robotic legs or feet may have disadvantages such as high complex- PTO in combine harvesters and boats. The operation functions
ity, poor repairability, high cost, and limited maneuverability, but a can thus be classified in four categories: M, NA, CBM, and CWM.
robotic foot-type system is the only transporter type that can move
on an unpredicted, uneven and complicated surface. In some spe- 2.1.5. The variation in communication
cial cases in which farmers are using an airboat to fertilize paddy There are two different communication types used in AVS:
fields, the transporter system of the airboat can be a propeller. This internal and external communications. Internal communication
type of transporter is rarely used for agricultural AVs. always is the communication between the controlling PC inside
the vehicle and its ECU (see Section 2.1.5.1 below). External com-
2.1.4. The variation in operation function munication can be between the AV and another controlling unit
A vehicle as a mechanical platform has different functionalities. far from the AV, such as a host-PC, an emergency switch, or
In the case of agricultural vehicles, the number of functionalities is another vehicle (Section 2.1.5.2).
greater than those of vehicles used for daily application such as
cars and buses. As shown in Fig. 4, AVs have seven main operation 2.1.5.1. The variation in internal communication. All types of com-
functions: steering control by a handle, forward and backward munication that can connect different devices and elements inside
motion control, braking, shifting, rotary speed, hitching, and a an AV are known as internal communication methods. In the
PTO. The steering is usually controlled by one or more driving beginning stages of AV research, the agricultural vehicles had no
wheels (front wheels) in the case of wheel-type and half- ECU, and because of that, communication between the vehicle
crawler-type vehicles, and by controlling the crawlers in the case and a PC was not possible. In such cases, the actuators can be con-
of crawler-type vehicles and combine harvesters. The forward/ trolled directly from the PC, which is called direct control. In the
backward motion is mainly controlled by shifts in the wheel-type direct control method, there is no communication system and
and half-crawler-type tractors and by the swash plate of a hydro- the control is achieved using an input/output (IO) port, an
static transmission (HST) system in crawler-type tractors and com- analog-to-digital board (AD), a digital-to-analog board (DA), and
bine harvesters. The brake as a resistant unit for motion a pulse motor control board (PMC) as shown in Fig. 5. The IO con-
inhabitation, the shift as a transmission ratio changer, and the trols the actuators and relays using output signal/commands, and
rotary speed of the engine are the three functionalities that are it receives input commands from a switch. An AD is installed on
available in all vehicles. a PC bus to receive information from sensors, and the DA and
The main difference between agricultural vehicles and other PMC control the actuators and motors, respectively.
vehicles is the hitch and PTO functions. The hitch or three-point When autonomous vehicles were later equipped with an ECU,
hitch in tractors is developed for attaching different implements the controlling signals from a PC were sent directly to the ECU,
to the tractor. This function is not available in combine harvesters. and the ECU would then control the actuators. At that time, several
A PTO is a rotary shaft behind the tractor that is directly connected serial communication standards were becoming popular, including
to the engine for the transmission of power to an installed imple- RS232c for low-speed communication, and RS422 and RS485 for a
ment or another vehicle. multidrop serial bus, ethernet, and universal serial bus (USB). The
All of the above-mentioned functions are controllable manually RS232c standard can communicate only between two devices
(M) or using an ECU. For autonomous applications, all of the func- (Fig. 6a), whereas the RS422 and RS485 standard can provide com-
tions should be controllable electrically using a command from a munication between 10 and 32 devices at the same time. The
personal computer (PC) connected to the ECU. With some plat- RS232c had some limitations including a limited cable length,
forms, some functions are not controllable electrically. In such a low data-transmission speed, and limited multi-connection func-
case, an electric, hydraulic, or pneumatic actuator could be added tionality. These limitations obliged researchers to use USBs instead
to make the functions controllable. This type of functions is of RS232c in 1996. The use of a USB improves the speed and some
referred to herein as ’controlled by modifications’ (CBM). For exam- other limitations, but its serial communication has master–slave
protocol in which the slave cannot interact without a host, as
shown in Fig. 6b. Decisions must be made from a controlling PC,
which can increase the process time and decrease the response
speed.

Fig. 4. Operation functions of AVs. (1) steering, (2) forward and backward, (3)
brake, (4) shift, (5) rotary speed, (6) hitch, (7) PTO. Fig. 5. Topography of direct control.
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 161

rooms, cloud servers, user interfaces, and separate vehicles such


as a tractor, combine harvester, or drone. This communication
requires a wireless communication system that is a type of exter-
nal communication (Fig. 8). It can be two-way communication,
one-way communication, or master-slave communication. Based
on the application of the system and the target of the research,
the external communication can be a Wi-Fi signal, Bluetooth sig-
nal, or long-term evolution (LTE) with the use of a Wi-Fi router,
Bluetooth device, personal digital assistant (PDS), or cell phone.

Fig. 6. Topographies of (a) the RS232c standard and (b) a USB. 2.1.6. The variation in sensors
One of the main components of a robot is its sensors, which
detect the information from the environment and send the related
In 1990, the international communication protocol ISOBUS was signals to the controlling unit. For agricultural vehicles, the sensors
published. The communication system of agricultural vehicles was can be divided into three main groups: positioning sensors (Sec-
adopted by the ISO 11783 standard based on Society of Automotive tion 2.1.6.1 below) that provide the location of the robot, attitude
Engineers (SAE) standard J1939. With this standard, the controller sensors (Section 2.1.6.2) to provide the orientation of the robot,
area network (CAN-bus) was introduced (AEF, 2019). Using CAN- and safety sensors (Section 2.1.6.3) for emergency situations.
bus, different devices and microcontrollers can communicate with
each other without any host PC, as shown in Fig. 7. In this system, 2.1.6.1. The variation in positioning sensors. The well-known posi-
several ECUs, PCs, the internet of things (IOT), and microcontrollers tioning sensors for AVs are image sensors, including lateral naviga-
as nodes can connect with each other using a bus connection. tion (LNAV), Smooth navigation (SNAV), X-ray pulsar-based
Different control units have been used to control an AV’s steer- navigation (XNAV), Global Navigation Satellite System (GNSS), glo-
ing, forward and backward motion, brake, shift, rotary speed, hitch, bal positioning system (GPS), differential GPS (DGPS), and real-
PTO, user interface, and implements’ ECUs. In vehicle robotics lab- time kinematic - GPS (RTK-GPS). The accuracy, interval, and
oratories, the internal communication of AVs was first developed methodology of these sensors differ. The topography of image sen-
based on ’direct control’ and then improved to RS232, and eventu- sors (positioning sensors) as a positioning system is illustrated in
ally the CAN-bus was used. The details of this progress are dis- Fig. 9. This system is composed of a single main system which is
cussed below in Section 3.1. uploaded on a robotic platform, and two static image sensors as
subsystems (Ishii et al., 1998a) that can control the entire system
2.1.5.2. The variation in external communication. AVs have also been based on the principle of triangulation. The system calculates the
designed to communicate with controlling rooms, monitoring robot position using two angles from the base line to a visual mar-

Fig. 7. Topography of CAN-bus.

Fig. 8. The external communication of AV.


162 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

Fig. 9. Topography of positioning systems (image sensors) based on the principle of triangulation.

ker mounted on the robot and a predetermined distance (L) respectively. The controlling system performs an image analysis
between two subsystems. to calculate the robot’s position using the two angles from the sub-
Because the positioning system recognizes the visual marker systems, and it transfers the location data via a radio receiver (e.g.,
based on chromaticity, two fluorescent lamps with red and white a wireless modem) (Ishii et al., 1998b).
radii have been used as the visual marker mounted on the robot. LNAV was an off-the-wire electromagnetic induction system
Each subsystem is composed of a rotary encoder, a CCD camera, with power cable installation around the field (Fig. 10a) (Kondo
a stepper motor, and a computer. To measure the angle to the et al., 2011). This navigation system was developed by Kubota
visual marker on the robot (h), the rotation angle of the CCD cam- Co., and the cables installed around the field provided a magnetic
era (hc ) and the deflection angle of the visual marker in an image field. The generated magnetic fields are strongly deepened with
(hd ) are detected by the rotary encoder and the CCD camera, the distance from each cable. In this system, two magnetic field

Fig. 10. Positioning systems. (a) LNAV; (1) cable C end detection and perimeter operation (9.8 kHz). (2) Cable A for returning and perimeter operations (4 kHz); (3) Field
border. (4) Cable B for returning and perimeter operations (1.5 kHz). (5) Left sensor. (6) Right sensor. (7) Unmanned returning operation. (8) A teaching run: (b) SNAV, (c)
XNAV (9) target, (10) mobile station. (11) Reference station Topcon AP-L1 (Kondo et al., 2011).
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 163

sensors were used at each side of the vehicle. To use this position- both antennas in one package. Some positioning sensors that are
ing system, it was necessary to manually ’teach’ the robot along the commonly used in vehicle robotics laboratories along with their
field boundary. This system could be effective in different weather accuracy and intervals are shown in Table 3.
conditions, but the cost of construction can increase dramatically
in large fields. 2.1.6.2. The variation in attitude sensors. The attitude sensor is a
The SNAV developed by Japan Aviation Electronics Industries complementary unit for autonomous navigation that indicates
(JAEI) used DGPS, a Transcranial Magnetic Sensor (TMS), and an the orientation of the vehicle. It can also improve the positioning
inertial measuring unit (IMU) (Fig. 10b) (Kondo et al., 2011). The accuracy using sensor fusion methods when the GPS signal is
IMU and TMS were used to improve the interval of positioning sys- unavailable because of tunnels, trees, or buildings. Geomagnetic
tem. The accuracy of SNAV was sufficient, but the costs, informa- direction sensors (GDSs) were previously used for agricultural
tion service, and the reference station were major limitations for robots to measure the heading angle (Yaw). These sensors had
the use of SNAV. two main problems; their accuracy is affected by the surrounding
XNAV was developed by BRAIN-IAM and manufactured by magnetism field, and errors resulting from vehicle inclinations can
Sanyo Electric Co. Generally, XNAV looks like an image sensor, occur. Fiber optical gyroscope (FOG) sensors then became popular,
but this positioning sensor used an optical measurement system but FOG sensors are quite expensive, and they are not cost-
(Fig. 10c) (Kondo et al., 2011). The installed target on the vehicle effective for use in agricultural robots.
(Fig. 10 (9)) was observed from the reference station, and the diag- The inertial measurement units (IMUs) are known as the best
onal distance (L) and the horizontal angle could then be obtained. choice for indicating the orientation of vehicles. These sensors
The coordination of the target was thus obtained. can measure the quaternions, headings (pitch, roll and yaw veloc-
The GNSS, GPS, DGPS, and RTK-GPS are four important position- ity and indirectly pitch and roll angle, eventually yaw angle), linear
ing systems nowadays. The general principles of GNSS, DGPS, and accelerations, and gravity. An IMU consists of gyroscopes and
RTK-GPS are based on a global satellite navigation system (GNSS), accelerometers, and magnetometers are also available in some
but each has small differences in methodology, accuracy, topogra- IMUs. The combination of an IMU and RTK-GPS with a sensor
phy, and application. Several GNSSs are now in use, including NAV- fusion method can guide vehicles with high accuracy. Some com-
STAR (U.S.), QZSS (Japan), GLONASS (Russia), IRNSS (India), BeiDou- panies have recently offered an IMU as part of their RTK-GPS pack-
3 (China), and GALELEO (Europe). The GPS uses three satellites for age. The attitude sensors commonly used in vehicle robotics
the determination of attitude and longitude and one satellite for laboratories are listed in Table 4.
altitude, providing accuracy within 3 m. The mobile unit (rover)
carries a GNSS antenna (or cellphone) to receive GNSS signals. 2.1.6.3. The variation in safety sensors. An autonomous vehicle
DGPS enhances the position using a base station which has a should have a specific safety zone to prevent accidents with
known positional coordination. DGPS can provide accuracy of humans, other vehicles, and any other possible obstacles. This
10 cm. For some agriculture applications, a minimum accuracy of safety zone can be controlled by different safety sensors such as
5 cm is required, and thus RTK-GPS could the best option. RTK- a laser scanner (2D or 3D), camera (2D, 3D, or an omni-
GPS corrects the position of the mobile unit using the positional directional stereo vision [OSV] camera), different switches (e.g., a
coordination of a base station, which is transferred by a transmis- tap switch, proximity switch, bumper switch, and emergency
sion antenna. In this case, the mobile unit should carry a GNSS switch). These safety sensors can cover a specific safety zone (such
antenna and a radio antenna. Several companies currently provide as a 10-m-dia. circle), or it can work by physical force (mostly

Table 3
Different type of positioning sensors and examples.

Type Examples and properties Price range


Abbr. Brand Model Accuracy (mm) Frequency (Hz)
Image sensors PS-1 Prototype Prototype 300 0.84 Prototype
LNAV PS-2 Kubota LNAV 50 10 Prototype
SNAV PS-3 JAEI SNAV 100 1 Prototype
XNAV PS-4 Topcon AP-L1 10–50 2 $5k–$6k
RTK-GPS PS-5 Trimble MS750 20 20 $4k–$5k
PS-6 Topcon Legacy-E 10 10 $5k–$13k
PS-7 Topcon AGI-3 20 10 $5k–$30k
PS-8 Trimble SPS855 8 20 $5k–$15k
DGPS PS-9 Hemisphere V100 600 0.05 $3k–$8k

Table 4
Different type of attitude sensor and examples.

Type Abbr. Example Price range


Brand Model
GDS AS-1 Watson Industries FGM-300A –
Gyroscope AS-2 Gyrostar ENV-05F-03 $100–$1k
Inclinometer AS-3 Omron D5R-L02-15 $100–$3k
FOG AS-4 JAE Ltd. JG35FD (1-axis FOG) $100–$2k
IMU AS-5 JAE Ltd. JCS-7401A (3-axes FOG) $1k
AS-6 Vectornav VN-100 (MEMS*) $800
AS-7 JAE ltd. JCS-7402A (3-axes FOG) $1k–$2k
AS-8 Topcon, AGI-3 On-RTK-GPS (MEMS) On board
*
MEMS: Microelectromechanical systems.
164 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

Table 5
Different type of safety sensors and examples.

Type Abbr. Example Price range


Brand Model
2D laser scanner SS-1 SICK Ltd. LMS 291 $2.5k
OSV SS-2 Point Gray Reasearc Ladybug3 $10–$12k
Camera SS-3 DWINC Co. Ltd. MTV-54KON $500–$1k
tap switch SS-4 Tapeswitch T5-16 $10–$100
proximity switch SS-5 KEYENCE Ltd. ED-130 $10–$100
3D camera SS-6 PMD tec. CamCube 2.0 $12k
2D Laser scanner SS-7 Hokuyo Automatic co. Ltd. UTM-30LX $4.5–$5k
Wireless video transmission SS-8 Cosmowave co. 4CH Quad $500~$1k
Bumper switch SS-9 – – –
Emergency switch SS-10 – – –

switches). The safety sensors that are frequently used in vehicle safety sensors in Section 2.1.6.3 and they use for visual ranging,
robotics laboratories are listed in Table 5. image analysis, segmentation, obstacle recognition (such as CCD/
CMOS camera, OSV camera, and visual ranging packages)
2.1.6.4. Other classification. There are many other classifications of (Moshou et al., 2011; Rahnemoonfar and Sheppard, 2017;
sensors in the literature for mobile and autonomous vehicles Vasconez et al., 2018).
which classify them based on (1) the type of sensing parameter,
(2) the direction of energy, and (3) the application of sensor 2.1.7. The variation in controlling unit
(Siegwart et al., 2011; Vasconez et al., 2019b). The autonomous navigation of an AV is always controlled using
(1) The first and most common classification of sensors is based a control unit. The main control unit is a computer (PC) with differ-
on the type of sensor’s parameters which can be internal (proprio- ent connections. The controlling program is always written in the
ceptive sensors) or external parameters (exteroceptive sensors) PC in various languages such as C and C#. When there was no
(Narvaez et al., 2017; Soter et al., 2018). The proprioceptive sensors ECU available in vehicles, the autonomous navigation of robots
know a sensor to measure the internal parameters of AVs such as was designed based on a direct control system. In this type of con-
encoder to measure motor speed and joint angles, or loadcells to trolling system, a PC was usually connected to the actuator and
measure forces. The exteroceptive sensors achieve the information sensors using different AD, DA, or PMC boards or IO ports
from the surrounded environments of AVs such as LIDAR, laser- (Fig. 11). In this case, the control of an actuator was achieved based
scanner, or sonar for obstacle detection and distance measure- on the controlling program written in the PC, with the data
ments (Vasconez et al., 2019a). Exteroceptive sensors are usually received from the sensors. Later control units were designed to
applied for the environment observation, mapping, autonomous use a microcontroller such as Arduino or Raspberry. With this type
navigation, action estimation, or characterization of the environ- of control unit, the pre-written program in the PC can control the
ment’s variables. They may use as a safety sensor, positioning sen- actuator based on signals from sensors. This type of control system
sor, attitude sensor, or a combination of them. is suitable for some vehicles that have no ECU. A control unit using
(2) The second classification is based on the direction of energy a microcontroller can play the role of an ECU for the vehicle.
which can be from the sensors (active sensors) or to the sensor For vehicles that are equipped with an ECU, a PC connected to
(passive sensors) (Narvaez et al., 2017). The active sensors check the ECU can control the system. Unlike a direct control system
the environmental reaction by emitting energy into the environ- which has different control boards, in this system only a PC is con-
ment such as encoders, ultrasonic sensors, and laser rangefinders. nected to the ECU. The PC only sends the command to the ECU; the
And, the passive sensors measure ambient environmental energy ECU controls the vehicle’s motion. Factories that manufacture ECUs
entering the sensor such as thermal sensors and probes for temper- provide an application programming interface (API) library for the
ature measurement, and cameras to achieve image parameters development of a control algorithm based on pre-determined stan-
(Vasconez et al., 2018). dard codes. In this system, some vehicle-related sensors and actu-
(3) the last and most practical classification is based on the ators are connected to the ECU, and some external sensors and
application of sensors which can be divided into ground-based external communication systems are connected to the PC. The
beacons, heading sensors, tactile sensors, active ranging sensors, communication system between the PC and ECU progressed from
motion/speed sensors, and vision-based sensors (Ampatzidis RS-232c to CAN-bus. When CAN-bus is used as the communication
et al., 2009; Narvaez et al., 2017). The ground-based beacons are system, different ECUs for different units can be added. The exter-
for localization and positioning of AVs which is mentioned in Sec- nal ECU can be connected with implements and external functions.
tion 2.1.6.1 (such as GNSS, RTK-GPS, RF/Active ultrasonic/Reflective
beacons) (Prado et al., 2018). The heading or orientation sensors 2.2. The variation in environment
are to measure the orientation and attitude of AVs which is men-
tioned in Section 2.1.6.2 as positioning sensors (such as IMUs, Autonomous vehicles have been designed for a variety of farm-
FOGs, Compass, and Inclinometers). The tactile sensors are for ing environments. The AVs can be designed to maneuver in open
detection of physical contact which introduced as safety sensors
in Section 2.1.6.3 (such as contact switches, bumpers switches,
and proximity sensors). The active ranging sensors check the
reflectivity, time-of-flight, and geometric triangulation which
mentioned as safety sensors in Section 2.1.6.3 (such as ultrasonic
sensor, laser scanners, and sonar sensors). The motion/speed sen-
sors are sensors to measure the velocity and acceleration of AVs
relative to fixed or moving objects (such as Doppler radar). The last
group of sensors is the vision-based sensors which mentioned as Fig. 11. Controlling unit.
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 165

fields such as corn and soybean fields; they can be designed for
orchards such as those of oranges and apples; and they can be
designed for specific fields such as paddy fields. Based on the
new innovations in agriculture, the target environments can be
changed. Each environment can have specific features and limita-
tions. In open fields, the sun’s radiation, the wind, and muddy soil
can present different limitations. In orchards, the surrounding
trees can reduce the efficiency of a navigation system. In paddy
fields, muddy soil and maneuvering limitations are major con-
cerns. The parameters for designing an AV can and do vary based
on the farming environment.

2.3. The variation in controlling algorithm

The control algorithm is the decision unit of an AV. The main


objective of this algorithm is the development of a kinematic and
dynamic model to control the lateral and longitude error of the
vehicle and thus guide the vehicle along a desired path. The kine-
matic model of an autonomous vehicle must be defined as shown Fig. 12. General kinematic model of AVs.
in Fig. 12. The parameters include the coordination of the vehicle
single or several layers based on the requirements. As the GDS
(xc ; yc ; hc ), which indicated the position of the vehicle. The coordi-
was affected by the surrounding magnetic fields, the neural net-
nation can be measured by applying positioning sensors such as
works were modified to overcome the error of inclinations and
those used with RTK-GPS, the reference coordination (x0 ; y0 ; h0 ),
magnetism. However, it was necessary to train the neural network,
the objective position or desired location (xd ; yd ; hd ), and the orien-
and the AV had to be trained before main maneuvers were per-
tation of the vehicle (xv ) which is measured using attitude sensors
formed. One of the major disadvantages of neural networks was
such as an IMU, the steering angle (d), the longitudinal velocity
the black boxes, i.e., some unidentified boxes in programming
(Vx ), the heading error (ehead ), and the lateral error (elat ). The ehead
due to the hidden layers. The presence of such boxes resulted in
is the difference between the heading angle of the vehicle and
some uncontrollable behaviors in navigation that could make the
the desired angle of the path. The elat is the distance between the
AVs dangerous.
communication port of the vehicle and the desired path. The gen-
Fuzzy logic is a many-valued logic. To control a vehicle, there
eral kinematic model for autonomous vehicles is as shown in Eq.
are three different functions that handle the lateral error based
(1). The dynamic model is shown in Eq. (2).
on the steering angle (Fig. 13b), which is defined as trapezoid-
2 3 2 32 3
elong cos hc sin hc 0 xd  xc shaped curves. When the lateral error is almost zero, the f 0 controls
6
4 elat 7 ¼
5 4
6
 sin h c cos h c 0
76
54 d  yc 5
y
7
ð1Þ the steering angle. When the lateral error starts to increase, the
density of f 0 decreases and the system is going to use f 1 . This is
ehead 0 0 1 hd  hc
the same situation when the lateral error is decreasing. The func-
2 3 2 3 tions can be developed based on the required parameters in the
_
elong cos hc 0  
6 _ 7 6 7 v algorithm.
4 elat 5 ¼ 4 sin hc 0 5 ð2Þ A PID controller is a type of controller that can correct the
u_
_
ehead 0 1 required navigation parameters using the feedbacked signal which
produces the error value. This controller uses three proportional-i
Then, the global coordination frame is as follows:
ntegral-derivative functions to control the error (Fig. 13c). PID con-
2Rt R  3
2 3 v cos 2 3
t
u_ dt dt trollers are now used as an appropriate control for autonomous
xc 6 0
7 0 x0 navigation because of their high capability and controllable
6 7 6 Rt  R  7 6 7
4 yc 5 ¼ 6 v sin t u _ dt dt 7 þ 4 y0 5 ð3Þ functions.
4 0 0 5
hc Rt h0 Because of the disadvantages of neural networks and fussy
0
u_ dt logic, the use of a PID controller is becoming more popular nowa-
where elong is the longitudinal error and u_ is the yaw rate velocity. days. In this regard, mainly using two PID controllers, the dynamic
model can be developed for the control of lateral (PID-1) and lon-
However, in the real field conditions, the values of v and u can vary
gitudinal (PID-2) motion by adjusting the steering angle and gas
due to the slip angle and different parameters related to features of
pedal command, respectively. The PID-1 controller is for lateral
the soil.
control, which consists of two control systems: a P control based
As autonomous navigation in agricultural fields has non-linear
on ehead and a PID controller based on elat ; as shown in Eq. (4)
behavior, different logic or different control systems could be used
and Fig. 14a. The objective of this control system is to control the
to control lateral and longitudinal errors, such as a neural network,
vehicle’s steering angle along a desired path. In this regard, the sys-
fuzzy logic, a proportional-integral-derivative (PID) controller, or a
tem tries to keep elat and ehead near zero.
model predictive control (MPC). A neural network can be used to
Xn
define the output of attitude sensors in non-linear conditions dn ¼ Kp;lat :elat þ Ki : e Dt _ þ Kp;head :ehead
þ Ki :elat ð4Þ
i¼1 lat
(Fig. 13a). When a neural network was used in 1992, GNSS and
GPS was not sufficiently developed and/or it was too expensive The PID-2 controller is for longitudinal control (Fig. 14b). This
for autonomous navigation, and a GDS was commonly used as an control system adjusts the current longitude velocity of the vehicle
attitude sensor. Neural networks were used to improve the perfor- (vc ) based on the desired velocity (vc ). The velocity error (ev ) comes
mance of navigation, because the image sensors (positioning sen- to the PID-2 controller as an input value, and the PID-2 controls the
sors at that time) had a large interval and low accuracy. gas command (Uv ) of the ECU. This can be a pedal control output
A neural network was composed of three layers: an input layer, for the vehicle, controlled by external actuators. The control func-
a hidden layer, and an output layer. The hidden layer could be a tion is as follows:
166 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

Fig. 13. Controlling logics. (a) Neural network, (b) Fuzzy logic, and (c) PID controller, (d) Theoretical principle of MPC.

1. The MPC can handle multi-input and multi-output (MIMO) sys-


tems the same as ANN. This functionality is useful when it is
predicted that there are some interactions between inputs
and outputs (Fig. 15). The MPC consists of three parts: predic-
tion model, rolling optimization, and feedback adjustment. In
an AV the steering wheel angle can affect the velocity output
because of wheel slip or other possible reasons. Using the PID
controller can make the controlling algorithm more compli-
cated because two PID control loop would operate indepen-
dently of each other. As if there are no interactions between
loops, designing larger systems would be even challenging.
The advantage of MPC is its multivariable controller that con-
Fig. 14. PID control system for (a) lateral control (PID-1) and (b) longitudinal
trols the outputs simultaneously by consideration of all the
control (PID-2).
interactions between system variables.
2. The MPC can handle constraints which are important because
Xn
Uv ¼ Kp  ve þ Ki  ve Dt þ Ki  v_e ð5Þ violating them leads AVs to undesired consequences. An AV
i¼1
must obey speed limits and maintain a safe distance from other
vehicles, AVs (in the multi-robot system), and obstacles. They
Where
are also constraints due to the physical limitations of AVs such
as limits on acceleration. If the AV works based on an MPC algo-
ve ¼ vd  vc ð6Þ rithm, the controller would track a desired trajectory while sat-
For agriculture applications (Fig. 12), a simplified steering con- isfying all these constraints.
trol is sometimes used as follows: 3. The MPC has preview capability which is similar to feedback
control. When an AV travels on a curvy path or would turn at
dn ¼ k1  elat þ k2  ehead ð7Þ the headland if the controller doesn’t know that the corner
coming ahead, the AV only be able to apply the brake while it
in which k1 and k2 are gains. The elat error is: takes the corner. However, the AV equipped with a safety sen-
sor like a camera which provides trajectory feature, the con-
aEGPS þ bNGPS þ C troller will know in advance about the upcoming corner. So, it
elat ¼ qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ð8Þ can brake sufficiently to safely stay in the lane. The MPC can
a2 þ b2 easily incorporate future reference information into the control
problem to improve controller performance.
in which a, b, and C are constant coefficients in the linear equation.
The EGPS , and NGPS are RTK-GPS easting and northing, respectively. The MPC has all the advantages a bow mentioned but to apply
The ehead is: this algorithm, a powerful and fast processor with a large memory
required. This is because the MPC solves an online optimization
e 
lat problem at each time stem.
ehead ¼ a tan þ hc ð9Þ
L Thanks to a valuable review paper about the MPC, its develop-
The MPC is a feedback control algorithm that uses a model for a ment procedure, and its application in agriculture by Ding et al.
prediction about the feature of processes that can be used for pre- (2018). The paper explained different types of MPC such as hybrid
diction and optimized controlling of an AV including steering angle MPC, adaptive MPC, robust MPC, nonlinear MPC, tube-based MPC,
and velocity. There are many reasons to use the MPC controller: stochastic MPC, distributed MPC, and explicit MPC which were
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 167

Fig. 15. MPC algorithm of an AV.

developed to improve the disadvantages of PID, ANN, and other in 2018. The 12 VeBots lab projects are introduced in the following
controllers. section with ’AV-n’ titles; the n is the project number.
Table 6 provides a general description of each projects (which
2.4. Performance indicators described in Section 3 and illustrated in Fig. 21) and a summary
of the variety in platform types, transportation systems, and
The performance of an AV can generally be defined using two mobile platforms. The development procedure for autonomous
main parameters: ehead and elat . When an AV is being developed, agriculture vehicles can be divided into three eras: the develop-
these two parameters must be defined (Fig. 16a). The determina- ment of technology (AV-1 to AV-3), commercialization (AV-4 to
tion of whether the designed vehicle is accurate enough or not is AV-11), and intelligent systems (AV-12 and AV-13). During the
based on these two parameters. In Japan, the required lateral error development era, the main target was designing, manufacturing,
must be 8 cm, because the average distance between crop rows is and evaluating an autonomous tractor that can maneuver in fields
66 cm, whereas the width of wheels or crawlers is ~50 cm. In this with little human interposition. At the end of that approx. 8-year
case, the maximum accuracy of 8 cm can be acceptable for AVs era, the AV-3 could maneuver autonomously with high accuracy.
(Fig. 16b). Otherwise, the vehicle can damage the crop rows and After this technology was established, laboratories decided to
the efficiency of AV can be dramatically decreased. commercialize their systems and apply them on different plat-
For the determination of the values of ehead and elat , the root forms. A collaboration between Hokkaido University and the
means square error (RMSE) must be defined. The average values University of Illinois produced the AV-4. In VeBots laboratory then
cannot be the representative of the values because an autonomous developed another system by applying autonomous system to a
vehicle can have an average error of zero but still show high turbu- crawler-type tractor (AV-5; model CT801, Yanmar), a utility vehi-
lence in lateral and heading control. The maximum and minimum cle (AV-6; E-Gator, John Deere), a commercialized wheel-type trac-
values can be used as the sub-main parameters. After the evalua- tor (AV-7; model EG83, Yanmar), a combine harvester (AV-8;
tion of each AV based on the different indicators described here model AG1100, Yanmar), an airboat (AV-9; model RB-26, Yanmar),
in Section 3.7, the performance indicators of all AVs will be a rice transplanter (AV-10; model EP8D, Kubota), and a half-
compared. crawler-type tractor (AV-11; model EG105, Yanmar). After the
VeBots laboratory successfully expanded their technology, the
3. Literature review results multi-robot tractor (AV-12; model EG453, Yanmar) and the robotic
harvesting system for heavy-weight crops (AV-13; model YT5113,
3.1. Autonomous vehicle in Hokkaido university Yanmar) were developed. The detailed explanations of each sys-
tems is presented in the following sections.
The laboratory of vehicle robotics (VeBots) at Hokkaido Univer-
sity is one of the pioneer laboratories studying autonomous vehi- 3.1.1. AV-1
cles (Roshanianfard). In this section, the VeBots studies Noguchi et al. (1992) introduced the first robot tractor from the
conducted between 1990 and 2018 are summarized, and the devel- VeBots lab in a paper titled ‘‘A Study of an Intelligent Industrial
opment component for each AV is discussed. The AVs developed at Vehicle with a Neural Network” at the Second Intelligent Systems
the VeBots lab have gone through several evolutions from a path Symposium–Japan in 1992. This was the first authentic step in the
planning system in 1997 to a multi-robot tractor (multi-RTs) sys- development of an intelligent autonomous vehicle for agricultural
tem and intelligent systems (Roshanianfard and Noguchi, 2020) applications. Before that study, several research groups had
designed agricultural vehicles controlled on a predetermined path
with the use of mechanical, optical, ultrasonic, or radio guidance,
and in some cases using leader cables. Each of these methodologies
has specific disadvantages (e.g., large errors, lack of communica-
tion, and positioning errors) which made them unsuitable for use
with an auto-guidance system. At that time, the lack of a stable
and accurate position-sensing system necessitated the develop-
ment of different positioning systems.
The GNSS and GPS became available in 1990 only for military
applications, and Noguchi et al. (1996) then developed a mobile
[42] then developed a mobile agricultural robot that used a geo-
magnetic direction sensor (GDS) and image sensors (Ishii, 1997).
Their objective was to establish a mobile robotic system for agri-
cultural applications that can detect its location using a positioning
Fig. 16. Performance indicators (a) ehead and elat , and (b) the maximum limitation of system and a heading angle sensor. They investigated a mobile
lateral error. robot that could be controlled by positions obtained from the
168 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

Table 6
General description of projects.

Project No. Type of platform Transportation system Brand Model Project name Start Finish
1
Development of technology AV-1 Tractor Wheel-type Prototype Prototype NA 1992 1997
AV-2 Tractor Wheel-type Kubota GL320 BRAIN2 1993 1998
AV-3 Tractor Wheel-type Kubota MD77 NEDO3 1997 2000
Commercialization AV-4 Tractor Wheel-type Case-IH Magnum 7220 UIUC4 1997 1999
AV-5 Tractor Crawler-type Yanmar CT801 Yanmar co. 2002 2008
AV-6 Utility vehicle Wheel-type John Deer E-Gator Yanmar co. 2005 2008
AV-7 Tractor Wheel-type Yanmar EG83 MAFF5 2010 2014
AV-8 Combine harvester Combine harvester Yanmar AG1100 MAFF 2010 2014
AV-9 Airboat Propeller Yanmar RB-26 NA 2012 2016
AV-10 Rice transplanter Wheel-type Kubota EP8D BRAIN 2014 2018
AV-11 Tractor Half-crawler Yanmar EG105 SIP6 2015 now
Intelligent systems AV-12 Tractor Half-crawler Yanmar EG453 SIP 2015 now
AV-13 Tractor Half-crawler Yanmar YT5113 SIP 2016 now

1-AV: Autonomous vehicle.


2-BRAIN: Bio-oriented Technology Research Advancement Institution.
3-NEDO: New Energy and Industrial Technology Development Organization.
4-UIUC: University of Illinois Urbana-Champaign.
5-MAFF: Ministry of Agriculture, Forestry and Fisheries.
6-SIP: Cross-ministerial Strategic Innovation Promotion Program.

image sensors and heading angles from a GDS. As shown in Fig. 21 the error due to the magnetism which exists around the robot,
(AV-1), this mobile robot tractor (AV-1) was a small, prototype, and the error from the robot inclinations. In this regard, Noguchi
rear-wheel-drive tractor with a petrol engine that could electri- et al. (1996) developed a neural network to refine the GDS output.
cally control the steering angle (/) using a potentiometer (max. Their neural network was composed of two networks: a correction
±35° that improved to ±40° (Noguchi and Terao, 1997)) and the network for vehicle inclinations, and a correction network for the
rotation of a rear wheel by a rotary encoder (max. 7°/s). It was also magnetism that surrounds the robot (Fig. 17, left). The training
equipped with a controllable clutch and a brake using a computer data of the neural network were obtained by giving the tested
which provided movement speed in the range of 0.4–1.2 m/s. In robot the inclinations in the outdoor environment.
this AV, the image sensors were used for the positioning system. Ishii et al. (1994) compared their neural network and a PID con-
In a robotic system, not only the location of the robot should be troller in order to identify the best control system for a robot trac-
predetermined as auto-guidance input; the orientation of the sys- tor as a vehicle with a nonlinear kinematic system. The results
tem should also be recognized. In 1996, GDSs were popular as low- indicated that the neural network controller, with a maximum
cost sensors for measuring the heading angle (w), and that error of 7.5 cm and 0.105 rad, had an accuracy higher than that
mounted a GDS almost 1 m above a robot tractor to avoid the noise of the PID controller (max. error of 31 cm and 0.195 rad).
effect. They also corrected the GDS outputs using an inclination Noguchi and Terao (1997) later developed a control technique that
sensor. The heading angle could measure roll and pitch directions used a combination of a neural network and a genetic algorithm
in the range of 360° ± 6°. Using a GDS involved two main errors: (GA) which was able to create a suboptimal path for the developed

Fig. 17. The neural network method for refining the geomagnetic direction sensor output (left), for motion of a robot tractor (right).
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 169

robot tractor (Fig. 17, right). The results of their research showed 3.1.2. AV-2
that the RMSE of forward, lateral, and yaw angular velocities were Noguchi and his colleagues were able to successfully develop
0.0171 m/s, 0.0155 m/s, and 0.0156 rad/s, respectively. The devel- their first autonomous vehicle, but the navigation system and
oped method was able to effectively determine a maneuver path interval of the vehicle required some modification. The accuracy
for the robot tractor. of a positioning system should be optimized, and the communica-
The control algorithm and guidance geometry of the robot are tion time delay between a mobile vehicle and station should be
shown in Fig. 18. Based on these, the robot could measure the angle kept to a minimum. Yukumoto et al. (2000a) developed an auton-
of the base line using the GDS before the robot’s autonomous oper- omous agricultural robot that can find a position with an error
ation. Next, the target positions (which were previously measured <5 cm (Yukumoto et al., 2000b); the positioning error was almost
using the positioning system) and the order of positions for arrival 40 cm when they used image sensors and the principle of triangu-
are indicated to the robot. After this procedure, the robot turns to lation. In another study, Yukumoto et al. (2000a) used a Kubota
the direction of the first target position using the data from the tractor (model GL320) (Kubota Co., 2018) as a platform equipped
GDS. The straight-line path from the current position to the first with a TMS (geomagnetic) heading sensor, a control box, and a
target position (i = 1) can be created. reflector for the navigation system. The configuration of this robot
The parameters a and b are the gains which were experimen- tractor (AV-2), named ROBOTRA, is shown in Fig. 21 (AV-2).
tally determined to achieve satisfactory control stability and accu- For newly developed autonomous systems with limited exper-
racy. The parameter d is the displacement, /d is the path direction imental data, it is essential to evaluate different auto-guidance
of the GDS, and / is the measured heading angle. A trial run of the navigation systems. Yukumoto et al. (2000a) designed and evalu-
robot was conducted on grassland at 0.5 m/s (Ishii et al., 1995), and ated three different navigation systems: (1) a system using an
the results showed that the RMSE of the neural network method off-the-wire method, named LNAV, (2) a method based on a com-
(1°) was almost 20% that of the conventional method (5.7°). The bination of a DGPS and an inertial navigation system (INS) named
absolute maximum error and the RMSE of the position for the pre- ’SNAV,’ and (3) an optical method named XNAV that uses two dif-
determined path were 51 cm and 23 cm, respectively. The maxi- ferent types of reference stations, i.e., XNAV and AP-L1. LNAV was
mum errors of the direction for the conventional and neural developed by Kubota Co. (Tokyo). SNAV was developed by Japan
network methods were approx. 14° and 1°, respectively. The aver- Aviation Electronics Industries (Tokyo). XNAV was designed by
age error of the final position for each target position was found to BRAIN-IAM (The Institute of Agricultural Machinery, Saitama,
be approx. 40 cm. This error was caused by measurement error of Japan), and XNAV by AP-L1 was developed by Topcon Co. (Tokyo).
the angles, a time delay, and the measurement interval of the posi- The experiments were performed in a 100  150 m paddy field
tioning system. at the speed of 0.5 m/s. The results indicated the following posi-
The above results clearly indicated that the neural network tioning accuracy and interval values: LNAV, 50 mm and 0.1 s;
method was more effective than the conventional method because SNAV, 100 mm and 0.1 s; XNAV, 50 mm and 0.52 s; and XNAV
the angle detected from the GDS indicate more precise values. This by AP-L1 10 mm and 0.5 s (Table 7). However, the interval of the
result can be attributed not only to the position error, but also to SNAV system improved when the system was equipped with an
the poor steering response due to the lack of power of the stepper IMU. The XNAV results with the reference station AP-L1 were the
motor with which the robot was equipped. It seemed that the posi- most satisfactory.
tion error of the robot was satisfactory for transport work in a field, The four navigation systems described above were costly, and
but the measurable area of the positioning system should be this was a commercialization limitation. To reduce the cost of ori-
increased for practical uses. In addition, more accurate measure- entation detection, Mizushima et al. (2000) developed a sensor
ments of the robot position are required for various farm fusion method (Kalman filter) that uses a GDS and a fiber optical
operations. gyroscope (FOG) based on dead reckoning. They constructed the
kinematic model of the vehicle and conducted experiments with
three autonomous guidance systems: GDS (FGM-300A) alone,
FOG (JG35FD) alone, and the sensor fusion mode. They set three
GDS or FOG sensor to make an IMU for measuring the roll, pitch,
and yaw angles. The results showed that the lateral offsets of the
GDS, FOG, and sensor fusion were 200, 400, and 61 mm, respec-
tively, indicating that the sensor fusion can support maneuvers
more accurately. They then applied the sensor fusion method on
a Kubota GL320 platform, operated the vehicle manually, and col-
lected the positioning data using RTK-GPS (MS750, Trimble Ltd.)
with accuracy of ±2 cm\pm2. The results indicated that the RMSE
of the lateral offset was 38 mm, the maximum error from the
regression line was 100 mm, and the angular error was 0.004°.
Mizushima et al. (2000) also reported that when the geomag-
netic field around the sensor increased, the RMSE of the lateral off-
set, the maximum error from the regression line, and the angular
error could increase to 92 mm, 170 mm, and 0.057°, respectively.
However, their developed sensor fusion method improved the
auto-guidance performance more accurately compared to the use
of a single navigation sensor, and the limitation of GDS due to
the geomagnetic field could decrease the performance accuracy.
Mizushima et al. (2011) later developed a low-cost attitude sen-
sor composed of three vibratory gyroscopes (Gyrostar ENV-05F-03,
Murata Manufacturing Co., Nagaokakyo, Japan), two inclinometers
(D5R-L02-15, Omron Co., Kyoto, Japan), and a sensor fusion algo-
Fig. 18. Robot guidance geometry using the heading angle and displacement errors. rithm. This attitude sensor could measure the roll, pitch, and yaw
170 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

Table 7
Specifications of different navigation systems used with the ROBOTRA robot tractor.

Navigation system LNAV SNAV SNAV + IMU XNAV XNAV + AP-L1


Positioning accuracy (mm) 50 100 100 50 10
Interval (s) 0.1 1 0.1 0.52 0.5

angles with RMSE values of 0.43°, 0.61°, and 0.64°, respectively to Mizushima et al. (2005) obtained the same heading errors (1.59°,
improve the positioning accuracy of the vehicle in a sloping field 0.41°, and 0.65° in the yaw, roll and pitch angles, respectively)
from 25.9 cm to 3 cm and in a bumpy field from 8.4 cm to when they used a microcomputer (#H8S2612, Renesas Co.) instead
3.7 cm. This low-cost attitude sensor provided high accuracy at a of a personal computer.
price (USD $500) that was only 5% of the cost of previous IMUs Because a robot tractor should be able to maneuver for various
such as combined TMS or FOG sensors. To reduce the price of atti- types of operation, the turning task at the end of a field remained a
tude sensors such as FOG and GDS, Liu et al. (2014) evaluated a serious problem to solve. Noguchi et al. (2001b) addressed this
low-cost IMU (S4E5A0A0, Seiko Epson Co., Suwa, Japan) using a problem by developing a switch-back turning function based on
mean filter for pre-procession and Kalman filter sensor fusion. a spline function. They conducted this research by a computer sim-
They aimed to provide acceptable accuracy and attitude. ulation and real-world experimentation. The computer simulation
The navigations systems that are dependent on satellite signals determined the desired motion path, and they applied the devel-
have shown weak performance in covered areas such as tunnels or oped function to the robot tractor (AV-3). The turning function
orchards surrounded by trees. Since some agricultural fields was composed of a forward and backward movement. The experi-
include buildings, trees or different objects, sometimes RTK-GPS mental results confirmed that the system could complete all turns
cannot work efficiently. Tsubota et al. (2004) thus developed an successfully (Kise et al., 2002a). The positioning error was 70 mm
automatic guidance system that uses a 2D laser scanner (LM291) overall for the turns (Kise et al., 2002b).
attached on the front of a robot tractor, a virtual reference station Yokota et al. (2005) later installed a 2D laser scanner (#LMS291,
for RTK-GPS, and an IMU. Their test results showed that the lateral SICK, Minneapolis, MN) on top of the robot tractor to generate a 3D
and heading errors of the system were 60 mm and 3°, respectively. GIS map that could be used for the collection of field information
These results were good enough to autonomously operate the for precise farming. They were able to generate a map with a posi-
robot tractor inside a coconut field (Barawid et al., 2008a). Those tioning error of 65 cm and an attitude error of 19 cm. The same
authors recommended further modifications for the improvement laser scanner was installed by Barawid et al. (2007) on the front
of the applications, capability, and performance of this robot trac- of a tractor (70 cm above the ground) to develop an autonomous
tor by including different applications or the cooperation of several navigation system that can travel between trees in an orchard.
robot tractors. After calibration Barawid et al. (2008b), the completed tests at var-
ious speeds (0.36, 0.48, 0.62, 0.86, 1.23, and 1.43 m/s) indicated
3.1.3. AV-3 that the best heading and lateral were 0.5° and 50 mm respectively
To improve the efficiency of their developed autonomous trac- when the robot’s speed was 0.36 m/s. Those results, in a compar-
tor, Kise et al. (2001) designed another robot tractor that was able ison with the system developed by Tsubota et al. (2004) for the
to maneuver in a field based on a geographic information system second robot tractor (AV-2) revealed that the heading error and
(GIS) predetermined path for different applications. They evaluated lateral error were improved from 60 to 50 mm and from 3° to
a robot tractor engaged in different agricultural work in actual 1.5°, respectively, demonstrating that autonomous navigation
fields. The configuration of this robot tractor (AV-3) is shown in using different navigation and attitude sensors was significantly
Fig. 21. This was a modified Kubota MD77 robot tractor (Kubota improved.
Co., 2018) equipped with RTK-GPS (#MS750, Trimble, Sunnyvale, When a third generation of robot tractors was being developed,
CA), FOG sensor (#JS35-FD, JAEI), and emergency switches. The it was necessary to devise a communication system between two
data communication between the ECU and a personal computer robots as multi-robot cooperation. A system that could connect
was accomplished with CAN-bus. Because of FOG drift, a Kalman two independent robot tractors with sufficient safety was desired.
filter was applied to the control algorithm. This sensor fusion Noguchi et al. (2004) presented a master-slave system for farm
method was proposed by Michio et al. [56], and it could estimate applications. Their system consisted of two algorithms: a GoTo
the heading angle with 45-mm accuracy on different types of paths algorithm (the slave is guided to go to some predetermined posi-
(straight, curve, and turns). tion or path), and a FOLLOW algorithm (the slave mimics the mas-
The AV-3 robot tractor was tested in a field at the operation ter’s behaviors by a lateral offset) as shown in Fig. 19.
speed of 1.5 m/s, and the results indicated that the robot tractor To increase the safety of robot tractor operations, Yang and
could maneuver based on the predetermined path with a lateral Noguchi (2012) created a human detection system that uses omni-
error of 6 cm and a heading error of 1.3°. The system could also directional stereovision (OSV) in a 3D panorama image. They used
operate a rotary tillage implement with satisfactory required two multi-lens-based high-definition omnidirectional cameras and
parameters (Michio et al., 2001). Noguchi et al. (2001a) tested two RTK-GPS systems. The position of an obstacle (a human, height
the same robot tractor by adding an IMU (a three-axis FOG sensor, 1.7 m) was detected using the RTK-GPS installed on the human’s
#JCS7401A, JAEI) to the control configuration. They also used the helmet, and this was compared with the results obtained with
same sensor fusion method, and they tested the robot tractor at the OSV system. The RMSE of the distance error and maximum
a speed of 2.5 m/s in a soybean field in different operations includ- error between the RTK-GPS data and the OSV data were 0.49 and
ing tillage, planting, fertilizing, and cultivating (Noguchi et al., 1 m, respectively. The OSV system also successfully detected the
2002). The results demonstrated that the applied sensor fusion sys- direction of obstacle movement.
tem with an IMU improved the accuracy of maneuvers, although Those results demonstrated that the system designed by Yang
the lateral error of the robot tractor was 30 mm. Those authors and Noguchi can detect a mobile or fixed obstacle in the daytime,
did not mention the heading error results in that study, but this and the accuracy of the system was better than those of laser scan-
error can be expected to improve (Mizushima et al., 2002). ners, ultrasonic sensors, and infrared sensors. Using an OSV
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 171

heading error was 0.22° and the positioning error was 14 mm, indi-
cating that the maneuvering of this full-crawler robot tractor was
more accurate than that of wheel-type robot tractors. This could
be because of the specifications of crawlers (which can provide
stable motion), or it could be due to the low-speed operation.
The positioning error and heading error of system were as follows:
at the speed of 0.6 m/s, 10 mm and 0.24°; at 0.9 m/s, 18 mm and
0.47°; at 1.3 m/s, 29 mm and 0.76°; at 1.7 m/s, 22.3 mm and
0.63°; and at 2 m/s, 18 mm and 0.54° (Takai et al., 2011).
To increase the safety of the robotic maneuver, three types of
safety sensors, i.e., a 2D laser scanner (#LMS291, SICK), a tap
switch (#T5-16, Tapeswitch Japan, Matsudo, Japan), and two prox-
imity switch sensors (#ED-130, Keyence, Osaka, Japan) were later
used.

3.1.6. AV-6
In the subsequent research by Barawid et al. (2008a) in their
efforts to develop an automatic guidance system for orchards in
Fig. 19. GOTO and FOLLOW algorithms for a master-slave robot system. 2008, Barawid and Noguchi (2010) extended their studies by
developing a small, low-cost robotic vehicle (AV-6) that uses an
electric power source instead of fuel. They used an electric utility
method could counteract the drawbacks of the above-mentioned vehicle (E-Gator, John Deere Co., Moline, IL) because of its easy
sensors because of the inconsistency of appearance information maintenance and energy efficiency (Fig. 21). The E-Gator was mod-
such as the color of the obstacle. Jongmin (2014) later improved ified to have controllable steering, movement, and speed by apply-
the performance of a robot tractor and a combine harvester for ing an ECU that was connected to a personal computer using CAN-
working during the nighttime. bus. An RTK-GPS sensor (#MS750, Trimble) and an IMU (#VN-100,
JAEI) were used as positioning and attitude sensors, respectively. A
3.1.4. AV-4 proper motor as an electric engine was selected based on the vehi-
In 1997, the VeBots laboratory collaborated with the University cle maneuver in on-road and off-road conditions, and then the
of Illinois to learn the latter’s agricultural robot technology. Zhang steering of vehicle was calibrated.
et al. (1999) designed a robot tractor [70] designed a robot tractor The AV-6 was evaluated in straight line tests, and the results
(AV-4) that uses DGPS, a FOG sensor, and a camera with a position- showed that at the speeds of 0.86, 1.52, and 1.61 m/s there was a
ing error < 150 mm and a steering error of <0.4° at the travelling lateral error of 80, 180, and 410 mm and heading error of 1.23°,
speed of 3.6 m/s. They installed navigation sensors, steering sen- 3.17°, and 7.67°, respectively. The performance of this robot was
sors, and a control unit on a modified Case-IH 7220 tractor compared with that of the robot tractor AV-3 (MD77) described
(Case-IH Corp.) as shown in Fig. 21 (AV-4). Noguchi et al. (1999) above, which was used a wheel-type tractor. The results indicated
later developed an intelligent vision system for a robot tractor that that the electric vehicle AV-6 was more accurate than the fuel
could classify crops. They used fuzzy logic (FL) for crop classifica- robot tractor AV-3 regarding positioning and attitude, although
tion, a genetic algorithm (GA) for the FL optimization of vision sys- charging the batteries of the electric vehicle can be a limitation
tem, an artificial neural network (Aneural network) to estimate the for long applications.
crop height, and GIS for map creation. The system was expected to Barawid and Noguchi (2011) attached a camera on an AV-6 to
have a positioning accuracy of 200 mm. The developed system was develop a follower vehicle that uses target recognition. The system
able to accurately classify the estimated weight and height of soy- was observed to be able to follow a mobile target with a heading
bean crops. error of 4.6° and a lateral error of 1 m at the speed of 1.1 m/s
Image processing to detect orientation in which the positioning (Barawid and Noguchi, 2011). Yin et al. (2013) also developed an
error was improved to 44.7 mm was reported by Pinto et al. (2000). obstacle detection system using a 3D camera (PMD CamCube 2.0)
The heading error of this guidance algorithm was 1.26°, which is which had a lateral error of 56 mm and 71 mm in static and
greater than the error afforded by the sensor fusion method. The dynamic situations, respectively. The maximum positioning accu-
sensor fusion algorithm of the robot tractor (AV-4) was improved racy of 100 and 184 mm with an RMSE of 56 and 71 mm was
and recorded by Monte et al. (2000) as a patent entitled ‘‘Sensor- achieved in static and dynamic motion, respectively (Yang,
fusion navigator for automated guidance of off-road vehicles” in 2013b; Barawid, 2011).
2002. Ospina and Noguchi (2016) used an AV-6 robot vehicle to esti-
mate dynamic parameters and improve the efficiency of autono-
3.1.5. AV-5 mous navigation in order to reduce the tire sideslip effect. The
Hokkaido Island is the northernmost island in Japan, with a long main objective was to dynamically model the relationship between
winter and an average winter snowfall of 1 m. Only five months of the sideslip angle of the platform and the yaw rate (as outputs) and
the year can be used to cultivate rice, and the muddy rice fields the vehicle velocity and steering angle (as inputs) using bicycle
decrease the efficiency of wheel-tractors. Takai et al. (2010) geometrical and dynamic models. They used potentiometers
designed a crawler robot tractor (AV-5) by applying RTK-GPS (#CPP-60, Midori Precisions, Tokyo) on one of the steering wheels
(Legacy-E, Topcon) with a virtual reference station (VRS) and an to provide the steering angle, plus a microcontroller and a com-
IMU (#JCS-7401A, JAEI) on a modified crawler-type tractor (Model puter for communication and data processing.
CT801, Yanmar, Osaka, Japan) as shown in Fig. 21 (AV-5). The least The vehicle was tested on a flat concrete surface. The compar-
squares method (LSM) was used for the sensor fusion; the naviga- ison of experimental results and the modeled data confirmed that
tion map was provided by the GIS, and the control of each crawler the developed model can be used for robotic vehicles with an esti-
motion was provided by a digital-to-analog converter of the ECU mate of dynamic parameters; however, some modification was
(Takai et al., 2014). At the speed of 0.6 m/s, the robot tractor’s needed because of the non-linearity of the agriculture field condi-
172 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

tions. Ospina and Noguchi (2018) later evaluated their developed tractors due to its crawlers. In the case of robot tractor AV-5, its
model in different soil conditions including a flat experimental accuracy was already in a comparison with other robot tractors
field, dry soil conditions, and wet soil conditions to find a relation with similar horse power values. This is because the crawlers have
between the tire’s lateral forces and slip angles in terms of soil a greater connection surface, which reduces the momentary head-
moisture content and cone index. This study was done to improve ing fluctuations.
the automatic steering controller systems of AVs. The results Zhang et al. later optimized the control parameters by develop-
showed that the model could give a more accurate description of ing a kinematic model for steering and turning functions, and the
the relation between the tire’s lateral forces and slip angles and robot was retested (Zhang et al., 2014b). They reported that the
could be applied to agricultural field conditions with slight performance of the robot was optimized at the speed of 1.2 m/s
modifications. when the optimized lateral error was 250 mm; however, this lat-
eral error was larger than those in their earlier report (Zhang
3.1.7. AV-7 et al., 2013b). Rahman et al. (2017) developed a dynamic model
Yang et al. (2016) developed a CAN-bus-based robot tractor to estimate the heading angle and reduce the harvesting time. They
(AV-7) (Fig. 21). They applied an RTK-GPS sensor (#AGI3, Topcon), also devised an optimum harvesting area for convex and concave
an IMU, OSV, and two laser scanners (front and rear, and a 2D laser polygon-shaped fields using sensor measurement. The test results
scanner with 100-mm accuracy (Yang and Noguchi, 2014)) as indicated that the absolute estimated heading angle was sufficient
safety sensors, a personal computer, two CAN-buses (CAN-1: inter- to apply to exact crop periphery determinations (Rahman, 2018).
nal communication between the tractor’s ECU and the sensors and Choi et al. (2014) added a laser scanner (#UTM-30LX, Hokuyo
actuators, CAN-2: communication between the PC and the robot USA, Indian Trail, NC) to the AV-8 combine harvester to provide a
tractor) on a wheel-type tractor (#EG83, Yanmar). navigation system by obtaining more field information (such as
Three experiments were conducted in a rectangular agricultural the shape of the field). They did not use this sensor for obstacle
field with eight paths with 5-m intervals and a turning algorithm detection or safety applications. Their findings in static and
developed by Kise et al. (2002b). The experiments were performed dynamic (speed: 0.97 m/s) conditions showed that the system
at (1) an experimental farm, without implements for an accuracy had heading errors of 200 and 700 mm and lateral errors of 0.8°
evaluation (at 1 m/s speed); (2) at a soybean field with a spraying and 3°, respectively. This system could distinguish the crop rows
implement for an evaluation of non-straight path navigation (the and harvested boundary precisely. In the manual operation of the
speed was not reported); and (3) at a normal 2-ha agricultural field system, the operator can detect the edge of harvester crops and
for almost 5 hr during the night-time for a test of the stability of tries to harvest the uncut crops. In the system’s autonomous oper-
the system for long-term work (speed: 0.6 m/s). At the first, sec- ation, sensor detection was needed to detect the cut/uncut crops
ond, and third experiments the heading error and lateral error and to operate the system to reduce the uncut rows.
were 0.57°, 0.77°, and 0.75°, and 50, 20, and 40 mm, respectively. In this regard, Teng et al. (2016) developed an edge detection
At the second test, the heading error was increased because of system that uses a laser rangefinder that was a laser scanner
the spraying implement’s weight. Compared to the previously (#UTM-30LX, Hokuyo) installed on a pan-tilt unit (PTU). The
developed robot tractors (AV-6 and before), the navigation param- indoor edge detection tests showed an average offset of 217 mm
eters were clearly improved in this robot tractor (AV-7). The third and an RMSE of 6.7 mm. In static and dynamic experiments in a
experiment was also the first reported experiment conducted in wheat field, the system had a lateral error of 42.7 and 101.5 mm,
the absence of daylight. respectively. The accuracy of the edge detection in the dynamic
The efficiency of farm work could increase if one driver could condition was affected by the vehicle’s vibration, the steering error
control two tractors, and if a robot tractor could follow another of the navigation (manually operated in that study), and the non-
human-driven tractor (Noguchi, 2015). (Zhang et al., 2014a) devel- linearity of the wheat edges. Because of the low resolution of laser
oped an algorithm for the AV-7 robot tractor to follow an AV-3 finder, the authors recommended the use of image processing or
tractor that was operated manually. The robot tractor was sensor fusion in future studies. Fan (2015) subsequently developed
equipped with wireless video transmission, four cameras, a laser a real-time quality sensor to predict the protein and moisture of
scanner, and a Bluetooth device. The best distance between the wheat using spectroscopy reflectance; however, this research
two tractors was 150 m. The experiments were done on six paths, was not directly related to the autonomous navigation (Choi
with a U-turning method, speed of 0.9 m/s, and when the robot et al., 2013).
tractor moved before the human-driven tractor moved. The results
showed that this system a had lateral error (straight line and turn- 3.1.9. AV-9
ing) of 30 mm and a heading error of 0.71° (Zhang et al., 2013a). The monitoring of paddy field growth and spraying has been
The lateral error of the human-driven tractor was improved from another common concern in Hokkaido (Roshanianfard). Liu and
60 mm to 30 mm because it followed the robot tractor (Zhang Noguchi (2016) developed an unmanned surface vehicle (USV)
et al., 2015; Zhang, 2014a; Yang, 2013a). (AV-9) for paddy fields. They used a radio-controlled air propeller
agricultural vessel (RB-26, Hokuto Yanmar Co., Ebetsu, Japan) as
3.1.8. AV-8 platform, a personal computer, a GNSS / GPS compass (V100:
Zhang et al. (2013b) developed a robot combine harvester (AV- DGPS + gyroscope, Hemisphere GPS, Calgary, Canada), an ECU
8) for different applications (Fig. 21). They used a commercial com- (Arduino UNO), servo motors for rudder control, and a magnetic
bine harvester (Model AG1100, Yanmar) equipped with RTK- sensor (#GV-101, Futaba, Mobara, Japan) to measure the rotary
GPS + IMU (#AGI3, Topcon). A disadvantage of the AGI3 Topcon speed of the engine, and a wireless router for data communication
RTK-GPS was that it should be used at speeds > 3 m/s, which prob- (Fig. 21).
ably decreases the efficiency of autonomous vehicles (Zhang, They evaluated the performance of the USV in three different
2014b). After Zhang et al. evaluated the AV-8 in a straight-line test conditions: (1) line-follow navigation to minimize lateral and
in the field, the RTK-GPS installation location was modified. heading errors, (2) headland turning to evaluate the turning func-
The experimental results demonstrated that the developed tions, and (3) map-based navigation. The developed robot USV
robot combine harvester had a heading error of 0.92° and a lateral showed linear and autonomous navigations with lateral errors of
error of 34.7 mm at the traveling speed of 1.0 m/s. This robot com- 250 and 380 mm, respectively. The USV was also able to navigate
bine harvester was clearly more accurate than the previous robot ellipses in turning paths. The large lateral errors were due to a
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 173

crosswind, and Liu et al. (2017) tested an idea to simulate and the system in a field, using the same parameters as the simulation.
reduce the effect of a crosswind. The main objective of their study The results indicated that the minimum distance between the
was a determination of the USV’s maneuverability in paddy fields. safety zones was 0.8 m, and the total operation time was
The experimental results in a test of a zig-zag maneuver at 1.2 m/s 24.5 min. These values were obtained because the software did
were compared with those of a computer simulation and related not include all of the parameters of a real field, and due to IMU
parameters. The simulation results indicated that the crosswind errors. Compared to a single robot tractor, the AV-12 system had
force can reach 2.85 N at the speed of 3 m/s under 0.1% turbulence. 84.3% efficiency. The lateral errors of the first and second robot
Liu et al. recommended that an optimized control algorithm that tractors were 70 and 50 mm, although the reported lateral error
considers crosswind force be used in future studies (Liu et al., of the system was almost zero.
2016). Zhang et al. (2016) then improved the algorithm so that the
minimum distance between the safety zones could be decreased
3.1.10. AV-10 to 0.59 m; the total operation time decreased to 20.4 min, and
Rice cultivation in paddy fields presents specific challenges the efficiency increased to 95.1%. As was the case when the VeBots
including maneuverability in the fields and labor concerns. Oka- laboratory designed robot tractors with reasonable heading errors,
moto and his team have been working on the autonomous naviga- this parameter was not reported in follower robots.
tion of a specifically designed rice transplanter made by Kubota Co. The spark of the multi-robot operation idea was provided by
(Japan) (Kawahito, 2016; Okada, 2017; Wada, 2017). As shown in Noguchi and Barawid (2011) for rice, wheat, a[109] for rice, wheat,
Fig. 21 (AV-10), this specifically designed vehicle has metallic and soybean in Japan, planning a five-year project beginning in
wheels and unique dynamic characteristics that make it easier to 2010. They introduced developed autonomous vehicles (AV-2,
control the vehicle. Okamoto et al. developed a low-cost navigation AV-5, AV-6, AV-8, and AV-10) as platforms, RTK-GNSS and IMU
system that uses an RTK-GNSS sensor (#SPS 855, Trimble) and a as navigation sensors, and laser-scanner and ultrasonic sensors
CAN-Bus connection. This rice transplanter can maneuver in a as safety sensors as the components of future multi-robotic sys-
paddy field with maximum and minimum lateral errors of 33.6 tems. Their idea was the communication among a robot manage-
and 6.7 cm, respectively. Okamoto et al. later equipped the vehicle ment system (navigation sensors, sensor fusion, Kalman filter), a
with a camera (4 mm, Tamron, Saitama, Japan) to recognize pre- real-time monitoring system (management and monitoring), a
provided paths. They observed that they could control the new sys- navigation system, and a safety system using a wireless local-
tem by an average lateral error of 10.34 mm. This study is in area network (LAN).
progress. After several attempts, Zhang and Noguchi (2017) developed
the first multi-robot tractor system for agricultural field work.
3.1.11. AV-11 They aimed to increase the efficiency and speed of farm work in
As mentioned, autonomous navigation includes the turning order to decrease the risk of harvesting and other natural limita-
phase at the end of each straight maneuver (headland). To address tions such as rain and temperature variation. Their multi-robot
this phase, several algorithms have been developed: a U-turn, key- system can decrease the economic indexes and labor costs that
hole turn (Takai et al., 2011), and fish-tail turn (Michio et al., 2001) provide an infrastructure for high-efficiency agriculture. They used
algorithm. Wang and Noguchi (2016) designed a robot tractor (AV- the robot tractor (AV-12) as the main platform in three different
11) with optimized maneuverability in straight lines and turning at patterns: an I-pattern, a V-pattern, and a W-pattern using a U-
the headland (Fig. 21). Their main objectives were the optimization shaped turning method. They controlled the lateral and longitudi-
of the trajectory, headland distance, and economy indexes. A nal distances between the robots, and they defined a circular safety
wheel-type tractor (Model EG105, Yanmar) was equipped with a zone and a rectangular safety zone. They defined and evaluated
personal computer, control unit, RTK-GPS sensor (SPS855, Trim- two types of efficiency: the efficiency of the multi-robots’ working
ble), and IMU (#VN100, VectorNav Technology, Dallas, TX). Their time compared to that of a single robot, and the efficiency of multi-
vehicle’s turning function was named ’circle back (CB),’ which robots’ working time compared to the total consumed time.
includes two circular motions and one linear motion. Their exper- As shown in Table 8, the I, V, and W patterns were simulated
iments indicated that the robot tractor can maneuver with lateral using three robot tractors (18 paths), five robot tractors (30 paths),
and heading errors of 60 mm and 1.2° respectively when the speed and seven robot tractors (42 paths). The observed efficiency of
of the platform is 1.4 m/s. This turning algorithm provided efficient these combinations were 268.4%, 365.6%, and 437.8%, respectively
and optimized maneuvering. compared to a single robot tractor. The field experiments using
three (V-pattern) and four (W-pattern) robot tractors showed effi-
3.1.12. AV-12 ciency values of 247.5% and 352.9%, respectively compared to a
In a continuation of their research into follower robots such as a single robot tractor.
master-slave system (GoTo and FOLLOW systems) (Noguchi et al., The same study’s comparison of working times indicated that
2004) and follower robot tractor (Zhang et al., 2014a, 2013a, 2015), the three-robot tractor and four-robot tractor combinations has
Zhang and Noguchi (2015) developed a leader-follower system efficiency and lateral error values of 80% and 82.1% and 50 and
that is a collaboration of two independent robot tractors in the 42.5 mm, respectively. Based on the results of safety experiments,
field, to reduce the total working time. They equipped a half- the developed system was confirmed to be safe for field perfor-
crawler tractor (Model EG453, Yanmar) with RTK-GPS (SPS855, mance. The final efficiency evaluation showed that the I-pattern,
Trimble), IMU (VN100, VectorNav), PDA (personal digital assis- V-pattern, and W-pattern have 83.2%–89.8%, 67.3%–78.9%, and
tant), personal computer, and Bluetooth as the robot tractor (AV- 59.4%–65.8% ranges of efficiency, respectively (Fig. 20), which are
12, Fig. 21). This system has two major parts: the robot’s navigator strongly dependent on the field size (Zhang, 2017).
(all control functions) and the robot’s server/client.
The system was first simulated on six paths at the average 3.1.13. AV-13
speed of 0.83 m/s when the distance between the robots was Heavy-weight crops such as pumpkin, watermelon, melon, and
12 m. The minimum distance between the safety zones was cabbage have recently attained remarkable market value in Japan
1.56 m. The total operation time was 22.5 min. When the simula- (especially Hokkaido), but the number of farmers in Japan contin-
tion results satisfied the safe distance criterion and the system was ues to decrease because of the strenuous workloads and a labor
confirmed to work without collisions, Zhang and Noguchi tested shortage. For Japan as a developed country, farming is considered
174 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

Table 8
Details of the simulation and experimentation results in different patterns.

Number of robot tractor Type of pattern Number of paths Total work time (min) Efficiency (%)
Multi-robot Single-robot
Simulation 3 I 18 19.57 52.52 268.4
5 V 30 24.12 88.17 365.6
7 W 42 28.28 123.8 437.8
Experimentation 3 V 12 14.02 34.69 247.5
4 W 16 13.2 46.57 352.9

9 mm3, the front access was 3.518  106 mm2, and the harvesting
area was 808 mm (Roshanianfard et al., 2019).
Before the end-effector of the system was designed, three vari-
eties of pumpkin (JEJEJ, TC2A, Hokutokou) as a heavy-weight crop
were characterized (Roshanianfard, 2018a). Three experiments
were conducted: (1) an evaluation of the general physical proper-
ties for investigating the pumpkins’ orientation and possible har-
vesting methodologies, (2) a compression strength test for
measuring the yield force, and (3) a bending-shear test for measur-
ing the yield force to cut the pumpkin stems. The results showed
that the average pumpkin’s lift weight was 26% more than pump-
kin’s pure weight. A special applied technique simplified the har-
vesting procedure (Roshanianfard and Noguchi, 2018a).
Roshanianfard and Noguchi (2017a) then designed an end-
effector with a unique harvesting methodology based on the
extracted properties of pumpkins. The components were designed
Fig. 20. Different patterns for multi-robot systems. and dynamically simulated, and after several modifications, the
final components were manufactured and assembled. The
an overwhelming job compared to office work. The current har- designed end-effector had five fingers designed to grasp and har-
vesters are not precise or careful enough, and this can result in vest heavy-weight crops with the diameter of 170–500 mm by a
damage to crops and decline marketing value. Robot technology sustainable force distribution. The trial results demonstrated that
is a potential solution for these issues, but most of the agricultural the systems’ fingers have enough capability under the maximum
robots that are currently available are designed for small-sized and payload of the HRHC system (i.e., 25 kg), and the new end-
lightweight crops (Roshanianfard). effector could harvest the varieties of pumpkin mentioned above
Vehicle robotics laboratories can design smart robot tractors for because the range of the system’s radius, volume and mass values
real farms, including farms with heavy crops. Such autonomous covered the extracted physical parameters of pumpkins.
vehicles are equipped with different sensors (navigation sensors, After the control unit of the HRHC system was designed (based
safety sensors, and attitude sensors) and different platforms (trac- on a programmable logic controller system and consisting of a per-
tors, combine harvesters, rice transplanter, airboats, and electric sonal computer, a position board, amplifiers, servo motors, switch-
utility vehicles). As an intelligent system, a proportionate actuator ing unit, and emergency switches), the performance of the HRHC
is necessary as the complementary unit. This actuator could be system was evaluated. The results showed that the HRHC system
applied for different applications. Roshanianfard and Noguchi has a 92% harvesting success rate, 0% damage rate, and an average
(2017b) developed a ’harvesting robot for heavy-weight crops cycle time of 42.4 s. The final parameters of workspace were as fol-
(HRHC)’ system (Roshanianfard, 2018a). They sought to develop lows: workspace volume, 5.662  109 mm3; harvesting surface, 2.
an automatic harvesting system to improve the Japanese farmers’ 86  106 mm2; and harvesting length, 800 mm. The average accu-
work in different fields. The HRHC system consists of five main racy values were as follows: x-direction, 10.91 mm; y-direction,
units: (1) an automatic robot tractor (which could be an AV-7, 9.52 mm; and repeatability, 12.74 mm. The control resolution in
AV-11 or AV-12), (2) a 5-degrees-of-freedom (DOF) robotic arm, the x-, y-, and z-directions were 1 ± 0.075, 1 ± 0.05, and
(3) a specifically designed end-effector, (4) main controlling units, 1 ± 0.025, respectively (Roshanianfard et al., 2018).
and (5) a vision system (Fig. 21, AV-13). Automatic maneuvers in The harvesting success rate of the HRHC system (100%) was
the field, crop detection by a vision system, field monitoring, and higher than the overall average harvesting success rate of previous
precise field operations are only some of the planned applications studies 66% (40–86%) between 1984 and 2012 (Bac et al., 2014).
for the HRHC system (Roshanianfard and Noguchi, 2017c). The results of each calculation, simulation, development, modifica-
To develop this system, Roshanianfard and Noguchi (2016) first tion, and characterization show that the HRHC system can be used
designed the development procedure for a low-cost 5-DOF robotic for heavyweight crop robotic harvesting applications. Although
arm for farm use, considering the limitations when facing complex this system was designed for specific crops (i.e., pumpkin, water-
agricultural conditions with a limited workspace and payload. Dif- melon and cabbage), it could be used for other applications such
ferent parameters such as joint torque, payload per weight (PPW), as precise seeding, fertilizing, watering, and weeding.
and repeatability were calculated and simulated dynamically and As Fig. 22 shows, the number of published papers in the devel-
statically. After the system’s development, the control algorithm oping era increased rapidly. During the next era of commercializa-
Kamata et al. (2018) was created using the Denavit and Hartenberg tion, the number of publications decreased because the highest
method in forward kinematics and inverse kinematics calculations priority was the expansion of the technology. Nowadays, the
(Roshanianfard and Noguchi, 2018b). The performance evaluation VeBots laboratory is focusing on upgrading their technology in
of the system revealed that the workspace volume was 8.024  10 the field of artificial intelligence (AI). Table 9 presents more
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 175

Fig. 21. The configurations of the autonomous vehicles designed at Hokkaido University (Roshanianfard).

18
16
16
Number of publication

4
14
12
10 8
8 8 8
8 7
6 6
6
4
4 3
2
2 1
0
AV-1 AV-2 AV-3 AV-4 AV-5 AV-6 AV-7 AV-8 AV-9 AV-10 AV-11 AV-12 AV-13

Project

Fig. 22. The numbers of outcome relevant papers.

detailed information about the applied sensors for each project, the cles, there are several platforms that can be used. The VeBots lab-
experimental conditions, and the results. oratory used mostly tractors (in nine projects and 60 papers). They
also used a combine harvester (one project, eight papers), a utility
3.2. Characterization results vehicle (one project, eight papers), an airboat (one project, four
papers), and a rice transplanter (one project, one paper) (Fig. 23).
One of the main components of an autonomous vehicle is the In general, a tractor was the platform in 70% of the projects (74%
mobile platform. For the category of autonomous agriculture vehi- of the papers), and the other types of platforms were used in only
176 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

Table 9
The navigation results of the autonomous vehicles.

Project No. Environment Speed (m/s) Heading error (degree) Lateral error (mm) Navigation sensor
RMSE Max Avg PS AS SS
AV-1 Field 0.5 1 230 510 400 1 1 –
AV-2 Field 0.5 NA NA NA NA 2 1 9
Field 0.5 NA NA NA NA 3 1 –
Field 0.5 NA NA NA NA 3 1 –
Field 0.5 NA NA NA 50 4 – –
Field 0.5 NA NA NA NA 4 – –
Field 0.6 0.004 38 100 100 5 1,4 –
0.057 170
Orchard 0.5 3.11 60 150 NA 5 4 1
0.55 3.02 60
0.64 3.48 50
0.87 4.39 70
1.4 3.15 60
Field (Flat) 0.8 0.63 65 NA NA RTK 2,3 –
Field (Sloping) 0.64 30
Field (Bumpy) 0.59 37
AV-3 Field 1.5 1.3 60 150 NA 5 4 10
Field 2.5 NA 30 70 NA 5 4,5 10
Orchard 0.36 1.5 110 NA NA 5 5 1,10
Field 0.86 1.5 NA NA 110 6 5 2
1.23 3.3 140
1.43 4.9 190
AV-4 Field, GNSS-FOG 3.6 0.4 NA 150 NA U U
Field, Vision only 200
Field virtual 1.26 41.4 166 44.7 RTK –
AV-5 Field 0.6 0.22 14 NA NA 6 5 1
Field 0.6 0.24 10 NA NA 7 5 –
0.9 0.47 18
1.3 0.76 29
1.7 0.63 22.3
2 0.54 18
Field 0.5 0.2 10 NA NA 6 5 1,4.5
AV-6 Field 0.86 1.23 80 NA NA 5 6 10
1.52 3.17 180
1.61 7.67 410
Field static NA 56 100 NA 6 7 6
dynamic 71 184
AV-7 Field 1 0.57 50 70 NA 7 U 10,2,7
Field, implement – 0.77 20 30
Filed, night 0.6 0.75 40 100
Field 1.4, straight 0.98 41 57.1 NA 8 6 –
1, turning – – 53
Field 0.5 0.39 29 69 NA 7 U 3,8
AV-8 Field 1 0.92 34.7 200 NA 7 6 –
Field Static 0.8 20 40 NA 5 7 7
0.97 3 70 100
AV-9 Paddy field 1.2 NA 310 820 230 U 6 –
AV-10 Paddy field NA NA NA NA NA UD UD UD
AV-11 Field 1.4, straight 1.2 30 160 20 7 U 3,7
1, turning 10 120 80
AV-12 Field 1.16 NA 70 50 40 8 6 7
AV-13 Field UD UD UD UD UD UD UD UD

30% of the projects and described in 26% of the papers. These data advances in steering began with AV-1, the steering of which was
are not surprising, because the tractor is a multi-application vehi- controlled using a stepper motor connected to the steering shaft.
cle that can carry different types of implements and equipment An electro-hydraulic valve was used later in AV-2, -3, -4, -6, -7, -
and perform several types of farm work; a tractor can thus be used 11, and -12. AV-5 and AV-8 each had a hydrostatic transmission
for various robotic applications. (HST) system with which the steering could be controlled from a
Among the projects mentioned above, in seven projects (52 swash plate. The steering control of the AVs was not directly by
papers, 64.2%) a wheel-type transporter system was used; in three the ECU until AV-13. This was because of Japan’s strict governmen-
projects (14 papers, 17.3%) a half-crawler type was used, and in tal policy regarding vehicle traffic and autonomous navigation of
two projects (11 papers, 13.6%) a crawler transporter was used. vehicles; this policy did not allow manufacturers to sell a vehicle
Most of the robotic projects thus used a wheel-type tractor, which with controllable steering. After 2016, the policy was revised and
is the most commonly used agriculture machine in the world companies were allowed to offer controllable-steering vehicles.
(Fig. 24). Before that, when an AV was being designed, some modification
Table 10 summarizes the development procedure of operation of the steering control was required.
functions of AVs such as the steering control, forward and back- The other operation functions (i.e., forward and backward
ward motion, brake, shift, rotary speed, hitch, and PTO. The motion, brake, shift, rotary speed, hitch, and PTO control) were
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 177

70 Table 10
Operation functions of developed autonomous vehicles and its development process.

Project Project steering Forward & brake Shift Rotary Hitch PTO
No. backward speed
60 Publication
AV-1 d  d NA d NA NA
AV-2 d d d  d d 
AV-3 d d d d d d d
50 AV-4 d d d d d d d
AV-5 d d NA NA d d d
AV-6 d d NA NA NA NA NA
AV-7 d s s s s s s
40 AV-8 s s s s s NA NA
AV-9 d d d NA d NA NA
AV-10 s s s NA s NA NA
AV-11 d s s s s s s
30 AV-12 d s s s s s s
AV-13 s s s s s s s

s: It can be controlled without modification.


20 d: It can be controlled by modification.
: Manual.
NA: Not available.
10

originally performed manually using the related handle. A handle


0 connected to a switch could control the actuator. Until the AV-6,
an extra ECU was required to control these functions. This ECU
Tractor

Rice transplanter
Combine

Utility vehicle

Boat

was connected between a switch and actuators and could control


it using the signals received from a personal computer. This was
why these functions need further modification. From the AV-7
onward, manufacturers have included a commercialized ECU on
the platforms that makes the controllable functions possible. After
that (with the exception of the AV-9, which has a different topol-
ogy, control system, and platform), the operation functions of all
Fig. 23. Different types of platforms used.
AVs have been controllable without modification.

3.3. The development of communication system

Fig. 25 illustrates the internal communication system develop-


ment procedure. As shown, the internal communication system
was directly controlled in the beginning because there was no
60 ECU available on tractors. The research groups were obliged to
use different interface cards to receive data from the sensors and
Number of Project to send commands to the actuators. Some prototype ECUs were
Number of Publication eventually developed and installed on tractors. These ECUs were
50 not commercialized at the time, but then researchers and manu-
facturers investigated several systems to obtain an ideal control
unit. At that time, the RS-232c (and a USB in some cases) was used
for communication between a personal computer and an ECU. In
40 some systems, a USB was also used. After the ISO-Bus and the
immediately CAN-Bus communication systems were developed,
the communication of several ECUs became easier and faster. Trac-
tors, like cars, can be equipped with CAN-Bus, and this communi-
30 cation method is now constantly used.

3.4. The development of positioning system

20
For autonomous navigation, positioning sensors are known as
the most important sensors because they can influence the accu-
racy of the entire autonomous system. When the AV-1 was devel-
oped, GNSS was not yet available, and its positioning sensor had
10
been designed only for military applications. Image sensors were
then created to guide a tractor (Fig. 26). As the accuracy of this

0
Wheel Half-crawler Crawler Other
Fig. 24. Different transporter systems.
Fig. 25. The internal communication system development process.
178 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

RTK-GPS Laser scanner


DGPS
Camera
GPS and SNAV

LNAV and XNAV Switches

Image sensor

AV-1

AV-2

AV-3

AV-4

AV-5

AV-6

AV-7

AV-8

AV-9

AV-10

AV-11

AV-12
AV-1

AV-2

AV-3

AV-4

AV-5

AV-6

AV-7

AV-8

AV-9

AV-10

AV-11

AV-12
Project
Project

Fig. 26. The positioning sensor development process.


Fig. 28. The safety sensor development process.

positioning sensor was not good enough, researchers decided to


use some available positioning sensors, including LNAV, XNAV, 3.6. The development of safety sensors
SNAV, and DGPS sensors. The SNAV sensor was selected as the
optimal sensor because the navigation topology of SNAV is based Fig. 28 illustrates the safety sensors on different AVs. Safety
on RTK-GPS. However, RTK-GPS and DGPS were still at the early sensors — which protect an AV from various obstacles in agricul-
development stages at the time, and their accuracy was not very tural environments — have changed over time. Contact sensors
high. RTK-GPS was also too expensive to use for agricultural appli- (e.g., bumper switches) were used as the first safety sensors. This
cations (see Fig. 27). type of safety sensor has high reliability, but the large delay and
When Japan faced several challenges such as a labor shortage, long response time could cause serious damage. To increase the
the expensive price of autonomous vehicles became more accept- safety index, a camera was added to the next generation of AVs.
able and several laboratories expressed their interest in the devel- Different methodologies based on the cameras were applied to
opment of robotic vehicles. During this period, RTK-GPS became analyze the agricultural environment, such as stereovision. A cam-
the standard position sensor in autonomous vehicles. The current era can collect valuable and diverse information about the environ-
RTK-GPS, with 2-cm accuracy, is known as an accurate but expen- ment, but this type of sensor had two significant disadvantages (1)
sive positioning sensor. The development of low-cost RTK-GPS or a more information can reduce the safety, and (2) environmental
new economic navigation method is necessary. factors such as lights and shadows can give rise to errors. Research-
ers then started to combine a camera with a laser scanner to
increase the safety level.
The camera/laser scanner combination is now the main choice
3.5. The development of attitude sensors as a safety sensor, but the use of only a laser scanner has been
examined. Compared to a camera, a laser scanner can cover a
Overall, attitude sensors can be divided into three categories: longer distance, and, in most cases, a laser scanner is more accu-
GDS, FOG, and IMU sensors. GDS sensors was the most commonly rate. The development of safety sensors began with switches,
used sensors before developing FOG and IMU. GDS and TMS sen- moved on to cameras, and eventually to the combination of a laser
sors, which are affected by the surrounding magnetic field, were scanner and camera as the most effective sensor.
unavoidably inaccurate which resulted in large random and bias Agricultural environments have been the main substrates
errors. FOG sensors then became more popular, and three-axis examined in the development of AVs for farming, which has pro-
FOG sensors were soon used to measure the 3D orientation of vehi- vided many challenges in the design of robotic systems. This type
cles. These sensors were applied in airplane navigation systems of substrate is special because in an agriculture environment, many
before being used in the agriculture industry. Because of the complex and varying factors must be considered. Examples include
three-axis FOG sensors’ high price, micro-electro-mechanical sys- light reflection, which can affect image sensors; dusty environ-
tem (MEMS) IMUs were evaluated, and this type of attitude sensor ments, which can affect a mechanical platform or its components;
is now frequently used. It thus seems that attitude sensors will and humidity, rain, wind, non-flat surfaces, isolation, and vibration.
continue to be used in future designs of autonomous agriculture All of these factors increase the complexity of the design process,
vehicles. and this complexity it is the reason why researchers select a speci-
fic environment when they start to design the required system.
As shown in Fig. 29, the VeBots laboratory at Hokkaido Univer-
IMU (3-axes FOG) MEMS IMU
IMU sity has conducted research in three main environments: agricul-
tural fields, an orchard, and a paddy field. Most of the research
FOD
was performed in agricultural fields (87% of all experiments); the
GDS orchards and paddy field accounted for only 14%. The agricultural
field experiments used different platforms (e.g., a tractor, combine
harvester, or utility vehicle). The orchard experiments used only a
AV-1

AV-2

AV-3

AV-4

AV-5

AV-6

AV-7

AV-8

AV-9

AV-10

AV-11

AV-12

tractor, and the single paddy field experiment used both a rice
transplanter and an airboat.
Project

3.7. Performance qualification

Fig. 30 illustrates the heading error of each of the 13 AVs and


Fig. 27. The attitude sensor development process. their travel speeds. The horizontal axis indicates the different
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 179

30 conditions, the heading error has always been < 2°. From AV-3
Number of experimentations

87% onward, three-axis FOGs began to be used, and this type of IMU
25 changed over time to the MEMS type.
Some experiments indicated that high speed (Fig. 30 (3)), a
20 rapid turn (Fig. 30 (4)), and using only a 2D laser scanner as the
positioning sensor (Fig. 30 (2)) can cause large heading errors. Gen-
erally, the average heading errors before (using GDS) and after (us-
15
ing IMU) the threshold were 2° and 1.6°. Using an IMU thus has a
positive effect on the heading error in navigation.
10
The next performance indicator of AVs is the lateral error
(Fig. 31), which is directly related to the positioning sensor and
5 7% 7%
also related to the platform, the transporter system, the tire slip
angle, and the environmental conditions. The horizontal axis in
0 Fig. 31 indicates the number of autonomous vehicles, the position-
Orchard
Field

Paddy field
ing sensors, the travel speed, and environmental specifications. The
figure is divided into two periods using a threshold between AV-1
and AV-2. The threshold is the time point at which the positioning
sensor changed to RTK-GPS. Before that, different positioning sen-
Fig. 29. Experiment environments. sors (e.g., image sensor, LNAV, SNAV, XNAV, and DGPS) were used,
and the results were not stable over all different conditions. The
lateral error of the AV-1 was 23 cm — a large error due to the image
sensor and stations.
experiments, e.g., AV-2, PS-9, 0.8, and Bumpy, along with the posi- In the first AV-2 projects, positioning sensors such as LNAV,
tioning sensor used (PS-9: RTK-GPS, Topcon, Legacy-E; see Table 3), SNAV, and XNAV were used, with poor results. When the position-
the travel speed (0.8 m/s), and the environment specifications (e.g., ing sensor was changed to RTK-GPS, the average lateral error
Bumpy field). The blue vertical line indicates the threshold of the decreased to 6.78 cm. The RTK-GPS, Topcon, and Legacy-E position-
attitude sensor. This threshold is located between the AV-2 and ing sensors together had an average lateral error of 2.5 cm when
the AV-3, when the attitude sensor was changed from GDS to attached to the AV-5 and AV-7. The RTK-GPS, Topcon, and AGI-3
IMU (three-axis FOG or MEMS). These data establish that the IMUs positioning sensors together had an average lateral error of
had a revolutionary effect on the heading error. Before the AV-3, 2.61 cm when attached to AV-5, -7, -8, and -11.
GDSs were used in the AV-1 and AV-2 as the attitude sensor. The RTK-GPS, Trimble, and SPS855 sensors had an average lat-
As is clear in the figure, the heading error in (AV-2, PS-5, 0.55), eral error of 6 cm when attached on the AV-7 and AV-12. The
(AV-2, PS-5, 0.6), (AV-2, PS-5, 0.64), (AV-2, PS-9, 0.8, Flat), and (AV- RTK-GPS, Trimble, and MS750 sensors had an average lateral error
2, PS-9, 0.8, Slope) were 3.11°, 3.02°, 3.48°, 4.39°, and 3.15°, respec- of 9.83 cm when attached to the AV-2, -3, -6, and -8. The DGPS,
tively (Fig. 30 (1)). These errors were quite high and were not suit- Hemisphere, and V100 sensors had an average lateral error of
able for autonomous applications. In standard experimental 11 cm when attached to the AV-2 and the AV-9 (31 cm). Although

Fig. 30. Performance indicators of the AVs (heading error).


180 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

Fig. 31. Performance indicators of the AVs (lateral error).

it has not been possible to determine the lateral error by consider- 3. To control all the components of an AAV, there must be suffi-
ing only the positioning sensors, the positioning sensors play the cient access to the ECU using USB, RS232c, or CAN-BUS. This
main role in attaining high position accuracy. The data in Fig. 31 is because, controlling a platform through its ECU helps the
also indicate that high speed (AV-6) and a wet environment such automation process to control the components according to
as a paddy field (AV-9) can dramatically increase the lateral error. the factory’s designed standards and applied APIs. This requires
that the manufacturing company either allow to access the con-
tents of the ECU or to develop an external port to communicate
4. Lessons learned the ECU and provide the related controlling API. Otherwise, the
designer should include a separate ECU for autonomous control.
Almost thirty years of study on the agricultural autonomous An external ECU may have various challenges such as (1) incon-
vehicles in the laboratory of vehicle robotics – Hokkaido University sistency of the controlling commands of external ECU versus
– Japan has yielded valuable experiences, the most important of the main ECU, (2) lack of equipment equipped with CAN-BUS
which are outlined below (see videos at Roshanianfard (2020)): communication system, (3) multi-ECU control challenges using
USB or RS232c communication system, and (4) inability to stop
1. To develop an autonomous vehicle (for various applications the device when it gets out of control.
including agriculture industry) with minimum human interposi- 4. Each prototype AAV needs to be equipped with a computer
tion, the selected platform (such as tractor, combine harvester, that receives the commands from sensors (such as RTK-GPS,
Utility vehicle, Transplanter) must have an automatic transmis- IMU, LIDAR) and send the related commands to the actuators
sion and an electrical controllable steering system. An automatic (such as steering and brake) using a related algorithm. It is
transmission helps to adjust the vehicle’s maneuver velocity by strongly recommended to use solid-state drive (SSD) hard
sending a command through the TECU. Speed control of plat- disks to store data inside the computer so that the vibrations
forms that have manual transmission should be done with addi- of the vehicles during maneuver have less effect on the per-
tional mechanical/hydraulic actuating systems (to change gears formance of the algorithm. This type of hard disks is more
and control clutch control). It is not recommended to use addi- resistant to physical shock and can support fast communica-
tional actuators because their installation has many challenges tion with ECUs.
and their application includes many errors. 5. The GNSS-based navigation systems have weak performance in
2. The steering control is also an important character that should covered areas such as tunnels or environments with obstacles
be considered based on the national policies of each country. trees or buildings. It is important to consider this disadvantage
Many developed countries such as Germany and Japan, have in designed algorithms. There are many studies introduced in
restricted the use of electronic steering on various vehicles this study that have developed different algorithms using sen-
because of safety reasons. After designing the necessary stan- sor fusion, LRF, or other methods.
dards and regulations, this system is gradually seen on the 6. However, the fuzzy logic, genetic algorithm, and artificial
new designs and platforms such as Yanmar YT5113 (Tractor). neural network can help to solve the non-linear behavior of
In some other countries, there may be similar policies that algorithms and support MIMO controls, it is recommended
restrict the development of autonomous vehicles. Designers to develop the controlling algorithms as simple as possible
and researchers must carefully review the related standards regardless of the least important parameters. A simple algo-
and rules before starting their project to avoid wasting time rithm can be run fast, and its troubleshooting and coding
and budget.
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 181

are much easier. In this regard, the proportional-integral-deri Blanes, C., Mellado, M., Beltrán, P., 2016. Tactile sensing with accelerometers in
prehensile grippers for robots. Mechatronics 33, 1–12.
vative (PID) and model predictive control (MPC) controllers
Bochtis, D.D., Sørensen, C.G.C., Busato, P., 2014. Advances in agricultural machinery
recommended. management: a review. Biosyst. Eng. 126, 69–81.
7. More detailed information about the characterization of differ- Boston Dynamics, 2019. The mobile robot designed for sensing, inspection, and
ent components including hardware, environments, controlling remote operation (Spot).
Charania, I., Li, X., 2020. Smart farming: agriculture’s shift from a labor intensive to
algorithm, and performance indicator described in detail in Sec- technology native industry. Internet of Things 9, 100142.
tion 2. It is recommended to study this section before develop- Choi, J., Yin, X., Noguchi, N., 2013. Development of a laser scanner-based navigation
ing a new AV. system for a combine harvester. IFAC Proceedings Volumes 46, 103–108.
Choi, J., Yin, X., Yang, L., Noguchi, N., 2014. Development of a laser scanner-based
navigation system for a combine harvester. Eng. Agric. Environ. Food 7, 7–13.
5. Conclusion De-An, Z., Jidong, L., Wei, J., Ying, Z., Yu, C., 2011. Design and control of an apple
harvesting robot. Biosyst. Eng. 110, 112–122.
Ding, Y., Wang, L., Li, Y., Li, D., 2018. Model predictive control and its application in
Different aspects of autonomous agricultural vehicles have been agriculture: a review. Comput. Electron. Agric. 151, 104–117.
presented in detail: the variation in hardware (development, plat- Dong, F., Heinemann, W., Kasper, R., 2011. Development of a row guidance system
for an autonomous robot for white asparagus harvesting. Comput. Electron.
form, transportation system, operation function, communication Agric. 79, 216–225.
methods, sensors, and control units), the environment, control Eizicovits, D., van Tuijl, B., Berman, S., Edan, Y., 2016. Integration of perception
algorithms, and performance indicators. The development proce- capabilities in gripper design using graspability maps. Biosyst. Eng. 146, 98–113.
Ellen, L., William, C., Erica, S., Keith, F., 2013. Global Agricultural Productivity report.
dures and characteristics of each AV were compared and discussed.
Faizollahzadeh Ardabili, S., Mahmoudi, A., Mesri Gundoshmian, T., Roshanianfard,
Case studies of autonomous vehicles designed at Hokkaido Univer- A., 2016. Modeling and comparison of fuzzy and on/off controller in a
sity and the development process were presented as practical cases mushroom growing hall. Measurement 90, 127–134.
and representative development challenges. The case studies also Fan, Z., 2015. Development of a Real-time Quality Sensor for Wheat on a Combine
Harvester. Department of Environment Resources, Graduate school of
contribute to researchers’ understanding of the aspects of future Agriculture, Hokkaido University.
robotic systems and to the prevention of repeat failures and unnec- Hayashi, S., Shigematsu, K., Yamamoto, S., Kobayashi, K., Kohno, Y., Kamata, J.,
essary challenges. It is hoped that this review will assist scientists Kurita, M., 2010. Evaluation of a strawberry-harvesting robot in a field test.
Biosyst. Eng. 105, 160–171.
who are working to develop robots for the benefit of humanity and Ishii, K., 1997. Study on Control method of Agricultural Autonomous Mobile Robot.
a peaceful future. Department of Environment Resources, Graduate school of Agriculture.
Hokkaido University, p. 146.
Ishii, K., Terao, H., Noguchi, N., 1994. Studies on self-learning autonomous vehicles
Declaration of Competing Interest (Part 1), following control using neuro-controller. J. Japanese Soc. Agric.
Machinery 56, 53–60.
Ishii, K., Terao, H., Noguchi, N., 1995. Studies on self-learning autonomous vehicles
The authors declare that they have no known competing finan-
(Part 2), verification of neuro-controller by model vehicle. J. Japanese Soc. Agric.
cial interests or personal relationships that could have appeared Machinery 57, 61–67.
to influence the work reported in this paper. Ishii, K., Terao, H., Noguchi, N., 1998a. Studies on Self-learning autonomous vehicles
(Part 3), positioning system for autonomous vehicle. J. Japanese Soc. Agric.
Machinery 60, 51–58.
Acknowledgments Ishii, K., Terao, H., Noguchi, N., Kise, M., 1998b. Studies on self-learning autonomous
vehicles (Part 4), online neuro-controller. J. Japanese Soc. Agric. Machinery 60,
53–58.
This study was supported by the Cross-ministerial Strategic Jongmin, C., 2014. Development of Guidance System Using Local Sensors for
Innovation Promotion Program (SIP) - Tokyo managed by Cabinet Agricultural Vehicles. Department of Environment Resources, Graduate school
Office of Japan. of Agriculture. Hokkaido University.
Kamata, T., Roshanianfard, A., Noguchi, N., 2018. Heavy-weight crop harvesting
robot - controlling algorithm. IFAC-PapersOnLine 51, 244–249.
References Kawahito, H., 2016. Study on automatic steering of rice transplanter by detection of
traveling marker using image processing (In Japanese), Agriculture department.
Hokkaido University.
AEF, 2019. ISOBUS.
Kim, Y.Y., Hwang, H., Cho, S.I., 2008. A hybrid robotic system for harvesting heavy
Ampatzidis, Y.G., Vougioukas, S.G., Bochtis, D.D., Tsatsarelis, C.A., 2009. A yield
produce. Eng. Agric. Environ. Food 1, 18–23.
mapping system for hand-harvested fruits based on RFID and GPS location
Kise, M., Noguchi, N., Ishii, K., Terao, H., 2001. Development of the agricultural
technologies: field testing. Precis. Agric. 10, 63–72.
autonomous tractor with an RTK-GPS and a fog. IFAC Proceedings Volumes 34,
Asadollahpour, A., Omidinajafabadi, M., Jamalhosseini, S., 2014. Factors affecting the
99–104.
conversion to organic farming in Iran: a case study of mazandaran rice
Kise, M., Noguchi, N., Ishii, K., Terao, H., 2002a. The development of the
producers. Sci. Int. (Lahore) 26, 1844–1860.
autonomous tractor with steering controller applied by optimal control. In:
Bac, C.W., van Henten, E.J., Hemming, J., Edan, Y., 2014. Harvesting robots for high-
Automation Technology for Off-Road Equipment, Proceedings of the July 26-27,
value crops: state-of-the-art review and challenges ahead. J. Field Rob. 31, 888–
2002 Conference (Chicago, Illinois, USA). ASABE, St. Joseph, MI, pp. 367–373.
911.
Kise, M., Noguchi, N., Ishii, K., Terao, H., 2002b. Enhancement of turning accuracy by
Bakker, T., van Asselt, K., Bontsema, J., Müller, J., van Straten, G., 2011. Autonomous
path planning for robot tractor. In: Automation Technology for Off-Road
navigation using a robot platform in a sugar beet field. Biosyst. Eng. 109, 357–
Equipment, Proceedings of the July 26-27, 2002 Conference (Chicago, Illinois,
368.
USA). ASABE, St. Joseph, MI, pp. 398–404.
Barawid, J., Oscar, C., Mizushima, A., Ishii, K., Noguchi, N., 2007. Development of an
Kohno, Y., Kondo, N., Iida, M., Kurita, M., Shiigi, T., Ogawa, Y., Kaichi, T., Okamoto, S.,
autonomous navigation system using a two-dimensional laser scanner in an
2011. Development of a mobile grading machine for citrus fruit. Eng. Agric.
orchard application. Biosyst. Eng. 96, 139–149.
Environ. Food 4, 7–11.
Barawid Jr, O.C., 2011. Development of electronic utility robot vehicle. Department
Kondo, N., Monta, M., Fujiura, T., 1996. Fruit harvesting robots in Japan. Adv. Space
of Environment Resources, Graduate school of Agriculture, Hokkaido University,
Res. 18, 181–184.
p. 170.
Kondo, N., Monto, M., Noguchi, N., 2011. Agricultural Robots.
Barawid Jr, O.C., Farrokhi Teimourlou, R., Noguchi, N., Ishii, K., 2008a. Automatic
Kondo, N., Yamamoto, K., Shimizu, H., Yata, K., Kurita, M., Shiigi, T., Monta, M.,
guidance system in real-time orchard application (Part 1), a novel research on
Nishizu, T., 2009. A machine vision system for tomato cluster harvesting robot.
coconut field application using laser scanner. J. Japanese Soc. Agric. Machinery
Eng. Agric. Environ. Food 2, 60–65.
70, 76–84.
Kubota Co., 2018. Kubota Tractor Corporation.
Barawid Jr, O.C., Ishii, K., Noguchi, N., 2008b. Calibration method for 2-dimensional
Kurashiki, K., Fukao, T., Ishiyama, K., Kamiya, T., Murakami, N., 2010a. Orchard
laser scanner attached on a robot vehicle. In: Proceedings of the 17th World
traveling UGV using particle filter based localization and inverse optimal
Congress, The International Federation of Automatic Control, Seoul, Korea.
control. In: 2010 IEEE/SICE International Symposium on System Integration, pp.
Barawid Jr, O.C., Noguchi, N., 2010. Automatic guidance system in real-time orchard
31–36.
application (Part 2), development of low-cost and small scale electronic robot
Kurashiki, K., Fukao, T., Nagata, J., Ishiyama, K., Kamiya, T., Murakami, N., 2010b.
vehicle for orchard application. J. Japanese Soc. Agric. Machinery 72, 243–250.
Laser-based vehicle control in orchard. IFAC Proceedings Volumes 43, 127–132.
Barawid, O.C., Noguchi, N., 2011. Automatic steering system for electronic robot
Kurita, H., Iida, M., Suguri, M., Masuda, R., 2012. Application of image processing
vehicle. IFAC Proceedings Volumes 44, 2901–2906.
technology for unloading automation of robotic head-feeding combine
Belforte, G., Deboli, R., Gay, P., Piccarolo, P., Ricauda Aimonino, D., 2006. Robot
harvester. Eng. Agric. Environ. Food 5, 146–151.
design and testing for greenhouse applications. Biosyst. Eng. 95, 309–321.
182 A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183

Lezoche, M., Hernandez, J.E., Alemany Díaz, M.d.M.E., Panetto, H., Kacprzyk, J., 2020. Rahman, M., 2018. Studies on tracked dynamic model and optimum harvesting area
Agri-food 4.0: A survey of the supply chains and technologies for the future for path planning of robot combine harvester. Department of Environment
agriculture. Comput. Industry 117, 103187. Resources, Graduate school of Agriculture. Hokkaido University.
Li, M., Imou, K., Wakabayashi, K., Yokoyama, S., 2009. Review of research on Rahman, M., Ishii, K., Noguchi, N., 2017. Study on tracked combine harvester
agricultural vehicle autonomous guidance. Int. J. Agric. & Biol. Eng. 2, 1– dynamic model for automated navigation purposes. Adv. Robot. Autom. 6.
26. Rahnemoonfar, M., Sheppard, C., 2017. Deep count: fruit counting based on deep
Liu, Y., Noguchi, N., 2016. Development of an unmanned surface vehicle for simulated learning. Sensors 17, 905.
autonomous navigation in a paddy field. Eng. Agric. Environ. Food 9, 21–26. Rajendra, P., Kondo, N., Ninomiya, K., Kamata, J., Kurita, M., Shiigi, T., Hayashi, S.,
Liu, Y., Noguchi, N., Ishii, K., 2014. Development of a low-cost IMU by using sensor Yoshida, H., Kohno, Y., 2009. Machine vision algorithm for robots to harvest
fusion for attitude angle estimation. IFAC Proceedings Volumes 47, 4435–4440. strawberries in tabletop culture greenhouses. Eng. Agric. Environ. Food 2, 24–
Liu, Y., Noguchi, N., Ishii, K., Liang, L., 2016. Wind direction-based path planning for 30.
an agricultural unmanned airboat navigation. In: 2016 IEEE/SICE International Ren, G., Lin, T., Ying, Y., Chowdhary, G., Ting, K.C., 2020. Agricultural robotics
Symposium on System Integration (SII), pp. 906–911. research applicable to poultry production: a review. Comput. Electron. Agric.
Liu, Y., Noguchi, N., Roshanianfard, A., 2017. Simulation and test of an agricultural 169, 105216.
unmanned airboat maneuverability model. Int. J. Agric. Biol. Eng. 10. Roshanianfard, A., 2018a. Development of a harvesting robot for heavy-weight crop.
Michio, K., Noboru, N., Kazunobu, I., Hideo, T., 2001. Field mobile robot navigated by Department of Environment Resources, Graduate school of Agriculture.
RTK-GPS and FOG (Part 2) - autonomous operation by applying navigation map. Hokkaido University, p. 236.
J. Agric. Mech. Eng. 63, 80–85. Roshanianfard, A., Noguchi, N., 2020. Pumpkin harvesting robotic end-effector.
Mizushima, A., Ishii, K., Noguchi, N., Matsuo, Y., Lu, R., 2011. Development of a low- Computer and electronic in agriculture, 174, 105503.
cost attitude sensor for agricultural vehicles. Comput. Electron. Agric. 76, 198– Roshanianfard, A., 2020. YouTube Channel named ‘‘Agricultural Robots”, Laboratory
204. of Vehicle Robotics at Hokkaido University - Japan.
Mizushima, A., Noguchi, N., Ishii, K., 2005. Development of Robot Tractor Using the Roshanianfard, A., Kamata, T., Noguchi, N., 2018. Performance evaluation of
Low-Cost GPS/INS System. ASAE, St. Joseph, MI. harvesting robot for heavy-weight crops. IFAC-PapersOnLine 51, 332–338.
Mizushima, A., Noguchi, N., Ishii, K., Terao, H., 2002. Automatic navigation of the Roshanianfard, A., Noguchi, N., 2016. Development of a 5DOF robotic arm
agricultural vehicle by the geomagnetic direction sensor and gyroscope. In: (RAVebots-1) applied to heavy products harvesting. IFAC-PapersOnLine 49,
Automation Technology for Off-Road Equipment, Proceedings of the July 26-27, 155–160.
2002 Conference (Chicago, Illinois, USA). ASABE, St. Joseph, MI, pp. 204–211. Roshanianfard, A., Noguchi, N., 2017a. Designing of pumpkin harvester robotic end-
Mizushima, A., Noguchi, N., Ishii, K., Terao, H., Yukumoto, O., Yamamoto, S., 2000. effector. In: 2017 The 3rd International Conference on Control, Automation and
Automatic guidance system composed of geomagnetic direction sensor and Robotics (ICCAR 2017). IEEE, Nagoya, Japan.
fiber optic gyroscope. IFAC Proceedings Volumes 33, 313–317. Roshanianfard, A., Noguchi, N., 2017b. Development of a heavyweight crop robotic
Monte, A.D., Noboru, N., Qin, Z., John, F.R., Jeffrey, D.W., 2000. Sensor-fusion harvesting system (HCRH). In: 2017 The 3rd International Conference on
navigator for automated guidance of off-road vehicles. In: patent, U. (Ed.), USA. Control, Automation and Robotics. IEEE.
Moshou, D., Bravo, C., Oberti, R., West, J.S., Ramon, H., Vougioukas, S., Bochtis, D., Roshanianfard, A., Noguchi, N., 2017c. Development of robotic harvesting system
2011. Intelligent multi-sensor system for the detection and treatment of fungal for heavy-weight crops. In: 11th seminar of ASIJ - Academic Society of Iranians
diseases in arable crops. Biosyst. Eng. 108, 311–321. in Japan, Tokyo, Japan.
Mousazadeh, H., 2013. A technical review on navigation systems of agricultural Roshanianfard, A., Noguchi, N., 2018a. Characterization of pumpkin for a harvesting
autonomous off-road vehicles. J. Terramech. 50, 211–232. robot. IFAC-PapersOnLine 51, 23–30.
Narvaez, F.Y., Reina, G., Torres-Torriti, M., Kantor, G., Cheein, F.A., 2017. A survey of Roshanianfard, A., Noguchi, N., 2018b. Kinematics analysis and simulation of a 5DOF
ranging and imaging techniques for precision agriculture phenotyping. IEEE/ articulated robotic arm applied to heavy products harvesting. Tarim Bilimleri
ASME Trans. Mechatron. 22, 2428–2439. Dergisi-J. Agric. Sci. 24, 91–104.
Noguchi, N., 2015. Current status and future prospects for vehicle robotics on Roshanianfard, A., Noguchi, N., Kamata, T., 2019. Design and performance of a
agriculture. J. Japan Soc. Precision Eng. 81, 22–25. robotic arm for farm use. Int. J. Agric. Biol. Eng. (IJABE) 12, 146–158.
Noguchi, N., Barawid Jr, O.C., 2011. Robot farming system using multiple robot Saito, M., Tamaki, K., Nishiwaki, K., Nagasaka, Y., Motobayashi, K., 2013.
tractors in Japan agriculture. IFAC Proceedings Volumes 44, 633–637. Development of Robot Combine Harvester for Beans using CAN Bus Network.
Noguchi, N., Reid, F.J., Zhang, Q., Will, D.J., Ishii, K., 2001a. Development of Robot IFAC Proceedings Volumes 46, 148–153.
Tractor Based on RTK-GPS and Gyroscope. In: 2001 ASAE Annual Meeting. ASAE, Siegwart, R., Nourbakhsh, I.R., Scaramuzza, D., 2011. Autonomous Mobile Robots.
St. Joseph, MI. The MIT press, p. 335.
Noguchi, N., Ishii, K., Terao, H., 1992. A Study on Intelligent Industrial Vehicle by Soter, G., Conn, A., Hauser, H., Rossiter, J., 2018. Bodily aware soft robots: integration
using Neural Network (Modeling Vehicle Movement by Neural Network). In: of proprioceptive and exteroceptive sensors. In: 2018 IEEE International
Proceedings of the 2nd Intelligent Systems Symposium Proceedings, Japan, pp. Conference on Robotics and Automation (ICRA), pp. 2448–2453.
367–372. Statistics Bureau of Japan, 2017. Statistical handbook of Japan. Statistics Bureau
Noguchi, N., Ishii, K., Terao, H., 1996. Development of an agricultural mobile robot Ministry of Internal Affairs and Communications Japan.
using a geomagnetic direction sensor and image sensors. J. Agric. Eng. Res. 67, Takai, R., Barawid, O., Ishii, K., Noguchi, N., 2010. Development of Crawler-Type
1–15. Robot Tractor based on GPS and IMU. IFAC Proceedings Volumes 43, 151–156.
Noguchi, N., Kise, M., Ishii, K., Terao, H., 2002. Field automation using robot tractor. Takai, R., Barawid, O., Noguchi, N., 2011. Autonomous Navigation System of
In: Automation Technology for Off-Road Equipment, Proceedings of the July 26- Crawler-Type Robot Tractor. IFAC Proceedings Volumes 44, 14165–14169.
27, 2002 Conference (Chicago, Illinois, USA). ASABE, St. Joseph, MI, pp. 239–245. Takai, R., Yang, L., Noguchi, N., 2014. Development of a crawler-type robot tractor
Noguchi, N., Reid, J., Zhang, Q., Will, D.J., 2001b. Turning function for robot tractor using RTK-GPS and IMU. Eng. Agric. Environ. Food 7, 143–147.
based on spline function. In: 2001 ASAE Annual Meeting. ASAE, St. Joseph, MI. Tamaki, K., Nagasaka, Y., Nishiwaki, K., Saito, M., Kikuchi, Y., Motobayashi, K., 2013.
Noguchi, N., Reid, J.F., Zhang, Q., Tian, L., Hasan, A.C., 1999. Vision intelligence for A Robot System for Paddy Field Farming in Japan. IFAC Proceedings Volumes 46,
mobile agro-robotic system. J. Robot. Mechatron. 11. 143–147.
Noguchi, N., Terao, H., 1997. Path planning of an agricultural mobile robot by neural Tanigaki, K., Fujiura, T., Akase, A., Imagawa, J., 2008. Cherry-harvesting robot.
network and genetic algorithm. Comput. Electron. Agric. 18, 187–204. Comput. Electron. Agric. 63, 65–72.
Noguchi, N., Will, J., Reid, J., Zhang, Q., 2004. Development of a master–slave robot Tanner, H.G., Kyriakopoulos, K.J., Krikelis, N.I., 2001. Advanced agricultural robots:
system for farm operations. Comput. Electron. Agric. 44, 1–19. kinematics and dynamics of multiple mobile manipulators handling non-rigid
Okada, A., 2017. Study on automatic steering control of rice transplanter by satellite material. Comput. Electron. Agric. 31, 91–105.
positioning system (In Japanese). Hokkaido University. Teng, Z., Noguchi, N., Liangliang, Y., Ishii, K., Jun, C., 2016. Development of uncut
Ospina, R., Noguchi, N., 2016. Determination of tire dynamic properties: Application crop edge detection system based on laser rangefinder for combine harvesters.
to an agricultural vehicle. Eng. Agric. Environ. Food 9, 123–130. Int. J. Agric. Biol. Eng. 9, 21–28.
Ospina, R., Noguchi, N., 2018. Alternative method to model an agricultural vehicle’s Thanpattranon, P., Ahamed, T., Takigawa, T., 2016. Navigation of autonomous
tire parameters. Eng. Agric. Environ. Food 11, 9–18. tractor for orchards and plantations using a laser range finder: automatic
Pettersson, A., Davis, S., Gray, J.O., Dodd, T.J., Ohlsson, T., 2010. Design of a control of trailer position with tractor. Biosyst. Eng. 147, 90–103.
magnetorheological robot gripper for handling of delicate food products with Toyama, S., Yamamoto, G., 2009a. Development of Wearable-Agri-Robot
varying shapes. J. Food Eng. 98, 332–338. mechanism for agricultural work. In: 2009 IEEE/RSJ International Conference
Pettersson, A., Ohlsson, T., Davis, S., Gray, J.O., Dodd, T.J., 2011. A hygienically on Intelligent Robots and Systems, pp. 5801–5806.
designed force gripper for flexible handling of variable and easily damaged Toyama, S., Yamamoto, G., 2009b. Development of Wearable-Agri-Robot
natural food products. Innovative Food Sci. Emerg. Technol. 12, 344–351. mechanism for agricultural work. In: 2009 IEEE/RSJ International Conference
Pinto, F.A.C., Reid, J.F., Zhang, Q., Noguchi, N., 2000. Vehicle guidance parameter on Intelligent Robots and Systems, pp. 5801–5806.
determination from crop row images using principal component analysis. J. Tsubota, R., Noguchi, N., Mizushima, A., 2004. Automatic guidance with a laser
Agric. Eng. Res. 75, 257–264. scanner for a robot tractor in an orchard, Automation Technology for Off-Road
Prado, Álvaro J., Michałek, Maciej M., Cheein, Fernando A., 2018. Machine-learning Equipment, Proceedings of the 7–8 October 2004 Conference (Kyoto, Japan)
based approaches for self-tuning trajectory tracking controllers under terrain Publication Date 7 October 2004. ASABE, St. Joseph, MI.
changes in repetitive tasks. Eng. Appl. Artif. Intell. 67, 63–80. USDA, 2019. Farm Demographics - U.S. Farmers by Gender, Age, Race, Ethnicity, and
Qiao, J., Sasao, A., Shibusawa, S., Kondo, N., Morimoto, E., 2005. Mapping yield and More, United States Department of Agriculture, National Agricultural Statistics
quality using the mobile fruit grading robot. Biosyst. Eng. 90, 135–142. Service.
A. Roshanianfard et al. / Journal of Terramechanics 91 (2020) 155–183 183

Van Henten, E.J., Van’t Slot, D.A., Hol, C.W.J., Van Willigenburg, L.G., 2009. Optimal Yukumoto, O., Matsuo, Y., Noguchi, N., 2000b. Robotization of agricultural vehicles
manipulator design for a cucumber harvesting robot. Comput. Electron. Agri. 65, (Part 2): - description of the tilling robot. Japan Agric. Res. Quart. 34.
247–257. Zhai, Z., Martínez, J.F., Beltran, V., Martínez, N.L., 2020. Decision support systems for
Vasconez, J.P., Guevara, L., Cheein, F.A., 2019a. Social robot navigation based on HRI agriculture 4.0: survey and challenges. Comput. Electron. Agric. 170, 105256.
non-verbal communication: A case study on avocado harvesting. Proceedings of Zhang, C., 2014a. Development of a Leader-follower System for Farm Use,
the ACM Symposium on Applied Computing. Department of Environment Resources. Graduate school of Agriculture,
Vasconez, J.P., Kantor, G.A., Auat Cheein, F.A., 2019b. Human–robot interaction in Hokkaido University.
agriculture: a survey and current challenges. Biosyst. Eng. 179, 35–48. Zhang, C., 2017. Development of a Multi-robot Tractor System for Farm Work,
Vasconez, J.P., Salvo, J., Auat, F., 2018. Toward semantic action recognition for Department of Environment Resources. Graduate school of Agriculture,
avocado harvesting process based on single shot MultiBox Detector. In: 2018 Hokkaido University.
IEEE International Conference on Automation/XXIII Congress of the Chilean Zhang, C., Noguchi, N., 2015. Development of leader-follower system for field work.
Association of Automatic Control (ICA-ACCA), pp. 1–6. In: 2015 IEEE/SICE International Symposium on System Integration (SII), pp.
Vougioukas, S.G., 2019. Agricultural Robotics. Annu. Rev. Control, Robotics, 364–368.
Autonomous Syst. 2, 365–392. Zhang, C., Noguchi, N., 2017. Development of a multi-robot tractor system for
Wada, 2017. Research on rudder automatic steering for labor saving in rice agriculture field work. Comput. Electron. Agric. 142, 79–90.
transplanter, Agriculture Department. Hokkaido University. Zhang, C., Noguchi, N., Yang, L., 2016. Leader–follower system using two robot
Wang, H., Noguchi, N., 2016. Autonomous maneuvers of a robotic tractor for tractors to improve work efficiency. Comput. Electron. Agric. 121, 269–281.
farming. In: 2016 IEEE/SICE International Symposium on System Integration Zhang, C., Yang, L., Noguchi, N., 2014a. Development of a human-driven tractor
(SII), pp. 592–597. following a robot system. In: Proceedings of the 19th World Congress, The
Weiss, U., Biber, P., 2011. Plant detection and mapping for agricultural robots using International Federation of Automatic Control. IFAC proceeding Cape Town,
a 3D LIDAR sensor. Rob. Auton. Syst. 59, 265–273. South Africa.
Yang, L., 2013a. Development of a Robot Tractor Implemented an Omni-Directional Zhang, C., Yang, L., Noguchi, N., 2015. Development of a robot tractor controlled by a
Safety System. Department of Environment Resources, Graduate School of human-driven tractor system. Eng. Agric. Environ. Food 8, 7–12.
Agriculture, Hokkaido University, p. 133. Zhang, C., Yang, L., Zhang, Z., Noguchi, N., 2013a. Development of robot tractor
Yang, L., 2013b. Development of in-Field Transportation Robot Vehicle using associating with human-drive tractor for farm work. IFAC Proceedings Volumes
Multiple Sensors. Department of Environment Resources, Graduate School of 46, 83–88.
Agriculture, Hokkaido University, p. 138. Zhang, Q., Reid, F.J., Noguchi, N., 1999. Agricultural vehicle navigation using
Yang, L., Noguchi, N., 2012. Human detection for a robot tractor using omni- multiple guidance sensors. In: Int. Conf. on Field and Service Robotics.
directional stereo vision. Comput. Electron. Agric. 89, 116–125. Zhang, Z., 2014b. Development of a Robot Combine Harvester based on GNSS,
Yang, L., Noguchi, N., 2014. An active safety system using two laser scanners for a Department of Environment Resources. Graduate school of Agriculture,
robot tractor. IFAC Proceedings Volumes 47, 11577–11582. Hokkaido University.
Yang, L., Noguchi, N., Takai, R., 2016. Development and application of a wheel-type Zhang, Z., Noguchi, N., Ishii, K., Yang, L., 2014b. Optimization of steering control
robot tractor. Eng. Agric. Environ. Food 9, 131–140. parameters based on a combine harvester’s kinematic model. Eng. Agric.
YANMAR CO., 2018. Data sheet for YT488A, YT498A, YT4104A, and YT5113A Environ. Food 7, 91–96.
tractors. Zhang, Z., Noguchi, N., Ishii, K., Yang, L., Zhang, C., 2013b. Development of a robot
Yin, X., Noguchi, N., Ishi, K., 2013. Development of an obstacle avoidance system for combine harvester for wheat and paddy harvesting. IFAC Proceedings Volumes
a field robot using a 3D camera. Eng. Agric. Environ. Food 6, 41–47. 46, 45–48.
Yokota, M., Mizushima, A., Ishii, K., Noguchi, N., 2005. 3-D GIS Map Generation Zhou, C., Gu, S., Wen, Y., Du, Z., Xiao, C., Huang, L., Zhu, M., 2020. The review
Using a Robot Tractor with a Laser Scanner. ASAE, St. Joseph, MI. unmanned surface vehicle path planning: Based on multi-modality constraint.
Yukumoto, O., Matsuo, Y., Noguchi, N., 2000a. Robotization of agricultural vehicles Ocean Eng. 200, 107043.
(Part 1), component technologies and navigation systems. Japan Agric. Res. Zion, B., Mann, M., Levin, D., Shilo, A., Rubinstein, D., Shmulevich, I., 2014. Harvest-
Quart. 34, 99–105. order planning for a multiarm robotic harvester. Comput. Electron. Agric. 103,
75–81.

You might also like