You are on page 1of 69

SENSOR ASSISTED MOTION

PLANNING ALGORITHM DESIGN


FOR AUTONOMOUS TRACTORS
M.Sc.Dissertation

Raşit TAŞKIN

Eskişehir 2019
SENSOR ASSISTED MOTION PLANNING ALGORITHM
DESIGN FOR AUTONOMOUS TRACTORS

Raşit TAŞKIN

M.Sc.DISSERTATION

Electrical and Electronics Engineering Program


Supervisor: Assoc.Prof.Dr. Nuray AT

Eskişehir
Anadolu University
Graduate School of Sciences

November 2019
FINAL APPROVAL FOR THESIS

This thesis titled “SENSOR ASSISTED MOTION PLANNING ALGORITHM


DESIGN FOR AUTONOMOUS TRACTORS” has been prepared and submitted by Raşit
TAŞKIN in partial fulfillment of the requirements in “Anadolu University Directive on
Graduate Education and Examination” for the Degree of Master of Science in Electrical
and Electronics Engineering Department has been examined and approved on
19/11/2019.

Committee Members Signature

Member (Supervisor) : Assoc. Prof.Dr. Nuray AT ………….


Member : Assoc. Prof.Dr. Hanife APAYDIN ÖZKAN ………….
Member : Asst. Prof.Dr. Mehmet KOÇ …………..

Prof. Dr. Murat TANIŞLI


Director of Institute of Graduate Programs
ABSTRACT

M.Sc. Dissertation
SENSOR ASSISTED MOTION PLANNING ALGORITHM
DESIGN FOR AUTONOMOUS TRACTORS

Raşit TAŞKIN

Anadolu University
Graduate School of Sciences
Electrical and Electronics Engineering Program
November 2019

Supervisor: Assoc.Prof. Dr. Nuray AT

In this thesis, a tractor system is designed with sensors support which can determine
position and direction and can work autonomously in a planned field path using a route
planning software. Firstly, the characteristics of an autonomous tractor and the field to be
processed are determined, and then the route pattern to be used in the processing of a field
is calculated and route planning software is designed in the area bounded by the given
corners.

For autonomous tractors, “back and forth” field covering technique and “spiral”
covering technique are tested. A navigation program is created by designing a test tool
for autonomous movement on the route determined by route planning software. As a
result of the tests, it is observed that the vehicle scans the area in the given corner
coordinates autonomously and the result graphs are presented.

Keywords: Autonomous Tractor, GPS, Electronic Compass, Positioning, Direction


Finding, Field Path Planning, Back and Forth Algorithm, Spiral Path
Planning Algorithm

iii
ÖZET

Yüksek Lisans Tezi


OTONOM TRAKTÖRLER İÇİN SENSÖR DESTEKLİ
HAREKET PLANLAMA ALGORİTMASI TASARIMI

Raşit TAŞKIN

Anadolu Üniversitesi
Fen Bilimleri Enstitüsü
Elektrik-Elektronik Mühendisliği Ana Bilim Dalı
Kasım 2019

Danışman: Doç.Dr. Nuray AT

Bu tezde, sensör destekli, konum ve yön belirleyebilen ve hazırlanan rota planlama


yazılımı ile planlı tarla rotasında otonom olarak çalışabilen bir traktör sistemi
tasarlanmıştır. Öncelikle otonom bir traktörün ve işlenecek tarlanın özellikleri
belirlenmiş, sonrasında bir tarlanın işlenmesinde kullanılacak rota kalıbı belirlenerek
verilen köşelerin sınırladığı alanda rota planlama yazılımı tasarlanmıştır.

Otonom traktörler için “ileri ve geri” tarla sürme tekniği ile “spiral” yol teknikleri
test edilmesi amacıyla rota planlama yazılımı ile belirlenen rota üzerinde otonom olarak
hareket etmesi için bir test aracı tasarlanarak navigasyon programı oluşturulmuştur.
Yapılan testler sonucunda aracın verilen köşe koordinatlarındaki alanı otonom olarak
taradığı gözlemlenerek sonuç grafikleri sunulmuştur.

Anahtar Kelimeler: Otonom Traktörler, GPS, Elektronik Pusula, Konumlandırma, Yön


bulma, Tarla Rotası Planlama, İleri ve Geri Rota Planlama
Algoritması, Spiral Rota Planlama Algritması

iv
STATEMENT OF COMPLIANCE WITH ETHICAL PRINCIPLES AND RULES

I hereby truthfully declare that this thesis is an original work prepared by me; that
I have behaved in accordance with the scientific ethical principles and rules throughout
the stages of preparation, data collection, analysis and presentation of my work; that I
have cited the sources of all the data and information that could be obtained within the
scope of this study, and included these sources in the references section; and that this
study has been scanned for plagiarism with “scientific plagiarism detection program”
used by Anadolu University, and that “it does not have any plagiarism” whatsoever. I also
declare that, if a case contrary to my declaration is detected in my work at any time, I
hereby express my consent to all the ethical and legal consequences that are involved.

Raşit TAŞKIN

………………

v
TABLE OF CONTENTS

Page
TITLE PAGE ................................................................................................................... i
FINAL APPROVAL FOR THESIS .............................................................................. ii
ABSTRACT .................................................................................................................... iii
ÖZET .............................................................................................................................. iv
STATEMENT OF COMPLIANCE WITH ETHICAL PRINCIPLES AND
RULES ............................................................................................................................. v
LIST OF FIGURES ..................................................................................................... viii
LIST OF TABLES .......................................................................................................... x
ABBREVIATIONS ........................................................................................................ xi
1. INTRODUCTION ................................................................................................... 1
1.1 Overview and Motivation ................................................................................ 1

1.2 Thesis Structure ............................................................................................... 6

2. PATH PLANNING .................................................................................................. 7


2.1 Complete Coverage Path Planning ................................................................. 7

2.2 Path Planning Techniques ............................................................................... 8

2.3 Field Characters ............................................................................................... 9

2.4 Test Area ......................................................................................................... 10

3. AUTONOMOUS DRIVING SYSTEM ............................................................... 11


3.1 Driving Speed ................................................................................................. 11

3.2 Sensors and Data Collection .......................................................................... 11

3.2.1 Initialization the mission data ............................................................. 11

3.2.2 Sensors and directional control ........................................................... 12

3.3 Navigational Calculations .............................................................................. 17

3.3.1 Heading calculation .............................................................................. 19

3.3.2 Target bearing calculation ................................................................... 19

3.3.3 Relative bearing calculation ................................................................ 21

3.3.4 Distance calculation .............................................................................. 21


vi
3.4 Controllable Outputs ..................................................................................... 22

4. NAVIGATION PATH PLANNING .................................................................... 24


4.1 Calculating The First Point ........................................................................... 25

4.2 Indexing The Navigation Points for the Spiral Method ............................. 27

4.3 Determining and Indexing The Navigation Points for the Back and Forth
Method ............................................................................................................ 30

4.4 Data Formatting ............................................................................................. 31

5. TEST PLATFORM DESIGN ............................................................................... 33


5.1 System Architecture ....................................................................................... 34

5.2 Model Design .................................................................................................. 35

5.2.1 System components............................................................................... 36

5.2.2 Wiring design ........................................................................................ 39

5.3 Mission Planning ............................................................................................ 40

5.4 Initialization .................................................................................................... 41

5.5 System Run ..................................................................................................... 42

5.6 Data Logger .................................................................................................... 42

6. TEST AND RESULTS .......................................................................................... 44


6.1 Test Environment ........................................................................................... 44

6.2 Test Outputs ................................................................................................... 45

6.3 Results ............................................................................................................. 48

7. CONCLUSION ...................................................................................................... 52
REFERENCES.............................................................................................................. 53

vii
LIST OF FIGURES
Page
Figure 1.1. Bear Flag Robotics Autonomous Tractor ..................................................... 3
Figure 1.2. Autonomous Tractor Corporation’s tractor .................................................. 4
Figure 1.3. CASE-IH Autonomous Tractor . ................................................................... 4
Figure 1.4. Automation levels definition . ....................................................................... 5
Figure 2.1 The waste areas of 180 degree turning . ......................................................... 9
Figure 2.2. Field covering methods Boustrophedon Motion (Left), Spiral Motion
(Right) ............................................................................................................................. 10
Figure 3.1. GPS satellites orbits . ................................................................................... 13
Figure 3.2. GPS satellites system . ................................................................................. 14
Figure 3.3. World’s magnetic fields . ............................................................................ 14
Figure 3.4. Turkey’s latitudes and longitudes map . ...................................................... 15
Figure 3.5. True and relative bearings . ......................................................................... 17
Figure 3.6. Vehicle heading and bearing . ..................................................................... 18
Figure 3.7. Navigational variables of the autonomous tractor system. .......................... 18
Figure 3.8. Navigational calculations. ........................................................................... 20
Figure 4.1. Example field coordinates ........................................................................... 24
Figure 4.2. Example field image .................................................................................... 25
Figure 4.3. Navigation point calculation........................................................................ 26
Figure 4.4. The calculated navigation points ................................................................. 27
Figure 4.5. A four corner field graph ............................................................................. 28
Figure 4.6. A five corner field graph ............................................................................. 29
Figure 4.7. Back and Forth path..................................................................................... 31
Figure 4.8. Path planning algorithm output path image................................................. 31
Figure 5.1. Autonomous vehicle test platform............................................................... 33
Figure 5.2. The basic system architecture for the autonomous driving. ........................ 34
Figure 5.3. Flowchart of the autonomous driving.......................................................... 35
Figure 5.4. DC reduced motors and wheels ................................................................... 36
Figure 5.5. DC motor driver .......................................................................................... 37
Figure 5.6. Servo motors ................................................................................................ 37
Figure 5.7. GPS sensor................................................................................................... 37
Figure 5.8. Magnetometer sensor................................................................................... 38

viii
Figure 5.9. Arduino UNO microcontroller .................................................................... 38
Figure 5.10. Power supply modules ............................................................................... 39
Figure 5.11. SD card module and microSD card ........................................................... 39
Figure 5.12. Wiring system architecture of the vehicle ................................................. 40
Figure 6.1. The bird-eye image of the test field ............................................................. 44
Figure 6.2. Back and forth navigation path and driven path .......................................... 45
Figure 6.3. Back and forth navigation path navigational error. ..................................... 46
Figure 6.4. Spiral navigation path and driven path ........................................................ 46
Figure 6.5. Spiral navigation path navigational error .................................................... 47
Figure 6.6. Spiral navigation path for half sized distance points and driven path. ........ 47
Figure 6.7. Spiral navigation path for half sized distance points navigational error. .... 48
Figure 6.8. Error graphs for back and forth path, spiral path with long distance points
and spiral path with half distance points. ........................................................................ 48
Figure 6.9. Turn radius of a standard tractor . ............................................................... 50
Figure 6.10. Back and forth path turning area. .............................................................. 50
Figure 6.11. Waste areas on spiral path (left), on back and forth path (right). .............. 51
Figure 6.12. Total waste areas for the covering techniques. .......................................... 51

ix
LIST OF TABLES
Page
Table 3.1. NMEA Recommended Minimum Sentence Parsing .................................... 13
Table 4.1. Path planning algorithm output navigational points’ coordinates ................ 32
Table 5.1. Six digit formatted target data table. ............................................................. 41
Table 5.2. Sample data log table after the mission ........................................................ 43
Table 6.1. Comparison from test outputs ....................................................................... 49

x
ABBREVIATIONS

GNSS : Global Navigation Satellite Systems


GPS : Global Positioning System
lat : Latitude
lon : Longitude
mG : Mili Gauss
MRC : Recommended Minimum Sentence C
NMEA : National Marine Electronics Association
rpm : Revolutions Per Minute
rms : Root Mean Square

xi
1. INTRODUCTION
1.1 Overview and Motivation
In today’s world, the autonomous vehicles are receiving great attention both from
the academic world and the production industry. The human drivers’ sensing abilities and
outer world dynamic variables are the major elements of driving skills. In addition, human
sights are quite restricted and sometimes can be so slow that immediate actions cannot be
taken. On the other hand, electronic sensors and micro electromechanical sensors (mems)
have wide spectrum of sensing environmental changes, and they can be very fast
compared to human senses.
Unmanned autonomous car navigation is not a new idea. So far, many systems
based on image processing have been implemented, but the actual limit is the precise
position data to be grown up to full autonomy. Today, the available precise positioning
systems make it possible to improve the autonomous navigation systems, and they let the
systems gain new dimensions to grow [1].
According to Food and Agriculture Organization of the United Nations; the
population of the world is estimated to reach 9.1 billion by the year 2050, and the need
for food to feed a crowd of this size is estimated to be about twice. In future, the need for
high productivity and precision in agricultural production will emerge and this can only
be achieved through autonomization of agriculture. These autonomous devices will not
be affected by night conditions and they will have long working hours compared to
manual operators [2].
Labor expenditure in agriculture also creates economic pressure in the sector [3].
In an agricultural country such as China, the government follows a policy of supporting
agriculture with technology and science in order to prevent unstable agricultural growth
due to the aging of the people who can participate in the labor force [4].
In 2015, the average age of 2.097 million people engaged in agriculture in Japan
was found to be 66.7, and this number is predicted to decrease by 4.2% over the next five
years [5]. Therefore, the development of autonomous driving systems is also supported
in Japan.
As a factor in addition to the reasons listed above, young generation do not choose
agriculture as a career. Therefore, agricultural employment is gradually decreasing. The

1
idea of developing autonomous systems gains importance in overcoming this deficiency
[6].
Another advantage of autonomous driving is that it allows the operator to monitor
and control other situations. Autonomous agriculture is a more marketable sector than
military autonomous systems. It is a big sector that also benefits economic development.
At the same time, the design of systems with high precision and accuracy as well as low
costs are of great importance [7].
Autonomous vehicles research areas are not only about carrying human and pay
loads, but also based on agricultural driving on a wide area of agricultural studies.
Agricultural fields are usually big and wide areas to be driven along. Also agricultural
driving patterns are repeatable paths. For these reasons, human drivers can have some
problems on paying attention to the work for hours. Also some visibility problems, like
lack of illuminated areas, could make driving harder for a human driver. Autonomous
tractor idea can overcome many negative effects of human driving. For these reasons,
autonomous tractor driving systems are being developed in many countries and they are
going to have a wider study place in the economical agricultural life.
In Turkey, agricultural machinery production industry has become popular in recent
years. The %39,2 of the producers are working in a cooperative work with a university
[8]. In the near future, precision, fertilization, irrigation and spraying to increase
productivity, the remote sensing like GPS and remote control methods are becoming the
main research areas [8].
In the world, the studies on autonomous agricultural machinery have been done by
engineers to reduce labor forces. The agricultural autonomous systems can perform
various agricultural tasks. The most well-known agricultural machines are tractors. The
tractor driving aids are already being used by producers. They started to think and design
about autonomous driving. There are mainly two research areas in agricultural robotics;
ground sensing and satellite based systems.
One of the ground based navigation systems is the oldest navigation aid, the polar
magnetization. Near the North Pole, there is a huge amount of magnetization force that
can affect the tiny magnet. Even a primitive machine can detect the magnetization
direction to point magnetic north. In electronic technology, there are some sensors to
measure magnetization in three axes. The other method of navigational sensing system is

2
the satellite based systems. They are mainly depending on global satellite navigation
systems. Satellite navigational aid is the leading navigation technic in navigation area.
Farming can be a dangerous industry to work in due to inherit risks of working with
large equipment and other environmental factors. Satellite navigation technology will
also enhance the level of safety in farming [9].
The autonomous agricultural projects have been developed by big agricultural
machine manufactures. One of them is Bear Flag Robotics. Their autonomous tractor can
command a fleet of tractors from one remote location, it can plan routes, schedule jobs,
and receive real-time equipment alerts remotely. It has custom routes with Record and
Replay™ path capability. It can optimize route planning and implement control that
minimizes overlap and inputs. Their tractor operates in broad acre, row crops, orchards,
and vineyards with a clear view of the sky. These tractors have two modes: manual and
autonomous mode. They have remote E-stop and obstacle avoidance [10]. Figure 1.1
shows the autonomous tractor of Bear Flag Robotics.

Figure 1.1. Bear Flag Robotics Autonomous Tractor [10]

Another example of autonomous tractor production company is Autonomous


Tractor Corporation. They claim that their electricity autonomous tractor reduces cost of
equipment by autonomous drive with autonomous tractor [11]. Figure 1.2 shows their
system.

3
Figure 1.2. Autonomous Tractor Corporation’s tractor [11]

Another producer is CASE-IH Tractor Corporation. CASE-IH Tractor Company


has designed a concept autonomous tractor. As seen in Figure 1.3, the company produced
a tractor that has not any seat or steering commander mechanism. The autonomous tractor
has some antenna system to sense the field characteristics and some outer control
communication processes. Also because of the heavy load design needs, the strong wheels
are designed as double per each wheel.

Figure 1.3. CASE-IH Autonomous Tractor [12].

4
Australian government studied the six largest manufacturers John DEERE, Case
New Holland, AGCO, CLAAS, Deutz-Fahr, Kubota for their autonomous farming
designs [13].
CASE IH defined automation into 5 main criteria: guidance, coordination and
optimization, operator assisted autonomy, supervised autonomy and full autonomy.
Figure 1.4 shows the automation defined by CASE IH [14].

Figure 1.4. Automation levels definition [14].

Guidance is the first automation criteria. The manned tractor gets help via sensors
and route planner screens.
Coordination and optimization is the correlation between two different agricultural
machines that they need to work together. The sensors and screens help the drivers to
detect where to move together simultaneously.
Operator assisted autonomy provides restricted autonomy but it is unsafe without
an operator. For the safety and complexity reasons, a driver must sit on the driver seat to
control some critical work.
The supervised autonomy process is a kind of operator assisted autonomy but the
driver controls the critical works by a remote control system.
The last stage of the automation levels is the full autonomy. In this system, tractor
collects and senses some outer space data and processes given initial mission data without
any outer control by a controller person. The aim of this study is to design a tractor system
with the full autonomy.

5
Autonomous systems may in some cases also include more than one cooperative
system. When a vehicle is driving in the field, other vehicle can serve as storage of the
vehicle harvest [15].
In the studies carried out for smart agriculture systems, positioning, timing,
quantitative monitoring and management of percentage concepts are used [16]. Data
recording systems are also used in the systems in order to follow up and develop the
studies [16].
In general, autonomous equipment uses multiple sensors to calculate posture,
direction and location [17]. Autonomous rotation operations are shaped according to the
difference in heading angle [17].

1.2 Thesis Structure


In this work, the autonomous tractor topic is studied with two path planning styles.
The back and forth and spiral planning algorithm are described and evaluated. Then, the
outer world variables such as location of a tractor, latitude and longitude data, steering
turn angle and direction, data collection mechanisms are discussed with sensors.
In the Chapter 2, path planning techniques are given with field characteristics. The
autonomous navigation principles and techniques are described in the Chapter 3. The
sensing algorithms, calculations to make these raw data’s to be useful data are also
described in this chapter. The pre-planned autonomous tractor path planning algorithm
that is dynamically calculated with a MATLAB script to make the navigation path in a
restricted area with given corners is examined in the Chapter 4.
The dynamically calculated route is used on a test platform. In the Chapter 5, the
implementation of this platform is given. Control algorithm of the platform and logging
the data in navigation work are described in Chapter 5.
In the Chapter 6, the results are given with supportive graphs and the pros and
negative effects of the path algorithms are discussed.

6
2. PATH PLANNING
Engineers have published hundreds of articles and patents on autonomous
agricultural robots since the 1920s. A description of autonomous tractor behavior is
expressed as sensible and long-term behavior in a semi-natural environment while
performing a useful task [18].
Many researchers have investigated the automation of mobile equipment in
agriculture. There are two basic approaches, the vehicle drives a route based on an
absolute reference frame. The route is planned by calculating a geometric coverage
pattern over the field or by manually teaching the machine. The route is driven by using
absolute positioning sensors, such as a global positioning system (GPS), magnetic
compass, or visual markers. The planned route is driven as programmed or taught, without
modification. The autonomy is needed since spraying is performed frequently and is
hazardous to both the vehicle operator and other workers in the field. The other approach
is the vehicle drives based on a relative frame by restricted an found by sensing some
known reference cues such as individual plants a crop line or the end of a row [19].
To generate a path, several sophisticated and classical algorithms exist that are
based on graph theory. One of the specific characteristics of mobile robots is the
complexity of their environment. Therefore, one of the critical problems for the mobile
robots is path planning which is still an open one studied extensively. Accordingly, one
of the key issues in the design of an autonomous robot is navigation. The navigation is
the science of directing the course of a mobile robot as the robot traverses the
environment. The goal of the navigation system of the mobile robots is to move the robot
to a named place in a known, unknown, or partially known environment [20].

2.1 Complete Coverage Path Planning


Complete coverage path planning (CCPP) is used as the type of planning defined
as the completion of the task by passing all points on the path defined in a certain area
[21]. CCPP is also used in autonomous agricultural vehicles since vehicle path planning
problems should be a way of scanning whole target area [22]. Robot path planning
algorithms can scan the entire area with the CCPP [23]. An autonomous vehicle should
go through the path that must go continuously and without repetition, go through all the
designated points and do not overlap [24].

7
CCPP is the problem of finding a path that passes through all the points in the
workspace from a starting point to a final point. CCPP is a fundamental problem in
robotics with numerous applications in real world such as demining, agriculture and
farming, cleaning, inspection of complex structures, seabed mining, and underwater
operations to name a few. Coverage efficiency of the CCPP algorithm is determined by
total coverage ratio, total time required for complete coverage, total path length and
energy consumption required to cover the path [21].
Generally, the coverage algorithms are categorized as offline and online algorithms.
Offline coverage algorithms use fixed information and environment is known in advance.
Complete coverage planned by genetic algorithms, neural networks, cellular
decomposition, spanning trees, spiral filling paths and ant colony method falls in this
category. Whereas, online coverage algorithms use real time measurements and decisions
to sweep the entire target area. In online approaches complete environment map can only
be generated by the robot’s exploration such as executing an action and observing the
consequences of these actions. Sensor based approaches are popular candidates for this
category [25].
A typical auto guidance system consists of hardware and software. Hardware,
includes position, steering angle sensor, actuator, and software includes path planning
and steering algorithms [26].
In autonomous systems, two types of planning consist of global path planning and
local motion planning [27].The planning is the route finding to be followed in global
planning part and steering to the selected point from local points to accomplish planned
route following. Two main parameters are used for autonomous systems: heading angle
and time dependent displacement [28].

2.2 Path Planning Techniques


The simplest technique for the comprehensive covering of an area is the back and
forth Bostraphedus path [29]. Another method used in the literature is the spiral scanning
method [2]. Concave areas can be divided into convex subareas and screened by multiple
step screening methods [30].
As a way of field covering task, CCPP has two standard basic motions to be
followed to perform coverage. First is the square spiral motions, and the second is
boustrophedon (back-and-forth) motion. The main advantage of these basic motions is
8
that they can cover region of any shape and can be used as a base for more complex
movements particularly in an environment full of obstacles. The field area can be covered
by a human driver in different ways but autonomous systems generally uses the two path
styles. A CCPP algorithm is complete if the robot sweeps the workplace such that union
of all the sub-trajectories completely covers the workplace in finite time. [25].
Energy consumption comes into prominence as the criterion in choosing the route
to be used [31]. Other criteria are evaluated as minimum number of turns, minimum path
and minimum repetition [32]. The total number of turns can be practically used as a
measure of efficiency [33]. The turning process can be waste of time while it exceeds the
field area. The back and forth style has 180 degree turn points and out of the field area
while spiral path has smaller angles and turn areas are in the field. The waste areas are
shown as blue arches on Figure 2.1.

Figure 2.1 The waste areas of 180 degree turning [17].

2.3 Field Characters


Fields are different shaped enclosed agricultural areas. Agricultural machines
process the fields to produce agricultural goods. Autonomous tractors are going to be used
in these fields. Covering the fields is the main concern for autonomous driving. As
described before, there are two kinds of path routes for covering the field. One of them is
driving forward and backwards to the end. The other one is covering outer to the middle
loop by loop in spiral motion. Figure 2.3 shows the forward backward method and the
outer to inner loops method.
9
Figure 2.2. Field covering methods Boustrophedon Motion (Left), Spiral Motion (Right) [34].

The way to cover an empty area problem can be solved both using a spiral pattern
and using back and forth-motions. Both of these patterns can be a basis for a coverage
algorithm that works in more general environments. These basic patterns are shown in
figure 2.2. The black lines are the calculated path and the black dots are the finishing
points [34].
In this study, the loop based path planning algorithm is designed. It is assumed that
there is no obstacle in the field and the fields are convex shaped. The geographical
coordinates of the field are going to be found by online Global Navigation Satellite
Systems (GNSS). Both the spiral and the back and forth algorithms are implemented for
a sensor assisted autonomous vehicle and the results of the two path panning types are
evaluated.

2.4 Test Area


The test area is a free space in Eskişehir. The model car wheels need a clear area to
move smooth. The geographical coordinates of the center of Eskişehir is on 39° 46’ 36”
North latitude and 30° 31’ 14” East longitude. The tests are evaluated on North
hemisphere and the compass declination angle for Eskişehir is +5° 37’ degrees positive
East. Note that the compass sensors are highly sensitive to magnetic areas so that the test
area needs to be clear from big magnetic sources and ferromagnetic vehicles.

10
3. AUTONOMOUS DRIVING SYSTEM
3.1 Driving Speed
The tractors are used on low speeds. The harvesting efficiency study shows that the
efficient harvesting tractor speed is 2 to 7.7 kilometers per hour (km/h). The autonomous
tractor design should use the working mode speed between these limits. [35].
The speed of the tractor is calculated with the tire dimensions and revolution per
minute. 𝑆𝑡 is the speed of tractor in km/h, 𝑑𝑤 is the dimension of the main wheel in meters,
and 𝑤𝑠𝑟𝑝𝑚 is the revolution per minutes of the wheel shaft:

𝑆𝑡 (3.1)
𝑤𝑠𝑟𝑝𝑚 =
𝑑𝑤 ∗ 60⁄1000

The rpm meters read the rpm of the shaft.

3.2 Sensors and Data Collection


Autonomous driving systems are mainly based on sensing the environmental
variables and using them to maintain the vehicle on the route by correcting the error.
Conventional tractor drivers mainly work on route planning in mind (see and analysis),
maintain vehicle on the route (see, directional control), emergency handling and stop the
mission (sensing and braking) and velocity control (analysis and throttle control). On the
other hand, in the autonomous driving systems, these kind of missions are done by sensors
and microcontrollers. The human controller only gives the initialization data to a system
and the others are done autonomously.

3.2.1 Initialization the mission data


The first data for the agricultural mission must be the field data. As mentioned
before, the coordinate data can be found by online GNSS data. After this data is given,
one must know the mission (plowing, harvesting, etc.). The mission data is given by the
operator to the autonomous system. In this study, the initial data is navigation points
coordinates that are calculated by a path planning algorithm.

11
3.2.2 Sensors and directional control
The mission data are the location variables as latitudes and longitudes. The global
coordinates give the exact location with negligible amount of error. Location data gives
not only the location of the vehicle’s state position but also preplanned mission
coordinates in units of the latitude and longitude format. The most common location
finding method is the GNSS. The widely used GNSS systems are the Global Positioning
System (GPS) and the Globalnaya Navigatsionnaya Sputnikovaya Sistema (GLONASS)
moving on the near orbit of the world for the global navigation purpose. For this study,
The GPS system is used. Direction finding is the second sensing variable to achieve in
autonomous driving. The world’s biggest magnetic field is located near the North Pole of
the World. Most of the direction sensors use this magnetic field to find the magnetic north.
But in reality, the magnetic field is not exactly on the geographical North Pole point. The
declination angle, for different locations and date, can be found on the magnetic
declination online web site [36]. The true north angle is found by adding the declination
angle to the magnetic north angle.

3.2.2.1 GPS Sensor


Today, using GPS is a common method for navigation purposes. Ranging from
driver aid mobile GPS devices to smartphone mapping systems, GPS receivers are used
in many kinds of devices. GPS receivers can receive satellite positions, location data,
accurate time data and some other information like almanac data. Most commonly used
data packet from GPS is National Marine Electronics Association (NMEA) sentences
from 24 GPS satellites that are turning on World’s orbit. One of the NMEA sentence is
Recommended Minimum Sentence C (GPRMC) sentence. The parsing is shown in Table
3.1 [37] for the following.

$GPRMC, 123519, A, 4807.038, N, 01131.000, E, 022.4, 084.4, 230394,003.1, W*6A

12
Table 3.1. NMEA Recommended Minimum Sentence Parsing [37] .

RMC Recommended Minimum Sentences C


123519 Fix taken at 12:35:19 UTC
A Status A=activate or V=Void
4807.038, N Latitude 48 deg. 07.038’ N
01131.000, E Longitude 11 deg. 31.000’ E
022.4 Speed over the ground in knots
084.4 Track angle in degrees True
230394 Date 23’rd of March 1994
003.1, W Magnetic Variation
*6A The checksum data, always begin with
*

GPS sensors can receive the location data that the device is located and time of the
moment with in a high resolution accuracy. The system is first constructed for military
reasons but today we can use the civil receivers to get NMEA data. In this study, the
location of the tractor and fields are used in units of latitude and longitudes provided from
GPS sensors and map data on Google maps. Figure 3.1 [38] shows the orbits of GPS
satellites.

Figure 3.1. GPS satellites orbits [38].

GPS navigation data is used in air, sea and land vehicles. As seen on Figure 3.2
[39] GPS satellites communicate all receivers located on various vehicles.
13
Figure 3.2. GPS satellites system [39].

In this study; GPS actual vehicle location and the target location coordinates are
used for calculating distance and the relative bearing from actual location to target
location. Also, GPS time data is used for data logger time information.

3.2.2.2 Compass Sensor


Even in ancient times the magnetic effect is already known and is used for
navigation purposes in most commonly on ships. Today, still the needle type compasses
can be used. On the other hand, the micro electro mechanic systems (mems) magneto
compass sensors can be used as the digital data output of the magnetic angle to the world’s
magnetic field. Figure 3.3 [40] shows world’s magnetic fields.

Figure 3.3. World’s magnetic fields [40].

14
The sensors give three axis magnetic angles to the magnetic biggest field near it. In
case a jamming effect is high from near big magnetic fields, the sensor needs to be located
any nonmagnetic area in the vehicle.
Because of the tilt angle and unpredictable disturbances, the Digital Compass does
not provide a precise measurement. Some calibrate procedures and compensations
algorithms need to be applied to get higher precision [3].

3.2.2.3 Direction Finding


World is not a perfect globe shape. The world’s shape is named geoid and its turning
axis makes World four ways of cardinal directions. Turning direction of the World is east
to west and the axis of turning lies to the north and to the south. Latitude and longitude
data are described in cardinal directional system.
Turkey’s latitude and longitude map is shown in Figure 3.4 [41].

Figure 3.4. Turkey’s latitudes and longitudes map [41].

Coordinate systems are mainly based on cardinal directions. With this information,
given two different location coordinates has a direction in units of directional angle.
A vehicle has a forward position because of its neutral moving direction. It is so
called term, heading angle can be found by a constant located compass sensor. On the

15
other hand, two coordinates, one is current location of a vehicle and the second is the
target point from mission database.
There are two kinds of coordinate definitions in literature, decimal system and
degrees system, and they can be converted as follows;
(3.2)
𝐴𝐵º 𝐶𝐷, 𝐸𝐹’ = 𝐴𝐵 + (𝐶𝐷, 𝐸𝐹)/60

For example;
(3.3)
39º 45,52’ = 39 + (45,52)/60 = 39,758666 𝑑𝑒𝑐

3.2.2.4 Parallel and Meridians


World surface is divided into artificial lines as parallels and meridians. Parallel
circles are parallel to the biggest parallel, 0° equator circle, so called as the big circle, and
the meridians are arches between the north and south poles.
Parallels properties are:
 These circles are extended from east to west,
 The first parallel is “0°” degree named “equator circle”.
 Arbitrary two parallels have 1° distance between each other. There are 180°, 90°
north and 90° south parallel circles. The North Pole and South Pole circles are
only a point on the poles.
 From the equator to poles, parallel degrees increase numerically. In contrast to
this, the lengths of the parallels decrease.
 Parallels always perpendicular to meridian arches.
 The north parallels’ degrees increase along to the north pole and the south
parallels’ degrees increase along to the South Pole.
Some parallels have special properties:
 0° Equator
 23° 27° northern tropic of cancer
 23° 27° southern tropic of Capricorn
 66° 33° northern polar sphere
 66° 33° southern polar sphere
 180° north pole

16
 180° south pole
 The distance between two neighbor parallels is 111 kilometers.
Meridians properties are:
 The first meridian is located on London Greenwich. There are 360 meridians on
the world surface, 180 at east and 180 at west.
 The east meridians’ degrees increase along the east and the west meridians
degrees increase along the west.
 Meridians are the same length.
 On Equator, the distance between two arbitrary meridians is 111 kilometers [42].

3.3 Navigational Calculations


In navigational systems, someone can find any location using the latitude and
longitude data. Also, the relative directions between two locations can be found by using
their coordinates. The angle system on the earth surface is described in either 0°-359°
degree system or 0 and 2 pi radian system. In addition, on the earth surface, there is north
vector from any point to the North Pole [43].
Figure 3.5 shows that the located main ship has an already known centerline
direction, heading. The bearing is the angle in degrees (clockwise) between the North and
the direction to the destination ship or navigation aid [44].

Figure 3.5. True and relative bearings [43].

17
On the other hand, the relative bearing is the angle in degrees between heading of
the vehicle and destination or navigation aid. Figure 3.6 [44] shows that some radio
beacon stations can be thought as a target point to navigate somewhere relative direction
already known.

Figure 3.6. Vehicle heading and bearing [44].

In this study, heading is the tractors front angle from the north. The bearing is the
vector angle between current location and the target position angle to the north. The angle
difference is the difference between the heading angle and the bearing angle. In
addition, a distance between the two locations can be calculated. Figure 3.7 shows the
navigational variables for autonomous tractor design.

Figure 3.7. Navigational variables of the autonomous tractor system.

18
3.3.1 Heading calculation
The autonomous tractor navigation process is based on point to point navigation
algorithm. The algorithm needs to know about which direction to go along.
Magnetometers can measure three dimensional magnetic fields as “x, y and z”. The letter
“x” is the backward to the heading direction, “y” is the left to right direction and “z” is
the bottom to up direction. Because the tractor has not any rolling move, vertical direction
“z” is not considered as a variable. The heading angle is found by an algorithm that uses
the “x” and “y” gauss data of the current direction.
The magnetometers provide data for each axis in units of Gauss. The data needs to
be converted to the degree value.
In order to have a direction vector, two axis data are needed. In a two dimensional
system X axis and Y axis are used to calculate the heading angle. LSB value of the sensor
is 0.48828125 mG resulting 2048 scale,
(3.4)
𝑋 = 𝑥 ∗ 048808125 𝑚𝐺

where 𝑋 is the gauss data of x direction and 𝑥 is the minimum available data
received.
(3.5)
𝑌 = 𝑦 ∗ 048808125 𝑚𝐺

where 𝑌 is the gauss data of y direction and 𝑦 is the minimum available data
received.
Calculating Gauss data to degree is provided by the following formula:
(3.6)
𝐷 = 𝑎𝑟𝑐𝑡𝑎𝑛(𝑌/𝑋) ∗ (180/𝜋)

where 𝐷 is the heading angle in degrees [45].


However, there are some restrictions in this mathematical formula. First of all, if 𝑋
is ‘0’, than the division goes to the infinity, so if it occurs, 𝐷 is 90 degrees. If 𝐷 is greater
than 360°, then 360° degree must be subtracted from 𝐷 to get a Mod 360 value.

3.3.2 Target bearing calculation


The second data needed for the point to point navigation algorithm is the target
bearing data. It is the angle between two point locations vector angle to the North Pole.
As seen from figure 3.8 bearing angle 𝛽 is calculated as follows;

19
Figure 3.8. Navigational calculations.

Let 𝑋 and 𝑌 be the dummy variables to clarify the basic formula:


(3.7)
𝛽 = 𝑎𝑡𝑎𝑛2(𝑋, 𝑌)

(3.8)
𝑋 = cos(𝑙𝑡 )sin(𝑑𝐿 )

(3.9)
𝑑𝐿 = 𝐿𝑡 − 𝐿𝑐𝑙

where 𝑑𝐿 is the longitude difference, 𝑙𝑡 is the target latitude, 𝐿𝑡 is the target


longitude, 𝑙𝑐𝑙 is the latitude of current location and 𝐿𝑐𝑙 is the longitude of current location
of the vehicle.
All the calculations units are in radians, 𝛼1 and 𝛼2 are dummy variables.
(3.10)
𝑌 = 𝛼1 − 𝛼2

(3.11)
𝛼1 = sin(𝑙𝑡 ) ∗ cos(𝑙𝑐𝑙 )

(3.12)
𝛼2 = sin(𝑙𝑐𝑙 ) ∗ cos(𝑙𝑡 ) ∗ cos(𝑑𝐿 )

20
The bearing angle 𝛼𝑏 needs to be 0 − 2𝜋 range. The restriction gives that;

𝑖𝑓 𝑡ℎ𝑒 𝛽 𝑙𝑒𝑠𝑠 𝑡ℎ𝑎𝑛 0;


(3.13)
𝛽 = 𝛽 + 2𝜋

𝑖𝑓 𝑡ℎ𝑒 𝛽 𝑏𝑖𝑔𝑔𝑒𝑟 𝑡ℎ𝑎𝑛 2𝜋;


(3.14)
𝛽 = 𝛽 − 2𝜋

At the end, the bearing in degrees is;

180 (3.15)
𝛽𝑑 = 𝛽 ∗
𝜋

Where 𝛽𝑑 is angle I degrees and 𝛽 is the angle in radians [45].

3.3.3 Relative bearing calculation


So far, the heading and the bearing angles have been calculated. The autonomous
tractor main algorithm always steers to the target point in the steering angle limits. If the
angle exceeds the limit angle; it maintains the maximum value of steering angle until
reaches this limit again by decreasing the angle difference from heading to target. In other
words, system should make the difference between heading and the target angle to 0.
(3.16)
𝛼rb = 𝛼𝑡𝑏 − 𝛼ℎ

Where 𝛼rb is the relative bearing angle, 𝛼tb is the target bearing angle and the 𝛼h
is the heading angle of the vehicle. 𝛼rb has the range 0° to 359°. To follow the target
direction, the steering angle 𝛼s is;
(3.17)
𝛼s = 𝛼𝑟𝑏

If the steering angle 𝛼s bigger than the maximum steering angle 𝛼maxS then;
(3.18)
𝛼s = 𝛼maxS

If the steering angle 𝛼s smaller than the minimum steering angle 𝛼minS then;
(3.19)
𝛼s = 𝛼minS

3.3.4 Distance calculation


The steering angle data to the steering actuators is found by using the steering angle
calculation algorithm as described in the previous. With this information, the autonomous

21
vehicle can point to the target point location. The tractor can go to the point using sensor
data. From this point, the next goal is to define the new target position to the current point
position. The latitude and longitude data can give the distance data between current
location and the target point location.
The radius of the hypothetical sphere of the Earth is 6372795 meters. Because the
Earth is not exact sphere, rounding errors may be up to 0.5%.
The Haversine formula is useful even in small distances [46]. Distance 𝑑 is
calculated by using Haversine Formula;
(3.20)
𝑑𝑙 = 𝑙𝑡 − 𝑙𝑐𝑙

(3.21)
𝑑𝐿 = 𝐿𝑡 − 𝐿𝑐𝑙

(3.22)
𝐴 = sin(𝑑𝑙 /2)2 + cos(𝑙𝑡 ) ∗ cos(𝑙𝑐𝑙 ) ∗ 𝑠𝑖𝑛(𝑑𝐿 /2)2

(3.23)
𝐶 = 2 ∗ 𝑎𝑡𝑎𝑛2(√𝐴, √1 − 𝐴)

(3.24)
𝑑 = 𝑅∗𝐶

𝑑𝑙 is the latitude difference 𝑑𝐿 is the longitude difference, 𝑙𝑡 is the target latitude,


𝐿𝑡 is the target longitude, 𝑙𝑐𝑙 is the latitude of current location and 𝐿𝑐𝑙 is the longitude of
current location of the vehicle, R is the Earth radius (mean radius 6.371km). In this
formulas, angles need to be in radians [46].

3.4 Controllable Outputs


In agricultural tractor driving, driver controls some local missions. These missions
are ignition, gear shifting, throttle levering, steering control, braking and stopping the
tractor. On the other hand, modern tractors and vehicles control these missions
autonomously by smart ignition, automatic shifting and smart braking systems.
In this project, the design has start and stop discrete outputs for these controls. The
steering mechanism is pulse wide modulation (pwm) controlled servo motors that moves
90° as forward, 60° as turn full right and 120° as turn full left positions. Algorithm 1
shows the steering algorithm.

22
Algorithm 1. Steering control. The algorithm controls the angle of steering wheels 𝑠. It receives the
heading ℎ, and relative bearing 𝑏 then calculates difference 𝐷 between heading and the relative bearing to
the target point.

1. 𝐃𝐚𝐭𝐚: ℎ, 𝑐
2. 𝐑𝐞𝐬𝐮𝐥𝐭: 𝑠
3. 𝐷 ← ℎ − 𝑐;
4. 𝐈𝐟 𝐷 < 5 𝑎𝑛𝑑 𝐷 > −5 𝐭𝐡𝐞𝐧
5.
6. 𝑠 ← 0;
7. 𝐞𝐥𝐬𝐞 𝐢𝐟 𝐷 > 30 𝐭𝐡𝐞𝐧
8. 𝑠 ← 30
9. 𝐞𝐥𝐬𝐞 𝐢𝐟 𝐷 > −30 𝐭𝐡𝐞𝐧
10. 𝑠 ← −30
11. 𝐞𝐥𝐬𝐞
12. 𝑠←𝐷

23
4. NAVIGATION PATH PLANNING
In the autonomous driving systems, with no doubt, the most important issue is the
path planning algorithm. While the mission planning is done by the driver in non-
autonomous tractor driving operations, in the autonomous system architecture, mission
and path is calculated by the system. In this project, the main purpose is covering the field
autonomously and the path planning algorithm will be the base structure of the project.
Path means the way to reach a next point. The field has some special points as
corner points, starting point and finishing point. In autonomous systems, initialization
data is given by the operator. The unique points are the corner points. Figure 4.1 shows
an example of a field’s corner point coordinates.

Field Coordinates 30,538669;


39,8158 39,815705
30,539013;
39,8156 39,81568
39,8154

39,8152
Latitude

39,815
30,537575;
39,8148 39,814493
39,8146
30,537521;
39,8144 39,814229
39,8142 30,538573;
39,814254
39,814
30,5374 30,5376 30,5378 30,538 30,5382 30,5384 30,5386 30,5388 30,539 30,5392
Longitude

Figure 4.1. Example field coordinates

The field has 5 corners. The coordinate data are either found by a GPS receiver
located on the corners of the field, or from map applications. The most reliable method is
to use the same kind GPS receiver as the autonomous tractor’s GPS receiver.
The initial corners must be the first arbitrary given navigation points. The number
of given corner is 𝑛. The 𝑛 + 1’th point must be calculated first.
Some satellite images show that a method for agricultural driving starts from outer
lines of the surrounding field and finishes at the middle of the field. The proposed

24
autonomous driving model is based on this method. Figure 4.2 shows a field image
plowing from outer frame to the middle of the field.

Figure 4.2. Example field image

4.1 Calculating The First Point


The path must cover all the given field. The corners are navigated first. After the
𝑛’th point, the 𝑛 + 1’th point is needed to inboard with the autonomous tractor wide. It
can be seen that the next space points must be on the line that the corner points to the
middle points of the remaining loops.
The first center point must be calculated. The center point of a loop on coordinate
space 𝑥𝑐 and 𝑦𝑐 can be found by;

(4.1)
𝑥𝑐 = (𝑥1 + 𝑥2 + ⋯ + 𝑥𝑛 )/𝑛

(4.2)
𝑦𝑐 = (𝑦1 + 𝑦2 + ⋯ + 𝑦𝑛 )/𝑛

It can be seen that all the navigation points are on the line between the center and
the previous loop point. The following calculations determine the 𝑛 + 1’th point.

25
Figure 4.3. Navigation point calculation

Figure 4.3 shows that, the calculated point is between points and has a distance as
offset from the previous loop corner. This mathematical problem can be solved by using
the 𝑇ℎ𝑎𝑙𝑒𝑠 theorem:

𝐶𝐴 (4.3)
| |=𝑘
𝐶𝐵

𝑥𝑐+ 𝑘𝑥1 (4.4)


𝑥𝑛 =
1+𝑘

𝑦𝑐 + 𝑘𝑦1 (4.5)
𝑦𝑛 =
1+𝑘

The distance |𝐶𝐴| is the distance offset for the autonomous tractor coverage wide,
𝑥𝑛 and 𝑦𝑛 are the coordinates of the 𝑛 + 1’th point of the path. The other points are
calculated similarly using this formula. The limit of the points is reached if the distance
is shorter than the offset. The last index 𝑙 is;

𝐶𝐴 (4.6)
𝑙 =| |
𝐶𝐵

In this equation, distance |𝐶𝐵| is needed. It can be calculated as follows;

(4.7)
|𝐶𝐵| = √(𝑦𝑐 − 𝑦𝑛 )2 + (𝑥𝑐 − 𝑥𝑛 )2

The procedure is applied all the corner to center lines. All the navigation points are
found by using this formula. The points are calculated sequentially from the first corner

26
to the last corner. Figure 4.4 shows the calculated navigation points for the given field
coordinates.

Figure 4.4. The calculated navigation points

At this stage of the procedure, the navigation points are known by their coordinates.
The next step is to determine the index numbers of the coordinates to be driven in a
sequence from the start to a center point. The spiral path planning algorithm is given
below.

Algorithm 2. Spiral Path Planning. The algorithm creates a coverage path. It receives the initialize corner
points p1 to pn and creates latitudes and longitudes table.

1. 𝐃𝐚𝐭𝐚: 𝑝1 𝑡𝑜 𝑝𝑛
2. 𝐑𝐞𝐬𝐮𝐥𝐭: 𝑙𝑎𝑡(1)𝑡𝑜 𝑙𝑎𝑡(𝑛), 𝑙𝑜𝑛(1), 𝑙𝑜𝑛(𝑛)
3. 𝐜𝐞𝐧𝐭𝐞𝐫(𝐩𝟏, 𝐩𝐧)
4. 𝑐 ← 𝑝1(𝑖, 𝑗) + 𝑝2(𝑖, 𝑗) . . 𝑝𝑛(𝑖, 𝑗)/𝑛;
5.
6. n ={1,2,…N}
7. thales(p(n),c)
8. 𝑘 ← (𝑓𝑢𝑙𝑙_𝑑𝑖𝑠𝑡 − 𝑑𝑖𝑠𝑡)/𝑑𝑖𝑠𝑡;
9. 𝑛𝑝(𝑖) ← 𝑝(𝑛, 𝑖) + 𝑘 ∗ 𝑐(𝑖)/(𝑘 + 1) ;
10. 𝑛𝑝(𝑗) ← 𝑝(𝑛, 𝑗) + 𝑘 ∗ 𝑐(𝑗)/(𝑘 + 1) ;

4.2 Indexing The Navigation Points for the Spiral Method


In the previous section, all the navigation points’ coordinates are found as units of
latitudes and longitudes but these are not ordered. In order to cover the path from start to
end, they need to be ordered. The path can be defined as loops from outer to inner loop.
27
The total corner points number is 𝑡 and number of initial corner number is 𝑛, the loop
number 𝑛𝑙𝑜𝑜𝑝 ;

(4.8)
𝑛𝑙𝑜𝑜𝑝 ≤ 𝑡⁄𝑛.

Where 𝑖 is the index of the point, 𝑛 is the first loop corner number, 𝑙 is the loop
number and 𝑝 is the calculated point along the line between middle to corner. The index
𝑖 can be increased by 𝑖 ∗ 𝑛 as well.

𝑖 = 𝑙∗𝑛+𝑝 (4.9)

where;
𝑙 = 0,1,2, … … 𝐿;
𝑛 = 1,2, … … . . 𝑁;
𝑝 = 1,2, … … . . 𝑁;

The algorithm is tested with a MATLAB script. The script gets the coordinates of
the first corners and the corner number 𝑛. The output is a 2𝑖 path coordinates table to be
used in a tractor processor. Figure 4.5 shows an example output graphic for a given corner
coordinates of (100,100), (500,100), (500,500) and (100,500) in units.

Figure 4.5. A four corner field graph

28
Figure 4.6 shows another example to produce navigation path by given corner
coordinates (100,150), (500,100), (600,400), (500,500) and (200,400) units.

Figure 4.6. A five corner field graph

Algorithm 3 shows the spiral path indexing algorithm.

Algorithm 3. Spiral Path Indexing. The algorithm creates a spiral coverage path. It receives the
navigation points p1 to pn and creates latitudes and longitudes table to make a spiral path. Loop number
𝐿 is the loop number outer to inner. 𝑁 is the point number calculated by algorithm 1. 𝑃 is the corner
number as input.

1. 𝐃𝐚𝐭𝐚: 𝑛𝑝, l ={1,2,…,L}, n = {1,2,…,N},p={1,2,…,P}


2. 𝐑𝐞𝐬𝐮𝐥𝐭: 𝑙𝑎𝑡(1)𝑡𝑜 𝑙𝑎𝑡(𝑛), 𝑙𝑜𝑛(1), 𝑙𝑜𝑛(𝑛)
3.
4. 𝐢𝐧𝐝𝐞𝐱(𝐧𝐩, 𝐬𝐩)
5. 𝑠𝑝(𝑖) ← 𝑛𝑝(𝑙 ∗ 𝑛 + 𝑝);

29
4.3 Determining and Indexing The Navigation Points for the Back and Forth
Method
Algorithm 4 shows the Back and Forth path planning algorithm.

Algorithm 4. Back and Forth Path Planning. The algorithm creates a coverage path. It receives the initial
corner points p1 to pn and creates latitudes and longitudes table. 𝐷 is the rate full distance over distance
between lines.

1. 𝐃𝐚𝐭𝐚: 𝑢𝑝𝑝𝑒𝑟 𝑐𝑜𝑟𝑛𝑒𝑟𝑠 𝑝1, 𝑝3, 𝑙𝑜𝑤𝑒𝑟 𝑐𝑜𝑟𝑛𝑒𝑟𝑠 𝑝2, 𝑝4 𝑖 = {2,4, . . , 𝐷}


2. 𝐑𝐞𝐬𝐮𝐥𝐭: 𝑙𝑎𝑡(1)𝑡𝑜 𝑙𝑎𝑡(𝑛), 𝑙𝑜𝑛(1), 𝑙𝑜𝑛(𝑛)
3. thales(p1,p3)
4. 𝑘 ← (𝑓𝑢𝑙𝑙_𝑑𝑖𝑠𝑡 − 𝑑𝑖𝑠𝑡)/𝑑𝑖𝑠𝑡
5. 𝑛𝑝(𝑖) ← 𝑝1(𝑖) + 𝑘 ∗ 𝑝(3)/(𝑘 + 1)
6. 𝑛𝑝(𝑗) ← 𝑝1(𝑗) + 𝑘 ∗ 𝑐(3)/(𝑘 + 1)
7. thales(p2,p4)
8. 𝑘 ← (𝑓𝑢𝑙𝑙_𝑑𝑖𝑠𝑡 − 𝑑𝑖𝑠𝑡)/𝑑𝑖𝑠𝑡
9. 𝑛𝑝(𝑖 + 1) ← 𝑝2(𝑖) + 𝑘 ∗ 𝑝(4)/(𝑘 + 1)
10. 𝑛𝑝(𝑗 + 1) ← 𝑝2(𝑗) + 𝑘 ∗ 𝑐(4)/(𝑘 + 1)
11.

Algorithm 5 shows the Back and Forth path indexing algorithm.

Algorithm 5. Back and Forth Path Indexing. The algorithm creates a spiral coverage path. It receives the
navigation points p1 to pn and creates latitudes and longitudes table to make parallel path. 𝑙 is the length
of the coordinate table. pp is parallel point coordinates table.

1. 𝐃𝐚𝐭𝐚: 𝑛𝑝, i ={1,2,…,l}


2. 𝐑𝐞𝐬𝐮𝐥𝐭: 𝑙𝑎𝑡(1)𝑡𝑜 𝑙𝑎𝑡(𝑛), 𝑙𝑜𝑛(1), 𝑙𝑜𝑛(𝑛)
3. 𝐢𝐧𝐝𝐞𝐱(𝐩𝐩)
4. for 𝑖 = 1 𝑡𝑜 𝑙
5. 𝑠𝑝(𝑖) ← 𝑝𝑝(1)
6. 𝑠𝑝(𝑖 + 1) ← 𝑝𝑝(2)
7. 𝑠𝑝(𝑖 + 3) ← 𝑝𝑝(3)
8. 𝑠𝑝(𝑖 + 2) ← 𝑝𝑝(4)

Figure 4.7 shows another example to produce Back and Forth style navigation path
by given corner coordinates (100,100), (500,100), (500,500), and (100,500) units.

30
Figure 4.7. Back and Forth path.

4.4 Data Formatting


The MATLAB program has produces digits after comma, however the autonomous
tractor system uses only 6 digits after comma. For this reason, the table needs to be
reformatted after the table is created by the MATLAB script. The MS Excel can format
this data and it is used for data formatting purpose. Figure 4.8 shows an example route
and Table 4.1 shows the coordinates of the navigation points calculated and formatted by
the path planning algorithm.

Figure 4.8. Path planning algorithm output path image

31
Table 4.1. Path planning algorithm output navigational points’ coordinates.

Latitudes Longitudes

1 39,814412 30,537485

2 39,817426 30,532430

3 39,818572 30,533512

4 39,815653 30,538635

5 39,814478 30,537423

6 39,817401 30,532516

7 39,818508 30,533575

. . .
. . .
. . .
131 39,816531 30,535548

132 39,816432 30,535859

32
5. TEST PLATFORM DESIGN
The test platform is a battery powered vehicle for the application of this study. The
vehicle is controlled by an Arduino microcontroller. The Arduino code is written in C++
environment. The equations in this study are converted to C++ code and loaded to the
microcontroller. As seen in Figure 5.1 the vehicle uses two real time sensors, GPS sensor
and a compass sensor. The mission data is loaded to the vehicle via an SD card. The
vehicle has two power wheels and two steering wheels.

Figure 5.1. Autonomous vehicle test platform

33
5.1 System Architecture
The test system is designed to accomplish autonomous mission. Figure 5.2 shows
the basic system architecture.

MISSION

PLANING

DIRECTION
GPS
AUTONOMOUS
CONTROL
TRACTOR

CONTROL SYSTEM VELOCITY


COMPASS
CONTROL

DATA

LOG

Figure 5.2. The basic system architecture for the autonomous driving.

As seen on the flowchart on figure 5.3, initially the sensors are initiated. After
initialization period, the first navigation point is read from latitudes and longitudes table.
After reaching any point, the next point is read. After the last point reached, the motors
are shut down. All the navigation time, the navigational data is being loaded with in a one
second period to the SD card.

34
Figure 5.3. Flowchart of the autonomous driving.

5.2 Model Design


The platform consists of a processing module, sensor modules, moving module,
battery modules and a data module.
In order to be autonomous, a vehicle has to know or solve four things: what is the
task to do, what is the way to complete it, what is already known and what is the position
related to known [47]. In agricultural robotics, the task is usually to cover the whole field,
not only going from point A to point B [47].
The start point is the actual position of the vehicle. The goal of the navigation
process is to move to a named place in a known, unknown or partially known environment
[20].
35
In this study, the environment coordinates are known by the controller and need to
be loaded before the tractor starts operation.

5.2.1 System components


In this section, the model components are described for the test platform to run.
These components are:

 2 reducer DC motors and 6cm diameter wheels,


 DC L293N motor driver shield,
 2 Sg90 servo motors
 GY-GPS6MV2 GPS sensor
 HMC5883L compass sensor
 Arduino Uno microcontroller
 2 power supply modules, one for motors and one for sensors, SD card module
and microcontroller,
 SPI interface SD Card module
 16GB SanDisk 80 Mb/s SD Card

5.2.1.1 Reducer motors


The platform has 750-gram weight. For forward movement, the platform is needed
to be driven with two 6V 250 RPM dc motor and wheel modules. Platform uses two
wheel back powered system (Figure 5.4).

Figure 5.4. DC reduced motors and wheels

5.2.1.2 DC L293N motor driver


The DC motors need more than 1 ampere per motor. The microcontroller cannot
provide this high current, so an L293 N motor driver shield that can supply maximum
current 2 Amps per motor is used in the system (Figure 5.5).

36
Figure 5.5. DC motor driver

5.2.1.3 SG 90 servo motors


Steering control mechanism has 2 SG 90 servo motors having 180 ° movement
capacity. The movement is restricted to 60° to 120° due to the tire diameter restriction
purposes. 90° is a forward movement (Figure 5.6).

Figure 5.6. Servo motors

5.2.1.4 GY-GPS6MV2 GPS sensor


Platform uses a GPS sensor that works with 3.3 or 5 Volt DC supply and at 9600
baud rate. Stellates supplies NMEA data packet from GPS antenna to processing section
(Figure 5.7).

Figure 5.7. GPS sensor

37
5.2.1.5 HMC5883L compass sensor
Three axis magnetometer is used for direction compass. The module uses I2C
communication protocol. The sensor supplies raw three axis magnetometer data to the
processing system (Figure 5.8).

Figure 5.8. Magnetometer sensor

5.2.1.6 Arduino Uno microprocessor


Arduino Uno controller is used for controlling the data flow and controlling
platform movement with needed speed. This microcontroller uses Atmega 328
microprocessor for processing (Figure 5.9).

Figure 5.9. Arduino UNO microcontroller

5.2.1.7 Power supply modules


Platform has two power modules. One is a 5 V/1A USB output power supply for
the microprocessor, sensors and SD card module. Second one is 8 battery pack that
supplies 12 V DC power at max 2 amperes for the power motors and direction servo
motors (Figure 5.10).
38
Figure 5.10. Power supply modules

5.2.1.8 SD card module


The platform needs a data for the mission planning and data logging for storage
purposes. For these reasons the Arduino compatible SD card module is used. The module
uses SPI interface with Arduino microcontroller. An SD card adaptor is used for a
computer and SD card interface. A 16GB SanDisk 80 Mb/s SD Card is used as mission
planning and log data transfer (Figure 5.11).

Figure 5.11. SD card module and microSD card

5.2.2 Wiring design


Figure 5.12 shows the internal wiring, data and power architecture of the test
vehicle.

39
Figure 5.12. Wiring system architecture of the vehicle

5.3 Mission Planning


Mission planning structure is based on the path planning algorithm given earlier.
The MATLAB path planning script gives an output that includes latitudes and longitudes
of navigation points.

40
Table 5.1. Six digit formatted target data table.

Latitude Longitude

1 39,814412 30,537485

2 39,817426 30,532430

3 39,818572 30,533512

4 39,815653 30,538635

5 39,814478 30,537423

6 39,817401 30,532516

. . .
. . .
. . .
131 39,816531 30,535548

132 39,816432 30,535859

These data are formatted in an Excel file that is needed for C++ string class to be
read in execution of the autonomous tractor program. The formatted latitudes and
longitudes for the mission are saved to the SD card in two text files. The program reads
the coordinates from latitudes and longitudes text files respectively.

5.4 Initialization
First, mission planning SD card is installed to the platform SD card reader module,
the system is ready for the start. The sensors are needed to be initiated before the power
on motors period starts. If the compass sensor does not start, serial logger gives an alert
message. After the compass sensor is ready and gives valid values, GPS sensor starts to
find GPS satellites in the initialization period. The sensor gives 1.000 value before
initialization. The system gives serial debug message that says “GPS waiting”. The GPS
initialization period can last more than 3 minutes in cold start, and less than 30 seconds
in hot start. After sensor initializations, the serial log sends the location, target and
direction data to the serial port if connected to the logging serial terminal.

41
5.5 System Run
After the initialization period, system reads the first target from the SD card.
Program executes computational process and steers to the target location while power
motors start. The switch attached to the motor from power pack determines when the
system will start. When the platform reaches to the target point, next target coordinates
should be loaded as target coordinates to the system. When the last point is reached, the
system stops.

5.6 Data Logger


The SD card module has a data logging process for navigation data that the trip has.
The data consist of current coordinates, current target coordinates, target angle, current
time from GPS, and the target number data. These data are stored on a text file in the SD
card in 2 second periods along the navigation process. The data can be read from a
computer as text file to be analyzed by user. Table 5.2 shows an example of saved data
from vehicle tests.

42
Table 5.2. Sample data log table after the mission.

Index latitude longitude steering anglemotor speed target lat target lon angle diff bearing heading distance target no hours minutes seconds
1 39.769.119 30.555.053 0 50 39.769.496 30.554.758 0 329,17 0 49 1 22 11 33
2 39.769.119 30.555.053 -9 60 39.769.496 30.554.758 349 329,17 -19 49 1 22 11 33
3 39.769.119 30.555.053 -7 70 39.769.496 30.554.758 345 329,17 -15 49 1 22 11 33
4 39.769.119 30.555.053 0 80 39.769.496 30.554.758 334 329,17 -4 49 1 22 11 33
5 39.769.119 30.555.053 0 90 39.769.496 30.554.758 324 329,17 5 49 1 22 11 33
6 39.769.119 30.555.053 5 100 39.769.496 30.554.758 319 329,17 10 49 1 22 11 33
7 39.769.119 30.555.053 5 100 39.769.496 30.554.758 318 329,17 11 49 1 22 11 33

43
8 39.769.119 30.555.053 0 100 39.769.496 30.554.758 321 329,17 8 49 1 22 11 33
9 39.769.111 30.555.067 0 100 39.769.496 30.554.758 319 328,5 9 50 1 22 11 42
10 39.769.115 30.555.063 5 100 39.769.496 30.554.758 318 328,59 10 50 1 22 11 43
6. TEST AND RESULTS
6.1 Test Environment
The algorithms are tested on an autonomous model car. Final implementation and
tests are done on a pedestrian free test area. Test area is selected by its appropriate
conditions to test the model car. Also the dimensions are determined so that model can
turn at least 3 tours.
The field coordinates are collected by online mapping application [48]. The corner
coordinates are processed by the path planning algorithm and navigation points are
determined, they are than loaded to the SD card. The trajectory data are loaded to the
model car by inserting SD card.
For each test, model car sensor power supply is started until GPS data stream is
valid. After a few minutes, the motor power supply is turned on by micro switch and due
to program structure, while GPS signal is valid, the model is started to move slowly (at
nearly 5 km/h speed).
Model car is first located any point in the field and no body interrupts it until it
finishes the job. After the model car finishes the work, LOG file on the SD card is
downloaded to a computer for analysis. The tests are performed on the test field. Figure
6.1 shows the top view image of the test field.

Figure 6.1. The bird-eye image of the test field

The compass sensor needs to be calibrated for each test. The compass sensor is
calibrated using calibration values 𝑋𝑐 and 𝑌𝑐 . These values are calculated from average

44
values on 𝑋𝑎 and 𝑌𝑎 values over magnetometer maximum and minimum raw values of 𝑋
and 𝑌 values.
𝑚𝑖𝑛(𝑋) + 𝑚𝑎𝑥(𝑋)
𝑋𝑎 = (6.1)
2

𝑚𝑖𝑛(𝑌) + 𝑚𝑎𝑥(𝑌)
𝑌𝑎 = (6.2)
2

𝑋𝑐 = 𝑋 + 𝑋𝑎
(6.3)

𝑌𝑐 = 𝑌 + 𝑌𝑎
(6.4)

6.2 Test Outputs


Test results are analyzed by collecting position data for each test. In Figure 6.2, the
pink fields are the navigation path calculated by the path planning algorithm. And the
blue dotted fields are the followed coordinates that are collected by GPS receiver in one
second period.

Back And Forth Path

30.540.050
30.540.000
30.539.950
Longitudes

30.539.900
Followed
30.539.850 Path
30.539.800
30.539.750
30.539.700
30.539.650
39.739.650 39.739.800 39.739.950 39.740.100 39.740.250
Latitudes

Figure 6.2. Back and forth navigation path and driven path

For the back and forth algorithm test, the path was driven by the model and the path
covered the field. The error function is given in Figure 6.3.

45
Back and Forth Path Error
10
8
error (meters)

6
error
4
2
0
1 15 29 43 57 71 85 99 113 127 141 155 169 183 197 211 225 239 253 267
position sample number

Figure 6.3. Back and forth navigation path navigational error.

The root mean square (rms) of the error is 0.8210 meters for this test. The total path
length is calculated as 360.9 meters long.
The other test results are given for the spiral path planning algorithm. The outer to
inner spiral path test is given in Figure 6.4.

Long Size Steerpoint Spiral Path


30.540.050
30.540.000
30.539.950
Longitudes

30.539.900
Followed
30.539.850 Path
30.539.800
30.539.750
30.539.700
30.539.650
39.739.650 39.739.800 39.739.950 39.740.100 39.740.250
Latitudes

Figure 6.4. Spiral navigation path and driven path

As seen from the above figure, the error for the longer parts of the path is greater
than the smaller paths. The error function of the spiral path is given in Figure 6.5.

46
Spiral Navigation Path Error

10
8
error (meters)

6
error
4
2
0
1 16 31 46 61 76 91 106 121 136 151 166 181 196 211 226 241 256 271 286
position sample number

Figure 6.5. Spiral navigation path navigational error

From the Figure 6.5, the error goes to 3 meters maximum. The root mean square of
the error is 1.2128 meters for this plan. The total path length is calculated as 381.1 meters
long. The error can be reduced by dividing the paths to considerable small distances.
Middle points’ coordinates are calculated as 𝑋𝑚 and 𝑌𝑚 for all navigation points on the
spiral path model.
𝑋𝑚−1 + 𝑋𝑚+1
𝑋𝑚 = (6.5)
2

𝑌𝑚−1 + 𝑌𝑚+1
𝑌𝑚 = (6.6)
2

Figure 6.6 shows dividing path yields better results compared to the earlier test.

Spiral Path Planning using Half Sized Interval


Steerpoint
30.540.050
30.540.000
30.539.950
30.539.900
Longitudes

Followed
30.539.850 Path
30.539.800 Planned
30.539.750 Path
30.539.700
30.539.650
39.739.650 39.739.800 39.739.950 39.740.100 39.740.250
Latitudes

Figure 6.6. Spiral navigation path for half sized distance points and driven path.

47
The navigational errors are reduced by this process and the error is seen in Figure
6.7.

Spiral Path Error

10
8
error (metres)

6
error
4
2
0
1 23 45 67 89 111 133 155 177 199 221 243 265 287 309 331 353 375 397 419
position sample number

Figure 6.7. Spiral navigation path for half sized distance points navigational error.

The root mean square of the error is 0.79 meters for this path plan. The total path
length is calculated as 381.17 meters long.
6.3 Results
The study shows that the back and forth and Spiral autonomous path covering
methods give nearly similar rms errors. Figure 6.8 shows rms errors. Spiral path long
distance has the highest error rate but by dividing smaller distance navigation points, the
spiral path algorithm error becomes lower than that of the back and forth path planning.

RMS Error
2,00
1,80
rms
1,60 error
Spiral Path Long
Meters

1,40 Distance; 1,21


1,20 Back And Forth Spiral Path Half
1,00 Path; 0,81 Distance; 0,79
0,80
0,60
Back And Forth Path Spiral Path Long Spiral Path Half
Distance Distance
Path Type

Figure 6.8. Error graphs for back and forth path, spiral path with long distance points and spiral path
with half distance points.

48
Moreover, in back and forth path planning, the turning areas are all out of the field
that yield lost time and fuel. But with the spiral autonomous path, all of the corners are in
the field and there is no lost time to turn for the next path. This study shows that due to
smaller error, the spiral algorithm can be used for enhancing efficiency of the
Autonomous Tractors.

Table 6.1. Comparison of the back and forth and spiral with half distance paths.

Spiral path with half


Back and forth path
distance

Total Path Length 360,9 m 381,17 m

Total Number of Corner 17 15

Number of Outer Corner 17 4

Total Time 306 sec 337 sec

Total Power 56.95 mAh 62.71 mAh

In Table 6.1, total path length of the spiral path is longer than back and forth path
by %5,3 from spiral path. The used power is higher for the spiral path because of the
longer path. However, the total turning number for the spiral path is less than the back
and forth path. The most important waste area is due to outer corners and back and forth
outer corner number is higher than the spiral path. The test vehicle is a small dimensional
vehicle. So it can turn a sharp corner with a small turning radius. The above test outputs
give almost perfect turn movement for a field. On the other hand, the real dimension
tractor would have a waste space due to its turning radius dimensions. Figure 6.9 shows
the turning radius for a standard tractor.

49
Figure 6.9. Turn radius of a standard tractor [49].

Turning on an outer corner takes an area out of the field. Figure 6.10 illustrates the
turning waste area for the back and forth path.

Figure 6.10. Back and forth path turning area.

50
The wasted areas on both path styles are shown in Figure 6.11. The dark blue area
is the work area and the light blue are is the waste area.

Figure 6.11. Waste areas on spiral path (left), on back and forth path (right).

The field dimensions directly affect the waste areas of both field covering
techniques. For a 30 meters constant horizontal dimension field, total waste areas with
respect to the lateral dimension are shown on Figure 6.12.

Total Waste Area

1200
1000
Total waste area (m^2)

800 Back and forth outer


waste area
600
Spiral outer waste area
400
200
0
0 10 20 30 40 50 60 70 80 90 100110120130
Lateral length of a field (meters)

Figure 6.12. Total waste areas for the covering techniques.

In Figure 6.12, the waste area is increasing with lateral dimension of the field. The
increase is higher for the back and forth path. The agricultural fields are considerable big
areas. The difference between spiral path and back and forth path is getting bigger for
larger areas. As a result one can conclude that, the spiral path has a more efficient
coverage than back and forth path. Because of the low number of turning area, the spiral
method is more usable than the back and forth path method. The use of spiral path in an
autonomous tractor provides higher path efficiency than the back and forth algorithm.

51
7. CONCLUSION
In the process of autonomization of agricultural machinery, studies will continue in
the future in our country, which is an agricultural country.
In this study, back and forth and spiral covering path planning algorithms have been
implemented to autonomize the operations of an agricultural machine on the field. A route
is formed that can be driven with the input of the corner coordinates of the field area. The
spherical coordinates are considered on the field samples given by experiments.
In the application part of the study, an embedded system is designed which records
the course data for the purpose of analyzing the course by taking the field driving route.
The designed vehicle obtains instant position and direction information by means of GPS
sensor and compass sensor. The direction and distance data of the destination points are
calculated according to the loaded coordinate data with the loading of the task data to the
SD card. Navigation is realized and the vehicle stops after the last navigation point.
The results of the study show that the designed vehicle is able to complete its task
in an empty space for both of the path planning types. The efficiency of the paths depends
on the corner numbers. It is important to calibrate compass sensor with a low error rate.
The limiting distance between two navigation points can overcome the compass related
errors with higher efficiency navigation. The most important result of the study is the
number of outer corner point is the decisive factor of efficiency of time field covering.
Finally, the study shows that, the spiral method is more usable than the back and forth
path method. The spiral method has a lower wasted turning area. For these reasons, use
of spiral path in an autonomous tractor provides higher path efficiency than the back and
forth algorithm.

52
REFERENCES

[1] A. Rekow, T. Bell, D. M. O’Connor and D. B. Parkinson, "CDGPS BASED


IDENTIFICATION OF FAR TRACTOR," in 0-7803-4330-1/98/$10.00 1998
IEEE, 1998.

[2] E. Kayacan and E. Kayacan, "Learning in Centralized Nonlinear Model Predictive


Control: Application to an Autonomous Tractor-Trailer System," in IEEE
TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY, VOL. 23, NO. 1,
JANUARY 2015, 2015.

[3] E. Kayacan, E. Kayacan, H. Ramon and W. Saeys, "Modeling and Identification


of the Yaw Dynamics of an Autonomous Tractor," in 978-1-4673-5769-
2/13/$31.00 ©2013 IEEE, Leuven, Belgium., 2013.

[4] Z. Guoyu, Z. Peng and Q. Junfei, "Path Planning Algorithm Based on Sub-Region
for Agricultural Robot," in 2010, Beijing 100124, China, 2010 2nd International
Asia Conference on Informatics in Control, Automation and Robotics.

[5] Y. Liu, N. Noguchi, K. Ishii and L. Liang, "Wind direction based path planning
for an unmanned airboat navigation," in Proceedings of the 2016 IEEE/SICE
International Symposium on System Integration,, Sapporo Convention Center,
Sapporo, Japan,, 2016.

[6] A. J. Bautista, S. O. Wane, F. Nario, J. L. Torres and T. E. Danao, "Development


of an Autonomous Hand Tractor Platform for Philippine Agricultural
Operations," in 2018 18th International Conference on Control, Automation and
Systems (ICCAS 2018), YongPyong Resort, PyeongChang, GangWon, Korea,
2018.

[7] L. Piyathilaka and R. Munasinghe, "Vision-only Outdoor Localization of Two-


wheel Tractor for Autonomous Operation in Agricultural Fields," in 2011 6th
International Conference on Industrial and Information Systems, ICIIS 2011,
Aug. 16-19, 2011,, Sri Lanka, 2011.

53
[8] H. G. Ünal, K. Saçalık, A. Gök and K. Gök, "Türkiye'deki Tarım Makinaları
Üreticilerine Farklı Bir Bakış," Tarım Makinaları Bilimi Dergisi, vol. 3, no. 18,
pp. 13-14-15, 2007.

[9] "The future of farming," Asirobots, [Online]. Available:


https://www.asirobots.com/ farming/. [Accessed 19 08 2019].

[10] "Think Outside The Cab," Bear Flag Robotics, [Online]. Available: https://
bearflagrobotics.com/#section-0. [Accessed 19 08 2019].

[11] K. Schulz and T. Anderson, "We are Electrifying," ATC, 2017,2018. [Online].
Available: https://www.autonomoustractor.com/. [Accessed 19 08 2019].

[12] "Autonomous Concept Vehicle," CASE IH, 2019. [Online]. Available:


https://www. caseih.com/northamerica/en-us/Pages/campaigns/autonomous-
concept- vehicle.aspx. [Accessed 19 08 2019].

[13] Nanalyze, "https://www.nanalyze.com," Nanalyze 2019, 30 12 2018. [Online].


Available: https://www.nanalyze.com/2018/12/autonomous-tractors-self-
driving-farm-equipment/. [Accessed 26 07 2019].

[14] C. Feuerhake, "https://www.caseih.com," 2019 CNH Industrial America LLC, 14


2 2018. [Online]. Available: https://www.caseih.com/emea/en-
za/News/Pages/2018-02-14-Case-IH-Defines-Categories-of-Autonomy - and -
Announces- Pilot- Program.aspx. [Accessed 26 7 2019].

[15] Z. Peng, Q. Junfei and Z. Hengyi, "Path Planning and Tracking for Agricultural
Master-slave Robot System," in 2010 International Conference on Computer and
Communication Technologies in Agriculture Engineering, China, 2010.

[16] Y. Yang and Y. Miao, "A Path Planning Method for Mobile Sink in Farmland
Wireless Sensor Network," in 978-1-5090-6414-4/17/$31.00 ©2017 IEEE,
Beijing, China;, 2017.

[17] H. Wang and N. Noguchi, "Autonomous Manuevras of a Robotic Tractor for


Farming," in Proceedings of the 2016 IEEE/SICE International Symposium on
System Integration,, Sapporo, Japan,, 2016.

54
[18] B. S. Blackmore and H. W. Griepentrog., "Autonomous Vehicles and Robotics.,"
The International Comission of Agricultural Engineering., vol. 6, pp. 204-2015,
2006.

[19] A. Stentz, C. Dima, C. Wellington, H. Herman and D. Stager, "A System for Semi
Autonomous Tractor Operations," AutonomousRobots, no. 13, pp. 87-104, 2002.

[20] O.Hachour, "Path Planning of Autonomous Mobile Robot," INTERNATIONAL


JOURNAL OF SYSTEMS APPLICATIONS, ENGINEERING &
DEVELOPMENT, vol. 2, no. 4, p. 180, 2008.

[21] E. Galceran and M. Carreras, "A Survey on Coverage Path Planning for
Robotics," Underwater Robotics Research Center, 2013.

[22] J. I. Vasquez-Gomez, M. Marciano-Melchor and J. C. Herrera-Lozada, "Optimal


Coverage Path Planning based on the Rotating Calipers Algorithm," in 2017
International Conference on Mechatronics, Electronics and Automotive
Engineering, 2017.

[23] P.-M. Hsu, C.-L. Lin and M.-Y. Yang, "On the Complete Coverage Path Planning
for Mobile Robots," in J Intell Robot Syst (2014) 74:945–963.

[24] S.Kalaivanan and R.Kalpana, "Coverage path planning for an autonomous robot
specific to agricultural operations," in 2017 International Conference on
Intelligent Computing and Control (I2C2), 2017.

[25] A. Khan, I. Noreen and Z. Habib, "On Complete Coverage Path Planning
Algorithms for Non-holonomic Mobile Robots: Survey and Challenges," J. Inf.
Sci. Eng. 2017, 2017.

[26] H. C. Moon, H. Woo, H. X. Zhe, H. K. J. Kim, Y. Kim and J. Kye, "Auto-


Guidance System for Tillage Tractor," in 2013 13th International Conference on
Control, Automation and Systems (ICCAS 2013), Daejeon, Korea, 2013.

[27] T. Makino, H. Yokoi and Y. Kakazu, "Development of a Motion Planning System


for an Agricultural Mobile Robot," in PR0001-3/99/0000-0959 $4.00 0 1999
SICE, Sapporo, Japan, 1999.

[28] J. Wang, J. Chen, S. Cheng and Y. Xie, "Double Heuristic Optimization Based on
Hierarchical Partitioning for Coverage Path Planning of Robot Mowers," in 2016

55
12th International Conference on Computational Intelligence and Security,
China, 2016.

[29] M. Coombes, W.-H. Chen and C. Liu, "FixedWing UAV Survey Coverage Path
Planning in Wind for Improving Existing Ground Control Station Software," in
Proceedings of the 37th Chinese Control Conference, Wuhan, China, July 25-27,
2018,.

[30] J. H. Lee, J. S. Choi, B. H. Lee and K. W. Lee, "Complete Coverage Path Planning
for Cleaning Task using Multiple Robots," in Proceedings of the 2009 IEEE
International Conference on Systems, Man, and Cybernetics, San Antonio, TX,
USA, October 2009.

[31] F. Al-Turjman and M. Biglarbegian, "Path Planning for Data Collectors in


Precision Agriculture WSNs," in 978-1-4799-0959-9/14/$31.00 ©2014 IEEE,
Guelph, ON, Canada, 2014.

[32] X. Guo, "Coverage Rolling Path Planning of Unknown Environments with


Dynamic Heuristic Searching," in 2009 World Congress on Computer Science
and Information Engineering, shenzhen, China, 2009.

[33] A. Janchiv, D. Batsaikhan, G. h. Kim and S.-G. Lee, "Complete Coverage Path
Planning for Multi-Robots Based on," in 2011 11th International Conference on
Control, Automation and Systems, Gyeonggi-do, Korea, 2011.

[34] M. Waanders, "Coverage Path Planning for Mobile Cleaning Robots," 15th
Twente Student Conference on IT, 2011, 2011.

[35] S. Dauda, D. Ahmad, A. K and a. O. J, "EFFECTS OF TRACTOR SPEEDS ON


THE PERFORMANCE OF A KENAF," International Conference on
Agricultural and Food Engineering for Life (Cafei2012), pp. 141-150, 2012.

[36] "www.magnetic-declination.com," Magnetic-Declination.com, 2018. [Online].


Available: www.magnetic-declination.com. [Accessed 26 7 2019].

[37] D. DePriest, "NMEA data," [Online]. Available: https://www.


gpsinformation.org/ dale/ nmea.htm. [Accessed 26 7 2019].

56
[38] Space Based Positioning Navigation & Timing, "https://www.gps.gov," Space
Based Positioning Navigation & Timing, 10 7 2019. [Online]. Available:
https://www.gps. gov/systems/gps/space/. [Accessed 26 7 2019].

[39] E. Calandrelli, "https://techcrunch.com," TheSpaceGal, 05 02 2016. [Online].


Available: https://techcrunch.com/2016/02/05/new- air- force- satellites-
launched- to-improve-gps/. [Accessed 26 7 2019].

[40] D. T. Sarıgül, "Dünya'nın Manyetik Kutupları Neden Yer Değiştirir ?," Bilim
Genç, vol. 15, p. 12, 2015.

[41] "Türkiye Hangi Paralel ve Meridyenler Arasındadır," 2019. [Online]. Available:


https://www.denkbilgi.com/ turkiye- hangi- paralel- ve- meridyenler- arasindadir.
html. [Accessed 26 7 2019].

[42] M. Eliçalışkan, " Paralel ve .meridyenler," 2014. [Online]. Available: http://www.


cografya.gen.tr/egitim/matcog/paralel-meridyen.htm. [Accessed 26 7 2019].

[43] "True-And-Relative-Bearings," [Online]. [Accessed 11 10 2019].

[44] R. B. Sutton, "The difference between heading and bearing in navigational terms,"
11 6 2011. [Online]. Available: https://diydrones.com/profiles/blogs/the-
difference-between-heading. [Accessed 28 7 2019].

[45] T. Bleything, "How to Convert Magnetometer Data into Compass Heading," 28


8 2017. [Online]. Available: https://blog.digilentinc.com/ how-to-convert-
magneto meter-data-into-compass-heading/. [Accessed 28 7 2019].

[46] C. Veness, "Calculate distance, bearing and more between Latitude/Longitude


points," 2 2019. [Online]. Available: https:// www.movable-type.co.uk/ scripts/
latlong.html. [Accessed 28 7 2019].

[47] A. Visala and T. Oksanen, "Path Planning Algorithms for Agricultural Machines,"
Agricultural Engineering International, no. 9, p. 2, 2007/June.

[48] Google, "Google maps," Google, 2005. [Online]. Available: https://www.google.


com/ maps. [Accessed 28 7 2019].

[49] "Massey Ferguson MF 6700S Brochure," 2017. [Online]. Available: http://www.


masseyferguson.com.au/mf6700s.aspx. [Accessed 14 11 2019].

57

You might also like