You are on page 1of 67

Bachelor Degree Project in Production Engineering 2020

Carlos Jaime Mérida

SIMULATION OF AGVS IN
MATLAB
Virtual 3D environment for testing
different AGV kinematics and algorithms

Bachelor Degree Project in Production Engineering


G2E, 30 credits
Spring term 2020

Carlos Jaime Mérida

Supervisor: Bernard Schmidt


Examiner: Wei Wang
1
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Abstract

The field of robotics is becoming increasingly more important and consequently, students need better
tools to gain knowledge and experience with them. The University of Skövde was interested in
developing a learning tool focused on a virtual simulation of mobile robots. Despite the fact that there
are several programmes to create this tool, MATLAB was preferable because of its strong presence in
educational institutions. The objectives were oriented towards testing different robot kinematics in an
adjustable virtual 3D environment. Moreover, the simulation needed a part in which future users could
design own algorithms in order to control the AGVs. Therefore, sensors such as LIDAR sensors were
necessary to enable a possible interaction between the robot and the scenario created.

This project was developed with a previous study and a comparison of some MATLAB projects and
tools. After that, the scenario and the simulation were produced. As a result, a virtual simulation has
been created emphasising that the user could modify and adapt multiple parameters such as the size
of the AGV, the form of the virtual environment or the selection of forward or inverse kinematics in
order to develop different types of algorithms. Other features can be adjusted manually such as the
type or number of sensors as well as SLAM conditions. Finally, this thesis was conducted to give a basis
about mobile robots and to be a first step for operating with real robots. The simulation also provides
an easy to use interface in which students can keep working in it through the introduction of new
applications related to image processing or more sophisticated algorithms and controllers.

Keywords

AGV, 3D simulation, virtual environment, kinematics, MATLAB, Simulink.

I
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Certification

This thesis has been submitted by Carlos Jaime Mérida to the University of Skövde as a requirement
for the degree of Bachelor of Science in Production Engineering.

The undersigned certifies that all the material in this thesis that is not my own has been properly
acknowledged using accepted referencing practices and, further, that the thesis includes no material
for which I have previously received academic credit.

Carlos Jaime Mérida

Skövde 2020-05-19
Institutionen för Ingenjörsvetenskap/School of Engineering Science

II
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Acknowledgements

I would like to dedicate these lines to everyone that contributed to this project and all people I met
during this Erasmus year in Skövde.

Firstly, I would like to thank the University of Skövde and my supervisor Bernard Schmidt for giving the
opportunity of studying here and the help for developing my thesis. Moreover, I would like to thank
Rhonwen Bowen and the Study Support Centre at the university.

In addition, I wish to express my gratitude to my friends I met in Skövde and the ones already known
from Spain for the all the moments we shared.

Last but not least, I am really thankful to my family, in particular, my parents for the unconditional
support all the time and allowing that I could have this enriching experience.

III
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Table of contents

1. Introduction ......................................................................................................................... 1

1.1. Background.............................................................................................................................. 1

1.2. Problem statement.................................................................................................................. 2

1.3. Aim and objectives .................................................................................................................. 2

1.4. Extent and delimitation ........................................................................................................... 2

1.5. Sustainability ........................................................................................................................... 3

1.6. Overview of the thesis............................................................................................................. 4

2. Approach .............................................................................................................................. 6

3. Literature review .................................................................................................................. 8

3.1. Role in education ..................................................................................................................... 8

3.2. Mobile robots’ applications with MATLAB .............................................................................. 9

3.2.1. Comparison with other software .................................................................................. 10


3.2.2. Research of similar projects .......................................................................................... 10
3.2.3. Simulink 3D Animation .................................................................................................. 11
3.3. Perception and navigation .................................................................................................... 12

3.4. Conclusions............................................................................................................................ 13

4. Theoretical frame of reference ............................................................................................ 14

4.1. Simulation.............................................................................................................................. 14

Dynamic systems: definitions ........................................................................................................ 14


4.2. AGVs ...................................................................................................................................... 15

4.2.1. Kinematic models .......................................................................................................... 15


4.2.2. Sensors and perception ................................................................................................. 18
4.2.3. Positioning and navigation systems .............................................................................. 19
5. Structure of the simulation.................................................................................................. 20

5.1. Sensors block ......................................................................................................................... 20

5.2. Control & Kinematics block ................................................................................................... 21

5.3. Visualization block ................................................................................................................. 22

6. Interact and modify the simulation...................................................................................... 24

6.1. Maze creator script ............................................................................................................... 24

IV
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

6.2. 3D World Editor ..................................................................................................................... 25

6.2.1. How to develop an AGV model ..................................................................................... 26


6.2.2. Communication with Simulink ...................................................................................... 29
6.3. Stateflow ............................................................................................................................... 29

6.4. SLAM script ............................................................................................................................ 31

7. Case study .......................................................................................................................... 33

7.1. Simulation with two robots ................................................................................................... 33

7.2. SLAM validation ..................................................................................................................... 34

8. Results and discussion......................................................................................................... 36

8.1. Results ................................................................................................................................... 36

8.2. Discussion .............................................................................................................................. 38

9. Conclusions and future work ............................................................................................... 40

Future work ....................................................................................................................................... 41

References ................................................................................................................................. 42

Appendix A Work breakdown and time plan ............................................................................... 46

Appendix B Kinematics of mobile robots ..................................................................................... 48

B.1. Locomotion................................................................................................................................ 48

B.2. Kinematics .................................................................................................................................. 49

B.2.1. Robot position ..................................................................................................................... 49


B.2.2. Types of wheels ................................................................................................................... 49
Appendix C MATLAB code and Simulink files ............................................................................... 51

C.1. Maze_Creator.m ........................................................................................................................ 51

C.2. SLAM.m ...................................................................................................................................... 52

C.3. AGV_Simulation.slx .................................................................................................................... 54

C.3.1. Sensors section .................................................................................................................... 54


C.3.2. Control & Kinematics section .............................................................................................. 55
C.3.3. Visualization section ............................................................................................................ 57

V
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

List of figures
Figure 1 AGV market by region, Markets and Markets (2019) ............................................................... 1
Figure 2 Flow chart of the methodology ................................................................................................ 7
Figure 3 Differential drive robot............................................................................................................ 16
Figure 4 Tricycle drive and Ackerman steering models........................................................................ 16
Figure 5 Ackerman steering parameters Kamal Nasir (2016) ............................................................... 17
Figure 6 Swedish four-wheeled model, adapted from Hamid, Qiao & Ghaeminezhad (2015) ............ 18
Figure 7 Map-based navigation (Siegwart, Nourbakhsh & Scaramuza 2004) ....................................... 19
Figure 8 Main structure of the simulation ............................................................................................ 20
Figure 9 Sensors block ........................................................................................................................... 20
Figure 10 Control & Kinematics block (1º level).................................................................................... 21
Figure 11 Control & Kinematics block (2º level).................................................................................... 21
Figure 12 Control block (3º level) .......................................................................................................... 22
Figure 13 Kinematics block (3º level) .................................................................................................... 22
Figure 14 Visualization block ................................................................................................................. 23
Figure 15 Mask of the simulation .......................................................................................................... 24
Figure 16 Example of a maze using a matrix ......................................................................................... 25
Figure 17 Structure of the virtual environment .................................................................................... 26
Figure 18 Transform node ..................................................................................................................... 26
Figure 19 Tree structure of a differential drive robot ........................................................................... 27
Figure 20 Sensor development ............................................................................................................. 28
Figure 21 Model of a differential drive ................................................................................................. 29
Figure 22 Stateflow example ................................................................................................................. 30
Figure 23 Rotation of the Z-axis ............................................................................................................ 30
Figure 24 Stateflow inputs and outputs ................................................................................................ 31
Figure 25 LIDAR parameters .................................................................................................................. 32
Figure 26 Loop closure parameters ....................................................................................................... 32
Figure 27 Robot persecution ................................................................................................................. 33
Figure 28 Final built map and occupancy map ...................................................................................... 34
Figure 29 Real map ................................................................................................................................ 35
Figure 30 Kinematic models .................................................................................................................. 36
Figure 31 AGV sensors........................................................................................................................... 37
Figure 32 Panoramic view ..................................................................................................................... 37
Figure 33 Dynamic perspectives of the simulation ............................................................................... 38

VI
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Figure 34 Basic types of wheels ............................................................................................................ 48


Figure 35 Swedish wheel ....................................................................................................................... 48
Figure 36 Global and local reference frame .......................................................................................... 49
Figure 37 Wheel parameters ................................................................................................................. 50
Figure 38 AGV Simulation file ................................................................................................................ 54
Figure 39 Sensor part ............................................................................................................................ 55
Figure 40 Kinematics and control part with scales block ...................................................................... 55
Figure 41 Inverse kinematics ................................................................................................................. 56
Figure 42 Forward kinematics ............................................................................................................... 56
Figure 43 Visualization part ................................................................................................................... 57

VII
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

List of tables
Table 1 Description of the chapters ........................................................................................................ 4
Table 2 Matrix decision of MATLAB tools ............................................................................................. 11
Table 3 Algorithms used in perception subsystem ............................................................................... 12
Table 4 Guidance system comparison. Lynch et al. (2018) ................................................................... 12
Table 5 Original time plan ..................................................................................................................... 46
Table 6 Updated time plan .................................................................................................................... 47

List of abbreviations

AGV Automated Guided Vehicle

GUI Graphical User Interface

ICR Instantaneous Centre of Rotation

IT Information Technology

LIDAR Light Detection and Ranging

MARS Multi-Agent Robot Simulator

MCL Monte Carlo Localization

PID Proportional–Integral–Derivative

RoW Rest of World

SLAM Simultaneous Localization and Mapping

STEM Science, Technology, Engineering, Mathematics

SVM Support Vector Machine

US Ultra-sonic

USARSim Urban Search and Rescue Simulation

URDF Unified Robot Description Format

VIII
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

1. Introduction

This chapter has as an objective to explain the main aspects permitting a first approximation of this
project to the reader, mentioning the importance of Autonomous Guided Vehicles (AGVs) in the
industry including the problem description and the aims of the thesis. In addition, delimitations and an
evaluation from a sustainable point of view are illustrated.

1.1. Background

Thanks to technology development, automation has been increasing in many areas of industry,
allowing more numbers of autonomous tasks and improving working conditions. Consequently,
employees are principally in charge of supervision of the machines. One example of this would be the
implementation of AGVs in production lines and factories; they are commonly used for the flexible
handling of workpieces between stations easily and safely.

Figure 1 shows the increment of AGVs by region from 2016 to 2024. More investment can be
appreciated over the years and at the same pace in each part of the world. The prediction for 2024 is
almost double in comparison with 2016. Note that Rest of World is (RoW) in the figure.

2,5

2
USD billion

1,5

0,5

0
2016 2017 2018 2019 2020 2021 2022 2023 2024

North America Europe Asia-Pacific RoW

Figure 1 AGV market by region, Markets and Markets (2019)

On the other hand, to know how to introduce this type of vehicle in a real factory, a previous virtual
simulation would be very recommendable to test the working conditions of a company with different
AGV systems in order to decide on the number of them in the process, the appropriate paths or how
they are going to be guided, for example.

1
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

1.2. Problem statement

This project has been proposed by the University of Skövde to create a virtual 3D environment where
different models of AGVs can move with the aid of Light Detection and Ranging sensors (LIDAR) to
avoid collisions and move around a maze. The programme must be developed to select and show the
typical mobile configurations such as differential drive, tricycle drive, Ackerman steering, and the
implementation of Swedish or also called Mecanum wheels in a four-wheeled model. Therefore, four
different robot models need to be built.

Combining MATLAB and Simulink tools such as Robotics System Toolbox, Simulink 3D Animation and
Stateflow is one of the most important challenges of this thesis; because these tools are required to
create the simulation.

Furthermore, this programme is a tool focused on being used in courses such as Autonomous
Robots/Systems to support and guide students to understand the kinematics of mobile robots by
means of simulating a specific model and modifying its parameters such as the velocity or size of the
AGV. In addition, they can use this software package applying own algorithms in order to control AGVs
and to modify a virtual 3D environment.

To conclude, computer tools are increasingly employed and also there is a wide array of software, for
example, MATLAB package is one of the most popular for students. It has been chosen as a simulator
in this project because it is frequently utilized in educational institutions when it comes to
programming; consequently, there is no need to learn too many new functionalities to handle the
simulation.

1.3. Aim and objectives

The main aim of the thesis is to design an interactive simulation as a learning tool to gain knowledge
about AGVs through MATLAB and Simulink features. In order to achieve this aim, the required
objectives are included:

• Select and modify different kinematic models of AGVs

• Visualise AGVs in a virtual 3D environment with editable obstacles

• Simulate sensors such as LIDAR to avoid collisions

• Create algorithms for robot navigation

• Set up simulation/apply own algorithms to test different solutions

1.4. Extent and delimitation

There are some delimitations when it comes to developing the whole programme. Firstly, the dynamic
parameters such as torques or forces of the robot and its interaction with the environment are not
considered. The simulation is only focused on the kinematic equations of the robot configurations to

2
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

obtain the proper movements according to each kinematic model. The movements are applied to the
whole robot; wheels do not rotate during a simulation.

Secondly, the representations of the different types of AGV models are an approximation of real
models created by simple 3D shapes such as cylinders, spheres and boxes; because a simple mobile
robot is enough to show its kinematic behaviour. However, due to the modelling complexity of Swedish
wheels, these have been modelled as simple wheels (cylinders). Moreover, the accuracy of the robot
models depends on the 3D editor used.

Regarding sensors, LIDAR sensors are implemented virtually with lines in the object editor and when
they collide with an obstacle, they can provide information about the distance or position like a real
sensor.

Finally, the virtual environment is not a real factory layout, it only represents an editable 3D maze
where it is possible to examine the behaviour of AGVs while they are performing in the scenario.

1.5. Sustainability

According to the World Commission on Environment and Development sustainability (1987) is defined
as: “Sustainability development is the development that meets the need of the present without
compromising the ability of future generations to meet their own needs”. Moreover, Triple P; People,
Planet, and Profit (in 2002 would be modified by Prosperity), is also included in the definition. Despite
that explanation, this definition is seen from a too anthropogenic perspective, the ecological view is
essential to consider by including the Precaution Principle which means that environmental measures
have to be adopted when scientific evidence proves that harmful consequences can occur.

A definition that embraces all the terms above would be the following: “Without compromising usual
designs criteria such as costs, appearance and quality, all environmental impacts of a design outcome
throughput the complete life cycle should be taken into consideration, including social and economic
well-being, such as ethics and the environment” Morris et al. (2007, 135-142).

The three following perspectives regarded (economic, social, and ecological) come from the Triple P
concept, changing “People” by social, “Planet” by ecological and “Prosperity” by economic. However,
a growing number of people consider sustainability as a requirement or order winner when purchasing
products; nevertheless, companies seek to improve and refocus their processes not only for their
public image but also to minimize resources involved. Consequently, simulation has become an
excellent method linked to sustainability to develop more sustainable goods or services; while mobile
robots as a tool to boost factory performance.

From an economic perspective, creating a simulation means reducing considerable investments when
it comes to building a real system. There is no need to purchase a real AGV to show specific kinematics
and its behaviour inside the existing layout. Furthermore, many algorithms, as well as robot shapes,
can be applied, and additionally, different layouts can be designed, both without any cost. Moreover,
although AGVs need sometimes a great investment, this enhances the automation of a factory,
reaching a reduction of lead time and work in progress between processes; hence, with the correct
planification a company can reduce a lot of costs.

3
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

According to social sustainability, this project aims to reinforce the learning and knowledge of mobile
robots. Therefore, this thesis not only are the students benefitting but also those who want to learn
about how to control mobile robots in a virtual environment with MATLAB. When physical mobile
robots are considered, Three D’s of Industry, which represent the dirty, the dull, and the dangerous,
are handling better labour conditions because the AGVs are usually utilized to pick and place objects.
(Shea, 2016).

Last but not least, regarding an ecological aspect, any physical resource for virtual simulation is not
generally necessary, without taking into account the equipment to carry it out such as a computer.
Furthermore, applying mobile robots optimise factory efficiency and generate less wastage; besides
that, they do not produce CO2 emissions because they are commonly fabricated with electric batteries.
Consequently, a minimum environmental impact is produced by simulations or AGV applications.

In conclusion, social perspective can vary depending on simulation purposes while AGVs decrease
labour risks between operators and machines. Despite this, economic and ecological sustainability are
always present due to energy and natural resource reduction as well as a decrease in expenses.

1.6. Overview of the thesis

To conclude with the introduction, the structure of the thesis is explained in this section. Table 1 gives
a brief description of each chapter of the project,

Table 1 Description of the chapters

Chapters Description

Chapter 1: Introduction Gives a general description of the thesis including the problem
statement, objectives and among other things

Chapter 2: Approach Describes the methodology followed during the development of


the project, it uses a waterfall model

Chapter 3: Literature review Includes a literature review to show the role of robotics in
education and current technological advances about mobile
robots such as sensors and navigation system, and applications
using MATLAB

Chapter 4: Theoretical frame of Considers all required concepts and knowledge to understand
reference the project and its implementation, this includes simulation
definition, the description of the robot models and their
kinematic equations, and an introduction about sensors and
navigation systems

Chapter 5: Structure of the Describes the structure of the simulation in Simulink


simulation

Chapter 6: Interact and modify Introduces how to modify the simulation parameters and create
the simulation new stuff

4
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Chapter 7: Case study Includes two experiments in order to test the simulation
programme and check the objectives

Chapter 8: Results and Collects the results and gives discussion of them
discussion

Chapter 9: Conclusions and Summarises the main results of the simulation and challenges as
future work well as future work

Appendix A Work breakdown Illustrates the time flow during the development of the thesis
and time plan

Appendix B Kinematics of Describes the fundamental concepts about the kinematics of


mobile robots mobile robots and how equations are used and defined

Appendix C MATLAB code and Collects the code of MATLAB scripts and screenshots of the
Simulink files Simulink systems.

5
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

2. Approach

A thesis is a combination of a written report and an argument in order to defend it at a viva voce
(Oates, 2006). Hence, depending on the type of the research an appropriate methodology should be
used. This chapter is dedicated to explaining the methodology followed during the thesis development
which is Design and Creation. In addition, this thesis can be considered as an Information Technology
(IT) application with the aim to conduct to something else according to Oates (2006). Due to the fact
that the goal is to create a simulation tool for students to learn about mobile robots. Consequently,
designing its own virtual environment and algorithms are included. The research here is focused on
seeking the best MATLAB tools in order to accomplish the objectives.

Five steps are constituted to reach the objectives: awareness, suggestion, development, evaluation,
and conclusion (Vaishnav & Kuechler, 2004) from Oates (2006). Therefore, the simulation can be
improved step by step adding more features.

1. Awareness: firstly, an identification of the problem is required; here, the problem is


determined in the Introduction. This corresponds to the need of supporting students to
enhance them about mobile robots and to practise with them testing different features of
AGVs.

2. Suggestion: the next step must be to propose a possible solution to solve that problem. Thus,
this includes the literature review in order to contrast different projects with a special focus in
MATLAB projects, and the theoretical frame of reference to gain knowledge about mobile
robots; while considering the objectives of the thesis. Consequently, a 3D simulation tool in
which students can experiment with different AGVs and set their preferences of the scenery
and control algorithm.

3. Development: it is the next stage; the programme is divided into three main parts: the 3D
environment and the robot models created with Simulink 3D animation, implementing the
kinematic equations and including the control part using Stateflow. Moreover, adding editable
parameters to cover the specific functionalities mentioned in the Problem Statement and
making a friendly and an intuitive environment of simulation.

4. Evaluation is oriented to examine if the simulation fulfilled all the requirements and check if
the performances of AGVs are the expected ones in comparison with the implemented in the
previous stage. If the results are not validated, the problem could be in the definition of the
algorithms if the AGV moves correctly, but it does not navigate around the virtual environment
properly, Moreover, equations would not be applied satisfactorily if AGVs do not move
according to their kinematics. Finally, the errors could be in the design stage is the objects do
not have the proper size or appearance.

5. Conclusion: this determines the results obtained from the project once they are validated, and
the possible future work is defined in order to improve and more features into the simulation
tool achieved during the thesis.

6
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

When it comes to describing the system methodology, the waterfall model is employed. This method
implies that each stage has to be completed before conceiving the next stage. However, the whole
programme would need some iterations to be concluded. Figure 2 exposes the steps to fulfil the
simulation requirements:

Start

Formulate the problem and Awareness


identify its requirements

Research about 3D robotics


Suggestion
Matlab projects

Design the virtual environment


and the kinematic models

Implement kinematic equations


Development

Apply navigation algorithms

Evaluate the results

No Evaluation
Expected
results

Yes

Conclusion Conclusion

End

Figure 2 Flow chart of the methodology

7
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

3. Literature review

In this chapter, relevant related literature about the project will be discussed, on the one hand, with a
focus on how robotics can be a teaching platform for enhancing students’ interest in this field and
mobile robot applications using MATLAB tools. On the other hand, a review of the most used sensors
and systems of navigation oriented to AGVs.

3.1. Role in education

There are numerous projects about robotics concerning how to improve the students’ capabilities and
knowledge about robotics. For instance, Yong-de et al. (2013) focused on the importance of
encouraging students’ innovative thinking through teaching applications to stimulate and develop
their ideas and creativity. Furthermore, not only is the interest promoted for a higher level, but also
for all the stages of education. The authors explained all the benefits of these types of tools such as to
achieve more active learning, to experiment through the theoretical knowledge acquired in lessons
and, to become familiar with a continuous growth area which is robotics.

On the other hand, according to Carpin et al. (2007), because of the decrease in engineering students,
the authors developed software called Urban Search and Rescue Simulation (USARSim) to be utilized
as a support for competitions such as Robotcup, a competition of robotics in which thousands of
students around the world can take part. Moreover, this simulation tool pretended to foster robotics
for young people and to enable the authors to implement this programme as an educational tool,
releasing examples and free code to support the learning.

Referring to other tools developed using MATLAB, a website was designed to simulate fixed robot
kinematics as well as its dynamic aspects in order to implement paths and motion control. Once the
parameters are determined, a virtual simulation is shown for validating the model; therefore, authors
achieved a better involvement for students to grasp robotic concepts, (Sengupta, Jain & Kumar, 2013).
Another case was related to create a virtual lab because from the point of view of the authors; students
do not know all the potential risks of laboratories that do not have all the necessary equipment. Thus,
they developed a virtual lab to accomplish research of industrial robots, including experiments in
different difficulties (Li, Fu, & Wang 2018).

In conclusion, due to the popularity of Science, Technology, Engineering, Mathematics (STEM) in


education, many projects have been elaborated to propose possible tools such as the examples above.
In consideration of the fact that the field of robotics is currently booming, the need for these projects
is so important to students for gaining a better understanding.

8
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

3.2. Mobile robots’ applications with MATLAB

MATLAB has become one of the most used software due to its many available features. When it comes
to handling mobile robots, it provides tools related to designing algorithms, creating virtual
environments, avoiding collisions, among other functionalities.

First of all, to begin from a basic level, Gonzalez, Mahulea & Kloetzer (2015) presented a free
programme to introduce the basics of mobile robots such as path planning, motion control, and
modelling; using a Graphical User Interface (GUI) is possible to observe the results obtained in detail.
In addition, for path planning, two methods can be used such as cell decomposition and road map as
well as the map which is previously designed by the user. In terms of kinematics, differential drive or
Ackerman steering can be implemented and also selected dynamic parameters to modify the motion.

Other projects are included, such as Multi-Agent Robot Simulator (MARS), which is a simulator to
experiment with mobile robots built with Lego pieces. MATLAB was selected as a programme because
students could easily design their algorithms to control one or more robots and design and test games
or tasks. Examples include avoiding collisions, competing with other robots, Simultaneous Localization
and Mapping (SLAM) or path planning in structured or unstructured 2D workspaces which could be
previously created by the user. Moreover, controllers could be or could not be centralized, this means
that the robots can be controlled by one agent or they can control themselves taking into account only
their simulation parameters. Although holonomic and non-holonomic robots could be performed, the
latter type was implemented as unicycles (Casini & Garulli 2016).

Secondly, once the basic concepts have been acquired thanks to projects such as those mentioned
above, the next step would be tools such as a simulator to test mobile robots before the real
implementation. Frontoni , et al. (2006) added an algorithm to navigate utilizing Monte Carlo
Localization (MCL), through global localization and robot position tracking. MCL is a method based on
the probability density in a discrete representation of a map; the sensors in the test were sonar sensors
due to their low price. In addition, AGV implementations that are laser-guided are also common (Xu et
al. 2019). In this case, the work environment needs to count with laser reflectors to determine the
position with a LIDAR laser through the intensity of the reflections; moreover, this method is so popular
due to its simplicity and accuracy. Nevertheless, the algorithms must not take so much time while the
robot is moving because it could estimate an incorrect position.

Finally, Pandey & Ramakrushna. (2014) developed a navigation system based on fuzzy logic and
artificial neural network to improve the movements of mobile robots. It was designed to reach
successful results in workspaces full of hurdles and in which complex path planning was required to
avoid them and to seek the goal position. The fuzzy controller has as inputs the distance between the
obstacles and the robot sensors; the output variable is the angle direction of the robot. Consequently,
the robot could reach the goal position moving in a smooth trajectory; however, according to the
simulation, it was only allowed to move in the forward direction.

9
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

3.2.1. Comparison with other software

There are some well-known programmes such as Gazebo or Unity to create accurate 3D simulations.
Gazebo is more focused on robot control, while Unity enables creating environments with a view to
developing videogames. Conversely, although MATLAB is commonly regarded as a powerful calculating
tool, its name comes from Matrix Laboratory. It has plenty of functionalities and tools to take into
consideration when building a virtual environment in which the behaviour of a robot can be analysed
and controlled. Moreover, according to the previous section MATLAB allows designing detailed
simulations to be used by students. The detectable disadvantage is the poor graphic quality to render
complex material textures.

Apart from that, software such as Gazebo or Unity requires more programming knowledge that future
users might not have. Moreover, MATLAB and Simulink are a very popular programme for students
due to their easy-to-use environment, the utilization in universities and they count on a great variety
of features in which it is possible to work in many scientific areas.

3.2.2. Research of similar projects

The main research of the thesis is what 3D editor of MATLAB is the best to develop the virtual
environment as well as the robot models according to the aims of the project. Therefore, some video
tutorials and examples of the Simulink library were analysed.

Firstly, the first project was Robotics Playground, which was developed by MathWorks Student
Competitions Team. (2020). This is a project designed entirely using Simscape Multibody, for a
modifiable scenario as well as for a robot model, which was a differential drive AGV. Simscape
Multibody is a very useful tool when it comes to emulating real systems or creating very detailed
simulations. Due to the fact that several options regarding joints, torques, forces and more parameters
are provided in order to obtain a more accurate model. Furthermore, there is a video (Avendaño,
2018), which shows some functionalities and the results obtained from this app.

Secondly, Seshadri (2014) recorded a video in which explains a virtual simulation of a real AGV model
in order to test its behaviour according to an algorithm the author developed. The difference in
comparison with the previous project is that the virtual environment was designed using Simulink 3D
Animation with 3D World Editor. While the AGV model was imported from a Unified Robot Description
Format (URDF) file using Simscape Multibody. This format permits to collect numerous characteristics
of the AGV from a CAD file making the design of the robot easier. Nevertheless, URDF files for the
kinematic prototypes required for this thesis were not available because they usually belong to
companies and building them would need too much time. In addition, the visualization of the
simulation was implemented with Simulink 3D, so the graphic quality was reduced according to the
first project.

Finally, other examples such as Differential Wheeled Robot in a Maze (The Mathworks, Inc. 2019) was
built using only Simulink 3D Animation. Therefore, their scenarios and AGV models were included in
the same 3D file. Furthermore, many parameters and conditions were not necessary to include; for
instance, gravity, torques, friction coefficient, etc. Thus, the implementation is easier, achieving

10
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

enough accuracy to obtain a suitable simulation environment. The graphic quality is the same as the
previous example. However, the model shape is less complex.

Consequently, some options have been regarded to develop the simulation; through a Pugh’s
evaluation matrix as a selection method, the advantages and disadvantages of each working way are
clearly shown in Table 2.

Table 2 Matrix decision of MATLAB tools

Simscape Simscape Multibody & Simulink 3D


Key Criteria
Multibody Simulink 3D Animation Animation

Ease of Implementation - - +

Ease of Use - 0 +

Graphical Quality + 0 0

Sum of Positives (+’s) 1 0 2

Sum of Zeros (0’s) 0 2 1

Sum of Negatives (- ‘s) 2 1 0

Net Score -1 -1 2

Rank 3 3 1

As a result of this evaluation method, Simulink 3D Animation is the best tool to implement the
simulation and develop the scenario and robot models because of its simplicity.

3.2.3. Simulink 3D Animation

Once this internal tool was selected, the next step was to research the three editors available to create
and render the virtual simulation. These were: MATLAB Editor, V-Realm Builder, and 3D World Editor.

First of all, although MATLAB Editor was the most challenging and arduous to use; however, it
permitted to modify details easily and faster due to the no need of opening the virtual environment.

The most intuitive was V-Realm Builder, it allowed designing different shapes and changing their
appearances easily (Echavez Morales 2012). Despite that, sensors could not be included.
Consequently, 3D World Editor was the chosen editor; besides that, it was utilized in most of the
MATLAB and Simulink examples.

To conclude, Simulink 3D Animation with 3D World Editor are the key tools when it comes to
developing the whole virtual simulation.

11
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

3.3. Perception and navigation

When it comes to discussing sensors, LIDAR sensors, as well as cameras and radars, are the most
frequently utilized sensors in AGVs; nevertheless, Ultra-Sonic (US) sensors or contact sensors are
implemented in more specific tasks. Furthermore, a combination of different types of sensors is also
common to obtain more information about the scenario (Ilas 2013).

Table 3, which is based on Ilas (2013) depicts the most common approaches in each stage of the
perception subsystem.

Table 3 Algorithms used in perception subsystem (Ilas 2013)

Perception stages Popular Algorithms

Detection Background subtraction and segmentation

Tracking Kalman filter

Classification Support Vector Machine (SVM)

Lynch et al. (2018) claimed that “the sensors used by an AGV to navigate are determined by the
navigation system employed by the unit”. The two most popular sensors are the laser scanner and the
magnetic guide in the industrial sector.

In order to compare and analyse the different navigation methods, Table 4 shows the four main types:
Laser Guidance, Magnetic Sport, Line Following, and Barcode Guidance, which are evaluated with the
aid of the following parameters.

Table 4 Guidance system comparison. Lynch et al. (2018)

Criteria Laser Guidance Magnetic Spot Line Following Barcode Guidance

Cost of
High Low Low Low
Installation

Ease of
High High High High
Installation

Complexity High Low Low Low

Flexibility High Medium Low Medium

Deviate from
Yes Yes No No
routes

Efficiency High Medium Low Medium

Ease of
High High Low High
expansion

12
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

As a consequence of this evaluation, Laser guidance can be considered as the best method due to its
flexibility and efficiency despite requiring high investment and complexity. On the other hand,
Magnetic Spot could be the best-balanced solution to begin when a company wants to implement
AGVs in its processes because of the price. Taking into account that it is better in reference to deviating
from routes, ease of expansion and installation.

3.4. Conclusions

In closing the review of these conferences papers and projects, it is demonstrated that robotics, which
is one of the pillars of current technology, is an increasingly more attractive sector for educative
institutions from school to universities thanks to the development of simulators. MATLAB has been
utilized in most cases and because of its versatility assists to design general programmes to test
multiple functionalities or just to experiment with new specific methods and tools for mobile robots.
Researches have been mainly focused on navigation algorithms and perception systems in order to
improve robot movements and their interaction with the surrounding environment, reaching superior
safety levels as well as higher performance. Regarding the perception systems, LIDAR is the most
common sensor and Laser guidance the best navigation method despite its implementation cost.

13
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

4. Theoretical frame of reference

This fourth chapter includes all the concepts and definitions relating to simulation. Furthermore, it is
explained the different kinematics of mobile robots with their equations as well as the sensors and
perception of the environment, and finally, navigation systems are mentioned.

4.1. Simulation

The first aspect to explain is what a simulation is. A simple explanation could be “The imitation of the
operation of a real-world process or system over time” Banks et al (2014). Nowadays, the simulation
field is a leading sector and it is one of the nine technologies of Industry 4.0.

According to Stewart (2014), there are four types of simulation methods. The most common is focused
on discrete events due to its use in abstract operation systems such as manufacturing lines, restaurants
and even hospitals. In addition, Monte Carlo simulation is based on calculating the probability of the
subject of study to determine the outcome. Regarding other methods, agent-based simulation is
oriented to study interactions between individual entities with their behaviour in time. The last
method is called system dynamics and; it consists of a methodology to understand complex behaviours
that involve dependencies, feedback and delays in the process.

With regard to this thesis, the simulation that will be created with MATLAB can be considered as a
continuous simulation, it can react to initial conditions and input parameters. This simulation could be
seen as a dynamic system because it uses differential equations of first order referring to the velocity
and the position of the robot model.

Dynamic systems: definitions

Some concepts must be introduced to obtain a complete understanding of this topic.

Firstly, a system is a set of components that can interact with each other; thus, when one of them
changes, it affects their states. Moreover, modelling means to obtain linear mathematical relations,
differential and algebraic equations between parts of the system. Regarding differential equations,
variables derived corresponding to dynamic variables; besides, the order of the system is the highest
derivative of the equations, the majority is first or second-order systems.

Secondly, as said above, systems can react to initial conditions which are the initial values of dynamic
variables; when it comes to perturbations, they are referred to external factors that occur arbitrarily,
and inputs signals which are the independent variables of the differential equations. Finally, the
response or the behaviour of the system can be obtained.

14
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

4.2. AGVs

AGVs are mainly used to transport a wide variety of products and to pick and place them. They are
considered as logistic tools in the industry due to the advantages they can offer in relation to material
handling and safety. Furthermore, more companies are designing their own AGVs in order to clean
their factories.

Due to the great variety of models, AGVs can work in many areas to boost process performance while
they increase safety and accuracy levels. Applications include MHI (2012).

• Material handling: transporting raw material, pallets, unfinished goods between workstations,
containers or finished products including the storage. This is the most common use
• Cleaning: companies are developing their own AGVs for cleaning purposes
• Staff transportation: in this case, AGVs would constitute an autonomous car but inside the
factory

To conclude the introduction, in this section it is explained the different kinematic models, according
to Siegwart, Nourbakhsh & Scaramuza (2004) and other sources. Nevertheless, Appendix B Kinematics
of mobile robots describes more previous concepts and equations to reach a better and deeper
understanding of kinematics.

4.2.1. Kinematic models

“Kinematics is the most basic study of how mechanical systems behave” (Siegwart, Nourbakhsh &
Scaramuza, 2004, p. 47). In this case, for mobile robots, wheels play the main role to design and
understand the kinematics. With that in mind, wheels together with chassis determine the robot’s
motion as well as impose its constraints.

This thesis has as one of its objectives to implement different kinematics such as differential drive,
tricycle drive, and Ackerman steering. Furthermore, a model with four Swedish wheels is implemented.
All descriptions of these configurations will be covered in the following points.

Forward and inverse kinematics have been implemented in the simulation. The first one describes the
movement of the entire of the robot from the velocities of the wheels and the second one is the
inverse; the velocities of the wheels are calculated defining the movements of the robot previously.

• Differential drive

This first model uses two fixed standard wheels that drive the motion with one or two optional wheels.
These can be steered standard, castor or spherical wheels, in other words, they have to be
omnidirectional and their aim is to keep the balance of the robot. Applying the conditions that 𝛼, which
is the angle between P and the wheel, and 𝛽, which represents the angle between the wheel plane
and the robot chassis, for the left wheel (see Figure 3); the right wheel would be very similar. The
resulting forward kinematic model would be equation (1). (Siegwart, Nourbakhsh & Scaramuza 2004).

15
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

𝑌𝑅

𝛽=0

𝑙 𝛼 = 𝜋/2
𝑋𝑅
𝑃

Figure 3 Differential drive robot

1 0 𝑙 −1 𝑟𝜑̇ 𝑟
𝜉𝐼̇ = 𝑅(𝜃) −1
[1 0 −𝑙 ] [ 𝑟𝜑̇ 𝑙 ] →
0 1 0 0
𝑥̇ 𝑟𝑐𝑜𝑠𝜃 𝑟𝑠𝑖𝑛𝜃 𝑟 𝑇
[𝑦̇ ] = [ (𝜑̇ 𝑟 + 𝜑̇ 𝑙 ), (𝜑̇ 𝑟 + 𝜑̇ 𝑙 ), (𝜑̇ 𝑟 − 𝜑̇ 𝑙 )] (1)
2 2 2𝑙
𝜃̇
Where the first matrix, 𝜉𝐼̇ , indicates the velocities in the global axis (𝑥̇ , 𝑦̇), and the angular velocity, 𝜃̇.
Furthermore, 𝑅(𝜃)−1 is the inverse of the rotation matrix, 𝑟 is the radius of the wheel, 𝑙 is the distance
from P to the centre of the wheel, and 𝜑̇ 𝑟 and 𝜑̇ 𝑙 the angular velocities for right and left wheel.

When it comes to defining the inverse kinematics, it is defined in equation (2). Where 𝑉 is the modulus
of the velocity.
𝑇
𝜑̇ 𝑉 + 𝜃̇ 𝑙 𝑉 − 𝜃̇ 𝑙
[ 𝑟] = [ , ] (2)
𝜑̇ 𝑙 𝑟 𝑟

• Tricycle drive and Ackerman steering

On the one hand, tricycle drive is based in the same driven wheels of the previous model; however,
these wheels always spin in the same sense but their velocity depends on the angle of the third wheel,
which decides the direction of the robot, in this case, a steered standard wheel. On the other hand,
Ackerman steering shares the same characteristics with the difference that it has two wheels instead
of one to control the direction of the robot. As a consequence, this model is more stable. See Figure 4.

𝑃
𝑃

Figure 4 Tricycle drive and Ackerman steering models

These models have only one degree of freedom. Hence, they can go forward or backwards, and as a
consequence, they need to make a circle when a rotation is required. The Instantaneous Centre of
Rotation (ICR) is the centre of that circle.

16
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Finally, the forward kinematic representations of both models, the tricycle drive and Ackerman
steering are formulated in the next equations (3) and (4) referring forward and inverse kinematics
respectively, as Figure 5 shows the main parameters of both kinematics. To classify the parameters,
the first subscript 𝑟 and 𝑙 indicate right and left respectively as well as the second one, 𝑓 and 𝑟, denote
front and rear.

𝑋𝑅
𝛿

𝛿 𝐼𝐶𝑅
𝑌𝑅
𝑃

𝑙 𝑅

Figure 5 Ackerman steering parameters Kamal Nasir (2016)

In addition, 𝑅 is the distance from the ICR to the gravity centre and 𝛿 the steer angle. Although in the
reality the steer angles for the front wheels would differ a little proportion with 𝛿, both angles are
considered equal than 𝛿. As a result, the forward kinematic matrix is adopted from Wang & Qi (2001)
using the bicycle model to solve the model. The matrix is the same for the tricycle drive and Ackermann
steering.
𝑥̇ 𝑟 𝑟 𝑟 𝑇
[𝑦̇ ] = [ (𝜑𝑓̇ cos 𝛿 + 𝜑𝑟̇ ), (𝜑𝑓̇ sin 𝛿), (𝜑𝑓̇ sin 𝛿)] (3)
2 2 𝐿
𝜃̇
Considering the inverse kinematic equations, the only difference is for the front wheel in the tricycle
model in which 𝑙 = 0. (Kamal Nasir 2016). This model offers more information about wheel velocities.
𝐿 V
𝑅= θ̇ = tan 𝛿
tan 𝛿 L
(𝑅 − 𝑙𝑓 )𝑉 (𝑅 + 𝑙𝑓 )𝑉
𝜑𝑙𝑟̇ = 𝜑𝑟𝑟
̇ =
𝑅∙𝑟 𝑅∙𝑟
2 (4)
√(𝑅 − 𝑙𝑓 ) + 𝐿2 |tan 𝛿| 𝑉
𝜑𝑙𝑓
̇ =
𝐿∙𝑟
2
√(𝑅 + 𝑙𝑓 ) + 𝐿2 |tan 𝛿| 𝑉
{ 𝜑𝑟𝑓
̇ =
𝐿∙𝑟
• Four Swedish wheels model

The last model is implemented with four Swedish wheels, which are omnidirectional wheels
constituted of small rollers at a certain angle. This type of wheel does not have to rotate along their
vertical axis of the plane and as a consequence,
it possesses three degrees of freedoms. The model and its main parameters are shown in Figure 6. The

17
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

equations are adopted from Hamid, Qiao & Ghaeminezhad (2015) in which they took as an assumption
that the angle of the roller rotation axis relative to the wheel plane (𝛾) as well as the angle between
𝑋𝑅 and the wheel were 𝑝𝑖/4 radians. Conversely, 𝛽, which is the angle formed by 𝑋𝑅 and the
transversal axis of the wheel was 𝑝𝑖/2 radians.

𝑌𝑅

𝛽
𝑙𝑦 𝐶ℎ𝑎𝑠𝑠𝑖𝑠
3 𝑃 1
4 2 𝛾
𝛼
𝑋𝑅
𝑃

Figure 6 Swedish four-wheeled model, adapted from Hamid, Qiao & Ghaeminezhad (2015)

The distances, 𝑙𝑥 and 𝑙𝑦 are the lengths from the local axes to the wheel centre. Consequently, the
resulting forward kinematic matrix is written hereafter (5).

1 1 1 1 𝜑1̇
𝑥̇ 𝑟 −1 1 1 −1 𝜑2̇
[𝑦̇ ] = [ 1 ] [𝜑 ̇ ] (5)
4 − 1 1

1
3
𝜃̇ 𝑙 +𝑙 (𝑙𝑥 + 𝑙𝑦 ) 𝑙𝑥 + 𝑙𝑦 𝑙𝑥 + 𝑙𝑦 𝜑4̇
𝑥 𝑦

While the inverse kinematic matrix is described in (6).

𝜑1̇ 1 −1 −(𝑙𝑥 + 𝑙𝑦 )
𝜑̇ 1 1 1 (𝑙𝑥 + 𝑙𝑦 ) 𝑥̇
[ 2] = [𝑦̇ ] (6)
𝜑3̇ 𝑟 1 1 −(𝑙𝑥 + 𝑙𝑦 )
𝜑4̇ 𝜃̇
[1 −1 (𝑙𝑥 + 𝑙𝑦 ) ]

4.2.2. Sensors and perception

The perception system is divided into three main stages: detection, classification and tracking. The
fundamental task of sensors is to provide data of the environment (exteroceptive sensors) as well as
of internal parts of the robot (proprioceptive sensors). There is a wide range of sensors; however, for
this simulation, only a few types are required. It will be assumed that there is no noise and
interferences of signal recognitions.

Firstly, there are optical encoders to measure the velocity and position of driven wheels, so the
orientation can be also determined. Then, active ranging sensors such as LIDAR to know the distance
between the obstacles and the robot; consequently, the robot can create a map of the environment,
which can be 2D or 3D. Furthermore, 3D cameras are useful to capture the object shapes of the
environment.

For the simulation, only exteroceptive sensors are required because the velocity, rotation of the
wheels as well as the position of the robot are measured through odometry. In this case, it uses data
from equations instead of motion sensors to estimate the position over time. LIDAR sensors are the

18
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

used ones in the virtual environment, they determine the distance from the sensor itself to the
receptor measuring the time of the rays of light between they go out and they are captured by the
sensor again.

Once the information is obtained and analysed, the robot can move around safely in the scenario
according to its programming. Sensors assist the control unit to complete a path or solve a maze, and
they are also designed to avoid collisions.

4.2.3. Positioning and navigation systems

This is the next step after information from sensors is collected. Positioning or localization means
identifying the robot position in the environment, after that, the cognition stage begins in which the
robot will decide what the next action is to complete the path or to keep moving. Finally, in the motion
control stage, the robot defines the motor parameters to achieve the goal. Figure 7 depicts the map-
based navigation. In this method, the robot attempts to reach position thanks to sensors and then
updates its position in the environment. Whereas the map given to the robot has to be accurate
enough to achieve the goal location, this method is very useful when the robot has to approach
different scenarios.

Perception

Localization / Map-Building
Actuators
Sensors
Cognition / Planning ssssss

Motion Control

Figure 7 Map-based navigation (Siegwart, Nourbakhsh & Scaramuza 2004)

Another relevant method is SLAM. Here, the robot creates a map of the environment while it is moving,
so a previous given map is not mandatory as in the other method.

To conclude, as there is no map, planning and reaction aspects must be related and coordinated to
allow the robot to move properly to the objective. Without a reaction of the system, a collision could
happen during the performance. The navigation depends on the algorithm implemented as well as the
number of sensors in this project. Although this is also important to develop the simulation, lower-
levels such as kinematics will be more taken into account because of the thesis goal.

19
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

5. Structure of the simulation

In this chapter, the structure of the simulation and how it is organized are explained. Figure 8 shows
that there are three main blocks, each one manages a part of the simulation. To classify them, the
Sensors block supervises the inputs of the robots, as well as the Visualization block, which supervises
the outputs of them. Finally, the second block contains algorithms and equations. In the third section
of Appendix C MATLAB code and Simulink files can be seen all the real blocks made in Simulink.

Control &
Sensors Visualization
Kinematics

Figure 8 Main structure of the simulation

5.1. Sensors block

This first block includes all the available outputs of the virtual environment, these outputs are the
inputs of the system; however, only sensors of each model are displayed, these are grouped according
to the type of the model. Therefore, depending on the model and using a switch, it is possible to select
only the signals of the sensors that move to the next stage.

The simulation initially provides four LIDAR sensors for each model, one at 0°, 90°, 180° and, 270°
creating a cross. Consequently, the output of the Sensors block is a vector with four elements.
Moreover, there is a SLAM LIDAR sensor for each model which provides information about the
obstacles surrounding in 360° to create a map of the scenario; however, this information does not
move to the next stage because its information does not affect the algorithms. There is another switch
to choose between which SLAM sensor goes to MATLAB workspace. On the other hand, the same
occurs with the camera sensor; this camera only provides information in the Visualization block. Figure
9 shows the internal structure of the Sensors block. The real block in Simulink can be seen in Figure 39
Sensor part.

Differential Tricycle Ackermann Swedish


Drive Drive Steering Wheels
Model

Switch 1 & 2

Sensors Outputs SLAM Sensor

Figure 9 Sensors block

20
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

5.2. Control & Kinematics block

Secondly, this block is denser than the others, the forward or inverse kinematics equations and
algorithms are contained inside this part. It consists of some different levels. The first one, which is
represented in Figure 10, uses sensors as inputs signals that go to both kinematics blocks. An internal
variable, Kinematics, selects between which kinematics the movement outputs come from, either
forward or inverse. Furthermore, it is a Scales block in which the size of the wheels, robot bodies, and
sensors are defined and controlled by a mask and internal data. All these output signals depart to the
next stage, the Visualization block. In Figure 40 Kinematics and control part with scales block is the first
level of this block in Simulink.

Forward
Kinematics

Sensors Kinematics Movement


Outputs
Inverse
Kinematics

Scales Size Outputs

Figure 10 Control & Kinematics block (1º level)

The next level has a parallel structure for both kinematic blocks. It constitutes the algorithms and the
kinematics of each robot model, Figure 11. Finally, only the translation, rotation, and wheel angle, if
the model is the tricycle or Ackermann, are collected from the selected variant. In addition, there is a
feedback of the position from the kinematics block in order to develop more complex algorithms.
Figure 41 Inverse kinematics and Figure 42 Forward kinematics would be the 2º level of this block in
Simulink.

Movement Inputs
Movement
Algorithms Kinematics
Outputs

Position Feedback

Figure 11 Control & Kinematics block (2º level)

The third and last level corresponds to Figure 12 and 13, it collects the Stateflow charts of the models
and with an internal variable, Model, the corresponding chart is selected, using the sensors signal; its
responses go to the kinematic part to calculate the movement of the robot. Depending on the chosen

21
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

kinematics, these outputs can vary. Furthermore, the position feedback, as well as the signals of the
sensors, go into this block.

Differential Ackermann
Drive Steering

Position Movement
Model
Feedback Inputs

Tricycle Swedish
Drive wheels
model

Figure 12 Control block (3º level)

Figure 13 shows the Kinematics block. Once the movement outputs are calculated, two switches
decide which results correspond with the model that the user chose before the simulation started.
Due to the fact that every kinematic block receives the signals from the Algorithms block, a selection
of the results is thus required.

Translation Switch

Movement Differential Tricycle Ackermann Swedish Movemen


Inputs Drive Drive Steering wheels t Outputs
model

Rotation Switch

Figure 13 Kinematics block (3º level)

5.3. Visualization block

The last stage of the simulation is the visualization of the robot moving around the scenario according
to its programming in the previous part. The structure is similar in comparison with the first block,
Sensors block. However, the internal variable Model manages which robot model is activated and
displayed. Despite this, all the robot models receive the movement and size outputs, the robots not
selected are provided with null signals thanks to some switches in order to avoid overloading the
simulation. Then, there is an image processor and a viewer to make future analysis of the results of

22
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

the camera sensor. All these parts are shown in Figure 14 and Figure 43 shows the real block
implemented.

Differential
Switch
Drive
Movement
Outputs
Tricycle
Switch
Drive
Image
Model
Processor
Ackermann
Switch
Size Steering
Outputs

Swedish
Switch wheels
model

Figure 14 Visualization block

23
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

6. Interact and modify the simulation

This part of the thesis describes how to interact and modify the simulation so that future users can
work with it and apply their features and tools. As a result, the programme can be improved and
provides more functionalities.

The first tool to handle the simulation is a mask in the Control & Kinematics block in which the four
robot models are listed as well as the two kinematics. Once the model and the kinematics are selected,
the user can define the dimensions of the robot and the wheels; therefore, all wheels have the same
size. Finally, the initial position and rotation of the robot in the virtual environment can be set. The
point (0,0,0) is in the middle of the scenario. However, only X and Z axes can be set due to the fact that
Y-axis is automatically calculated according to the size of the robot and wheels. Figure 15 presents how
the mask looks in Simulink.

Figure 15 Mask of the simulation

In addition, it is explained how to use a script to create a maze, and 3D World Editor in order to add
new robot models or sensors in the simulation or modify the existed models. Moreover, Stateflow is
described to create algorithms for robot movements. The last part is addressed to describe the SLAM
script with the aim of developing a map from the robot simulation.

6.1. Maze creator script

The second tool is an m-script that adjusts the virtual environment to create a maze or simple obstacles
in it using a square matrix that represents the whole scenario in the XZ plane. Consequently, a 0 is
equivalent to space and a 1 indicates a wall in that place, the empty or occupied space is 1 m2 of area.

24
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Thus, building manually the walls in the 3D editor is not necessary and the user can save a lot of time
while designing the shape of different scenarios in a short time. The code of this script is placed in
Maze Creator section of Appendix C MATLAB code and Simulink files and it is based on the example
Terrain Visualization The MathWorks, Inc. (1994-2020).

The walls created are linked to the node Walls in the 3D editor. The obstacles need to be deleted
before building a new maze because the resulting scenario would be a combination of them. There are
some modifiable parameters such as the appearance or the position of the walls. However, the most
important is their size; in this case, the shape is [1, 1, 1] m, which affects to the position (translation)
because of the reference point is in the centre of the object. Therefore, to place a wall in X=7m and
Z=6m, the position in the global reference would be [6.5, 0.5, -5.5] m. Note that the negative sign in Z
is because of that axis is oriented in other sense.

Consequently, 400 is the maximum number of walls, because the scenario is a 20x20m square.
Moreover, the size of the matrix and the size of the obstacles individually must be related in order to
correspond to every position of the matrix with an exact place in the scenario. Hereafter, Figure 16
depicts an example of a matrix and its equivalent maze in the simulation.

map= [...
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1;
1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1;
1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1;
1,0,0,1,1,1,1,1,1,1,0,0,1,1,1,1,1,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1;
1,0,0,1,1,1,1,1,1,1,1,1,1,0,0,0,0,0,0,1;
1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1;
1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1;
1,0,0,1,0,0,0,0,0,0,0,0,1,1,1,1,1,1,1,1;
1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1;
1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1;
1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,1,0,0,1;
1,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,1,0,0,1;
1,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,1,0,0,1;
1,0,0,1,1,1,1,1,1,0,0,0,1,0,0,0,1,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,1,1,1,1,1,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1;
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1];

Figure 16 Example of a maze using a matrix

6.2. 3D World Editor

In this section is explained how to operate with this programme, it allows to build a virtual 3D
environment. This was utilized to create robot models and their components through simple shapes
such as cylinders, prisms, and lines. Although an AGV needs joints and internal mechanisms, these
were not necessary thanks to applying the kinematic equations in the whole robot. Thus, the model
could move around the scenario like a real robot; despite the fact that the wheels did not spin.

25
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

To introduce the software, the structure of the virtual environment is shown in Figure 17; all these
elements are called nodes and there are several categories. In this case, it has two PROTO objects,
these are predefined shapes in order to be used in other parts such as in Kinematic Models without
copying or creating them again. Moreover, the Background provides a green ground and a blue sky to
the rest of the scenario; in addition, Viewpoints is a group that contains different perspectives of the
simulation. Referring to Directional Lights, they produce some points of light that permit better
monitoring.

Figure 17 Structure of the virtual environment

On the other hand, Scenario node is where the ground and the walls are contained, it is a type of node
called Transform that will be described later. Finally, the most important node, the switch Kinematic
Models allows to select and modify the distinct prototypes of AGVs. The next subsection is focused on
how to create an AGV, with one sensor; hence, a general idea of how the models were built can be
obtained.

6.2.1. How to develop an AGV model

The development of an AGV, in this case, a differential drive, consists of several steps. The first step is
to define a new transform node, with this node the position, rotation, centre, scale and its orientation
of itself and its linked objects, called children, can be changed at the same time. The parameters of
these components are compounded of three elements that correspond with axes XYZ. Note that there
is a fourth element for the rotation and scale orientation is to write the angle of rotation in radians
while the previous ones indicate in what axis is the rotation, (Figure 18).

Figure 18 Transform node

26
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Once the transform node is built, shapes must be added. Hence, while the children node is selected,
the user can insert a shape node using the Nodes tab or with the right mouse-click. After that, by
selecting the geometry field, add a new object such as a cylinder. From here, the radius and thickness
can be defined. However, these last parameters can only be modified inside this software, to control
the size of an object from Simulink the scale vector is necessary.

The next step is to provide the wheels of the model. Thus, there are two options: the first one would
be to create them in the same way as the main body was created, and the second one is to insert a
proto object with the wheel already developed. As two wheels are required, with the second method
is enough to develop only one wheel. The only extra requirement is that the wheel must be introduced
inside of a proto object node. Despite that, the process is not completed, by clicking the right button
and selecting Wrap By > Transform, the proto object is now in a transform node; therefore, the
rotation, position and other parameters of the wheel are managed with independency of the main
transform node which controls the whole model.

Because of all these steps, the resulting tree structure in the file would be presented in the following
figure, Figure 19. Additionally, in the tree structure, there is a group with viewpoints, a transform node
with the ground and, directional light to illuminate the area. How to create and manipulate them
follows a very similar procedure and their variables, in the case of the directional light and the
viewpoints, are simpler and intuitive.

Figure 19 Tree structure of a differential drive robot

27
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

As a general remark, a node is always required when a characteristic of an object needs to be modified.
For instance, when an object needs more than one rotation in two axes, two transform nodes one
inside the other can permit it.

At this point, the next stage is the interaction between the model and the virtual environment; thus,
sensors must be implemented. There are multiple types of sensors in the editor such as plane, cylinder
or sphere sensor; among other sensors. However, in order to emulate a LIDAR sensor, there is another
category of sensors called Pick Sensor Node, where Line Pick Sensor is. This type consists of a line and
when this collides with obstacles the sensor turns on with a bool variable.

To apply this sensor, two parts are required. The first one is a shape which detects the obstacles, in
this case, an Indexed Line Set; the order is very important, the shape must be placed before the sensor
itself in the tree structure. To define the direction of the sensor, the initial and final point are included
in coord node. Secondly, the sensor is a line pick sensor in which the picking geometry is the indexed
line set and the pick target are the obstacles. These elements are selected by clicking the right button
and in the tab USE, choose the proper component. To illustrate all those steps, Figure 20 shows the
tree structure of the sensor. The node Sensors is inserted in children of Differential Drive transform;
moreover, the node Sensor Base is a 3D object to place the sensors.

As a remark, when the same sensor needs to be applied in more examples, the implementation
changes a little. The difference is related to the shape, a new definition of an indexed line set is not
necessary because the same previous indexed line can be used with a right-mouse-click in the
geometry field and selecting the previous object in the tab USE.

Figure 20 Sensor development

28
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

The final result of applying all these nodes and some details of appearance is the robot model
presented in Figure 21. The green line corresponds to the sensor implemented.

Figure 21 Model of a differential drive

6.2.2. Communication with Simulink

When it comes to interacting with the virtual environment in Simulink, two blocks permit write and
read data, VRSink and VRSource respectively. These blocks belong to the library of Simulink 3D
Animation.

When they are in the simulation file and by browsing the 3D-file created before, it is possible to select
the parameters to read and write. On the one hand, for the vrsource, only the signal isActive and range
from pick line sensor is necessary, the last parameter is for the SLAM sensor. On the other hand, for
the vrsink, translation, rotation, and scale signals are enough to simulate the model. Furthermore,
Sensors block is mainly a vrsource block, while the Visualization part is mainly a vrsink block.

6.3. Stateflow

Stateflow allows designing algorithms through flow charts, transition diagrams, among other tools in
order to control and handle Simulink models. It uses a very graphical environment; therefore, it is more
intuitive than building an m-script in MATLAB instead.

It has been utilized to create the states which govern the behaviour of the robot models. In the case
of the forward kinematics, the velocity of the wheels individually; while, for the inverse kinematics,
the velocity and angular velocity of the robot itself.

The video of Carone (2010) provides a general idea of how it works and how to involve the main
elements of Stateflow: states, transitions, and junctions. Figure 22 describes a very simple algorithm
to achieve that the previous robot model avoids frontal obstacles and rotate 90° to the right to be able
to keep moving.

29
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Figure 22 Stateflow example

Note that there are some types of variables. Firstly, inputs such as Theta or Front_Sens, which
correspond to the angle of the robot and the front sensor. These are updating all the time. Secondly,
outputs such as Left Wheel, which is the velocity of the left wheel; they update their value only when
they are included in a state. Moreover, there is another type called local data such as Ref Angle. These
can have the value the user decides, and they are only updated in states. Although if a local data is
equal to an input signal, that does not mean that the local data is updating over time; only when the
state which contains that local data is reached.

Furthermore, when it comes to adding MATLAB functions in a Stateflow chart, there are some
differences in comparison with applying in a Simulink page. All the variables of the chart can be used
in the function without any previous declaration in the name of the function; because of being an
internal function. Besides, the function only updates its result once when it is called.

Conversely, in the simulation, the implementation of the Stateflow blocks has the same structure as
for the forward as for the inverse kinematics. On the one hand, the inputs, which are the sensors, are
the same for each model, because all of them have four sensors. Moreover, there is a feedback of the
position and rotation of the robot and two variables for knowing the kinematics and the robot model;
thus, only one algorithm is activated. In addition, the position in Z has a negative gain to change the
sense of the Z-axis in the real file. Because when it comes to programming, the Z-axis is changed of
sense. See Figure 23 hereafter.
𝑍′
𝑋

𝑋
𝑍′
Real axes Adapted axes

Figure 23 Rotation of the Z-axis

On the other hand, the output variables are shared between the different algorithms; for instance, in
the case of the inverse kinematics, the Vx variable is defined in all models. Therefore, the same output
is in all control sequences and depending on the model selected its values can vary. The reason for this

30
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

is for avoiding having too many output variables which would have the same function but in different
AGVs. Lastly, there is a clock in each chart to regulate the refresh time of the signals in the system. See
Figure 24.

X Clock X Clock

Z Z
Left Rear Wheel
Vx
Theta Theta

Front Sensor Right Rear Wheel Front Sensor


Vz

Right Sensor Right Sensor


Left Front Wheel
Phi
Left Sensor Left Sensor

Back Sensor Right Front Wheel Back Sensor 𝜔

Model Model
Phi
Kinematics Kinematics

Forward Kinematics Inverse Kinematics

Figure 24 Stateflow inputs and outputs

6.4. SLAM script

This last script does not follow the concept of SLAM exactly, because to run the script, this needs sensor
ranges of the robot model during a simulation. That means that the m-file can develop the map once
a simulation is completed. The information is collected from Simulink with a sample time of 0.7 s, the
more data collected, the more for the representation; however, too much data can collapse the script
when it is running. On the other hand, the code of this script is in SLAM section of the Appendix C
MATLAB code and Simulink files and is based on the example Implement Online Simultaneous
Localization And Mapping (SLAM) with Lidar Scans The MathWorks, Inc. (1994-2020).

Depending on the length, range, and number of rays of the SLAM sensor, some parameters need to be
adjusted previously. The angle between each ray and, the initial position and rotation need to be set
as well as selecting the appropriate SLAM and robot node. The main part is to create the LIDAR SLAM
sensor; it is defined the maximum length of the LIDAR range and the map resolution. In the simulation,
LIDAR sensors have 10 m of length, so the maximum scan range must be smaller to avoid losing
accuracy. The resolution means cells per meter, so 20 cells give 5 cm of precision (Figure 25). Although
there are several rays, the indexed line set is a single piece.

31
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

angles = 180:-1.5:-178.5;
angles = deg2rad(angles)';
maxLidarRange = 8;
mapResolution = 20;
SlamAlg = LidarSLAM(mapResolution,maxLidarRange);
Figure 25 LIDAR parameters

The next step is to set the loop closure parameters empirically, these have been defined as the example
(Figure 26). The loop closure parameters are set empirically. On the one hand, the loop closure
threshold, the more it is, the more it helps to avoid false positives in loop closure identification process.
Nevertheless, there is a limit; for instance, in an environment where there are similar features, the
chance of scans produces a false positive can grow. On the other hand, loop closure search radius, this
permits the algorithm to search a wider range of the map from the current position of the sensor.

slamAlg.LoopClosureThreshold = 200;
slamAlg.LoopClosureSearchRadius = 3;
controlRate = rateControl(10);
Figure 26 Loop closure parameters

Finally, the results obtained are an estimation of the real map and an occupancy grid map, which
represents the maze as a probabilistic occupancy grid. As some parameters cannot be calculated, more
than one attempt might be required to achieve proper results.

32
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

7. Case study

The two previous chapters explained part of the implementation of the simulation according to its
structure itself and the most unfamiliar MATLAB tools for students, 3D World Editor and Stateflow.
However, this chapter is oriented to obtain an evaluation of the simulation environment through two
tests. These are a persecution with two robots and a SLAM example.

7.1. Simulation with two robots

This experiment consists of two differential drive AGVs in a simple virtual environment; in this case,
the walls only surround the limits of the scenario; as a result, the scenario is a square. In the
experiment, one of the AGVs (the red one) must pursue the other AGV (the blue one) while the later
must try to avoid the obstacles and being reached.

Consequently, some aspects such as visualization, control, and kinematics could be checked. Regarding
visualization, Figure 27 shows a screenshot during the performance of this case with the red AGV
oriented to the blue one. The conditions of the experiment were that the pursuer had a higher top
speed and it reorients when the angle between both robots was bigger than pi/8 radians; while the
pursued AGV had a higher speed of rotation and it rotates to the part with more space of the square
when the pursuer had reduced the distance between them. All this information was introduced in the
algorithms.

In reference to avoid the walls, the algorithms used information provided by sensors which are the
green lines in the picture. Moreover, the position and angle of the robots were calculated through the
inverse kinematic equations; therefore, the robots behaved perfectly according to these equations.
This information also went to the control part of each AGV model as feedback in order to obtain the
distance and angle between robots.

In the video Persecution of two differential drives can be seen that the simulation provides a mask to
set some parameters of the robot conditions. The red AGV has to pursue the blue one, the red one is
faster but it rotates slower than the second mobile robot.

Figure 27 Robot persecution

33
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

7.2. SLAM validation

The second experiment is a test for validating the SLAM script. The maze is more complex than the one
in the previous test; hence, Maze creator script was checked with a more difficult task. Furthermore,
only one robot, the Swedish wheels model, performed in the virtual environment.

Thus, the aspects checked are very similar to the previous ones with the difference that the kinematic
equations, as well as the algorithm, are different; this algorithm was focused on navigating around all
the maze with a wall at the left side of the AGV. In addition, this algorithm was developed to use the
forward kinematic equations of this model.

Furthermore, SLAM sensor and the other sensors were validated. With the first sensor is possible to
build a map of the original maze using SLAM script; this is the most important aspect of this test. The
rest detected the walls around.

The results obtained are represented in Figure 28 in which there is an estimation of the real maze; the
blue line is the trajectory of the robot and the pink ones are the obstacles found. On the other hand,
an occupancy map, which is simpler, is also provided by the script. The video Estimation of a map using
SLAM shows how the estimation of a map can be made after the collection of information of the virtual
environment by the SLAM sensor while the robot was moving around this map.

Figure 28 Final built map and occupancy map

The maps are rotated -pi/2 radians because of the initial rotation of the robot. Despite that, the
resulting maps are quite similar to the original one shown in Figure 29 with the AGV model with its
sensors displayed; SLAM sensor is the set of blue lines.

34
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Figure 29 Real map

Finally, the video Camera sensor performance is a test of how an AGV model, a Swedish wheels model,
is moving around the maze of Figure 29 and at the same time the vision of its camera is shown in the
window on the right.

35
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

8. Results and discussion

The next chapter presents and evaluates the results obtained during the development of this thesis
and from the previous chapter, Case of study, in order to fulfil the objectives mentioned in the
introduction of the thesis.

8.1. Results

The chapter Case of Study is a validation of the main simulation built in Simulink, whose structure is
explained in Chapter 5, Structure of the simulation. According to the objectives required, this
simulation provides four kinematic models: differential drive, tricycle drive, Ackermann steering, and
Swedish wheels model which uses Mecanum wheels. These models are presented in the next figure,
Figure 30. In addition, the dimensions of the model, the radius and thickness of wheels, as well as the
initial position and rotation of the AGV, can be adjustable in the mask of the simulation.

Differential Drive Tricycle Drive

Ackermann Steering Swedish Wheels Model

Figure 30 Kinematic models

Additionally, these 3D models can perform in a modifiable virtual 3D environment with an area of
20x20 m2 ; this environment is made by a script using a binary matrix of a dimension of 20x20 in which
each number 1 corresponds with a cube of 1 m3 in the same place on the ground where it is placed in
the matrix. Therefore 400 blocks are possible to place in the scene and several experiments can be
developed to test a kinematic model in a specific situation. A more in-depth explanation of this can be
found in Maze creator script Sectiom and the code is in Maze Creator.m section of Appendix C MATLAB
code and Simulink files. Moreover, the user can modify it manually from 3D World Editor.

Regarding the interaction between the robot models and the virtual environment, LIDAR sensors were
implemented in order to detect obstacles and avoid collisions. Figure 31 shows four LIDAR sensors

36
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

represented by green lines; one every 90°. The number of them and their length can vary using the
editor. In addition, there is a sensor used for SLAM (blue lines) which covers 360° for the purpose of
collecting information concerning the location of the walls. In fact, in combination with SLAM script,
this sensor can estimate a map of the original one and also create a binary occupancy map; see SLAM
validation in Chapter 7 Case study for more details and SLAM section of Appendix C MATLAB code and
Simulink files to see the code for SLAM.

Figure 31 AGV sensors

In terms of control, there are two parts dedicated to design algorithms followed by a block which
includes forward and inverse kinematic equations. Therefore, control algorithms can be approached
to use in two different ways depending on the kinematic selected by the user to control the AGV.

Last but not least, the simulation can be scoped from different points of views as Figure 32 depicts a
panoramic perspective; conversely, a plan view has been shown in previous chapters such as in
Chapter 7 Case study or Maze creator script Section in Chapter 6. These views permit to check the
behaviour of the robot in the virtual environment due to the broad overview.

Figure 32 Panoramic view

Apart from these static fields of vision, there are dynamic ones too; one point of view per model which
goes behind each one to see the robot movements in more detail. Furthermore, there is another one

37
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

in the structure of the sensor to emulate an image processing camera with the aim of applying possible
image analysis features. Figure 33 presents both perspectives.

Back Camera Sensor Camera

Figure 33 Dynamic perspectives of the simulation

To conclude the results, this simulation is a formative tool designed to test different types of AGVs in
different situations and also allows the implementation of own algorithms using LIDAR sensors to
detect the objects in the virtual environment and create an estimation of the map with an external
script. From this thesis, it is possible to add new functionalities for the purpose of improving the
simulation and to gain more knowledge about mobile robots.

In contrast with the MATLAB examples discussed in Mobile robots’ applications with MATLAB Section
in Chapter 3, the majority of projects were developed in 2D and the ones developed in 3D were so
specific and used Simscape Multibody for creating the virtual environment or the robot models instead
of Simulink 3D Animation. This is one of the biggest difference; besides that, this programme permits
a lot of possibilities in one file, so the user can experiment with more functionalities of mobile robots.

8.2. Discussion

This section presents an interpretation of the results previously mentioned while assumptions and
errors are taken into account.

The simulation provides an accurate virtual environment and robot movements. However, only the
robot body makes the rotation and translation movements because wheels are attached and fixed to
the body. The models were not built with real joints or emulating real AGVs; they are a set of 3D shapes
of the editor, resulting in a simplification of the simulation due to the not interaction between wheels
and ground. Therefore, dynamic equations related to forces and torque were not considered because
they were not necessary to show the behaviour of a specific kinematic model.

During the development of the models in 3D World Editor was demanding to manage all characteristics
and understand all the parameters involved. There was a useful feature in the editor, the Script that
could allow modifying some aspects related to the size or colour of the objects; this would reduce the
number of blocks inside the Simulink programming. Despite that, it could not be accomplished because
of the lack of material of the programme. Because of that, tasks such as handling the scales of the
sensors, robot bodies and wheels had to be managed by some equations in Simulink. Furthermore,

38
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

there was another editor, MATLAB editor, which allows modifying parameters easily without loading
the 3D file; consequently, modifications can be done faster.

Regarding the simulation performance, Maze Creator script could delay the simulation if the maze
required too many walls; in other words, each wall had some nodes related to its hierarchy structure.
Despite this, these nodes were necessary; the problem emerged when there are too many nodes
because Simulink considered them as new inputs or outputs even they were not going to be used.
Thus, some situations cannot be analysed because of the time required. In addition, one of the
limitations when it comes to creating a maze is that obstacles are cubes; thus, curve shapes are not
available to build the scenario using the Maze Creator script.

Moreover, the programme has a sample time, set conditions of quality in Stateflow; for instance, robot
angle = pi/2, is not recommendable due to the fact that the robot angle, as well as other parameters,
change their values in a discrete way. Consequently, the previous condition could never be reached
and conditions needed to be set with <= or >= symbols for ensuring that the condition could be fulfilled.
Therefore, some position or rotation errors could be accumulated during the performance of the
robot. In addition, SLAM script parameters are set empirically. Therefore, the user can obtain
inaccurate results in the first attempts at estimating the original map.

In reference to sensors, despite using LIDAR sensors, there were more types to implement in the editor
such as sonar or plane sensors. Conversely, LIDAR sensors were the most used ones in examples and
the most useful for the purposes of the thesis. One of the disadvantages was that modifications of the
number and dimensions of the sensors needed were made inside the editor.

Last but not least, the methodology followed during the development of this thesis has allowed to
contrast and compare several MATLAB examples to apply their advantages into the simulation. The
order in the development steps was very important to reduce errors because of the creation of the
object structure in the design step, which was the first one, was essential to making the future
connections between objects and equations. On the other hand, the validation stage was also useful
for the purpose of correcting the errors fastly.

39
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

9. Conclusions and future work

Now that the results have been presented and discussed in the last chapter and the whole simulation
has been successfully developed and proved its functionalities. Then, the conclusions of this project
are mentioned hereafter.

• The programme is not only oriented to students from the University of Skövde, but also for
people interested in mobile robots using a virtual environment of simulation. The tool
developed has provided four modifiable kinematic 3D models: differential drive, tricycle drive,
Ackermann steering, and Swedish wheels model developed with Mecanum wheels. Moreover,
LIDAR sensors and an adjustable virtual scenario have been implemented. The tool has been
created to apply more features by future users and increase functionalities in order to cover
more aspects of mobile robots.

• Regarding part control, own algorithms can be implemented to different kinematic models,
using inverse or forward kinematics. Inverse kinematics is more focused on developing
complex algorithms; while forward kinematics to understand how the AGV moves because it
controls the equations control the wheels of the AGV. However, more training with Stateflow
would be required in order to obtain more elaborated algorithms.

• Maze creator script has proved to be very useful to create mazes and place obstacles in the
virtual scene. Nevertheless, it has delayed the performance of the simulation if the map was
too complex or many nodes were used. One recommendation would be to join in a single piece
the blocks in a row or a column, so there would be a reduction of the number of nodes and
objects.

• MATLAB and Simulink tools have fulfilled the objectives satisfactorily. Simulink 3D Animation
has allowed for the creation of a virtual 3D environment with enough accuracy; although LIDAR
sensors have only been added, there were more types of sensors. LIDAR sensors have been
implemented to avoid collision and to collect information for SLAM script to create a binary
occupancy map of the real map. The parameters of this script need to be set empirically, so
the uer can obtain a wrong estimation of the map in the first attempts.

One suggestion related to sensors is to create a script like Maze creator with the purpose of
deciding the characteristics of the sensors such as the length or the number of them without
modifying manually the 3D file.

• Although the graphic visualization has been accurate enough for the aims of this thesis using
Simulink 3D Animation. Other tools such as Simscape Multibody would be very
recommendable if the user wanted to implement an AGV model with URDF files; more
parameters could be considered to achieve a more realistic interaction of the robot with the
virtual environment.

40
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Future work

The objectives mentioned in the Introduction were covered and fulfilled in this thesis with a simulation
tool developed with MATLAB and mainly with Simulink. Therefore, this provides a basis with enough
functionalities to start practising with mobile robots and acquiring knowledge related to them.

It could be interesting to investigate how to apply algorithms of image recognition; for instance, for
seeking objects or colours, and reading QR codes. These features would improve the interaction of the
robot with the virtual environment.

Other kinematic models could be included such as a three-wheel omnidrive robot with standard or
Mecanum wheels in order to test its kinematics.

On another note, this tool has been designed in order to test algorithms in different kinematic models.
Thus, to implement complex algorithms focused on solving mazes or reaching certain positions in the
scenario while the AGV is able to avoid colliding with obstacles. This would achieve a more
independent and thorough behaviour for the robots. Furthermore, proportional–integral–derivative
(PID) controllers are relevant for handling parameters such as the velocity of the robot and for
estimating errors.

Moreover, dynamic equations, as well as friction coefficient, could be implemented in order to


elaborate a more accurate simulation. With this in mind, by adding movement to the wheels and apart
from the robot movements, an emulation of a real case could be made.

41
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

References

Books

Banks, J., Carson II, J.S., Nelson, B.L. & Nicol, D. (2014) Discrete-Event System Simulation: Fifth ed.
Pearson.

Borenstein, J., Everett, H. &. Feng, L. (1996). Navigating Mobile Robots. Massachusetts: A K Peters, Ltd.

Jonker, G. &. Harmser, J. (2012). Engineering For Sustainability: A Practical Guide For Sustainable
Design. First ed. Amsterdam: Elsevier B.V.

Oates, B. (2006). Researching Information Systems and Computing. London: SAGE Publications Ltd.

Siegwart, R., Nourbakhsh, I. &. Scaramuza, D. (2004). Introduction To Autonomous Mobile Robots. 2nd
ed. s.l.: Ronald C. Arkin.

Stewart, R. (2014). Simulation. s.l.: Palgrave Higher Ed M.U.A.

Woods, R. &. Lawrence, K.. (1997). Modeling and Simulation of Dynamic Systems. s.l.: Prentice-Hall,
Inc.

World Commission on Environment and Development (1987), Our Common Future. Oxford University
Press, Oxford, New York.

Conference papers

Carpin, S., Lewis, M., Wang, J., Balakirsky, S. & Scrapper, C. (2007). USARSim: a robot simulator for
research and education. Proceedings 2007 IEEE International Conference on Robotics and Automation,
Roma, Italy, 2007, April 10-14, pp. 1400-1405. Doi: 10.1109/ROBOT.2007.363180

Frontoni, E., Mancini, A., Caponetti, F. & Zingaretti, P. (2006). A framework for simulations and tests of
mobile robotics tasks. 14th Mediterranean Conference on Control and Automation, Ancona, Italy, 2006,
June 28-30, pp. 1-6. Doi: 10.1109/MED.2006.328842

Gonzalez, R., Mahulea, C. & Kloetzer, M. (2015). A Matlab-based interactive simulator for mobile
robotics. IEEE International Conference on Automation Science and Engineering (CASE), Gothenburg,
Sweden, 2015, August 24-28, pp. 310-315. Doi: 10.1109/CoASE.2015.7294097

Ilas, C. (2013). Perception in autonomous ground vehicles. Proceedings of the International Conference
on ELECTRONICS, COMPUTERS and ARTIFICIAL INTELLIGENCE - ECAI-2013, Pitesti, Romania, 2013, June
27-29, pp. 1-6. Doi: 10.1109/ECAI.2013.6636180

Li, C., Fu, L. & Wang, L. (2018). Innovate engineering education by using virtual laboratory platform
based industrial robot. Chinese Control and Decision Conference (CCDC), Shenyang, China, 2018, June
9-11, pp. 3467-3472. Doi: 10.1109/CCDC.2018.8407723

Lynch, L., Newe, T., Clifford, J., Coleman, J., Walsh, J. & Toal, D. (2018). Automated Ground Vehicle
(AGV) and Sensor Technologies- A Review. 12th International Conference on Sensing Technology (ICST),
Limerick, Ireland, 2018, December 4-6, pp. 347-352. Doi: 10.1109/ICSensT.2018.8603640

42
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Sengupta, D., Jain. N. & Kumar, C. S. (2013). An Educational Website on Kinematics of Robots. IEEE Fifth
International Conference on Technology for Education (t4e 2013), Kharagpur, India, 2013, December
18-20. pp. 24-27. Doi: 10.1109/T4E.2013.14

Wang, D. & Qi, F. (2001). Trajectory planning for a four-wheel-steering vehicle. Proceedings. IEEE
International Conference on Robotics and Automation (Cat. No.01CH37164), Seoul, South Korea, 2001,
May 21-26. pp. 3320-3325 vol.4. Doi: 10.1109/ROBOT.2001.933130

Xu, H., Xia, J., Yuan. Z. & Cao, P. (2019). Design and Implementation of Differential Drive AGV Based on
Laser Guidance. 3rd International Conference on Robotics and Automation Sciences (ICRAS), Wuhan,
China, 2019, June 1-3, pp. 112-117. Doi: 10.1109/ICRAS.2019.8808992

Yong-de, Z., Jin-gang, J., Hai-yan, D. & Qin-xiang, Z. (2013). Training pattern of undergraduate's
innovative education using robot as an education platform. 8th International Conference on Computer
Science & Education, Colombo, Sri Lanka, 2013, April 26-28, pp. 1180-1184. Doi:
10.1109/ICCSE.2013.6554096

Journal articles

Casini, M. & Garulli, A. (2016). MARS: a Matlab simulator for mobile robotics experiments, IFAC-
PapersOnLine, Volume 49, Issue 6, Pages 69-74, ISSN 2405-8963, Doi:
https://doi.org/10.1016/j.ifacol.2016.07.155.

Hamid, T., Qiao, B. & Ghaeminezhad, N. (2015). Kinematic Model of a Four Mecanum Wheeled Mobile
Robot, International Journal of Computer Applications (0975 – 8887), Volume 113 – No. 3.

Morris, R., Childs, P. & Hamilton, T. (2007): Sustainability by design: a reflection on the suitability of
pedagogic practice in design and engineering courses in the teaching of sustainable design, European
Journal of Engineering Education 32, Pages 135-142.

Pandey, A. & Ramakrushna Parhi, D. (2014). MATLAB Simulation for Mobile Robot Navigation with
Hurdles in Cluttered Environment Using Minimum Rule based Fuzzy Logic Controller, Procedia
Technology, Volume 14, Pages 28-34, ISSN 2212-0173, Doi:
https://doi.org/10.1016/j.protcy.2014.08.005.

Reports

Markets and Markets. (2019). Automated Guided Vehicle Market by Type, Navigation Technology,
Application, Industry, and Geography - Global Forecast to 2024 (SE 3351).
Available at: https://www.marketsandmarkets.com/Market-Reports/automated-guided-vehicle-
market-27462395.html?gclid=CjwKCAiAj-
_xBRBjEiwAmRbqYmPWwx9LWWVwLe4CGZP3cjHRC06RiCk8T_V0BMXBDhp7z2lJOI4iehoCo0YQAvD_
BwE [Accessed 6 February 2020]Borenstein

Webpages

The MathWorks, I., (2019). Robotics System Toolbox.


Available at: https://es.mathworks.com/products/robotics.html#rms
[Accessed 28 January 2020].

43
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

The MathWorks, I., (2019). Simscape Multibody.


Available at: https://es.mathworks.com/products/simmechanics.html#imcad
[Accessed: 1 February 2020].

The MathWorks, I., (2019). Simulink 3D Animation.


Available at: https://es.mathworks.com/products/3d-animation.html#3dworlds
[Accessed 31 January 2020].

The MathWorks, I., (2019). Stateflow.


Available at: https://es.mathworks.com/products/stateflow.html
[Accessed 5th February 2020].

Eisele, R., (2008-2020). Ackerman Steering.


Available at: https://www.xarg.org/book/kinematics/ackerman-steering/
[Accessed 10th Abril 2020].

Examples

The MathWorks, Inc. (1994-2020). Collision Detection Using Line Sensor.


https://es.mathworks.com/help/sl3d/examples/collision-detection-using-line-sensor.html
[Accessed 25th February 2020].

The MathWorks, Inc. (1994-2020). Differential Wheeled Robot in a Maze.


https://es.mathworks.com/help/sl3d/examples/differential-wheeled-robot-in-a-maze.html
[Accessed 5th February 2020].

The MathWorks, Inc. (1994-2020). Differential Wheeled Robot with LIDAR Sensor.
https://es.mathworks.com/help/sl3d/examples/differential-wheeled-robot-with-lidar-sensor.html
[Accessed 26th February 2020].

The MathWorks, Inc. (1994-2020). Vehicle Dynamics Visualization with Video Output.
https://es.mathworks.com/help/sl3d/examples/vehicle-dynamics-visualization-with-video-
output.html
[Accessed 8th April 2020].

The MathWorks, Inc. (1994-2020). Terrain Visualization.


https://es.mathworks.com/help/sl3d/examples/terrain-visualization.html
[Accessed 17th March 2020].

MathWorks Student Competitions Team. (2020). Robotics Playground.


https://www.github.com/mathworks-robotics/robotics-playground
[Accessed 3rd February 2020].

The MathWorks, Inc. (1994-2020). Implement Online Simultaneous Localization And Mapping (SLAM)
with Lidar Scans.
https://es.mathworks.com/help/nav/ug/implement-online-simultaneous-localization-and-mapping-
with-lidar-scans.html
[Accessed 28th March 2020].

44
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

PowerPoints

Kamal Nasir, A. (2016). EE565:Mobile Robotics Lecture 2 [PowerPoint presentation]


Available at: http://web.lums.edu.pk/~akn/Files/Other/teaching/mobile%20robotics/spring201
6/lectures/Lecture2-Module1-Kinematics.pdf [Accessed 17th March 2020].

MHI. (2012). What are AGVs used for? [PowerPoint presentation]


Available at: http://www.mhi.org/downloads/industrygroups/agvs/elessons/AGVS-On-Line-Training-
Module-2-FinalT.pdf [Accessed 12th February 2020].

TFE-Moodle 2. Kinematics [PowerPoint presentation]


Available at: http://www.moodle2.tfe.umu.se/pluginfile.php/52988/mod_resource/content/2/
Kinematics.pdf [Accessed 2nd Abril 2020].

Weblog

Shea, C. (2016). Robots Tackling the Three D’s of Industry. Robotiq blog [blog] August 17.
https://blog.robotiq.com/robots-tackling-the-three-ds-of-industry [Accessed 15 February 2020].

Videos

Mobile Robot Simulation for Collision Avoidance with Simulink (2014)


[video] Seshadri, S., MathWorks
https://www.mathworks.com/videos/mobile-robot-simulation-for-collision-avoidance-with-simulink-
90193.html [Accessed 20th February 2020].

Simulating Mobile Robotics Using Virtual Worlds (2018)


[video] Avendaño, J., MathWorks
MathWorkshttps://es.mathworks.com/support/search.html/videos/matlab-and-simulink-pass-
competitions-hub-getting-started-with-robotics-playground-virtual-worlds-
1533569380647.html?q=&fq=asset_type_name:video%20category:simulink/getting-started-with-
simulink&page=1 [Accessed 1st February 2020].

Stateflow Tutorials, Part 1: States and Transitions (2010)


[video] Carone, M., MathWorks
https://www.mathworks.com/videos/stateflow-tutorials-part-1-states-and-transitions-
81647.html?elqsid=1583927230010&potential_use=Student [Accessed 11th March 2020].

Temporal Logic Operators - Stateflow Video (2017)


[video] MathWorks
https://www.youtube.com/watch?v=ss4IH3b8Mis [Accessed 24th April 2020].

Tutorial Paso a Paso de un modelo 3D en V-Realm Builder (2012)


[video] Echavez Morales, W. J.
https://www.youtube.com/watch?v=uNRz65SfGWU [Accessed 2nd March 2020].

45
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Appendix A Work breakdown and time plan

The schedule followed to make this thesis is represented in Table 5. This Gantt chart comes from the
specifications of the thesis. However, there were changes in the tasks during the development of the
simulation and the report.

Table 5 Original time plan

Activity/Week W4 W5 W7 W9 W11 W13 W15 W17 W19 W21

Start

Literature Review

Investigate Toolboxes

Design Scenario

Implement test simulation

Improve Scenario

Improve simulation

Report

Presentation

Table 6 is the resulting time plan followed after the problems and challenges found. The main
difference between them is the change in the order of the two first tasks. This is because of the
importance of finding examples and tutorials in order to confirm that the thesis objectives were
possible to accomplish using MATLAB and Simulink. When enough material was collected, the next
step was to work with them and develop some little models for familiarizing with their characteristics.
After that, according to the results obtained with those models and comparing the examples found, a
justified selection of the most appropriate MATLAB tool was made.

Moreover, due to the fact that there were some 3D editors available in MATLAB. Like the previous
selection of MATLAB tools, these were tested and compared. Despite VR Builder had been previously
chosen because of its ease of use, this was finally discarded by 3D World Editor due to the availability
of more resources about sensors.

About the delay of the improvements of the scenario and the simulation. On the one hand, some
problem emerged referring to the need for an easy tool to modify the scenario, the Maze creator script,
it did not compile at the beginning and the maze did not save in the file.

Besides that, how to connect and organise each part and block of the simulation was also demanding
in order to procure an intuitive environment with the minimum amount of Simulink blocks. On the
other hand, the experiments explained in Chapter 7 Case study had to be made ready for the Chapter
8 Results and discussion for the purpose of discussing them and validate all the features. This part led
to more time than it was planned because there were designing errors and some modifications were
required in the programme.

46
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Despite all these problems and challenges, the report and the simulation could be accomplished in the
delivery date.
Table 6 Updated time plan

Activity/Week W4 W5 W7 W9 W11 W13 W15 W17 W19 W21

Start

Investigate Toolboxes

Literature review

Design scenario

Implement test simulation

Improve scenario

Improve simulation

Report

Presentation

47
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Appendix B Kinematics of mobile robots

This appendix introduced the basis to understand the kinematic models used in this thesis, explaining
step by step the necessary and previous knowledge, in reference to locomotion and the behaviour of
wheels, and their equations. (Siegwart , Nourbakhsh & Scaramuza 2004).

B.1. Locomotion

There are several ways of locomotion, the most known are using legs or wheels; wheels have been
more popular because of their mechanical simplicity and easy implementation. However, flat and hard
grounds are required to guarantee an efficient performance of the wheels. Using wheels offer a large
number of configurations depending on the types of them as well as the number of each one. When it
comes to choosing a type of wheel, the choice is focused on three fundamental aspects:
manoeuvrability, controllability, and stability.

There are four basic wheels types shown in Figure 34 and 35: standard wheel, castor wheel (very
common in office chairs), Swedish wheel, which is composed of small rollers, and finally ball or
spherical wheel. Moreover, combining different wheels resulting in plenty of possible configurations,
taking into account that when there are more than three wheels, it is necessary a suspension system
to avoid having a wheel or more without ground contact.

Figure 34 Basic types of wheels

Figure 35 Swedish wheel

Considering the three main aspects mentioned, more wheels mean more stability, whereas this can
affect to the manoeuvrability and controllability, the typical wheel number in an AGV is two, three or
four. Furthermore, the more the robot is manoeuvrable, more difficult is to design the control
algorithms.

48
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

B.2. Kinematics

This subsection contains the technical aspects of the coordinates systems as well as the equations
according to each type of wheel.

B.2.1. Robot position

The robot is regarded as a rigid body on wheels moving in the plane. There is a maximum of three
degrees of freedom, two for the position and one for the rotation. Tanking this into consideration,
frames of reference have to be defined to obtain a correct representation of the robot. There is a
global frame of reference for the plane and a local one for the robot itself, Figure 36.
𝑌𝐼

𝑋𝑅

𝑌𝑅

𝑃 𝜃

𝑋𝐼
𝑂

Figure 36 Global and local reference frame

The global frame of reference has as an origin 𝑂 and {𝑋𝐼 , 𝑌𝐼 } are the coordinates to represent the
position in the environment. Moreover, the local frame of reference is similar, 𝑃 is the origin, usually
the centre of mass, and {𝑋𝑅 , 𝑌𝑅 } are the coordinates. This frame needs also 𝜃, the angle between both
frames. Therefore, the global position of the robot can be described with the following vector (7).
𝑥
𝜉𝐼 = [𝑦 ] (7)
𝜃
The robot motion is determined with a rotation matrix (8), 𝜃 is measured anti-clockwise direction.

𝑐𝑜𝑠𝜃 𝑠𝑖𝑛𝜃 0
𝑅(𝜃) = [−𝑠𝑖𝑛𝜃 𝑐𝑜𝑠𝜃 0] (8)
0 0 1
As a result, it can be known the orientation of the robot and of course in the event of the velocity is
given, the robot motion along the local axes (9).

𝜉𝑅̇ = 𝑅(𝜃)𝜉𝐼̇ (9)

B.2.2. Types of wheels

Apart from that, to calculate the kinematics of a robot, it is required to define the wheel kinematic
constraints, this can be related with the radius of the wheels, their distance from 𝑃, and the spinning.
The equation to start solving the model would be the next (10).

49
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

𝜉𝐼̇ = 𝑅(𝜃)−1 𝜉𝑅̇ (10)

Where 𝑅(𝜃)−1 is:

𝑐𝑜𝑠𝜃 −𝑠𝑖𝑛𝜃 0
𝑅(𝜃)−1 = [ 𝑠𝑖𝑛𝜃 𝑐𝑜𝑠𝜃 0] (11)
0 0 1
Equation (10) provides a transformation which is used in all kinematic blocks of the simulation because
the robot moves according to the global axes and the equations are referred to its local axes.

Moreover, depending on the type of wheel, there will be different parameters to take into account.
Some assumptions must be defined to reach a simplified model: there is only one point of contact
between the wheel and the ground with no sliding and the rotation axis of the wheel is always parallel
with the floor. Every wheel will follow and be developed according to these conditions, in this thesis
standard and Swedish wheel equations will be mentioned.

• Standard wheels

There are two models, one can rotate in its vertical axis, which is the steered standard wheel, and the
other cannot, fixed standard wheel. Thus, the angle between the steered wheel plane and the robot
chassis (β(t)) can vary over time. Equations (12) and (13) depict the two assumptions above, rolling
and sliding constraints respectively. Where 𝑙 and 𝛼 is the distance and angle from 𝑃 respectively, the
wheel radius is denoted by 𝑟 and its angular velocity by φ̇.

[sin(α + β) − cos(α + β) (−𝑙)𝑐𝑜𝑠𝛽]𝑅(𝜃)𝜉𝐼̇ − rφ̇ = 0 (12)

[cos(α + β) sin(α + β) 𝑙𝑠𝑖𝑛𝛽]𝑅(𝜃)𝜉𝐼̇ = 0 (13)

• Swedish wheel

This type of wheel is omnidirectional; it can move in any direction. The terms represent the same
variables as (12) and (13), but with the addition of γ, which is the angle of the roller rotation axis
relative to the wheel plane. The radius and angular velocity of them are rsw and φ̇sw respectively.
Finally, this type of wheel does not impose kinematic constraints on the robot motion.

[sin(α + β + γ) − cos(α + β + γ) (−𝑙)cos (𝛽 + 𝛾)]𝑅(𝜃)𝜉𝐼̇ − rφ̇cosγ = 0 (14)

[cos(α + β + γ) sin(α + β + γ) 𝑙sin (𝛽 + 𝛾)]𝑅(𝜃)𝜉𝐼̇ − rφ̇sinγ − rsw φ̇sw = 0 (15)

To illustrate these equations, the majority of the parameters used are included in Figure 37.
𝑌𝑅

𝐶ℎ𝑎𝑠𝑠𝑖𝑠

𝑃 𝛼
𝑋𝑅

Figure 37 Wheel parameters

50
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Appendix C MATLAB code and Simulink files

With the purpose of showing MATLAB scripts and Simulink pages during the development of this thesis,
this appendix collects all the programming material of the simulation.

C.1. Maze_Creator.m

This script creates a map of obstacles according to user preferences. The following matrix covers from
the map (1,1) to map (20,20). Despite map(1,1) is placed in [-9.5, 1, -9.5] in the upper-left corner and
not in the centre of the scenario. In the code, it can be seen that the x3d file is opened to add new
nodes. These nodes correspond to the obstacles. The comments in the script explain the instructions.

%Matrix of the map


map=[...
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1;
1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1;
1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1;
1,0,0,1,1,1,1,1,1,0,0,0,1,1,1,1,0,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1;
1,0,0,1,1,1,1,1,1,1,1,1,1,0,0,0,0,0,0,1;
1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1;
1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1;
1,0,0,1,1,0,0,0,0,0,0,1,1,1,1,1,1,1,1,1;
1,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1;
1,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1;
1,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1;
1,0,0,0,1,1,0,0,0,0,0,1,0,0,0,0,1,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,1,1,1,1,1,1,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1;
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1;
1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1];

%Each 1 corresponds to a cube of 1m^3

% Bring up template and create the obstacles


myworld = vrworld('Scenario.x3d');
% open the virtual world
open(myworld);
% create and handle a node 'AUTOTerrain', the node that will contain the DEM data
Walls_node = vrnode(myworld,'Walls');
%delete(vrnode(myworld,'Walls','children'));
for z=1:20
for x=1:20
if(map(z,x)>0.5)
% create a child of 'Walls' - transform
newTransform = vrnode(Walls_node, 'children', 'Wall_Transform', 'Transform');
newTransform.translation=[-10+(x-0.505)*1,0.5,-10+(z-0.505)*1];

51
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

newShape= vrnode(newTransform, 'children', 'Wall_Shape', 'Shape');


% create appearance field for the shape
newAppear = vrnode(newShape, 'appearance', 'Wall_Appearance 'Appearance');
% create material field for the appearance
newMat = vrnode(newAppear, 'material', 'Wall_Material','Material');
newBox = vrnode(newShape, 'geometry', 'Wall_Box','Box');
% assign properties for the geometry field - use DEM data
newBox.size=[1.01 1 1.01];
newTexture = vrnode(newAppear, 'texture', 'Brick_small','ImageTexture');
newTexture.url = 'texture/Brick_2.jpg';
end
end
end

%Save the changes


save(myworld, 'Scenario.x3d')
close(myworld);
pause(2);
%delete(myworld);

C.2. SLAM.m

This second script contains the commands to create a LIDAR sensor inside of MATLAB workspace and
apply it in a loop closure in order to obtain an estimation of the map in which the AGV performs. This
script only shows de creation of the map because of showing the robot requires more time of
processing. Thus, the only necessary information from the simulation is the ranges of the sensor. The
comments in the code explain every step in the script.

% Create Lidar SLAM Object

%The LIDAR sensor uses 240 laser lines and the angle between each one is 1.5
degrees.
angles = 180:-1.5:-178.5;
angles = deg2rad(angles)';

%The sensor has a range of 10 m. The range set needs to be shorter than the
original range
%to do no lose accuracy. The grid map resolution means the number of cells per
meter; it affects the precision.
maxLidarRange = 8;
mapResolution = 20;
slamAlg = lidarSLAM(mapResolution,maxLidarRange);

%The loop closure parameters are set empirically. Using a higher loop closure
threshold helps reject
%false positives in loop closure identification process. Keep in mind that a high-
score match may still
%be a bad match. Using a higher loop closure search radius allows the algorithm to
search a wider
%range of the map around the current pose estimate for loop closures.
slamAlg.LoopClosureThreshold = 200;
slamAlg.LoopClosureSearchRadius = 3;
controlRate = rateControl(10);

52
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

%%
% Observe the Effect of Loop Closure and Optimization Process
%The data is obtained from a simulation previously done in Simulink.
%Loop closures are detected as with the range fdata. The pose graph optimization is
performed
%whenever a loop closure is detected. This can be checked using the output
optimizationInfo.IsPerformed value from addScan.
%A snapshot is shown to demonstrate of the scans and poses when the first loop
closure is identified
%and verify the results visually. This plot shows overlaid scans and an optimized
pose graph for the first loop closure.
%The plot is updated continuously as robot navigates through the virtual scene
firstLoopClosure = false;
scans = cell(length(out.Rang.Time),1);

figure
for i=1:length(out.Rang.Time)
% Read the range readings obtained from LIDAR sensor of the robot.
range=out.Rang.Data(:,i);

% The simulated lidar readings will give -1 values if the objects are
% out of range. Make all these value to the greater than maxLidarRange.
range(range==-1) = maxLidarRange+2;

% Create a |lidarScan| object from the ranges and angles.


scans{i} = lidarScan(range,angles);
[isScanAccepted,loopClosureInfo,optimizationInfo] = addScan(slamAlg,scans{i});

if isScanAccepted
% Visualise how scans plot and poses are updated
show(slamAlg);

% Visualise the first detected loop closure


% firstLoopClosure flag is used to capture the first loop closure event
if optimizationInfo.IsPerformed && ~firstLoopClosure
firstLoopClosure = true;
show(slamAlg,'Poses','off');
hold on;
show(slamAlg.PoseGraph);
hold off;
title('First loop closure');
snapnow
end
end
waitfor(controlRate);
end
%%
% Plot the final built map
show(slamAlg,'Poses','off');
hold on
show(slamAlg.PoseGraph);
hold off
title('Final Built Map of the Environment');
xlabel('X [meters]');
ylabel('Z [meters]');

%%
% Build Occupancy Grid Map

53
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

%The optimized scans and poses can be used to generate an occupancy map which
represents the environment as a probabilistic occupancy grid.
[scans,optimizedPoses] = scansAndPoses(slamAlg);
map = buildMap(scans,optimizedPoses,mapResolution,maxLidarRange);

figure;
show(map);
hold on
show(slamAlg.PoseGraph,'IDs','off');
hold off
title('Occupancy Grid Map Built');
legend('Trajectory of the Robot');

C.3. AGV_Simulation.slx

This last section includes some screenshots of the main Simulink pages of the simulation programme.
This part can be compared with Chapter 5 Structure of the simulation because this chapter explains
the characteristics of each part of the simulation with more schematic figures; Therefore, this section
is only oriented in showing the real blocks in Simulink. Figure 38 depicts the first page of
AGV_Simulation.slx in which the main blocks are together.

Figure 38 AGV Simulation file

C.3.1. Sensors section

Figure 39 represents the sensors of the robot models. There are four sensors for each model and
another one for SLAM. Because this last sensor is not necessary for the next part of the simulation, it
is collected to work apart in MATLAB in SLAM.m.

54
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Figure 39 Sensor part

C.3.2. Control & Kinematics section

This block contains the forward and inverse kinematics and also the algorithms of each model.
Furthermore, the scales block to control the size of the body, sensor and wheels of the selected AGV
is included due to the mask of this block has the size parameters (Figure 40). On the other hand, the
outputs of the kinematic blocks are ordered like that because to put them together in a single vector;
so, to separate them, they should be ordered from more vector length to less vector length. As it was
explained in Chapter 6 Interact and modify the simulation, the rotation needs four parameters and
translation only three.

Figure 40 Kinematics and control part with scales block

55
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

Inverse kinematics block and forward kinematics block are in Figure 41 and Figure 42 respectively.
These are very similar with the difference that the equations are focused on the robot itself or its
wheels.

Figure 41 Inverse kinematics

Figure 42 Forward kinematics

56
Bachelor Degree Project in Production Engineering 2020
Carlos Jaime Mérida

C.3.3. Visualization section

The last part of the simulation is the visualization, the biggest block corresponds to a vrsink, it contains
all the outputs to show properly the simulation. Moreover, it has an output, the video viewer, this
includes a camera sensor to apply image recognition algorithms. See Figure 43.

Figure 43 Visualization part

57

You might also like