You are on page 1of 61

COLLISION FREE PATH PLANNING ALGORITHMS FOR ROBOT NAVIGATION PROBLEM

A Thesis presented to the faculty of the Graduate School University of Missouri-Columbia

In Partial Fulfillment of the Requirement for the Degree Master of Science

By Kyung min Han

Dr. Robert W. McLaren, Thesis Supervisor AUGUST 2007

The undersigned, appointed by the dean of the Graduate School, have examined the [thesis] entitled COLLISION FREE PATH PLANNING ALGORITHMS FOR ROBOT NAVIGATION PROBLEM Presented by Kyung min Han, A candidate for the degree of Masters of Science And hereby certify that, in their opinion, it is worthy of acceptance.

Dr. Robert W. McLaren, Professor Emeritus, Electrical and Computer Engineering

Dr. Guilherme Desouza, Assistant Professor, Electrical and Computer Engineering

Dr. Roger, Fales, Assistant Professor, Mechanical and Aerospace Engineering

ACKNOWLEDGEMENTS Looking back on my master’s research years, I realized that I have gone through a great program which strengthened my academic knowledge and gave me a broader scope of what the electrical engineering discipline really is. Needless to say, I have faced a number of situations that seem to hard to overcome. However, I was lucky enough to have great faculty members and friends who always are ready to help me out. First of all, I deeply thank my Lord for that He always listens to me whenever I pray for myself. Especially, I feel a great thankfulness for my adviser Dr. Robert Mclaren. His precious advice and support always encouraged me in the right direction of my research goal. It would not have been possible for me to finish my work without his kind helps and wise suggestions through all my master’s years. I am also grateful to Dr. Guilherme DeSouza for his valuable teaching and encouragement that always led me to make a right decision whenever I faced difficult situations. I would like to express my acknowledgement to Dr. Roger Fales for kindly agreeing to join my thesis committee. I am also grateful to Dr. Keller who offered me such a useful teaching and study opportunity which helped me to have a better idea in my research topic I am grateful to Dr. Tai seung Jang for his generous help and advice as my senior. I am pleased to thank to my friend Bon seok Koo for his valuable help. I am thankful my lab mates: Yuanqiang Dong, Hui Peng, Vishal Rijhwani, Youyou Wang and Thomas Konig.

ii

my sister Hye jin Han for cheering me up to confront new challenges in my life. my mother Jung shim Yoo. iii . I deeply thankful to my family members in South Korea: My father Haeng yong Han. but not the least.Last.

.1 Genetic Algorithm 2...................................................... PROPOSED SOLUSTIONS..........................................................................1 Genetic Algorithm 4......... RESULTS ..........................................................................2 Ant Colony Optimization Algorithm 5....................1 Path planning 2..53 iv ................................................................3 2................... PROBLEM DESCRIPTION........................................................................................................................... INTRODUCTION ...............................................16 5....................................1 Limitation in Genetic Algorithm 6........................................ ii ABSTRACT....... MATLAB CODE .................5 4..........................1 Results with the Genetic Algorithm 5...........1 1...........................................................................................................2 Limitation in Ant Colony Optimization 6.41 REFERENCES ..........................................................v Chapter 1...... CONCLUSIONS...........3 Scope for future work APPENDIX 1....2 Results with the Ant Colony Optimization Algorithm 6......................TABLE OF CONTENTS ACKNOWLEDGEMENTS.38 6.................................................6 4..........2 Ant Colony Optimization Algorithm 3............... HISTORICAL OVERVIEW OF THE GA AND ACO ALGORITHM.........................................

In order to prevent this phenomena the proposed solution adapted a global attraction term which guides ants to head toward the destination point. it is possible that an ant may wander around the grid map or may become stuck among local grid points. such random solutions can easily be eliminated by setting-up simple equation between a line created by two via points and the obstacle. including maze navigation is a challenging topic in robotics. Each ant drops a quantity of artificial pheromone on every point that the ant passes through. The ant colony optimization algorithm is another approach to solve this problem. path distances that each chromosome creates can be regarded as a fitness measure for the corresponding chromosome. Genetic algorithm is a popular approach that searches for an optimal solution in given set of solutions. Considering via points as genes in a chromosome will provide a number of possible solutions on a grid map of paths.ABSTRACT Path planning problem. Since each ant will make a decision at every grid point that it encounters. This pheromone simply changes the probability that the next ant becomes attracted to a particular grid point. In some cases. v . a solution path passes through an obstacle. Both algorithms are tested and compared in the result section. In this case. This thesis addresses methods of the path finding problem using these two different approaches. Assuming that the shape of an obstacle is a circle. Indeed. The experiment results demonstrate that these two methods have a great potential to solve the proposed problem. a significant amount of research has been devoted to this problem in recent years.

D Huh. and a number of studies have been attempted. resulting in a significant number of solutions. They used the Dijkstra algorithm for global path planning and the potential field method for local path 1 . and so forth. The main goal of this problem is to construct a collision-free path from a starting position to an end or destination position. then it is a matter for the robot to accurately follow the pre-determined path. For this reason. such as obstacle avoidance. Park. a desirable path should result from not letting the robot waste time taking unnecessary steps or becoming stuck in local minimum positions. Accordingly. position identification. U. The main scope of the path finding problem involves the efficiency and safety issues. This safety issue is another critical part of the algorithm. safety and accuracy. A reliable navigation algorithm must be able to 1) Identify the current location of the robot. Kim [4] approached the path finding problem by combining global path planning and local path planning. mobile robot navigation problem is a challenging problem. J. Furthermore. However. this navigation problem includes several difficult phases that need to be overcome. 3) Determine a path to the object. 2) Avoid any collisions.1 Path Planning The robot path planning problem is a very challenging problem in robotics. Three major concerns in regard to robot navigation problems are efficiency.0 Introduction 1. this problem can be broken in to several subtasks. A number of algorithms have been proposed to address these two important issues.1. Once the optimum collision-free path is constructed. The efficiency of the algorithm is considered as an important matter since one of the main concerns is to find the destination in a short time. a desirable path should avoid all known obstacles in the area. Huh and H. As Ibrahim [17] pointed out.

[5] used via points (VP) to find the optimum path. Then. including those indicated previously. Sadati and J. the simulation results are presented. They developed a “smart” algorithm which could change the number of via points in response to a different level of complexity of the map upon which paths would be generated. Kardaras. Chapter 4 introduces two proposed solutions (GA. ACO) for the proposed problem. 2 . S. Vlachavas [6] presented a path planning algorithm that uses a neural network and a skeletonization technique. numerous approaches to solving the path finding problem have been published. In the following chapter. Lee and G. In Chapter 5. Chapter 6 provides a conclusion. Chapter 2 discusses historical reviews of the Genetic Algorithm and the Ant Colony Optimization algorithm. This thesis will apply two approaches to solving the path finding problem: 1) a genetic algorithm (GA) and 2) an ant colony optimization algorithm (ACO). The remainder of this thesis is organized into five major chapters. the proposed solutions are implemented and demonstrated through computer simulations. G. Bourbakis and L. N.planning. N. Chapter 3 describes the proposed problem. Thus. Taheri [7] presented a combination method consisting of a Hopfield Neural Net ((NN) and a genetic algorithm (GA).

such as ants are stimulated by how well they have 3 . These two steps can be considered as pioneering works in the discipline of evolutionary computation. some species. The central concept of this algorithm Based on is based on the pheromone trail and the following behavior of real ants. but very active research area in the discipline of computational intelligence. flowshop optimization.3 Ant Colony Optimization The Ant Colony Optimization (ACO) method is a more recent. crossover. The genetic algorithm has become a well-known technique for optimization. biological studies.These problems bear a strong similarity in that the main objective of these problems is that of optimizing or selecting the best solution out of a number of possible solutions. Adaptation in Natural and Artificial Systems presented an evolutionary method that involves natural selection. For example. the current frame-work of the so-called “genetic algorithm” or GA was developed by John Holland in 1975 [2]. His pioneering book. called the This algorithm ANT System. job shop scheduling.2. Thus. The first ACO algorithm. In the mid 1960’s. However. and the like [3]. GA has a reasonable motive of being employed in a path optimizing problem. takes inspiration from the social behavior of ants. and mutation. the GA represents a feasible approach to the classical traveling salesman problem. 2.1 Genetic Algorithm An evolutionary computational strategy was first proposed by Rechenberg and Schwefel in 1965 [1] as a numerical optimization technique. Fogel proposed the first evolutionary program. was developed by Marco Dorigo and his colleagues [10]. intelligent search and machine learning.0 Historical overview of the GA and ACO algorithm 2.

after the first ACO or AS [10] algorithm(s) were proposed.performed (certain) tasks. Examples of these applications include a classical TSP. Dorigo and M. The results achieved. demonstrated the algorithm’s versatility as well as its robustness. From the early 90’s. and the “single machine total weighted tardiness scheduling problem” (SMTWTP) [8]. the pheromone trail is updated in accordance with this evaluation. In fact. it is claimed that the ACO algorithm is a feasible approach to the proposed path planning problem. The so-called “ant cycle algorithm” of Dorigo [8] was based on the above principle. Then. 4 . Birattari [9] presented a theoretical approach to a classical traveling salesman problem. as reported in his paper [14]. Thus. The constructed solutions are then evaluated by their quality in terms of the performance achieved. M. The algorithm performs well in finding an optimal path or an optimal sequence of ants’ steps that defines a path. The stimulation in the world of the ant can be equivocated to their pheromone trail. Similar to other biologically inspired algorithms. the ACO algorithm performs well in several different optimization problems. a collection of artificial ants construct potential solution to a problem requiring optimization based on this pheromone (feedback) information. A significant portion of this report shows how the ACO meets such expectation. In the ACO algorithm. His algorithm was applied to the classical traveling salesman problem along with the asymmetrical traveling salesman problem (ATSP) and the job scheduling problem. a significant number of successful applications became available. such as Particle Swarm Optimization [18]. telecommunication routing problems.

O2: obstacles. the map consists of a 20 x 20 grid. here.1 shows an example of such an arrangement.) Furthermore. The left bottom corner of the map is the starting point for a path while the right top corner of the map is the destination point for a path. Fig 3. D: destination point) 5 . O1.3. The size of a map can be changed arbitrarily. Figure 3. but the size of the obstacle is variable from 1 to 5 grid points. The shape of an obstacle is always a circle. multiple obstacles are possible.0 Problem description Consider a 2-D square map overlaid with a uniform pattern of grid points. The positions of the obstacles are randomly selected and can be located at any grid point in the map except at points close to the starting point region or close to the goal point region (say 5 grid points away.1: problem configuration (S: starting point.

1a: Example of a single point crossover in third bit position from the LSB 6 . Fitness: Performance of each chromosome in terms of its output performance evaluated by a fitness function.The goal is to construct a shortest path from the starting point. 110101010 parent1 110101001 101011001 parent2 101011010 offspring1 offspring2 Fig 4. Crossover: Exchange of bits between two parents. to the destination (goal point.0 Proposed solutions 4.) D. 4.1 Genetic algorithm Definitions of keywords [16] Chromosome: a set of parameters (genes) which define a possible solution of the proposed problem. 1) the genetic algorithm and 2) the ACO algorithm will be proposed in the next section. which avoids every obstacle in the map. (a path cannot touch any obstacle) In order to solve this problem. Selection: Survival of chromosome that is the best fit in current generation. S. two approaches.

these bits are changeable.1b Example of a mutation (mutation of a chromosome at the 6th bit) In the proposed algorithm via points for a path represent genes or bits in each chromosome. 0) and the right most chromosome is the destination point (20. 17) (20. 7) (18. Each grid point. 7 . 5). However. For example. 20).1c shows this configuration. 1010110101 1010100101 Fig 4. (x. 20)]. Fig 4. the left most chromosome is always the staring point (0. the 5 genes chromosome for this path is [(0. 0) (5. 4) (9. if a path has 3 via points (1. 5) (10. 1) (5.Mutation: Random change of one or more bits in chromosome. y) represent a gene of a chromosome. Since the algorithm searches for the best path.

The methodologies associated with these steps are described in subsection i) through subsection v). 8 .Fig 4. if a particular chromosome has better fitness (shorter path distance) than other chromosomes. Subsections vii) through viii) show other important features of the proposed algorithm such as the rules and assumptions. and the entire program sequence. Subsection vi ) explains the obstacle avoidance technique. and mutation. all the chromosomes will be updated by their fitness. crossover. a chromosome with good fitness has a much higher probability than other inferior chromosomes to appear in the next generation.1c: A chromosome (path) with 3 genes (via points) The proposed genetic algorithm consists of 3 main steps: natural selection. i) Natural selection At each generation (iteration). then that particular chromosome is more likely to win the competition and clone itself. In other words. Thus.

under this rule. the chromosome is a candidate for the crossover event. Thus. if the given mutation rate is Pm=0. all the crossover events are controlled by a predetermined Pc (crossover rate). The main goal of the elitism rule is to keep the best chromosome from the current generation. the smaller that the distance is. If the generated number is less than Pc. then a 20 X 5 X 0. iii) Mutation Unlike the crossover event. where d(k) is the path length for d (k ) 9 . the better that corresponding chromosome will be. Thus. v) Elitism In order to keep the best chromosome from each generation. That is.01=1 bit mutation is expected to happen. The left most genes and the right most genes will avoid the crossover event since these two points cannot be eliminated. iv) Fitness function The fitness of each chromosome is evaluated in terms of its path distance. mutation is performed on a bit by bit basis.01. the algorithm creates a random number in [0 1] for each chromosome. the best chromosome from each generation will 1 . In other words. the population is expected to perform a 1% bit mutation. For example. Furthermore. the elitism method is employed.ii) Crossover A group of chromosome undergoes crossover at each generation. For the purpose of diversity. the crossover point bit is randomly selected in each generation. if 20 chromosomes exist with 5 bits in each one. The fitness function used in this particular problem is ft (k ) = the kth chromosome.

the line equation is given by y = x. 10) (20. (4. say an obstacle is located at a position of (12. only chromosome 2 will survive because the chromosome 2 passes through the obstacle. a line created by two via points either passes through an obstacle or not. Fig 4.1. starting from the first line created by two via points: (0. one can check all the lines created between via points. Then. Also. Although chromosome 1 has better performance compared to chromosome 2. The obstacle can be expressed as ( x − 12) 2 + ( y − 12) 2 = 4 (4. 0) (10. the algorithm checks if the line passes through the obstacle. If the X coordinate of the obstacle lies between two given via points and the two solutions are real numbers.1).1. then the algorithm declares that this line will pass through the obstacle and will then eliminate this chromosome. 20). 10). 0) (10. 10 . where 0<x<10 Eqs..1.1) and (4. 12) with a radius of 2. consider a chromosome with 1 via point: (0.2) and will yield two possible solutions. it could be a useless solution if the corresponding path passes through any of the obstacles. Since via points cannot lie on any of obstacles.2 depicts this concept. Even if the performance based on (path distance) of a new chromosome is acceptable.1. vi) Obstacle avoidance New chromosomes appear in every generation.not undergo any mutation or crossover event and will safely move onto the next generation. For example. Knowing this fact.1. (4.2) Combining Furthermore.

and any point inside of an obstacle. The shape of an obstacle is a circle with a radius of 1 or 1. (x3.5 grid points most of the time.1. the goal point is (20. The starting point is (0. 0) and (20. 20) ]. viii) Program sequence 11 . 0). y3). (x1. then the chromosome may look like [(0. However. A via point cannot be the starting point. a few of the experiments are done with a radius of 5 grid points. 20) respectively. (20. the end point.Fig 4. The size of the map is 20 X 20 grid points. and none of obstacles can include the starting point or the goal point. 20). 0).1: Obstacle avoidance example vii) Rules and assumptions Only integers are employed to locate positions on the map. (x2. y1). y2). The first bit and the last bit of a chromosome is (0. If 3 via points are chosen.

20) ) of the chromosomes Apply cloning and natural selection to each chromosome Apply a crossover technique to each chromosome Apply mutation Eliminate all chromosomes (paths) that pass through any of the obstacles. End if End for 4. 12 . for example Do For generation=1.0) to (20. …. Yo) Choose the mutation rate (Pm) and the crossover rate (Pc) Initialize the population at the chosen size.Choose the size of the population (the number of chromosomes) Choose the size of a chromosome Create the number of obstacles and their positions (Xo.. 2.2 ACO (Ant Colony Optimization) algorithm Definitions of keywords [10] Artificial ant: Ant artificial agent which makes movements based on the attraction of pheromone. 100. If the size of current population < the size of the original population Generate extra chromosomes which avoid all obstacles. N Compute the fitness (the distance of the path from (0.

y). The ant takes its next step randomly.2. based on the probability given by k ϕ ij (t ) = τ ij (t ) + α * Ω ij (t ) ∑l∈N τ il (4. each ant probabilistically prefers to follow a direction rich in pheromone rather than a poorer one. y+1).Pheromone: Chemical substance deposited by an ant when walking. (x. The quantity τ il indicates every possible lth neighbor point when the ant is in ith position.1) Where τ ij (t ) is accumulated pheromone on the jth grid point when the ith grid point is the ant’s current location at time t. Ω ij (t ) is the dot product of the vector from i to j with the vector i to destination point. The first term is associated with the pheromone amount in the 8 neighboring grid points (a local pheromone). i) Methodology The TSP (Travel salesman problem) is a paradigmatic optimization problem since it is used to demonstrate the original AS (Ant System) problem. …. ] ∈ N.y). where N in general is the set of all neighboring locations of the current location. The proposed path-finding algorithm employs the original concept of the ACO algorithm with some modification. an ant iteratively moves from a grid point to one of its neighboring grid points. y-1). ant k can choose the next (jth) grid point by choosing one of its 8 neighbor locations: [ (x+1. Starting from the grid point (0. Since then it has often been used as a benchmark to test the new ACO concept. The second term is associated with a global attraction.. The global term is 13 . where α is a scale parameter. When at the ith grid point (x. The idea of the proposed algorithm is as follows. 0). (x+1.

However. y+1) (x+1. for example). after a group of agents finished its tour. y) (x-1. i) The elitist strategy 14 . The purpose of the global term is to guide the agent in the desired direction so that a large number of ants can reach the goal point in a limited number of steps (x-1. y+1) (x-1. This parameter prevents the k map from unlimited accumulation of the pheromone. y) (x+1.3) Where Lk (t ) is the length of the kth ant’s tour (path). Thus.2) Where 0 < ρ < 1 is the pheromone forgetting parameter.2.2. It is defined as k ∆γ ij (t ) = 1 / Lk (t ) (4.1: An agent’s current position (bold) with 8 possible next positions (italic) The solution construction ends after each ant reaches the destination or the ant took too many steps (50 or 100 steps. y-1) Fig 4. y-1) (x.2.y+1) (x+1. the pheromone in the entire map will be updated by k τ ij (t + 1) = (1 − ρ ) *τ ij (t ) + ∑ ∆γ ij (t ) k =1 m (4. y) (x. The quantity ∆γ ij is the amount of pheromone that ant k deposits on the map. let the Lk (t ) =Infinity. the pheromone will be deposited only if the ant reaches to the destination position in less than a certain number of steps.defined as the dot product of the agent’s heading direction and the food (destination) direction. When the kth ant cannot reach the destination in a certain number of steps. y-1) (x.

In other words. each time the pheromone trails are deposited.. K Do For step=1. Move to a next grid point by the computed probability Store the history of past location points in an array If the current location is equal to the destination Break the step loop End if End For Store the tour distance made up by kth ant Compute the pheromone amount which is generated by the kth ant End For Update pheromone amount of the entire map End For 15 .2.2.2. 2. equation (4. For these grid points. and e is a positive integer which reinforces the global best path.4) Again. …. ….The main idea of the elitist strategy is to give a significant additional weight to the best tour constructed from a group of agents. N Do For ant=1.2. M Compute the probability of the kth ant’s next locaton. Algorithm pseudo code Do For iteration=1. the Lgb represents the length of the tour. ….4) becomes: gb ∆γ ij (t ) = e / Lgb (t ) (4. those belonging to the grid points of the group’s best tour get an additional amount of pheromone.

5. However. and the star objects represent the chromosome bits. All simulations were done with MATLAB. mutation rates. the size of an object in diagrams could be smaller than its actual size. these studies include the effect of elitism. of via points Elitism usage 0. Experiments were performed with a different number of obstacles. In order to make objective conditions regarding the proposed algorithms.5.0 Results The results of applying the GA (Genetic Algorithm) and ACO (Ant Colony Optimization) are presented in this section. the circular objects represent the obstacles.2 2 1 3 yes 16 . different test conditions were set up to create variety of experiments. Furthermore. Appropriate diagrams and graphs are included to help illustrate results. This section is divided into two subsections. and the number of chromosome bits. of obstacle Radius of the obst acles No. crossover rates.1 Results with the genetic algorithm. Experiment 1: When obstacles do not block the optimum path of the map • Parameter specifications for the Experiment 1 Pc (crossover rate) Pm (mutation rate) No. In every figure.25 0.

25 0. Since neither of the obstacles have any influence on finding the best solution. the algorithm found an optimum solution at only the 10th generation.Fig 5.1. of obstacle Radius of the obst acles No. Experiment 2: When two obstacles block the shortest path • Parameter specifications for the scenario 2 Pc Pm No. of via points Elitism usage 0.1: The best chromosome (solution) at the 10th generation • Discussion The interpretation of the above figures is straightforward.2 2 1 3 yes 17 .

Fig 5.1.3: The best solution at the 100th generation 18 .1.2: The best solution at the 10th generation Fig 5.

Note that there is no difference between the 10th generation solution and the 100th solution. Experiment 3: When a multiple obstacle exists • Parameter specifications for experiment 3 Pc Pm No. an optimum solution was found in a relatively few numbers of generations. of obstacle Radius of the obst acle No.1. of via points Elitism usage 0.4 9 1 4 Yes Fig 5.25 0.• Discussion Again.4: The best path at the 10th generation 19 .

In general. Such a consequence was expected in the sense that similar characteristics are easily found in other computational intelligence algorithms. the algorithm needs more generations to find the best solution. For example. it is necessary to increase the number of bits in a chromosome.1. if the map complexity increases in terms of the number of obstacles as well as its distribution pattern. Experiment 4: A large obstacle with a different number of chromosomes. an artificial neural network requires a large number of hidden neurons if the network has to deal with high uncertainty situations or nonlinearities.5: The best path at the 100th generation • Discussion As the number of obstacle increases. • Parameter specifications for the Experiment 4 20 . It is verified that the best solution at the 100th generation is better than that at the 10th generation.Fig 5.

3 yes Fig 5.6: One via point 21 .1.Pc Pm No. of via points Elitism usage 0. of obstacle Radius of the obst acle No.2 1 5 1.25 0. 2.

Fig 5.1.7: Two via points Fig 5.1.8: Three via points 22 .

1. Indeed. Both results are achieved by averaging 10 experiments for each case.6 through 5. as the number of via points increases.1. the performance with three via points (33.8 1 5 1. 2. as shown in the above figures (fig 5. Experiment 5: Elitism effects on the performances Pc range Pm range No.• Discussion The purpose of the experiment 4 is to see if the number of via points has an effect on the performance. of obstacle Radius of the obst acle No.8). However. 23 . 3 Yes It is interesting to see how the performance of the algorithm varies due to employing elitism. of via points Elitism usage 0 ~ . The following two Figures demonstrate the effect on the performance of the algorithm due to changing the crossover rate or the mutation rate.5 0 ~ . more via points should lead to shorter paths. Theoretically. the algorithm requires a larger number of generations to converge.1798) is better than for the other two cases.

9: Path distance vs.10: Path distance vs.Fig 5.1. Pm 24 . Pc Fig 5.1. crossover rate. mutation rate.

The obstacles can be located at any place on the map except at the starting point and at the destination point. different population sizes.10 shows how the performance varies as the mutation rate increases. 20).1. which is not true for the case without elitism.19 and Pc=0 in Fig 5. Pm=0 in the Fig 5.1.1. Fig 5. these studies include the effects of elitism. According to Fig. 5.2 Results with the ant colony optimization With the ACO algorithm. 1) and the destination point is (20.20. 5. the crossover rate has little influence on the elitism. Notice that employing elitism gives stable and better results in both cases in terms of its path distance.20 shows that a higher mutation rate will lead to better performance. Fig 5. Both of the figures compare the case with elitism and the case without elitism. and a different number of iterations.• Discussion The size and the number of obstacles in this experiment were the same as in Experiment 4. the staring point is (1.19.1. the circular objects represent the obstacles (there is some discrepancy between the actual obstacle and the apparent obstacle size in the figure.) Experiment 1: When obstacles do not exist in the map (W/O elitism) 25 . Figure 5. Furthermore. The experiments were done with a different number of obstacles. In every Figure.9 shows how the performance (path distance) varies as the crossover rate increases.1.1.

Fig 5.1: The best tour found at the 50th iteration 26 .2.

2.2: Best tours at each iteration (minimum distances at each iteration) Fig 5.Fig 5.2.3: Deposited pheromone in 3D space 27 .

Fig 5.2.4: Deposited pheromone in 2D contour map (Red > yellow > blue) Experiment 2: When obstacles does not exist in the map (W/ elitism) 28 .

Fig 5.6: Best tours at each iteration 29 .2.5: The best tour found at the 50th iteration Fig 5.2.

8: Map pheromone contour map (red > yellow > blue) 30 .Fig 5.7: Pheromone accumulation in 3D Fig 5.2.2.

Also.2.2. Experiment 3: When an obstacle blocks the optimum path in the map (W/O elitism) Fig 5. the variation in the optimum solution settles down as the number of iterations increases as shown in the Fig 5.2.Discussion of the experiment 1 and 2: It is noted that a significant difference exists between the maximum value and the minimum value in the pheromone map employing in which elitism is employed while the pheromone map for the original ACO is more evenly distributed.9: The best tour found at the last iteration 31 .

Fig 5.2.10: Deposited pheromone in 2D contour map Experiment 4: When an obstacle blocks the optimum path in the map (W/ elitism) 32 .

Fig 5.2.11: Deposited pheromone in 2D contour map (red > yellow > blue)

33

Fig 5.2.12: The best path found by Genetic Algorithm

Discussion of the Experiment 3 and 4:
It is clearly shown that employing the elite strategy reinforces narrow region of the map. This means that the pheromone is mostly distributed on the best path of the map while the other regions on the map receive little or nothing. Fig 5.2.12 is the result achieved from the previous algorithm (GA). Comparing Fig 5.2.12 with Fig 5.2.9, the GA produced a better result in terms of the path distance. This is because the ACO algorithm in this application has to deal with much more uncertainty than in the GA case. In other words, the GA method chooses the best via points (most of the time 3 or 4, points at most 6) while the ACO algorithm makes a decision at every grid point. Experiment 5: When multiple obstacles block the optimum path on the map (W/ elitism)

Fig 5.2.13: Best tour found at the last iteration

34

Fig 5.2.14: Map pheromone configuration

Discussion of the Experiment 5: The problem becomes more complicated as the
number of obstacles increases. However, since every decision is made at the grid point level, the algorithm always finds an optimized solution which is not necessarily true for the GA method. For example, in the GA case, if the user does not choose a sufficient number of via points, the algorithm will require a significant amount of time to find a best solution or even may fail to find a solution. Thus, the proposed ACO algorithm is better in that sense since the proposed ACO algorithm will find a solution in any case. Experiment 6: Effect of the number of ants (agents)

35

357 38.15: An example path with the number of ant = 10 36 .237 Table 5.6223 36.The number of ants Average path di stance of 10 tri als 10 100 500 45.2.2.1 Fig 5.

2.Fig 5.17: A path example with the number of ant = 500 37 .16: A path example with the number ant = 100 Fig 5.2.

As it is shown in table 5.2.15 and Fig 5. Such a result is due to the fact that in the ACO algorithm each ant only proceeds 1 grid point at a time while this is unnecessary in the GA case. it is not always true that large number of ants is always the best. if the group size is greater than some minimum need.2. some limitations were found. not only does the number of ants that can reach the destination get smaller but also the amount of pheromone accumulation decreases. a large difference of performance exists between 10 ant and 100 ants while not much improvement presents between 100 ants and 500 ants 6.Discussion of the experiment 6: Comparing Fig 5. there is not much improvement in performance. Conclusion In this thesis. in a majority of cases the GA found a better solution than did the ACO algorithm in terms of the performance. Indeed.1. it has been discovered that employing elitism led the algorithms to find a solution more easily. Overcoming these limitations represent a challenge for future research.16 leads us to the conclusion that a larger size group is better than a smaller size group. The three subsections describe some of the limitations of the proposed algorithms along with possible future work 38 . the results of detailed investigation of the GA and ACO algorithms being applied to a path optimization problem have been presented. However. Despite the good performance for both algorithms.2. If the group size is too small. However.0. It has been demonstrated that both approaches have a great potential of being a solution to the proposed problem. In addition.

However. 6. so that the algorithm does not return the best solution. However. it is necessary to impose another constraint on the algorithm. the ACO algorithm has an advantage over the Genetic algorithm in terms of the algorithm execution time. as it was discussed in the results section. 39 . this algorithm will require a significant amount of computation time. Although it turned out that GA is better than ACO algorithm in terms of finding an optimal path. That is. a more effective and reliable solution can be found if one can adapt only positive phases of the two algorithms. Such a topic can be considered as a challenging problem for future work. a global attraction term had to be added to lead ant to reach the goal point.2 Limitation in Ant Colony Optimization. In other words. this limitation needs to be addressed. this method is inferior to GA method in the sense that ACO approach takes some unnecessary steps. when a large number of obstacles exist in the map.3 Scope for future work Perhaps. but also the ant may become stuck at a point. this computation issue can be ignored. this algorithm does not devote an in ordinate amount of time in iteration process. No matter how many obstacles are present.1 Limitation in Genetic Algorithm In order to guarantee obstacle avoidance. looking for any intersection point created by the path line and the obstacles can create a significant amount of computation.6. Furthermore. 6. If only a few obstacles block the optimal path. Eliminating this term may cause not only the ant wander around in the map. GA algorithm may pay for that advantage in terms of computation time. the best approach in the path finding problem would be an algorithm that converges rapidly toward a meaningful solution. As it is mentioned previously.

40 . in a real environment. as for example.Moreover. the 3D path finding problem is also challenging topic. the algorithm has to be able to deal with nonlinear factors such as noise. and so forth. underwater vehicles. Because. In addition. applying this algorithm to real robot can be a challenging topic. Such an application may be found more easily than some 2D application. aircraft.

no_via_ind2=[]. Furthermore. 4. 8. [0 20 ]) % axis( [0 20 0 20] ) %%%%%%%% Make sure that the via points is not the staring point and goal %%%%%%%% point. 'markersize'. end end % % figure(1) % plot( Op(1.:).:)== Ep(1) & pos(2. it should not be lying on the obstacles.10.3. 14.25'. 10.10. pos=[pos pos_t]. %Op=[ 3. 6. Op(2.:) ==Sp(1) & pos(2. 10] .2.14. 16]'. map_size] %%%% Define the destination point %%%%%%%%%%%% create the possible locations in the map %%%%%%%%%%%%%% pos=[]. 41 . ra*28 ) % hold on % ezplot( '(x-10)^2 + (y-10)^2 . 'ro'. for k=1:map_size+1 for l=1:map_size+1 pos_t=[l-1. .:)== Sp(2) ) | ( pos(1. 10. %%%% Define the radius of the obstacle map_size=20. 14 . %%% size of the population Lg=[1 2 4 ]. no_via_ind1= find( ( pos(1.Appendix A MATLAB code clc clear all for aa=1:3 %%%%%%%%%%%%%%%%%%%% Define globally used variables %%%%%%%%%%%%%%% tic. 0] %%%% Define the starting point Op=[ 10.10. 16. Sp= [0. 10 . k-1].:)==Ep(2) ) ). 18]'%. %%%% Define the position of the obstacle Ep=[map_size. pop=100 . %%% Define the length of the gene ( the number of via points) ra=5. 4 .:). 10. 14.

.2) no_via_ind2=[no_via_ind2 find( (pos(1. . ch_order = [ [1:length(ch_fitness)']' ch_fitness] .:). .2. Ep. [ Xch.2. %%% generate possible solutions to start the algorithm ch_dist=[]. i-1). . ra). .15.i) ).Ych(:. % title( 'possible via points' ). %[ .2. .i-1)= sqrt( ( Xch(:. . . Sp. end ch_fitness = sum( ch_dist')'.001.^2 ). Pm=. .i-1).05.^2 + ( pos(2.1. 2) %%%% reoder the chromosomes in terms of their fitness %%%%%%%%%%%% iteration begions %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% for generation = 1:500 generation %%%%%%%%%%%%%%% divide the chromosomes into 10 groups %%%%%%%%%%%%%%%%%5 %%%%%%%%%%%%%%% Do crossover and mutation for the 10 groups %%%%%%%%%%%% %%%%%%%%%%%%% Variate Pc and Pm by the fitness of the group %%%%%%%%%%%%% Good fitness group <== low Pc and Pm %%%%%%%%%%%% Bad fitness group <== high Pc and Pm Pc=.2. no_via )=[]. %for i=1:10 42 .2.2. 'ro' ).:). .:) . pos( :. . . .i) ). .i) ).^2 <= ra^2 ) ].Op(1.3. % figure(2) % plot( pos(1. %% find points which are inside of the obstacle circle end no_via = union ( no_via_ind1.5 ].1. i ) ).4.for i=1: size (Op. best_ch_ind= find( ch_fitness == min( ch_fitness) ). Lg(aa). Op. pos(2.35.^2 + ( Ych(:.Xch(:. %[ .3. . . for i=2:Lg(aa)+2 ch_dist(:. . Ych ]= gen_ch ( pop .2.5]. ch_order = sortrows( ch_order. . no_via_ind2 ).01.:) -Op(2.05.35. pos.05.

%%% Make up the current population up to the number of the staring population Xchild1=[ Xchild1.1). Ychild1=[ Ychild1. chn_size=0. Op. Xch_t= [Xchn . Ychild1. :).. %%%%%%%%%%%%%% In case that the size of chromosome is short. Xch(2:end. Ych. pos. Ychdn2]. Xchdn2]. Op. Ychdn2 ]= gen_ch ( chn_size. i ) ).elite= ch_order(1.i) ).:). Lg(aa). Ychn2 ]= gen_ch ( chn_size. Ye=Ych( elite(:. Xch(1. Op.:). Ep. bit. ra ).:)= Ych_t(1:pop-1. Sp. i-1). Pm. %Ych_new. Ych(1.:).:)=Xe Ych_t= [Ychn .^2 + ( Ych(:.Ych(:.^2 ). end ch_fitness = sum( ch_dist')'.size(Xchild1. %Xch_new. ra) . Ychn= Ychild2. : ).. bit=round( rand*(Lg(aa)-1) ). Lg(aa). chn_size= pop. Xchn2].i-1)= sqrt( ( Xch(:. [ Xchild1. [ Xchild2... Sp. pos. [ Xchdn2.i-1).1). Ychild1 ] = crossover ( order. for i=2:Lg(aa)+2 ch_dist(:. 43 . Lg(aa)). ra ) . Pc .1). Xe=Xch( elite(:. pos. [ Xchn2. %order= ch_order( 1 + 10*(i-1) : 10*i. Xch..:). make up %%%%%%%%%%%%%% chromosome. best_ch_ind= find( ch_fitness == min( ch_fitness) ). %%% Perform the X over chn_size=0.Xch(:.size(Xchn. chn_size= pop. :) .1). Ychild2 ] = mutation( Xchild1. Ep.. Ychn2]. order=ch_order(2:end. Ych(2:end.:)=Ye %%%%%%%%%%%%%%%%%%%% chromosome re evaluation %%%%%%%%%%%%%%%%%%%%%%%%% ch_dist=[].:)= Xch_t(1:pop-1. %end %%%%%%%%%%%%%%%%%% Update all chromosomes %%%%%%%%%%%%%%%%%%%%%%%%%% Xchn= Xchild2. Op. ra.

:) ) % plot( Op(1. order10=order. Ych10(best10(1). num2str( ch_order50(1. ['path length ='. 'markersize'.2) )] ) hold off print ('-dtiff'. % figure(2) % %plot( Xch50(best50(1). 2) %%%% reoder the chromosomes in terms of their fitness %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%% if generation==10 Xch10=Xch.:). 'ro'. figure(1) plot( Xch10(best10(1).19.:). Op(2. order50=ch_order. Ych10=Ych. ch_order = sortrows( ch_order. ra*28 ) % hold off % title('the best chromosome found at the 50th generation') % text(12.:). Ych10(best10(1). 44 . Op(2. ra*27 ) title('the best chromosome found at the 10th generation') text(12.:). Ych50(best50(1). 'ro'.:). 'g*' ) hold on line( Xch10(best10(1).ch_order = [ [1:length(ch_fitness)']' ch_fitness] . best10= best_ch_ind. best50=best_ch_ind.:). Ych(best50(1). num2str( order10(1.:).num2str(aa)]) end if generation==50 Xch50=Xch.:).:) ) plot( Op(1. ['path length ='.2) )] ) % print -dtiff fig_50 end if generation==100 Xch100=Xch. Ych50=Ych. 'g*' ) % hold on % line( Xch(best50(1).19.:).[ 'ob3_10_'.:). 'markersize'.

Ych i=1. pos. best100=best_ch_ind.:). num2str( ch_order(1. ['path length ='.:).num2str(aa)]) end %of aa function [ Xch. %%% pop size of possible combinations ch_t=[ Sp pos(:. ra*27) title(' the best chromomosome found at the 100th generation' ) text(12. order100=ch_order.chs_ind) Ep ] .:) ) plot( Op(1. Ych100(best100(1). Ych=zeros(N. ra*28 ) % hold off % title('the best chromosome found at the 50th generation') % % text(12. 45 .2) )] ) hold off % print ('-dtiff'. figure(4) plot( Xch(best_ch_ind(1). Xch=zeros(N. ['path length ='.:).19.:).[ 'ob3_100_'. 'ro'.:). % figure(2) % plot( Xch100(best100(1). 'g*' ) % hold on % line( Xch100(best100(1). Lg. 'markersize'. Op(2. 'markersize'. Lg+2).:).:) ) % plot( Op(1.:). while i < N+1 ]= gen_ch ( N .:). 'ro'. Lg+2). Op. Ych100(best100(1).Ych100=Ych.Lg)*length(pos) ). 'g*' ) hold on line( Xch(best_ch_ind(1). Op(2.19. Sp. ra) chs_ind=ceil( rand(1.2) )] ) % print -dtiff fig_100h4 end end toc.:). Ych(best_ch_ind(1). Ep.:). num2str( order10(1. Ych(best_ch_ind(1).

Ych (i. y2=Yc(p. end end end . Op.e) descending order of X else m=(y2-y1)/(x2-x1). function [ test_result ]=test_chromosome( Xc.r). y1=Yc(p. i.. pol1= m^2+1.1) ). Ysol= roots( [ pol1 pol2 pol3] ) .:) .r). pol2= (2*m*c-2*y0*m-2*x0). i=i+1. Op. pol2= -2*y0.:). for p=1: size(Xc. %%%%%%%%%% if x1=x2 then find Ysol first %%%%% if x1==x2 pol1=1. %%%%% else represent in polynoimal form.2 ) for r=1: length( Op ) x2= Xc(p.q-1). ch_t(2. x0= Op(1. pol3= y0^2+x1^2-2*x0*x1+x0^2-ra^2.1) for q=2: size( Xc. x1=Xc(p.2).. v_fit=zeros( size(Xc.:)= ch_t(2.q). y0=Op(2. ra ) if test_result == 1 Xch (i. x2]. c=y1-m*x1. ra ) clear test_result. Xsol= [x1.. Yc.[ test_result ]=test_chromosome( ch_t(1.:). length(Op).q-1).q). 46 . Xsol= roots( [ pol1 pol2 pol3] ) .:)... pol3= c^2-2*y0*c+y0^2+x0^2-ra^2.:)= ch_t(1.size(Xc.

end end if sum(sum( v_fit(:. ch_fitness_t = order(:.2) 47 . % p_t= sum( ch_fitness ). Pc . y1]. bit. [Xsol' . sol_d = dist( [x1. y0 ] ). if isreal(sol) & ( obs_d > min( sol_d) & obs_d < max( sol_d ) ) v_fit(q.p ) = 1. % fit_sum=sum(ch_fitness ) . ra./ch_fitness .r. y1].:.min( order(:. Ych.Ysol= m*Xsol + c. end end end end %%%%%%%%%%% This function performs crossover for the give chromosome %%%%%%%%%%%%% Performing 1 point crossover at the 1st bit position %%%% function [ Xchild. Ychild ] = crossover ( order.2) .2) )+1.p) ) ) <1 test_result(p)=1. Xch. Ysol ] . [x0 . if bit < 0 | bit > 5 error( 'choose diff bit number' ) return %break end % ch_fitness= order(:. end sol= [ Xsol. Ysol'] ). Op. else test_result(p)=0. % p= p_t/sum(p_t). Lg) %bit=1. obs_d = dist( [x1.

/ch_fitness_t ch_fitness/ sum(ch_fitness) % fit_sum=sum(ch_fitness_t). :). : )=[]. for i=1:length(p) q(i)=sum( p(1:i) ) . end X_tail=Xparent(vc.1). 1:Lg-bit ).. Xchild=Xparent. qk(i)=qk_t(1).length(p) ).ch_fitness= 1. chromosome clone %%%%%%%%%%%%%%%% for i=1:length(p) qk_t= find( q > roulet(i) ) . [ test_result ]=test_chromosome( Xchild. Y_tail=Yparent(vc. Ychild( find( test_result == 0 ). Yparent = Yp .sqrt( 20^2+20^2 ) ) . Y_tail= flipud(Y_tail). ra ) ..1). %qk_1_t=find( q < roulet(1) ). % ch_fitness= fit_sum. : )=[]. Yp=Yp(qk.Lg-bit+1:end) . Op.. Ychild./ ( ch_fitness_t .:)= [ Y_head Y_tail ]. vc=find( roulet < Pc ). %%%%%%%%%%%%%% chromosome crossover %%%%%%%%%%%%%%%% roulet=rand(1.: ).:)= [ X_head X_tail ]. X_tail= flipud(X_tail). Xp= Xch( order(:. Ychild(vc. 48 . Yp= Ych( order(:. qk=qk_1_t(end) end Xp=Xp(qk. Y_head= Yparent(vc. end %%%%%%%%%%%%%% roulet=rand(1. 1:Lg-bit ). length(p) ).. :). p= ch_fitness/(sum(ch_fitness) ) .: ). Xchild(vc. Xchild( find( test_result == 0 ).Lg-bit+1:end).. Ychild=Yparent. X_head= Xparent(vc. Xparent = Xp ..

end) . 10. : )=[]. pos.. 13. 17. child_ind). 11.10. Op. . m_ind=find( Ppm < Pm).2) ). 14.10. [ test_result ]=test_chromosome( Xchild. %%%% The mutation will be performed on the body of the chomosome %%%% Ppm=rand( size( X_body. Ychild= [ Y_head Y_body Y_tail]. Xchild= [ X_head X_body X_tail]. 2:end-1 ).. 6. Ychild. ra=1. X_tail= Xp(:. Y_body( m_ind) = pos( 2. %%%% Define the radius of the obstacle Sp= [1 1] %%%% Define the starting point Op=[10. ra ). child_ind=ceil( rand( size( m_ind) )*length(pos) ). 8. 14. Pm . 19. 16. Op. Y_body= Yp(:. 13.10. 4. : )=[]. ra) X_head= Xp(:. child_ind) . Yp. X_body( m_ind )= pos( 1. Ychild( find( test_result == 0 ). 49 . 14 .1). Ychild ] = mutation( Xp. 14.10..1). 16. Y_tail= Yp(:. 2:end-1 ).. 18]'. Y_head= Yp(:. 14.10.%%%%%%%%%%%%%%%%%% end of the crossover %%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%% This function performs mutation for the given chromosome function [ Xchild. %Op=[ 4. 12.1). map_size=20. 8. 10] . End /// ACO code here // clc clear all close all %%%%%%%%%%%%%%%%%%%% Define globally used variables %%%%%%%%%%%%%%% tic. size( X_body. Xchild( find( test_result == 0 ). 4 .end) .14. 10. X_body= Xp(:.

pos( :.. ra*28 ) % hold on % ezplot( '(x-10)^2 + (y-10)^2 . pp=[]. %% find points which are inside of the obstacle circle end no_pos = union ( no_pos_ind1.2) no_pos_ind2=[no_pos_ind2 find( (pos(1. %%%%%% pheromone decay factor ph_l=zeros(20. Furthermore.:) -Op(2. Q=5. for iteration=1:50 acc=1 . %%%%%% cp is the current ant position %%%%%% pp is history of the ant positions 50 . %%%%%% Initialize local pheromone map %tij=(1-p)*tij+dtij %%%%% Update the pheromone max_step=50 ant_number=500.:)== Sp(2) ) ). % figure(1) % plot( Op(1. no_pos_ind2 ). no_pos_ind1= find( ( pos(1.:). pos=[pos pos_t]. path_length=[].:) ==Sp(1) & pos(2..1. k ]. no_pos )=[]..Op(1.i) ).25'.^2 <= ra^2 ) ]. dT=[]. no_pos_ind2=[].i) ).Ep=[map_size map_size] %%%% Define the destination point %%%%%%%%%%%% create the possible locations in the map %%%%%%%%%%%%%% pos=[]. [0 20 ]) % axis( [0 20 0 20] ) % hold off a=. it should not be lying on the obstacles. non_emp_ind=[].:) .:). for k=1:ant_number k cp=Sp.2. 'ro'.20). 'markersize'. for i=1: size (Op.^2 + ( pos(2. for k=1:map_size for l=1:map_size pos_t=[l . end end %%%%%%%% point. pp_hist=[]. Op(2.

1) .. ph_g( find(ph_g <=0 ) )=0 . yp=cp(2). % ph_g= ph_g+abs( min( ph_g ) ) . zind= find( zindt). xp+1 yp+1. ph_g=dot( head_vector.00001. %%%% The ant's next movement should not be one of the past positions unless all neighbor positions belong to the past position P= Ps/sum(Ps) . pp=[pp . pp. 51 . Ps(zind) = 0. Ep' ) + 1 ) . dm=0. 'rows' ). for i=1:size(nps. %%%%%% Global pheromone for step=1:max_step%100%%%%%% repeat following process until the kth ant reach the goal position xp=cp(1). pos' . %%%% The ant's next location must be the one of the possible postions in the map %%%%%%%%%%%%%% Determine the ant's next position by probabilty based on the pheromone %%%%%%%%%%%% clear t. 1] ). nps.1) t(i)=ph_l( nps(i.1). %%%% Probabilities of the ant's next movement is dependent on the amount of pheromone in the next positions q=[].2) ). %ph_g=0. %mem_ind=ismember( intersect(nps. xp-1 yp-1. %nps( find( ~mem_ind ) ) = []. xp-1 yp+1. [size(nps.01. lm=0. head_vector= nps . %%% np neighbor points with respect to the current point nps=intersect(nps. cp ] .%ph_g=Ep. pos' .repmat( cp. xp yp1. for i=1:length(P) q_t=q_t+P(i). 2 ) .1). intersect( nps.xp-1 yp.xp yp+1. nps(i.1 ] ). xp+1 yp-1 ]. repmat( (Ep/norm(Ep) ). [size(nps. 'rows' ). 'rows' ) . Ps= t+ ph_g' . 'rows') . 'rows' ). q_t=0.%/ (dist( cp. %%%% pp is the past path points nps=[ xp+1 yp. end zindt= ismember( nps.

%%% the ant reached at the destination pp=[pp .. pp(:. :) %%%% np is the finally chosen next Lk= dm*sqrt(2) + lm. pp_hist{k. end cp=np. position %%%%%%%%%%%%%%%% If the ant moved to its diagonal direction then %%%%%%%%%%%%%%%% increment dm. if dist(cp.2}=dT(k) . end end %%% end of a group travel elite_ind=find( path_length == min( path_length) ).1). dT(k)= Q/(Lk-20*sqrt(2)) . Otherwise increment lm. qk=find(q > roullete) . 52 . pp_hist{k. np') > 1 dm= dm +1. % figure(1) % line( pp(:. np=nps(np_ind. np_ind=qk(1) .:). acc=acc+1. if cp == [20 20]. break end end % %%% end of the max number steps %%% wheel of the fortune..1}=pp . cp]. dT(k)= 0. else path_length( k ) = Inf. end roullete=rand . else lm= lm +1. non_emp_ind(acc)=k .2) ) if step < max_step path_length( k ) = Lk.q=[q q_t].

1). num2str( best_Lk(iteration) hold on %ezplot( '(x-10)^2 + (y-10)^2 . c_pp(m. for m=1:size(c_pp.2) ) text(12. Op(2. %%%%%%% Update local pheromone map %%%%%%%%%%%%%%%%%%%% for n=1:length( non_emp_ind ) c_pp= cell2mat( pp_hist( non_emp_ind(n) ) ).25'. ra*28 ) title('The best tour found in the last group') figure(3) plot( 1:iteration. ['path length ='.:).best_Lk(iteration)=path_length( elite_ind(1)).1).19.2) ) . c_dT= dT( non_emp_ind(n)). best_path(:. best_Lk(1:iteration)) title('Best tour at the each iteration') xlabel('iteration') ylabel('Best tour found by an agent') figure(4) bar3(ph_l) title( ' Map pheromone amount ' ) figure(5) contour(ph_l) title('Pheromone amount contour map' ) )] ) 53 .:). figure(2) line( best_path(:. 'ro'. end end end %%% end of the iteration best_path= pp_hist{elite_ind(1).1) indi=sub2ind(size(ph_l). c_pp(m. ph_l(indi)= ph_l(indi)* (1.1} . [0 20 ]) plot( Op(1.a) + c_dT. 'markersize'. Lk= dm*sqrt(2) + lm.

(Morgan Kaufmann.” 1997 IEEE Interational Conference on Robotics and Automation. pg 2~3. D. Vol. 1996) 622-627.” Ph. [6] N. [7] N. “Learning with delayed rewards. Man.Fematt. Kim.. 1999. Kararas. D. Eds. Thailand [8] Marco Dorigo and Thomas Stuzle “The Ant Colony Optimization Metaheuristic: Algorithms. I. R. [3] Amit Konar. pg 95. H. Corne et al.” IECON 02 IEEE annual conference. “Genetic Algorithm in Robot Path Planning Problem in Crisp and Fuzzified Environments. and Advances. “Evolutionary computation theory and applications. and Cybernetics-Part B. J. Huh. Landweber and Erik Winfree. [5] S. 1996. Huh. U.” Springer 05 pg 339.” IEEE ICIT02. McGraw Hill. Psychology Department.” Springer 1998. techniques and applications. “Collision-Free Path Planning with Neural Networks. Vlachavas.” in New Ideas in Optimization. [11] Luca M. Gambardella and Marco Dorigo “ Ant-Q: A Reinforcement Learning approach to the traveling salesman problem. Applications. University of Cambridge.H. Sadati and J. “Evolution as Computation. [4] D. Park. 54 .” IEEE Computational Intelligence Magazine.J. L. 2006. Mauro Birattari.C.” World Scientific 1999. D.1. Bourbakis. and Alberto Colorni “The Ant System: Optimization by a colony of cooperating agents. Goladman.” Man and Cybernetics IEEE. [10] Marco Dorigo.1-13. [12] Watkins C. USA. dissertation.H. [9] Marco Dorigo. UK.G.” In Proceedings of the Eleventh International Conference on Machine Learning.” IEEE Transaction on Systems. No. Taheri. London. San Francisco.References: [1] Xin Yao. England. Vittorio Maniezzo. Tsoukalas. 1997. [2] Laura F. 11–32. Lee and G. and Thomas Stutzle “Ant Colony Optimization Artificial Ants as a Computational Intelligence Technique. pp. “Path Planning in a 2-D Known Space using Neural Networks and Skeletonization. “Path Planning and Navigation for Autonomus Mobile Robot. pp. “Computational Intelligence principles. Bangkok.

“The Hyper-Cube Framework for Ant Colony Optimization.” Future Gen. “Study on Mobile Robot Navigation Techniques. 2000.” IEEE ICIT.[13] Thomas Stuzle and Holger H. 34. IEEE International Conference. “MAX-MIN Ant System. 889-914 [14] Marco Dorigo. [17] M.” IEEE TRANSACTIONS ON SYSTEMS. pg 18. 16. [18] James Kennedy and Russell Eberhart. “Particle Swarm Optimization. Syst. and Marco Dorigo. Springer Berlin Heidelberg NewYork. APRIL 2004 [16] Amit Konar. “Comutational Intelligence Principles. Mauro Birattari. no. and Thomas Stutzle. NO. December 8-10.TECHICAL REPORT SERIES: TR/IRIDIA/2006-023. vol. 2005. Techniques and Applications. AND CYBERNETICSPART B: CYBERNETICS. Hoos. MAN. 55 . “Ant Colony Optimization Artificial Ants as a Computational Intelligence Technique. VOL. 8 pp.” Springer.” Neural Networks. 1995 Proceedings. [15] Christian Blum.” IRIDIA. Youself Ibrahim and Allwyn Fernandes.2. 2004. Tunisia. Comput.