Metaheuristic Methods and Their Applications

Lin-Yu Tseng ( )
Institute of Networking and Multimedia Department of Computer Science and Engineering National Chung Hsing University

OUTLINE
I. Optimization Problems II. Strategies for Solving NP-hard Optimization Problems III. What is a Metaheuristic Method? IV. Trajectory Methods V . Population-Based Methods VI. Multiple Trajectory Search VII. The Applications of Metaheuristic Methods VIII. Conclusions

I. Optimization Problems
Computer Science 


Traveling Salesman Problem Maximum Clique Problem Flow Shop Scheduling Problem P ± Median Problem

Operational Research 


Many optimization problems are NP-hard.

Optimization problems

Calculus-based method

Hill climbing

How about this? .

An example : TSP (Traveling Salesman Problem) 1 10 2 6 20 23 5 18 15 4 5 A solution 7 A sequence 3 3 12345 13452 Tour length = 31 Tour length = 63 There are (n-1)! tours in total .

Strategies for Solving NP-hard Optimization Problems ‡ Branch-and-Bound Find exact solution ‡ Approximation Algorithms ± e. ‡ Heuristic Methods Deterministic ‡ Metaheuristic Methods Heuristic + Randomization . There is an approximation algorithm for TSP which can find a tour with tour length 1.II.5× (optimal tour length) in O(n3) time.g.

learning strategies are used to structure information in order to find efficiently near-optimal solutions. What is a Metaheruistic Method? ‡ Meta : in an upper level ‡ Heuristic : to find A metaheuristic is formally defined as an iterative generation process which guides a subordinate heuristic by combining intelligently different concepts for exploring and exploiting the search space. .III. [Osman and Laporte 1996].

‡ Metaheuristic algorithms are approximate and usually non-deterministic. ‡ The goal is to efficiently explore the search space in order to find (near-)optimal solutions. ‡ Techniques which constitute metaheuristic algorithms range from simple local search procedures to complex learning processes. .Fundamental Properties of Metaheuristics [Blum and Roli 2003] ‡ Metaheuristics are strategies that ³guide´ the search process.

‡ Today¶s more advanced metaheuristics use search experience (embodied in some form of memory) to guide the search. ‡ The basic concepts of metaheuristics permit an abstract level description.) ‡ They may incorporate mechanisms to avoid getting trapped in confined areas of the search space. . ‡ Metaheuristics are not problem-specific. ‡ Metaheuristics may make use of domain-specific knowledge in the form of heuristics that are controlled by the upper level strategy.Fundamental Properties of Metaheuristics (cont.

Basic Local Search : Iterative Improvement  Improve (N(S)) can be First improvement Best improvement Intermediate option .IV. Trajectory Methods 1.

Trajectory Methods x2 x1 x3 X4 X5 .

Simulated Annealing ± Kirkpatrick (Science 1983) ¨ f ( s' )  f ( s) ¸ p (T .2. s ) ! exp©  ¹ ‡ Probability T ª º ‡ Temperature T may be defined as Tk 1 ! ETk . s' . 0 1 ‡ Random walk + iterative improvement .

Tabu Search ± Glover 1986 ‡ Simple Tabu Search ‡ Tabu list ‡ Tabu tenure: the length of the tabu list ‡ Aspiration condition .3.

Tabu Search .

k ! 1.4.. N1 k n1 ' k ! kmax x' x( x' N k' ( x)) x' x' ' x' ' x n x' ' k n1 k n k 1 .. k max 1999 move A set of neighborhood structures... Variable Neighborhood Search ± Hansen and Mladenovi ‡ ‡ Composed of three phase shaking local search N2 .. N kmax ' N k' ..

Variable Neighborhood Search Best N N2 1 x¶ x¶¶ x Best x¶ N1 x¶¶ x¶¶ x¶¶ x¶ x Best x¶ N1 x Best .

Related to objective function population  Initial .Chromosome function --. Population-Based Methods 1. Genetic Algorithm ± Holland 1975  Coding  Fitness of a solution --.V.

An example : TSP (Traveling Salesman Problem) 1 10 2 18 6 20 15 4 5 A solution 7 A sequence 23 5 3 3 12345 13452 Tour length = 31 Tour length = 63 .

Genetic Algorithm ‡ Reproduction (Selection) ‡ Crossover ‡ Mutation .

Example 1 Maximize f(x) = x2 where x  I and 0 e x e 31 .

1. e. Coding of a solution : A five-bit integer.g. Initial population : (Randomly generated) 01101 11000 01000 10011 . Fitness function : F(x) = f(x) = x 3. 01101 2 2.

Reproduction Roulette Wheel .

Reproduction .

Crossover .

02 bits .Mutation The probability of mutation Pm = 0.001 20 bits * 0.001 = 0.

.

101] . 111. 116.98. 122] [116.110.03038 v 10-19 ‡ The lowercase letters in ASCII are represented by numbers in the range [97.101.114. 116.Example 2 Word matching problem ‡ ³to be or not to be´ p ³tobeornottobe´ ‡ (1/26)13 = 4.111. 98.

Initial population .

The corresponding strings rzfqdhujardbe niedwrvrjahfj cyueisosqvcb fbfvgramtekuvs kbuqrtjtjensb fwyqykktzyojh tbxblsoizggwm dtriusrgkmbg jvpbgemtpjalq .

.

2. Ant Colony Optimization ± Dorigo 1992 Pheromone .

Initialization Loop Loop Each ant applies a state transition rule to incrementally build a solution and applies a local updating rule to the pheromone Until each of all ants has built a complete solution A global pheromone updating rule is applied Until End_Condition .

Example : TSP A simple heuristics ± a greedy method: choose the shortest edge to go out of a node 1 10 2 6 20 23 5 18 15 4 5 7 3 3 Solution : 15432 .

‡ Each ant builds a solution using a step by step constructive decision policy. ‡ How ant k choose the next city to visit? ® arg s!¯ ° max uJ r {X ru ™ (L ru ) } S. H rs be a distance measure associated with edge (r.s) if s  Jr s ‘ Jr . otherwise F if q e q0 ® X rs ™ (L rs ) F ± F k p rs ! ¯ §X ru ™ (L ru ) uJ r ± if ° 0 Where L ! 1H .

otherwise ° . X 0 is the initial pheromone level 0 V 1 Global update of pheromone X rs n (1  E )X rs  E ™ (X rs . where (X rs ! X 0 . if (r . s )  globally best tour !¯ gb ± 0.Local update of pheromone X rs n (1  V )X rs  V ™ (X rs . where (X rs 0 E 1 ®1 ± L .

3. each particle stochastically accelerates toward its pBest and gBest gBest. ‡ Each particle keeps track of its ³best´ (highest fitness) position in hyperspace. Particle Swarm Optimization ± Kennedy and Eberhart 1995 ‡ Population initialized by assigning random positions and velocities potential solutions are velocities. gBest´ ‡ At each time step. then flown through hyperspace. pBest´ ± It is called ³gBest for the best in the population. ± This is called ³pBest for an individual particle. .

Evaluate fitness of individual particle. 4. 5. Terminate on some condition. . Modify velocities based on personal best and global best. 2.  Includes position and velocity. 3. Go to step 2.Particle Swarm Optimization Process 1. Initialize population in hyperspace.

Initialization: .Positions and velocities .

based on personal best and global best. Here I am.Modify velocities . now! pBest xt xt+1 v New Position ! gBest .

xt ) + c2 * rand() * (gBest .Particle Swarm Optimization Modification of velocities and positions vt+1 = vt + c1 * rand() * (pBest .xt ) xt+1 = xt + vt+1 .

Multiple Trajectory Search (MTS) 1.MTS begins with multiple initial solutions. .Initial solutions will be classified as foreground solutions and background solutions. and the search will be focused mainly on foreground solutions.VI. so multiple trajectories will be produced in the search process. 2.

The search neighborhood begins with a large-scale neighborhood. .Three candidate local search methods were designed. When we want to search the neighborhood of a solution.3. At this time. then it contracts step by step until it reaches the pre-defined tiny size. they will be tested first in order we can choose the one that best fits the landscape of the solution. it is reset to its original size. 4.

By choosing a local search method that best fits the landscape of a solution¶s neighborhood. It does an iterated local search using one of three candidate local search methods.‡ The MTS focuses its search mainly on foreground solutions. the MTS may find its way to a local optimum or the global optimum .

‡ Step 5 : If termination condition is met then stop else go to Step 3. ‡ Step 3 : Choose K foreground solutions based on the grades of the solutions. . ‡ Step 2 : Iteratively apply local search to all these M initial solutions. ‡ Step 4: Iteratively apply local search to K foreground solutions.‡ Step 1 : Find M initial solutions that are uniformly distributed over the solution space.

Then the neighborhood contracts step by step until it reaches a pre-defined tiny size.‡ Three local search methods begin their search in a very large ³neighborhood´. after then. it is reset to its original size. .

‡ Local Search 2 This local search method searches along a direction by considering about n/4 dimensions. .Three Local Search Methods ‡ Local Search 1 This local search method searches along each of n dimensions in a randomly permuted order.

Three Local Search Methods ‡ Local Search 3 This local search method also searches along each of n dimension in a randomly permuted order. but it evaluate multiple solutions along each dimension within a prespecified range. .

Multiple Trajectory Search for Large Scale Global Optimization Winner of 2008 IEEE World Congress on Computational Intelligence Competition on Large Scale Global Optimization .

Multiple Trajectory Search for Unconstrained/Constrained Multi-Objective Optimization The Second and the Third Places of 2009 IEEE Congress on Evolutionary Computation Competition on Performance Assessment of Constrained/Bound Constrained MultiObjective Optimization Algorithm .

The Applications of Metaheuristics 1. Solving NP-hard Optimization Problems ‡ ‡ ‡ ‡ ‡ ‡ ‡ Traveling Salesman Problem Maximum Clique Problem Flow Shop Scheduling Problem P-Median Problem Feature Selection in Pattern Recognition Automatic Clustering Machine Learning (e. Search Problems in Many Applications .VII. Neural Network Training) 2.g.

‡ Obtain better quality solutions than heuristic methods.VIII. ‡ More efficient than branch-and-bound. metaheuristic methods are very good choices for solving these problems. Conclusions ‡ For NP-hard optimization problems and complicated search problems. ‡ Hybridization of metaheuristics ‡ How to make the search more systematic ? ‡ How to make the search more controllable ? ‡ How to make the performance scalable? .

MI. 671±680.D. C. 140. Kirkpatrick. 4. Ph. R. J. In Metaheuristics: Advances and trends in local search paradigms for optimization. Eds. I. 1999. 533±549. and Andrea R. 3. IEEE Press. and Eberhart.. Glover. ³Future paths for integer programming and links to artificial intelligence´. 63. and C. Ann.. 6. 2.. Roucairol. ³Optimization by simulated annealing´. D. Voß. learning and natural algorithms (in italian). and Mladenovi . Osman. H. N.. Adaption in natural and artificial systems. and Laporte. Martello.H.References 1. S. Chapter 30. Gelatt. M. Hansen. pp. ³Metaheuristics in Combinatorial Optimization: Overview and Conceptual Comparison´. 13. 35(3). C. Res. pp. Holland. Optimization. 1942-1948. 1986. 1995. Oper. P. . Blum. 1996. 8. The University of Michigan Press. 5. Res. Science. 4598. M. 1983. Kluwer Academic Publishers. 1975. 13 May 1983 220. 1992. S. Comput. ³Metaheuristics:A bibliography´. Politecnico di Milano. Italy. and Vecchi. Dorigo. 268±308. P. J. 2003. ³Particle Swarm Optimization´. 433±458. An introduction to variable neighborhood search. 513±623. DEI. F. Osman. I. 7. S. Oper. Proceedings of the 1995 IEEE International Conference on Neural Networks. Kennedy. thesis. ACM Computing Surveys.Ann Harbor.G.

June 2-6. Trondheim.´ Proceedings of the 2008 IEEE Congress on Evolutionary Computation. . 2008.´ Proceedings of the 2009 IEEE Congress on Evolutionary Computation. (2009) ³Multiple Trajectory Search for Unconstrained/Constrained Multi-Objective Optimization. Lin-Yu Tseng* and Chun Chen. (2008) ³Multiple Trajectory Search for Large Scale Global Optimization. Lin-Yu Tseng* and Chun Chen. May 18-21. Norway. Hong Kong.References 9. 10.

! ! .

Sign up to vote on this title
UsefulNot useful