General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary

Genetic Algorithms
A step by step tutorial

Max Moorkamp
Dublin Institute for Advanced Studies

Barcelona, 29th November 2005

Max Moorkamp

Genetic Algorithms

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary

Outline

1

General remarks and comparison with other method Genetic algorithms step-by-step A hand-calculated example Some experiments on simple problems Advanced topics Multi-objective optimization A geophysical example

2

3

Max Moorkamp

Genetic Algorithms

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary

Comparison of different optimization methods

Method Analytical Gradient Numerical Gradient Genetic Algorithm Monte Carlo Grid Search

Func. Eval. 1 10 − 102 103 − 104 105 108

Class of Problems One special problem differentiable, lim. no. of min. limited parameter range none none

Max Moorkamp

Genetic Algorithms

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Advantages and Disadvantages of Genetic Algorithm Advantages: No derivatives needed Easy to parallelize Can escape local minima Works on a wide range of problems Disadvantages: Need much more function evaluations than linearized methods No guaranteed convergence even to local minimum Have to discretize parameter space Max Moorkamp Genetic Algorithms .

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary General program flow of a genetic algorithm Determine parameter encoding Generate random population Calculate probabilities from Objective function The general structure is the same for all genetic algorithms Individual steps can be done in a variety of ways We will only discuss the most basic operators N Iterations Select models into new population Crossover Mutation Max Moorkamp Genetic Algorithms .

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Outline 1 General remarks and comparison with other method Genetic algorithms step-by-step A hand-calculated example Some experiments on simple problems Advanced topics Multi-objective optimization A geophysical example 2 3 Max Moorkamp Genetic Algorithms .

. x2 : x1 ∈ [−1 .1 10-bit Basevalue Stepsize x1 -1 0. 3.003 Max Moorkamp Genetic Algorithms . 1] and x2 ∈ [0 . . . . stepsize and encoding length has to be specified. 3. . .0002 x2 0 0.1] 5-bit Basevalue Stepsize x1 -1 0. .4. . 100) ← (0101110011 .General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Encoding the parameters Model parameters are encoded into a binary string x = (1.0625 x2 0 0. . For example two parameters x1 . . 100011) For each paramter a minimum value.2.

.0625 (10110) = 0 + 13 ∗ 0.6 pN = (0110100001) → (01101) = −1 + 22 ∗ 0.7 . (10001) = 0 + 17 ∗ 0. p1 = (0101110011) → (01011) = −1 + 26 ∗ 0.3 p3 = (1001010001) → (10010) = −1 + 9 ∗ 0.1 = 1.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Generating a population At the start a population of N random models is generated.0625 = −0.375 Max Moorkamp Genetic Algorithms .5 p2 = (1111010110) → (11110) = −1 + 15 ∗ 0. . .1 = 1.1 = 1.4375 . (00001) = 0 + 16 ∗ 0.625 (10011) = 0 + 25 ∗ 0. .0625 = 0.0625 = −0.0625 = 0.1 = 2.

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Calculating the objective function values For each member of the population calculate the value of the objective function: 2 2 O(x1 . .08 .3) = 1. .5) = 6.625.6) = 2. 1. . 1. 1.4375.70 Max Moorkamp Genetic Algorithms .0625.69 p3 → O(−0.64 p2 → O(−0. pN → O(0. x2 ) = x1 + x2 p1 → O(0. 2. .7) = 3.375. .

64 − 1.64 − 3.45 6. Genetic Algorithms Now we can generate a new Max Moorkamp .General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Assigning probabilities Based on the objective function value.64 − 2.45 population of the same size. 12.40 12. In our example P(pj ) = In our case: i 1 (max {O(xi )} − O(xi )) i max {O(xi )} − O(xi ) max {O(xi )} − O(xi ) = 12.08 = 0.64 − 6.45 this gives: P(p1 ) = P(p2 ) = P(p3 ) = P(p4 ) = 6.28 12.45 6.70 = 0.45 6.64 = 0.32 12. a probability is assigned to make it into the next generation P(pj ) = f (O(xi )).69 = 0.

28 0. m1 m2 m3 m4 Old Pos.4 Max Moorkamp Genetic Algorithms . 0.32 0. The offsprings undergo two steps: Crossover and Mutation Mating Cand.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Selection Each model’s chance to mate is determined by their probability. p2 p3 p4 p2 Prob.4 0.

(110111) 3 (000001) Max Moorkamp Genetic Algorithms .7.2 ≤ Pc ≤ 0.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Crossover With probability Pc two members of the population exchange their genes after a randomly chosen point. (000111) −→ (110001) Typically 0.

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Mutation With a probability Pm one bit changes its value. (000111) → (010111) Typically Pm is chosen so that 80-90 percent of the population do not mutate. Max Moorkamp Genetic Algorithms .

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Combined effect on the reproduction operators Result of selection. Max Moorkamp Genetic Algorithms .08 3. crossover and mutation: New p1 p2 p3 p4 Father m1 m1 m3 m3 Mother m2 m2 m4 m4 Cross. Mut.7 2.69 2. 4 4 3 Result (1111010110) (1001010001) (0110|100001) → (0110010110) (1111|010110) → (1101100001) O(x) 1.03 The resulting genetic strings make up the new population.

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Outline 1 General remarks and comparison with other method Genetic algorithms step-by-step A hand-calculated example Some experiments on simple problems Advanced topics Multi-objective optimization A geophysical example 2 3 Max Moorkamp Genetic Algorithms .

. N = 20 and we calculate 6 generations i.11) = −78. The objective function ↔ probability scaling is P(pi ) = 1 exp(−O(x)/T ) where T = 1000 g < 3 100 g ≥ 3 Max Moorkamp Genetic Algorithms .001.e 120 forward calculations. We search x ∈ [−10 .03 and O(4.475. .41) = −335.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems A slightly more difficult problem O(x) = (x − 6)(x − 2)(x + 4)(x + 6) has two minima O(−5. 20] with 15-bit and a stepsize of 0.

Generation Genetic Algorithms .3 4.5 Fitness vs.4 4.2 4.1 4.3 starting value Function Value GAResult 0 -200 -400 0 1 2 Generation 3 4 0 500 1000 Index 1500 2000 Max Moorkamp 4.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Statistics of GA performance 400 Best member Mean performance 200 25799.

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Performance of the GA Generation 0 150000 Generation 1 Generation 2 6000 100000 + 4000 y y + + −10 −5 + 0 + + + ++ + ++ 5 x 10 15 20 50000 + 2000 + + + + + + 0 x 5 10 −5 y 2000 4000 6000 + −10 −5 0 x + + + 5 10 0 0 −10 Generation 3 Generation 4 0 Generation 5 6000 6000 4000 4000 y y 2000 2000 y + ++ + −10 −5 0 x + + + 5 10 + + −10 −5 0 x 2000 4000 6000 + + 5 10 ++ + + + −10 −5 0 x 5 10 0 0 Max Moorkamp Genetic Algorithms 0 .

383 Mean 4.671 1st Qu. -5.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Looking at some statistics Histogram of x 2000 runs.465 Max. 4.389 Max Moorkamp Genetic Algorithms . 6. N=20. 4. g=6 6 -4 -6 0 500 1000 1500 2000 0 −6 100 200 -2 300 0 Frequency 400 2 500 4 600 700 −4 −2 0 x 2 4 6 Min.215 3rd Qu.255 Median 4.

5 (1.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems A hard problem O(x) = |floor (x)| + 0. − (|x| − |floor (x)|)) Objective 20 function Histogram of result of 2000 runs N = 20 6 generations T = 1000 Histogram of ga 150 15 10 5 Frequency −10 −5 0 5 x 10 15 20 0 50 0 100 y −2 0 ga 2 4 Max Moorkamp Genetic Algorithms .

636 0.5103 3rd Qu.2350 Median 0.6158 0.9590 4.2467 0.920 0.930 -0. -2. but decrease diversity in the population In this case the parameter T has no big influence on the result Max Moorkamp Genetic Algorithms . 0.8350 0.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary A hand-calculated example Some experiments on simple problems Statistics for the hard problem Statistical results for different GA-parameters Parms 20/6/1000 30/10/1000 20/6/10 30/10/10 Min. 0.9590 The algorithm fails to find the exact minimum.0000 1st Qu.9550 -3. 4.221 0.405 2.5070 0.2357 0.4410 -1.5550 0. but identifies the general region More iterations do not improve the mean/median performance.580 0.7930 0.7552 Max.4990 Mean 0.0990 2.5340 0.

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Multi-objective optimization A geophysical example Outline 1 General remarks and comparison with other method Genetic algorithms step-by-step A hand-calculated example Some experiments on simple problems Advanced topics Multi-objective optimization A geophysical example 2 3 Max Moorkamp Genetic Algorithms .

  . .   O1 (xi )  O2 (xi )    O(xi ) =   . OM (xi ) Usually the weighted norm of this vector is then minimized e.g M W · O(xi ) 2 = j=1 (Wj Oj (xi ))2 Genetic algorithms can also directly work on the vector without weighting.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Multi-objective optimization A geophysical example Multi-objective optimization Multi-objective optimization means that we try to minimize several functions simultaneously. Max Moorkamp Genetic Algorithms .

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Multi-objective optimization A geophysical example Pareto optimality and domination Domination: (x < p y ) ⇔ (∀ i : xi ≤ yi ∧ ∃ i : xi < yi ) If a member of the population is not dominated by any other it is called nondominated or pareto-optimal.04 0. Population fitness Generation 1 0.02 0 10 20 30 MT RMS 40 Max Moorkamp Genetic Algorithms .06 Model Roughness Rank 1 Rank 2 Rank 3 Rank 4 0.

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Multi-objective optimization A geophysical example Transforming rank into probability Pi = 1− r rmax α r < rmax r ≥ rmax 0 r max = 40 Fitness as a function of Rank 1 α=1 α=2 α=3 α=4 α = 0.5 0.2 0 0 10 20 Rank 30 40 Max Moorkamp Genetic Algorithms .8 0.4 0.6 0.

φ. si ) + WReg OReg (xj ) WMT . f (xj ))  OReg (xj ) OMT (di . si ) + WRec ORec (di . si ) = WMT OMT (di . f (xj )) = 1 n n i=1 di − si ∆di 2 with si ∈ [ρa . Z ] Max Moorkamp Genetic Algorithms . but could be variable during inversion. In contrast genetic algorithms can handle vector valued objective functions without a need to weight the individual objective functions. WRec and Wreg are usually constant. xj ) =  ORec (di .   OMT (di . f (xj )) Ogen (di .General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Multi-objective optimization A geophysical example Designing an objective function In linearized methods the objective function has to be scalar and is a linear combination of the individual objective functions Olin (di .

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Multi-objective optimization A geophysical example Outline 1 General remarks and comparison with other method Genetic algorithms step-by-step A hand-calculated example Some experiments on simple problems Advanced topics Multi-objective optimization A geophysical example 2 3 Max Moorkamp Genetic Algorithms .

Input model 8 8 7 6 5 4 3 2 140 120 0 10 20 30 40 50 60 70 S-Vel. Density P-Vel.01 1 Period [s] 100 10000 Max Moorkamp Genetic Algorithms . Resistivity 100 80 60 40 20 0 10 20 30 40 Depth [km] 50 60 70 0 20 40 Depth [km] 60 Data fit Synthetic Input model 1000 ρa [Ωm] 100 10 90 75 Phase 60 45 30 15 0 0.01 1 100 10000 Input Data Inversion 0. Inversion result 7 6 5 4 3 2 140 120 Resistivity 100 80 60 40 20 0 0 20 40 60 S-Vel.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Multi-objective optimization A geophysical example Inversion of synthetic data A simple two-layer model is inverted to test the inversion. Density P-Vel.

01 0.0001 0 50 100 150 200 Max Moorkamp Genetic Algorithms .01 0.0001 0.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Multi-objective optimization A geophysical example Evolution of misfit and roughness Best models No Regularization 1000 Iteration 0 Iteration 20 Iteration 40 Iteration 60 Iteration 80 Iteration 100 Rec. Misfit 100 10 Max Misfit Min Misfit Median Misfit Min Roughness Median Roughness 1 0. Func.1 0.001 0.1 1 MT Misfit 10 100 0.

04 4.5 4 Amplitude 0.02 Vs [km/s] 3.General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Multi-objective optimization A geophysical example Inversion of a more realistic model Inversion fit MT Data MT Models 1000 1000 ρa [Ωm] 100 Original Model Inversion Result 10 90 75 Phase 60 45 30 15 0 0 1 100 10000 ρa [Ωm] 100 Original data Inversion result 0.5 0 2 0 100 200 Time [s] 300 400 500 1.01 1 Period [s] 100 10000 10 1 Depth [km] 10 100 Inversion Fit Receiver function data Receiver function model comparison 5 Inversion result Original model 0.5 1 Depth [km] 10 100 Max Moorkamp Genetic Algorithms .5 3 2.

General remarks and comparison with other method Genetic algorithms step-by-step Advanced topics Summary Summary Genetic algorithms can be applied to a wide range of problems Their flexibility comes at the cost of more function evaluations There are many variants of genetic algorithms available. each with its own strengths and weaknesses Max Moorkamp Genetic Algorithms .

April 2002 Kanpur Genetic Algorithm Laboratory NSGA-II and general information http://www.mit. Deb A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II IEEE Transactions on Evolutionary Computation. Goldberg Genetic algorithms in search. Wall GALib programming library in C++ http://lancet.Appendix For Further Reading For Further Reading D.shtml M. No. optimization & machine learning Addison Wesley.iitk.ac. K. 1990.edu/ga Max Moorkamp Genetic Algorithms . 2. E.in/kangal/index. 6. Vol.

Sign up to vote on this title
UsefulNot useful