You are on page 1of 73

R.K.

Bhattacharjya/CE/IITG

Introduction To Genetic Algorithms

Dr. Rajib Kumar Bhattacharjya


Department of Civil Engineering
IIT Guwahati
Email: rkbc@iitg.ernet.in
19 October 2012

References
R.K. Bhattacharjya/CE/IITG

D. E. Goldberg, Genetic Algorithm In


Search, Optimization And Machine Learning, New
York: Addison Wesley (1989)

John H. Holland Genetic Algorithms, Scientific


American Journal, July 1992.

Kalyanmoy Deb, An Introduction To Genetic


Algorithms, Sadhana, Vol. 24 Parts 4 And 5.
19 October 2012

Introduction to optimization
R.K. Bhattacharjya/CE/IITG

Global optima

Local optima

Local optima

Local optima
Local optima

19 October 2012

Introduction to optimization
R.K. Bhattacharjya/CE/IITG

Multiple optimal solutions

19 October 2012

Genetic Algorithms
5

R.K. Bhattacharjya/CE/IITG

Genetic Algorithms are the heuristic search


and optimization techniques that mimic the
process of natural evolution.

19 October 2012

Principle Of Natural Selection


6

R.K. Bhattacharjya/CE/IITG

Select The
Best,
Discard The
Rest

19 October 2012

An Example.
R.K. Bhattacharjya/CE/IITG

Giraffes have long necks




Giraffes with slightly longer necks could feed on


leaves of higher branches when all lower ones had
been eaten off.
They had a better chance of survival.
Favorable characteristic
propagated through generations
of giraffes.
Now, evolved species has long necks.

19 October 2012

An Example.
8

R.K. Bhattacharjya/CE/IITG

This longer necks may have due to the effect of mutation


initially. However as it was favorable, this was propagated over
the generations.
19 October 2012

Evolution of species
R.K. Bhattacharjya/CE/IITG

Initial Population of
animals

Millions Of
Years

Struggle For Existence


Survival Of the Fittest

Surviving Individuals
Reproduce, Propagate Favorable
Characteristics

Evolved Species

19 October 2012

10

R.K. Bhattacharjya/CE/IITG

Thus genetic algorithms implement


the optimization strategies by
simulating evolution of species
through natural selection

19 October 2012

Simple Genetic Algorithms


R.K. Bhattacharjya/CE/IITG

11

Start
Initialize
population

Evaluate Solutions

T =0
Optimum
Solution?

Selection

Y
T=T+1

Stop

Crossover
Mutation
19 October 2012

Simple Genetic Algorithm


R.K. Bhattacharjya/CE/IITG

12

function sga()
{
Initialize population;
Calculate fitness function;
While(fitness value != termination criteria)
{
Selection;
Crossover;
Mutation;
Calculate fitness function;

}
}

19 October 2012

Nature to Computer Mapping


R.K. Bhattacharjya/CE/IITG

13

Nature

Computer

Population
Individual
Fitness
Chromosome
Gene
Reproduction

Set of solutions
Solution to a problem
Quality of a solution
Encoding for a solution
Part of the encoding solution
Crossover

19 October 2012

GA Operators and Parameters


R.K. Bhattacharjya/CE/IITG

14

Selection

Crossover

Mutation

Elitism

19 October 2012

Encoding
15

R.K. Bhattacharjya/CE/IITG

The process of representing a solution in the


form of a string that conveys the necessary
information.
Just as in a chromosome, each gene controls a
particular characteristic of the
individual, similarly, each bit in the string
represents a characteristic of the solution.
19 October 2012

Encoding Methods
R.K. Bhattacharjya/CE/IITG

16

Most common method of encoding is binary coded.


Chromosomes are strings of 1 and 0 and each position
in the chromosome represents a particular characteristic
of the problem

Decoded value

52

26

Mapping between decimal


and binary value
19 October 2012

Encoding Methods
R.K. Bhattacharjya/CE/IITG

17

Defining a string

[0100001010]
d
h

(d,h) = (8,10) cm
Chromosome = [0100001010]

19 October 2012

Fitness function
18

R.K. Bhattacharjya/CE/IITG

A fitness function value quantifies the optimality


of a solution. The value is used to rank a
particular solution against all the other solutions.
A fitness value is assigned to each solution
depending on how close it is actually to the
optimal solution of the problem.

19 October 2012

Assigning a fitness function


R.K. Bhattacharjya/CE/IITG

19

Considering c = 0.0654

23

19 October 2012

Selection
20

R.K. Bhattacharjya/CE/IITG

The process that determines which solutions are


to be preserved and allowed to reproduce and
which ones deserve to die out.
The primary objective of the selection operator is
to emphasize the good solutions and eliminate
the bad solutions in a population while keeping
the population size constant.
Selects the best, discards the rest
19 October 2012

Selection
R.K. Bhattacharjya/CE/IITG

21

Identify the good solutions in a population.

Make multiple copies of the good solutions.

Eliminate bad solutions from the population so


that multiple copies of good solutions can be
placed in the population.

19 October 2012

Selection
R.K. Bhattacharjya/CE/IITG

22

There are different techniques to implement


selection in Genetic Algorithms.
They are:
 Tournament

selection
 Roulette wheel selection
 Rank selection
 Steady state selection, etc
19 October 2012

Tournament selection
R.K. Bhattacharjya/CE/IITG

23

In tournament selection several tournaments


are played among a few individuals. The
individuals are chosen at random from the
population.
The winner of each tournament is selected
for next generation.
Selection pressure can be adjusted easily by
changing the tournament size.
Weak individuals have a smaller chance to be
selected if tournament size is large.
19 October 2012

Tournament selection
R.K. Bhattacharjya/CE/IITG

24

32

22
22

25
25

13 +30
32

40
22

32
40

32

25

22

22
25

7 +45

7 +45
25
25

22

13 +30
13 +30

19 October 2012

13 +30

Roulette wheel selection


R.K. Bhattacharjya/CE/IITG

25

Chromosome #


Parents are
selected
according to their
fitness values.
The better
chromosomes
have more
chances to be
selected.

Chromosome 1
Chromosome 2
Chromosome 3
Chromosome 4
Chromosome 5
Chromosome 6

Fitness

% of Roulette wheel

127.50
50.00
25.00
20.00
15.00
12.50

5%

51
20
10
8
6
5
Roulette Wheel

6%
8%

Chromosome 1
Chromosome 2

10%
51%

Chromosome 3
Chromosome 4
Chromosome 5
Chromosome 6

20%

19 October 2012

Rank selection
26

Chromosome # Fitness


The Roulette wheel


selection will have
problems when the
fitness values differ very
much.
In this case, the
chromosomes with low
fitness values will have a
very few chance to be
selected.
This problem can be
avoided using ranking
selection.

Chromosome 1 127.50
Chromosome 2 50.00
Chromosome 3 25.00
Chromosome 4 20.00
Chromosome 5 15.00
Chromosome 6 12.50
5%

R.K. Bhattacharjya/CE/IITG
% of
Rank
% of
Roulette
Roulette
wheel
wheel
51
6
28
20
5
24
10
4
19
8
3
14
6
2
10
5
1
5
Rank Selection

10%
28%

Chromosome 1
14%

Chromosome 2
Chromosome 3
Chromosome 4
Chromosome 5
19%

Chromosome 6

24%

19 October 2012

Steady state selection


R.K. Bhattacharjya/CE/IITG

27

In this method, a few good chromosomes are


used for creating new offspring in every
iteration.
Then some bad chromosomes are removed and
the new offspring is placed in their places.
The rest of population migrates to the next
generation without going through selection
process.

19 October 2012

Binary Crossover
R.K. Bhattacharjya/CE/IITG

28




The crossover operator is used to create new solutions from the


existing solutions available in the mating pool after applying
selection operator.
This operator exchanges the gene information between the
solutions in the mating pool.
The most popular crossover selects any two solutions strings
randomly from the mating pool and some portion of the strings is
exchanged between the strings.
The selection point is selected randomly.
A probability of crossover is also introduced in order to give
freedom to an individual solution string to determine whether the
solution would go for crossover or not.
19 October 2012

Binary Crossover
29

R.K. Bhattacharjya/CE/IITG

Source: Deb 1999


19 October 2012

Binary Mutation
R.K. Bhattacharjya/CE/IITG

30





Mutation is the occasional introduction of new features in to the solution


strings of the population pool to maintain diversity in the population.
Though crossover has the main responsibility to search for the optimal
solution, mutation is also used for this purpose.
Mutation operator changes a 1 to 0 or vise versa, with a mutation
probability of .
The mutation probability is generally kept low for steady convergence.
A high value of mutation probability would search here and there like a
random search technique.

19 October 2012

Source: Deb 1999

Elitism
R.K. Bhattacharjya/CE/IITG

31

Crossover and mutation may destroy the best


solution of the population pool
Elitism is the preservation of few best solution
of the population pool
Elitism is defined in percentage or in number

19 October 2012

An example problem
R.K. Bhattacharjya/CE/IITG

32

Consider 5 bit string to represent the solution, then


00000 = 0 and 11111 =

Let assume a population size of 4

19 October 2012

An example problem
33

R.K. Bhattacharjya/CE/IITG

Source: Deb 1999


19 October 2012

Real coded Genetic Algorithms


R.K. Bhattacharjya/CE/IITG

34

Disadvantage of binary coded GA


 more

computation
 lower accuracy
 longer computing time
 solution space discontinuity
 hamming cliff

19 October 2012

Real coded Genetic Algorithms


R.K. Bhattacharjya/CE/IITG

35

The standard genetic algorithms has the following steps


1.
2.
3.
4.
5.
6.

Choose initial population


Assign a fitness function
Perform elitism
Perform selection
Perform crossover
Perform mutation

In case of standard Genetic Algorithms, steps 5 and


6 require bitwise manipulation.
19 October 2012

Real coded Genetic Algorithms


R.K. Bhattacharjya/CE/IITG

36

Simple crossover: similar to binary crossover

P1=[8 6 3 7 6]
P2=[2 9 4 8 9]
C1=[8 6 4 8 9]
C2=[2 9 3 7 6]

19 October 2012

Real coded Genetic Algorithms


R.K. Bhattacharjya/CE/IITG

37

Linear Crossover

Parents: (x1,,xn ) and (y1,,yn )


Select a single gene (k) at random
Three children are created as,

( x1 , ..., xk , 0.5 yk + 0.5 xk , ..., xn )


( x1 , ..., xk , 1.5 yk 0.5 xk , ..., xn )

( x1 , ..., xk , - 0.5 yk + 1.5 xk , ..., xn )

From the three children, best two are selected for the
next generation
19 October 2012

Real coded Genetic Algorithms


R.K. Bhattacharjya/CE/IITG

38

Single arithmetic crossover

Parents: (x1,,xn ) and (y1,,yn )


Select a single gene (k) at random
child1 is created as,

( x1 , ..., xk , yk + (1 ) xk , ..., xn )

reverse for other child. e.g. with = 0.5

0.5 0.7 0.7 0.5 0.2 0.8 0.3 0.9 0.4

0.5 0.7 0.7 0.5 0.2 0.5 0.3 0.9 0.4

0.1 0.3 0.1 0.3 0.7 0.2 0.5 0.1 0.2

0.1 0.3 0.1 0.3 0.7 0.5 0.5 0.1 0.2


19 October 2012

Real coded Genetic Algorithms


R.K. Bhattacharjya/CE/IITG

39

Simple arithmetic crossover

Parents: (x1,,xn ) and (y1,,yn )


Pick random gene (k) after this point mix values
child1 is created as:

( x , ..., x , y
+ (1 ) x
, ..., y + (1 ) x )
1
k
k +1
k +1
n
n

reverse for other child. e.g. with = 0.5

0.5 0.7 0.7 0.5 0.2 0.8 0.3 0.9 0.4

0.5 0.7 0.7 0.5 0.2 0.5 0.4 0.5 0.3

0.1 0.3 0.1 0.3 0.7 0.2 0.5 0.1 0.2

0.1 0.3 0.1 0.3 0.7 0.5 0.4 0.5 0.3


19 October 2012

Real coded Genetic Algorithms


R.K. Bhattacharjya/CE/IITG

40

Whole arithmetic crossover

Most commonly used


Parents: (x1,,xn ) and (y1,,yn )
child1 is:

x + (1 ) y

reverse for other child. e.g. with = 0.5

0.5 0.7 0.7 0.5 0.2 0.8 0.3 0.9 0.4

0.3 0.5 0.4 0.4 0.4 0.5 0.4 0.5 0.3

0.1 0.3 0.1 0.3 0.6 0.2 0.5 0.1 0.2

0.3 0.5 0.4 0.4 0.4 0.5 0.4 0.5 0.3


19 October 2012

Simulated binary crossover


R.K. Bhattacharjya/CE/IITG

41

Developed by Deb and Agrawal, 1995)

Where,
a random number
is a parameter that controls the crossover process. A high value
of the parameter will create near-parent solution
19 October 2012

Random mutation
R.K. Bhattacharjya/CE/IITG

42

Where

is a random number between [0,1]

Where,
is the user defined maximum
perturbation
19 October 2012

Normally distributed mutation


R.K. Bhattacharjya/CE/IITG

43

A simple and popular method

Where

is the Gaussian probability


distribution with zero mean

19 October 2012

Polynomial mutation
44

R.K. Bhattacharjya/CE/IITG

19 October 2012

Multi-modal optimization
45

R.K. Bhattacharjya/CE/IITG

19 October 2012

After Generation 200


46

R.K. Bhattacharjya/CE/IITG

19 October 2012

Multi-modal optimization
R.K. Bhattacharjya/CE/IITG

47

Sharing function

Niche count

Modified fitness

19 October 2012

Hand calculation
R.K. Bhattacharjya/CE/IITG

48

Maximize
Sol

String

Decoded
value

110100

52

1.651

0.890

101100

44

1.397

0.942

011101

29

0.921

0.246

001011

11

0.349

0.890

110000

48

1.524

0.997

101110

46

1.460

0.992
19 October 2012

Distance table
R.K. Bhattacharjya/CE/IITG

49

dij

0.254

0.73

1.302

0.127

0.191

0.254

0.476

1.048

0.127

0.063

0.73

0.476

0.572

0.603

0.539

1.302

1.048

0.572

1.175

1.111

0.127

0.127

0.603

1.175

0.064

0.191

0.063

0.539

1.111

0.064

19 October 2012

Sharing function values


R.K. Bhattacharjya/CE/IITG

50

sh(dij)

nc

0.492

0.746

0.618

2.856

0.492

0.048

0.746

0.874

3.16

0.048

1.048

0.746

0.746

0.872

3.364

0.618

0.874

0.872

3.364

19 October 2012

Sharing fitness value


R.K. Bhattacharjya/CE/IITG

51

Sol

String

Decoded
value

110100

52

1.651

0.890

101100

44

1.397

0.942

011101

29

0.921

0.246

001011

11

0.349

0.890

110000

48

1.524

0.997

101110

46

1.460

0.992

nc

f
2.856 0.312
3.160 0.300
1.048 0.235
1.000 0.890
3.364 0.296
3.364 0.295

19 October 2012

52

Solutions obtained using modified


fitness value
R.K. Bhattacharjya/CE/IITG

19 October 2012

Evolutionary Strategies
R.K. Bhattacharjya/CE/IITG

53

ES use real parameter value

ES does not use crossover operator

It is just like a real coded genetic algorithms with


selection and mutation operators only

19 October 2012

Two members ES: (1+1) ES


R.K. Bhattacharjya/CE/IITG

54

In each iteration one parent is used to create


one offspring by using Gaussian mutation
operator

19 October 2012

Two members ES: (1+1) ES


R.K. Bhattacharjya/CE/IITG

55




Step1: Choose a initial solution and a mutation


strength
Step2: Create a mutate solution

Step 3: If
, replace
with
Step4: If termination criteria is satisfied, stop, else
go to step 2

19 October 2012

Two members ES: (1+1) ES


R.K. Bhattacharjya/CE/IITG

56

Strength of the algorithm is the proper value of

Rechenberg postulate
 The

ratio of successful mutations to all the mutations


should be 1/5. If this ratio is greater than 1/5, increase
mutation strength. If it is less than 1/5, decrease the
mutation strength.

19 October 2012

Two members ES: (1+1) ES


R.K. Bhattacharjya/CE/IITG

57

A mutation is defined as successful if the mutated


offspring is better than the parent solution.
If
is the ratio of successful mutation over n trial,
Schwefel (1981) suggested a factor
in
the following update rule

19 October 2012

Matlab code
58

R.K. Bhattacharjya/CE/IITG

19 October 2012

59

19 October 2012

Some results of 1+1 ES


R.K. Bhattacharjya/CE/IITG

60

5
0
10

20 0

4.5
4

50

10 0

Optimal Solution is
X*= [3.00
1.99]

50

10 0

3.5

20

5
0
10

1.5

10

20

2
10
20

50

20

10 0

10 0

20 0

0.5

1.5

Objective function value f


= 0.0031007

50

20
0

0.5

50

20

10

2.5

20 0

2.5
X

3.5

4.5

20 0

19 October 2012

Multimember ES
R.K. Bhattacharjya/CE/IITG

61

ES
Step1: Choose an initial population of
and mutation strength
Step2: Create mutated solution

solutions

Step3: Combine
and , and choose the best
solutions
Step4: Terminate? Else go to step 2
19 October 2012

Multimember ES
R.K. Bhattacharjya/CE/IITG

62

ES

Through selection

Through mutation
19 October 2012

Multimember ES
R.K. Bhattacharjya/CE/IITG

63

Offspring

ES

Through selection

Through mutation
19 October 2012

Multi-objective optimization
R.K. Bhattacharjya/CE/IITG

Comfort

64

Price

19 October 2012

Multi-objective optimization
R.K. Bhattacharjya/CE/IITG

65

Two objectives are


Minimize weight
Minimize deflection

19 October 2012

Multi-objective optimization
R.K. Bhattacharjya/CE/IITG

66





More than one objectives


Objectives are conflicting in nature
Dealing with two search space
Decision variable space
 Objective space





Unique mapping between the objectives and often the


mapping is non-linear
Properties of the two search space are not similar
Proximity of two solutions in one search space does not
mean a proximity in other search space
19 October 2012

Multi-objective optimization
67

R.K. Bhattacharjya/CE/IITG

19 October 2012

NSGA II
R.K. Bhattacharjya/CE/IITG

68

Non-dominated Sorting Genetic Algorithms


 NSGA

II is an elitist non-dominated sorting Genetic


Algorithm to solve multi-objective optimization problem
developed by Prof. K. Deb and his student at IIT
Kanpur.
 It has been reported that NSGA II can converge to the
global Pareto-optimal front and can maintain the
diversity of population on the Pareto-optimal front

19 October 2012

Non-dominated sorting
R.K. Bhattacharjya/CE/IITG

69

1
Feasible Region

5
4
6
2
3

Objective 1 (Minimize)

Objective 2 (Minimize)

Objective 2 (Minimize)

1
2

Infeasible Region

Objective 1 (Minimize)

19 October 2012

Initialize population of size N


Calculate all the objective functions

Rank the population according to nondominating criteria

70

Selection

No
Crossover

Termination
Criteria?

Yes

Mutation
Calculate objective function of the
new population

Pareto-optimal solution

Combine old and new population

Non-dominating ranking on the


combined population
Calculate crowding distance of all
the solutions
Get the N member from the
combined population on the basis
of rank and crowding distance
Replace parent population by the better
members of the combined population

19 October 2012

Calculation crowding distance


71

R.K. Bhattacharjya/CE/IITG

19 October 2012

Replacement scheme of NSGA II


R.K. Bhattacharjya/CE/IITG

72

Non-dominating sorting

Pt

Pt+1

F1
F2
F3

Crowding distance
sorting

Qt

Rejected

19 October 2012

R.K. Bhattacharjya/CE/IITG

73

THANKS
19 October 2012

You might also like