You are on page 1of 24

WATER RESOURCES SYSTEMS

ENGINEERING
WEEK9

Doosun Kang, Kyung Hee University


Nonlinear Optimization Approach
“Local Search”

2
Strategies for Local Search
3

1. Select initial point


2. Choose direction to move
3. Choose distance to move
4. Evaluate function value at new point
5. Accept or reject step
6. Evaluate termination criteria
7. Adapting search procedure
based on information gained
8. Stop OR return to step 2
Pros and Cons of Local Search method
4

Pros
• Simple and intuitive
• No sophisticated technique required

Cons
• No guarantee for global optima
• Inefficient (or impossible) if dimension of DVs is high

Need for global search algorithm !


Nonlinear Optimization Approach
“Global Search Algorithm”

5
Global Search Algorithms
6

GA, genetic algorithm


PSO, particle swarm optimization
ACO, ant colony optimization
SA, simulated annealing
HS, harmony search

HS

The techniques have been successfully applied to a number of complex problems


☞ More effective in finding optimal solutions compared to traditional and
mathematical approaches
History of Global Search Algorithms
7

Types of Algorithms – Initial Stage


History of Global Search Algorithms
8

Types of Algorithms – Recent


Genetic Algorithm (유전자알고리즘)
9

History
⚫ Pioneered by John Holland in the 1970’s
⚫ Got popular in the late 1980’s
⚫ Based on ideas from Darwinian Evolution
⚫ Can be used to solve a variety of problems that
are not easy to solve using other techniques
⚫ Evolution based on “survival of the fittest”
Background of GA
10

Evolution
– Organisms (animals or plants) produce a number of
offspring which are almost, but not entirely, like themselves.
– Some of these offspring may survive to produce offspring
of their own
• The “better adapted” offspring are more likely to survive
• Over time, later generations become better and better adapted
– Genetic Algorithms use this same process to “evolve” better
solutions
– The good solution survive, while bad ones die
Ideas for GA
11

A dumb solution
A “blind generate and test” algorithm:
➢ Repeat
✓ Generate a random possible solution
✓ Test the solution and see how good it is
➢ Until solution is good enough Can we use this dumb idea?
Sometimes - yes:
✓ if there are only a few possible solutions
✓ and you have enough time
✓ then such a method could be used
For most problems - no:
✓ many possible solutions
✓ with no time to try them all
✓ so this method can not be used
Ideas for GA
12

A “less-dumb” idea (GA)


➢ Generate a set of random solutions
➢ Repeat
✓ Test each solution in the set (rank them)
✓ Remove some bad solutions from the set

✓ Duplicate some good solutions

✓ Make small changes to some of them

➢ Until best solution is good enough


Terminology of GA
13

✓ Decision variable
✓ Objective function (=fitness function)
✓ Population (=individuals)
✓ Generation
✓ Parents/Offspring
✓ Crossover
✓ Mutation
✓ Elitism
Pseudo-code of GA
14

BEGIN
INITIALIZE population with random candidate solutions;
EVALUATE each candidate;
REPEAT UNTIL ( TERMINATION CONDITION is satisfied )
DO
1. SELECT parents;
2. RECOMBINE pairs of parents;
3. MUTATE the resulting offspring;
4. EVALUATE new candidates;
5. SELECT individuals for the next generation;
END
GA flowchart
15
Initialize
population

generation = generation +1
Evaluate Fitness

Yes
Satisfy
Constraints
No

Modify Individuals Select Survivors

Selection
Crossover
Mutation
Elitism Evaluate Fitness
GA processes - Crossover
16

▪ Crossover combines two parents to form new children for the next generation
▪ Crossover combines genetic material from two parents, in order to produce
superior offspring.
▪ Types of crossover:
Scattered

Single-point

Two-points

Intermediate
Material from Hoshin Gupta’s class note
GA processes - Mutation
17

▪ Mutation introduces randomness into the population


▪ Mutation makes small random changes in the individuals in the population, which
provides genetic diversity and enable to search a broader space
▪ The idea of mutation is to reintroduce divergence into a converging population.
▪ Mutation is performed on small part of population, in order to avoid entering
unstable state.

Material from Hoshin Gupta’s class note


GA processes – coding of the parameter
space
18

Material from Hoshin Gupta’s class note


GA processes - Selection
19

▪ The selection function chooses parents for the next generation based
on the fitness function values
▪ Main idea : better individuals get higher chance to survive
• Changes proportional to fitness value
• Implementation : Roulette wheel technique
➢ Assign to each individual a portion of the roulette wheel
➢ Spin the wheel ‘n’ times to select ‘n’ individuals
Back to (Ideas for) GA
20

A “less-dumb” idea (GA)


➢ Generate a set of random solutions ➔ Initial population
➢ Repeat
✓ Test each solution in the set (rank them)
✓ Remove some bad solutions from the set ➔ Elitism

✓ Duplicate some good solutions ➔ Crossover

✓ Make small changes to some of them ➔ Mutation

➢ Until best solution is good enough


21

After 20 generations

After 50 generations

After 100 generations


GA Exercise using MATLAB
22

Hosaki Function (Minima at [X1 X2]=Local [1,2] and Global [4,2])


f = [ 1 – 8 x(1) + 7 x(1)2 - (7/3) x(1)3 + .25 x(1)4 ] x(2)2 exp{-x(2)}
GA Exercise using MATLAB
23

“optimtool”
Next week
24

▪ More details of GA
▪ GA exercise using MATLAB

You might also like