Professional Documents
Culture Documents
Pso Tutorial
Pso Tutorial
A mini tutorial
Russell
Eberhart
eberhart@engr.iupui.edu
James
Kennedy
Kennedy_Jim@bls.gov
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Part 1: United we stand
D P NP
geographical
social
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Psychosocial compromise
My best
x im ity
i−pro perf.
p
Here I The best perf.
am! x g of my
neighbours
r o xi mit y
v
g−p
Initial v
1
2 3
Fitness
Search space
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
The circular neighbourhood
1
Particle 1’s
2
3−neighbourhood 8
7 3
Virtual circle 4
6
5
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Random proximity
Hyperparallelepiped => Biased
p y r amid
= May an
im ity DPNP
i−pro x
o xi mit y
p g−pr
x
g
v
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Animated illustration
Global
optimum
Or this way
r l d ” g ra p h
“Small wo
or ?
(informers)
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Type 1” form t
c tio n c oe ff i ci e n
l c on s tr i
Globa Usual values:
⎧v(t + 1) = χ (v(t ) + ϕ (q − x(t )))
⎨
⎩ x(t + 1) = v(t + 1) + x(t ) κ=1
with ϕ=4.1
ϕ = rand (0, ϕ ) + rand (0, ϕ ) = ϕ ' +ϕ '
1 2 1 2
=> χ=0.73
ϕ ' p +ϕ ' g
q= 1 2
swarm size=20
ϕ ' +ϕ '
1 2
⎧ 2κ hood size=3
⎪⎪ for ϕ > 4
χ = ⎨ϕ − 2 + ϕ − 4ϕ 2
c ri t e rio n
⎪ i ve rg en c e
⎪⎩else κ
No n d
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
5D complex space
A 3D section ϕ
6
5
4
} Convergence
3
-2
-1
2
1
0
00
-2 }
-4 Non
divergence
2 1
Re(v)
4 2 Re(y)
-1 -0.5 00 0.5 1
-0.2
-0.4 Re(v)
-0.6
-0.8
0 2
0 1 3
1 2 4 0 Bingo!
8
2 3 0 5 1
8 3
4
4 1
2
6 0
1
2
3
3 2 4
Algebraic operators
(coefficient , velocity ) ⎯⎯→⊗
velocity
(velocity, velocity ) ⎯⎯→o
velocity
( position, position ) ⎯⎯→ Θ
velocity
( position, velocity ) ⎯⎯→⊕
position
(position,velocity) = confinement(positiont+1,positiont)
Granularity
=>
Combinatorial TRIBES
Combinatorial Hybride
Hybride
OEP 0, 1, …
Multi−objective
Multi−objective
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Unbiased random proximity
Hyperparallelepiped => Biased
ity Hypersphere vs
i−proxim
hypercube
o xi mit y
g−pr
1.2
p 1
x 0.8
Volume
0.6
0.4
0.2
v 0
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29
Dimension
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
The three balls
o x imity ty
i−pr g−pro x imi
p
x
g
rand(0…b)(p-x)
αv
The better I The better is my best
am the more I neighbour the more I
follow my own tend to go towards
way him
I try to generate a
I'm the best new particle
but there has been not enough
improvement
Adaptive
information links
1,00E+04 25
9,00E+03
8,00E+03 20
7,00E+03
6,00E+03 15
Potential energy
5,00E+03 Kinetic energy
Swarm size
4,00E+03 10
3,00E+03
2,00E+03 5
1,00E+03
0,00E+00 0
ε=0.00001
after 2240 evaluations
Number of evaluations
1,00E+04 25
9,00E+03
8,00E+03 20
7,00E+03
6,00E+03 15
Potential energy
5,00E+03 Kinetic energy
Swarm size
4,00E+03 10
3,00E+03
2,00E+03 5
1,00E+03
0,00E+00 0
ε=0.00001
after 1414 evaluations
Number of evaluations
8 2
3 g
7
x
xi mit y
g−pro
6 4
5
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
End of Part 2
∑
⎪ xi =
⎪⎩ 1
∑ xi
D / 2+1
N=100, D=20. Search space: [1,N]D
105 evaluations:
63+90+16+54+71+20+23+60+38+15
=
12+48+13+51+36+42+86+26+57+79 (=450)
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Knapsack
a rity = 1
⎧ gr a nu l
⎪
{ }
⎪ x ∈ 1...N
⎪ i
⎪
⎨ i ≠ j ⇒ xi ≠ x j
⎪
⎪
⎪ ∑ xi = S
⎪ i∈I , I = D, I ⊂{1, N }
N=100, D=10, S=100, ⎩
870 evaluations:
run 1 => (9, 14, 18, 1, 16, 5, 6, 2, 12, 17)
run 2 => (29, 3, 16, 4, 1, 2, 6, 8, 26, 5)
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Apple trees
Best position
Swarm size=3
Evaluation n1 n2 n3
n1 0 3 0 17
1 6 4 10
2 3 11 6
n3 3 7 7 6
n2
f = (n1 − n2 )2 + (n2 − n3 )2
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Graph Coloring Problem
s−v el
plu 1 0
pos − 2 2 -1 -1
3 5
5
+ 0
-3
-1
1 2
1
4
1 1
= 5
5
5
2
4
Heterogeneous
t1 t2 t3 t4 t5 t6 t7
He Z., Wei C., Yang L., Gao X., Yao S., Eberhart R. C., Shi Y., "Extracting Rules from
Fuzzy Neural Network by PSO", IEEE IEC, Anchorage, Alaska, USA, 1998.
Secrest B. R., Traveling Salesman Problem for Surveillance Mission using PSO,
AFIT/GCE/ENG/01M-03, Air Force Institute of Technology, 2001.
Yoshida H., Kawata K., Fukuyama Y., "A PSO for Reactive Power and Voltage
Control considering Voltage Security Assessment", IEEE TPS, vol. 15, 2001, p. 1232-
1239.
Krohling, R. A., Knidel, H., and Shi, Y. Solving numerical equations of hydraulic
problems using PSO. Proceedings of the IEEE CEC, Honolulu, Hawaii USA. 2002.
Omran, M., Salman, A., and Engelbrecht, A. P. Image classification using PSO.
Proceedings of the 4th Asia-Pacific Conference on Simulated Evolution and Learning
2002 (SEAL 2002), Singapore. p. 370-374, 2002.
Coello Coello, C. A., Luna, E. H., and Aguirre, A. H. Use of PSO to design
combinational logic circuits. LNCS No. 2606, p. 398-409, 2003.
Onwubolu, G. C. and Clerc, M., "Optimal path for automated drilling operations by a
new heuristic approach using particle swarm optimization," International Journal of
Production Research, vol. 4, p. 473-491, 2004.
τ k ,m ( entry , xk ,m )
.
.
.
.
si,P s'i,P(t)
ei,N
⎧ E = (E1 ...Enb _tests ) Function to minimise
1
f (X ) = S − S
⎪
⎪ X (t ) = (x k, m (t )) '
1+ e
xk , m entry
⎨
S = (S1 ...Snb _ tests )
⎪
⎪⎩S ' (t ) = (S1' (t )...Snb' _tests (t ))
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
To know more
THE site:
Particle Swarm Central, http://www.particleswarm.info
m o del
a pp roaches Met a
b i na to rial
Bet t er c om
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Beat the swarm!
t po s i tion
cu r r en
Your
p e rf . of
Best
r f.
Yo ur bes t p e the swarm
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
APPENDIX
2.5
1.5
0.5
1 2 3 4 5 6
ϕ
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Robustness
Performance map :
NeededIterations(κ,ϕ)
Iter
f ( x1 , x2 ) =
( )
100 x2 − x12 + (1 − x1 )
2
1 run
1 solution
143 evaluations
Search space
[0,1]2
10 runs
3 solutions
1430 evaluations
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
Model fitting (ARMA
+AIC)
Autoregressive Moving Average
+ Akaike's Information Criterion
n m
∑φ y
i =0
i t −i = ∑ θ j at − j
j =0
∑ (y )
N
− yt ARMA
2
t data
σ =
2 i =1
N t er og e nu ous
he
f = n log σ ( )
2
+ 2(n + m )
2004−12−15 Particle Swarm optimisation Maurice.Clerc@WriteMe.com
A binary PSO C code
Information links are modified at random if there has been no
improvement