You are on page 1of 40

Particle Swarm Optimization:

Method and Applications

Rania Hassan
Post-doctoral Associate
Engineering Systems Division

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 1
Particle Swarm Optimization
A pseudo-optimization method (heuristic) inspired by the
collective intelligence of swarms of biological populations.

Flocks of Birds Colonies of Insects

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 2
Particle Swarm Optimization
A pseudo-optimization method (heuristic) inspired by the
collective intelligence of swarms of biological populations.

Schools of Fish Herds of Animals

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 3
Inventors
Introduced in 1995: Kennedy, J. and Eberhart, R., “Particle Swarm
Optimization,” Proceedings of the IEEE International Conference on Neural
Networks, Perth, Australia 1995, pp. 1942-1945.

James Kennedy
Social Psychologist
US Department of Labor

Russell Eberhart
Dean of Engineering Research
Indiana Univ. Purdue Univ. Indianapolis

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 4
Swarming in System Design

• Swarming theory has been used in system design.


Examples of aerospace systems utilizing swarming theory
include formation flying of aircraft and spacecraft.

Flocks of birds fly in V-shaped formations to reduce drag


and save energy on long migrations.

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 5
Swarming in System Design

Weimerskirch, H. et al. "Energy saving in flight formation." Nature 413,


(18 October 2001): 697 - 698.

A study of great white pelicans has found that birds


flying in formation use up to a fifth less energy than
those flying solo (Weimerskirch et al.).

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 6
Swarming in System Design
• A space system in formation flying combines data from several
spacecraft rather than flying all the instruments on one costly satellite.
• It also allows for the collection of data unavailable from a single
satellite, such as stereo views or simultaneously collecting data of the
same ground scene at different angles.

Image taken from NASA's website. http://www.nasa.gov.


© Rania Hassan 3/2004
Engineering Systems Division - Massachusetts Institute of Technology 7
Lecture Overview
• Introduction
• Motivation
• PSO Conceptual Development
• PSO Algorithm
– Basic Algorithm
– Constraint Handling
– Discretization
• Applications
– PSO Demo
– Benchmark test problems
• Unconstrained
• Constrained
– System Problem: Spacecraft Design
• Final Comments on PSO
• References

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 8
“Personal” Motivation
PSO
– is a zero-order, non-calculus-based method (no gradients are needed).
– can solve discontinuous, mutlimodal, non-convex problems.
– includes some probabilistic features in the motion of particles.
– is a population-based search method, i.e. it moves from a set of points
(particles’ positions) to another set of points with likely improvement in
one iteration (move). [is that good or bad ??]
Does it remind you of another heuristic?
The Genetic Algorithm (GA)
– The GA is inherently discrete (in terms of handling design variables)
– PSO is inherently continuous (in terms of handling design variables)

– Some researchers report that PSO requires less function evaluations


than the GA (most problems studied are continuous).

– In Compindex: there are 18150 hits for the GA from 1990 to 2004,
whereas there are only 105 hits for PSO - many version of PSO are
likely to appear.
© Rania Hassan 3/2004
Engineering Systems Division - Massachusetts Institute of Technology 9
PSO Conceptual Development
• Social Behavior Simulation Optimizer
• The social model was intended to answer the following
questions:
– How do large numbers of birds (or other populations
exhibiting swarming behavior) produce seamless, graceful
flocking choreography, while often, but suddenly changing
direction, scattering and regrouping?
– Are there any advantages to the swarming behavior for an
individual in a swarm?
– Do humans exhibit social interaction similar to the swarming
behavior in other species?

• Keep these questions in mind while watching clips from


the French documentary “Winged Migration” by Jacques
Perrin.
© Rania Hassan 3/2004
Engineering Systems Division - Massachusetts Institute of Technology 10
PSO Conceptual Development

• How do large numbers of birds produce seamless, graceful


flocking choreography, while often, but suddenly changing
direction, scattering and regrouping?
– “Decentralized” local processes.
– Manipulation of inter-individual distances (keep pace and avoid collision).

• Are there any advantages to the swarming behavior for an


individual in a swarm?
– Can profit from the discoveries and previous experience of other swarm
members in search for food, avoiding predators, adjusting to the
environment, i.e. information sharing yields evolutionary advantage.

• Do humans exhibit social interaction similar to the swarming


behavior in other species?
– Absolutely, humans learn to imitate physical motion early on; as they grow
older, they imitate their peers on a more abstract level by adjusting their
beliefs and attitudes to conform with societal standards.

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 11
PSO Conceptual Development

• Phase I:

– Randomly generated particle positions in 2-d space.

– Randomly generated velocity vectors for each particle in 2-d space.

– For each swarm movement (iteration), each particle (agent) matches


the velocity of its nearest neighbor to provide synchrony.

– Random changes in velocities (craziness) are added in each iteration


to provide variation in motion and “life-like” appearance – artificial.

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 12
PSO Conceptual Development
• Phase II:
– Heppner’s simulation used a roost (cornfield), that the birds
flocked around before landing there – eliminating the need for
“craziness “.
– The mathematical formulation for the roosting model included:
• each particle evaluates its current fitness by comparing its position to
the roost position (100, 100).
ith particle
kth time unit f ki = (xki − ) + (yki − )
• each particle remembers its best ever fitness value, f ,i and the
position associated with it, p i = > x i  y i @ , and adjusts its velocity
accordingly.

LI  xki > x i ∴Vxki + = Vxki − crand Uniformly distributed


LI  xki < x i ∴Vxki + = Vxki + crand random number

update the velocity in the y-direction in the same manner


© Rania Hassan 3/2004
Engineering Systems Division - Massachusetts Institute of Technology 13
PSO Conceptual Development
• Phase II:
• each particle knows the position of the best particle in the current
swarm, p g = > x g  y g @ . This is the particle with the lowest fitness
k k k
value, f g (closest to the roost).
k
LI  xki > xkg ∴Vxki + = Vxki − c rand
global best
LI  xki < xkg ∴Vxki + = Vxki + c rand

• in the simulation, when c and c are set high, the flock is sucked
violently into the cornfield; whereas if they are set low, the flock
swirls around the cornfield and slowly approach it.
– The simulation with the known cornfield position appeared real
and raised the question of how do birds find “optimal” food
sources in reality without knowing the “cornfield” location?
• Example 1: scientists proved that parks in affluent neighborhoods
attract more birds than parks in poor neighborhoods.
• Example 2: put a bird feeder in your balcony and see what happens
in few hours.
© Rania Hassan 3/2004
Engineering Systems Division - Massachusetts Institute of Technology 14
PSO Algorithm

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 15
Basic PSO Algorithm

• Phase III
– The swarming behavior of the birds could be the reason for finding optimal
food resources.
– The model developed in Phase II could be used (with minor modifications)
to find optimal solutions for N-dimensional, non-convex, multi-modal,
nonlinear functions.
– In this current basic version of PSO, craziness and velocity matching are
eliminated.

Algorithm Description

• Particle Description: each particle has three features


i
– Position x k (this is the ith particle at time k, notice vector notation)
i
– Velocity v k (similar to search direction, used to update the position)
i
( )
– Fitness or objective f x k (determines which particle has the best value in
the swarm and also determines the best position of each particle over time.

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 16
Basic PSO Algorithm

• Initial Swarm
– No well established guidelines for swarm size, normally 15 to 30.
– particles are randomly distributed across the design space.
x i = x PLQ + rand (x PD[ − x PLQ )
where x PLQ and x PD[ are vectors of lower and upper limit values
respectively.

– Evaluate the fitness of each particle and store:


• particle best ever position (particle memory p i here is same as x i )
g
• Best position in current swarm (influence of swarm p  )

– Initial velocity is randomly generated.


x + rand (x PD[ − x PLQ ) SRVLWLRQ
v i = PLQ =
ǻt WLPH
© Rania Hassan 3/2004
Engineering Systems Division - Massachusetts Institute of Technology 17
Basic PSO Algorithm

• Velocity Update
– provides search directions
– Includes deterministic and probabilistic parameters.
– Combines effect of current motion, particle own memory,
and swarm influence.
New velocity
v ik + = wv ik + c rand 
( p i
− x ik ) + c rand 
§¨ p g − x i ·¸
© k k
¹

∆t ∆t

current motion particle memory swarm influence


influence

inertia factor self confidence self confidence


0.4 to 1.4 1.5 to 2 2 to 2.5

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 18
Basic PSO Algorithm
• Position Update
– Position is updated by velocity vector.
x ik +
pg
i k
x ik + = x ik + v ik + ∆t p

x ik v ik
• Stopping Criteria
– Maximum change in best fitness smaller than specified
tolerance for a specified number of moves (S).

( ) ( )
 f p kg − f p kg− q  ≤ ε q = , ,..S

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 19
Constraint Handling
• Side Constraints
– Velocity vectors can drive particles to “explosion”.
– Upper and lower variable limits can be treated as regular constraints.
– Particles with violated side constraints could be reset to the nearest
limit.

• Functional Constraints
– Exterior penalty methods (linear, step linear, or quadratic).
penalty multipliers
fitness function N con
f (x ) = φ (x ) + ¦ ri (PD[[,gi ])
i =

objective function penalty function

– If a particle is infeasible, last search direction (velocity) was not


feasible. Set current velocity to zero.

v ik + = c rand 
( p i
− x ik ) + c rand 
§¨ p g − x i ·¸
© k k
¹

∆t ∆t
© Rania Hassan 3/2004
Engineering Systems Division - Massachusetts Institute of Technology 20
Discretization

• System problems typically include continuous, integer,


and discrete design variables.

• Basic PSO works with continuous variables.

• There are several methods that allows PSO to handle


discrete variables.

• The literature reports that the simple method of rounding


particle position coordinates to the nearest integers
provide the best computational performance.

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 21
Applications

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 22
PSO Demo
• Demo provided by Gerhard Venter of Vanderplatts R&D.
• Smooth function with three local minima and a known global
minimum of [0,0] and an optimal function value of 0.

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 23

Courtesy of Gerhard Venter. Used with permission.


PSO Demo
• Demo provided by Gerhard Venter of Vanderplatts R&D.
• Noisy function with many local minima and a known global
minimum of [0,0] and an optimal function value of 1000.

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 24

Courtesy of Gerhard Venter. Used with permission.


Unconstrained Benchmark Problems
Eggcrate Function

Minimize (
f (x ) = x + x +  VLQ  x + VLQ  x )
• Known global minimum of [0,0] and an optimal function value of 0.

• PSO parameters:
– Swarm Size = 30
– Inertia, w = 0.5 (static)
– Self Confidence, c = 1.5
– Swarm Confidence, c = 1.5
– Stopping Tolerance, ε = 0.001

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 25
Unconstrained Benchmark Problems
Eggcrate Function
Results for move0 Results for move5
6 2.5

2
4
Statistics 1.5 Statistics
2 1

Min = 15.1 0
0.5
Min = 0.25
x2

x2
0
-2
Mean = 49.9 -0.5
Mean = 26.5
-4 -1

-1.5
Max = 92.6 -6
-2
Max = 86.6
-8 -2.5
-6 -4 -2 0 2 4 6 -8 -6 -4 -2 0 2 4 6 8 10
x1 x1

Results for move10 Results for move28


2.5 1

2
0.8
Statistics 1.5 Statistics
1 0.6

Min = 0.02 0.5


0.4 Min = 0.3E-5
x2

x2
0
0.2
Mean = 13.6 -0.5
Mean = 4.69
-1 0

Max = 66.5
-1.5

-2
-0.2 Max = 41.9
-2.5 -0.4
-5 -4 -3 -2 -1 0 1 2 3 4 -6 -5 -4 -3 -2 -1 0 1 2
x1 x1

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 26
Unconstrained Benchmark Problems
Banana (Rosenbrock) Function

Minimize f (x ) =  x − (  
x ) + ( − x )

• Known global minimum of [1,1] and an optimal function value of 0.

• PSO parameters:
– Swarm Size = 30
– Inertia, w = 0.5 (static)
– Self Confidence, c = 1.5
– Swarm Confidence, c = 1.5
– Stopping Tolerance, ε = 0.001

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 27
Unconstrained Benchmark Problems
Banana (Rosenbrock) Function
Results for move0 Results for move5
5 4

Statistics 3
3
Statistics
2
2

Min = 1.03 1
Min = 0.26
x2

x2
0 1

Mean = 1.4E4 -1
0
Mean = 5.8E3
-2

-3
Max = 5.3E4 -4
-1
Max = 6.9E4
-5 -2
-5 0 5 -4 -2 0 2 4 6
x1 x1

Results for move25 Results for move53


1.2 1.6

1 1.4

Statistics 1.2
Statistics
0.8

Min = 0.36 0.6


1
Min = 0.6E-3
x2

x2
0.8
0.4
Mean = 2.6E4 0.6 Mean = 1.3E3
0.2
0.4

Max = 7.2E5 0 0.2


Max = 2.7E4
-0.2 0
-4 -2 0 2 4 6 8 10 -5 -4 -3 -2 -1 0 1 2 3 4
x1 x1

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 28
Constrained Benchmark Problems
Golinski Speed Reducer
• Full problem description can be found at
http://mdob.larc.nasa.gov/mdo.test/class2prob4.html
• This problem represents the design of a simple gear box such
as might be used in a light airplane between the engine and
propeller to allow each to rotate at its most efficient speed.
• The objective is to minimize the speed reducer weight while
satisfying a number of constraints (11) imposed by gear and
shaft design practices.
• Seven design variables are available to the optimizer, and
each has an upper and lower limit imposed.
• PSO parameters:
– Swarm Size = 60
– Inertia, w = 0.5 (static)
– Self Confidence, c = 1.5
– Swarm Confidence, c = 1.5
– Stopping Tolerance, ε = 5 kg
Image taken from NASA's website. http://www.nasa.gov.
© Rania Hassan 3/2004
Engineering Systems Division - Massachusetts Institute of Technology 29
Constrained Benchmark Problems
Golinski Speed Reducer
• Known solution
x = [3.50 0.7 17 7.3 7.30 3.35 5.29]
f (x ) = 2985 kg
• PSO solution
x = [3.53 0.7 17 8.1 7.74 3.35 5.29]
f (x ) = 3019 kg x 10
7
PSO Statistics
5

4.5

Min, Mean, and Max Fitness


3.5

2.5

1.5

0.5

0
0 20 40 60 80 100 120
Move

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 30
Constrained Benchmark Problems
Golinski Speed Reducer
7 7
x 10 Results for Move 0, # of Feasible Particles (out of 60) = 0 x 10 Results for Move 5, # of Feasible Particles (out of 60) = 24
4.5 4.5

4 4

3.5 3.5

3 3
Fitness Value

Fitness Value
2.5 2.5

2 2

1.5 1.5

1 1

0.5 0.5

0 0
0 10 20 30 40 50 60 0 10 20 30 40 50 60
Particle number Particle number

7 6
x 10 Results for Move 65, # of Feasible Particles (out of 60) = 17 x 10 Results for Move 112, # of Feasible Particles (out of 60) = 16
3 14

12
2.5

10
2
Fitness Value

Fitness Value
8
1.5
6

1
4

0.5
2

0 0
0 10 20 30 40 50 60 0 10 20 30 40 50 60
Particle number Particle number

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 31
System Problem
Communication Satellite Design

Value Acquisition
Cost

Reliability

Payload / Launch
Mass
Mission
Requirements

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 32
System Problem
Communication Satellite Design
Performance Optimization
Estimation Algorithm
Tools
Challenges
 Multidisciplinary Framework  Combinatorial Problem
¾ many design variables ¾ discrete
¾ computationally-expensive ¾ integer
disciplinary design tools ¾ continuous

Solutions
 Focus on the conceptual  Use non-calculus based
design stage (less variables, search methods like the
simpler models) Genetic Algorithm (GA)
© Rania Hassan 3/2004
Engineering Systems Division - Massachusetts Institute of Technology 33
System Problem
Communication Satellite Design
Design Variables
There are 27 design variables that can be combined into 2.2 trillion different designs.

Design
Description and discrete values
parameter
1 to 8 HPA type (TWTA or SSPA)
Solar array cell type (GaAs single junction, GaAs multi-junction, Si
9
thin, Si normal, or hybrid Si with GaAs multi-junction)
10 Battery cell type (NiCd or NiH2)
11 N/S thermal coupling (no coupling or coupling)
N/S STK thruster technology (xenon plasma, arcjets, bi-propellant,
12
and hydrazine monopropellant)
13 E/W STK thruster technology (bi-propellant or hydrazine)
14 Launch vehicle choice from 8 options

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 34
System Problem
Communication Satellite Design
Design Variables continued …
Design
Description and discrete values
parameter
Redundancy level for Ku-band transponder that has 12 operating
15
HPAs (0, 1, 2, or 4 redundant HPAs)
Redundancy levels for C-band transponders, each has 12
16 to 22
operating HPAs (0, 1, 2, or 4 redundant HPAs)
Redundancy levels for last C-band transponder that has 2
23
operating HPAs (0 or 1 redundant HPA)
24 Propulsion subsystem redundancy level (0 or full redundancy)
25 ADCS subsystem redundancy level (0 or full redundancy)
26 TCR subsystem redundancy level (0 or full redundancy)
Solar array area redundancy level (0, 1.02, 1.04 or 1.06 × Area
27
redundancy level)

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 35
Fairing
Diameter

Stowed Solar Stowed Sidewall


Array Antenna

Stowed Satellite Configuration in Launch


Vehicle Fairing
System Problem
Communication Satellite Design

• PSO parameters:
– Swarm Size = 60
– Inertia, w = 0.5 (static)
– Self Confidence, c = 1.5
– Swarm Confidence, c = 1.5
– Stopping Tolerance, ε = 5 kg

• Known solution
x = [1 0 1 1 1 1 1 1 4 2 1 1 0 0 1 1 0 1 1 1 0 0 1 1 0 0 3]
f (x ) = 3443 kg
• PSO solution
x = [1 1 1 0 1 1 1 1 4 2 1 1 0 1 0 2 1 1 0 1 1 0 1 0 1 1 2]
f (x ) = 3461 kg

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 37
System Problem
Communication Satellite Design
5 5
x 10 Results for Move 0, # of Feasible Particles (out of 60) = 0 x 10 Results for Move 5, # of Feasible Particles (out of 60) = 28
16 9

14 8

7
12

6
10
Fitness Value

Fitness Value
5
8
4
6
3
4
2

2 1

0 0
0 10 20 30 40 50 60 0 10 20 30 40 50 60
Particle number Particle number

5 4
x 10 Results for Move 15, # of Feasible Particles (out of 60) = 50 x 10 Results for Move 33, # of Feasible Particles (out of 60) = 58
14 1.8

12 1.6

1.4
10

1.2
Fitness Value

Fitness Value
8
1
6
0.8

4
0.6

2 0.4

0 0.2
0 10 20 30 40 50 60 0 10 20 30 40 50 60
Particle number Particle number

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 38
Final Comments on PSO

• This is a method “in the making” - many versions


are likely to appear.

• Poor repeatability in terms of:


– finding optimal solution
– computational cost

• More robust constraint (side and functional) handling


approaches are needed.

• Guidelines for selection of swarm size, inertia and


confidence parameters are needed.

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 39
References

• Kennedy, J. and Eberhart, R., “Particle Swarm Optimization,”


Proceedings of the IEEE International Conference on Neural
Networks, Perth, Australia 1995, pp. 1942-1945.

• Venter, G. and Sobieski, J., “Particle Swarm Optimization,”


AIAA 2002-1235, 43rd AIAA/ASME/ASCE/AHS/ASC Structures,
Structural Dynamics, and Materials Conference, Denver, CO.,
April 2002.

• Kennedy, J. and Eberhart, R., Swarm Intelligence, Academic


Press, 1st ed., San Diego, CA, 2001.

© Rania Hassan 3/2004


Engineering Systems Division - Massachusetts Institute of Technology 40

You might also like