You are on page 1of 22

(PSO)

PREPARED BY:-
VRAJESH CHOKSHI
(08CE10)
ARTIFICIAL INTELLIGENCE
Now a days the concepts of artificial intelligence have
came into existence, basically they are related to
producing such artificial things which act totally as an
human being or has the memory and functionality same as
the human brain.

Artificial Intelligence includes many topics on the Swarm


optimization, soft computing, memetic algorithm , genetic
algorithm many more topics.

One of the main type of algorithm for artificial inteligence


is the PARTICLE SWARM OPTIMIZATION. Which
provides how the collective behavior of the realworld
entities make it possible work effficiently .
SWARM INTELLIGENCE
Swarms consist of many simple entities
that have local interactions,
including interacting with the
environment.

The emergence of complex,


or macroscopic, behaviors and the
ability to achieve significant results as
a team result from combining simple,
or microscopic, behaviors.
Swarm intelligence techniques (note the difference from intelligent
swarms) are population-based stochastic methods used in combinatorial
optimization problems in which the collective behavior of relatively simple
individuals arises from their local interactions with their environment to
produce functional global patterns.
PARTICLE SWARM
OPTIMIZATION
Introduction
 Particle Swarm Optimization (PSO) is a relatively new, evolution-based search
and optimization technique. We explain the PSO algorithm in detail and
demonstrate its performance on one- and two-dimensional continuous search
problems.

 PSO algorithms are especially useful for parameter optimization in continuous,


multi-dimensional search spaces. PSO is mainly inspired by social behaviour
patterns of organisms that live and interact within large groups. In particular,
PSO incorporates swarming behaviours observed in flocks of birds, schools of
fish, or swarms of bees.
(Cont.)
 The term PSO refers to a relatively new family of algorithms that may be used
to find optimal or near to optimal solutions to numerical and qualitative
problems. It is implemented easily in most of the programming languages
since the core of the program can be written in a single line of code and has
proven both very effective and quick when applied to a diverse set of
optimization problems.

 PSO is an established method for parameter optimization . it represents a


population based adative optimization technique that is influenced by several
strategic parameters choosing reasonable parameters for its convergence
behavior and depends on optimization task.
What does it basically mean????
 Particle swarm optimization (PSO) is one of the evolutionary computation
techniques . like the other evolutionary computation techniques ,PSO is a
population based search alogorithm and is initialize evoluted with a population
of random solution , called particles .unlike in the other evolutionary ideas and
techniques,each particle in PSO is also associated with the velocity.

 The idea of PSO is to have swarm of particles “flying” through a


multidimensional search space , looking for the global optimum . by exchanging
information the particles can influence each others movements . each particle
retains an individual (or “cognitive”)memory of the best position it has visited ,
as well as a global (or “social”)memory of the best position visited by all the
particles in the swarm. A particle calculates its next position based on a
combination of its last movement vector , the individual and global memories ,
and a random component.
(Cont.)
 PSO is a global optimization algorithm for dealing with problems in which a point
or surface in an ndimensional space best represents a solution. Potential solutions
are plotted in this space and seeded with an initial velocity. Particles move through
the solution space, and certain fitness criteria evaluate them. Over time, particles
accelerate toward those with better fitness values.
 
 Particle Swarm Optimization (PSO) applies to concept of social interaction to
problem solving.

 It has been applied successfully to a wide variety of search and optimization


problems.
 In PSO, a swarm of n individuals communicate either directly or indirectly with
one another search directions (gradients).
 
 PSO is a simple but powerful search technique.
 
Who basically found it????
Mayor. James kennedy Mr.Russell Eberhart

It was developed in 1995 by James Kennedy and Russ Eberhart [Kennedy, J.


and Eberhart, R. (1995). “Particle Swarm Optimization”, Proceedings of the
1995 IEEE International Conference on Neural Networks, pp. 1942-1948, IEEE
Press.]
Advantages
 The main advantage of such an approach over other global minimization
strategies such as simulated annealing is that the large numbers of the
members that make up the particle swarm make the technique impressively
resilient to the problem of the local minima

 An advantage of PSO is its ability to handle the particle swarm optimization


problem with multiple local optima reasonably well and it s simplicity of
implementation. Especially in the comparison to related strategies like genetic
algorithm. In the field of chemiformatics, PSO has successfully been applied
to the quantitative structure activity relationship(QSAR) modeling , including
k-nearest neighbor and kernel regression , minimum spanning tree for
piecewise modeling , partial least square modeling , and neutral network
training.
How it works!!!
 The basic algorithm funda:

 The social metaphor that led to this algorithm can be summarized as follows:
 
 The individuals that are part of a society hold an opinion that is part of a “belief
space”(search space ) shared by every
possible individual.
 Individuals may modify this “opinion state” based on three factors:
 The knowledge of the environment(its fitness value)
 The individual’s previous history of states(its memory)
 The previous history of states of the individual’s neighbourhood.
Algorithm
1.Initialize the particles randomly in the search space
2.Set the generation count
3.While not teminate :
  a.Evaluate the fitness of individual particles(pbest)

b.Find the best instance of particle against generation / keep


track of the individual’s highest fitness(gbest)

c.Calculate particle velocities

d.Update particle position / location.

  e.Terminate if condition is best


4.Terminate if generation count expires.
Working of algorithm
 
 The PSO algorithm is simple in concept , easy to implement and conceptual
efficient . the original procedure for implementing PSO is as follows:

 Initialize a population of particle with random positions and velocities on D


dimensions in the problem space

 For each particle , evaluate the desired optimization fitness function in D


variable.

 Compare particle’s fitness evaluation with its pbest . if current value is better
than the pbest , then the set pbest equal to the current value ,and pi equals to
the current location xi in D-dimensional space

  Identify the particle in the neighborhood with the best success so far and
assign its index to be variable g.
(Cont.)
 Change the velocity and position of the particle according to the following
equation

Vid=vid+c1rand()(pid-xid)+c2rand()(pgd-xid)
Xid=xid+vid
 

 Loop to step b until a critetion is met, usually a sufficiently good fitness or


maximum number of iterations.
Concepts to apply & how its
works
 The connection to search and optimization is made by assigning direction
vectors and velocities to each point in a multidimensional search space.

 Each point the ‘moves’ or ‘flies’ through the seach space following its velocity
vector , which is influenced by the directions and velocities of other points in its
neighborhood .

 These localized interaction with neighboring points propogates through the


entire ‘swarm’ of potential solution.

 How much influence a particular point has on the other points is determined by
the ‘fitness’ , that is assigned to a potential solution , which captures how good
it is compared to all other solution

 Each solution is represented as a series of co-ordinates in n dimensional space .


a number of particles are initialized randomly within the search space.
(Cont.)
 Each particle has a very simple memory of its personal best solution stored as
pbest

 The global best solution for each iterartion is given as the gbest

 On each iteration , every article is moved a certain diatance from its current
location , influenced a random amount by the pbest and gbest values.

 Each particle keeps track of its coordinates in the problem space which are
associated with the best solution it has achieved.the fitness value is also stored
The value is pbest .When a particle takes all population as its topological
neighbors, the best value is a global best given as gbest.
(Cont.)
 After fining this best value, particle calculates its new velocity and position using forula a
& b respectively
 

V[]=v[]+c1*rand()*(pbest[]-present[]) + c2*rand[](gbest[]-present[])

Present[]=present[]+v[]

 
APPLICATION
 Solving the constrained optimization problem 

 Solving min-max problem


 
 Solving multiobjective optimization problem

 Also used for solvig the travelling salesman problem:


(Cont.)
 Her e in travelling sales man problem the PSO is applied so as to get the
minimum distance travelled by the sales man , coloured dot is a beginning
point whle others are destination.

 Dynamic tracking

 Evolve weights and structure of the neutral network.

 Analyse human tremor.

 Register 3d-to-3d biomedical image


Conclusion
 Particle swarm optimization is an extremely simple algorithm that seems to
be effective for optimizing a wide range of functions.

 We view it as a mid-level form of A-life or biologically derived algorithm,


occupying the space in nature between evolutionary search, which requires
eons, and neural processing, which occurs on the order of milliseconds. Social
optimization occurs in the time frame of ordinary experience - in fact, it is
ordinary experience.

 In addition to its ties with A-life, particle swarm optimization has obvious ties
with evolutionary computation.

 Conceptually, it seems to lie somewhere between genetic algorithms and


evolutionary programming. It is highly dependent on stochastic processes,
like evolutionary programming
QUESTIONS?
Thank You!!!

You might also like