Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Look up keyword
Like this
5Activity
0 of .
Results for:
No results containing your search query
P. 1
Combined Algorithm of Particle Swarm Optimization

Combined Algorithm of Particle Swarm Optimization

Ratings: (0)|Views: 240 |Likes:
Published by ijcsis
A new optimization algorithm is developed in this paper as a Combined Algorithm of particle swarm optimization, is presented, based on a novel philosophy by modifying the velocity update equation. This is done by combined two different PSO algorithms i.e., Standard Particle Swarm Optimization and Personal Best Position Particle Swarm Optimization. Its performance is compared with the standard PSO (SPSO) by testing it on a set 15 of scalable and 13 non-scalable test problems. Based on the numerical and graphical analyses of results it is shown that the CAPSO (Combined Algorithm of Particle Swarm Optimization) outperforms the SPSO (Standard Particle Swarm Optimization), in terms of efficiency, reliability, accuracy and stability.
A new optimization algorithm is developed in this paper as a Combined Algorithm of particle swarm optimization, is presented, based on a novel philosophy by modifying the velocity update equation. This is done by combined two different PSO algorithms i.e., Standard Particle Swarm Optimization and Personal Best Position Particle Swarm Optimization. Its performance is compared with the standard PSO (SPSO) by testing it on a set 15 of scalable and 13 non-scalable test problems. Based on the numerical and graphical analyses of results it is shown that the CAPSO (Combined Algorithm of Particle Swarm Optimization) outperforms the SPSO (Standard Particle Swarm Optimization), in terms of efficiency, reliability, accuracy and stability.

More info:

Published by: ijcsis on Dec 04, 2010
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

04/10/2013

pdf

text

original

 
Combined Algorithm of Particle SwarmOptimization
 Narinder SinghDepartment of Mathematics,Punjabi University, PatialaINDIA,( Punjab)-147201Email:narindersinghgoria@ymail.comS.B. SinghDepartment of Mathematics,Punjabi University, Patiala,INDIA,( Punjab)-147201Email:sbsingh69@yahoo.comJ.C.BansalABV-Indian Institute of InformationTechnology and Management-Gwalior (M.P), INDIAEmail:jcbansal@yahoo.com
 
 Abstract:
A new optimization algorithm isdeveloped in this paper as a CombinedAlgorithm of particle swarm optimization, ispresented, based on a novel philosophy bymodifying the velocity update equation. Thisis done by combined two different PSOalgorithms i.e., Standard Particle SwarmOptimization and Personal Best PositionParticle Swarm Optimization. Itsperformance is compared with the standardPSO (SPSO) by testing it on a set 15 of scalable and 13 non-scalable test problems.Based on the numerical and graphicalanalyses of results it is shown that the CAPSO(Combined Algorithm of Particle SwarmOptimization) outperforms the SPSO(Standard Particle Swarm Optimization), interms of efficiency, reliability, accuracy andstability.
 Keywords: Particle Swarm Optimization,CAPSO (Combined Algorithm of Particle Swarm Optimization), global optimization,velocity update equation, Personal Best  Position Particle Swarm Optimization.
I. INTRODUCTION
 Standard Particle Swarm Optimization (SPSO):
 
Particle swarm optimization (PSO) [1], [2] is astochastic, population-based search method,modeled after the behavior of bird flocks. A PSOalgorithm maintains a swarm of individuals(called particles), where each individual (particle)represents a candidate solution. Particles follow avery simple behavior: emulate the success of neighboring particles, and own successesachieved. The position of a particle is thereforeinfluenced by the best particle in a neighborhood,as well as the best solution found by the particle.Particle position
i
 x
are adjusted using
(1)()(1)
i i i
 x t x t v
+ = + +
…(1)where the velocity component,
()
i
v
representsthe step size. For the basic PSO.
1122
ˆ(1)()()()
ij ij j ij ij j j ij
v t v t cr y x c r y x
+ = + +
... (2)where
w
is the inertia weight [11],
12
c and c
arethe acceleration coefficients,
12
,(0,1)
 j j
r r
,
ij
isthe personal best position of particle
i
, and
ˆ
 y
 is the neighborhood best position of particle
i
.The neighborhood best position
i
 y
, of particle
i
 depends on the neighborhood topology used [3],[4]. If a star topology is used, then
ˆ
i
refers tothe best position found by the entire swarm. Thatis,
{(),(),....,()}min((()),(y(t)),....,(())0101
 y y t y t y t f y t f f y i s s
=
 
where
 s
is the swarm size. The resultingalgorithm is referred to as the global best PSO.For the ring topology, the swarm is divided intooverlapping neighborhoods of particles. In thiscase,
ˆ
i
 y
is the best position found by theneighborhood of particle
i
. The resultingalgorithm is referred to as the Local best PSO.The Von Neumann topology definesneighborhoods by organizing particles in a latticestructure. A number of empirical studies haveshown that the Von Neumann topologyoutperforms other neighborhood topologies [4],[5]. It is important to note that neighborhoods aredetermined using particle indices, and are not based on any spatial information.A large number of PSO variations have beendeveloped, mainly to improve the accuracy of solutions, diversity, and convergence behavior [6], [7]. This section reviews those variationsused in this study, from which concepts have been borrowed to develop a new, parameter-freePSO algorithm.
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 8, November 2010270http://sites.google.com/site/ijcsis/ISSN 1947-5500
 
Van den Bergh and Engelbrecht [8], [9], andClerc and Kennedy [3], formally proved thateach particle converges to a weighted average of its personal best and neighborhood best positions.That is,
1212
ˆlim
ij ijij x
c y c y xc c
+=+
…….. (3)This theoretically derived behavior providessupport for the barebones PSO developed byKennedy [10], where the velocity vector isreplaced with a vector of random numberssampled from a Gaussian distribution with themean defined by equation (3), assuming that
1
c
=
2
c
, and deviation,
ˆ
 y yij ij
σ  
=
 The velocity equation changes to
ˆ(1)(,)2
ij ijij
 y yv t
σ  
++
 The position update then changes to
(1)(1)
i i
 x t v
+ = +
 Kennedy [9] also proposed an alternative versionof the barebones PSO, where
(0,1)0.5(1)ˆ(,)2
ijijij ij
 y if v y y N otherwise
σ  
+ =+
..(4)Based on equation (4), there is a 50% chance thatthe j-th dimension of the particle dimensionchanges to the corresponding personal best position. This version of the barebones PSO biases towards exploiting personal best positions.Silva et al. (2002) presented a predator- pray model to maintain population diversity.
II. THE PROPOSED COMBINEDALGORITHM OF PSO
The motivation behind introducing CAPSO isthat in the velocity update equation instead of comparing the combined two difference PSOalgorithm update velocity equation i.e., SPSO(Standard Particle Swarm Optimization) andPBPPSO (Personal Best Position Particle SwarmOptimization).Thus, we introduce a new velocity updateequation as fellows:
ˆ(1)()()()1122()()()1122
v t wv t c r y x c r y xij ij j ij ij j j ijwv t c r y x c r xij j ij ij j ij
+ = + + ++ + −
 OR 
(1)2()2()11ˆ(2)22
v t wv t c r y xij ij j ij ijc r y x j j ij
+ = + +
…… (5)In the velocity update equation of this new PSOthe first term represents the current velocity of the particle and can be thought of as amomentum term. The second term is proportional to the vector 
2()11
c r y x j ij ij
, isresponsible for the attractor of particle’s current position and positive direction of its own best position (pbest). The third term is proportional tothe vector 
ˆ(2)22
c r y x j j ij
, is responsible for theattractor of particle’s current position.Figure:-I: Comparative movement of a particle inSPSO and CPSO
PbestGbestCurrent positionSPSOCAPSO
 The pseudo code of CAPSO is shown below:
ALGORITHM- CAPSO
For 
1
=
to the max: bound of the number oniterations,For 
1
i
=
to the swarm size,For 
1
 j
=
to the problem dimensionality,Apply the velocity update equation (5);Update Position using equation (2);End-for-
j
 Compute fitness of updated position;If needed, update historical information for  personal best position and global best position;End-for-i;Terminate if global best position meets problemsrequirements;End-for-
;
END ALGORITHM
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 8, November 2010271http://sites.google.com/site/ijcsis/ISSN 1947-5500
 
III. THE TEST BED
Many times it is found that the evaluation of a proposed algorithm is evaluated only on a few benchmark problems. However, in this paper weconsider a test bed of thirty benchmark problemswith varying difficulty levels and problem size.The relative performance of SPSO and CAPSOis evaluated on two kinds of problem sets.Problem Set 1 consists of 15 scalable problems,i.e., those problems in which the dimension of the problems can be increased / decreased at will.In general, the complexity of the problemincreases as the problem size is increased.Problem Set 2 consists of those problems inwhich the problem size is fixed, but the problemshave many local as well as global optima. TheProblem Set 1 is shown in Table 1 and ProblemSet 2 is shown in Table 2.
Table-1: Details of Problem Set-I (Continued)
Serial NoFunction Name Expression Search Space Objective FunctionValue1.
 
Ackley
12()20exp(0.02)11exp(cos())201
nMin f x n xiinn x eii
π  
==+ +=
 
3030
 xi
 02.
 
Cosine Mixture
2()0.1cos(5)11
n nin f x x xi ii i
π  
= += =
 
11
i
 x
 
0.1()
n
×
 3.
 
Exponential
2()(0.5)1
nin f x xii
= =
 
11
i
 x
 -14.
 
Griewank 
12()1cos()400011
nnxiMin f x xiii i
= + = =
 
600600
 xi
 05.
 
Rastrigin
2()10[10cos(2)]1
nMin f x n x xi ii
π  
= + =
 
5.125.12
 xi
 06.
 
Function ‘6’
1222()[100()(1)]11
nMin f x x x xi i ii
= + +=
 
3030
i
 x
 07.
 
Zakharov’s
242()[()][()]22111
n n ni iMin f x x x xi i ii i i
= + += = =
 
5.125.12
i
 x
 08.
 
Sphere
2()1
nMin f x xii
==
 
5.125.12
 xi
 09.
 
Axis parallel hyper ellipsoid
21
()
nii
M i n f x i x
=
=
 
5.125.12
 xi
 010.
 
Schwefel ‘3’
11
()
nni ii i
M i n f x x x
= =
= +
 
1010
 xi
 011.
 
Dejong
41
()((0,1))
nii
M in f x x r a n
=
= +
 
1010
i
 x
 012.
 
Schwefel ‘4’
(){,1}
Min f x Max x i ni
=
 
100100
 xi
 013.
 
Cigar 
221
()100000
ni ii
i n f x x x
=
= +
 
1010
 xi
 014.
 
Brown ‘3’
12222111
()[()(1)(1)(1)]
ni i i ii
M in f x x x x x
+ +=
= + + + +
 
14
i
 x
 015.
 
Function ‘15’
1221
()[0.20.1sin2]
ni i ii
in f x x x x
=
= +
 
1010
 xi
 0
Table 2: Details of Problem Set-II
Serial NoFunction Name Expression Search Space ObjectiveFunction Value1.
 
Becker and Lago
2212
()(5)(5)
M in f x x x
= − +
 
1010
i
 x
 02.
 
Bohachevsky ‘1’
221212
()20.3cos(3)0.4cos(4)0.7
Min f x x x x x
π π  
= + +
 
50,5012
 x x
 0
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 8, November 2010272http://sites.google.com/site/ijcsis/ISSN 1947-5500

Activity (5)

You've already reviewed this. Edit your review.
1 hundred reads
ramdas1991 liked this
chandanap liked this
chnr108 liked this
BGSorin liked this

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->