You are on page 1of 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/241177369

A New Verified Particle Swarm Optimization Algorithm

Article · October 2010


DOI: 10.1109/AICI.2010.39

CITATIONS READS
2 1,368

5 authors, including:

Niusha Shafiabady Ida Shafiabady


Charles Darwin University 4 PUBLICATIONS 17 CITATIONS
35 PUBLICATIONS 413 CITATIONS
SEE PROFILE
SEE PROFILE

All content following this page was uploaded by Niusha Shafiabady on 27 June 2014.

The user has requested enhancement of the downloaded file.


A New Verified Particle Swarm Optimization Algorithm

Niusha Shafiabady1, Mohsen A. Fesharaki1, Ida Shafiabady1, Shahab Ahmadi1, Mohesn Fesharaki1

1 Science Society of Mechatronics, Azad University, Science and Research Branch

nshafiabady@yahoo.com, fesharaki@yahoo.com, ishafiabady@yahoo.com, shahab_ahmadi_01@yahoo.com,


mnfesharaki@yahoo.com

Abstract— In this paper a verified PSO is introduced. In the improvement is clear especially in the search space with
proposed version of PSO the dimensions of the velocities are higher number of variables.
increased. This verified PSO is compared with the
conventional PSO algorithm using thirteen test functions. The II. THE PRPOSED ALGORITHM
verification done in PSO algorithm has made it much better in
achieving the optimization goals. The improvement is specially In order to improve the efficiency of PSO a verification is
seen in higher dimensions. done in the algorithm by adding a dimension to the velocity
shown in equations 2~5.
Keywords- Particle Swarm Optimization.
Cognitive
 Term
  (2)
Vx i  wVx i 1  rand  c1 ( X iPersonal _ Best  X i )  rand  c 2 ( X iGlobal _ Bset  X i )
   
Social Term

I. INTRODUCTION Cognitive
 Term
  (3)
Vy i  wVy i 1  rand  cc1 ( X iPersonal _ Best  X i )  rand  cc2 ( X iGlobal _ Best  X i )
Particle swam optimization is a population-based   
Social Term
evolutionary algorithm. It is similar to other population-
based evolutionary algorithms in that the algorithm is Vi  Vxi  Vyi (4)
initialized with a population of random solutions. It is unlike X i  X i 1  Vi (5)
most other population-based evolutionary algorithms in the
way that PSO is motivated by the simulation of social
The formulation shows a velocity in the direction of
behavior instead of survival of the fittest, and each
coordinate x and another velocity in the direction of the
candidate’s solution is associated with a velocity. The
coordinate y. As shown in equation 4 the two velocities are
candidates’ solutions, called particles, “fly” through the
added to get the final value of the velocity to be applied to
search space. The velocity is constantly adjusted according
the particles in each iteration. Here the constants are tuned
to the corresponding particle’s experience and the particle’s
to create a balance between the cognitive and the social
companions’ experience. It is expected that the particles will
terms. Their values are decided to be
move towards better solution areas. Mathematically, the
particles are manipulated according to the following c1  1.2, c 2  .5, cc 2  1.2, cc1  .5 to create the
equation: mentioned balance. The velocity in the direction of x has a
Cognitive
 Term
  (1) greater cognitive term whereas the velocity calculated in the
V  wV  rand  c ( X  X )  rand  c ( X X )
i i 1 1 iPersonal _ Best i 2 iGlobal _ Best
  i
 direction of y has a greater social term. By adding these two
Social Term

X i  X i 1  Vi
velocities with each other the optimization power of PSO
improves.
Where variable w is the inertia weight and its adaptation is
an important issue and a lot of work has been done to find III. SIMULATION RESULTS
its ideal value for better results. Here c1 and c2 are positive
constants that show the emphasize put on the cognitive or
In order to compare the proposed method with PSO both
social terms, and rand is a random number. For PSO to
methods are used to optimize thirteen test functions. The
converge c1  c 2  4 should be supported. The balance simulations are done using Matlab and the test functions are
between global and local search throughout the course of given in equations 6~18.
run is critical to the success of an evolutionary algorithm. In The simulations are done in the same condition and w=0.5
this article a new method is proposed to increase the for all. Fig. 1~13 show the test functions’ shapes.
dimension of the search velocity and better results are As f2 is a function that its dimension can’t be increased, its
achieved using the verified version of PSO. The results are given in only a two-variable space.
The results using PSO and verified PSO are given in table I.
As the results show the functionality of verified PSO is Drop Wave ( f 8 ) :
much better than normal PSO. Only regarding the sixth 2

function PSO has been better than verified PSO although in 1  cos(12 x
i 1
2
i ) (13)
that case verified PSO has achieved a desired result and has f8  
1 2 2
met the optimization goal. On the whole the performance of  xi  2
2 i 1
PSO has improved by the verification. In order to show the 
improvement of the algorithm, a ten-dimensional search x  [5.12, 5.12] 2
space is used and the related results are given in table II. In
Ackley ( f 9 ) :
all the cases adding the dimension has improved the (14)
f 9  20 exp  0.2 1 / n i 1 x i 
n 2
performance of the algorithm and it can be proposed as a  
better version of PSO. 
 exp 1 / ni 1 cos( 2x i )  20  e
n


x  [32, 32] n
 
Spherical ( f1 ) : min f 9 ( x*)  f 9 (0)  0
n
(6)
f 1   xi2
i 1 Quadric ( f 10 ) :

x  [100, 100] n
  
f 10  i 1  j 1 x j
n i

2 (15)
min f1 ( x*)  f1 (0)  0 
x  [100, 100] n
 
Six Hump Camel Back ( f2 ) :
min f 10 ( x*)  f 10 (0)  0
(7)
x14 2
f2  (4  2.1x12  ) x1  x1x2  (4  4x22 ) x22
3 Rosenbrock ( f 11 ) : (16)
  3 3 f 11  i (100( x i  1  x i2 ) 2  ( x i  1) 2 )
n
x  
 2 2 
x  [ 2.048, 2.048] n
  0.08 0.7   
min f2 ( x*)  f2 ( )  1.0316 min f 11 ( x*)  f 11 (1)  0
 0.08  0.7

Zakharov
( f12):
Schwefel( f3 ) : (17)
f12 i xi2 (i 1/ 2ixi )2 (i 1/ 2ixi )4
n n n
n

f3    xi sin xi
i1
  (8) 
x[10,10]n
 
 minf12(x*) f12(0) 0
x [500, 500]n
 
min f3 (x*)  f3 (420.9687)  418.9829n
Levy( f13 ) :
Rastrigin ( f 4 ) :
f13  ( / n)  A
(9) 2 (18)
A  (k sin2 (y1 )  i 1 (( yi  a) 
n 1
f 4  i 1 ( x i  10 cos(2x i )  10)
n 2


x  [5.12, 5.12] n (1  k sin2 (yi 1 )))  ( yn  a)2 )
 
min f 4 ( x*)  f 4 (0)  0 yi  1  1/ 4( xi  1), k  10, a  1

x [10, 10]n
Hyperellipsoid ( f 5 ) :  
min f13 ( x*)  f13 (1)  0
n
f 5   (i  xi2 ) (10)
i 1

x  [5.12, 5.12] n
 
min f 5 ( x*)  f 5 (0)  0

Griewank ( f 6 ) : (11)
n n
f 6  1 / 4000 x i   cos( x i / i )  1
2

i 1 i 1

x  [300, 300] n
 
min f 6 ( x*)  f 6 (0)  0
Easom ( f 7 ) :
n
(12) Fig. 1- The shape of Spherical function
f 7   cos( xi ) exp( ( xi   ) 2 )
i 1

x  [100, 100]n
 
min f 7 ( x*)  f 7 ( )  1
Fig. 2- The shape of Six Hump camel back function Fig. 6- The shape of Griewank function

Fig. 3- The shape of Schwefel function Fig. 7- The shape of Easom function

Fig. 4- The shape of Rastrigin function Fig. 8- The shape of Drop Wave function

Fig. 5- The shape of Hyperellipsoid function Fig. 9- The shape of Ackley function
TABLE I- THE RESULTS OF PSO AND VERIFIED PSO

No.
Test Min Iterat of
X1 X2
No Function Value ions Popul
ation

-0.0009 0.0012 2.1920e-06


1 Spherical 100 20
0.6297e- 3.9752e-
0.0317e-10
10 21
Six Hump 0.0898 -0.7129 -1.0316
2 Camel 100 20
Fig. 10- The shape of Quadric function Back 0.0898 -0.7127 -1.0316

420.9475 420.8334 -837.9634


3 Schwefel 100 20
420.9655 421.1206 -837.9629
-0.3296e-
0.6567e-03 1.0710e-04
03
4 Rastrigin 100 20
0.4199e- 7.1054e-
-0.5571e-08
08 15
-0.6027 e- -0.3044 e-
7.3380e-07
Hyperelli 03 03
5 100 20
psoid -0.0350e- 7.2713e-
-0.8498e-11
11 23
-0.8326e-
-0.8022e-03 4.9540e-07
Fig. 11- The shape of Rosenbrock function 03
6 Griewank 100 20
0.8527e-
-0.4228e-08
08
0

3.1426 3.1405 -1
7 Easom 100 20
3.1416 3.1416 -1.0000

-0.5201 0.0125 -0.9362


Drop
8 100 20
Wave 0.6736e-
0.2012e-05
05
-1.0000
-0.2015e-
0.1678e-04 7.4182e-05
04
9 Ackley 100 20
-0.7981e- 2.7827e-
0.5752 e-08
Fig. 12- The shape of Zakharov function 08 08
0.0013 -0.0360 0.0417
10 Quadric 100 20
8.9360e-
0.0267 -0.0132
04
2.9941 -0.6145 1.6166
11 Rosenbrock 100 20
1.6168 1.6168 0.7624

0.1152 0.3038 -0.1513


12 Zakharov 100 20
2.7198e-
0.0147 -0.0074
04

13 Levy -9.9855 -9.1711 0.0820 100 20


Fig. 13- The shape of Levy function

The first rows of the given results in the tables denote the
results achieved by PSO and the second rows that have been
mentioned, denote the results achieved by verified PSO.
IV. CONCLUSION
The results show that by verifying PSO and adding the
velocities’ dimension its power in finding the optimal point
of the test functions has been improved. The verified PSO
has achieved better results and it can be used as an
optimization method when more accurate results are needed
and accuracy is an important factor especially when higher
dimension of the search space is required.

TABLE II- THE RESULTS OF PSO AND VERIFIED PSO


WITH 10 VARIABLES
Minimum Number of
No. Test Function Iterations
Value population
0.2798
1 Spherical 100 20
0.0214
-
4.0472e+03
2 Schwefel 100 20
-
4.3446e+03
13.6386
3 Rastrigin 100 20
12.1230
0.1490
4 Hyperellipsoid 100 20
0.1273
0.0851
5 Griewank 100 20
0.0442
0
6 Easom 100 20
0
1.8651
7 Ackley 100 20
1.6534
1.7635
8 Quadric 100 20
1.1833
85.7453
9 Rosenbrock 100 20
74.8754
2.7999
10 Zakharov 100 20
1.8244
0.0964
11 Levy 100 20
1.1368e-04

REFERENCES

[1] A.P. Engelbrecht, Fundamentals of Computational Swarm


Intelligence, John Wiley & Sons, 2005.
[2] Yuhui Shi, Russell C. Eberhart, Fuzzy Adaptive Particle Swarm
Optimization, 2001.
[3] Marcin Molga, Czesław Smutnicki, Test Functions for Optimization
Needs.

View publication stats

You might also like