You are on page 1of 9

ARTICLE IN PRESS

Applied Mathematics and Computation xxx (2007) xxx–xxx


www.elsevier.com/locate/amc

An improved particle swarm optimization algorithm


a,b,*
Yan Jiang , Tiesong Hu a, ChongChao Huang c, Xianing Wu a

a
State Key Laboratory of Water Resource and Hydropower Engineering Science, Wuhan University, Wuhan 430072, China
b
College of Water Sciences, Beijing Normal University, Beijing 100875, China
c
School of Mathematics and Statistics, Wuhan University, Wuhan 430072, China

Abstract

An improved particle swarm optimization (IPSO) is proposed in this paper. In the new algorithm, a population of
points sampled randomly from the feasible space. Then the population is partitioned into several sub-swarms, each of
which is made to evolve based on particle swarm optimization (PSO) algorithm. At periodic stages in the evolution, the
entire population is shuffled, and then points are reassigned to sub-swarms to ensure information sharing. This method
greatly elevates the ability of exploration and exploitation. Simulations for three benchmark test functions show that IPSO
possesses better ability to find the global optimum than that of the standard PSO algorithm. Compared with PSO, IPSO is
also applied to identify the hydrologic model. The results show that IPSO remarkably improves the calculation accuracy
and is an effective global optimization to calibrate hydrologic model.
Ó 2007 Elsevier Inc. All rights reserved.

Keywords: Particle swarm optimization; Improved particle swarm optimization; Global optimization; Hydrologic model; Parameters
calibration

1. Introduction

The particle swarm optimization (PSO) algorithm is a member of the wide category of swarm Intelligence
methods for solving global optimization problems. It was originally proposed by Kennedy as a simulation of
social behavior, and it was initially introduced in 1995 as an optimization method [3,5]. PSO is related with
artificial life, and specifically to swarming theories, and also with evolutionary computation, especially evolu-
tionary strategies and genetic algorithm. PSO can be easily implemented and it is computationally inexpensive,
since its memory and CPU speed requirements are low.
As swarm Intelligence method, PSO also has premature convergence, especially in complex multi-peak-
search problems. However many researchers are devoted into this field to handle this problem. Angeline [1]
incorporates PSO with an explicit selection mechanism similar to that used in more traditional evolutionary
computations to improve PSO. Løvbjerg et al. [6] proposes a hybrid PSO combining the traditional velocity
and position update rules with the ideas of breeding and subpopulations, which has the potential to achieve

*
Corresponding author.
E-mail address: lirenjy@sohu.com (Y. Jiang).

0096-3003/$ - see front matter Ó 2007 Elsevier Inc. All rights reserved.
doi:10.1016/j.amc.2007.03.047

Please cite this article in press as: Y. Jiang et al., An improved particle swarm optimization algorithm, Appl. Math.
Comput. (2007), doi:10.1016/j.amc.2007.03.047
ARTICLE IN PRESS

2 Y. Jiang et al. / Applied Mathematics and Computation xxx (2007) xxx–xxx

faster convergence and the potential to find a better solution. Parsopoulos and Vrahatis [7,8] introduces function
‘‘stretching’’ to PSO for the alleviation of the local minima problem. In [9], Parsopoulos uses the technique of
initializing the particle swarm optimizer using the nonlinear simplex method to explore the search space more
efficiently and detect better solutions. Higashi and Iba [4] presents particle swarm optimization with Gaussian
mutation. This method has been proved that can succeed in acquiring the better results than those by PSO alone.
Inspired by GA, Shi et al. [10] presents a hybrid evolutionary algorithm based on PSO and GA methods through
crossing over PSO and GA, which possess better ability to find the global optimum than that of the standard
PSO algorithm. Wang and Li [14] integrates PSO and simulated annealing to improve the performance of PSO.
Inspired by [2], we propose an improved particle swarm optimization (IPSO) in this paper. In IPSO, a pop-
ulation of points sampled randomly from the feasible space. Then the population is partitioned into several
sub-swarms, each of which is made to evolve based on PSO. At periodic stages in the evolution, the entire
population is shuffled and points are reassigned to sub-swarms to ensure information sharing.
This paper is organized as follows: Section 2 presents a review of PSO. Description of the proposed algo-
rithm IPSO is given in Section 3. Numerical examples used to illustrate the efficiency of the proposed algo-
rithm are given in Section 4. Case study on hydrologic model calibration is given in Section 5. Finally,
Section 6 concludes this paper.

2. Overview of the standard PSO

PSO is a population based optimization tool, where the system is initialized with a population of random
particles and the algorithm searches for optima by updating generations. Suppose that the search space is n-
dimensional, and then the particle i of the swarm can be represented by an n-dimensional vector
X i ¼ ðxi1 ; xi2 ;    ; xin Þ. The velocity of this particle can be represented by another n-dimensional vector
V i ¼ ðvi1 ; vi2 ;    ; vin Þ. The fitness of each particle can be evaluated according to the objective function of opti-
mization problem. The best previously visited position of the particle i is noted as its individual best position
P i ¼ ðpi1 ; pi2 ;    ; pin Þ. The position of the best individual of the whole swarm is noted as the global best posi-
tion G ¼ ðg1 ; g2 ;    ; gn Þ. At each step, the velocity of particle and its new position will be assigned according
to the following two equations:
V i ¼ x  V i þ c1  r1  ðP i  X i Þ þ c2  r2  ðG  X i Þ; ð1Þ
X i ¼ X i þ V i; ð2Þ
where, x is called the inertia weight that controls the impact of previous velocity of particle on its current one.
r1 ; r2 are independently uniformly distributed random variables with range (0,1). c1 ; c2 are positive constant
parameters called acceleration coefficients which control the maximum step size.
In PSO, Eq. (1) is used to calculate the new velocity according to its previous velocity and to the distance of
its current position from both its own best historical position and the best position of the entire population or
its neighborhood. Generally, the value of each component in V can be clamped to the range ½vmax ; vmax  to
control excessive roaming of particles outside the search space. Then the particle flies toward a new position
according Eq. (2). This process is repeated until a user-defined stopping criterion is reached.

3. Desperation of IPSO

In IPSO, a population of points sampled randomly from the feasible space. Then the population is parti-
tioned into several sub-swarms. Each complex independently executes PSO or its variants, including the
update of particles’ position and velocity. After a certain number of generations, the sub-swarms are forced
to mix and points are reassigned to ensure information sharing. The IPSO strategy is presented below and
is illustrated in Fig. 1.

Step 1: Initializing. Select p P 1; m P 1, where, p is the number of sub-swarms, m is the number of points in
each complex. Compute the sample size s = pm. Sample s points X 1 ;    ; X s in the feasible space.
Compute the function value fi at each point Xi.

Please cite this article in press as: Y. Jiang et al., An improved particle swarm optimization algorithm, Appl. Math.
Comput. (2007), doi:10.1016/j.amc.2007.03.047
ARTICLE IN PRESS

Y. Jiang et al. / Applied Mathematics and Computation xxx (2007) xxx–xxx 3

Start
Select q,T, set t=1
Input: n=dimension, p=number of complexes
m=number of points in each complex k
Select q points from A according to function values.
Compute: sample size s=pm
Store them in Fk={Y ki ,V ki ,uki ,i=1,…,q}

Sample s points at random in feasible space.


k k k
Compute the function value at each point Renew F , V and Evaluate F

Sort the s points in order of increasing k k No


Y is better than P ?
function value. Store them in E.
Yes
k k
P =Y
Partition E into p complexes of m points

No
Evolve each complex Ak, k=1,…,p Y k is better than Gk ?
Yes
k
Replace A , k=1,…,p into E G k= Y k

No
No t=t+1 t>=T?
Convergence criteria satisfied?
Yes
Yes
Replace F k into Ak
Stop

Fig. 1. The flow chart of IPSO.

Step 2: Ranking. Sort the points in order of increasing function value. Store them in an array
E ¼ fX i ; fi ji ¼ 1; . . . ; sg.
Step 3: Partitioning. Partition E into p sub-swarms A1 ; A2 ;    ; Ap , each containing points m, such that:
Ak ¼ fX kj ; fjk jX kj ¼ X kþpðj1Þ ; fjk ¼ fkþpðj1Þ ; j ¼ 1; . . . ; mg; k ¼ 1; . . . ; p.
Step 4: Evolving. Evolve each complex Ak using particle swarm optimization (PSO) separately.
Step 4.1: Initializing. Select q,T, where, q is the population size of PSO, T is the maximal iterated generation.
Step 4.2: Selecting. Choose q distinct points Y k1 ;    ; Y kq from Ak according to the function values to construct
a sub-swarm. Better points in Ak have more probability to be selected. Store them in
F k ¼ fY ki ; V ki ; uki ji ¼ 1; . . . ; qg, where V ki is the velocity for particle Y ki and uki is the corresponding
function value. Find out the best previously visited position of each particle P ki and the position
of the best individual of the whole swarm Gk.
Step 4.3: Comparing. Compare the function value between each particle Y ki and P ki . If Y ki is better than P ki ,
then P ki ¼ Y ki . Compare the function value between each particle Y ki and Gk. If Y ki is better than
Gk, then Y ki ¼ Gk .
Step 4.4: Renewing. According to the formulation (1) and (2) , renew the position and velocity of each
particle.
Step 4.5: Iterating. Iterate by repeating Step 4.3 and Step 4.4 T times, where T is a user-specified parameter
which determines how fast each complex should evolve.
Step 5: Sub-swarms shuffling. Replace A1 ;    ; Ap into E.Sort E in order of increasing function value.
Step 6: Check convergence. If the convergence criteria are satisfied, stop. Otherwise, return to Step 4.

IPSO combines the strengths of the particle swarm optimization, competitive evolution and the concept of
complex shuffling. It greatly enhances survivability by a sharing of the information gained independently by
each complex. In IPSO, each member of a complex is a potential parent with the ability to participate in a
process of evolution. A sub-swarm selected from the complex is like a pair of parents. To ensure that the evo-
lution process is competitive, we require that the probability that better parents contribute to the generation of
offspring is higher than that of worse parents. Finally, each new offspring replaces the worst point of the cur-
Please cite this article in press as: Y. Jiang et al., An improved particle swarm optimization algorithm, Appl. Math.
Comput. (2007), doi:10.1016/j.amc.2007.03.047
ARTICLE IN PRESS

4 Y. Jiang et al. / Applied Mathematics and Computation xxx (2007) xxx–xxx

Table 1
Optimization test functions
Name Formula Range Optimal f
Pn1
Rosenbrock f1 ðxÞ ¼ i¼1 ð100ðxiþ1  x2i Þ2 þ ðxi  1Þ2 Þ ½30; 30n 0
Pn
Rastrigrin f2 ðxÞ ¼ i¼1 ðx2i  10 cosð2pxi Þ þ 10Þ ½5:12; 5:12n 0
1
Pn 2 n xiffi n
Griewark f3 ðxÞ ¼ 4000 i¼1 xi  Pi¼1 cosð iÞ þ 1
p ½600; 600 0

rent sub-swarm, rather than the worst point of the entire population. This ensures that process before being
replaced or discarded. Thus, none of the information contained in the sample is ignored.

4. Numerical experiments

In this section, three nonlinear benchmark functions that are commonly used in literature [11,13] are per-
formed. The functions, the admissible range of the variable and the optimum are summarized in Table 1.
To evaluate the performance of the proposed IPSO, the basic PSO is used for comparisons. For all these
methods, the parameter x used is recommended from Shi and Eberhart [12] with a linearly decreasing, which
changes from 0.9 to 0.4. The acceleration constants c1; c2 are both 2.0. Xmax and Vmax are set to equal. In our
case for IPSO, the following values are used as default: p ¼ 4; m ¼ 2q, where p is the number of sub-swarms, m
is the number of points in each sub-swarm, q is the population size of particles selected from the points in sub-
swarm.
As in [11], for each function, three different dimension sizes are tested. They are dimension sizes: 10, 20 and
30 and the corresponding maximum number of generations are set as 1000, 1500 and 2000. In addition, dif-
ferent population sizes are used for each function with different dimensions. They are population sizes of 20,
40, 80, and 160. The symmetric initialization method is adopted here for population initialization. In IPSO,
G ¼ KT ; T ¼ 2n, where G is the whole generations, T is the times that each sub-swarm executes PSO indepen-
dently and K is the number of sub-swarms shuffling. We execute the there algorithm in 50 independent runs.
The mean fitness values of the best particle found for the 50 runs for the three functions are listed in Tables 2–
4. From the Tables, it can be seen that IPSO outperforms better than PSO. Figs. 2–4 show the mean fitness of
the best particles found during 2000 generations with 30 dimensions by 80 particles for f1 ; f 2 ; f 3 respectively.
They illustrate IPSO will sustainable evolve when PSO is almost stagnated.

5. Application of IPSO to calibrate the Xin’anjiang model

Hydrologic model has spread to many fields of application such as flood forecasting, water resources esti-
mation, design flood, field drainage, water project programming, hydrological station planning, water quality

Table 2
Mean function values for Rosenbrock function
Population size Dimension Generation PSO IPSO
20 10 1000 42.6162 10.5172
20 1500 87.2870 75.7246
30 2000 132.5973 99.8039
40 10 1000 24.3512 1.2446
20 1500 47.7243 8.7328
30 2000 66.6341 14.7301
80 10 1000 15.3883 0.1922
20 1500 40.6403 1.5824
30 2000 63.4453 1.5364
160 10 1000 11.6283 0.0598
20 1500 28.9142 0.4771
30 2000 56.6689 0.4491

Please cite this article in press as: Y. Jiang et al., An improved particle swarm optimization algorithm, Appl. Math.
Comput. (2007), doi:10.1016/j.amc.2007.03.047
ARTICLE IN PRESS

Y. Jiang et al. / Applied Mathematics and Computation xxx (2007) xxx–xxx 5

Table 3
Mean function values for Rastrigrin function
Population size Dimension Generation PSO IPSO
20 10 1000 5.2062 3.2928
20 1500 22.7724 16.4137
30 2000 49.2942 35.0189
40 10 1000 3.5697 2.6162
20 1500 17.2975 14.8894
30 2000 38.9142 27.7637
80 10 1000 2.3835 1.7054
20 1500 12.9020 7.6689
30 2000 30.0375 13.8827
160 10 1000 1.4418 0.8001
20 1500 10.0438 4.2799
30 2000 24.5105 11.9521

Table 4
Mean function values for Griewark function
Population size Dimension Generation PSO IPSO
20 10 1000 0.0920 0.0784
20 1500 0.0317 0.0236
30 2000 0.0482 0.0165
40 10 1000 0.0762 0.0648
20 1500 0.0227 0.0182
30 2000 0.0153 0.0151
80 10 1000 0.0658 0.0594
20 1500 0.0222 0.0091
30 2000 0.0121 0.0004
160 10 1000 0.0577 0.0507
20 1500 0.0215 0.0048
30 2000 0.0121 0.0010

accounting etc. It usually consists of a number of parameters to be calibrated by examining the estimated and
measured discharge series. For any hydrologic model to have practical utility, it is important to be able to
identify proper values for the parameters which govern these functions. The procedure for doing this is called
model calibration.

Fig. 2. Evolution of logarithmic average fitness of Rosenbrock function for PSO and IPSO.

Please cite this article in press as: Y. Jiang et al., An improved particle swarm optimization algorithm, Appl. Math.
Comput. (2007), doi:10.1016/j.amc.2007.03.047
ARTICLE IN PRESS

6 Y. Jiang et al. / Applied Mathematics and Computation xxx (2007) xxx–xxx

Fig. 3. Evolution of logarithmic average fitness of Rastrigrin function for PSO and IPSO.

Fig. 4. Evolution of logarithmic average fitness of Griewark function for PSO and IPSO.

The Xin’anjiang hydrologic model [15], developed in 1973 by Prof. Zhao, is a conceptual hydrologic model
with 17 parameters needed to be calibrated. The range of these parameter values and their physical meaning
are given in Table 5. In the Xin’anjiang model, the basin is divided into a set of sub-basins, the outflow hyd-
rograph from each of which is first simulated and then routed down the channels to the main basin outlet. The

Table 5
The range of parameter values and their physical meanings
Parameter Min Max Physical meanings
K 0.2 1.5 The ratio of potential to pan evapotranspiration
WUM 5 20 The mean tension water capacities of the upper soil layer
WLM 60 90 The mean tension water capacities of the lower soil layer
C 0.08 0.18 The evapotranspiration coefficient of deeper layer
WM 80 170 The areal mean tension water capacity
IMP 0.01 0.05 The ratio of impervious area to the total area of the basin
B 0.1 0.4 The exponential of distribution water capacity
SM 10 50 The free water storage capacity
EX 0.5 2 The exponential of distribution water capacity
KG 0.01 0.7 The outflow coefficient of free water storage to the groundwater flow
KSS 0.01 0.7 The outflow coefficient of free water storage to the interflow
KKG 0.95 0.998 The recession constant of ground water storage
KKSS 0.5 0.9 The recession constant of lower interflow storage
CS 0.1 0.5 The recession constant of channel network storage
KE 0.2 0.5 Muskingum coefficient
XE 1 5 The residence time of water

Please cite this article in press as: Y. Jiang et al., An improved particle swarm optimization algorithm, Appl. Math.
Comput. (2007), doi:10.1016/j.amc.2007.03.047
ARTICLE IN PRESS

Y. Jiang et al. / Applied Mathematics and Computation xxx (2007) xxx–xxx 7

Actual evaporation (E) Rainfall depth (P) and Pan evaportaion (EM)

K WM, B IMP
Pervious area
Not runoff Impervious area
Runoff Runoff (R)
producing
producting
area
area (FR)
(1-FR)
Runoff (R)

SM Surface runoff Surface


Tension water (W) (RS) discharge (QS)
WUM Free EX
EU Upper layer (WU)
WLM water
EL Lower layer (WL) KSS Interflow runoff KKSS
ED C Deepest layer (WD) (S) Interflow Total sub-basin CN Total sub-basin
(RSS) discharge (QSS) inflow (T) CK discharge (Q)
KG KE XE
Groundwater KKG Groundwater
Total basin
runoff (RG) discharge (QG)
discharge (TQ)

Fig. 5. Flow chart of the Xin’anjiang model.

flow chart is shown in Fig. 5. The inputs to the model are rainfall P and potential evapotranspiration EM and
the output is the outlet discharge TQ from the whole basin.

5.1. Objective function and statistical indinces

The objective function is selected as formula (3), and it is designed to have the minima.
  sim j
1 XN
2
 obs  Q
jQ
f ðxÞ ¼ ðQobs;i  Qsim;i Þ 1 þ  obs ; ð3Þ
N i¼1 Q
where, the variable x is parameters needed to be calibrated, Qobs;i and Qsim;i are the measured and simulated
discharge in the period of calibration respectively, N is the number of time series.
The variables restraint for the problem is: xmin 6 x 6 xmax . Where, xmin ; xmax are the lower and upper bound
of the variables.
Two statistical indices selected to compare the performance of PSO and IPSO are model coefficient R2 and
water balance error RE. Smaller water balance error and bigger model coefficient means simulated discharges
is closer to observed discharges.
PN 2
ðQobs;i  Qsim;i Þ
R ¼ 1  Pi¼1
2
; ð4Þ
N  obs Þ2
ðQobs;i  Q
i¼1
Q
RE ¼ obs  100%: ð5Þ
Qsim

5.2. Experiments results

In order to test the performance of IPSO for the Xin’anjiang model calibration, we compare it with PSO.
The catchment used is the Huangbohe basin, Hubei province. Five years’ data, starting from January 1, 1988

Table 6
The calculating results of PSO and IPSO
Method Function values Statistical indices
Max Min Mean Deviation Water balance error Model efficiency
PSO 55.454 50.649 52.892 1.821 1.047 0.819
IPSO 50.704 50.650 50.664 0.021 1.032 0.825

Please cite this article in press as: Y. Jiang et al., An improved particle swarm optimization algorithm, Appl. Math.
Comput. (2007), doi:10.1016/j.amc.2007.03.047
ARTICLE IN PRESS

8 Y. Jiang et al. / Applied Mathematics and Computation xxx (2007) xxx–xxx

Fig. 6. Runoff simulating results of PSO and IPSO.

of daily values of rainfall and evapotranspiration are used. In real-life applications the optimization cost is
usually dominated by the evaluations of the objective function; the expected number of function evaluations
is retained as the main algorithm performance criterion. So for the two methods, we select 80 as the popula-
tion size. Ten runs are made with different initial seeds for the random number generator resulting in different
initial starting populations of points. Each run consists of 30,000 objective function evaluations. Table 6
shows the maximum, minimum and mean function values of the best particle found for the 10 runs, Standard
deviation of the 10 mean function values, model efficiency and water balance error. Fig. 6. shows the runoff
simulating results of PSO and IPSO in 1991. We can see that these two methods can simulate the observed
discharge well. Fig. 7. shows the mean function values of the best particles of PSO and ISPO. From Table
6 and Fig. 7, it can be seen that PSO and IPSO can control water balance error in the allowed range (no morn

70 max
68 min
average
66
function values

64
62
60
58
56
54
52
50
0 5000 10000 15000 20000 25000 30000
(a) iteration numbers of objective funtion

55.5 max
55 min
54.5 average
54
function values

53.5
53
52.5
52
51.5
51
50.5
50
0 5000 10000 15000 20000 25000 30000
(b) iteration numbers of objective funtion

Fig. 7. Iteration numbers of maximum, minimum and mean function values: (a) PSO and (b) IPSO.

Please cite this article in press as: Y. Jiang et al., An improved particle swarm optimization algorithm, Appl. Math.
Comput. (2007), doi:10.1016/j.amc.2007.03.047
ARTICLE IN PRESS

Y. Jiang et al. / Applied Mathematics and Computation xxx (2007) xxx–xxx 9

than 5%), but IPSO can get higher model efficiency. Small mean function value and standard deviation of
IPSO means that IPSO is more stable than PSO. Therefore, IPSO performs better than PSO in the Xin’anjiang
model calibration.

6. Conclusions

This paper presents a shuffled complex evolution of particle swarm optimization algorithm called IPSO
designed to improve the performance of PSO. In IPSO, a population of points sampled randomly from the
feasible space. Then the population is partitioned into several sub-swarms, which is made to evolve based
on particle swarm optimization (PSO). At periodic stages in the evolution, the entire population is shuffled
and points are reassigned to sub-swarms to ensure information sharing. This new method can enhance surviv-
ability by a sharing of the information. Three benchmark functions and an case study are performed in the
simulation part using different algorithms. The performance comparisons indicate that IPSO is superior to
PSO.

References

[1] P.J. Angeline, Using selection to improve particle swarm optimization, Proceedings of the IEEE Congress on Evolutionary
Computation, Anchorage, Alaska, USA (1998) 84–89.
[2] Q.Y. Duan, S. Sorooshian, V.K. Gupta, Effective and efficient global optimization for conceptual rainfall-runoff models, Water
Resources Research 28 (1992) 1015–1031.
[3] R.C. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, Proceedings of the Sixth International Symposium on
Micromachine and Human Science, Nagoya, Japan (1995) 39–43.
[4] N. Higashi, H. Iba, Particle swarm optimization with gaussian mutation, Proceedings of the IEEE Swarm Intelligence Symposium
2003, Indianapolis, Indiana, USA (2003) 72–79.
[5] J. Kennedy, R.C. Eberhart, Particle swarm optimization, in: Proceedings of IEEE International Conference on Neural Networks,
Piscataway, NJ, 1995, 1942–1948.
[6] M. Løvbjerg, T.K Rasmussen, T. Krink, Hybrid particle swarm optimiser with breeding and subpopulations, in: Proceedings of the
Genetic and Evolutionary Computation Conference, 2001.
[7] K.E. Parsopoulos, V.P. Plagianakos, G.D. Magoulas, M.N. Vrahatis, Improving particle swarm optimizer by function stretching,
Advances in Convex Analysis and Global Optimization (2001) 445–457.
[8] K.E. Parsopoulos, V.P. Plagianakos, G.D. Magoulas, M.N. Vrahatis, Stretching technique for obtaining global minimizers through
particle swarm optimization, in: Proceedings of the Workshop on Particle Swarm Optimization, Indianapolis, IN, 2001.
[9] K.E. Parsopoulos, M.N. Vrahatis, Initializing the particle swarm optimizer using the nonlinear simplex method, Advances in
Intelligent Systems, Fuzzy Systems, Evolutionary Computation, WSEAS Press, 2002, pp. 216–221.
[10] X. Shi, Y. Lu, C. Zhou, H. Lee, W. Lin, Y. Liang, Hybrid evolutionary algorithms based on PSO and GA, Proceedings of IEEE
Congress on Evolutionary Computation 2003, Canbella, Australia (2003) 2393–2399.
[11] Y. Shi, R.C. Eberhart, Empirical study of particle swarm optimization, in: Proceedings of the IEEE Congress on Evolutionary
Computation, Piscataway, NJ, 1999, 1945–1950.
[12] Y. Shi, R.C. Eberhart, A modified particle swarm optimizer, Proceedings of the IEEE Congress on Evolutionary Computation,
Piscataway, NJ (1998) 69–73.
[13] I.C. Trelea, The particle swarm optimization algorithm: convergence analysis and parameter selection, Information Processing Letters
85 (2003) 317–325.
[14] X.H. Wang, J.J. Li, Hybrid particle swarm optimization with simulated annealing, Proceedings of the Third International Conference
on Machine Learning and Cybernetics, Shanghai (2004) 2402–2405.
[15] R.J. Zhao, The Xinanjiang model applied in China, Journal of Hydrology (1992) 371–381.

Please cite this article in press as: Y. Jiang et al., An improved particle swarm optimization algorithm, Appl. Math.
Comput. (2007), doi:10.1016/j.amc.2007.03.047

You might also like