Professional Documents
Culture Documents
Abstract—Acceleration coefficients are the key parameters of position of a particle is determined by its own best position
particle swarm optimization (PSO) algorithm used to control the while the global best position is determined by its best
movement of particles by modifying its cognitive and social position found in the swarm. Many CPSO variants have been
components. Several variants have been proposed that modify proposed to solve continues optimization problems. However,
the acceleration coefficients to improve the convergence speed of it is ineffective for solving binary optimization problems.
PSO in continuous search space. In this regard, a few attentions
have been paid to improve the convergence speed of binary To solve the binary optimization problems, Kennedy and
particle swarm optimization (BPSO). Moreover, in presence of Eberhart proposed a BPSO [2] in 1997. The BPSO represents
distinct position of particles by ignoring the dispersion of each particle with a binary string of 0 or 1. In BPSO, each
particles in a search space, BPSO deals all particles equally. To particle changes its position by either selecting 0 or 1. BPSO
address this issue, we have proposed a fitness-based acceleration has been also successfully utilized to solve many binary
coefficients Novel BPSO, called FAC-NBPSO. In the proposed problems.
algorithm, the fitness of each particle is used to modify the
cognitive and social components of each particle. The Acceleration coefficients are the key parameters of CPSO
performance of the proposed algorithm is tested on four that significantly affect its performance [3]. A number of
benchmark test functions. The experimental results show that the modifications in acceleration coefficients have been proposed
proposed algorithm performs better than the compared to enhance the performance of CPSO. Zhengjia & Jianzhong
algorithm with improved convergence speed. [4] proposed a novel self-adaptive strategy for individual
inertia weight and social acceleration coefficient. An adaptive
Keywords—PSO; BPSO; convergence speed; acceleration PSO has been proposed by Tang and Zhang [3], it introduced
coefficients. exponential time-varying acceleration coefficients by
reduction of the cognitive component and by the increase of
I. INTRODUCTION the social component. In [5], the acceleration coefficients
changes dynamically over iterations. Ren [6] proposed a
In recent years, an increasing number of complex
fitness feedback based by adjusting the inertia weight and
optimization problems have emerged in various fields of
acceleration coefficients dynamically in a non-deterministic
science and technology. To solve optimization problems,
way. Ma et al. [7] proposed a modified PSO with dynamic
researchers have proposed a number of evolutionary
acceleration coefficients. A new self-adaptive PSO by
algorithms. Kennedy and Eberhart [1] introduced continues
defining acceleration coefficients in terms of fitness was
particle swarm optimization (CPSO), inspired by the social
proposed by Dong et al. [8].
behavior of birds flock and fish schools. In CPSO, the
individual in the swarm represents a particle, which holds a In this paper, we have proposed a fitness-based
potential solution to a problem. The CPSO executes by acceleration coefficients novel BPSO (FAC-NBPSO)
initializing the particles randomly in the search space. All the algorithm to improve the performance of NBPSO [9]. The
particles in the swarm communicate with each other and proposed algorithm modified the cognitive and social
adjust their positions. On every iteration, the personal best components of velocity update equation on the basis of fitness
xij ( t + 1 ) = °®
1, if Rdij < Sg ( v( ) ) .
ij
t +1
(3) IV. FITNESS-BASED ACCELERATION COEFFICIENTS NBPSO
°̄ 0, else. (FAC-NBPSO) ALGORITHM
The problem with NBPSO algorithm [9] is that it uses
where Rd is a random number chosen for the range of [0, 1]. fixed acceleration coefficients for all particles in a swarm.
In recent years, a number of BPSO variants have been This leads the swarm to converge slowly. To overcome this
proposed to achieve better performance and to overcome the problem of NBPSO, we have proposed the FAC-NBPSO
convergence problems associated with BPSO. Nezamabadi et algorithm, which inserts the acceleration coefficients [8] to
al. [10] proposed a new BPSO by defining probability accelerate the convergence speed of the NBPSO. Based on the
functions to change velocity and position. Cervantes et al. fitness of each particle the proposed FAC-NBPSO algorithm
[11] resolved classification. A novel BPSO has been proposed accelerate the convergence speed of the NBPSO by modifying
[9] by interpreting velocity and solve the problem of selecting the cognitive and social components.
a proper value of inertia weight. A modified BPSO has been The purpose of increasing c1 and decreasing c2 for each
proposed by Lee et al [12], by adopting the concepts of particle with higher fitness in population is to enhance the
genotype-phenotype and mutation of a genetic algorithm to convergence of NBPSO.
optimize binary problems. Jeong et al. [13] proposed a
Quantum-inspired BPSO to address unit commitment The FAC-NBPSO modifies c1 and c2 adaptively for each
problems by applying quantum computing concepts. An particle in terms of fitness rank for each particle as:
experiment was performed by Singh et al [14] to address
c1 = 1.5 − (1 − FR ) .
2
discrete optimization problems with a hybrid version of BPSO (8)
and genetic crossover. Zhang et al [15] developed a BPSO c2 = 0.01∗ (1 − FR ) .
3
356
coefficients, the FAC-NBPSO sorts all the particles in the V. EXPERIMENTAL SETUP
population with respect to their fitness and then a fitness rank The benchmark test functions are utilized to validate and
is assigned to each particle. The FAC-NBPSO first ranks the compare the characteristics and performance of optimization
particle having greater fitness value and then later rank the algorithms such as convergence rate, robustness, precision,
particles with smaller fitness value. This improved the and general performance. To evaluate the performance of
convergence speed. FAC-NBPSO algorithm based on the fitness of each particle
The procedure of FAC-NBPSO is given below: by modifying is the cognitive and social components
measured by comparing it with NBPSO algorithm. The
I. Initialize the particle swarm with random positions experiments are carried out on minimization of test functions.
within the hypercube (particles are selected having
binary values 0, 1) The dimensions are represented as N, domain size of the
II. Calculate the fitness of every particle utilizing its problem is denoted as lb xi ub and optimal solution are
present position. denoted by f(x = )כf(x1,...xn), respectively. Where lb is the
III. Get pbest by the comparison of the fitness of each lower bound of the variables while, ub represent the upper
particle to its best fitness. Set the present position as bound of the variables. Four benchmark functions with zero
the gbest position if the fitness value at the current global minimum used to perform experiments on FAC-
position is better than its best position. NBPSO are:
IV. Get gbest by comparison of fitness of each particle to I. Sphere function
its best fitness within the population. Set the present
position as the best position if the fitness value at the N
present position is better than its best position. f1( x ) = ¦ xi2 . (9)
i =1
V. Sort and rank all the particles according to their
fitness values. The above function is the continuous, differentiable, Non-
VI. Calculate the acceleration coefficients for each separable, scalable, Multimodal function, where the domain
particle in terms of fitness using equation (8). size of problem is 0 xi 10. The optimal solution is x= כ
VII. Update particle’s velocity using equation (4) and (5). f(0,··· ,0), f(x = )כ0.
VIII. Update the velocity of change of bits by the use of
equation (6). II. Rosenbrock function
IX. The new position of each particle is updated by
generating a random variable in the range of [0, 1]
using equation (7).
(
f 2 ( x ) = ¦ 100 ( xi +1 − xi2 ) + ( xi − 1 )
2 2
). ( 10 )
X. Move to step II and repeat till convergence occurs. The Rosebrock is the continuous, differentiable, Non-
separable, scalable unimodal function, where the domain size
To elaborate the algorithm more clearly its flow chart is given of the problem is í30 xi 30. The optimal solution is at x= כ
below: f (1,··· ,1), f(x = )כ0.
357
TABLE I. PARAMETER SETTINGS
Dimension FAC-NBPSO NBPSO[6] BPSO[6] BPSO[6]
Sr.no Algorithm Parameter Value N=3 2.0860*10-9 2.0860*10-9 0.003 0.0277
w 0.5
-3
1 FAC-NBPSO c1 1.5-(1-FR)2 N=5 7.4*10 0.0099 0.2113 0.1503
c2 0.01*(1-FR)3
N = 10 2.0860*10-9 0.0519 0.8282 1.0254
w 0.5
2 NBPSO
c1 & c2 1.0
w 0.9 to 0.1
3 BPSO TABLE V. EXPERIMENTAL RESULTS FOR GLOBAL BEST USING
c1 & c2 2
RASTRIGIN FUNCTION FOR MINIMIZATION IN 10 RUN OF THE ALGORITHM
TABLE III. EXPERIMENTAL RESULTS FOR GLOBAL BEST USING Rastrigrin 4.5109*10-7 4.5109*10-7 4.5109*10-7 4.5109*10-7
ROSENBROCK FUNCTION FOR MINIMIZATION IN 10 RUN OF THE ALGORITHM
Dimension FAC-NBPSO NBPSO[6] BPSO[6] BPSO[6] The next experiment carried out on dimension 5 for all
N=3 0.1875 0.5164 0.9384 0.8645 four test functions with 500, 200, 100 and 50 iterations. The
experimental results are shown in table VII when compared
N=5 1.2275 2.5162 1406 3746.5
with the results of 1000 iterations presented in table I-IV
N = 10 8.8127 367.83 1.3094*10 6
1.52321*106 demonstrate that the convergence speed of proposed FAC-
NBPSO algorithm reduces for all test functions as we reduce
the number of iterations.
TABLE IV. EXPERIMENTAL RESULTS FOR GLOBAL BEST USING
GRIENWANGK FUNCTION FOR MINIMIZATION IN 10 RUN OF THE ALGORITHM
358
TABLE VII. EXPERIMENTAL RESULTS FOR GLOBAL BEST-USING their fitness by modifying the cognitive and social
FUNCTIONS WITH DIFFERENT ITERATION IN 10 RUN OF THE FAC-NBPSO
ALGORITHM
components. The experiments are performed on four
benchmark test functions to evaluate the performance of FAC-
N=5 Iterations NBPSO. The findings affirmed the improved performance of
Functions 500 200 100 50
the proposed FAC-NBPSO than the compared algorithms in
terms of convergence speed. To validate the improved
Sphere 1.1369*10-8 1.1369*10-8 2.9559*10-8 7.7603*10-6
convergence speed of FAC-NBPSO, additional experiments
Rosenbrock 2.7871 3.2948 4.4074 10.0468 are carried out for four benchmark test functions on different
Grienwank 0.0100 0.0252 0.0103 0.0621
iterations. The experimental results demonstrated that FAC-
NBPSO performed better not just on 1000 iterations, it still
Rastrigrin 4.5109*10-7 4.5109*10-7 4.5109*10-7 4.5109*10-7
improves convergence speed when the iterations are reduced
from 1000 to 500, 200, 100, and 50.
The next experiment carried out on dimension 10 for all
four test functions with 500, 200, 100 and 50 iterations. The REFERENCES
experimental results are shown in table VIII when compared
with the results of 1000 iterations presented in table I-IV [1] J. Kennedy and R. Eberhart, "PSO optimization," in Proc. IEEE Int.
Conf. Neural Networks, 1995, pp. 1941-1948.
demonstrate that the convergence speed of proposed FAC- [2] J. Kennedy and R. C. Eberhart, "A discrete binary version of the particle
NBPSO algorithm reduces for all test functions as we reduce swarm algorithm," in Systems, Man, and Cybernetics, 1997.
the number of iterations. Computational Cybernetics and Simulation., 1997 IEEE International
Conference on, 1997, pp. 4104-4108.
TABLE VIII. EXPERIMENTAL RESULTS FOR GLOBAL BEST-USING [3] T. Ziyu and Z. Ding Xue, "A modified particle swarm optimization with
FUNCTIONS WITH DIFFERENT ITERATION IN 10 RUN OF THE FAC-NBPSO adaptive acceleration coefficients," in Information Processing, 2009.
ALGORITHM APCIP 2009. Asia-Pacific Conference on, 2009, pp. 330-332.
[4] Z. Wu and J. Zhou, "A self-adaptive particle swarm optimization
N = 10 Iterations algorithm with individual coefficients adjustment," in Computational
Intelligence and Security, 2007 International Conference on, 2007, pp.
Functions 500 200 100 50
133-136.
Sphere 2.2737*10-8 4.0609*10-6 0.0105 1.6026 [5] C. Banerjee and R. Sawal, "PSO with dynamic acceleration coefficient
based on multiple constraint satisfaction: Implementing Fuzzy Inference
Rosenbrock 8.7830 8.8185 14.1053 476.3454
System," in Advances in Electronics, Computers and Communications
Grienwank 0.0148 0.0718 0.0651 0.2358 (ICAECC), 2014 International Conference on, 2014, pp. 1-5.
[6] R. Huifeng, X. Jun, and H. Guyu, "Fitness feedback based particles
-7 -7 -7
Rastrigrin 4.5109*10 4.5109*10 4.5109*10 4.5109*10-7 swarm optimization," in Control Conference (CCC), 2015 34th Chinese,
2015, pp. 2673-2677.
[7] G. Ma, R. Gong, Q. Li, and G. Yao, "An Improved Particle Swarm
The experimental results of the proposed FAC-NBPSO Optimization Algorithm with Dynamic Acceleration Coefficients,"
algorithms shown in Table VI-VIII illustrates that particles Bulletin of Electrical Engineering and Informatics, vol. 5, pp. 489-494,
2016.
with 1000 iterations converge quickly and efficiently because [8] C. Dong, Z. Chen, and S. Sun, "The Acceleration Coefficients Self-
they get a higher chance to explore the search space. While Adapting In PSO," International Journal of Digital Content Technology
reducing iterations from 1000, 500, 200, 100 and to 50, the and its Applications, vol. 7, p. 672, 2013.
convergence speed reduces as there is less chance for the [9] M. A. Khanesar, M. Teshnehlab, and M. A. Shoorehdeli, "A novel
particles to explore the search space. The comparison of binary particle swarm optimization," in Control & Automation, 2007.
MED'07. Mediterranean Conference on, 2007, pp. 1-6.
experimental results is also presented in graphical [10] H. Nezamabadi-pour, M. Rostami-Shahrbabaki, and M. Maghfoori-
representation shown in Fig (V-VII). It is concluded that Farsangi, "Binary particle swarm optimization: challenges and new
based on the fitness the proposed algorithm FAC-NBPSO first solutions," CSI J Comput Sci Eng, vol. 6, pp. 21-32, 2008.
ranked the particle having greater fitness value and then later [11] A. Cervantes, I. M. Galván, and P. Isasi, "Binary particle swarm
ranked the particles with smaller fitness value. The proposed optimization in classification," 2005.
FAC-NBPSO algorithm attains great improvement over [12] S. Lee, S. Soak, S. Oh, W. Pedrycz, and M. Jeon, "Modified binary
particle swarm optimization," Progress in Natural Science, vol. 18, pp.
NBPSO by modifying the cognitive and social components. 1161-1166, 2008.
This improved the convergence speed and contributes a lot to [13] Y.-W. Jeong, J.-B. Park, S.-H. Jang, and K. Y. Lee, "A new quantum-
improve the performance of the NBPSO algorithm [6]. inspired binary PSO: application to unit commitment problems for
Simultaneously, the uniformity of convergence speed power systems," IEEE Transactions on Power Systems, vol. 25, pp.
enhances the robustness and stability of the FAC-NBPSO 1486-1495, 2010.
[14] D. Singh, V. Singh, and U. Ansari, "Binary particle swarm optimization
algorithm. with crossover operation for discrete optimization," International
Journal of Computer Applications, vol. 28, pp. 15-20, 2011.
VIII. CONCLUSION [15] Zhang, N., Xiong, J., Zhong, J., & Thompson, L. (2018, June). Feature
Selection Method Using BPSO-EA with ENN Classifier. In 2018 Eighth
In our work, a Fitness-based Acceleration Coefficient International Conference on Information Science and Technology
strategy is introduced in Novel BPSO to enhance its (ICIST) (pp. 364-369). IEEE.
convergence speed. Unlike NBPSO where the fixed [16] Zabidi, A., Yassin, I. M., Tahir, N. M., Rizman, Z. I., & Karbasi, M.
(2017). Comparison between binary particles swarm optimization
acceleration coefficients are adopted for all particles that (BPSO) and binary artificial bee colony (BABC) for nonlinear
ignore the dispersion of particles in a search space, in the autoregressive model structure selection of chaotic data. Journal of
proposed FAC-NBPSO all particles are ranked according to Fundamental and Applied Sciences, 9(3S), 730-754.
359
[17] Wei, J., Zhang, R., Yu, Z., Hu, R., Tang, J., Gui, C., & Yuan, Y. (2017).
A BPSO-SVM algorithm based on memory renewal and enhanced
mutation mechanisms for feature selection. Applied Soft Computing, 58,
176-192.
360