You are on page 1of 8

Evaluation Performance Study of Firefly Algorithm, Particle Swarm

Optimization and Artificial Bee colony algorithm for


Non-Linear
mathematical Optimization functions

Siddharth Agarwal, M-tech(CS)

Amrit Pal Singh,(Asst Prof)

Nitin Anand, M-Tech (IS)

GGSIPU, New Delhi, India

GGSIPU, New Delhi, India

AIACTR, New Delhi, India

siddharth1089@gmmail.com

amritpal1986@gmail.com

proudtobeanindiannitin@gmail.com

Abstract - The paper reviews and introduces the


state-of-the-art
nature-inspired
metaheuristic
algorithms in optimization, including Firefly
algorithm, PSO algorithms and ABC algorithm. By
implementing these algorithms in Matlab, we will
use worked examples to show how each algorithm
works. Firefly algorithm is one of the evolutionary
optimization algorithms, and is inspired by the
flashing behaviour of fireflies in nature. There are
many noisy non-linear mathematical optimization
problems that can be effectively solved by
Metaheuristic
Algorithms.
Mathematical
optimization or programming is the study of such
planning and design problems using mathematical
tools. Nowadays, computer simulations become an
indispensable tool for solving such optimization
problems with various efficient search algorithms.
Nature-inspired algorithms are among the most
powerful algorithms for optimization. Firefly
algorithm is one of the new metaheuristic
algorithms for optimization problems. The
algorithm is inspired by the flashing behaviour of
fireflies. A Firefly Algorithm (FA) is a recent
nature inspired optimization algorithm, which
simulates the flash pattern and characteristics of
fireflies. It is a powerful swarm intelligence
algorithm inspired by the flash phenomenon of the
fireflies. In this context, three types of metaheuristics called Artificial bee Colony algorithm,
Particle Swarm Optimization (PSO) and Firefly
algorithms were devised to find optimal solutions
of noisy non-linear continuous mathematical
models. A series of computational experiments
using each algorithm were conducted. The
stimulation result of this experiment were analyzed
and compared to the best solutions found so The
Firefly algorithm in each noisy non linear
optimization function seems to perform better and
efficient.

Keywords ABC algorithm, Firefly algorithm, Metaheuristic


algorithm, PSO.

1. Introduction - Most conventional or classic


algorithms are deterministic. For example, the
simplex method in linear programming is
deterministic. Some deterministic optimization
algorithms used the gradient information, they are
called gradient-based algorithms. Firefly is a
metaheuristic algorithm that is inspired by the
behaviour of insects (fireflies). Metaheuristic is
applied to problem classified as NP-hard or NP-C
by the theory of computational complexity.
Metaheuristic algorithms like Firefly algorithm,
PSO, ACO, BCO aims to find the optimal solution
[11,12]. Often to solve optimization problems,
optimization algorithms are used. Nature-inspired
metaheuristics are currently among the most
powerful tools for optimization. The firefly (FA) is
a recent nature inspired technique that has
commonly been used for solving non linear
optimization problems. The firefly algorithm (FA),
proposed by Xin-She Yang at Cambridge
University, is a novel metaheuristic, which is
inspired by the behaviour of fireflies. Their
population is estimated about two thousand firefly
species. Most of them produce short and rhythmic
flashes. The primary purpose for a fireflys flash is
to act as a signal system to attract other fireflies.
This algorithm is based on the behaviour of social
insects (fireflies). The social insect colonies, each
one seems to have its own phenomenon and yet the
group behaves as working on the common agenda
that is to be highly organized. Algorithms based on
nature have been demonstrated to show
effectiveness and efficiency to solve difficult
optimization problems. A swarm is a group of
multi-agent systems such as fireflies, in which
simple agents coordinate their activities to solve the
complex problems.
Particle swarm optimization (PSO) is a
population based stochastic optimization technique
developed by Dr. Eberhart and Dr. Kennedy in
1995, inspired by social behaviour of bird flocking
or fish schooling aims at finding the food.
Deterministic algorithms are almost all local search
algorithms, and they are quite ecient and aims in
finding the local optima. Each particle keeps track

of its coordinates in the problem space which are


associated with the best solution (fitness) it has
achieved so far. The goal is to discover solutions
that are not dominated by any other in the objective
space of the optimization problems. The particle
swarm optimization concept consists of, at each
time step, changing the velocity of (accelerating)
each particle toward its pbest and lbest locations
.Each swarm moves in the search space in a
cooperative search procedure. Acceleration is
weighted by a random term, with separate random
numbers being generated for acceleration
toward pbest and lbest locations. The group of
possible solutions is a set of particles, called
swarms.
Artificial Bee Colony Optimization (ABC) is a
metaheuristic algorithm developed by D. Karaboga
in 2005. In ABC, a population based algorithm, the
position of a food source represents a possible
solution to the optimization problem and the nectar
amount of a food source corresponds to the quality
(fitness) of the associated solution. The number of
the employed bees is equal to the number of
solutions in the population. In this algorithm,
forager bees are allocated to different food sources
so as to maximize the total nectar intake. The
colony has to optimize the overall efficiency of
nectar collection; the allocation of bees is thus
depending on many factors such as the nectar
richness and the proximity to the hive. At the first
step, a randomly distributed initial population (food
source positions) is generated. After initialization,
the population is subjected to repeat the cycles of
the search processes of the employed, onlooker,
and scout bees, respectively [5]. An employed bee
produces a modification on the source position in
her memory and discovers a new food source
position.
In this paper we are using three metaheuristic
algorithms PSO, ABC and Firefly algorithm for
providing solutions. These multiple agents called
particles which swarm around the search space
starting from some initial random guess.
2. Firefly algorithm
A Firefly Algorithm (FA) is a metaheuristic
algorithm inspired by the flashing behaviour
of fireflies. The primary purpose of firefly's flash
is to act as a signal system to attract other fireflies.
The firefly algorithm simulates the flash pattern
and characteristics of fireflies. Most of fireflies
produced short and rhythmic flashes and have
different flashing behaviour [3]. Fireflies use these
flashes for communication and attracting the
potential prey. Firefly algorithm is powerful local

search in which the brightness should be associated


with the objective function but sometimes it may
trap into several local optimums as result it cannot
search globally well. Two parameters of the
algorithm are attractiveness coefficient and
randomization coefficient play an important role in
determining the optimal solution in the search
space. The values are crucially important in
calculating the speed of the convergence and the
behaviour of FA algorithm. Firefly algorithm
parameters may not change by the time during
iterations. Two parameters of the algorithm are
attractiveness coefficient and randomization
coefficient. The values are crucially important in
determining the speed of the convergence and the
behaviour of FA algorithm. The firefly algorithm
has three basic idealized rules that are followed
while movement of one firefly to other [12].

All fireflies are unisex and they will move


towards more attractive and brighter ones
regardless of their sex.
The degree of attractiveness of a firefly is
proportional to its brightness. Also the
brightness may decrease as the distance
from the other fire flies increases due to
the fact that the air absorbs light. If there
is not a brighter or more attractive fire fly
than a particular one it will then move
randomly.
The brightness or light intensity of a fire
fly is determined by the value of the
objective function of a given problem.
By using the above three rules it is possible to
achieve the optimum value of the objective
function.
2.1 Variation
attractiveness

of

Light

Intensity

and

In the firefly algorithm, there are two important


points: the variation in the light intensity and
formulation of the attractiveness. So for
optimization problems, a firefly with high/low
intensity will attract another firefly with high/low
intensity. The distance rij is the distance between
firefly i and firefly j. In addition, light intensity
decreases with the distance from its source, and
light is also absorbed by the media .The light
intensity I(r) varies according to the inverse square
law relation.
I(r) = Is / r2

(1)

Where is the intensity at the source. For a stated


medium with a fixed light absorption coefficient ,
the light intensity I vary with the distance r. That is
I =I0 e r

(2)

Where Io is the initial light intensity


Each firefly has its distinctive attractiveness
which implies how strong it attracts other members
of the swarm. The attractiveness is varied and it
changes depending upon the distance between the
firefly i and firefly j. Since a fireflys attractiveness
is proportional to the light intensity seen by
adjacent fireflies, the attractiveness function is
defined as.
= o e

r2

Input: Create an initial population of fireflies n


within d dimensional search space xi,k ,
i=1,2,3...................n and k= 1,2 ,.............d.
Evaluate the fitness of the population f(xi,k) which
is directly proportional to the light intensity Ii,k.
Algorithms Parameter: 0,.
1)
Objective
x=(x1,x2,x3.............xd).

f(x)

2) For maximation problems, I f(x) or simply


I=f(x).
Repeat
for i=1 to n
for j=1 to n

(3)

if( Ij<Ii)

Where 0 is the attractiveness at r = 0 and is light


absorption coefficient

Move Firefly i toward j in d


dimension

For a fixed, the characteristic length becomes


=-1/m 1, m .

function

(4)

The distance between any two fireflies i and j at xi


and xj, respectively is the Cartesian distance. The
distance between any two fireflies i and j at xi and
xj, respectively is the Cartesian distance.

end if

attractiveness varies with distance via


exp[r-2].
Evaluate new solutions and update light
intensity using eq-(1)
end for j and end for i

(5)
The movement of the firefly i is attracted to another
more attractive (brighter) firefly j is determined by

Rank the fireflies and find the current best


until stop condition is met
end
3. PSO algorithm

(6)
Where the second term is due to the attraction and
third term is randomization with being the
randomization parameter, and is a vector of random
numbers being drawn from a Gaussian distribution
or uniform distribution. For example, the simplest
form is i could be replaced by (rand ) where
rand is a random number generator uniformly
distributed in [0, 1]. For most of our
implementation, we can take o 1 and [0, 1].
2.2 Algorithm
The Pseudo code can be summarized as.

Particle swarm optimization (PSO) is a


evolutionary computational intelligence technique,
population-based global optimization technique
proposed by Kennedy and Eberhart in 1995[3]. It is
inspired by the social behaviour of bird flocking
searching for food. PSO is a population based
search procedure. The individual agents of swarm
behave without supervision and each of these
agents has stochastic behaviour due to their
perception in the neighbourhood. Each particle in
the swarm represents a solution in a highdimensional space with four vectors, its current
position, best position found so far, the best
position found by its neighbourhood so far and its
velocity that adjusts its position in the search space

based on the best position reached by itself (pbest)


and on the best position reached by its
neighbourhood (gbest) The pseudo code is
summarized as..
Objective function f(x), x=(x1,x2..................xp)T
Initialize locations xi and velocity vi of n particles.
Find g* from min{f(x1),........................f(xn)} (at
t=0)
While (criterion)

food source and come back to hive and dance on


this area. The employed bee whose food source has
been abandoned becomes a scout and starts to
search for finding a new food source. Providing
that its nectar is higher than that of the previous
one, the bee memorizes the new position and
forgets the old one. Onlookers watch the dances of
employed bees and choose food sources depending
on dances with probability value associated with
food source pi calculated by the following
expression

t=t+1,

Pi = fiti / fitn , where n varies from n=1 to SN

for loop over all n particles and all d

where fiti is the fitness value of the solution i


evaluated by its employed bee, which is
proportional to the nectar amount of the food
source in the position i and SN is the number of
food sources which is equal to the number of
employed bees [13]. In this way, the employed bees
exchange their information with the onlookers.

dimensions
generate new velocity using equation
VtK+1 = Vtk + c1r1(tk - Xtk) + c2r2(gk Xtk)
calculate new locations using equation XtK+1 =
Xtk + VtK+1
Find the current best for each particle Xit ,

The no. of employed bees is equal to the no. of


food sources.[9,10] The pseudo source is
summarized as.

end for

Objective function f(x), x=(x1, x2,...........xn)T.

Find the current global best g*.

Encode f(x) into virtual nectar levels.

End while..
Output the final results Xi* and g*....
Where Xtk represents particle position.

Define dance routine (strength, direction) or


protocol.
While(condition)

Vtk represents particle velocity.

For loop over all n dimensions

tk represents Best remembered position.

Generate new solutions and

c1, c2
pamaters.

represents cognitive and social

r1, r2 are random numbers between 0 and 1.


4. ABC Algorithm
The artificial bee colony (ABC) optimization
algorithm was first discovered by D. Karaboga in
2005. In the ABC algorithm, the bees in a colony
are divided into three groups: Employed bees,
onlooker bees and scouts. It is assumed that there is
only one artificial employed bee for each food
source. In other words, the number of employed
bees in the colony is equal to the number of food
sources around the hive. Employed bees go to their

evaluate the best solution found so far..


end for
Communicate and update the optimal solution set.
end while..
Decode and output the best results

5. Noisy Non Linear Mathematical Functions


Consider three non-linear mathematical models
without constraints (taken from [6]). In order to
demonstrate how the firefly, PSO and ABC
algorithm works, we have implemented it in matlab

and the results are shown below. In order to show


that both the global optima and local optima can be
found simultaneously, we now use the following
three mathematical functions.
5.1. Four Peak function
f1(x,y)= e-(x-4)^2 (y-4)^2 + e-(x+4)^2 (y-4)^2 + 2[ e x^2-y^2 +
e x^2-(y+4)^2]

Fig 3: 2D Matlab plot for Goldstein-Price Function


5.4 Parabolic Function
f4(x,y) = 12-(x^2+y^2)/100

Fig 1: 2D Matlab plot for four peak function



   
f2(x,y)= 10 log( x^2 (4 2.1*x^2 + (1/3)*x^4)
+ x*y + 4*y^2(y^2 1)

Fig 4: 2D Matlab plot for Parabolic Function


5.5 Axis parallel hyper-ellipsoid function
f5(x,y) = x^2+2 * y^2

Fig 2 : 2D Matlab plot for Camelback Function

5.3 Goldstein-Price Function


f3(x,y) = 10 + log(1/((1+((1+x+y)^2)*(1914*x+3*x^2-14*y+6*x*y+3*y^2))*(30+((2*x3*y)^2)*(18-32*x+12*x^2+48*y36*x*y+27*y^2))))
Fig 5: 2D Matlab plot for Axis parallel hyperellipsoid Function

5.6 Styblinski Function


f6(x,y) = 275-(((
16*y^2+5*y)/2)+3)

x^4-16*x^2+5*x)/2)+((y^4-

Fig 8: 2D Matlab plot for Rosenbrock Function

5.9 Rosenbrocks valley


f9(x,y) = 100*(y-x^2)+(1-x^2)
Fig 6: 2D Matlab plot for Styblinski Function

5.7 Rastrigin Function


f7(x,y)
=
10*(cos(2*pi*x)+cos(2*pi*y)))

80-(20+x^2+y^2-

Fig 9: 2D Matlab plot for Rosenbrocks valley


Function
6. Comparative Study of Firefly Algorithm ,
PSO and ABC algorithm for Noisy Non-Linear
Fig 7: 2D Matlab plot for Rastrigin Function

5.8 Rosenbrock Function


f8(x,y) = 70*(((20-( (1-x/-7)^2 + ((y/6)+(x/7)^2)^2))+150)/170)+10

Optimization Problems.
The three algorithms are run on the dual-core
processor containing some of the mathematical non
linear optimization functions and the results are
evaluated. The results are not permanent and keep
on varying depending upon the computers
processor speed, RAM. The algorithm is run
considering different parameters such as no swarms
or fireflies or bees and their intensity to perform
work. Based on these parameters a comparison
study is made.

 

PSO

FFA

ABC

f(x,y)

15

3.5222

3.3245 3.4564

20

4.0012

3.8767 3.9899

25

5.1221

4.8911 4.9900

15

3.1123

2.6768 2.9899

20

4.0011

3.1321 3.8912

25

4.8976

4.7656 4.7888

15

2.3224

2.2867 2.3566

20

3.1786

2.9110 3.2121

25

3.6545

3.5528 3.6298

15

f1(x,y)

3.6536

3.5470 3.5565

3.0455

2.9121 2.9011

 

20

25
4.4678

4.3564 4.4111

4.4516

4.3989 4.4232

4.6232

4.2112 4.2000

5.3232

5.2989 5.3298

6.2483

5.9923 6.3669

6.7131

6.4878 6.5655

7.0867

6.7776 7.1998

15

2.5467

2.5211 2.3121

20

2.9981

2.8800 2.8912

25

3.6534

3.3121 3.3452

15

1.7892

1.3221 1.9867

20

1.9665

1.7534 1.7600

25

2.5341

2.1876 2.3322

15

3.1235

3.0011 3.1000

20

36671

3.4111 3.5232

25

3.9912

3.5121 3.8999

 

15

f2(x,y)

20

25

15

f3(x,y)

20

25

 

 

 

7. Conclusion and Future work


FFA tends to be better, especially on the functions
having multi-peaks. Complexity or difficulty level
of the functions had no effect to the FFA as
expected except on the Camelback function. FFA
and ABC sometimes perform near about same at
the various levels of complexity. Sometime ABC
algorithm run worst than PSO due to fitness
function calculation. However, execution time in
each replication is dramatically higher when they
are compared, especially when both local and
global optima found. The parameters can be
adjusted to suit for solving various problems with
different scales.
References
. [1] Blum Christian, Maria Jose Blesa Aguilera,
Andrea
Roli,
Michael
Sampels,
Hybrid
Metaheuristics, An Emerging Approach to
Optimization,Springer,2008.

[2] Hajo Broersma Application of the Firefly


Algorithm
for
Solving
the
Economic
Emissions Load dispatch Problem, Hindawi
Publishing Corporation, International Journal
of Combinatorics, Volume 2011
[3] IDe Falco, A.D. Cioppa, E. Tarantino, Facing
classification problems with particle Swarm
optimization Applied Soft Computing 7 (3) (2007)
652658.

[4] Kennedy, J.; Eberhart, R. (1995). "Particle


Swarm Optimization". Proceedings of IEEE .
International Conference on Neural Networks. IV.
pp. 19421948.
[5] Karaboga D. An Idea based on honey bee
swarm for numerical optimization, Technical
Report TR06, Erciyes University , Turkey,
2005.

Algorithm and the Genetic Algorithm, published


by AIAA, 2004.
[9] S. Nakrani and C. Tovey, On honey bees and
dynamic server allocation in internet hosting
centres, Adaptive Behaviour, 12, 233-240(2004).
. [10]WengKee Wong, Nature-Inspired
Metaheuristic Algorithms for Generating Optimal
Experimental Designs, 2011.

[6]
N. Chai-ead, P. Aungkulanon, and P.
Luangpai- boon, Member, IAENG, Bees and
Firefly Algorithms for Noisy Non-Linear
Optimization
Problems
International
Multiconference of Engineers and Computer
Scientists, 2011.

[11] Xiang-yin Meng, Yu-long Hu, Yuan-hang


Hou, Wen-quan Wang, The Analysis of
Chaotic Particle Swarm Optimization and the
Application in Preliminary Design of
Ship,
International Conference on Mechatronics and
Automation, August, 2010.

[7] Sh. M. Farahani, A. A. Abshouri, B. Nasiri,


and M. R. Meybodi, A Gaussian Firefly
Algorithm, International Journal of Machine
Learning and Computing, Vol. 1, No.
December 2011.

[12] X. S. Yang, Nature-Inspired Metaheuristic


Algorithms, Luniver Press, 2008.

[8] Rania Hassan, Babak Cohanim, Olivier de


Weck, A Comparison of the Particle swarm

[13] Dervis Karaboga Bahriye Basturk, A


powerful and efficient algorithm for numerical
function optimization: artificial bee colony (ABC)
algorithm published in Springer April, 13, 2007.

You might also like