You are on page 1of 5

Performance of a Hybrid EA-DE-Memetic

Algorithm on CEC 2011 Real World Optimization


Problems
Hemant Kumar Singh Tapabrata Ray
School of Engineering and IT School of Engineering and IT
University of New South Wales University of New South Wales
Australian Defence Force Academy Australian Defence Force Academy
ACT 2601, Canberra, Australia ACT 2601, Canberra, Australia
Telephone: +61 2 626 88479 Telephone: +61 2 626 88479
Email: h.singh@adfa.edu.au Email: t.ray@adfa.edu.au

Abstract—Evolutionary Algorithms (EAs), in their traditional of these are called Evolutionary Algorithms (EAs), which
form or in combination as in Memetic Algorithms (MAs), have are inspired from Darwinian evolution in nature. Since their
been quite successful in solving a variety of optimization prob- inception by John Holland [2], EAs have been extensively
lems in the past. More recently, several excellent Differential
Evolution (DE) based algorithms have been proposed which studied and applied to various problems. Meanwhile, they
have had outstanding success in IEEE Congress on Evolutionary have also inspired the development of other population based
Computation (CEC) competition problems. Inspired by previ- heuristic search methods, such as Particle Swarm Optimiza-
ous studies, we propose an algorithm combining the strengths tion (PSO), Ant Colony Optimization (ACO), Differential
of EA, DE and MA in this paper. The algorithm utilizes a Evolution (DE), etc.
population of random solutions to start with and generates
a child population either through EA or DE based evolution Currently, the Non-dominated Sorting Genetic Algorithm
with equal probability. Local search is then performed from II (NSGA-II)[3] is arguably the most widely used canonical
one of the solutions in the population for further improvement EA for optimization. As the name suggests, it uses non-
objective value. To avoid stagnation, re-initialization of the dominated sorting of the solutions for ranking. Additionally,
population is performed whenever the local search is unable diversity is maintained using crowding distance mechanism.
to improve the values consecutively for a prescribed number
of generations. The performance of the proposed algorithm is For generating offspring population, NSGA-II uses Sim-
presented in this paper for the newly introduced real world ulated Binary Crossover (SBX) and Polynomial Mutation
optimization problems for CEC 2011 competition. (though other operators can also be used).
Recent studies have demonstrated excellent performance
I. I NTRODUCTION of Differential Evolution (DE) when used with EAs [4],
Optimization forms an indispensable component of a num- [5]. Looking at the history of IEEE CEC competitions on
ber of real world problems, including engineering design, evolutionary optimization for last few years, the dominance
scheduling, parameter estimation, control, visual graphics, of DE-based methods can be clearly observed [6]. DE makes
and many more. Consequently, development of fast and exploratory moves in the search space using the difference
efficient optimization algorithms is an actively pursued re- between the parameter vectors, unlike traditional EAs which
search area. To aid this research, a number of benchmark usually use a probability functions for crossover between
test problems have been suggested in the past for evaluating parents.
and comparing the performances of various optimization At the same time, hybridization of global search with local
problems. For IEEE Congress on Evolutionary Computation search has also been emphasized in the literature[7], [8],
(CEC) 2011, a set of real world optimization problems has [9], [10], [11]. By using local search, specific regions of
been compiled by Das and Suganthan [1]. This paper studies the search space can be explored using fewer evaluations
the performance of a newly proposed memetic algorithm by and good quality solutions can be generated early during the
the authors on this set of real world optimization problems. search. At the same time, the global search ensures diversity
In last two decades or so, population based stochastic of the population and generates good starting solutions for
algorithms have gained popularity as generic optimizers. the local search. This hybrid approach is commonly referred
Their ability to handle highly non-linear, discontinuous or to as a Memetic Algorithm (MA).
non-differentiable functions, and to give the solutions (Pareto The algorithm presented in this paper attempts to combine
front) to a multi-objective problem in a single run, give them the advantages of the above three approaches, viz., EA,
an advantage over conventional optimization methods for DE and MA. The evolution of the solutions is done using
application to real world problems. The most widely studied traditional EA and DE strategies, and at each generation,

978-1-4244-7835-4/11/$26.00 ©2011 IEEE 1322


local search is invoked from a promising solution in an A. Initialization
attempt to further improve the solutions. Additional provi- As in common practice, the population is initialized using
sions are employed to avoid stagnation, including occasional random sampling from the solution space. The value of each
local search from a solution, and also re-initialization of the variable is sampled between its lower and upper bound using
population in case there is no improvement for prescribed uniform probability distribution.
number of generations. The performance of the algorithm
is then studied and reported on the set of 14 problems (22 B. Evolution
including different instances) for the CEC 2011 competition. In the proposed algorithm, two different evolution schemes
Rest of the paper is organized as follows. The proposed are used with equal probability for generating the offspring
algorithm is described in Section II. The performance of the population. These are:
proposed algorithm is reported in Section III. To conclude, 1) EA-like evolution – This includes Simulated Binary
a summary of the findings paper is presented in Section IV. Crossover (SBX) and Polynomial Mutation. SBX [12]
is done as shown in Eq. 1.
II. P ROPOSED ALGORITHM
yi1 = 0.5 [(1 + βqi ) x1i + (1 − βqi ) x2i ]
(1)
The proposed algorithm is a population based method in- yi2 = 0.5 [(1 − βqi ) x1i + (1 + βqi ) x2i ]
corporating two kinds of evolutionary mechanisms, either of where βqi is calculated as,
which is used with equal probability during a generation for ⎧
creating offspring. In addition, a local search is invoked from ⎨(2ui )1/ηc +1 , if ui ≤ 0.5,
a promising solution in the population in each generation. An βqi =  1 1/ηc +1 (2)
⎩ if ui > 0.5.
outline of the proposed algorithm is given in Algorithm 1. 2(1−ui )
The main steps of the proposed algorithm are described in and where ui is a uniform random number in the range
the following subsections. [0, 1) and ηc is a user defined parameter, Distribution
Index for Crossover. Polynomial Mutation operator is
Algorithm 1 Proposed algorithm used in this study [13] defined in Eq. 3.
Require: Population size (N ), Number of generations (NG ),
Crossover and mutation parameters. yi = xi + (xi − xi ) δ¯i (3)
1: pop1 = Initialize() where δ¯i is calculated as,
2: Evaluate(pop1 ) 
3: for i = 2 to NG do (2ri )1/(ηm +1) − 1, if ri < 0.5,
δ¯i = (4)
4: if rand(0,1) ≤ 0.5 then 1 − [2(1 − ri )]1/(ηm +1) , if ri ≥ 0.5.
5: childpopi = Evolve EA(popi−1 )
6: else where ri is a uniform random number in the range
7: childpopi = Evolve DE(popi−1 ) [0, 1) and ηm is a user defined parameter, Distribution
8: end if Index for Mutation.
9: Evaluate(childpopi ) 2) DE-like evolution – This includes the DE exponential
10: S = Rank(popi−1 + childpopi ) crossover and mutation, as described in [6], [14].
11: if rand(0,1) ≤ 0.2 then In DE, mutation is performed by adding donor vector
12: x = Random x ∈ popi {Select a random solution} to a parent (target) vector from the current generation.
13: else
 ri ,G + F.(X
i,G = X
V  ri ,G − X
 ri ,G ) (5)
14: x = Choose start x (popi ) {Select a solution as 1 2 3

described in Section II}  ri ,G , X


 ri ,G , X
 ri ,G are three distinct parameter
where X
15: end if 1 2 3
vectors sampled randomly for the population whose
16: xbest ← Local search (x) {xbest is the best solution
indices correspond to r1i , r2i , r3i , and F is a scalar value,
found using local search from x}
typically in the interval [0.4 1).
17: Replace worst solution in popi with xbest
Following the mutation, the donor vector undergoes
18: popi = Rank(popi ) {Rank the solutions again in  i,G to form the trial
crossover with the target vector X
popi } 
vector Ui,G = [u1,i,g , u2,i,G . . . uD,i,G ]. In the present
19: if local search does not improve the objective value
study, exponential crossover is used. In exponential
for K consecutive generations then
crossover, two integers, n and L, are chosen in the
20: popi = Re-initialize() {Retain the best solution
interval [1,D]. The integer n acts as a starting point in
while re-initializing}
the target vector, from where the exchange of compo-
21: end if
nents with donor vector starts; whereas L denotes the
22: end for
number of components the donor vector contributes to
the target vector. The trial vector is obtained as:

1323
III. R ESULTS ON CEC 2011 REAL WORLD PROBLEMS
uj,i,G = vj,i,G A. Experimental setup
for j = nD (n + 1)D , . . . (n + L − 1)D Twenty five runs of the proposed MA are done on each
uj,i,G = xj,i,G of the test problems posed in the CEC 2011 competition [1].
The parameters used for the algorithm are the same for each
for all other j ∈ [1, D] problem, i.e., no tuning of parameters is done across the
The angular brackets D denote modulo function with problems. The parameters are listed in Table I. A maximum
modulus D. A parameter called Crossover rate (Cr) is of 50 × nvar function evaluations are allotted to the local
used to determine the probability of using exponential search within each generation, where nvar is the number of
crossover. variables in the problem.
More detailed discussion of the above evolution schemes
can be found in [3], [6]. TABLE I: Parameters used for the proposed algorithm
C. Ranking and reduction
Parameter Value
Since the test problems studied in the paper are formulated Population size 200
as single-objective, unconstrained minimization problems Max. FES 150000
(constrained problems use penalty functions), the ranking is Crossover probability (SBX / Exponential) 0.9
Crossover index (SBX) 15
done by simply sorting the objective values in ascending or- Mutation probability 0.1
der. The best N solutions from the (parent+child) population Mutation index (Polynomial) 20
form the next generation. Scale factor F (DE mutation) 0.9

D. Local search
At each generation, in addition to generation of new
B. PC configuration
solutions using recombination and mutation, a local search is
used for further improvement. Sequential Quadratic Program- All the runs are performed on a cluster with the compute
ming (SQP) [15] is used for the local search in the present nodes DL140G3 5110 NHP Sata, with following configura-
study. For certain generations (chosen with probability of tion:
0.2), the starting solution is chosen randomly from the pop- 1) Processor - Dual-core Intel Xeon 5110
ulation in order to maintain diversity. Otherwise, the starting 2) RAM - 4GB
solution is selected from the population in the following way: 3) Operating system - Redhat Linux
• If the local search from a solution was able to improve The proposed algorithm is implemented in Matlab 2009a.
the solution in the previous generation, then the current
best solution is used as a starting solution for the local C. Summary of results
search. This is done because an improvement through The intermediate results for the problems (at 50000 and
the local search in the previous generation suggests a 100000 evaluations) are shown in Tables II and III respec-
possible scope of further improvement in the solution tively, whereas the statistics for the final objective values are
through the local search. shown in Table IV. The median value reported is 13th value
• If the local search was unable to improve the best in the sorted list of objective values obtained across 25 runs.
solution in the previous generation, it is evident that Since the exact optimum of most of these problems are not
the existing best solution (in the previous generation) is available, it is difficult to gauge the absolute performance
either not a good starting solution for the local search, of the algorithm at this stage. However, relative performance
or close enough to optimum (either local or global), can be evaluated once the results of other algorithms on these
such that further improvements are difficult. Therefore, problems become available.
in such a case, a random solution is selected from the
IV. S UMMARY AND F UTURE WORK
population in an attempt to explore a different region
for obtaining better objective values. In this paper, a hybrid memetic algorithm is presented.
After performing the local search the worst solution in the The algorithm attempts to combine advantages of three well
population is replaced by the best solution found by the local known approaches used for optimization – EA, DE and MA.
search. In the proposed algorithm, the offspring are generated using
EA and DE strategies with equal probability. Thereafter,
E. Re-initialization the solutions in the population are further improved by
In order to prevent the algorithm from stagnating at invoking a local search during each generation using SQP.
local optima, the population is reinitialized if there is no To prevent the algorithm from stagnation, the local search
improvement in the objective value for more than K (=10) is occasionally started from a random solution. In addition,
generations. The best solution found during the search is if the local search is not able to improve the solutions for
preserved in the population while re-initializing. a prescribed number of generations, then the population is

1324
TABLE II: Performance of the proposed algorithm after 5×104 evaluations
Problem Median Best Mean Worst S.D.
Problem 01 10.1675 1.16742e-11 7.60227 15.4409 6.02057
Problem 02 -23.1711 -28.4225 -23.0199 -16.5337 3.56379
Problem 03 1.15149e-05 1.15149e-05 1.15149e-05 1.15149e-05 1.17446e-18
Problem 04 15.0378 13.8542 15.4119 20.878 1.9663
Problem 05(1) -35.97 -36.9286 -35.2036 -31.6664 1.53468
Problem 05(2) -23.0059 -29.1661 -24.8334 -19.5118 3.05057
Problem 06 0.575118 0.5 0.584744 0.799613 0.0779346
Problem 07 220 220 220 220 0
Problem 08 107.477 14.781 989.755 5784.92 1592.49
Problem 09 -21.1363 -21.6104 -18.949 -12.2778 3.24434
Problem 10(1) 2.51557e+06 1.31554e+06 2.81229e+06 5.80252e+06 1.08889e+06
Problem 10(2) 2.67636e+07 2.3794e+07 2.6536e+07 2.87631e+07 999447
Problem 11(1) 15457.9 15446.3 15458.5 15471.4 8.27191
Problem 11(2) 19243.9 18875.5 19296.9 19678.8 184.361
Problem 11(3) 33016 32915.4 33021.7 33163.3 71.0478
Problem 11(4) 355731 343909 357899 382565 9666.42
Problem 11(5) 2.70822e+06 1.96663e+06 3.18125e+06 6.50302e+06 1.21192e+06
Problem 12(1) 2.18299e+06 1.38553e+06 2.18503e+06 3.94774e+06 580625
Problem 12(2) 2.68863e+06 1.4389e+06 2.83343e+06 4.01017e+06 714632
Problem 12(3) 2.18299e+06 1.38553e+06 2.18503e+06 3.94774e+06 580625
Problem 13 16.0693 8.91272 16.2315 22.3175 2.94555
Problem 14 19.5124 11.1583 19.7081 31.7659 4.82337

TABLE III: Performance of the proposed algorithm after 1 × 105 evaluations


Problem Median Best Mean Worst S.D.
Problem 01 1.29125e-09 1.16742e-11 3.8056 11.4902 5.20912
Problem 02 -25.5054 -28.4225 -24.6248 -16.5337 3.4294
Problem 03 1.15149e-05 1.15149e-05 1.15149e-05 1.15149e-05 1.02939e-18
Problem 04 14.3291 13.8431 14.7674 19.2249 1.38671
Problem 05(1) -35.97 -36.9286 -35.7792 -34.1647 0.996516
Problem 05(2) -27.4298 -29.1661 -26.1778 -21.2481 2.90947
Problem 06 0.521136 0.5 0.536185 0.634252 0.0375193
Problem 07 220 220 220 220 0
Problem 08 63.3651 8.07406 606.011 3210.24 937.201
Problem 09 -21.358 -21.7956 -20.6788 -13.0154 2.13134
Problem 10(1) 1.9824e+06 727529 1.95144e+06 3.34075e+06 751710
Problem 10(2) 2.19417e+07 2.09482e+07 2.19699e+07 2.28543e+07 520307
Problem 11(1) 15451.8 15445.5 15454.3 15471.3 7.53973
Problem 11(2) 19222.9 18875.5 19233 19460.2 125.175
Problem 11(3) 32960.8 32834.6 32958 33107.7 62.2709
Problem 11(4) 354351 343909 355255 369814 7454.82
Problem 11(5) 2.35573e+06 1.96663e+06 2.75347e+06 5.76762e+06 908764
Problem 12(1) 1.6995e+06 1.21632e+06 1.75407e+06 2.45722e+06 369321
Problem 12(2) 2.44336e+06 1.4389e+06 2.48706e+06 3.99085e+06 508226
Problem 12(3) 1.6995e+06 1.21632e+06 1.75407e+06 2.45722e+06 369321
Problem 13 15.4118 8.00076 14.5131 18.9582 2.62032
Problem 14 19.1884 8.63458 18.4534 24.7601 4.07166

reinitialized, while retaining the best solution found from the Isaacs for his support in the implementation of the presented
search until then. The performance of the proposed algorithm algorithm.
is studied on a set of 14 real world problems compiled for R EFERENCES
the CEC 2011 competition. The performance can be judged [1] S. Das and P. Suganthan, “Problem definitions and evaluation criteria
in greater details once the results of CEC 2011 competition for the CEC 2011 competition on testing evolutionary algorithms
are available. on real world optimization problems,” Jadavpur University, Kolkata,
India, and Nangyang Technological University, Singapore, Tech. Rep.,
ACKNOWLEDGMENT December, 2010.
[2] J. H. Holland, Adaptation in natural and artificial systems. Cam-
The presented work was supported by grants from Defence bridge, MA, USA: MIT Press, 1992.
and Security Applications Research Center (DSARC) and [3] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist
multiobjective genetic algorithm: NSGA-II,” Evolutionary Computa-
Research and Research Training Office (RRTO), Australian tion, IEEE Transactions on, vol. 6, pp. 182–197, 2002.
Defence Force Academy, University of New South Wales, [4] S.-Z. Zhao, P. Suganthan, and S. Das, “Self-adaptive
Australia. The authors would also like to thank Dr. Amitay differential evolution with multi-trajectory search for large-
scale optimization,” Soft Computing - A Fusion of Foundations,

1325
TABLE IV: Performance of the proposed algorithm after 1.5 × 105 evaluations
Problem Median Best Mean Worst S.D.
Problem 01 6.08465e-10 1.16742e-11 2.09489 11.3743 4.30635
Problem 02 -26.4644 -28.4225 -25.8992 -20.8217 2.24252
Problem 03 1.15149e-05 1.15149e-05 1.15149e-05 1.15149e-05 9.97108e-19
Problem 04 14.3291 13.8431 14.3688 19.0598 0.995967
Problem 05(1) -36.8454 -36.9286 -36.3092 -34.1647 0.791614
Problem 05(2) -29.1661 -29.1661 -27.3355 -23.0059 2.61528
Problem 06 0.514659 0.5 0.527842 0.620619 0.0355253
Problem 07 220 220 220 220 0
Problem 08 47.9732 8.07406 522.144 3210.24 919.856
Problem 09 -21.3734 -21.7956 -21.2554 -16.9809 0.904825
Problem 10(1) 1.71608e+06 182645 1.51568e+06 3.16774e+06 717585
Problem 10(2) 2.03711e+07 1.96688e+07 2.05975e+07 2.24535e+07 605955
Problem 11(1) 15449.5 15445.5 15450.7 15461.8 4.24154
Problem 11(2) 19209.7 18875.5 19204.5 19393.1 124.229
Problem 11(3) 32911.8 32782.9 32909.1 33021.4 49.1015
Problem 11(4) 353060 343909 353046 368753 6093.27
Problem 11(5) 2.34603e+06 1.96663e+06 2.58355e+06 4.7126e+06 646958
Problem 12(1) 1.56474e+06 1.21632e+06 1.66379e+06 2.34766e+06 327744
Problem 12(2) 2.24644e+06 1.4389e+06 2.35142e+06 3.77107e+06 454677
Problem 12(3) 1.56474e+06 1.21632e+06 1.66379e+06 2.34766e+06 327744
Problem 13 14.1173 7.97952 13.7679 16.6897 2.37386
Problem 14 17.4063 8.63372 17.3857 23.2744 3.68906

Methodologies and Applications, pp. 1–11, 2010, 10.1007/s00500- Berlin / Heidelberg, 2007, vol. 4448, pp. 320–329.
010-0645-4. [Online]. Available: http://dx.doi.org/10.1007/s00500- [10] L.-Y. Tseng and S.-C. Liang, “A hybrid metaheuristic for the quadratic
010-0645-4 assignment problem,” Computational Optimization and Applications,
[5] F. Neri and V. Tirronen, “Recent advances in differential evolution: vol. 34, pp. 85–113, 2006, 10.1007/s10589-005-3069-9. [Online].
a survey and experimental analysis,” Artificial Intelligence Review, Available: http://dx.doi.org/10.1007/s10589-005-3069-9
vol. 33, pp. 61–106, 2010, 10.1007/s10462-009-9137-2. [Online]. [11] K. Ganesh and M. Punniyamoorthy, “Optimization of continuous-
Available: http://dx.doi.org/10.1007/s10462-009-9137-2 time production planning using hybrid genetic algorithms-simulated
[6] S. Das and P. N. Suganthan, “Differential evolution: A survey of the annealing,” The International Journal of Advanced Manufacturing
state-of-the-art,” Evolutionary Computation, IEEE Transactions on, Technology, vol. 26, pp. 148–154, 2005, 10.1007/s00170-003-1976-4.
vol. PP, no. 99, pp. 1 –28, 2010. [Online]. Available: http://dx.doi.org/10.1007/s00170-003-1976-4
[7] Y.-S. Ong, M. Lim, and X. Chen, “Memetic computation;past, [12] K. Deb and S. Agrawal, “Simulated binary crossover for continuous
present; future [research frontier],” Computational Intelligence Maga- search space,” Complex Systems, vol. 9, pp. 115–148, 1995.
zine, IEEE, vol. 5, no. 2, pp. 24 –31, may 2010. [13] K. Deb and M. Goyal, “A combined genetic adaptive search (geneas)
[8] H. Singh, T. Ray, and W. Smith, “Performance of infeasibility em- for engineering design,” Computer Science and Informatics, vol. 26,
powered memetic algorithm for CEC 2010 constrained optimization pp. 30–45, 1996.
problems,” in Evolutionary Computation (CEC), 2010 IEEE Congress [14] K. Price, R. Storn, and J. Lampinen, Differential evolution - A Prac-
on, 2010, pp. 1 –8. tical Approach to Global Optimization. Springer, Berlin, Germany,
[9] V. Tirronen, F. Neri, T. Karkkainen, K. Majava, and T. Rossi, “A 2005.
memetic differential evolution in filter design for defect detection in [15] M. Powell, “A fast algorithm for nonlinearly constrained optimization
paper production,” in Applications of Evolutionary Computing, ser. calculations.” in Numerical Analysis, G. Watson, Ed. Springer, 1978,
Lecture Notes in Computer Science, M. Giacobini, Ed. Springer pp. 144–157.

1326

You might also like