You are on page 1of 18

Applied Intelligence

https://doi.org/10.1007/s10489-018-1291-2

A hybrid clonal selection algorithm with modified combinatorial


recombination and success-history based adaptive mutation
for numerical optimization
Weiwei Zhang1 · Kui Gao1 · Weizheng Zhang1 · Xiao Wang1 · Qiuwen Zhang1 · Hua Wang1

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Abstract
Artificial immune system is a class of computational intelligence methods drawing inspiration from biological immune
system. As one type of popular artificial immune computing model, clonal selection algorithm (CSA) has been widely
used for many optimization problems. When dealing with complex optimization problems, such as the characteristics of
multimodal, high-dimension, rotational, the traditional CSA often suffers from diversity loss, poor search ability, premature
convergence and stagnation. To address the problems, a modified combinatorial recombination is introduced to bring
diversity to the population and avoid the premature convergence. Moreover, the success-history based adaptive mutation
strategy is introduced to form a success-history based adaptive mutation based clonal selection algorithm to improve the
search ability. The mutation operator is also modified and analyzed through experimental comparison. To further improve
the precision and cope with the stagnation, the gene knockout strategy is proposed. The proposed algorithm is tested on CEC
2014 benchmarks and compared with state-of-the-art evolutionary algorithms. The experimental results show that MSHCSA
is quite competitive.

Keywords Immune system · Clonal selection algorithm · Optimization · Mutation · Adaptive

1 Introduction Where X = [x1 , x2 , · · · , xD ] is a D-dimensional vector of


decision variables in the feasible region . The optimization
Generally, an optimization problem can be described as could be either minimization problem or maximization problem.
Inspired by nature, people have developed many optimiza-
Optimize f (X), subject to X ∈  (1) tion computation methods to solve the complicated optimiza-
tion problems [1, 2]. Among them, artificial immune system
(AIS) has made great progress in both academic and indus-
 Weizheng Zhang
weizheng008@126.com trial communities recently. In view of the effectiveness and
complexity of biological immune system, various functions
Weiwei Zhang
anqikeli@163.com and principles are developed into applicable theories and
systems [3]. AISs have been applied on intrusion detec-
Kui Gao
945915123@qq.com tion [4], email spam detection [6, 7], feature selection and
prediction [8], optimization [9] and other fields [5]. Clonal
Xiao Wang
pandaxiaoxi@163.com selection algorithm (CSA) is a newly emerging compu-
tational paradigm inspired by adaptive immune response
Qiuwen Zhang
zhangqwen@126.com of immune B cells, including clonal proliferation differ-
entiation, affinity maturation and memory mechanisms, is
Hua Wang
haoyouqiezi@126.com well suited for handling optimization problems [10, 11].
However, facing more and more complicated optimization
1 Zhengzhou University of Light Industry, problem, the CSAs often suffer the premature convergence,
Zhengzhou, 450000, China poor accuracy and diversity loss.
W. Zhang et al.

In this paper, we propose an enhanced CSA named Through literatures review, the modification to the operators
MSHCSA, by introducing a new combinatorial recombina- and hybridization of different strategies and algorithms
tion and a modified success-history based adaptive mutation to basic CSA point out a promising direction to further
strategy. In MSHCSA, the combinatorial recombination improve the search effectiveness [10].
works for bring diversity and Success-history based adap- Degeneration recognizing method is applied on clonal
tive mutation strategy is for improving the search ability. selection algorithm for improving the computing speed
Moreover, to further improve the accuracy and avoid the [9]. Baldwinian effect is introduced to CSA which forms
stagnation, a gene knockout strategy is proposed. These an improved CSA named Baldwinian Clonal Selection
strategies work together to guide the evolutionary process of Algorithm which is proposed for optimization [14]. In the
MSHCSA towards the global optimum. hybrid GSCSA [15], gravitational search algorithm is
The remainder of the paper is organized as follows. Section 2 employed to do the exploration while CSA is for
shows basic theory of CSA and involved strategies. Section 3 exploitation. Differential evolution (DE) is combined
shows the proposed algorithm. Section 4 represents the with CSA for enhancing the operating efficiency and
experimental simulation and analysis. The conclusion is convergence rate [16]. A fast clonal algorithm is proposed
given in Section 5. which designs a parallel mutation operator comprising
of Gaussian and Cauchy mutation strategies [17]. A
quantum-inspired immune clonal algorithm (QICA) is
2 Clonal selection algorithms addressed to cope with global optimization, in which,
quantum mutation and quantum recombination are proposed
In this section, we first represent the basic framework of [18]. Cutello et al. introduced a real coded clonal
CSA and then give a rough review of the various variants selection algorithm for global optimization, involving the
of the CSA in the perspective of the modified operators and cloning operator, inversely proportional hypermutation and
hybrid algorithms. aging operator [19]. Baldwinian learning and orthogonal
learning are incorporated with CSA generates a hybrid
2.1 Basic CSA learning clonal selection algorithm (HLCSA) [20], in which
Baldwinian learning is employed for exploration while the
The main idea of clonal selection theory lies in the phe- orthogonal learning for exploitation. Yang et al. Lamarckian
nomenon that B cells can adaptively modify their recep- learning theory is introduced to CSA, which forms the
tor termed antibodies to react the invasive non-self anti- Lamarckian Clonal Selection Algorithm (LCSA), while the
gens in biological immune system [12]. Inspired from the Recombination operator and tournament selection operator
clonal selection principle, various clonal selection algo- are incorporated into the algorithm as well [21]. Li et al. [22]
rithms are proposed. The general one, known as CLONALG proposed an immune clonal PSO, in which Particle swarm
[13] is one of the representatives. optimization algorithm (PSO) is integrated with clonal
In the basic CSA, antigen is represented as an optimiza- selection to enhance the performance and suppress the
tion problem to be solved, while the antibodies are candidate degradation phenomena. Gaussian and Cauchy mutations
solutions. The algorithm aims to locate the fittest solutions are introduced to CSA and Cell Repair Operator (CPO)
to the problem. The optimization problem could be global and Dynamic Mutation Size Operator (DMSO) are also
optimization problem, multimodal optimization problem, proposed in the improved combined mutation CSA to
multi-objective optimization problem, and etc. Fitness mea- handle with the unimodal and multimodal functions [23].
sures how a solution fits for the problem. There are mainly Vaccination and Cauchy mutation is developed into clonal
three operations involved, which are cloning, hypermutation selection algorithm to accelerate the convergence rate
and selection. Cloning operator provides candidate room and improve the detection rate [24]. A mixed mutation
for the hypermutation to bring diversity. The introduction strategy combining with Gaussian and Cauchy mutations is
of diversity helps each antibody explore the near neighbor. proposed in [25] and experimental results show the mixed
Then the one with the highest fitness in the antibody pop- strategy can obtain the same or even better performance.
ulation will be survive through selection operator, and the It can be observed that the modification to the operators
others are eliminated. The framework of the basic CSA can and hybridization of different strategies and algorithms are
be described as Algorithm 1. effective improvements to basic CSA. However, the way to
modify the operators and know which strategies or algo-
2.2 Improved CSAs rithms incorporated with CSA would benefit the performance
is worth to be studied. When the optimization problem
When implementing in practice, basic CSA gradually becomes more and more complicated, it is a delicate work
evolves into various variants to meet the challenges. to modify the inappropriate operations and assemble the
A hybrid clonal selection algorithm with modified combinatorial recombination and success-history...

well-chosen strategies into a functional system. In this through the combinational recombination. In the example,
paper, a combinatorial recombination operator is modified m equals to 3. It needs to be known that i1 could be
and hypermutation is incorporated with success-history different with j1 , the same to i1 , j2 and i3 , j3 as long as the
based adaptive mutation strategy, together with gene normalization has been done.
knockout strategy compose an enhanced CSA termed After the fitness of the new generated individuals is
MSHCSA for the complex optimization problems. evaluated, together with the original individuals, the two
ones with the higher fitness will survive, and the other two
individuals are deleted, i.e. two individuals with high fitness
3 Proposed algorithm form the set {XA , XB , XA  , X  } are selected.
B
It is reasonable that the diversity could be introduced
3.1 Modified Recombination from information fusion of population according to the
above process. However, as is known to all, the distributed
During the development of B cell, the gene segments in the population trend to fall into local optima influenced by the
libraries are combined and rearranged at the level of the “attractors” of the search area. If the population falls into
DNA. With the recombination of gene segments, recombi- small local area, the above recombination operator cannot
nation creates a population of cells that vary widely in their bring diversity anymore and the search process is liable to
specificity [12]. The presence of combinatorial recombina- get stuck into stagnation.
tion takes the responsibility of the diversification as hyper- To solve the problem, a modified version of the com-
mutation in immunology. To simulate the arrangement of binatorial recombination is proposed in this paper. During
gene segments in recombination, many recombination oper- the recombination process, instead of being abandoned
ators have been proposed, such as DE inspired recombi- immediately, the generated candidate solutions failed in the
nation [26], binomial recombination [27] and intelligent selection process will be kept and stored in the archive A
recombination [28] et al. The combinatorial recombina- in the paper. Thereafter, in the proposed algorithm, one of
tion operator in RHCSA [29] is a newly proposed recom- the parent individual XA comes from the current population
bination operator which could introduce diversity to the and the other parent individual XB is randomly chosen from
population through fusing information between randomly the union of current population and archive A. The recom-
chosen parents. The new individuals are generated by bination is implemented as (2). Then in the selection part,
 instead of chose two individuals with high fitness form the
V A = αX VA + (1 − α)X VB set {XA , XB , XA , X  }, only X is updated if the new gen-
XA A B B A
(2)
XBV B = αXBVB + (1 − α)XA VA erated offspring have better fitness, i.e., the one with best
fitness will replace XA , and the other will be add to set A.
where XA andXB are randomly chosen individuals from the The size of archive A is initialized to zero, and then if the
population, VA and VB are randomly chosen m dimensions, archive size exceeds a certain threshold, say Np, then some
m ∈ [1, D]. The recombination operator is represented solutions are randomly removed from the archive to keep
in the Fig. 1, in which two new individuals are generated the archive size at Np.

Fig. 1 Recombination process


W. Zhang et al.

3.2 Success-history based adaptive mutation based perturbation for each clone. In view of the clonal
clonal selection selection principle, the hypermutation rate follows the
inversely proportional law [13]. Each candidate solu-
The Success-history based adaptive mutation based clonal tion is subject to M mutations [30] is adopted, in
selection is composed of Cloning, Success-history based adap- which inversely proportional law is subtly used to deter-
tive hypermutation and Selection, which is shown in Fig. 2. mine the number of mutations M. Moreover, the way
Success-History based Adaptive DE (SHADE) [31, 32]
i. Cloning: Each antibody Xi could generate clones as
generates the mutant vector, termed current-to-pbest/1
offspring {Xi1 , Xi2 , · · · , XiNc } through Cloning, where
mutation is adopted to implement the hypermutation.
Nc is the clone number. j
Each clone Xi j = 1, · · · , Nc, generates the hypermu-
ii. Success-history based adaptive hypermutation: Hyper- j
mutation brings diversity to the population by introducing tated clones Xi as (3).


j j j j j
j Xi + Fi × (Xpbest − Xi ) + Fi × (Xr1 − Xr2 ) j ∈ randM(n)
Xi = j (3)
Xi otherwise

j j
Where Xi is the j th dimension of ith individual, Xpbest α = exp(−ρf ∗ (Xi )) (5)
is the j th dimension of randomly chosen as one of the
top 100 × q% individuals in the current population with Where f ∗ (Xi ) ∈ [0, 1] is the normalized fitness of Xi , ρ
p ∈ (0, 1]. The indices r1, r2 are randomly selected from is the decay constant controls the shape of the mutation rate
[1, Np] and r1 = r2  = i. Fi determines the magnitude, · returns the lower bound integer.
randM(n) ∈ {1, · · · , n} is randomly chosen M indexes Besides, we also tried to make modification on the
without repetition. M is determined by equation of current-to-pbest mutation i.e. (3) as (6).
M = (α × n) + 1 (4)


j j j j j j j
j Xi + Fi × (Xpbest − Xi ) + Fi × (Xr1 − Xr2 ) + Fi × (Xr3 − Xr4 ) j ∈ randM(n)
Xi = j (6)
Xi otherwise

Where r3 and r4 are randomly selected as r1 and r2, the perturbation on the basis of current-to-pbest/1 mutation. In
added third item is expected to bring more diversity and view of the huge search space in handling with the high-
improve search ability. The illustrations of both current-to- dimensional and rotational problems, the search process is
pbest/1 mutation and the modified mutation are shown in easily to fall into stagnation. With the introduction of the
Fig. 3. As shown in Fig. 3, the added items could bring difference from two pairs of randomly selected individuals

Fig. 2 The main steps of the


success-history based adaptive
mutation based clonal selection
A hybrid clonal selection algorithm with modified combinatorial recombination and success-history...

Fig. 3 a An illustration of
current-to-pbest/1 mutation
scheme in 2D space b An
illustration of modified mutation
scheme in 2D space

is expected to bring more diversity. Moreover, the direction an element MFri , where ri ∈ [1, H ] from MF and
composition of newly introduced difference vectors and the implementing
original difference vector may somehow benefit solving the
Fi = randc(MFri , 0.1) (7)
rotation problem. The detailed discussion will be presented
in experimental part. where randc(μ, σ 2 ) is the cauchy distribution with mean
The control parameter Fi follows the success-history μ and variance σ 2 . In each generation, Fi succeeds in
based adaptation rules as [31, 32]. As shown in Table 1, a generating an individual better than the parent is recorded
historical memory archive MF which deserves H historical in SF , and at the end of generation, the contents of memory
value of Fi is set. In each generation, Fi randomly selects are updated as follows:

k meanW L (SF ) SF  = ϕ
MF = (8)
Table 1 The historical memory MF MFk otherwise
Index 1 2 ... H-1 H |SF |
k=1 ωk × (SF )
k 2

MF MF1 MF2 ... MFH −1 MFH


meanW L (SF ) = |SF |
(9)
k=1 ωk × SF
k
W. Zhang et al.

fk
ωk = |S | (10)
F
k=1 fk

 
fk = f (Xk ) − f (Xk ) (11)
where k ∈ [1, H ] is initialized to 1 and recursively
increases. If there is no better offspring is generated, i.e.
SF = ϕ the contents of MF will not be updated.
iii. Selection: Select the one with highest fitness among
Xi and hypermutated clones {Xi1 , Xi2 , · · · , XiNc }

4 Experiments
In this section, the experiments that have been done to evalu-
ate the performance of the proposed MSHCSA for a number
of benchmark functions are described. The functions were
taken from the suite of benchmark problems used in Com-
petition on Evolutionary Computation in Single Objective
Real-Parameter Numerical Optimization, held under the
2014 IEEE Congress on Evolutionary Computation (CEC)
[33], which contains 30 benchmark functions with the fea-
tures covering the unimodal, multimodal, composing test
problems by extracting features dimension-wise from sev-
eral problems, graded level of linkages, rotated trap prob-
3.3 Gene knockout strategy lems. The benchmark functions are listed in Table 2. The
search range of the functions is [−100, 100]D , D is dimen-
When handling the optimization problem with multiple sions, Mi is the rotation matrix, oi = [oi1 , oi2 , · · · , oiD ]T is
optima, evolutionary algorithms always suffer premature the shifted global optimum, which is randomly distributed in
convergence. The problem becomes more intractable with the search range, Fi∗ = Fi (x ∗ ) is the optimal function value,
the dimensions of the problems increase. When facing the fi (x) is the basic function. In the hybrid functions, gi (x) is
huge search space in high dimensional problems, stagnation the basic function used to construct the hybrid function, N is
is a common and hard to avoid situation. It is obvious that the number of basic function, zi is randomly shuffled x −oi .
stagnation is harmful since it cannot bring any progress to In the composition functions, λi controls the height of each
the results but consuming computational resource. To solve gi (x), ωi is weight value, biasi defines which optimum is
the problem, agei is introduced for each individual Xi to global optimum. It’s worth to address that these problems
count how many times it fails to update itself. If agei > should be treated as black-box problems, which means the
Maxage, one randomly chosen dimension of the individual explicit equations cannot be used.
Xi will be replaces as follows. Among them, function F1-F3 are unimodal rotated
j functions, F4-F16 are shifted and rotated multimodal
Xi = elitej + randn(0, 1) (12) functions, F17-F22 are hybrid functions and F23-F30
where, Maxage is a user defined threshold, j ∈ [1, D] is are composition functions. These functions pose great
a randomly chosen dimension, elite is a randomly chosen challenges. To save the space, we only give one function F12
individual from the top 100 q% best antibodies from the as representative to show the landscape distribution. The
population, randn(0, 1) is the Gaussian random number basic function of F12 is the Katsuura Function as follows.
⎛ ⎞ D101.2
32  j
with mean 0 and variance 1.
10  ⎝
D 2 xi −round(2j xi )
f10 (x) = 2 1+ i ⎠ − 10
3.4 The process of MSHCSA D 2 j D2
i=1 j =1
(13)
Combining with the new recombination and mutation
strategy, the new algorithm, denoted as MSHCSA is Figure 4 shows the 3D map and the contour map for 2D
presented in Algorithm 2. function of F12 (shifted and Rotatated Katsuura Function).
Table 2 Summary of the CEC2014 benchmark functions

No. Functions Equations Fi∗ = Fi (x ∗ )

Unimodal function 1 Rotated High Conditioned Elliptic Function F1 (x) = f1 (M(x − o1 )) + F1∗ 100
2 Rotated Bent Cigar Function F2 (x) = f2 (M(x − o2 )) + F2∗ 200
3 Rotated Discus Function F3 (x) = f3 (M(x − o3 )) + F3∗ 300
4)
Simple multimodal function 4 Shifted and Rotated Rosenbrock’s Function F4 (x) = f4 (M( 2.048(x−o
100 ) + 1) + F4∗ 400
5 Shifted and Rotated Ackley’s Function F5 (x) = f5 (M(x − o5 )) + F8∗ 500
6)
6 Shifted and Rotated Weierstrass Function F6 (x) = f6 (M( 0.5(x−o
100 )) + F6∗ 600
600(x−o7 )
7 Shifted and Rotated Griewank’s Function F7 (x) = f7 (M( 100 )) + F7∗ 700
8)
8 Shifted Rastrigin’s Function F8 (x) = f8 ( 5.12(x−o
100 ) + F7∗ 800
5.12(x−o9 )
9 Shifted and Rotated Rastrigin’s Function F9 (x) = f8 (M( 100 )) + F9∗ 900
10 ) ∗
10 Shifted Schwefel’s Function F10 (x) = f9 ( 1000(x−o
100 ) + F10 1000
1000(x−o11 ) ∗
11 Shifted and Rotated Schwefel’s Function F11 (x) = f9 (M( 100 )) + F11 1100
5(x−o12 ) ∗
12 Shifted and Rotated Katsuura Function F12 (x) = f10 (M( 100 )) + F12 1200
13 ) ∗
13 Shifted and Rotated HappyCat Function F13 (x) = f11 (M( 5(x−o 100 )) + F13 1300
5(x−o14 ) ∗
14 Shifted and Rotated HGBat Function F14 (x) = f12 (M( 100 )) + F14 1400
5(x−o15 ) ∗
A hybrid clonal selection algorithm with modified combinatorial recombination and success-history...

15 Shifted and Rotated Expanded Griewank’s plus Rosenbrock’s Function F15 (x) = f13 (M( 100 ) + 1) + F15 1500
16 Shifted and Rotated Expanded Scaffer’s F6 Function F16 (x) = f14 (M(x − o16 ) + 1) + F16 ∗ 1600
Hybrid functions 17-22 Hybrid Function1-6 F (x) = g1 (M1 z1 ) + · · · +gN (MN zN ) + F ∗ (x) 1700-2200
N

Composition functions 23-30 Composition Function 1-8 F (x) = {ωi ∗ [λi gi (x) + biasi ]} + F ∗ 2300-3000
i=1
W. Zhang et al.

It is observed that F12 is a non-separable function which space). “−”, “+”, and “≈” indicate that the performance
has a huge number of local optima, and it is continuous of the corresponding peer algorithm is worse than, better
everywhere, yet differentiable nowhere. When handling than, and similar to that of MSHCSA, respectively. The
these kind of functions, the algorithm is very easy to be best results among all the tested algorithms are marked in
trapped on the local optima. bold.
In our experiments, the dimensions of these functions
are set to 10, 30, 50, and 100, respectively, which bring 4.1 Experimental parameter setting
more challenges. Through the simulations, each experiment
runs 30 times and during each run, the function is evaluated During experimentation, involving parameters of the pro-
for 10000*D times (D is the dimension of feasible solution posed MSHCSA were set as follows based on parameter

Fig. 4 a 3D map for 2D


function F12. b Contour map for
2D function F12
A hybrid clonal selection algorithm with modified combinatorial recombination and success-history...

tuning simulation results: size of antibody population N is 4.2.2 Investigation of success-history based adaptive
set to 10∗D, the cloning number Nc is set to 1, and the mutation
number of selected dimension m in recombination is set to
0.3∗D. In the gene knockout strategy, the elite rate q is set In the proposed algorithm, the success-history based
as 0.2. The maximal age Maxage is set to 6. adaptive mutation is introduced. We not only directly
introduce the current-to-pbest/1 mutation but also try a
4.2 Experimental investigation of MSHCSA modification version of the mutation equation. They are
denoted as C-Mutation and M-mutation respectively and
There are new strategies developed in the MSHCSA compared results are shown in Table 5. Comparing with
algorithm. Their efficiencies are investigated through Table 4, it can be observed that both mutation operations
experimental simulation in the following subsection. could improve the performance of the algorithm to a great
extent.
4.2.1 Investigation of modified recombination When dimension is 30D, C-Mutation achieves better
results than M mutation in F4, F6, F8-F16, F19 and F21,
Based on the basic CSA, we compared the performance but worse results in F7, F17, F18, F20 and F22. Both
of the basic CSA, CSA with modified recombination of them obtain the satisfying results in the unimodal
strategy, and CSA with combination recombination in functions (F1-F3). Obviously, C-Mutation is superior than
RHCSA [29], which are called “Basic CSA”, “CSA+MR” M mutation when handling the multimodal functions (F4-
and “CSA+CR”, respectively. To save the limited space, F16). When facing the hybrid functions, the M mutation
Tables 3 and 4 shows the experimental results on CEC2014 achieves the better performance (F17-F22). They are
benchmark functions with 30 and 100 dimensions. even to the composition functions (F23-F30). In the
It is shown that all of them, including Basic CSA, and the 30D, C Mutation shows the advantage in handling the
introduction of no matter combinatorial recombination and multimodal functions. To the other type of functions, they
modified recombination could not get satisfactory results perform about the same. However, C Mutation starts to
on the benchmark functions. It simply shows the challenge lose superiority comparing with in 30D when dimension is
of the benchmark functions. Even though, the results of 50D, and the situation continues in 100D. It is addressed
both CSA+MR and CSA+CR are obviously better than that C Mutation fails to locate the global optima in F1
Basic CSA no matter on 30 or 100 dimensions. That is while M mutation succeeds in 50D, and the difference
to say, the introduction of recombination to Basic CSA of the error is an order of magnitude in 100D. With the
could improve the performance of Basic CSA. The reason increase of the dimension, the results of M mutation shows
is recombination operator could increase the diversity of better results in handling unimodal functions (F1-F3), and
the population through the exchange of the gene segments, the overwhelming advantage of C Mutation is declining
and the introduced diversity could help the population to the multimodal functions (F4-F16). To the Hybrid and
jump out of the local optima and keep searching for the composition functions, the M mutation is getting better and
global optimum. Moreover, CSA+CR could get better better results when dimensions increase. The reason is that
performance on unimodal functions (F1-F3) both in 30 and when dimensions increase, the search space will become
100 dimension, while CSA+MR seems to be more suitable huge. In this situation, the introduced diversity could help
for multimodal functions (F4-F16). It can be concluded the algorithm jumps out of the stagnation. Obviously, the
that the original combinatorial recombination could enhance M mutation brings more diversity than C Mutation.
exploitation ability while modified recombination could During the experimental comparison, the way to imple-
bring more diversity and motivate the optima searching ment both recombination and hypermutation operators
ability due to the guiding effect of failed candidate shows their respective advantages. The modified recombi-
solutions. To hybrid function (F17-F22), the CSA+CR nation and modified success-history based adaptive muta-
obtains better results than CSA+MR in 30 dimensions tion are adopted in the paper and form the algorithm
and gets even results on 100 dimensions. When facing MSHCSA through comprehensive assessment.
the composition functions (F23-F30), the performance of
CSA+MR is better than CSA+CR in 30 dimension and 4.3 Comparison with the state-of-art algorithms
tie with each other in 100 dimensions. It is found that
when coping with the complicated hybrid and composition The performance of MSHCSA is compared with the
function, neither CSA+MR nor CSA+CR has highlighting following five the state-of-the-art algorithms by using the
behavior as it shows in handling simple unimodal or benchmark suite of CEC2014: L-SHADE [34], RSDE [35],
multimodal functions. In view of the whole performance, FWA-DM [36], CETMS [37], and OptBees [38]. For the
the CSA+MR is slightly better than CSA+CR. competitor algorithms, the best parametric setup is
Table 3 Results (mean) of basic CSA with different Recombination operators on CEC2014 with 30D

Fun F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11

Basic CSA 2.22E+09– 1.21E+11– 1.54E+05– 3.07E+04– 2.12E+01– 4.74E+01– 9.65E+02– 4.69E+02– 5.84E+02– 8.44E+03– 8.75E+03–
CSA+CR 6.49E+08– 6.13E+10+ 6.90E+04+ 8.87E+03– 2.11E+01+ 3.73E+01– 4.91E+02+ 2.98E+02– 3.39E+02– 6.40E+03– 7.52E+03–
CSA+MR 6.42E+08 6.25E+10 7.12E+04 8.46E+03 2.11E+01 3.64E+01 4.95E+02 2.96E+02 3.33E+02 6.20E+03 7.38E+03
fun F12 F13 F14 F15 F16 F17 F18 F19 F20 F21 F22
Basic CSA 3.97E+00– 1.00E+01– 3.54E+02– 1.15E+07– 1.39E+01– 1.50E+08– 6.77E+09– 8.27E+02– 2.75E+05– 5.09E+07– 7.04E+03–
CSA+CR 2.87E+00– 7.17E+00– 1.74E+02– 5.96E+04+ 1.33E+01– 3.18E+07+ 1.68E+09+ 2.37E+02+ 6.44E+04– 8.38E+06+ 1.89E+03+
CSA+MR 2.74E+00 7.06E+00 1.68E+02 6.54E+04 1.31E+01 3.30E+07 1.73E+09 2.54E+02 4.61E+04 9.43E+06 2.12E+03
fun F23 F24 F25 F26 F27 F28 F29 F30 −/ + /≈
Basic CSA 1.46E+03– 5.12E+02– 3.63E+02– 1.81E+02– 1.66E+03– 7.79E+03– 5.39E+08– 4.34E+06– 30/0/0
CSA+CR 5.96E+02– 2.06E+02+ 2.04E+02– 1.39E+02– 7.52E+02+ 6.70E+03– 3.49E+08– 1.63E+06+ 17/13/0
CSA+MR 5.83E+02 2.07E+02 2.02E+02 1.15E+02 7.73E+02 6.70E+03 3.36E+08 1.65E+06

Table 4 Results (mean) of basic CSA with different Recombination operators on CEC2014 with 100D

Fun F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11

Basic CSA 1.81E+10– 6.14E+11– 6.63E+05– 2.77E+05– 2.15E+01– 1.75E+02– 5.70E+03– 1.99E+03– 2.43E+03– 3.37E+04– 3.42E+04–
CSA+CR 6.40E+09+ 2.75E+11+ 3.91E+05– 7.55E+04– 2.14E+01– 1.52E+02– 2.85E+03+ 1.28E+03– 1.48E+03– 2.87E+04– 3.15E+04–
CSA+MR 6.54E+09 2.80E+11 3.53E+05 7.50E+04 2.14E+01 1.51E+02 2.85E+03 1.25E+03 1.44E+03 2.80E+04 3.07E+04
fun F12 F13 F14 F15 F16 F17 F18 F19 F20 F21 F22
Basic CSA 5.40E+00– 1.41E+01– 1.66E+03– 4.87E+08– 4.83E+01– 2.76E+09– 7.58E+10– 1.88E+04– 1.47E+07– 1.20E+09– 1.03E+06–
CSA+CR 4.36E+00– 9.02E+00+ 8.08E+02+ 1.10E+07+ 4.71E+01– 9.00E+08+ 2.99E+10+ 5.94E+03+ 7.26E+05– 1.93E+08– 2.45E+04–
CSA+MR 4.21E+00 9.04E+00 8.26E+02 1.13E+07 4.68E+01 9.42E+08 3.07E+10 6.21E+03 5.14E+05 1.91E+08 2.27E+04
fun F23 F24 F25 F26 F27 F28 F29 F30 −/ + / ≈
Basic CSA 8.90E+03– 1.90E+03– 1.76E+03– 1.51E+03– 7.13E+03– 3.93E+04– 7.01E+09– 3.77E+08– 30/0/0
CSA+CR 2.00E+02– 2.02E+02– 2.00E+02+ 2.00E+02≈ 6.02E+03+ 3.65E+04+ 5.39E+09– 1.56E+08+ 16/13/1
CSA+MR 2.00E+02 2.01E+02 2.00E+02 2.00E+02 6.16E+03 3.69E+04 5.31E+09 1.74E+08
W. Zhang et al.
Table 5 Results (mean(std)) of basic CSA with different Hypermutation operators on CEC2014 with varying dimensions

Dimension D=30 D=50 D=100

Fun C-mutation M-mutation C-mutation M-mutation C-mutation M-mutation

F1 0.00E+00(0.00E+00)≈ 0.00E+00(0.00E+00) 4.38E+04(2.85E+04)– 0.00E+00(0.00E+00) 9.17E+05(3.07E+05)– 9.95E+04(4.26E+04)


F2 0.00E+00(0.00E+00)≈ 0.00E+00(0.00E+00) 0.00E+00(0.00E+00)+ 2.51E-07(1.77E-07) 1.00E-05(1.29E-05)+ 2.46E+00(1.21E+00)
F3 0.00E+00(0.00E+00)≈ 0.00E+00(0.00E+00) 0.00E+00(0.00E+00)≈ 0.00E+00(0.00E+00) 1.12E-04(1.65E-04)– 6.59E-07(3.64E-07)
F4 5.32E-01(1.38E+00)+ 1.18E+00(5.44E-01) 2.00E+01(3.28E+00)+ 2.72E+01(8.12E-01) 1.07E+02(3.52E+01)– 8.44E+01(6.18E-01)
F5 2.00E+01(1.21E-04)≈ 2.00E+01(5.33E-05) 2.00E+01(2.28E-04)≈ 2.00E+01(2.44E-04) 2.00E+01(2.15E-03)≈ 2.00E+01(1.87E-03)
F6 4.62E-01(6.97E-01)+ 2.41E+01(9.47E-01) 3.38E+00(1.41E+00)+ 5.09E+01(1.79E+00) 2.04E+01(2.80E+00)+ 1.29E+02(2.61E+00)
F7 4.93E-04(1.88E-03)- 0.00E+00(0.00E+00) 9.86E-04(3.08E-03)– 0.00E+00(0.00E+00) 9.86E-04(2.56E-03)– 0.00E+00(0.00E+00)
F8 3.59E+00(1.62E+00)+ 5.24E+00(1.18E+00) 2.23E+01(2.82E+00)+ 3.26E+01(3.80E+00) 1.69E+02(1.22E+01)+ 2.36E+02(1.64E+01)
F9 2.09E+01(4.45E+00)+ 2.57E+01(5.10E+00) 5.00E+01(6.68E+00)+ 7.68E+01(8.82E+00) 2.52E+02(2.01E+01)+ 3.17E+02(1.82E+01)
F10 9.90E+00(4.02E+00)+ 2.17E+01(8.32E+00) 1.17E+03(2.74E+02)- 1.51E+03(2.24E+02) 1.06E+04(5.21E+02)+ 1.11E+04(5.54E+02)
F11 2.43E+03(3.07E+02)+ 2.66E+03(3.09E+02) 5.59E+03(3.87E+02)+ 5.79E+03(4.10E+02) 1.60E+04(5.55E+02)+ 1.69E+04(5.54E+02)
F12 4.03E-01(8.29E-02)+ 4.21E-01(9.78E-02) 6.89E-01(1.10E-01)– 6.82E-01(1.02E-01) 1.28E+00(1.28E-01)+ 1.33E+00(1.34E-01)
F13 1.55E-01(2.01E-02)+ 1.97E-01(3.21E-02) 2.54E-01(4.21E-02)+ 2.84E-01(2.99E-02) 3.54E-01(4.18E-02)+ 3.67E-01(3.12E-02)
F14 2.48E-01(3.96E-02)+ 1.79E-01(2.76E-02) 3.28E-01(7.26E-02)- 2.22E-01(4.02E-02) 3.57E-01(7.32E-02)– 3.12E-01(9.89E-02)
F15 3.05E+00(6.12E-01)+ 3.39E+00(6.21E-01) 8.75E+00(9.70E-01)+ 9.19E+00(1.24E+00) 3.14E+01(2.47E+00)– 3.00E+01(1.84E+00)
F16 1.04E+01(2.73E-01)+ 1.10E+01(4.18E-01) 1.92E+01(4.43E-01)+ 2.03E+01(3.24E-01) 4.21E+01(4.96E-01)+ 4.35E+01(4.63E-01)
F17 1.09E+03(3.95E+02)– 1.07E+03(1.56E+02) 2.25E+03(5.41E+02)+ 2.64E+03(2.99E+02) 9.63E+03(4.05E+02)– 5.87E+03(6.98E+02)
F18 6.96E+01(1.35E+01)– 1.78E+01(3.32E+00) 1.25E+02(2.99E+01)- 7.78E+01(1.02E+01) 2.86E+02(3.82E+01)– 2.82E+02(9.44E+01)
F19 1.08E+01(1.21E+00)+ 1.25E+01(1.01E+00) 1.52E+01(4.72E+00)+ 1.96E+01(1.56E+00) 4.90E+01(3.82E+00)– 4.31E+01(3.35E-01)
F20 1.72E+01(3.62E+00)– 1.20E+01(1.44E+00) 6.83E+01(2.81E+01)– 4.89E+01(5.76E+00) 4.40E+02(7.73E+01)– 2.18E+02(1.49E+01)
A hybrid clonal selection algorithm with modified combinatorial recombination and success-history...

F21 2.60E+02(1.07E+02)+ 3.29E+02(9.73E+01) 1.22E+03(3.83E+02)+ 1.50E+03(2.48E+02) 2.84E+03(6.88E+02)+ 4.76E+03(3.41E+02)


F22 8.21E+01(5.71E+01)– 6.23E+01(3.75E+01) 3.12E+02(1.75E+02)+ 4.63E+02(1.32E+02) 1.34E+03(1.92E+02)+ 1.53E+03(3.35E+02)
F23 3.14E+02(0.00E+00)≈ 3.14E+02(0.00E+00) 3.37E+02(1.88E-13)≈ 3.37E+02(2.85E-13) 3.45E+02(2.60E-13)≈ 3.45E+02(1.05E-12)
F24 2.24E+02(1.04E+00)– 2.10E+02(1.12E+01) 2.75E+02(2.27E+00)- 2.64E+02(3.30E-01) 3.87E+02(4.18E+00)– 3.84E+02(2.91E+00)
F25 2.00E+02(1.87E-03)≈ 2.00E+02(5.28E-04) 2.01E+02(4.86E+00)- 2.00E+02(1.59E-02) 2.00E+02(1.57E-13)+ 2.01E+02(1.70E-01)
F26 1.00E+02(3.21E-02)≈ 1.00E+02(2.81E-02) 1.00E+02(5.76E-02)≈ 1.00E+02(2.83E-02) 1.50E+02(5.06E+01)– 1.00E+02(2.89E-02)
F27 3.62E+02(1.16E+02)+ 9.05E+02(1.85E+02) 5.89E+02(3.87E+02)+ 1.73E+03(3.70E+01) 1.32E+03(9.45E+02)+ 3.71E+03(5.61E+01)
F28 3.76E+02(4.76E+00)– 3.67E+02(4.63E-01) 3.68E+02(3.75E+00)- 3.57E+02(4.41E-01) 5.10E+02(3.23E+01)– 4.04E+02(5.30E+00)
F29 2.11E+02(2.19E+00)+ 2.14E+02(6.42E-01) 2.13E+02(2.88E+00)+ 2.29E+02(8.98E-01) 2.54E+02(8.22E+00)– 2.43E+02(4.22E+00)
F30 4.64E+02(7.66E+01)– 3.25E+02(4.51E+01) 1.16E+03(2.16E+02)– 9.58E+02(1.77E+02) 3.03E+03(2.98E+02)– 2.85E+03(2.73E+02)
-/+/≈ 7/16/7 11/15/4 15/13/2
Table 6 Results (mean(std)) of MSHCSA and other algorithm on CEC2014 benchmark function with 10D

Fun L-SHADE RSDE FWA-DM CETMS OptBees MSHCSA

F1 0.00E+00(0.00E+00)≈ 0.00E+00(0.00E+00)≈ 5.01E+03(1.67E+04)– 0.00E+00(0.00E+00) ≈ 7.80E+02(6.90E+02)– 0.00E+00(0.00E+00)


F2 0.00E+00(0.00E+00)≈ 0.00E+00(0.00E+00)≈ 1.34E-04(9.49E-04)– 0.00E+00(0.00E+00) ≈ 9.80E-03(6.3E−02)– 0.00E+00(0.00E+00)
F3 0.00E+00(0.00E+00)≈ 0.00E+00(0.00E+00)≈ 1.88E-09(9.02E-09)– 0.00E+00(0.00E+00) ≈ 9.20E-01(2.90E+00)– 0.00E+00(0.00E+00)
F4 2.90E+01(1.30E+01)– 2.81E+00(2.81E+00)– 1.41E+00(1.60E+00)– 6.90E+00(1.50E+01)– 2.60E+00(2.80E+00)– 0.00E+00(0.00E+00)
F5 1.40E+01(8.80E+00)+ 1.92E+01(1.92E+01)+ 2.00E+01(4.17E-02)≈ 1.90E+01(7.6E−1)+ 1.90E+01(1.2E−04)+ 2.00E+01(2.00E+01)
F6 1.80E-02(1.30E-01)+ 5.29E-02(5.29E-02)+ 7.06E-01(6.40E-01)+ 1.70E-01(4.0E−1)+ 3.00E+00(1.20E+00)– 1.26E+00(1.26E+00)
F7 3.00E-03(6.50E-03)+ 3.55E-02(3.55E-02)– 9.48E-02(4.92E-02)– 3.10E-02(2.1E−2)– 1.50E-01(1.3E−01)– 6.94E-03(6.94E-03)
F8 0.00E+00(0.00E+00)≈ 6.61E-01(6.61E-01)– 2.54E-01(8.09E-01)– 0.00E+00(0.00E+00) ≈ 0.00E+00(0.00E+00) ≈ 4.46E-09(4.46E-09)
F9 2.30E+00(8.40E-01)+ 8.52E+00(8.52E+00)– 6.01E+00(2.45E+00)– 7.30E+00(2.90E+00)– 2.00E+01(7.50E+00)– 2.80E+00(2.80E+00)
F10 8.60E-03(2.20E-02)+ 6.84E+01(6.84E+01)– 1.59E+00(2.08E+00)– 2.20E+00(2.90E+00)– 2.10E+02(1.00E+02)– 1.94E-01(1.94E-01)
F11 3.20E+01(3.80E+01)+ 2.91E+02(2.91E+02)– 3.72E+02(1.53E+02)– 3.00E+02(2.10E+02)– 3.90E+02(1.60E+02)– 1.62E+02(1.62E+02)
F12 6.80E-02(1.90E-02)+ 2.21E-01(2.21E-01)– 4.25E-02(4.78E-02)+ 1.60E-01(4.0E−2)– 1.30E-01(7.8E−02)+ 1.45E-01(1.45E-01)
F13 5.20E-02(1.50E-02)+ 1.28E-01(1.28E-01)– 1.21E-01(7.18E-02)– 4.70E-02(2.4E−2)+ 4.10E-01(1.8E−01)– 6.81E-02(6.81E-02)
F14 8.10E-02(2.60E-02)– 1.36E-01(1.36E-01)– 2.14E-01(1.20E-01)– 1.00E-01(3.5E−2)– 3.60E-01(1.9E−01)– 7.28E-02(7.28E-02)
F15 3.70E-01(6.90E-02)+ 9.83E-01(9.83E-01)– 7.75E-01(2.63E-01)– 6.30E-01(1.3E−1)– 2.40E+00(1.20E+00)– 4.79E-01(4.79E-01)
F16 1.20E+00(3.00E-01)+ 2.23E+00(2.23E+00)– 1.76E+00(4.68E-01)+ 1.80E+00(2.8E−1)+ 2.60E+00(3.9E−01)– 1.86E+00(1.86E+00)
F17 9.80E-01(1.10E+00)+ 4.77E+01(4.77E+01)– 2.55E+02(1.77E+02)– 1.30E+02(6.60E+01)– 6.80E+02(9.40E+02)– 1.36E+01(1.36E+01)
F18 2.40E-01(3.10E-01)+ 2.00E+00(2.00E+00)– 2.52E+01(1.83E+01)– 1.80E+00(2.40E+00)– 3.30E+01(4.50E+01)– 5.07E-01(5.07E-01)
F19 7.70E-02(6.40E-02)+ 1.03E+00(1.03E+00)+ 1.30E+00(7.75E-01)– 2.90E-01(5.4E−2)+ 9.30E-01(3.7E−01)+ 1.20E+00(1.20E+00)
F20 1.80E-01(1.80E-01)+ 7.21E-01(7.21E-01)– 1.34E+01(1.16E+01)– 4.70E-01(8.8E−1)– 8.90E+00(2.20E+01)– 3.35E-01(3.35E-01)
F21 4.10E-01(3.10E-01)+ 1.21E+00(1.21E+00)+ 9.46E+01(9.80E+01)– 3.80E+00(7.70E+00)– 5.70E+01(6.20E+01)– 1.34E+00(1.34E+00)
F22 4.40E-02(2.80E-02)+ 1.17E+01(1.17E+01)– 3.41E+01(4.40E+01)– 2.90E-01(1.8E−1)+ 1.70E+01(7.40E+00) – 1.07E+01(1.07E+01)
F23 3.30E+02(0.00E+00)– 3.29E+02(3.29E+02)≈ 3.29E+02(5.59E-08)≈ 3.20E+02(0.00E+00)+ 2.70E+02(1.00E+02)+ 3.29E+02(3.29E+02)
F24 1.10E+02(2.30E+00)– 1.19E+02(1.19E+02)– 1.27E+02(2.90E+01)– 1.10E+02(3.20E+00)- 1.30E+02(9.90E+00)– 1.09E+02(1.09E+02)
F25 1.30E+02(4.00E+01)+ 1.30E+02(1.30E+02)+ 1.79E+02(2.76E+01)+ 1.20E+02(8.70E+00)+ 1.40E+02(1.30E+01)+ 1.84E+02(1.84E+02)
F26 1.00E+02(1.60E-02)≈ 1.00E+02(1.00E+02)≈ 1.00E+02(7.47E-02)≈ 1.00E+02(2.0E−2) ≈ 1.00E+02(1.9E−01) ≈ 1.00E+02(1.00E+02)
F27 5.80E+01(1.30E+02)+ 9.12E+01(9.12E+01)+ 3.21E+02(1.21E+02)– 2.20E+00(4.9E−1)+ 7.40E+00(2.40E+00)+ 2.83E+02(2.83E+02)
F28 3.80E+02(3.20E+01)– 3.87E+02(3.87E+02)– 3.47E+02(4.76E+01)– 3.60E+02(6.40E+00) – 3.00E+02(1.9E−01)+ 3.06E+02(3.06E+02)
F29 2.20E+02(4.60E-01)≈ 2.13E+02(2.13E+02)– 2.12E+02(2.08E+01)– 2.20E+02(4.20E+00) – 2.10E+02(2.20E+01)– 2.02E+02(2.02E+02)
F30 4.60E+02(1.30E+01)– 5.05E+02(5.05E+02)– 3.94E+02(1.18E+02)– 5.00E+02(1.40E+01)– 3.80E+02(7.90E+01)– 2.24E+02(2.24E+02)
-/+/≈ 6/18/6 19/6/5 23/4/3 16/9/5 19/9/2
W. Zhang et al.
Table 7 Results (mean(std)) of MSHCSA and other algorithm on CEC2014 benchmark function with 30D

Fun L-SHADE RSDE FWA-DM CETMS OptBees MSHCSA

F1 0.00E+00(0.00E+00)≈ 1.50E+03(1.70E+03)- 2.76E+05(1.82E+05)– 0.00E+00(0.00E+00) ≈ 8.50E+04(3.00E+05)– 0.00E+00(0.00E+00)


F2 0.00E+00(0.00E+00)≈ 0.00E+00(5.99E-09)≈ 1.08E-16(1.87E-16)≈ 0.00E+00(0.00E+00) ≈ 0.00E+00(0.00E+00) ≈ 0.00E+00(0.00E+00)
F3 0.00E+00(0.00E+00)≈ 4.74E-02(1.16E-01)– 4.42E-16(4.74E-16)≈ 0.00E+00(0.00E+00) ≈ 8.40E-03(3.7e−2)– 0.00E+00(0.00E+00)
F4 0.00E+00(0.00E+00)+ 3.05E+00(1.34E+01)– 2.04E+01(1.91E+01)– 0.00E+00(0.00E+00)+ 1.20E+01(1.30E+01)– 1.18E+00(5.44E-01)
F5 2.00E+01(3.70E-02)≈ 2.03E+01(9.88E-02)– 2.05E+01(5.36E-02)– 2.00E+01(3.4e−1) ≈ 2.00E+01(1.0e−5) ≈ 2.00E+01(5.33E-05)
F6 1.40E-07(9.90E-07)+ 5.16E+00(2.01E+00)+ 1.29E+01(8.25E+00)+ 1.80E+00(1.70E+00)+ 1.60E+01(3.40E+00)+ 2.41E+01(9.47E-01)
F7 0.00E+00(0.00E+00)≈ 8.46E-04(1.59E-03)– 8.55E-03(9.81E-03)– 6.50E-01(2.60E+00)– 3.70E-02(3.8e−2)– 0.00E+00(0.00E+00)
F8 0.00E+00(0.00E+00)+ 2.04E+01(7.04E+00)– 1.13E-13(4.51E-13)+ 1.30E+01(3.30E+00)- 0.00E+00(0.00E+00)+ 5.24E+00(1.18E+00)
F9 6.80E+00(1.50E+00)+ 5.80E+01(1.65E+01)– 5.66E+01(1.08E+01)– 1.20E+01(5.20E+00)+ 1.30E+02(3.20E+01)– 2.57E+01(5.10E+00)
F10 1.60E-02(1.60E-02)+ 3.29E+02(2.47E+02)– 8.53E+00(2.42E+00)+ 2.00E+03(1.20E+03)- 1.00E+03(2.50E+02)– 2.17E+01(8.32E+00)
F11 1.20E+03(1.80E+02)+ 2.74E+03(6.44E+02)– 2.63E+03(2.48E+02)+ 2.50E+03(1.00E+03)+ 2.70E+03(5.60E+02)– 2.66E+03(3.09E+02)
F12 1.60E-01(2.30E-02)+ 4.44E-01(1.66E-01)– 3.71E-01(6.66E-02)+ 2.60E-03(3.1e−3)+ 1.80E-01(6.1e−2)+ 4.21E-01(9.78E-02)
F13 1.20E-01(1.70E-02)+ 3.05E-01(5.50E-02)– 3.89E-01(5.51E-02)– 5.30E-02(1.9e−2)+ 5.60E-01(1.4e−1)– 1.97E-01(3.21E-02)
F14 2.40E-01(3.00E-02)– 2.36E-01(3.37E-02)– 2.69E-01(7.76E-02)– 5.10E-01(5.6e−2)– 3.90E-01(2.3e−1)– 1.79E-01(2.76E-02)
F15 2.10E+00(2.50E-01)+ 5.92E+00(2.59E+00)– 7.37E+00(8.46E-01)– 2.00E+01(8.30E+01)– 1.20E+01(6.90E+00)– 3.39E+00(6.21E-01)
F16 8.50E+00(4.60E-01)+ 1.06E+01(7.70E-01)– 1.10E+01(2.71E-01)≈ 1.30E+01(1.10E+00)– 1.00E+01(6.9e−1)+ 1.10E+01(4.18E-01)
F17 1.90E+02(7.50E+01)+ 1.24E+03(3.79E+02)– 6.29E+03(5.95E+03)– 1.40E+03(4.40E+02)– 2.70E+04(4.00E+04)– 1.07E+03(1.56E+02)
F18 5.90E+00(2.90E+00)+ 9.54E+01(4.34E+01)– 7.67E+01(3.66E+01)– 1.50E+02(4.70E+01)– 1.90E+02(4.70E+02)– 1.78E+01(3.32E+00)
F19 3.70E+00(6.80E-01)+ 5.65E+00(1.46E+00)+ 9.95E+00(1.93E+00)+ 7.40E+00(1.90E+00)+ 7.80E+00(1.80E+00)+ 1.25E+01(1.01E+00)
F20 3.10E+00(1.50E+00)+ 3.73E+01(2.55E+01)– 4.28E+01(2.61E+01)– 1.10E+02(1.00E+02)– 8.50E+02(7.70E+02)– 1.20E+01(1.44E+00)
F21 8.70E+01(9.00E+01)+ 4.71E+02(2.34E+02)– 7.29E+02(9.49E+02)– 1.00E+03(2.80E+03)– 1.70E+04(1.80E+04)– 3.29E+02(9.73E+01)
A hybrid clonal selection algorithm with modified combinatorial recombination and success-history...

F22 2.80E+01(1.80E+01)+ 1.91E+02(1.19E+02)– 1.46E+02(8.83E+01)– 2.00E+02(1.40E+02)– 2.30E+02(9.20E+01)– 6.23E+01(3.75E+01)


F23 3.20E+02(0.00E+00)– 3.15E+02(1.62E-06)– 3.14E+02(9.28E-14)≈ 3.10E+02(3.4e−13)+ 3.10E+02(6.6e−2)+ 3.14E+02(0.00E+00)
F24 2.20E+02(1.10E+00)– 2.24E+02(1.65E+00)– 2.26E+02(3.59E+00)– 2.20E+02(6.50E+00)– 2.30E+02(5.40E+00-) 2.10E+02(1.12E+01)
F25 2.00E+02(5.00E-02)≈ 2.03E+02(1.17E-01)– 2.01E+02(1.98E-01)– 2.00E+02(1.40E+00) ≈ 2.00E+02(1.6e−1) ≈ 2.00E+02(5.28E-04)
F26 1.00E+02(1.60E-02)≈ 1.00E+02(4.14E-02)≈ 1.00E+02(5.35E-02)≈ 1.00E+02(2.20E+01) ≈ 1.00E+02(1.7e−1) ≈ 1.00E+02(2.81E-02)
F27 3.00E+02(0.00E+00)+ 4.69E+02(9.46E+01)+ 4.01E+02(3.06E+01)+ 3.70E+02(4.90E+01)+ 4.00E+02(9.7e−1)+ 9.05E+02(1.85E+02)
F28 8.40E+02(1.40E+01)– 9.05E+02(1.21E+02)– 3.93E+02(1.46E+01)– 9.30E+02(1.20E+02)– 4.30E+02(1.50E+01)– 3.67E+02(4.63E-01)
F29 7.20E+02(5.10E+00)– 6.52E+05(2.66E+06)– 2.11E+02(2.90E+00)+ 1.50E+06(3.40E+06)– 2.10E+02(1.10E+00)+ 2.14E+02(6.42E-01)
F30 1.20E+03(6.20E+02)– 1.70E+03(8.67E+02)– 4.51E+02(1.96E+02)– 2.90E+03(1.10E+03)– 5.90E+02(9.80E+01)– 3.25E+02(4.51E+01)
−/ + / ≈ 6/17/7 24/4/2 17/8/5 15/9//6 18/8/4
Table 8 Results (mean(std)) of MSHCSA and other algorithm on CEC2014 benchmark function with 50D

Fun L-SHADE RSDE FWA-DM CETMS OptBees MSHCSA

F1 1.20E+03(1.50E+03)– 2.25E+04(1.21E+04)– 6.15E+06(2.39E+06)– 0.00E+00(0.00E+00) ≈ 1.30E+05(9.70E+04)– 0.00E+00(0.00E+00)


F2 0.00E+00(0.00E+00)+ 3.58E+03(6.56E+03)– 4.83E+03(6.06E+03)– 0.00E+00(0.00E+00) + 1.00E-03(2.9e−3)– 2.51E-07(1.77E-07)
F3 0.00E+00(0.00E+00)≈ 4.10E-01(1.38E+00)– 7.66E+01(2.46E+01)– 0.00E+00(0.00E+00) ≈ 1.70E+02(1.70E+02)– 0.00E+00(0.00E+00)
F4 5.90E+01(4.60E+01)– 6.41E+01(3.62E+01)– 5.05E+01(1.88E+01)– 7.10E+01(4.40E+01)– 3.80E+01(3.40E+01)– 2.72E+01(8.12E-01)
F5 2.00E+01(4.60E-02)≈ 2.05E+01(9.61E-02)– 2.07E+01(3.61E-02)– 2.00E+01(3.3e−1) ≈ 2.00E+01(2.5e−5) ≈ 2.00E+01(2.44E-04)
F6 2.60E-01(5.20E-01)+ 1.72E+01(4.33E+00)+ 4.39E+01(2.31E+00)+ 4.20E+00(2.60E+00) + 3.00E+01(4.50E+00)+ 5.09E+01(1.79E+00)
F7 0.00E+00(0.00E+00)≈ 2.40E-03(4.40E-03)– 2.66E-03(4.25E-03)– 4.50E-01(1.90E+00)– 2.30E-02(3.9e−2)– 0.00E+00(0.00E+00)
F8 2.60E-09(7.50E-09)+ 5.04E+01(1.24E+01)– 9.17E+00(2.39E+00)+ 2.50E+01(5.50E+00)+ 0.00E+00(0.00E+00)+ 3.26E+01(3.80E+00)
F9 1.10E+01(2.10E+00)+ 1.42E+02(3.51E+01)– 1.47E+02(1.57E+01)– 2.40E+01(8.10E+00)+ 2.40E+02(4.80E+01)– 7.68E+01(8.82E+00)
F10 1.20E-01(4.10E-02)+ 1.52E+03(9.46E+02)– 6.14E+02(1.42E+02)+ 3.90E+03(1.20E+03)– 1.80E+03(3.40E+02)– 1.51E+03(2.24E+02)
F11 3.20E+03(3.30E+02)+ 6.15E+03(1.05E+03)– 5.43E+03(4.02E+02)+ 4.20E+03(1.50E+03)+ 5.10E+03(6.80E+02)+ 5.79E+03(4.10E+02)
F12 2.20E-01(2.80E-02)+ 5.38E-01(2.02E-01)+ 6.02E-01(8.72E-02)+ 1.50E-03(1.1e−3)+ 1.60E-01(4.6e−2)+ 6.82E-01(1.02E-01)
F13 1.60E-01(1.80E-02)+ 4.23E-01(6.05E-02)– 4.88E-01(4.63E-02)– 1.10E-01(4.2e−2)+ 6.00E-01(1.1e−1)– 2.84E-01(2.99E-02)
F14 3.00E-01(2.50E-02)– 2.78E-01(2.25E-02)– 3.30E-01(1.05E-01)– 5.00E-01(6.6e−2)– 4.70E-01(2.3e−1)– 2.22E-01(4.02E-02)
F15 5.20E+00(5.10E-01)+ 9.96E+00(6.53E+00)– 2.08E+01(1.45E+00)– 1.00E+01(2.20E+01)– 2.60E+01(9.10E+00)– 9.19E+00(1.24E+00)
F16 1.70E+01(4.80E-01)+ 1.93E+01(8.42E-01)+ 2.00E+01(3.21E-01)+ 2.20E+01(1.50E+00)– 1.80E+01(8.7e−1)+ 2.03E+01(3.24E-01)
F17 1.40E+03(5.10E+02)+ 4.10E+03(1.68E+03)– 1.08E+05(6.03E+04)– 2.10E+03(5.40E+02)+ 3.20E+04(3.50E+04)– 2.64E+03(2.99E+02)
F18 9.70E+01(1.40E+01)– 3.40E+02(2.38E+02)– 3.31E+03(3.72E+03)– 2.60E+02(6.40E+01)– 6.80E+02(1.00E+03)– 7.78E+01(1.02E+01)
F19 8.30E+00(1.80E+00)+ 1.46E+01(2.03E+00)+ 2.11E+01(3.17E+00)– 1.20E+01(3.10E+00)+ 1.50E+01(2.70E+00)+ 1.96E+01(1.56E+00)
F20 1.40E+01(4.60E+00)+ 1.60E+02(7.30E+01)– 4.00E+02(8.12E+01)– 3.50E+02(1.50E+02)– 1.90E+03(1.40E+03)– 4.89E+01(5.76E+00)
F21 5.20E+02(1.50E+02)+ 1.58E+03(6.55E+02)– 2.55E+04(1.58E+04)– 1.30E+03(4.20E+02)+ 6.00E+04(4.10E+04)– 1.50E+03(2.48E+02)
F22 1.10E+02(7.50E+01)+ 4.61E+02(2.34E+02)+ 5.45E+02(1.25E+02)– 3.60E+02(1.90E+02)+ 7.50E+02(2.00E+02)– 4.63E+02(1.32E+02)
F23 3.40E+02(4.40E-13)– 3.44E+02(1.19E-05)– 3.37E+02(1.31E-05)≈ 3.40E+02(2.8e−13)– 3.30E+02(2.8e−1)+ 3.37E+02(2.85E-13)
F24 2.80E+02(6.60E-01)– 2.76E+02(2.38E+00)– 2.66E+02(3.48E+00)– 2.70E+02(1.10E+00)– 2.60E+02(3.20E+00)+ 2.64E+02(3.30E-01)
F25 2.10E+02(3.60E-01)– 2.06E+02(7.81E-01)– 2.05E+02(2.26E+00)– 2.00E+02(2.70E+00) ≈ 2.00E+02(6.0e−1) ≈ 2.00E+02(1.59E-02)
F26 1.00E+02(1.40E+01)≈ 1.12E+02(4.97E+01)– 1.00E+02(5.98E-02)≈ 1.70E+02(1.00E+02)– 1.00E+02(1.2e−1) ≈ 1.00E+02(2.83E-02)
F27 3.30E+02(3.00E+01)+ 8.04E+02(1.00E+02)+ 1.45E+03(7.27E+01)+ 4.50E+02(7.30E+01)+ 1.10E+03(2.10E+02)+ 1.73E+03(3.70E+01)
F28 1.10E+03(2.90E+01)- 1.61E+03(3.90E+02)– 4.01E+02(3.06E+01)– 1.20E+03(8.60E+01)– 4.40E+02(3.80E+01)– 3.57E+02(4.41E-01)
F29 7.90E+02(2.40E+01)- 5.28E+06(1.64E+07)– 2.23E+02(2.67E+00)+ 2.20E+06(9.10E+06)– 2.30E+02(2.30E+00)– 2.29E+02(8.98E-01)
F30 8.70E+03(4.10E+02)- 1.12E+04(1.75E+03)– 8.36E+02(3.06E+02)+ 9.80E+03(1.70E+03)– 8.10E+02(1.20E+02)+ 9.58E+02(1.77E+02)
−/ + / ≈ 10/16/4 24/6/0 19/9/2 14/12/4 17/10/3
W. Zhang et al.
Table 9 Results (mean(std)) of MSHCSA and other algorithm on CEC2014 benchmark function with 100D

Fun L-SHADE RSDE FWA-DE CETMS OptBees MSHCSA

F1 1.70E+05(5.70E+04)– 8.33E+05(2.89E+05)– 2.28E+08(4.08E+07)– 0.00E+00(0.00E+00)+ 2.90E+05(1.00E+05)– 9.95E+04(4.26E+04)


F2 0.00E+00(0.00E+00)+ 7.39E+03(9.84E+03)– 1.62E+04(1.81E+04)– 0.00E+00(0.00E+00)+ 1.00E+01(2.90E+01)– 2.46E+00(1.21E+00)
F3 0.00E+00(0.00E+00)+ 9.77E-01(2.21E+00)– 2.95E+04(3.64E+03)– 0.00E+00(0.00E+00)+ 7.50E+02(7.80E+02)– 6.59E-07(3.64E-07)
F4 1.70E+02(3.10E+01)– 1.86E+02(4.06E+01)– 1.83E+02(1.00E+02)– 1.70E+02(4.10E+01)– 1.40E+02(5.40E+01)– 8.44E+01(6.18E-01)
F5 2.10E+01(3.10E-02)– 2.08E+01(7.85E-02)– 2.10E+01(2.44E-02)– 2.00E+01(3.2e−1) ≈ 2.00E+01(2.4e−5) ≈ 2.00E+01(1.87E-03)
F6 8.70E+00(2.30E+00)+ 6.02E+01(7.48E+00)+ 1.14E+02(2.54E+00)+ 1.00E+01(4.80E+00)+ 7.10E+01(7.70E+00)+ 1.29E+02(2.61E+00)
F7 0.00E+00(0.00E+00)≈ 1.27E-03(2.71E-03)– 1.29E-01(3.14E-02)– 1.90E-01(9.6e−1)– 5.80E-03(8.0e−3)– 0.00E+00(0.00E+00)
F8 1.10E-02(7.40E-03)+ 1.94E+02(3.02E+01)+ 1.08E+02(5.40E+00)+ 5.90E+01(1.20E+01)+ 0.00E+00(0.00E+00)+ 2.36E+02(1.64E+01)
F9 3.40E+01(5.00E+00)+ 3.20E+02(5.41E+01)- 5.52E+02(4.44E+01)– 6.20E+01(8.60E+00)+ 6.60E+02(9.70E+01)– 3.17E+02(1.82E+01)
F10 2.60E+01(5.80E+00)+ 9.31E+03(2.67E+03)+ 5.67E+03(3.04E+02)+ 9.10E+03(1.60E+03)+ 4.20E+03(4.20E+02)+ 1.11E+04(5.54E+02)
F11 1.10E+04(5.60E+02)+ 1.55E+04(1.54E+03)+ 1.46E+04(6.30E+02)+ 1.00E+04(1.80E+03)+ 1.20E+04(1.10E+03)+ 1.69E+04(5.54E+02)
F12 4.40E-01(4.70E-02)+ 7.42E-01(1.97E-01)+ 1.21E+00(7.78E-02)+ 7.40E-04(3.7e−4)+ 2.30E-01(5.0e−2)+ 1.33E+00(1.34E-01)
F13 2.40E-01(2.10E-02)+ 5.44E-01(4.09E-02)– 5.61E-01(3.61E-02)– 2.00E-01(4.1e−2)+ 5.80E-01(8.1e−2)– 3.67E-01(3.12E-02)
F14 1.20E-01(7.30E-03)+ 2.09E-01(1.23E-02)+ 1.89E-01(2.06E-02)+ 2.80E-01(2.4e−2)+ 2.20E-01(2.2e−2)+ 3.12E-01(9.89E-02)
F15 1.60E+01(1.20E+00)+ 5.24E+01(1.82E+01)– 8.74E+01(5.87E+00)– 2.20E+01(4.70E+01)+ 6.50E+01(1.80E+01)– 3.00E+01(1.84E+00)
F16 3.90E+01(4.80E-01)+ 4.24E+01(1.21E+00)+ 4.35E+01(3.75E-01)≈ 4.60E+01(1.10E+00)– 4.00E+01(1.10E+00)+ 4.35E+01(4.63E-01)
F17 4.40E+03(7.10E+02)+ 9.86E+04(4.60E+04)– 2.31E+07(5.63E+06)– 5.30E+03(6.90E+02)+ 1.00E+05(6.70E+04)– 5.87E+03(6.98E+02)
F18 2.20E+02(1.70E+01)+ 1.26E+03(1.08E+03)– 5.68E+03(8.70E+03)– 5.90E+02(1.00E+02)– 1.50E+03(2.10E+03)– 2.82E+02(9.44E+01)
F19 9.60E+01(2.30E+00)– 8.16E+01(2.58E+01)– 6.34E+01(2.43E+00)– 1.00E+02(1.60E+01)– 5.20E+01(1.50E+01)– 4.31E+01(3.35E-01)
F20 1.50E+02(5.20E+01)+ 5.50E+02(1.76E+02)– 6.93E+04(1.10E+04)– 7.10E+02(1.60E+02)– 1.00E+04(4.20E+03)– 2.18E+02(1.49E+01)
F21 2.30E+03(5.30E+02)+ 3.49E+04(1.90E+04)– 9.57E+06(2.31E+06)– 3.10E+03(5.40E+02)+ 3.10E+05(1.50E+05)– 4.76E+03(3.41E+02)
A hybrid clonal selection algorithm with modified combinatorial recombination and success-history...

F22 1.10E+03(1.90E+02)+ 1.51E+03(4.63E+02)+ 1.51E+03(1.34E+02)+ 7.60E+02(3.60E+02)+ 2.00E+03(3.30E+02)– 1.53E+03(3.35E+02)


F23 3.50E+02(2.80E-13)– 3.48E+02(2.12E-03)– 3.46E+02(2.18E-01)– 3.40E+02(7.6e−11)+ 3.40E+02(9.2e−1)+ 3.45E+02(1.05E-12)
F24 3.90E+02(2.90E+00)– 4.06E+02(5.67E+00)– 3.63E+02(2.85E+00)+ 3.90E+02(5.90E+00)– 3.40E+02(1.00E+01)+ 3.84E+02(2.91E+00)
F25 2.00E+02(4.00E-13)+ 2.42E+02(7.50E+00)– 3.03E+02(1.74E+01)– 2.20E+02(6.00E+00)– 2.00E+02(1.10E+00)+ 2.01E+02(1.70E-01)
F26 2.00E+02(6.20E-13)– 1.98E+02(4.97E+01)– 1.61E+02(5.88E+01)– 2.00E+02(6.60E+01)– 1.00E+02(6.8e−2) ≈ 1.00E+02(2.89E-02)
F27 3.80E+02(3.30E+01)+ 2.01E+03(1.65E+02)+ 3.14E+03(4.13E+02)+ 5.90E+02(1.00E+02)+ 2.10E+03(1.60E+02)+ 3.71E+03(5.61E+01)
F28 2.30E+03(4.60E+01)– 4.11E+03(7.20E+02)– 1.60E+03(3.42E+02)– 2.50E+03(2.80E+02)– 6.10E+02(4.00E+01)– 4.04E+02(5.30E+00)
F29 8.00E+02(7.60E+01)– 8.29E+07(8.20E+07)– 2.70E+02(5.23E+00)– 7.40E+02(2.80E+01)– 2.70E+02(3.20E+00)– 2.43E+02(4.22E+00)
F30 8.30E+03(9.60E+02)– 1.31E+04(2.31E+03)– 2.23E+03(1.16E+03)+ 8.80E+03(1.00E+03)– 2.80E+03(2.40E+00)+ 2.85E+03(2.73E+02)
-/+/≈ 10/19/1 21/9/0 19/10/1 12/17/1 16/14/2
W. Zhang et al.

employed in accordance with their respective literatures. the hybrid functions, L-SHADE shows the great advantage.
The experiments results on all the 30 test function from MSHCSA is better than OptBees on all of the functions,
CEC2014 are performed with 10D, 30D, 50D and 100D are superior to RSDE and FWA-DE on 5 out of 6 functions
displayed in Tables 6, 7, 8 and 9. and gets even to CTEMS. OptBees and MSHCSA keep the
When dimension is 10D as is shown in Table 6, four relatively better performance than the other algorithms in
algorithms (L-SHADE, RSDE, CETMS and MSHCSA) coping with the composition functions.
could achieve the global optima in all of the unimodal Looking through the overall performance of the algo-
functions (F1-F3). Only MSHCSA succeeds in locating the rithms on the benchmark function with different dimen-
global optimum in F4, From F5 to F11, the performance sions, we can make a summary. As indicated by the Free
of L-SHADE is the best among the comparing algorithms. lunch theory, every algorithm presents its own advantages
Comparing the other algorithms, MSHCSA does not on different aspect. Moreover, with the increase of dimen-
show advantages but is generally acceptable. To the sion, the performance of most algorithms gradually dete-
hybrid functions, the L-SHADE presents the overwhelming riorates. Even though, the proposed MSHCSA obtains the
advantage to the other algorithms. MSHCSA is better than competitive results on most of the CEC2014 benchmark
FWA-DM (on all hybrid functions F17-F22) and OptBees functions. On unimodal group, MSHCSA gets the best
(F17-F22 except F19). To the composition functions (F23- results with 10D and 30D, and the suboptimal results with
F30), MSHCSA gains the best results on four functions 50D. Even when the dimension is 100D, MSHCSA is better
(F24, F26, F29, and F30). As analyzed in Section 4.2, than RSDE, FWA-DE and OptBees. Moreover, MSHCSA
modified success-history based adaptive mutation is not shows the potential capability of handling the composition
good at coping with multimodal functions, but shows functions. On the whole, MSHCSA is superior to RSDE,
advantages in unimodal and composition functions. FWA-DE and OptBees on most of the benchmark func-
When dimension is 30D as is shown in Table 7, only three tion with different dimensions, and attains better results
algorithms (L-SHADE, CETMS and MSHCSA) achieve than CTEMS on most of the benchmark functions with low
the global optima in all of the unimodal functions (F1- dimensions. The satisfactory performance of MSHCSA is
F3). Even though L-SHADE still maintains an advantage due to the introduction of the modified recombination and
in most of the multimodal functions, MSHCSA gets the success-history based adaptive mutation, where the former
best results on three functions (F5, F7 and F14). There is brings the diversity to the population and the later improve
not much difference with dimension is 10D in handling the searching ability. Moreover, Gene knockout strategy fur-
Hybrid functions (F17-F22). MSHCSA achieves the best ther improves the precision and help jump out of stagnation.
performance on 5 out of 8 functions in composition Therefore, the proposed MSHCSA is able to solving the
functions (F23-F30). complicated optimization problem effectively.
When dimension is 50D as is shown in Table 8, only
CTEMS achieve the global optima in all of the unimodal
functions (F1-F3). MSHCSA obtains the global optimum 5 Conclusions
in F1 and F3, and a little error (2.51E-07) in F2 which
is acceptable. The advantage of L-SHADE in handing the In this paper, a novel clonal selection algorithm denoted as
multimodal functions continues to decline, which presents MSHCSA is proposed. The combinatorial recombination is
the best results on 8 out of 16 functions. MSHCSA gets four introduced and modified to bring more diversity while the
best results (F4, F5, F7, and F14). To the hybrid functions success-history based adaptive clonal selection is developed
(F17-F22), L-SHADE wins the comparison on 5 out of 6 to enhance the search ability. Moreover the Gene knockout
function except F18 which MSHCSA gets the best results. strategy is proposed to further improve the accuracy and
When comparing the composition functions, OptBees and avoid stagnation. The proposed algorithm is tested on the
MSHCSA are ahead of the other algorithms. CEC 2014 benchmark function, which covers unimodal,
When dimension is 100D as is shown in Table 9, CTEMS multimodal, composing test problems by extracting features
keeps the advantage of locating the global optimum in dimension-wise from several problems, graded level of
all of the unimodal functions (F1-F3). MSHCSA does not linkages, rotated trap problems with dimension ranges from
obtain the satisfied results. However, it is superior to RSDE, 10D to 100D. Two main introduced strategies are analyzed
FWA-DE and OptBees in handling the unimodal functions. at first, and then the proposed algorithm is compared with
To the multimodal functions (F4-F16), L-SHADE gets the the state of the art algorithms. The experimental results
best results on 8 out of 16 functions, while MSHCSA show the newly developed strategies greatly improve the
obtains best results on 3 functions. Even the performance performance of the algorithm. To compare with other
of all compared algorithms are unsatisfactory in handling algorithms, the MSHCSA is very competitive.
A hybrid clonal selection algorithm with modified combinatorial recombination and success-history...

Acknowledgements This work is supported by National Natural Sci- Part B Cybern Publ IEEE Syst Man Cybern Soc 38(5):1234–
ence Foundation of China (No. 61403349, 61501405, 41601418), 1253
Funding program for key scientific research projects of universities in 19. Cutello V, Nicosia G, Povene M (2006) Real coded clonal
Henan province (No. 18A210025), Science and technology research selection algorithm for unconstrained global optimization using a
key project of basic research projects in education department of hybrid inversely proportional hypermutation operator. Acm Symp
Henan province (No.15A520033, No.14B520066), doctoral founda- Appl Comput 2:950–954
tion(No. 2013BSJJ044), and Student science and technology activity 20. Peng Y, Lu B (2015) Hybrid learning clonal selection algorithm.
project of Zhengzhou university of light industry. Inf Sci 296(1):128–146
21. Gong M, Jiao L, Yang J, Liu F (2010) Lamarckian learning in
clonal selection algorithm for numerical optimization. Int J Artif
Intell Tools 19(1):19–37
References 22. Liu Z, Li X, Zhang J (2013) Co-evolutionary particle swarm
optimization algorithm based on elite immune clonal selection.
1. Mishra A (2017) Nature inspired algorithms: a survey of the state Acta Electronica Sinica 41(11):2167–2173
of the art. Int J Adv Res Comput Sci Manag Stud 5(9):16–21 23. Li R, Zhan W, Hao Z (2017) Artificial immune particle
2. Boussaı̈d I, Lepagnot J, Siarry P (2013) A survey on optimization swarm optimization algorithm based on clonal selection. Boletin
metaheuristics. Inform Sci 237(237):82–117 Tecnico/technical Bull 55(1):158–164
3. Silva GC, Dasgupta D (2016) A survey of recent works in artificial 24. Zhang F, Wang C, Yang Z (2017) Clonal selection algorithm based
immune systems. Handbook on computational intelligence: vol 2: on vaccination and Cauchy mutation. Computer Engineering &
evolutionary computation, hybrid systems, and applications, pp Applications
547–586 25. Liang L, Xu G, Liu D, Zhao S (2007) Immune clonal
4. Yin C, Ma L, Feng L (2017) Towards accurate intrusion detection selection optimization method with mixed mutation strategies. In:
based on improved clonal selection algorithm. Multimed Tools International conference on bio-inspired computing: theories and
Appl 76(19):19397–19410 applications. Bic-Ta IEEE, pp 37–41
5. Anuar S, Sallehuddin R, Selamat A (2016) Implementation of arti- 26. Qia Y, Houa Z, Yina M, Sunb H, Huang J (2015) An immune
ficial neural network on graphics processing unit for classification multi-objective optimization algorithm with differential evolution
problems. In: International conference on computational collective inspired recombination. Appl Soft Comput 29:395–410
intelligence, pp 303–310 27. Zaharie D (2009) Influence of crossover on the behavior of
6. Idris I, Selamat A, Nguyen NT, Omatu S, Krejcar O (2015) A differential evolution algorithms. Appl Soft Comput 9(3):1126–
combined negative selection algorithm-particle swarm optimiza- 1138
tion for an email spam detection system. Eng Appl Artif Intel 28. Liu R, Ma C, He F, Ma W, Jiao L (2014) Reference direction
39:33–44 based immune clone algorithm for many-objective optimization.
7. Idris I, Selamat A (2014) Improved email spam detection model Front Comput Sci 8(4):642–655
with negative selection algorithm and particle swarm optimization. 29. Zhang W, Lin J, Jing H, Zhang Q (2016) A novel hybrid
Appl Soft Comput 22(5):11–27 clonal selection algorithm with combinatorial recombination
8. Avatefipour O, Nafisian A (2018) A novel electric load con- and modified hypermutation operators for global optimization.
sumption prediction and feature selection model based on mod- Comput Intell Neurosci 2016:12
ified clonal selection algorithm. J Intell Fuzzy Syst 34(4):2261– 30. Pavone M, Narzisi G, Nicosia G (2012) Clonal selection: an
2272 immunological algorithm for global optimization over continuous
9. Xu N, Ding Y, Ren L, Hao K (2017) Degeneration recognizing spaces. J Glob Optim 53(4):769–808
clonal selection algorithm for multimodal optimization. IEEE 31. Tanabe R, Fukunaga A (2013) Success-history based parameter
Trans Cybern 48(3):848–861 adaptation for differential evolution. In: 2013 IEEE Congress on
10. Haktanirlar Ulutas B, Kulturel-Konak S (2011) A review of evolutionary computation, pp 71–78
clonal selection algorithm and its applications. Artif Intell Rev 32. Tanabe R, Fukunaga A (2013) Evaluating the performance of
36(2):117–138 SHADE on CEC 2013 benchmark problems. In: 2013 IEEE
11. De Castro LN, Timmis J (2002) An artificial immune network Congress on evolutionary computation, pp 1952–1959
for multimodal function optimization. In: Proceedings of IEEE 33. Liang J, Qu B, Suganthan P (2013) Problem definitions and
conference on evolutionary computation, pp 699–674 evaluation criteria for the CEC 2014 special session and competi-
12. De Castro LN, Zuben FJV (1999) Artificial immune systems: tion on single objective real-parameter numerical optimization
part I–basic theory and applications, Universidade Estadual de 34. Tanabe R, Fukunaga A (2014) Improving the search performance
Campinas, Dezembro de, Technical report 210 of SHADE using linear population size reduction. In: 2014 IEEE
13. de Castro LN, von Zuben FJ (2002) Learning and optimization Congress on evolutionary computation, pp 1658–1665
using the clonal selection principle. IEEE Trans Evol Comput 35. Xu C, Huang H, Ye S (2014) A differential evolution with
6(3):239–251 replacement strategy for real-parameter numerical optimization.
14. Gong MM, Jiao L, Zhang L (2010) Baldwinian learning in clonal In: 2014 IEEE Congress on evolutionary computation, pp 1617–
selection algorithm for optimization. Inf Sci 180(8):1218–1236 1624
15. Gao S, Chai H, Chen B (2013) Hybrid gravitational search and 36. Yu C, Kelley L, Zheng S, Tan Y (2014) Fireworks algorithm
clonal selection algorithm for global optimization, advances in with differential mutation for solving the CEC 2014 competition
swarm intelligence, vol 7929. Springer, Berlin, pp 1–10 problems. In: 2014 IEEE Congress on evolutionary computation,
16. Yang G, Jin H (2013) Optimization algorithm based on differential pp 3238–324
evolution and clonal selection mechanism. Comput Eng Appl 37. Liu C, Fan L (2016) A hybrid evolutionary algorithm based on
49(10):50–49 tissue membrane systems and CMA-ES for solving numerical
17. Khilwani N, Prakash A, Shankar R, Tiwari M (2008) Fast clonal optimization problems. Knowl-Based Syst 105:38–47
algorithm. Eng Appl Artif Intell 21(1):106–128 38. Maia RD, de Castro LN, Caminhas WM (2014) Real-parameter
18. Jiao L, Li Y, Gong M (2008) Quantum-inspired immune clonal optimization with optbees. In: 2014 IEEE Congress on evolution-
algorithm for global optimization. IEEE Trans Syst Man Cybern ary computation, pp 2649–2655
W. Zhang et al.

Weiwei Zhang is currently an Xiao Wang received the B.S.


assistant professor in Zheng- degree from Henan Normal
zhou University of Light In- University in 2004, the M.S.
dustry. She received the M.S. degree from the University of
and Ph.D. degree in computer Electronic Science and Tech-
science from Chongqing Uni- nology of China in 2007 and
versity, Chongqing, China, in the PhD degree from Tongji
2008 and 2013 respectively. University in 2013. He is cur-
Her current research interests rently an assistant professor at
include evolutionary compu- the School of Computer and
tation, dynamic optimization, Communication Engineering,
image processing, and intelli- Zhengzhou University of
gent control. Light Industry, Zhengzhou,
China. His research interests
include evolutionary optimi-
zation, machine learning and
Kui Gao received the B.S. deg- bioinformatics.
ree in information technol-
ogy from Zhengzhou Univer-
sity of Light Industry in the Qiuwen Zhang received the
summer of 2018 and will be Ph.D. degree in communica-
a graduate student in He’nan tion and information systems
University of technology in from Shanghai University,
the coming autumn. His cur- Shanghai, China, in 2012.
rent research interests include Since 2012, he has been with
artificial immune system and the faculty of the College of
robotic control. Computer and Communica-
tion Engineering, Zhengzhou
University of Light Industry,
where he is currently an Asso-
ciate Professor. His major
research interests include 3D
Weizheng Zhang is current- signal processing, pattern
ly an assistant professor in recognition, evolutionary opti-
Zhengzhou University of Light mization, and multimedia
Industry. He received the M.S. communication.
degree in Signal and Infor-
mation Processing from Shan-
dong University of Science Hua Wang is currently an assis-
and Technology, and Ph.D. tant professor at the School
degree in Agricultural Electri- of Computer and Communica-
fication and Automation from tion Engineering, Zhengzhou
Zhejiang University, China, in University of Light Industry,
2012 and 2016 respectively. Zhengzhou, China. He recei-
His current research interests ved the B.S. and Ph.D. deg-
include intelligent computing, ree from Wuhan University,
agricultural engineering, and Wuhan, China, in 2008 and
image processing. 2013 respectively. His current
research interests include evo-
lutionary computation, geo-
graphic information system,
and image processing.

You might also like