You are on page 1of 10

Available online at www.sciencedirect.

com
Available online at www.sciencedirect.com
ScienceDirect
ScienceDirect
Procedia
Available Computer
online Science 00 (2019) 000–000
at www.sciencedirect.com
Procedia Computer Science 00 (2019) 000–000 www.elsevier.com/locate/procedia
www.elsevier.com/locate/procedia
ScienceDirect
Procedia Computer Science 167 (2020) 263–272

International Conference on Computational Intelligence and Data Science (ICCIDS 2019)


International Conference on Computational Intelligence and Data Science (ICCIDS 2019)
Analysis of New Distributed Differential Evolution Algorithm with
Analysis of New Distributed Differential Evolution Algorithm with
Best Determination Method and Species Evolution
Best Determination Method and Species Evolution
Amit Ramesh Khaparde
Amit Ramesh Khaparde
Assistant Professor , Computer Science & Engineering , G. B. Pant Govt Engineering College, New Delhi, India
Assistant Professor , Computer Science & Engineering , G. B. Pant Govt Engineering College, New Delhi, India

Abstract
Abstract
This is a new semi-distributed differential evolution algorithm with best determination method and species evolution (DESBS).
This algorithm is based on fact-differential
is a new semi-distributed evolution ofevolution
species around niche.
algorithm withInbest
DESBS, the best determination
determination methodevolution
method and species determines the best
(DESBS).
individuals
This of population.
algorithm is based onThese best individuals
fact- evolution act around
of species as niches and the
niche. In species
DESBS,are theevolved around thesemethod
best determination best individuals.
determinesOver the
the best
period of time,
individuals each species
of population. evolved
These best separately,
individuals using
act as standard
niches and differential evolution
the species algorithm.
are evolved aroundThe
theseevolving efficiencyOver
best individuals. of each
the
speciesof
period evaluated separately
time, each and inefficient
species evolved species
separately, arestandard
using merge todifferential
nearby species. The scale-up
evolution study
algorithm. Theisevolving
performed to find out
efficiency best
of each
parameter
species setting separately
evaluated and the results are compared
and inefficient with
species areother
mergestate-of-art
to nearbyalgorithms
species. Thelike CoDE,study
scale-up EPSDE and standard
is performed differential
to find out best
evolution algorithm.
parameter setting andThethe results
results show that the DESBS
are compared with otherperform better than
state-of-art SDE inlike
algorithms multimodal functions
CoDE, EPSDE andand CoDE,differential
standard EPSDE in
rotating optimization
evolution algorithm. function
The results show that the DESBS perform better than SDE in multimodal functions and CoDE, EPSDE in
rotating optimization function
© 2019 The Authors. Published by Elsevier B.V.. This is an open access article under the CC BY-NC-ND license
© 2020 The Authors. Published by Elsevier B.V.
(http://creativecommons.org/licenses/by-nc-nd/4.0/)
© 2019 The Authors. Published by Elsevier B.V.. This is an open access article under the CC BY-NC-ND license
This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Peer-review under responsibility of the scientific committee of the International Conference on Computational Intelligence and
(http://creativecommons.org/licenses/by-nc-nd/4.0/)
Peer-review under responsibility of the scientific committee of the International Conference on Computational Intelligence and Data
Data Science
Peer-review
Science (ICCIDS
under
(ICCIDS 2019)
responsibility
2019). of the scientific committee of the International Conference on Computational Intelligence and
Data Science (ICCIDS 2019)
Keywords:Evolutionary Algorithm,;hybrid composite function;optimization;niches
Keywords:Evolutionary Algorithm,;hybrid composite function;optimization;niches

1. Introduction
1. Introduction
Optimization is a methodology to find best solution form a set of feasible solutions. In engineering, optimization
Optimization is a methodology
means maximizing to find best performance
the system/application solution formusing
a set minimal
of feasible solutions.
resources. TheInoptimization
engineering, problems
optimization
are
means maximizing
solved by using two the system/application
different performance
categories of techniques [1] using
– exactminimal resources.
algorithms The optimization
and stochastic algorithms. problems are
Evolutionary
solved by using
algorithms (EA)two
are different categoriesEA
the metaheuristics. of techniques [1] – exact
belong to family algorithms
of stochastic and stochastic
algorithm algorithms.
are problem solvingEvolutionary
techniques.
algorithms (EA) are the
EA has evolutionary metaheuristics.
computational EA belong
models to family
as prime elementof in
stochastic algorithm
their operation. are 1960s,
Since problem solving
many techniques.
algorithms with
EA has evolutionary
population-based, computationaland
fitness-oriented models as prime element
variation-driven in their
properties haveoperation. Since 1960s,
been proposed many
[2], like algorithms
genetic with
algorithm,
population-based,
evolutionary fitness-oriented
programming, andstrategies,
evolution variation-driven
swarm properties have
Intelligence, been proposed
differential [2], algorithm
evolution like geneticand
algorithm,
genetic
evolutionary programming,
programming. evolution
They all are based on thestrategies, swarm i,e,
common concept Intelligence,
evolution differential evolution
using selection, algorithm
mutation, and genetic
and reproduction.
programming. They all are based on the common concept i,e, evolution using selection, mutation, and reproduction.
1877-0509 © 2019 The Authors. Published by Elsevier B.V.. This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/by-nc-nd/4.0/)
1877-0509 © 2019 The Authors. Published by Elsevier B.V.. This is an open access article under the CC BY-NC-ND license
Peer-review under responsibility of the scientific committee of the International Conference on Computational Intelligence and Data Science
(http://creativecommons.org/licenses/by-nc-nd/4.0/)
(ICCIDS 2019)
Peer-review under responsibility of the scientific committee of the International Conference on Computational Intelligence and Data Science
(ICCIDS 2019)

1877-0509 © 2020 The Authors. Published by Elsevier B.V.


This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Peer-review under responsibility of the scientific committee of the International Conference on Computational Intelligence and Data
Science (ICCIDS 2019).
10.1016/j.procs.2020.03.220
264 Amit Ramesh Khaparde et al. / Procedia Computer Science 167 (2020) 263–272
2

The Differential evolution algorithm (DE) [3] , a heuristic method is very effectively and efficiently solved the non-
linear, non-differential continuous space optimization, because it is simple to use, robust, necessitate few control
parameters and impart to parallel computation [4] [5][6] [7]. DE has high convergence speed than other evolutionary
algorithms. DE is best to solve engineering optimization problems [8]. The application of differential evolution
algorithm available in the field of electronic engineering [9], electrical engineering [10], combinatorial mathematics
[11], civil engineering [12], aeronautical engineering [13], operation research [14], education sector [15], logistic
design [16] , other soft computing techniques [17] . There are various other Evolutionary algorithms along with DE
or the hybrid evolutionary algorithms to solve the optimization problems [18-21].

2. Differential Evolutionary Algorithm

Differential evolution algorithm (DE) is parallel natured direct search method. It has three decision parameters -
Population size (NP), crossover rate(Cr) and scaling factor (F). It is the cycle of generation phase , mutation, crossover
followed by selection. Initially the NP number of individuals are generated by following simple function:

𝐗𝐗 𝐢𝐢,𝐣𝐣 = 𝐗𝐗 𝐢𝐢𝐦𝐦𝐦𝐦𝐦𝐦 + 𝐫𝐫𝐫𝐫𝐫𝐫𝐫𝐫 [𝟎𝟎, 𝟏𝟏] (𝐗𝐗 𝐢𝐢𝐦𝐦𝐦𝐦𝐦𝐦 − 𝐗𝐗 𝐢𝐢𝐦𝐦𝐦𝐦𝐦𝐦 ) (𝟏𝟏)

𝐗𝐗𝐢𝐢𝐦𝐦𝐦𝐦𝐦𝐦 , 𝐗𝐗 𝐢𝐢𝐦𝐦𝐦𝐦𝐦𝐦 denote the upper bound, lower bound of decision variables, respectively. Each vector is denoted as: Xi,j
= X1i,j ,X2i,j,…..XDi,j, whereas i=1,2 … NP . D = no. of variables in jth generation. Next evaluation of newly created
population followed by mutation. Mutation gives the mutant vector. For each vector in population the mutant vectors
are created using following function.

𝐌𝐌𝐢𝐢,𝐣𝐣 = 𝐗𝐗 𝐫𝐫𝐫𝐫,𝐣𝐣 + 𝐅𝐅( 𝐗𝐗 𝐫𝐫𝐫𝐫,𝐣𝐣 − 𝐗𝐗 𝐫𝐫𝐫𝐫,𝐣𝐣 ) (2)

r1, r2, r3 ϵ 1. . . NP and different integers from each other, F > 0 ϵ [0, 2] is scaling factor. The two different techniques
[3] to select r1, either any random vector or the best vector of given generation. The recombination (crossover) is next
stage which is provide diversity in population. The trial vector is produced by mixing of mutant vector and target
vector. All population members become a target vector during evolution. Crossover can be represented as :

𝐌𝐌𝐢𝐢,𝐣𝐣 , 𝐢𝐢𝐢𝐢 (𝐫𝐫𝐫𝐫𝐫𝐫𝐫𝐫(𝐢𝐢) ≤ 𝐂𝐂𝐂𝐂 𝐨𝐨𝐨𝐨 𝐢𝐢 = 𝐫𝐫𝐫𝐫𝐫𝐫𝐫𝐫𝐢𝐢 (𝐣𝐣))


𝐕𝐕𝐢𝐢,𝐣𝐣 = { (𝟑𝟑)
𝐗𝐗 𝐢𝐢,𝐣𝐣 , 𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨
rand(i) generate random number between (0,1] , Cr is a crossover probability lie between (0,1] and randi(j) is
random index lie between (1,D] and ensure that Vi,j will have at least one parameter from Mi,j .The Standard DE
has two different type of recombination methods(crossover) first is binomial crossover and another is exponential
crossover. Aforementioned is binomial crossover.

Last is selection, DE has a very greedy approach for selection. Tournament selection method is used between trial
vector and target vector and the fittest one is opted for next generation. Selection process is represented by equation
4.

𝐕𝐕𝐢𝐢,𝐣𝐣 , 𝐢𝐢𝐢𝐢(𝐟𝐟(𝐕𝐕𝐢𝐢,𝐣𝐣 ≤ 𝐟𝐟(𝐗𝐗 𝐢𝐢,𝐣𝐣 )


𝐗𝐗 𝐢𝐢,𝐣𝐣+𝟏𝟏 = { (4)
𝐗𝐗 𝐢𝐢,𝐣𝐣 , 𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨𝐨
Although, DE has three control parameters, but their setting is a difficult task [22]. If parameter setting not done
properly then it cause premature convergence, stagnation [23]. The control parameters allow the user to control
population diversity during the evolution [26]. There are some theoretical rules to choose the control parameters
Amit Ramesh Khaparde et al. / Procedia Computer Science 167 (2020) 263–272 265
Amit Khaparde/ Procedia Computer Science 00 (2019) 000–000 3

[27] [28] in DE literature. Self-adaptation of control parameters [29] method is relatively solved parameter setting
problem.

The performance of DE is reliant upon initial population. Hence, a novel population method [24] is require generating
a well distributed initial population in search space. The parameters of strategy operators and size of population affect
the performance of differential evolution algorithm. A small size population with a greedy strategy give high
convergence speed but it has high probability of premature convergence and stagnation. In contrast, a large population
size with a good exploration capacity strategy reduces the probability of premature convergence and stagnation but
slow the convergence speed [25].

The mutation strategy (MS)affect the convergence speed [30] and parameter setting [31] in DE. MS with low selection
pressure [32] cause stagnation or premature convergence therefore the proper combination of exploration and
exploitation is requiring during evolution. The performance of mutation strategies depends on the type of problem
[33] [34] and varies its setting according to the nature of problems [35] [36].

3. Related Work

The balanced exploration-exploitation is phenomenon to enhance the performance of DE (37) is achieved using new
framework with second enhance mutation operator [38] , AMDE[40], Asymmetric mutation operator [41] . A new
proposed mutation scheme- DE/current-to-gr_best/1 [39] variant of the classical DE/current-to-best/1 scheme has
improved the performance of DE.
Unlike other EAs, the DE is not dependent on any PDF (probability density function). But, variants of DE like,
Gaussian PBX-alpha(GPBX-alpha) [42], improved CRDE[43], co-evolutionary method [44] and many other uses the
PDF to improve DE performance, the use of PDF provide extra decision parameters and costly computation .
In DE, selection of individuals is done at two different level , First, before the mutation, DE select random individuals
for the mutation. Second, at selection phase, selecting the population for the next generation. Proximate mutation
operator [45] select the individuals for mutation based on the distance between the individuals. it improves the
performance of DE. Ranked –based mutation operator [46] , uses the fitness value for vector selection has enhanced
the performance of SDE and other variants. Using Lagrange’s mean value theorem in mutation [47] and eliminating
the recombination from DE enhance the performance of SDE but it caused the extra computation in the algorithm. In
Modified Random Location (MRL) [48], search space is divided into regions has increase the performance of SDE.
Geometrical centroid base mutation operator (CDE) with local search in evolution [49] give the better trade-off in
convergence speed and improve efficiency of standard DE There are various mutation strategies in SDE. Run time
selection of mutation strategy based on historical knowledge or problem trajectory in evolution give the desired results
in various problem [50]. Composite DE (CoDE) [51], uses best of three different trial vector generation strategies for
evolution to improve performance. The concepts of subpopulation and covariance matrix in ISAMODE-CMA [21]
has drastically improve the performance compare to other algorithms.
In non- separable functions, DE does not make an efficient progress because it lacks diversity and sufficient bias in
mutation step, Combinatorial Sampling Differential Evolution (CSDE) [53] address this problem and give high
convergence speed towards global optimum. In self-adaptive DE (SaDE) algorithm [54], trial vector generation
strategies and their associated control parameter values are gradually overcome this problem.
The multi- model optimization problems are very efficiently solved by neighborhood mutation operator [55] [56],
mutation is performed within Euclidean neighborhood cause the multiple optima in evolution. EPSDE [57] [58],
competition between the pool of distinct mutation, crossover strategies and control parameter for producing the
offspring is successful tactic to solve the problems
The efficiency of SDE is enhance by adding the positive properties of Taguchi’s method [60] to standard DE called
hybrid robust differential evolution (HRDE) [62,63]. The performance of HRDE on multi pass turning problem is
much better as compare to particle swarm optimization algorithm, immune algorithm, hybrid harmony search
algorithm, hybrid genetic algorithm, scatter search algorithm, genetic algorithm and integration of simulated annealing
and Hooke-Jeevespatter search.
266 Amit Ramesh Khaparde et al. / Procedia Computer Science 167 (2020) 263–272
4

4. Proposed Method

The Pseudo code of DESBS is in [59], the distributed differential evolution algorithm is explained using figure 1 block
diagram. Like other evolutionary algorithms, DESBS is start with random number of candidate solutions in a search
space, called initial population. Next, the best determination method (BDM) provides the best solutions using standard
differential evolution algorithm. After that the ‘Species evolution ‘is used to generate the species in the search spaces,
the number of species are equal to number of best solutions. Every species evolved using differential evolution
algorithm. The efficiency of each species is evaluated over period of time and non-performing specie is merge to
nearby species.

Initial Population

Best Determination Method

Species Evolution

Species Merging

Fig. 1. Block Diagram

4.1. Best determination method

The population is nothing but the gathering of population member. In DESBS, these members are vectors. In natural
world, the population is consisting of the male and female members. In DESBS, the fitness value of the vector decides
the male and female member. The individual having the fitness value more than the calculated threshold is a female
individual (vector),I,e, and remaining individuals (vectors) are male . The Best determination method (BVST) is used
to detect the best vectors in the population and the best individual is considered as female individuals.
As the DESBS, uses the standard differential evolution algorithm for the offspring generation, therefore, all the
individuals or vectors in DESBS have the equal opportunity to produce the offspring. In BVST, performance count of
individual is increased as the child produced by the individual is better than itself. Then the performance count of all
population members is compared with threshold and those individuals having the performance count more than
threshold are best vectors or members. This fertility count (ability to produce the fit offspring) is the tenet of the
BVST. No. of opportunity provided to produce offspring decides the selection pressure in BVST.

4.2. Species Evolution Phase

In teaching – learning method [61] (TLM), the best individual act as teacher whereas the rest of the individuals act
as a learners. The learners update their knowledge based on the teacher knowledge. This method solves the problem
using knowledge transformation between learner and teacher, but the population structure in TLM is whole. Hence,
some time, it can cause the improper exploration and exploitation of search space. DESBS solve this problem by
dividing the main population into subpopulation.
The organisms having similar characteristic features called species and each member has its role , which decide
niche. In this method, each fittest individual is considered as niche in population and species form around them. Two
Amit Ramesh Khaparde et al. / Procedia Computer Science 167 (2020) 263–272 267
Amit Khaparde/ Procedia Computer Science 00 (2019) 000–000 5

members are similar if the Euclidian distance between them is less and these closer individuals forms the species .
Species formation is based on Euclidean distance between best and non-best members. Each species contains one best
member and closer zero or more non-best members.

4.3. Merging of Species

The distributed population structure allows the exploration of search space. Merging of species help to identify
global/local optima directly [18], by returning the best individual of each species. The merging of species produces
the hybridize species which produce fitter individuals.
Using the same concept in DESBS, the performance of each species is calculated. The species which is not performing
well is merged with the nearer species.
The decision of merging of specie to nearby species is very crucial. If the decision of merging of species took at early
stage then individuals in species will not get a sufficient time to evolve (produce offspring) . If the decision of merging
of species took at later stage then there is a possibility of production of unproductive offspring which can cause
unnecessary computation of objective function.

5. Results and Discussion

DESBS has best determination methods (BDM) to identify best members of the population. BDM require two
parameters ‘R’ and ‘M’ . The parameter ‘R’ decides the number of chances provided to individual for evolution and
performance count computation. The parameter ‘M’ is used to compute threshold value. The individuals having
performance above the threshold are the best members, around these the species formed. ‘R’ and ‘M’ are analyzed
on the performance of algorithm. The experiment is carried out at two different Phase.

• Scale up study for ‘R’ and ‘M’, to decide the parameter setting of ‘R’,’M’ in DESBS.

• Performance Analysis of DESBS with standard differential evolution algorithm (SDE) and other State of art DE
(CoDE, EPSDE).
The DESBS is implemented in MATLAB 7.9A. The unimodal functions and multi-model problems of CEC 2005
test suite are used in phase I. DESBS uses standard differential Evolutionary algorithm (SDE)for evolution. The
parameter setting of SDE and scale-up values of R and M is given in table 1.

Table 1 Parameter Setting of DESBS


Sr. No. Parameter Values
1 Dimension of the problem (D) 30
2 Scaling factor (F) 0.5
3 Crossover Rate (Cr) 0.9
4 Population size (NP) 300
5 Max Fun Evaluation (Max_Fev) 3e6
6 R 4,8,12
7 M 01-30
In first phase, DESBS executed for 20 times. The performance is evaluated on following criteria;

• Success rate (SR) = Number of successful runs/Total number of runs

• Success Performance (SP)= Average of Max function evaluation of Successful runs/ success rate

• Minimum value of function (MV)

• Max function evaluated for min value (MFE)


6
268 Amit Ramesh Khaparde et al. / Procedia Computer Science 167 (2020) 263–272

In second phase the parameter is the error value, i,e difference of minimal value and expected minimal value. DESBS
algorithm is executed for 50 times.

5.1. Scale –Up Study of Decision Parameter

The Experimental results of DESBS has shows that, the performance of DESBS is depends upon the value of ‘R’ and
‘M’. BDM uses ‘R’ and ‘M’ to decide best member of the population. Different values of ‘R’ are used to check the
effect of selection pressure which provides the chance to all individuals of population to prove themselves as a best
member of population. The small value of ‘R’ gives maximum chance whereas high value of ‘R’ gives minimum
chance to individuals. If the individuals get maximum chance then it requires Maximum function evaluation to find
the best individuals. If minimum chance provided then solution stuck to local optima.
Parameter ‘M’ is calculating the threshold for best members of the population. The results show that small value
of threshold cause a more best member hence more species formation with a few individualss, whereas, higher the
value of threshold cause less numbers of individuals as a best number and a smaller number of species with large
number of individuals in the species.
Here, the value of ‘R’ is varied from 1 to 30. First (NP/M) th number of best performed individuals calculate the
best individuals of the population around which the species are formed. It has found that, the function evaluation is
minimal, in uni-model problems, the first two best performed individuals decide best individuals of the population
and in multi-model functions, the first best performed individual decide best individuals of the population. The value
of M set to 15 and R set to 4 while comparing DESBS with other state-of-art algorithms and standard DE.

5.2. Comparison with SDE and other State-Of-Art Algorithm

DESBS is compared with SDE and other state of art variants of SDE algorithms – CoDE, EPSDE. The null hypothesis
in each test is that there is no difference exists between the original SDE/variant of SDE and the DESBS. the cases
are mark with “ + ” when the null hypothesis is rejected and the DESBS outperforms the other one in a statistically
significant way, mark with “–” when the null hypothesis is rejected and the original SDE/variant of SDE is
significantly better than DESBS, mark with “ = ” when the null hypothesis is accepted and no performance difference
is significant. Non- parametric two trail Wilcoxon ranksum test with significance level 0.05 is used to check the null
hypothesis.
The performance of DESBS and EPSDE is statically same in problem no. 8, 12, 22, 23, and 25; hence These two
algorithms cannot be compared in these 5 problems. But in problem 5, 7, 18, 19,20,21,24 the DESBS has perform
better than EPSDE, whereas in other problems EPSDE performance is better as compare to DESBS.
So, out of 25 problems, EPSDE outperform DESBS in 13 problems, whereas the performance of DESBS is better in
7 problems as compare to EPSDE.
DESBS and CoDE performance is statically same in 3 problems-1,3,15. Whereas, the performance of DESBS is better
in problem no. 2,4,5,6,7,18,19,20,22,25. In other problems CoDE is perform better than DESBS.
So, out of 25, DESBS outperform CoDE in 10 problems, and CoDE is better in 12 problems. While comparing with
SDE , it is found that expect 5 problem, the results of rest of 20 problems are statically same . In problem 3,12,15,20
DESBS is perform better than SDE, whereas in problem no 9 SDE is better than DESBS.

Table 2. DESBS Vs EPSDE


DESBS EPSDE
P. N Sign
Mean S.Dev Mean S.Dev
1 0 0 -0.000001 0 -
2 0 0 -0.000001 0 -
3 50915.054 53812.754 2695.6057 3251.8111 -
4 0 0 -0.000001 0 -
5 26.007016 23.987682 236.48249 179.30372 +
Amit Ramesh Khaparde et al. / Procedia Computer Science 167 (2020) 263–272 269
Amit Khaparde/ Procedia Computer Science 00 (2019) 000–000 7

6 -0.00096 0.000791 -0.01 0 -


7 -0.001041 0.000827 1.130142 0.516809 +
8 20.843912 0.040982 20.845565 0.044809 =
9 6.437497 5.394415 -0.01 0 -
10 139.53118 49.000737 26.666991 4.210192 -
11 37.106783 2.820991 29.09254 1.768606 -
12 107492.72 192036.92 18548.362 2938.4635 =
13 9.055265 3.747261 1.363809 0.096251 -
14 13.684802 0.113451 12.9299 0.151972 -
15 380.63189 115.25263 202.91252 2.008797 -
16 176.8334 49.635101 99.452783 106.13368 -
17 207.87491 27.101474 125.62366 98.787843 -
18 815.88837 0.007268 816.15367 0.156606 +
19 815.887 0.006138 816.1648 0.200726 +
20 815.88611 0.010443 816.1395 0.152328 +
21 856.42703 0.314059 857.12859 0.606353 +
22 499.90008 0.000192 500.05744 0.64505 =
23 864.08009 0.532835 864.2592 0.870808 =
24 208.48755 0.092283 210.11687 0.684087 +
25 208.46749 0.071848 215.95424 40.987819 =

Table 3 DESBS Vs CoDE


DESBS CoDE
P. N Sign
Mean S.Dev Mean S.Dev
1 0 0 0 0 =
2 0 0 0.000059 0.000025 +
3 50915.054 53812.754 0.37775 0.167989 -
4 0 0 0.005737 0.002795 +
5 26.007016 23.987682 31.63219 5.116828 +
6 -0.00096 0.000791 2.081529 0.66076 +
7 -0.001041 0.000827 4696.2786 0 +
8 20.843912 0.040982 20.847048 0.039935 =
9 6.437497 5.394415 -0.001039 0.000985 -
10 139.53118 49.000737 131.67664 9.442539 -
11 37.106783 2.820991 29.62228 1.285893 -
12 107492.72 192036.92 50175.562 8579.8321 -
13 9.055265 3.747261 5.052238 0.339835 -
14 13.684802 0.113451 12.832055 0.178376 -
15 380.63189 115.25263 399.99 0 =
16 176.8334 49.635101 147.59949 10.481023 -
17 207.87491 27.101474 187.0862 11.605374 -
270 Amit Ramesh Khaparde et al. / Procedia Computer Science 167 (2020) 263–272
8

18 815.88837 0.007268 905.02361 0.192267 +


19 815.887 0.006138 904.98389 0.15445 +
20 815.88611 0.010443 904.97264 0.179755 +
21 856.42703 0.314059 499.90005 0 -
22 499.90008 0.000192 881.36662 4.455461 +
23 864.08009 0.532835 534.06421 0.000063 -
24 208.48755 0.092283 199.9 0 -
25 208.46749 0.071848 1637.2719 3.266229 +

Table 4 DESBS Vs SDE


DESBS SDE
P. N Sign
Mean S.Dev Mean S.Dev
1 0 0 0 0 =
2 0 0 0 0 =
3 50915.054 53812.754 68959.116 58044.282 +
4 0 0 0 0 =
5 26.007016 23.987682 26.864916 23.948554 =
6 -0.00096 0.000791 -0.001065 0.000761 =
7 -0.001041 0.000827 -0.0012 0.001082 =
8 20.843912 0.040982 20.853617 0.037192 =
9 6.437497 5.394415 4.48877 3.79476 _
10 139.53118 49.000737 159.38361 16.468136 =
11 37.106783 2.820991 37.574664 0.898674 =
12 107492.72 192036.92 380916.43 224893.69 +
13 9.055265 3.747261 9.022686 4.058667 =
14 13.684802 0.113451 13.654421 0.147852 =
15 380.63189 115.25263 462.6798 76.73745 +
16 176.8334 49.635101 188.25525 57.398517 =
17 207.87491 27.101474 217.10232 33.178754 =
18 815.88837 0.007268 815.88796 0.007231 =
19 815.887 0.006138 815.88485 0.0046 =
20 815.88611 0.010443 815.88463 0.004477 +
21 856.42703 0.314059 856.44366 0.221705 =
22 499.90008 0.000192 499.90006 0.000079 =
23 864.08009 0.532835 864.16302 0.554197 =
24 208.48755 0.092283 208.48448 0.077819 =
25 208.46749 0.071848 208.45784 0.061743 =

6. Conclusion

A new distributed differential evolution with best vector selection and species evolution (DESBS) is proposed. it uses
the best vector method to find the best performing individual from the population, species evolution phase is used to
Amit Ramesh Khaparde et al. / Procedia Computer Science 167 (2020) 263–272 271
Amit Khaparde/ Procedia Computer Science 00 (2019) 000–000 9

merge the non-performing species which formed around the best individuals. It is not the fully distributed algorithm;
it is partially distributed algorithm because at end all species merge into single population. The in-build nature of
DESBS provide better exploration in starting of evolution and exploitation at end of evolution. DESBS has two new
parameters other than SDE. The scale up study is conducted to find their setting. The performance of DESBS is
compare with SDE, CoDE and EPSDE using CEC 2005 test suite. The null hypothesis is used to checked with help
of Non- parametric two trail Wilcoxon ranksum test. The results show that DESBS performance is better than SDE
in multi-model optimization problems and better than EPSDE, CoDE algorithm in hybrid composite .

References

[1] T. Weise, M. Zapf, R. Chiong, A. Nebro ” Why Is Optimization Difficult . Nature-Inspired Algorithms for Optimization”, SCI 193, pp. 1–50.
Springer. 2009.
[1] X. Yu, M. Gen, “Introduction to evolutionary algorithm” ISBN 978-1-84996-128-8 . Springer, 2010.
[2] R. Storn, K. Price “Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces”. Journal of
Global Optimization, pp, 341-359, 11: 341–359, 1997.
[3] R. Storn , K. Price “ Differential Evolution - A simple and efficient adaptive scheme for global optimization over continuous spaces”,TR-95-
012. March, 1995.
[4] R. Storn “ System Design by Constraint Adaptation and Differential Evolution”,TR-96-039,, 1996.
[5] K. Price, “Differential Evolution: A Fast and Simple Numerical Optimizer”,IEEE,1996.
[6] K. P. Wong, Z. Y. Dong, “Differential Evolution, an Alternative Approach to Evolutionary Algorithm”, ISAP, 2005.
[7] D. Karaboga, S. Okdem ,” A Simple and Global Optimization Algorithm for Engineering Problems: Differential Evolution Algorithm “,Turk
J Elec Engin,, VOL.12, NO.1, 2004.
[8] QI Feng, L. Ping,” A differential evolutionary based algorithm for multiuser OFDMA system adaptive resource allocation” Journal of
Communication and Computer, ISSN1548-7709, USA, Volume 5, No.12 (Serial No.49). Dec. 2008.
[9] M. Maximiano, M. A.V. Rodríguez, J.A. Gomez “Hybrid Differential Evolution Algorithm to Solve a Real-World Frequency Assignment
Problem” ,IEEE, 2008.
[10] A. Souham, M, H. Talbi, M. Batouche “A Quantum-Inspired Differential Evolution Algorithm for Solving the N-Queens Problem” The
International Arab Journal of Information Technology, Vol. 7, No. 1, January 2010.
[11] C. Song, X. Chu, L. Li, J. Wang “An improved differential evolution algorithm for the slope stability analysis” IEEE, 2011.
[12] M. Vasile, E. Minisci, M. Locatelli” An Inflationary Differential Evolution Algorithm for Space Trajectory Optimization” IEEE Transactions
On Evolutionary Computation, Vol. 15, No. 2, April 2011.
[13] M. F. Tasgetiren, P.N. Suganthan, T. Chua , A. Hajri “Differential Evolution Algorithms for the Generalized Assignment Problem”,IEEE
2009.
[14] F. Wang, W. Wang, H. Yang, Q. Pan “A Novel Discrete Differential Evolution Algorithm for Computer-aided Test-sheet Composition
Problems” IEEE.2009.
[15] S. Ding” Logistics Network Design Optimization Based on Differential Evolution Algorithm” IEEE, 2010.
[16] A. Slowik “Application of Adaptive Differential Evolution Algorithm with Multiple Trial Vectors to Artificial Neural Networks Training”,
IEEE, 2010.
[17] A. R. Yildiz , ”An effective hybrid immune-hill climbing optimization approach for solving design and manufacturing optimization problems
in industry”, journal of materials processing technology “,209,2773–2780,2009.
[18] Ali R. Yildiz , “Hybrid immune-simulated annealing algorithm for optimal design and manufacturing”, Int. J. Materials and Product
Technology, Vol. 34, No. 3, 2009
[19] Ali R. Yildiz , “A new hybrid differential evolution algorithm for the selection of optimal machining parameters in milling operations”,
Applied Soft Computing 13 , 1561–1566, 2013.
[20] M. Kiani, A.R. Yildiz,” A Comparative Study of Non-traditional Methods for Vehicle Crashworthiness and NVH Optimization”, Archives of
Computational Methods in Engineering, 23,723–734,2016.
[21] R. Gamperle, S. Mu ller, P. Koumoutsakos,” A Parameter Study for Differential Evolution”, Proceeding WSEAS, international conference on
advances in intelligent Systems, fuzzy systems, evolutionary computation, pp. 293–298, 2002
[22] J. Lampinen, I. Zelinka. ,”On Stagnation Of The Differential Evolution Algorithm,”, Ošmera, P. (ed.) Proceedings of MENDEL 2000, 6th
International Mendel Conference on Soft Computing, Brno, Czech Republic, 7–9 June 2000.
[23] S. Rahnamayan, H. R. Tizhoosh,, M. M.A. Salama “A novel population initialization method for accelerating evolutionary algorithms”
Eleswire, international journal computer and methamatics application,2007.
[24] R. Mallipeddi, P. N. Suganthan, “Empirical Study on the Effect of Population Size on Differential Evolution Algorithm”,IEEE,2008.
[25] D. Zaharie “ Statistical properties of DE and related random search algorithms.”
[26] D. Zaharie,” Differential Evolution From Theoretical Analysis To Practical Insights”
[27] J. Tvrdık.”Differential Evolution: Competitive Setting of Control Parameters “, Proceedings of the International Multiconference on Computer
Science and Information Technology pp. 207–213,2007.
[28] R. C. Silva, R. Lopes, A. Freitas, F. Guimaraes “Performance Comparison of Parameter Variation Operators in Self-Adaptive Differential
Evolution Algorithms”, Brazilian Symposium on Neural Networks,2012.
[29] G. Jeyakumar, C. Shanmugavelayutham ” Convergence Analysis Of Differential Evolution Variants On Unconstrained Global Optimization
Functions “International Journal of Artificial Intelligence & Applications (IJAIA), Vol.2, No.2, April 2011
[30] K. V. Price , J. Rönkkönen “Comparing the Uni-Modal Scaling Performance of Global and Local Selection in a Mutation-Only Differential
Evolution Algorithm” 2006 IEEE Congress on Evolutionary Computation
272 Amit Ramesh Khaparde et al. / Procedia Computer Science 167 (2020) 263–272
10

[31] A. M. Sutton, M. Lunacek, L. D. Whitley ” Differential Evolution and Non-separability: Using selective pressure to focus search” GECCO
2007.
[32] E. Montes, Reyes, Carlos A. Coello Coello “A Comparative Study of Differential Evolution Variants for Global Optimization” GECCO 2006.
[33] Y. Ao1, H. Chi ” Experimental Study on Differential Evolution Strategies” IEEE, 2009.
[34] W. Gong, Z. Cai” An Empirical Study on Differential Evolution for Optimal Power Allocation in WSNs “ ICNC 2012
[35] S. Chattopadhyay, S. Sanyal, A. Chandra ” Comparison of Various Mutation Schemes of Differential Evolution Algorithm for the Design of
Lowpass FIR Filter” SEISCON 2011
[36] M.G. Epitropakis, V.P. Plagianakos, M.N. Vrahatis “Balancing the exploration and exploitation capabilities of the Differential Evolution
Algorithm”IEEE 2008.
[37] C. Deng, B. Zhao, A. Deng , R. Hu “New Differential Evolution Algorithm with a Second Enhanced Mutation Operator”,IEEE 2009.
[38] Sk. M.l Islam, S. Das, S. Ghosh, S. Roy, P. N. Suganthan, “An Adaptive Differential Evolution Algorithm With Novel Mutation and Crossover
Strategies for Global Numerical Optimization” IEEE transactions on systems, man, and cybernetics—part b: cybernetics, vol. 42, no. 2, April
2012
[39] X. Miao, P. Fan, J. Wang , C. Li “Differential Evolution Based on Adaptive Mutation”, IEEE 2010.
[40] E. Shi, F. Leung, J. Lai “An Adaptive Differential Evolution with Unsymmetrical Mutation”, IEEE, 2011
[41] R. Zhou, J. Hao, H. Cao, H. Fan “An Empirical Study on Differential Evolution Algorithm and Its Several Variants”, IEEE 2011.
[42] L. Chen, L. Ding “ An improved crowding-based differential evolution for multimodal optimization” IEEE, 2011.
[43] A. Nobakhti , H. Wang “Co-evolutionary Self-Adaptive Differential Evolution with a Uniform-distribution Update Rule”,IEEE,2006.
[44] M.G. Epitropakis, D. K. Tasoulis, N. G. Pavlidis, V. P. Plagianakos, M N. Vrahatis “Enhancing Differential Evolution Utilizing Proximity-
Based Mutation Operators” IEEE transactions on evolutionary computation, vol. 15, no. 1, february 2011
[45] W. Gong , Z. Cai “ Differential Evolution With Ranking-Based Mutation Operators”, IEEE transactions on cybernetics, 2013.
[46] P. Bhowmik, S. Das, A. Konar, S.Das , A. K. Nagar “A New Differential Evolution with Improved Mutation Strategy”,IEEE,2010.
[47] P. Kumar, M. Pant” Enhanced Mutation Strategy for Differential Evolution” IEEE, 2012.
[48] M. Ali, M. Pant, A. Nagar “ Two new approach incorporating centroid based mutation operators for Differential Evolution”, World Journal of
Modelling and Simulation,2010
[49] Sk. M. Islam, S. Das, , S. Ghosh, S. Roy, P.N. Suganthan ” An Adaptive Differential Evolution Algorithm With Novel Mutation and Crossover
Strategies for Global Numerical Optimization” ieee transactions on systems, man, and cybernetics—part b: cybernetics, vol. 42, no. 2, april
2012
[50] Y. Wang, Z. Cai, Q. Zhang “Differential Evolution with Composite Trial Vector Generation Strategies and Control Parameters”, IEEE
Transactions on Evolutionary Computation, 2010.
[51] S. M. Elsayed, R. Sarker, D. L. Essam “An Improved Self-Adaptive Differential Evolution Algorithm for Optimization Problems” IEEE
transactions on industrial informatics, vol. 9, no. 2013.
[52] A. Iorio ,X. Li ”Improving the Performance and Scalability of Differential Evolution” SEAL 2008.
[53] K. Qin, V. L. Huang, , P. N. Suganthan “ Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization”,
IEEE transactions on evolutionary computation, vol. 13, no. 2, April 2009
[54] Y. Qu, P. N. Suganthan, J. J. Liang “Differential Evolution with Neighborhood Mutation for Multimodal Optimization” B. Y. Qu, P. N.
Suganthan, J. J. Liang
[55] S. Das, A. Abraham, , U. K. Chakraborty, A. Konar” Differential Evolution Using a Neighborhood-Based Mutation Operator” IEEE
transactions on evolutionary computation, vol. 13, no. 3, june 2009
[56] R. Mallipeddi, G. Iacca, P. N. Suganthan, F. Neri, E. Mininno “Ensemble Strategies in Compact Differential Evolution”, IEEE, 2011.
[57] Ali R. Yildiz, “Hybrid Taguchi-differential evolution algorithm for optimization of multi-pass turning operations”, Applied Soft Computing,
Vol: 13, Issue:3,pages 1433–1439, 2013.
[58] A. R. Khaparde, M. M. Raghuwanshi, L.G. Malik, ” A New Distributed Differential Evolution Algorithm”, IEEE, 2015.
[59] Ali R. Yildiz, “Optimization of multi-pass turning operations using hybrid teaching learning-based approach”, The International Journal of
Advanced Manufacturing Technology, 66, 9-12, 1319-1326, 2013
[60] Ali R. Yildiz, “A comparative study of population-based optimization algorithms for turning operations”, Information Sciences, 210 , 81–88,
2012.
[61] Ali R. Yildiz ”Comparison of evolutionary-based optimization algorithms for structural design optimization” , Engineering Applications of
Artificial Intelligence 26 ,327–333,2013.
[62] Ali R. Yildiz , “Optimization of cutting parameters in multi-pass turning using artificial bee colony-based approach”, Information Sciences
220 , 399–407,2013.

You might also like