Professional Documents
Culture Documents
PII: S1568-4946(17)30434-9
DOI: http://dx.doi.org/doi:10.1016/j.asoc.2017.07.023
Reference: ASOC 4351
Please cite this article as: F.Javidrad, M.Nazari, A New Hybrid Particle Swarm
and Simulated Annealing Stochastic Optimization Method, Applied Soft Computing
Journalhttp://dx.doi.org/10.1016/j.asoc.2017.07.023
This is a PDF file of an unedited manuscript that has been accepted for publication.
As a service to our customers we are providing this early version of the manuscript.
The manuscript will undergo copyediting, typesetting, and review of the resulting proof
before it is published in its final form. Please note that during the production process
errors may be discovered which could affect the content, and all legal disclaimers that
apply to the journal pertain.
A New Hybrid Particle Swarm and
Simulated Annealing Stochastic
Optimization Method
F. Javidrad1
Prof. of Aerospace Eng.
Center for Postgraduate Studies, Aeronautical University of Science and Technology
Sought Mehrabad, Shamshiri st., 13846-73411 Tehran, IRAN
M. Nazari
Postgraduate Student,
Center for Postgraduate Studies, Aeronautical University of Science and Technology,
Sought Mehrabad, Shamshiri st., 13846-73411 Tehran, IRAN
1
To whom all correspondence should be addressed. (f.javidrad@gmail.com)
Tel. ++98 921 6186485
Graphical abstract
Highlights:
Development of a new hybrid PSO-SA optimization method.
Numerical validation of the proposed method using a number of benchmark functions.
Using three criteria for comparative work.
Finding near optimum parameters of the proposed method.
Application of the proposed algorithm in two engineering problems.
Abstract
A novel hybrid particle swarm and simulated annealing stochastic optimization method is
proposed. The proposed hybrid method uses both PSO and SA in sequence and integrates the
merits of good exploration capability of PSO and good local search properties of SA.
Numerical simulation has been performed for selection of near optimum parameters of the
method. The performance of this hybrid optimization technique was evaluated by comparing
optimization results of thirty benchmark functions of different dimensions with those obtained
by other numerical methods considering three criteria. These criteria were stability, average
trial function evaluations for successful runs and the total average trial function evaluations
considering both successful and failed runs. Design of laminated composite materials with
required effective stiffness properties and minimum weight design of a three-bar truss are
addressed as typical applications of the proposed algorithm in various types of optimization
problems. In general, the proposed hybrid PSO-SA algorithm demonstrates improved
performance in solution of these problems compared to other evolutionary methods. The results
of this research show that the proposed algorithm can reliably and effectively be used for
various optimization problems.
1. Introduction
Recently, new design techniques based on stochastic search optimization methods have
emerged in various engineering disciplines such as aeronautical, structural and mechanical
engineering. Commonly, stochastic search techniques make use of the ideas inspired from
nature with some heuristic manners to find the global optimum of a problem and do not suffer
from some of discrepancies encountered in most of the deterministic optimization methods.
Deterministic optimization methods usually require preconditions such as continuity and
differentiability of the objective function which are not met in many practical problems [1]. On
the other hand, stochastic search optimization methods can successfully be applied to problems
including non-smooth or discontinuous functions of continuous, discrete and integer variables.
These methods are easy to implement because they require only function values to find an
optimum. Moreover, stochastic methods are usually capable of jumping out of the local minima
to approach the global or near global optimum solution [2]. However, heuristic methods have
a number of drawbacks. These methods may be inefficient in terms of computational effort
because they do need thorough exploration across the feasible area. In addition, there are some
factors that must be determined experimentally to assure that the stochastic method approaches
to the global optimum point.
The stochastic search methods are usually classified in three groups: Monte Carlo based
probabilistic search, evolutionary algorithms and swarm intelligence. To date, a number of
stochastic heuristic or meta-heuristic search methods from all three so-called groups have been
developed. Simulated Annealing (SA) and Tabu Search (TS) from the probabilistic search
category, Genetic Algorithm (GA) from the category of evolutionary algorithms and Particle
Swarm Optimization (PSO) and Ant Colony Optimization (ACO) from the swarm intelligence
category are the methods that have been widely used to address optimization issues [3].
Amongst all these methods, PSO and SA have accelerated gaining attention and applications
by scholars and researchers because of their simple concepts, good speed and ease of
implementation. But, PSO is prone to get stuck in local minima when implemented to complex
and multi-modal optimization functions. A great deal of effort has been made to develop altered
PSO approaches to improve its solution quality. The main aim of these approaches is to balance
exploration (global search across the entire search domain) and exploitation (local search)
during searching process. In contrast to PSO, SA is known as a stable method having a
mechanism to jump out of local minima. In fact, SA will definitely obtain the global optimum
point, if very low variations of the controlling parameter and stability at each cooling cycle are
maintained.
In this article, we attempt to introduce a novel hybrid optimization method based on PSO
and SA to further increase PSO solution quality. The proposed hybrid method primarily uses
both PSO and SA in sequence with communication. The performance of this hybrid
optimization technique was evaluated through thirty different benchmark functions. Design of
laminated composite materials with required effective stiffness constants and design of a three-
bar truss under tensile load constraints were examined as two applications of the proposed
algorithm in engineering problems. The results of this research generally show that the
proposed PSO-SA hybrid algorithm can effectively be used for optimization of problems
including continuous and discrete variables.
The remainder of this article is structured as follows: Section 2 gives a background and a
brief review of the PSO and SA. In Section 3 the proposed hybrid PSO-SA algorithm is
described. The evaluation of the proposed algorithm is given in Section 4. Section 5 presents
two applications of the proposed algorithm in design of composite laminates and design of a
three-bar truss. Finally, Section 6 presents conclusions.
2. Background
Consider a search space S n and Xki={X1i, X2i, …., Xni}T as the position of particle k in
iteration i, Pkb={P1b, P2 b, …. , Pn b}T as the best position of particle k in the search space and
Pg={P1g, P2 g, …. , Pn g}T as the global best position visited by all particles, so far. At each
iteration step i, the particle velocity and position are updated successively according to the
following equations:
vki 1 vki c1 R1.( Pkb X ki ) c2 R2 .( P g X ki ) (1)
X ki 1 X ki vki 1 (2)
where vk denotes the velocity of the kth particle, and R1 and R2 are two random numbers
between 0 and 1. c and c are real-valued parameters called cognitive parameter and social
parameter, respectively. These parameters actually adjust the magnitudes of the steps taken by
a particle in the direction of its best position and the global best position of the swarm.
A new factor called the weighting factor, has been suggested to obtain a better control
on the scope of the search [12]. Eq. 1 is changed to:
The function actually controls exploration across the search space and is a stochastic
mechanism for jumping out of local minimum points. A relatively large value of supports
global exploration, while a small value of that enables exploitation. The role of exploring is to
allow the particle move across the entire search space and, exploiting makes the search to be
performed within a localized region. There are many suggestions for this function such as the
decreasing linear step function defined as:
max min
max i (4)
imax
where max and min are two defined limits and imax is the maximum allowable iteration number.
There are also adaptive procedures for selection of based on its performance and distance
from its best position [13].
The PSO works in a very simple way. First, a population of particles with random positions
within the search space is generated. Then, PSO enters a loop to search for optimal solutions
by updating the particles velocities and positions using Eq. 1 (or Eq. 3) and Eq. 2. The loop
terminates when the adopted convergence criteria are satisfied.
Since the method always seeks a better solution using Pb and Pg data throughout the iterative
process, it can have a fast convergence rate. However, the tendency of all particles to move
toward the best experienced point and then search around a localized region may cause the
premature convergence problem. In other words, although PSO converges to an optimum point
much faster than many other evolutionary methods, it may have a poor solution quality,
specifically in high dimensional multi-modal problems [14].
There are many PSO-based algorithms introduced in literature such as: Vector Evaluated
PSO [15], Quantum behaved PSO [16], Improved Accelerated PSO [17] and Particle
Evolutionary Swarm Optimization [18]. PSO method has widely been used for optimization of
single-objective and multi-objective problems of continuous, discrete and mixed integer
variables [19-21]. There are also numerous reports exist in literature referring to the
applications of PSO in engineering problems (for example see [22]).
Simulated Annealing is a well-known heuristic method for global optimization inspired from
the physical process of annealing of solids. The foundation of SA algorithm is the Metropolis
Monte Carlo procedure, which was used to describe collection of atoms in equilibrium at a
specified temperature. The standard Monte Carlo algorithm only accepts movement to a state
with lower energy, while the Metropolis procedure makes it possible a state with higher energy
is also accepted with a probability [23]. The acceptance probability of a trial solution point Xi+1
from the current solution point Xi is stated as:
1 if f ( X i1 ) f ( X i )
P f / T (5)
e Otherwise
where f is the objective function, f f ( X i1 ) f ( X i ) and T is the controlling parameter which,
by analogy, is called temperature. Decrease in temperature actually limits acceptance
probability.
The step length vector in SA procedure is usually imposed such that at least 50% of trials
be accepted [24]. If j be the ratio of the number of accepted move for each variable to the
number of trials, the perturbation can be written as:
where R(-1,1) is a random number between -1 and 1 and V is the step length vector. Introducing
Vbi as the half of the difference between upper and lower limits for each variable, the
components of the step length vector are calculated as:
Ti 1 RT Ti (8)
where T0 is the initial temperature and RT is a positive constant usually taken in the interval of
0.8-0.999.
SA method has very wide variety of applications in various fields of science and engineering
concerning single-objective, multi-objective [28] constrained [29] or unconstrained
optimization problems. For example, in the case of composite materials design, SA has been
used to minimize the weight of laminated composite plates subjected to in-plane and out-of-
plate loads [30], maximize buckling load capacity of composite laminates [31], design a
laminate with required stiffness properties [32] and optimize the natural frequency and bucking
load of laminated hybrid composite plates [33].
Combining SA into the cycles of PSO is another approach adopted by researchers to address
optimization problems. The key idea behind this approach is using a local search strategy to
dynamically improve the global best point determined in each PSO cycles. In this approach,
the current global best position (or sometimes the local best position of each particle) is
improved using the temperature equilibrium loop of the SA while the temperature is updated
using the adopted cooling schedule in the main loop of the PSO [41-43]. It is argued that the
performance of this approach is dependent, to some extent, upon the proper selection of SA
parameters [44]. A hybrid PSO-SA is suggested in [45] where in each PSO iteration each
particle is simulated annealed independently and a group of new individuals are generated. In
another approach, the global best position of the swarm is improved using simulated annealing
method in each PSO iteration [46]. It is stated in this reference that, the best position of each
particle can also be simulated annealed in every PSO iteration to generate a swarm with the
best possible particles positions. However, efficiency of these procedures in terms of
computational efforts has not been discussed in detail.
a. the inertia term ( vk ) which keeps track of the previous velocity vector and thus
i
b. the cognitive component ( c1 R1.( Pk X k ) ) which accounts for the best position visited
b i
The main idea behind our PSO and SA hybridization is to improve the social behavior of a
swarm, by updating the global best position of the swarm. When no improvement was observed
in a cycle of PSO, the old global best position is replaced by a new one calculated using SA
method. This new best position actually gives a signal to the social leader of the swarm, i.e. the
particle which the previous global best solution was belonged to, for updating its direction. The
proposed hybrid PSO-SA method, which belongs to the third category discussed in Section 2.3,
reciprocates between PSO and SA and integrates the merits of good exploration capability of
PSO and good local search properties of SA. To our knowledge, this type of hybridization was
not addressed in the literature before. The most existing PSO-SA in this category used SA for
updating the global best solution (or local best positions) every cycles or only once during the
solution process. In another approach, when PSO trapped into a local minima, SA is used for
the rest of the process [47].
The main difference between our proposed algorithm with the existing ones is that we use
an intelligence in SA contribution to the solution process. SA contributes updating the global
best solution just when the PSO does not show improvement in the global best point. And, it
may happen several times during solution procedure. The conceptual model of the proposed
algorithm is depicted in Fig. 1.
The algorithm starts with PSO by generation of a population and introduction of an initial
temperature parameter. Temperature drops continuously after each cycle in PSO and each
temperature equilibrium cycle (Markov chain loop) in SA according to the given cooling
schedule. If no improvement occurs during a cycle in the current global best solution, then this
current best solution will be introduced to the SA algorithm for calculation of a better position.
The SA solution procedure goes on until a rejection takes place. The best solution in the recent
Markov chain loop then goes back to PSO as the new global best solution. Then, searching is
repeated with PSO using the current global best solution and the previous best locations of
particles. This sharing process is maintained until convergence criteria are satisfied. The steps
of the proposed hybrid PSO-SA method is presented in Algorithm 1.
Step 1: Initialization:
m (number of swarm particles),
imax (maximum allowable iterations),
max and min (limits of weight factors),
c1 and c2 (cognitive and social parameters),
vmax (velocity limit), T0 (initial temperature) and
RT (temperature decrement parameter)
Step 7: SA iterations
calculate a trial solution
g
Ptrial P g R(1,1) V
If f ( Ptrial
g
) f ( P g ) then P g Ptrial
g
STEP 8: Convergence
Repeat the procedure until convergence criteria are met
a. Improving the social component of the velocity vector tends the swarm to be attracted to the
portion of the search space where the global best is located, which can accelerate the
convergence.
b. The mechanism of uphill move is integrated into this hybrid algorithm to improve quality
and stability of the solution.
c. A balance between exploration and exploitation is maintained by lowering temperature
continuously in the process cycles.
d. The progress achieved so far in cognitive component of PSO velocity equation is preserved
during SA iterations.
There are some factors that may have an impact on the performances of the algorithm.
These factors are discussed briefly as follows:
(1) Positions and velocities initialization: The choice of the particles’ initial positions and
velocities can have a considerable effect on the probability the particles move outside the
feasible region. It is stated that initializing the particles’ positions uniformly across the
search space together with zero initial velocities limits the initial diversity of the swarm
and subsequently avoids the particles moving out of boundaries [48]. In this work,
however, zero initial velocities and uniform distribution of particles across the search
space together with limitations in particles’ movement were implemented to avoid
movement outside the boundaries.
(2) Cognitive and social parameters: These two parameters (c1 and c2 coefficients in Eq. [3])
govern the extent to which the particles move towards the particles’ best positions and
the global best positions, respectively. Selection of these parameters are still subject of
debates. Although many researches used constant values c1=c2=2 for these parameters,
we believe that these parameters should be adjusted experimentally for maximum
efficiency of the algorithm. We used in this study four functions with different dimension
to set these parameters (see Appendix B), and found c1=0.8 and c2=1.7 as near optimum
parameters. These values are used throughout the optimization process in this work.
(3) Inertia weights: Inertia weights is a function introduced to avoid swarm divergence by
modulating the contribution of the previous velocity to the current velocity vector. As it
is discussed in Section 2.1, the linear step function (Eq. 4) is the most popular function
used to apply inertia to the particles. The limits of this function should be calculated
experimentally.
(4)Velocity limits: Because of stochastic nature of the velocity equation, the calculated
velocities may be too large and move the particle off the search space. Therefore,
velocities are commonly limited to a defined value [11], that is:
4. Numerical verification
Numerical experiments have been performed to validate the proposed PSO-SA algorithm.
Thirty well-known optimization mathematical functions with different dimensions have been
utilized as benchmarks for global minimization by the proposed PSO-SA algorithm. These
benchmark functions are listed in Table A.1, Appendix A.
For comparison purposes, the standard PSO and the Shieh’s hybrid method [4], a hybrid
PSO-SA method from the second category as described in Section 2.3, have also been applied
to these test functions. Because of the stochastic nature of all three optimization methods, the
solution for each function were independently repeated 100 times. The PSO population size,
the initial temperature and the Markov chain length were set to 50, 1 and 100, respectively.
The successful runs are defined as the computer runs where the cost function getting values
less than a tolerance value, say 2×10-3. The failed runs, however, are those where at least one
of the following criteria are satisfied:
The effectiveness and efficiency of the solution in stochastic procedures can generally be
defined in terms of the percentage of successful runs, convergence speed and the maximum
effective trial function evaluations. The latter is, in fact, a true measure of efficiency
considering both successful and failed runs in a prescribed number of runs. In this regard, three
parameters have been used in this study to compare the performance of the proposed hybrid
PSO-SA methods. These parameters are:
● Stability of solution: the percentage of successful solutions.
● Average trial: the average number of function evaluations in successful runs. This term is
calculated by averaging number of function evaluations in successful solutions achieved in a
limited number of independent runs, say 100.
● Total average trial: the average number of function evaluations considering all successful
and failed runs, as determined according to Eq. 10. In fact, the combined effect of function
evaluations for successful runs and stability is included in the total average function
evaluations. This equation has been suggested in [32]:
Fail
( Ave)T ( Ave) s ( Ave) f (10)
Success
where (Ave)T, (Ave)s, (Ave)f, Fail and Success denote the total average trial function
evaluations, the average of trial function evaluations for successful runs, the average number
of function evaluations used up for failed runs, the numbers of failed and successful trials in
total n runs, respectively. The computational cost actually consists of function evaluations used
up for both successful and failed runs. Therefore, the fraction of function evaluations of failed
runs attributed to every successful run is added to the average function evaluations for
successful runs to give the actual function evaluations for a limited number of runs. The term
Fail.(Ave)f /Success is the fraction of total failed function evaluations allocated to every
successful run. It is obvious that when stability percentage is 100, the term Fail.(Ave)f /Success
is zero and therefore, the average trial and the total average trial are equal.
To make a better comparison between the results obtained by the proposed hybrid PSO-SA
method and two other adopted methods, the parameters of these methods have been optimized
first by a numerical method employed by Shieh et al. [4]. These parameters were max, min, c1
and c2 for all three methods and RT for the two hybrid PSO-SA methods. Through parameter
optimization, near optimum parameter values were calculated by varying one parameter in each
simulation. When an appropriate parameter is found, the process repeated to find the best fit
for another parameter. This process continued until all parameters are determined. The details
of this simulation results are given in Appendix B. The calculated near optimum parameters
for all three methods are given in Table 1. These near optimum parameters have been used to
compare optimization results for benchmark functions in this section.
4.2 Performance study
The percentage of successful solutions obtained from the three algorithms in testing thirty test
functions are given in Table 2 showing that the proposed hybrid PSO-SA has the highest rate
of success. In more than 90 percent of cases, the proposed method showed equal or superior
stability behavior compared to the standard PSO and Shieh's PSO-SA methods. Our
experiments show that the reason of this excellent stability percentage is primarily because of
the SA contribution which locally improves the global best position of PSO when it is close to
converge to a local optimum solution.
Table 3 and Table 4 present respectively the average trials and the total average trials
obtained from this numerical verification. More than 60% of the test functions have less
average trials than the standard PSO, and 73% than the Shieh's method. These percentages for
the total average trials are both 80%. The comparison shows an acceptable improvement in the
number of objective function evaluations demonstrating the effectiveness of the proposed
algorithm.
4.2.1 Effect of the solution tolerance on the performance of the hybrid PSO-SA method
In most stochastic methods, the solution tolerance has a considerable effect on the
performance. In all data given in Tables 2 to 4, the solution tolerance was 2×10-3. To study the
effect of this parameter on the performance of the proposed PSO-SA method, two other
solution tolerances have been presumed for finding optimization of the benchmark functions.
All other input data were the same as the previous study. Again, the results are presented using
three performance criteria as discussed above in Tables 5 to 7. The results show that stability
and average trials does not change so much with solution tolerance. However, as it is expected,
solution stability has trended to increase and average trials and total average have trended to
decrease with increase in the solution tolerance value.
where 5.12 xi 5.12, i 1,,4 for 4-dimensional function. This function is highly
multimodal. However, the location of local minima are regularly distributed. The function has
a global minimum value of 0. at xi=0., i=1,…,4. The average performance for 100 independent
runs was exhibited in terms of stability and speed of the proposed hybrid PSO-SA and
compared with the results of standard PSO and the Shieh’s hybrid solution procedures. The
proposed method was able to find the global minimum point in 100% of runs, while this
percentage for the standard PSO and the Shieh’s hybrid methods were 69% and 52%,
respectively. The average function evaluations used for optimization of this function using the
three methods were respectively 3094.7, 4194.1 and 3268.9. The total average trial function
evaluations were 3094.7, 8526.0 and 7284.8 demonstrating very good efficiency of the
proposed algorithm as compared to other methods. The convergence curve for this function is
given in Fig. 3 where the SA and PSO part of the solution procedure have been exhibited by
different line types. We believe the good convergence behavior of the proposed hybrid method
is related to the appropriate contribution of SA as it is seen from Fig. 3. When the PSO do not
give improvement to the process, the SA fills the gap and accelerates convergence to the global
optimum point.
The stability of the proposed solution procedure together with those of the Shieh’s hybrid
and standard PSO methods for Rastrigin function with different dimensions (n=2 to 10) are
given in Fig. 4. It is seen that stability of the Shieh’s hybrid and standard PSO methods are
very sensitive to dimension of the Rastrigin function approaching to zero for n=7 and higher.
(
Average time complexities for this function are given in Fig. 5 and Fig. 6. In Fig. 5, average
of the computation time spent for successful runs in 100 independent runs are shown. It is seen
that the average computation times calculated by the proposed method are competitive for
Rastrigin function up to six dimensions. In higher dimensions, however, the standard PSO and
the Shieh’s method do not approach to a solution. The total average time complexities are given
in Fig. 6, in which an equation similar to Eq. 10 was used for time of computations considering
both successful and failed runs. We believe that this time is a true measure of the computation
cost. This figure clearly indicates the performance of the proposed algorithm as compared to
two other optimization methods.
Stability calculation of three optimization methods with respect to the dimension of PSF
function as shown in Fig. 8, demonstrate superior stability performance of the proposed hybrid
algorithm over two other optimization methods. Average time complexities and total average
time complexities for this function are appeared in Fig. 9 and Fig. 10, respectively. It is noted
that although the average computation time for successful runs determined from our proposed
method show some deficiencies, the total average time reveal enhanced performance of the
proposed algorithm for all considered dimensions.
10 10
f ( xi ) 1 x 2j / 4000 cos( x j / j ), i 1,,10 (13)
j 1 j 1
where 600 xi 600, i 1,,10 . This function has a global minimum of 0. located at xi=0.,
i=1,…,10. The function has a large number of local minima, i.e., for a 2-D case, the number of
local minima is about 500 [52]. This problem has been solved by several stochastic methods
from either SA type or population set based type in [52]. The results are presented in Fig. 11
for a prescribed maximum number of function evaluation.
The convergence behavior of the proposed PSO-SA method is shown in Fig. 12. It is seen
that the proposed method converges to the global optimum point much faster than other
stochastic methods. The function evaluations to reach function values of 0.1 is about 4200
which is about 60% of the function evaluations required for the best of the other methods (see
Fig. 11). As depicted in this Fig. 12, the SA contribution is extensive and helps convergence
of the method.
A comparison between the proposed hybrid PSO-SA method with the Genetic Chaos
Optimization (GCO) [53] and the Wang's SAPSO hybrid method as suggested in [45] has been
performed using six test functions (see Table 8). The results, as presented in Table 9, show that
the proposed PSO-SA method has a better stability compared to GCO and Wang's SAPSO
methods with competitive optimum function values.
The stability percentage of the proposed hybrid PSO-SA algorithm, Shieh’s hybrid and
standard PSO methods were determined for Griewank function with different dimensions. Fig.
13 shows the results. Again, it is seen that the proposed method is more stable than the other
two optimization methods specifically in higher dimensions. Average computation time for
successful runs and the total average time were calculated for optimization of this function with
various dimensions and depicted in Fig. 14 and Fig. 15, respectively. It is observed that both
of the average computation time and the total average times of our algorithm are less than that
of the other two methods for all considered dimensions.
P r Pc 2
NP W r W c
2
f ( X i ) j
j j
(14)
P r
W r
j 1
j
where W, Pj, i and NP denote weight, effective stiffness properties, the total number of variables
and the number of stiffness properties included in the optimization problem, respectively. j
and are user-defined weighting factors reflecting emphasis on the parameters. These factors
are less than one with the following condition:
NP
j 1 (15)
j 1
where is mass density of the laminate and tk is thickness of the kth ply group. It is assumed
that the laminate under consideration has a constant number of ply groups, NL. For each
individual ply group two discrete design variables including layer group thickness and fiber
orientation angles were taken into consideration. The steps of ply group thickness and fiber
orientation angles variables were set to the thickness of one layer and 5°, respectively. The
boundary limits of design variables are:
t kl t k t ku k 1,, NL
kl k ku k 1,, NL (17)
NL
t l tk t u
k 1
where k is fiber orientation angle of the kth ply group. Superscripts u and l implies the upper
and lower limits, respectively.
For an orthotropic symmetric laminate, eight effective stiffness moduli are defined as [54]:
A
E x0 (18-a)
( A22 A66 A26
2
)t
A
E y0 (18-b)
( A11 A66 A16
2
)t
A
G xy0 (18-c)
( A11 A22 A122 ) t
12 D
E xf (18-e)
( D22 D66 D26
2
)t3
12 D
E yf (18-f)
( D11D66 D162 ) t 3
12 D
G xyf (18-g)
( D11D22 D122 ) t 3
direction, transverse direction, shear and major Poisson ratio, respectively. Superscript f stands
for the effective engineering bending properties. Aij, Dij are components of the laminate ABD
stiffness matrix calculated by the classical lamination theory (CLT) as
n
Aij (Q ij ) k ( Z k Z k 1 ) i, j 1,2,6
k 1
(19)
1 n
Dij (Q ij ) k ( Z k3 Z k31 ) i, j 1,2,6
3 k 1
where (Qij )k is reduced stiffness matrix of the layer group k in the global directions [55]. Zk
and Zk-1 are, respectively, the distance of the upper and lower surfaces of the kth layer group
from the laminate mid-plane as shown in Fig. 16.
As a numerical example, a laminate with eight ply groups is designed for the specified
engineering stiffness constants. The problem input values are given in Table 10.
For comparative work, three optimization methods, i.e. the standard PSO, the standard SA
and the proposed hybrid PSO-SA methods have been used to resolve this problem. Prior to
solution, the near optimum parameters of each method have been determined by the technique
similar to that explained in Section 4.1. Again, the population size and initial temperature were
set to 50 and 1, respectively. The maximum velocity for each variable was considered to be
10% of its range. The stop criteria in all three methods were
● reaching cost function value to 2×10-3,
● number of function evaluations exceeding 1×106,
● temperature approaching less than 1×10-4, and
● cost function value being unchanged for 400 successive iterations.
The resulting parameters are given in Table 11.
The global solution of this problem is (0/452/0)s with cost function value of 0. One hundred
independent computer runs have been performed to find stability, average trial and the total
average trials of solution. The results are given in Table 12. Typical variations of the cost
function with respect to the function evaluation for all three methods are exhibited in Fig. 17.
Examining Table 12 and Fig. 17, it can be seen that the proposed hybrid PSO-SA method has
improved performance and better convergence behavior compared to the standard PSO and SA
methods.
To further show advantages of the proposed approach over the standard SA, the percentages
of the failed runs with respect to the temperature decrement parameter are plotted in Fig. 18. It
is noted that stability reduction in relatively small values of RT is a real weakness of SA method
which is less severe in the proposed approach.
The average trial and the total average trial values with respect to the temperature decrement
parameter are also given in Fig. 19 and Fig. 20, respectively. The graphs clearly indicate better
performance of the proposed hybrid PSO-SA method as compared to the standard SA.
Minimize f ( x1 , x2 ) (2 2 x1 x2 ) l
2 x1 x2
Subject to g1 ( x1 , x2 ) P 0
2 x12 2 x1 x2
x2
g 2 ( x1 , x2 ) P 0
2 x 2 x1 x2
2
1
(20)
1
g 3 ( x1 , x2 ) P 0
2 x2 x1
0 x1 1
0 x2 1
where l=100 cm and P=2 kN and = 2 kN/cm2 and x1 and x2 are continuous variables used to
define cross sections of the bars.
A feasibility-based rule has been described in [41] in which infeasible particles, i.e., the
particles which violate the constraint, are allowed to be present in PSO and SA iterations. By
defining a violation function, the infeasible particle best solution and infeasible global best
solution are guided to the feasible region. We used in this study a similar procedure for
imposing constraints. The violation function is stated as
3
v( x1 , x2 ) max[ g k ( x1 , x2 ),0] (21)
k 1
Suppose that Xki+1 be the position of the kth particle at iteration i+1 and Pkb the best position
of this particle ever visited, so far. Now, Pkb can be replaced by Xki+1 at any of the following
conditions:
a. Both Pkb and Xki+1 are feasible and f (Xki+1) < f (Xkb).
b. Pkb is infeasible and Xki+1 is feasible.
c. Both Pkb and Xki+1 are infeasible and v (Xki+1) < v (Xkb).
The global best position, Pg, is also updated in a similar way. The acceptance probability in
SA for updating the global best position is determined as:
This problem has been solved by a hybrid method based on PSO and differential evolution
(PSO-DE) in [56] and by a swarm optimization algorithm in [57]. Our proposed algorithm was
applied to this problem with c1=1.4, c2=1.6, min=0.25, max=0.55, T=1 and RT=0.95. The
obtained results together with the reference solutions are given in Table 13 showing that the
results of the proposed algorithm are very competitive with less function evaluation numbers.
It is noted that although the PSO-DE method has marginally better objective function values
than the proposed PSO-SA algorithm, but the advantage of our method is in the number of
iterations required. The stability of solution was 100%. The overall results confirm that the
hybrid PSO-SA can have the substantial capability in handling constrained optimization
problems.
6. Conclusions
A new hybrid PSO-SA hybrid algorithm is presented. The new hybrid algorithm uses both
PSO and SA algorithms in sequence through a reciprocating mechanism. The new
hybridization technique improves the social behavior of a swarm by using SA to find a better
global best position of the swarm. The hybrid algorithm avoids the sensitivity of the solution
performance to the selection of initial point while maintaining merits of SA in global
exploration and exploitation. The proposed method has been validated by optimization of thirty
different benchmark functions. The results show that, in general, the proposed method has good
stability together with reasonable speed in finding global optimum point. Moreover, it is found
that the proposed method has better stability as compared to the standard PSO and Shieh’s
hybrid PSO-SA algorithms in optimization of continuous functions with high dimensions. Two
engineering problems (i) a laminated composite design problem and (ii) design optimization of
a three-bar truss have also been included to show the capability of the proposed method in
solving various types of optimization problems. Overall, it is concluded that the method can
effectively and reliably be used for combinatorial and non-linear optimization purposes.
Appendix A
Domain: x1 , x2 10,10
Modality: multi-modal
Global optimum solution: -186.730
j log
j 2
10 (1 j ( x 2j 1 2 x j 3x j 1 cos x j 1) 2 )
Table B.2. The effect of min on the performance of the Shieh's method.
Table B.5. The effect of the cooling parameter (RT) on the performance of the Shieh's method.
Table B.7. The effect of min on the performance of the standard PSO.
Table B.10. The effect of max on the performance of the proposed hybrid PSO-SA algorithm.
Table B.11. The effect of min on the performance of the proposed hybrid PSO-SA algorithm.
Table B.12. The effect of c1 on the performance of the proposed hybrid PSO-SA algorithm.
Table B.14. The effect of the cooling rate (RT) on the performance of the proposed hybrid PSO-
SA algorithm.
References
[1] M. Batholomew-Biggs, Nonlinear optimization with engineering application, Springer US,
2009.
[2] J.S. Arora, Global optimization concepts and methods (Chapter 16), in: Introduction to
optimum design (fourth edition), Academic Press, 2017, pp. 707-738.
[3] T. Rassias, C.A. Floudas, S. Butenko, Optimization in science and engineering, Springer
US, 2014.
[4] H.-L. Shieh, C.-C. Kuo, C.-M. Chiang, Modified particle swarm optimization algorithm
with simulated annealing behavior and its numerical verification, App. Math. Comput. 218
(2011) 4365-4383.
[5] A. Meng, Z. Li, H. Yin, S. Chen, Z. Guo, Accelerating particle swarm optimization using
crisscross search, Info. Sci. 329 (2016) 52-72.
[6] M.S. Kiran, M. Gunduz, O.K. Baykan, A novel algorithm based on particle swarm and ant
colony optimization for finding the global minimum, APP. Math. Comput. 219 (2012) 1515-
1521.
[7] A.A. Mousa, M.A. El-Shorbayy, W.F. Abd-El-Wahed, Local search based hybrid particle
swarm optimization algorithm for multiobjective optimization, Swarm Evol. Comput. 3 (2012)
1-14.
[8] W. Deng, R. Chen, B. He, Y. Liu, L. Yin, J. Guo, A novel two stage hybrid swarm
intelligence optimization and algorithm, Soft Comput. 16 (2012) 1707-1722.
[9] J. Kennedy, R.C. Eberhart, Particle swarm optimization, in: Proc. IEEE Int. Conf. Neural
Net. IV, 1995, pp. 1942-1948.
[10] R.C. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in: Proc. Sixth
Int. Symp. Micro Mach. and hum. Sci., Nagoya, Japan (1995) 39-43.
[11] J. Kennedy, R.C. Eberhart, Y. Shi, Swarm Intelligence, Morgan Kaufmann Pub. (2001)
USA.
[12] Y. Shi, R.C. Eberhart, Parameters selections in particle swarm optimization, in: Proc.
IEEE Int. Conf. Evol. Prog. (1998) 591-600.
[13] M. Taherkhani, R. Safabakhsh, A novel stability-based adaptive inertia weight for particle
swarm optimization, Appl. Soft Comput. 38 (2016) 281-295.
[14] P.J. Angeline, Using selection to improve particle swarm optimization, in: Proc. IEEE
Cog. Evol. Comput., 1998, pp. 84-89.
[15] S.N. Omkar, D. Mudigere, G. Narayana Naik, S. Gopalakrishnan, Vector evaluated
particle swarm optimization (VEPSO) for multi-objective design optimization of composite
structures, Comput. Struct. 36 (1-2) (2008) 1-14.
[16] S.N. Omkar, R. Khandelwal, T.V.S. Ananth, G. Narayana, S. Gopalakrishnan, Quantum
behaved Particle Swarm Optimization (QPSO) for multi-objective design optimization of
composite structures, Expert Sys. Appl. 36 (2009) 11312-11322.
[17] N. Ben Guedria, Improved Accelerated PSO Algorithm for Mechanical Engineering
Optimization Problems, Appl. Soft Comput. 40 (2016) 455-467.
[18] A.E.M. Zavala, A.H. Aguirre, E.R. Villa Diharce, Particle evolutionary swarm
optimization Algorithm (PESO), in: Proc. Mex. Int. Conf. Comput. Sci. 2005 (2005) 282-289.
[19] A. Banks, J. Vincent, C. Anyakoha, A review of particle swarm optimization. Part I:
background and development, Nat. Comput. 6 (2007) 467-484.
[20] A. Rezaee Jordehi, J. Jasni, Particle swarm optimization for discrete optimization
problems, a review, Artif. Intell. Rev. 43 (2015) 243-258.
[21] K.E. Parsopoulos, M.N. Vrahatis, Particle swarm optimization and intelligence: advances
and applications, IGA Global USA, 2010.
[22] R. Kathiravan, R. Ganguli, Strength design of composite beam using gradient and particle
swarm optimization, Comp. Struct. 81(4) (2007) 471-479.
[23] N. Metropolis, A.W. Rosenbluth, M.N. Rosenbluth, A.H. Teller, E. Teller, Equations of
state calculations by fast computing machines, J. Chem. Phys. 21 (1953) 1087-1092.
[24] K. Genovese, L. Lamberti, C. Pappalettere, Improved global-local simulated annealing
formulation for solving non-smooth engineering optimization problems, Int. J. Solids Struct.
42 (2005) 203-237.
[25] W. Michiels, E. Aarts, J. Korst, Theoretical aspect of local search, Springer-Verlag Berlin
Heidelberg, 2007.
[26] J.J. Schneider, M. Puchta, Investigation of acceptance simulated annealing - a simplified
approach to adaptive cooling schedule, Physica A 389 (2010) 5822-5831.
[27] B. Hajek, Cooling schedules for optimal annealing, Math. Oper. Res. 13 (1988) 311-329.
[28] L. Liu, H. Mu, J. Yang, X. Li, F. Wu, A simulated annealing for multi-criteria optimization
problem: DBMOSA, Swarm Evol. Comput., 14 (2014) 48-65.
[29] M. Di Sciuva, M. Gherlone, D. Lombario, Multiconstrained optimization of of laminated
and sandwich plates using evolutionary algorithms and higher-order plate theories, Comp.
Struct. 59 (2003) 149-154.
[30] M. Akbulut, F.O. Sonmez, Design optimization od laminated composites using a new
variant of simulated annealing, Comput. Struct. 89(17-18) (2011) 1712-1724.
[31] O. Erdal, F.O. Sonmez, Optimum design of composite laminates for maximum buckling
load capacity using simulated annealing, Comp. Struct. 71 (2005) 45-52.
[32] F. Javidrad, R. Nouri, A simulated annealing method for design of laminates with required
stiffness properties, Compos. Struct. 93 (2011) 1127-1135.
[33] S. karakaya, O. Soykasap, Natural frequency and bucking optimization of laminated
hybrid composite plates using genetic algorithm and simulated annealing, Struct. Multidisc.
Optim. 43 (2011) 61-72.
[34] W.-J. Xia, Z.-M. Wu, A hybrid particle swarm optimization approach for the job-shop
scheduling problem, Int. J. Adv. Manuf. Technol. 29 (2006) 360-366.
[35] J. Behnamian, S.M.T. Fatemi Ghomi, Development of a PSO-SA hybrid metaheuristic for
a new regression model to time series forecasting, Exp. Sys. Appl. 37 (2010) 974-984.
[36] F. Zhao, Q. Zhang, D. Yu, X. Chen, Y. Yang, A hybrid algorithm based on PSO and
simulated annealing and its application for partner selection in virtual enterprise, In:
Proceedings Advances in Intelligent Computing: Int. Conf. Intelligent Comput. ICIC 2005
China, Part I, LNCS 3644, D.S. Huang, X.-P. Zhang, G.-B. Huang (Eds.), Springer-Verlag
Berlin Heidelberg (2005) 380-389.
[37] A. Hadidi, A. Kaveh, B. Farahmand Azar, S. Talataheri, C. Farahmandpour, An efficient
hybrid algorithm based on particle swarm and simulated annealing for optimal design of space
trusses, Int. J. Optim. Civil Eng. 3 (2011) 377-395.
[38] X. Deng, Z. Wen, Y. Wang, P. Xiang, An improved PSO algorithm based on mutation
operator and simulated annealing, Int. J. Multimedia Ubiquitous Eng. 10 (10) (2015) 369-380.
[39] J. Zhong, T. Xu, K. Guan, B. Zou, Determination of ductile damage parameters using
hybrid particle swarm optimization, Experimental Mech. 56(6) (2016) 945-955.
[40] G. Wang, Z. Ma, Hybrid particle swarm optimization for first-order reliability method,
Comput. Geotech. 81 (2017) 49-58.
[41] Q. He, L. Wang, A hybrid particle swarm optimization with a feasibility-based rule for
constrained optimization, App. Math. Comput. 86(2) (2007) 1407-1422.
[42] A. Costa, G. Celano, S. Fichera, Optimization of multi-pass turning economies through a
hybrid particle swarm optimization technique, Int. J. Adv. Manuf. Technol. 53 (2011) 421-433.
[43] N. Sadati, T. Amraee, A.M. Ranjbar, A global particle swarm-based-simulated annealing
optimization technique for under-voltage load shedding problem, Appl. Soft Comput. 9 (2009)
652-657.
[44] N. Safaei, R. Tavakkoli-Moghaddam, C. Kiassat, Annealing-based particle swarm
optimization to solve the redundant reliability problem with multiple component choices, Appl.
Soft Comput. 12 (2012) 3462-3471.
[45] X-H Wang, J-J Li, Hybrid particle swarm optimization with simulated annealing, In: Proc.
Third Conf. Machine Learning Cybernetics, Shanghai (2004), 26-29.
[46] N. Sadati, M. Zamani, Hybrid particle swarm-based-simulated annealing optimization
techniques, IEEE Conference, 1-4244-0136-4/06 (2006).
[47] L. Idoumghar, M. Melkemi, R, Schott, M.I. Aouad, Hybrid PSO-SA type algorithm for
multimodal function optimization and reducing energy consumption in embedded systems,
Appl Computational Intell Soft Comput. (2011) doi:10.1155/2011/138078.
[48] F. Marini, B. Walczak, Particle swarm optimization (PSO). A tutorial, Chemometrics
Intell Lab Sys. Accepted Manuscript (2015), doi:10.1016/j.chemolab.2015.08.020.
[49] R.C. Eberhart, Y. Shi. J. Kennedy, Swarm intelligence, Morgan Kaufmann Burlington,
MA, 2001.
[50] L. Ozdamar, M. Demirhan, Experiments with new stochastic global optimization search
techniques, Comput. Oper. Res. 27(9) (2000) 841-865.
[51] T. Steihaug, S. Suleiman, Global convergence and the Powell singular function, J. Glob.
Optim. 56 (3) (2013) 845-853.
[52] M. Montaz Ali, C. Khompatraporn, Z.B. Zabinsky, A numerical evaluation of several
stochastic algorithms on selected continuous test problems, J. Glob. Optim. 31 (2005) 635-672.
[53] Y.D. Li, S.Y. Li, A new genetic chaos optimization combination method, Contr. Theo.
Appl. 19(1) (2002) 143-145.
[54] W. Zhang, K.E. Evans, A FORTRAN program for the design of laminates with required
mechanical properties, Comput. Struct. 45(5-6) (1992) 919-939.
[55] M.W. Hyer, Stress analysis of fiber-reinforced composite materials, McGraw Hill; 1998.
[56] H. Liu, Z. Cai, Y. Wang, Hybridizing particle swarm optimization with differential
evolution for constrained numerical and engineering optimization, Appl. Soft Comput. 10
(2010) 629-640.
[57] T. Ray, K.M. Liew, Society and civilization: an optimization algorithm based on the
simulation of social behavior, IEEE Trans. on Evol Comput. 7 (4) 2003 386-396.
The first author’s biography
Farhad Javidrad is currently a professor of Aeronautical Engineering. Dr. Javidrad received his Ph.D. in 1995
from Imperial College of Science, Technology and Medicine of London, England, and his M.Sc. in mechanical
engineering from Tehran University, Iran in 1991. His research interests include optimization methods, fracture
mechanics and composite materials. He has published over 50 research papers in national and international
journals and 3 books in these areas.
10
0.1
0.0001
1 10 100 1000 10000
Function evaluations (log scale)
Fig.2. Typical convergence behavior of the three methods used to find the global optimum of the Schubert's
function.
left ord Objective function values (log scale)
abs Function evaluations (log scale)
100
0.1
0.01
SA
0.001 PSO
0.0001
1 10 100 1000 10000
Function evaluations (log scale)
Fig.3. SA and PSO contributions to the optimization of Rastring’s function by the proposed hybrid PSO-SA
method.
left ord Objective function values (log scale)
abs Function evaluations (log scale)
Proposed Hybrid PSO-SA Standard PSO Shieh's Hybrid PSO-SA
100
Fig.4. Percentage of successful runs for three optimization methods with respect to dimension of the Rastrigin
function.
left ord Percentage of successful runs
abs Dimension of the Rastrigin function
0.18
0.14
0.12
0.1
0.08
0.06
0.04
0.02
0
2D 3D 4D 5D 6D 7D 8D 0D 10D
Dimension of the Rastrigin function
Fig.5. Average time complexities for successful runs of the Rastrigin function by three optimization methods.
left ord Average time for successful runs (s)
abs Dimension of the Rastrigin function
7
Proposed Hybrid PSO-SA
6
Standard PSO
4
3
2
1
0
2D 3D 4D 5D 6D 7D 8D 9D 10D
Dimension of the Rastring's function
Fig.6 Total average time of computation for the Rastrigin function by three optimization methods.
left ord Total average time (s)
abs Dimension of the Rastrigin function
100000
Fig.7. SA and PSO contributions to the optimization of PSF function by the proposed hybrid PSO-SA method.
left ord Objective function values (log scale)
abs Function evaluations
100
80 Standard PSO
60
50
40
30
20
10
0
8D 12D 16D 20D 24D
Dimension of the PSF function
Fig.8. Percentage of successful runs for three optimization methods with respect to dimension of the PSF
function.
left ord Percentage of successful runs
abs Dimension of the PSF function
0.35
0.2
0.15
0.1
0.05
0
8D 12D 16D 20D 24D
Dimension of the PSF function
Fig.9. Average time complexities for successful runs of the PSF function by three optimization methods.
left ord Average time for successful runs (s)
abs Dimension of the PSF function
6
Proposed Hybrid PSO-SA Standard PSO Shieh's Hybrid PSO-SA
5
0
8D 12D 16D 20D 24D
Dimension of the PSF function
Fig.10. Total average time of computation for the PSF function by three optimization methods.
left ord Total average time (s)
abs Dimension of the PSF function
Fig.11. Optimization of the 10-D Griewank function by several stochastic methods [52].
(IHR: Improving Hit-and-Run, HNS: Hide-and-Seek, CRS: Controlled Random Search, GA:
Genetic Algorithm, DE: Differential Evolution)
Left ord Function values
Abs Function Evaluations
1000
0.1
0.01
0.001
0.0001
0 2000 4000 6000 8000 10000
Function evaluations
Fig.12. Convergence behavior of 10-D Griewank function by the proposed hybrid PSO-SA method.
left ord Objective function values (log scale)
abs Function evaluations
100
90
Fig.13. Percentage of successful runs for three optimization methods with respect to dimension of the Griewank
function.
left ord Percentage of successful runs
abs Dimension of the Griewank function
0.18
0.14
0.12
0.1
0.08
0.06
0.04
0.02
0
2D 3D 4D 5D 6D
Dimension of the Griewank function
Fig.14 Average time complexities for successful runs of the Griewank function by three optimization methods.
left ord Average time for successful runs (s)
abs Dimension of the Griewank function
12
Proposed Hybrid PSO-SA
10
0
2D 3D 4D 5D 6D
Dimension of the Griewank function
Fig.15 Total average time of computation for the Griewank function by three optimization methods.
left ord Total average time (s)
abs Dimension of the Griewank function
y
2
kth layer group
z
k=n 1
Zk Zk-1
k=2 x x
k=1
0.1
0.01
0.001
0.0001
1 10 100 1000 10000 100000
Function evaluations (log scale)
0
0.6 0.7 0.8 0.9 1
Temperature decrement parameter, RT
Fig. 18. Percentage of failed runs with respect to the temperature decrement parameter.
Fig. 19. Average trial with respect to the temperature decrement parameter.
Fig. 20. The total average trial with respect to the temperature decrement parameter.
l x1 x2 x1
RT max min c1 c2
Standard PSO N/A 0.6 0.15 1.8 1
Shieh’s hybrid PSO-SA 0.6 0.6 0.2 1.9 1.3
The proposed hybrid PSO-SA 0.95 0.55 0.1 0.8 1.7
Table 2. Stability of solution determined using three optimization methods.
T0 RT max min c1 c2
Standard PSO N/A N/A 0.65 0.05 1.9 2
Standard SA 1 0.98 N/A N/A N/A N/A
The proposed hybrid PSO-SA 1 0.96 0.85 0.15 0.8 1.7
Table 12. Performance of the optimization methods in finding global optimum solution of the
laminate stiffness design problem.
T0 RT max min c1 c2
Standard PSO N/A N/A 0.65 0.05 1.9 2
Standard SA 1 0.98 N/A N/A N/A N/A
The proposed hybrid PSO-SA 1 0.96 0.85 0.15 0.8 1.7
Table 13. Comparing of the three-bar truss design problem results of the hybrid PSO-SA
method with respect to the other state-of-the-art algorithms.
Function
Method Best Mean Worst SD
evaluations
PSO-DE [56] 263.89584338 263.89584338 263.89584338 4.5E-10 17,600
Ray and Liew [57] 263.89584654 263.90335672 263.96975638 1.3E-02 17,610
Proposed hybrid PSO-SA 263.89591830 263.89656630 263.89699970 9.8E-08 12,530
Table A.1. Test functions used in the numerical experiments.
Global
ID Dim Function Name optimum Domain
solution
1 2 f ( x1 , x2 ) 100 x2 0.01x12 0.01 x1 10 Bukin 0. x1 15,5,
x 2 3,3
2 2 x1 5 x1 1 Chichinadze -43.3159
f ( x1 , x2 ) x12 12 x12 11 10 cos( ) 8 sin( ) ( ) 0.5 exp( 0.5( x2 0.5) 2 ) x1 , x 2 30,30
2 2 5
3 2 1 cos(12 x12 x22 ) Drop-Wave -1. x1 , x2 5.12,5.12
f ( x1 , x2 )
0.5( x12 x22 ) 2
4 2 f ( x1 , x2 ) ( x2 47) sin
x2 0.5x1 47 x1 sin( x1 x2 47 Egg-Holder -959.641 x1 , x 2 512,512
9 2 Sphere 0.
f ( x1 , x 2 ) 100( x 2 x12 ) 2 ( x1 1) 2 x1 , x 2 512,512
Model
10 2 2 Styblinsky- -78.332
f ( x1 , x2 ) 0.5 ( x 16 x 5 x j ) 4
j
2
j Tang
x1 , x 2 5,5
j 1
11 2 2 W / Wavy 0.
f ( x1, x2 ) 1 0.5 cos(10 x j ) exp( x 2j / 2) x1 , x 2 ,
j 1
12 3 10 Box-betts 0. x1 , x3 0.9,1.2
f ( x1 , x2 , x3 ) (exp( 0.1 jx1 ) exp( 0.1 jx 2 ) [exp( 0.1 j ) exp( j )]x3 ) 2
x2 9,11.2
Exponential
j 1
Quadratic
13 3 De Joung 0. x1 , x2 , x3 5.12,5.12
f ( x1 , x2 , x3 ) x12 x22 x32
14 4 4
f ( x1 ,..., x4 ) [ x (2 sin(1/ x j )]
6
Csendes 0. x j 1,1, j 1,...,4
j
j 1
15 4 4
f ( x1 ,..., x4 ) 1 [ x 2j / 4000] (cos( x j / j )
4 Griewang 0. x j 100,100, j 1,...,4
j 1 j 1
16 4 4
f ( x1 ,..., x4 ) ( x 2j 10 cos(2 x j ) 10
Rastrigin 0. x j 5.12,5.12, j 1,...,4
j 1
17 4
f ( x1 ,..., x4 ) 4 418.9829 ( x j sin x j )
4 Schwefel 0. x j 500,500, j 1,...,4
j 1
18 4 3
f ( x1 ,..., x4 ) ( x 2j 1 x 2j ) 0.25[sin 2 (50( x 2j 1 x 2j ) 0.1 ) 0.1]
Stretched V 0. x j 10,10, j 1,...,4
Sine wave
j 1
19 5 5 5
f ( x1 ,..., x5 ) {[(100( x 2j xi ) 2 (1 xi ) 2 ) 2 ) / 4000]
Whitley 0. x j 2,2, j 1,...,5
j 1 i 1
cos(100( x 2j xi ) 2 (1 xi2 ) 1}
20 6 6 5
f ( x1 ,..., x6 ) jx 2j 20 j sin 2 ( x j 1 sin x j sin x j 1 )
Pint’er 0. x j 10,10, j 1,...,6
j 1 j 2
5
j log
j 2
10 (1 j ( x 2j 1 2 x j 3x j 1 cos x j 1) 2 )
21 6 6
f ( x1 ,..., x6 ) ( x j 0.5 ) 2
Step 2 0. x j 100,100, j 1,...,6
j 1
22 8 7
f ( x1 ,..., x8 ) [100( x 2j x j 1 ) 2 ( x j 1) 2 ]
Rosenbrock 0. x j [5,10], j 1,...,8
j 1
23 8 8 8
f ( x1 ,..., x8 ) ( x 2j ) ( 0.5 jx j ) 2 ( 0.5 jx j ) 4
8 Zakharov 0. x j [5,10], j 1,...,8
j 1 j 1 j 1
x j 500,500, j 1,...,10
2 Trigonometri
24 10 10 1.
f ( x1 ,..., x10 ) 1 8 sin [7( xi 0.9) 2 ] 6 sin 2 [14( xi 0.9) 2 ] ( xi 0.9) 2 c
j 1
25 10 10
f ( x1 ,..., x10 ) ( x j 1) 2 x j x j 1
10 Trid10 -200. x j 100,100, j 1,...,10
j 1 j 1
26 10 10 10
f ( x1 ,..., x10 ) ( ( x j 1) 2 ) ( j ( x j 1)) 2 ( j ( x j 1)) 4
10 Variably 0. x j 100,100, j 1,...,10
j 1 j 1 j 1
28 16 4
f ( x1 ,..., x16 ) [( x 4 j 3 10 x 4 j 2 ) 2 5( x 4 j 1 x 4 j ) 2
Powell 0. x j [4,5], j 1,...,16
Singular
j 1
( x 4 j 2 x 4 j 1 ) 4 10( x 4 j 3 x 4 j ) 4 ]
29 16 16 15
f ( x1 ,..., x16 ) jx 2j 20 j sin 2 ( x j 1 sin x j sin x j 1 )
Pint’er 0. x j 10,10, j 1,...,16
j 1 j 2
15
j log
j 2
10 (1 j ( x 2j 1 2 x j 3x j 1 cos x j 1) 2 )
30 32 31
f ( x1 ,..., x32 ) [( x 2j )
( x 2j 1 1)
( x 2j 1 )
( x 2j 1)
]
Brown 0. x j 1,4, j 1,...,32
j 1
Table B.1. The effect of max on the performance of the Shieh's method.
Average trial Total average trial Stability
max ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.9 5247.9 6520.8 4299.4 4437.4 5247.9 20359.5 47193.9 28683.1 100 35 10 16
0.85 4634.4 5832.5 3819.7 4119.7 4634.4 17313.3 17380.8 9660.4 100 37 25 43
0.8 3957.2 5177.1 3434.1 3679.8 3957.2 13792.0 10411.4 5469.1 100 41 37 68
0.75 3641.8 4693.6 3210.5 3299.1 3641.8 12373.6 10279.8 4013.3 100 42 37 83
0.7 3287.1 4098.1 3044.1 2954.0 3287.1 11651.8 8159.4 3392.3 100 41 43 88
0.65 3161.1 3556.2 2675.8 2683.1 3161.1 12756.3 7383.4 3054.1 100 34 45 89
0.6 3059.8 3329.5 2632.2 2480.1 3059.8 9078.9 5537.2 2899.3 100 44 55 87
0.55 2958.8 3093.4 2474.6 2354.3 2958.8 8643 6134.6 2587.9 100 43 48 92
Table B.2. The effect of min on the performance of the Shieh's method.
Average trial Total average trial Stability
min ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.5 3087.8 3675.77 2511.27 2607.93 3087.8 10453.5 7900.1 2783.21 100 40 41 94
0.45 3059.78 3329.45 2632.24 2480.14 3059.78 9078.86 5537.2 2899.25 100 44 55 87
0.4 2974.03 3201.31 2461.92 2466.69 2974.03 9289 6295.81 2796.49 100 42 48 89
0.35 2896.34 3234.39 2442.46 2345.32 2896.34 10286.9 5073.07 2535.42 100 38 57 93
0.3 3027.39 3051.11 2348.98 2260.52 3027.39 9726.74 4325.56 2453.18 100 38 64 93
0.25 2917.07 3309.57 2365.2 2250.62 2917.07 8005.76 5090.57 2434.42 100 46 56 93
0.2 2893.31 2975.44 2276.76 2203.65 2893.31 8164.65 4603.44 2353.89 100 43 59 94
0.15 2838.77 3239.1 2232.55 2115.74 2838.77 8616.15 4756.54 2322.75 100 41 56 92
Table B.3. The effect of c1 on the performance of the reference analysis.
Average trial Total average trial Stability
c1
ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.2 1340.9 1182.7 1707.8 1680.0 1340.9 78419.7 4002.2 1742.3 100 3 49 97
0.4 1603.2 1491.0 1736.2 1631.4 1603.2 59288.5 4661.8 1714.2 100 4 43 96
0.6 1815.9 1253.6 1759.1 1622.1 1815.9 34404.9 3666.2 1725.8 100 7 54 95
0.8 2009.9 1501.7 1793.4 1621.2 2009.9 36107.1 4043.7 1621.2 100 7 50 100
1 2185.1 1804.3 1866.0 1711.4 2185.1 38768.7 4775.7 1795.1 100 7 46 96
1.1 2286.5 1998.3 1877.5 1674.9 2286.5 39551.0 4360.2 1715.3 100 7 50 98
1.2 2417.1 2183.8 1936.9 1758.9 2417.1 18951.5 4209.0 1780.0 100 15 53 99
1.3 2522.9 2371.0 1905.0 1803.3 2522.9 13016.2 3501.0 1867.8 100 23 63 97
1.4 2635.1 2528.0 2087.2 1870.6 2635.1 9272.7 4526.8 1984.9 100 32 53 95
1.5 2628.7 2452.5 2094.0 1938.2 2628.7 13187.8 5176.6 1959.3 100 24 48 99
1.65 2825.6 2807.7 2159.1 2032.6 2825.6 10094.6 5005.8 2205.5 100 33 52 93
1.7 2881.4 2993.3 2190.3 2110.4 2881.4 8718.7 4741.7 2159.2 100 40 56 98
1.8 2893.3 2975.4 2276.8 2203.7 2893.3 8164.7 4603.4 2353.9 100 43 59 94
1.9 3025.1 3425.5 2416.2 2299.0 3025.1 7225.3 5376.9 2376.1 100 51 54 97
2.0 3045.9 3502.2 2672.0 2464.5 3045.9 7423.8 7355.7 2757.7 100 52 44 97
2.1 3095.1 3621.5 2633.4 2611.0 3095.1 6781.9 3106.4 3106.4 100 59 55 85
Table B.4. The effect of c2 on the Reference analysis
Average trial Total average trial Stability
c2
ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.8 3883.3 4893.6 2585.2 14051.5 3883.3 6388.6 23447.5 1727.6 100 82 17 97
1 3464.7 4056.1 2492.1 4028.8 3464.7 5789.2 9713.0 1826.7 100 76 34 96
1.1 3180.9 3942.5 2385.7 3007.6 3180.9 4971.3 7862.3 1899.4 100 82 40 98
1.2 3204.4 3722.4 2503.1 2459.0 3204.4 5679.1 8032.0 1939.3 100 70 39 99
1.3 3012.1 3657.5 2459.3 1977.3 3012.1 5445.1 6127.8 2140.7 100 71 52 97
1.4 2508.3 2810.7 1981.9 1856.3 2508.3 16478.6 4742.4 1733.2 100 19 50 95
1.5 2554.9 2163.2 2019.6 1775.8 2554.9 12056.0 4185.6 1755.0 100 24 55 99
1.65 2471.8 2288.9 2020.7 1861.6 2471.8 16752.6 4417.2 1838.8 100 18 53 93
1.7 2472.4 2045.7 2039.3 1925.7 2547.3 19601.0 3963.2 1860.0 99 15 58 98
1.8 2425.7 2234.1 1937.5 1943.3 2425.7 20002.2 4824.5 1899.0 100 15 48 94
1.9 2472.8 2158.3 2019.8 2024.1 2472.8 30994.1 4543.1 2002.2 100 10 53 97
.
Table B.5. The effect of the cooling parameter (RT) on the performance of the Shieh's method.
Average trial Total average trial Stability
RT
ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.99 3087.7 18942.2 4753.9 2016.0 3087.7 2730.7 2216.0 2016.0 100 34 77 100
0.95 3019.9 21587.5 4936.0 1996.1 3019.9 2744.7 2317.3 1996.1 100 31 76 100
0.9 2901.2 14095.8 5697.4 2029.1 2901.2 3262.2 2220.0 2029.1 100 44 72 100
0.85 2876.5 13326.1 4394.8 2039.9 2876.5 2858.0 2300.8 2039.9 100 40 76 100
0.8 2779.7 9985.8 4362.7 2095.0 2779.7 3173.0 2332.1 2095.0 100 46 73 100
0.75 2753.0 10784.2 4246.8 2137.1 2753.0 3239.5 2383.8 2137.1 100 40 73 100
0.7 2653.6 10164.0 4045.2 2174.8 2653.6 3402.1 2366.3 2174.8 100 41 74 100
0.65 2555.1 7896.9 3883.0 2203.0 2555.1 3239.9 2331.2 2203.0 100 49 74 100
0.6 2498.7 7284.8 3908.4 2183.6 2498.7 3268.9 2293.1 2183.6 100 52 71 100
0.55 2519.2 8971.4 3715.4 2219.8 2519.2 3073.1 2276.3 2219.8 100 41 71 100
0.5 2438.3 7805.4 4175.3 2207.5 2438.3 3353.8 2120.3 2207.5 100 47 61 100
Table B.6. The effect of max on the performance of the standard PSO.
Average trial Total average trial Stability
max ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.9 2751.6 3923.1 5568.8 3834.1 2751.6 7066.2 16280.6 3834.1 100 83 51 100
0.85 2406.1 3428.5 5021.8 3445.5 2406.1 6201.4 15807.8 3445.5 100 84 50 100
0.8 2253.8 3327.0 5038.5 3182.0 2253.8 5553.2 16710.9 3182.0 100 83 47 100
0.75 1954.1 3057.2 4344.1 2870.5 1954.1 5636.7 17290.6 2870.5 100 85 44 100
0.7 1783.8 2827.3 4234.4 2618.7 1783.8 4820.2 20298.9 2618.7 100 84 38 100
0.65 1621.5 2831.4 3960.0 2425.5 1621.5 5023.1 12013.7 2425.5 100 85 54 100
0.6 1474.7 2425.2 3552.6 2268.2 1474.7 4336.9 13476.1 2268.2 100 81 49 100
0.55 1383.7 2661.8 3235.5 2114.2 1383.7 5453.8 17120.5 2114.2 100 77 40 100
0.5 1272.6 2126.7 3490.0 2024.4 1272.6 5410.5 16348.6 2024.4 100 83 41 100
Table B.7. The effect of min on the performance of the standard PSO.
Average trial Total average trial Stability
min ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.5 1623.1 4360.8 3110.0 2155.4 1623.1 9341.1 4386.4 2155.4 100 66 91 100
0.45 1568.4 4025.4 2982.7 2094.9 1568.4 8948.6 4824.1 2094.9 100 67 87 100
0.4 1550.2 4068.7 2615.3 2070.8 1550.2 8870.9 5808.5 2070.8 100 67 81 100
0.35 1550.9 3627.1 2933.0 2041.2 1550.9 6795.3 5470.3 2041.2 100 77 86 100
0.3 1511.4 3691.3 2800.5 2022.0 1511.4 8355.5 4897.0 2022.0 100 67 86 100
0.25 1464.8 4006.8 2530.4 1996.5 1464.8 7171.5 5711.0 1996.5 100 75 82 100
0.2 1454.5 3777.5 2564.5 1996.6 1454.5 6438.4 5454.9 1996.6 100 78 83 100
0.15 1420.0 3885.4 2588.3 1952.0 1420.0 7256.6 5458.8 1952.0 100 73 83 100
0.1 1414.5 3946.2 2635.8 1928.0 1414.5 7654.8 7348.7 1928.0 100 70 78 100
0.05 1424.3 3660.0 2810.0 1909.2 1424.3 7048.8 7490.9 1909.2 100 75 77 100
Table B.8. The effect of c1 on the performance of the standard PSO.
Average trial Total average trial Stability
c1 ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.6 1063.4 1250.3 1622.3 1565.7 1063.4 243017.0 12149.5 1565.7 100 3 44 100
0.7 1078.7 1235.0 1699.8 1568.4 1078.7 240085.0 8628.5 1568.4 100 3 54 100
0.8 1088.2 1596.8 1605.2 1562.2 1088.2 181334.0 8826.5 1562.2 100 4 54 100
0.9 1100.2 2221.3 1595.8 1583.9 1100.2 123705.0 9456.7 1583.9 100 6 55 100
1 1144.5 2811.8 1633.5 1637.5 1144.5 74756.8 7943.7 1637.5 100 10 59 100
1.1 1164.5 2558.4 1829.0 1655.9 1164.5 74313.4 7218.1 1655.9 100 10 64 100
1.2 1201.5 2855.4 1750.6 1699.0 1201.5 41308.2 6519.7 1699.0 100 18 68 100
1.3 1235.7 2871.4 1938.7 1707.4 1235.7 31277.6 6616.3 1707.4 100 24 67 100
1.4 1289.7 3532.8 2003.1 1764.5 1289.7 21485.7 5776.0 1764.5 100 34 74 100
1.5 1287.0 3502.6 2139.4 1809.5 1287.0 16711.3 5514.1 1809.5 100 40 79 100
1.6 1310.4 3073.4 2082.8 1864.2 1310.4 14690.8 5134.6 1864.2 100 43 80 100
1.7 1373.1 3465.0 1957.4 1958.5 1373.1 13185.4 5994.8 1958.5 100 49 75 100
1.8 1428.7 3803.9 2090.6 1999.7 1428.7 10045.6 3276.8 1999.7 100 60 90 100
1.9 1405.8 3835.3 2078.0 2098.6 1405.8 10843.1 3157.3 2098.6 100 58 92 100
Table B.9. The effect of c2 on the performance of the standard PSO.
Average trial Total average trial Stability
c2 ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.6 1560.5 2960.0 3174.0 2158.9 1560.5 37115.6 34420.5 19656.4 100 27 38 40
0.7 1443.7 3395.2 2150.2 2087.3 1443.7 12682.5 16032.0 3201.4 100 59 62 92
0.8 1433.2 3627.5 2307.7 1973.6 1433.2 9538.7 8116.1 1973.6 100 67 78 100
0.9 1452.0 4374.2 2201.3 1923.9 1452.0 9402.9 5546.1 1923.9 100 68 85 100
1.0 1408.0 4194.1 2268.5 1969.8 1408.0 8526.0 5717.4 1969.8 100 69 83 100
1.1 1428.7 3803.9 2090.6 1999.7 1428.7 10045.6 3276.8 1999.7 100 60 90 100
1.2 1436.6 3482.6 2089.5 2056.8 1436.6 14730.4 4668.9 2056.8 100 45 80 100
1.3 1436.7 3702.1 2237.0 2145.7 1436.7 19981.3 4988.3 2145.7 100 36 78 100
1.4 1417.1 3324.3 2325.1 2226.4 1417.1 19772.9 4665.8 2226.4 100 35 81 100
1.5 1489.6 3005.7 2445.9 2362.4 1489.6 27616.8 5561.1 2362.4 100 27 76 100
1.6 1515.7 3345.0 2421.7 2524.9 1515.7 37259.2 5540.8 2524.9 100 21 76 100
1.7 1490.8 3660.2 2449.6 2682.7 1490.8 24656.9 6888.0 2682.7 100 30 69 100
1.8 1600.0 3219.2 2616.9 2815.4 1600.0 42527.1 6152.0 2815.4 100 19 74 100
1.9 1638.2 3307.2 2942.0 3035.1 1638.2 30701.2 7274.4 3035.1 100 25 71 100
Table B.10. The effect of max on the performance of the proposed hybrid PSO-SA algorithm.
Average trial Total average trial Stability
max ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.9 1845.2 7285.1 8669.8 9229.3 1845.2 7285.1 10188.6 9229.3 100 100 94 100
0.85 1839.2 6587.0 7833.1 8236.1 1839.2 6737.7 9956.5 8236.1 100 99 91 100
0.8 1706.6 6236.3 7290.8 7369.9 1706.6 6236.3 10206.3 7369.9 100 100 88 100
0.75 1883.3 5513.9 6624.2 6579.6 1883.3 5513.9 9883.0 6579.6 100 100 86 100
0.7 1742.8 5054.9 5874.9 5854.5 1742.8 5054.9 8061.9 5854.5 100 100 90 100
0.65 1749.1 4738.7 5633.1 5308.1 1749.1 4738.7 8657.9 5308.1 100 100 86 100
0.6 1642.6 4181.7 5597.3 4717.3 1642.6 4181.7 9204.5 4717.3 100 100 84 100
0.55 1599.8 3990.1 5099.1 4307.4 1599.8 3990.1 7312.6 4307.4 100 100 89 100
0.5 1519.2 3533.0 4413.8 3883.6 1519.2 3533.0 6921.2 3883.6 100 100 87 100
Table B.11. The effect of min on the performance of the proposed hybrid PSO-SA algorithm.
Average trial Total average trial Stability
min ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.5 1661.7 4272.4 5227.9 4215.0 1661.7 4272.4 6109.2 4215.0 100 100 95 100
0.45 1614.6 3916.5 4870.1 4176.2 1614.6 3916.5 6332.7 4176.2 100 100 93 100
0.4 1649.9 3896.0 4725.7 4026.0 1649.9 3896.0 5993.9 4026.0 100 100 93 100
0.35 1526.7 3852.1 4740.2 3919.4 1526.7 3852.1 6270.5 3919.4 100 100 92 100
0.3 1583.1 3635.9 4707.9 3815.6 1583.1 3635.9 6281.7 3815.6 100 100 92 100
0.25 1524.5 3464.7 4460.6 3756.4 1524.5 3464.7 5930.0 3756.4 100 100 92 100
0.2 1538.3 3369.7 4339.3 3745.1 1538.3 3369.7 6015.8 3745.1 100 100 91 100
0.15 1487.0 3361.9 4524.0 3624.4 1487.0 3361.9 6011.9 3624.4 100 100 92 100
0.1 1494.4 3248.4 4157.8 3542.2 1494.4 3248.4 5504.1 3542.2 100 100 92 100
0.05 1498.9 3156.6 4184.0 3473.0 1498.9 3270.2 6066.6 3473.0 100 99 91 100
Table B.12. The effect of c1 on the performance of the proposed hybrid PSO-SA algorithm.
Average trial Total average trial Stability
c1
ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.6 1408.0 2993.3 3511.9 3067.4 1408.0 2993.3 4425.5 3067.4 100 100 94 100
0.7 1351.4 2985.3 3395.2 3074.7 1351.4 3090.6 4174.8 3074.7 100 99 95 100
0.8 1326.4 2974.5 3405.2 3025.3 1326.4 2974.5 4074.6 3025.3 100 100 96 100
0.9 1381.9 3030.2 3487.5 3175.4 1381.9 3135.2 4091.9 3175.4 100 99 96 100
1 1375.5 3099.2 3620.7 3213.1 1375.5 3099.2 4258.1 3213.1 100 100 96 100
1.1 1452.9 3246.2 3720.8 3402.7 1452.9 3359.0 4361.5 3402.7 100 99 96 100
1.2 1407.8 3255.4 3779.5 3569.3 1407.8 3255.4 4282.4 3569.3 100 100 97 100
1.3 1485.2 3340.8 3870.0 3918.6 1485.2 3340.8 4514.7 3918.6 100 100 96 100
1.4 1461.8 3410.3 4130.2 4034.0 1461.8 3410.3 4270.4 4034.0 100 100 99 100
1.5 1504.5 3433.1 4109.4 4401.1 1504.5 3561.4 4643.3 4401.1 100 99 97 100
1.6 1470.4 3520.8 4188.4 4559.7 1470.4 3770.7 4368.1 4559.7 100 98 99 100
1.7 1484.4 3704.1 4427.3 4870.6 1484.4 3704.1 4845.5 4870.6 100 100 98 100
1.8 1537.5 3796.2 4834.5 5253.4 1537.5 3796.2 4834.5 5253.4 100 100 100 100
1.9 1547.0 3935.1 4753.5 5426.6 1547.0 3935.1 4932.6 5426.6 100 100 99 100
Table B.13. The effect of c2 on the performance of the proposed hybrid PSO-SA algorithm.
Average trial Total average trial Stability
c2
ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.6 1231.89 3769.71 4662.69 6168.78 1231.89 3769.71 5200.84 6168.78 100 100 97 100
0.7 1281.94 3575.97 4830.27 4598.77 1281.94 3575.97 5585.48 4598.77 100 100 96 100
0.8 1277.19 3089.28 3801.14 2671.37 1277.19 3183.93 4871.25 2671.37 100 99 95 100
0.9 1325.21 3076.45 3249.69 2229.49 1325.21 3403.36 3809.6 2229.49 100 97 97 100
1 1236.59 2722.72 3188.88 2074.66 1236.59 2722.72 4648.89 2074.66 100 100 92 100
1.1 1360.33 2620.45 3075.73 2227.9 1360.33 2818.82 4600.05 2227.9 100 98 93 100
1.2 1294.75 2763.01 3112.44 2237.07 1294.75 3304.38 4050.97 2237.07 100 95 95 100
1.3 1322.85 2762.24 2965.16 2365.96 1322.85 2864.46 4189.51 2365.96 100 99 92 100
1.4 1372.2 2820.39 3099.87 2483.87 1372.2 3035.8 3925.45 2483.87 100 98 95 100
1.5 1393.61 2921.35 3363.94 2592.24 1393.61 3017.52 3920.08 2592.24 100 99 96 100
1.6 1374.56 2991.77 3351.33 2860.67 1374.56 3090.56 4602.52 2860.67 100 99 92 100
1.7 1326.35 2974.48 3405.23 3027.2 1326.35 2974.48 4074.6 3027.2 100 100 96 100
1.8 1447.51 3143.99 3581.02 3295.47 1447.51 3360.52 4172.58 3295.47 100 98 96 100
1.9 1426.04 3156.08 3583.65 3425.54 1426.04 3156.08 4875.93 3425.54 100 100 92 100
Table B.14. The effect of the cooling rate (RT) on the performance of the proposed hybrid PSO-
SA algorithm.
Average trial Total average trial Stability
R
ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23) ID(8) ID(15) ID(20) ID(23)
0.99 1516.1 2970.6 3595.8 3237.0 1516.1 2970.6 4282.1 3237.0 100 100 96 100
0.95 1326.4 2974.5 3405.2 3025.3 1326.4 2974.5 4074.6 3025.3 100 100 96 100
0.9 1346.7 3032.7 3477.5 3072.6 1346.7 3144.8 4518.3 3072.6 100 99 94 100
0.85 1223.3 2923.0 3591.2 3131.4 1223.3 2923.0 4325.1 3131.4 100 100 96 100
0.8 1158.0 2937.5 3515.9 3065.0 1158.0 3143.2 4080.7 3065.0 100 98 96 100
0.75 1055.0 2862.7 3459.2 3064.5 1055.0 3106.2 3905.5 3064.5 100 97 96 100
0.7 1004.3 2758.1 3269.8 3157.2 1004.3 3471.4 3886.2 3157.2 100 91 94 100
0.65 952.8 2793.7 2983.3 3244.9 952.8 4266.9 3992.0 3244.9 100 82 89 100
0.6 936.4 2407.3 2621.6 3035.6 936.4 5798.3 3946.1 3035.6 100 60 82 100