You are on page 1of 28

Engineering with Computers

https://doi.org/10.1007/s00366-020-01234-1

ORIGINAL ARTICLE

SGOA: annealing‑behaved grasshopper optimizer for global tasks


Caiyang Yu1 · Mengxiang Chen2 · Kai Cheng3 · Xuehua Zhao4 · Chao Ma4 · Fangjun Kuang5 · Huiling Chen1 

Received: 26 October 2020 / Accepted: 2 December 2020


© The Author(s), under exclusive licence to Springer-Verlag London Ltd. part of Springer Nature 2021

Abstract
An improved grasshopper optimization algorithm (GOA) is proposed in this paper, termed as SGOA, which combines
simulated annealing (SA) mechanism with the original GOA that is a natural optimizer widely used in finance, medical and
other fields, and receives more promising results based on grasshopper behavior. To compare performance of the SGOA and
other algorithms, an investigation to select CEC2017 benchmark function as the test set was carried out. Also, the Fried-
man assessment was performed to check the significance of the proposed method against other counterparts. In comparison
with ten meta-heuristic algorithms such as differential evolution (DE), the proposed SGOA can rank first in the CEC2017,
and also ranks first in comparison with ten advanced algorithms. The simulation results reveal that the SA strategy notably
improves the exploration and exploitation capacity of GOA. Moreover, the SGOA is also applied to engineering problems
and parameter optimization of the kernel extreme learning machine (KELM). After optimizing the parameters of KELM
using SGOA, the model was applied to two datasets, Cleveland Heart Dataset and Japanese Bankruptcy Dataset, and they
achieved an accuracy of 79.2% and 83.5%, respectively, which were better than the KELM model obtained other algorithms.
In these practical applications, it is indicated that the proposed SGOA can provide effective assistance in settling complex
optimization problems with impressive results.

Keywords  Grasshopper optimization algorithm · Swarm intelligence · Kernel extreme learning machine · Engineering
design

1 Introduction

1.1 Background and literature review

Along with the social progress, people are always look-


ing for a measure to weigh whether the affairs are being in
Fangjun Kuang, Huiling Chen have contributed equally to this
work.

1
* Fangjun Kuang Department of Computer Science and Artificial Intelligence,
kfj@wzbc.edu.cn Wenzhou University, Wenzhou 325035, China
2
* Huiling Chen Department of Information Technology, Wenzhou Vocational
chenhuiling.jlu@gmail.com College of Science and Technology, Wenzhou 325006,
Zhejiang, China
Caiyang Yu
3
Jerome0324@163.com State Grid Hebei Information & Telecommunication Branch,
Shijiazhuang 050021, China
Mengxiang Chen
4
chenmengxiang2016@163.com School of Digital Media, Shenzhen Institute of Information
Technology, Shenzhen 518172, China
Kai Cheng
5
smileck@126.com School of Information Engineering, Wenzhou Business
College, Wenzhou 325035, China
Xuehua Zhao
lcrlc@sina.com
Chao Ma
billmach@163.com

13
Vol.:(0123456789)
Engineering with Computers

the best state when making the solving-obstacle decision. [37, 38]. Tumuluru et al. [39] declared to update the weight
In many fields, such as industrial manufacturing, scientific of the Depp belief Neural Network using GOA and Gradient
research, transportation, engineering research, energy man- Descent, and then used the Neural Network to classify the
agement, resource utilization, engineering design, social cancer-diagnosis image. Zhao et al. [40] proposed a multi-
finance, communication network optimization and so on, stage intelligent wind power prediction algorithm through
people have perseveringly aimed to find the best processing integrating the Beveridge–Nelson decomposition method,
method, although under certain restrictions [1–5]. Therefore, support vector machine and GOA. Sultana et al. [41] used
optimization algorithms have emerged as the time requires. GOA to optimize energy and stabilize the coefficient of the
Optimization refers to the process of finding the optimal system in an original way of combining the optimal dis-
feasible solution under a specific restriction, such as in a tribution of Distributed Generation (DG) and BBS. Liang
reasonable time, or in a specified place, etc., in which the et al. [42] modified the GOA with Levy flight algorithm, and
working solution of the optimization problem can be con- then applied the improved GOA to the multi-level thresh-
ducted quantitatively. The so-called optimization problem old method to search for the optimal threshold. Omar et al.
refers to parameter values setting to make some optimal- [43] employed GOA to adjust the gain value of the pro-
ity techniques of the function (or system) being met under portional-integral-derivative (PID) controller. Zhang et al.
certain conditions, strive for maximizing or minimizing its [44] utilized GOA to optimize the parameters of variable
function’s performance indexes [6–11]. Using the optimiza- mode decomposition (VMD). Compared with the traditional
tion algorithm to solve the optimization problem as the most fixed parameters, the efficiency has been greatly improved.
common and effective method now, can be applied to all Mafarja et al. [45] proposed an efficient optimizer with uses
aspects of social activities in a broad range [12–23]. of grasshopper optimization algorithm, selection operation
Since not only do the optimization algorithm bring con- and evolutionary population dynamics (EPD) to reduce the
venience to people’s activities, but also it is beneficial to immature convergence of the simple grasshopper optimiza-
all aspects of national development, it has certain theoreti- tion algorithm, for good results in feature selection. Jumani
cal significance and great implement prospects. As the sus- et al. [46] highlighted grasshopper optimization algorithm to
tained growth of the society, people are facing increasingly obtain the optimal proportion integral (PI) in the controller
complex problems. At the same time, with the penetration gains, so as to enhance the dynamic response of microgrid
and utilization of computer technology, more optimization system in Islanded. GOA approach was introduced by Ibra-
algorithms emerge. Especially after the flourish of artifi- him et al. [47] to optimize the parameters affecting support
cial intelligence, genetics, and other disciplines, many vector machine and to select features on a real biological
researchers have integrated and performed those in opti- data set. Liu et al. [48] combined with linear weighting and
mization problems. Under such a background, the swarm grasshopper optimization algorithm, established an accu-
intelligence optimization algorithm will come into being. rate Internet energy hub model, and the results also proved
The swarm intelligence algorithm belongs to random opti- the scalability and other advantages of the set strategy. Wu
mization algorithms. And it explores the solution space et al. [49] made use of multiple strategies, such as natural
randomly through the iterative evolution of search agents. selection, the original GOA, the proposed adaptive GOA
Many scholars are also committed to the research of this (AGOA), and then regarded the algorithm as the solver of
kind of algorithm [24–28]. Most of these algorithms are distributed model predictive control (DMPC). Saxena et al.
enlightened by the movement and reproduction of animals. [50] operated different chaotic sequences into GOA. three
For example, genetic algorithm (GA) [29] is inspired by the mechanisms including Gauss Mutation, Levy-flight strategy
phenomenon of genetic material cross mutation; ant colony and opposition-based learning method to GOA integrated
algorithm (ACO) [30] is stimulated by ant colony foraging; by Luo et al. [51], were successfully applied to the financial
particle swarm algorithm (PSO) [31] is developed by the stress prediction problem. Jia et al. [52] combined the adap-
flight behavior of birds. Recently, many great swarm intel- tive differential evolution strategy in the grasshopper optimi-
ligence algorithms have been proposed, such as dragonfly zation algorithm, proposed the GOA-jDE algorithm, which
algorithm (DA) [32], gray wolf optimization (GWO) [33] improved the convergence speed and calculation accuracy
algorithm, gravity search algorithm (GSA) [34], whale opti- of the original algorithm. At the same time, the new algo-
mization algorithm (WOA) [35], grasshopper optimization rithm also plays an indispensable role in engineering design.
algorithm (GOA) [36], etc. The GOA involved in this paper Zakeri et al. [53] improved the grasshopper optimization
is proposed by Saremi et al. Because of its simple and effi- algorithm called grasshopper optimization feature selection
cient implementation, GOA has been used in various fields (GOFS), to make it more suitable to solve the problem of

13
Engineering with Computers

feature selection. A promotion made by Saxena et al. [54] These 30 functions are selected from well-regarded IEEE
that the two-stage bridging mechanism in the grasshopper CEC2017 [61]. We conducted the Friedman test [62] to ver-
optimization algorithm was formed and ten different chaotic ify the significant differences between the SGOA and other
maps enhanced chaotic grasshopper optimization algorithms counterparts. Finally, the proposed SGOA is applied to settle
(ECGOAs). Another combination of grasshopper optimiza- mathematical model problems and parameter optimization
tion algorithm and adaptive differential evolution (jDE) to of kernel extreme learning machine (KELM) [63] model.
get GOA-jDE that Jia et al. [55] developed, was verified that
its robustness in processing natural images by testing and 1.2 Novelty and contributions
evaluating on different satellite images. Ewees et al. [56]
proposed a grasshopper optimization algorithm based on The original GOA has the same defects as the traditional
opposition-based learning, which was tested on the bench- swarm intelligence algorithm. As the population contin-
mark function and was useful in engineering problems. Yue ues to evolve, the diversity of the population will become
et al. [57] took the innovation of principal component analy- weaker, which means that the distribution of solutions in
sis strategy (PCA-GOA), and then added the inertia weight the entire feasible region will centralize the region, which
operation to enhance the whole performance of the original will further weaken the search ability of the algorithm. The
algorithm. Arora et al. [58] introduced chaos theory into the chaotic map and Levy flight strategy are introduced into the
optimization process of GOA to speed up global conver- original SA. And, the uncertainty of the chaotic map makes
gence and improve the performance of the original GOA. it possible to explore other potential solutions in the space,
Ghulanavar et al. [59] propose an improved GOA (IGOA) ensuring the diversity of the population during the entire
for selecting the hidden units present in the bidirectional iteration process, as the same time, Levy flight enhances the
LSTM (Long short-term memory) layer of the AlexNet, search ability of the algorithm and improves the ability of
resulting in improved accuracy of AlexNet in the classifica- the population to jump out of the local optimum. Accord-
tion of various gear signals. ingly, the SA based on chaotic mapping and Levy flight is
Hence, the GOA has widely concerned in various field introduced into GOA, which effectively avoids the premature
because of its simplicity, flexibility and efficiency. However, phenomenon of the algorithm and improves its convergence
in multimodal and other complex problems, GOA shows accuracy. Moreover, from the experimental results, it is not
its shortcomings, such as weak ability to jump out of local difficult to find that the SGOA proposed in this work has the
optimal solution, over-fast convergence trend and so on. For fastest convergence speed in most cases and its optimization
different optimization problems, GOA generates solutions in accuracy is also satisfactory compared with other GOAs that
different ways. Therefore, dealing with different optimiza- have been published.
tion problems, new solution generation methods are need. In general, the following contributions are made in this
When using GOA to deal with diverse optimization algo- work.
rithms, although we can find the corresponding strategy, it
is not far enough. Therefore, it is managed to modify the To effectively solve the disadvantages of GOA, we intro-
core dismissal of GOA to overcome its weakness through duced the SA strategy into GOA.
more effective strategies. Compared with other algorithms, The proposed SGOA is applied to engineering problems,
GOA has some excellent characteristics, which are formu- and it has achieved promising results.
lated in the work of the original GOA and the work based The proposed SGOA successfully solves the parameter
on GOA. Meanwhile, the improvement strategy proposed optimization of KELM. And the final model has made
in this paper also confirms this point. GOA has a simplified a good evaluation in the classification of diseases and
structure and computational complexity, so it is easier to financial stress prediction.
get favors of researchers than other mature algorithms. In
this paper, the simulated annealing (SA) [60] mechanism to 1.3 Paper organization
improve algorithm performance is described in detail and
the simulated annealing GOA (SGOA) is proposed, aiming The rest of the paper is structured as follows. Section 2 pre-
at boosting the basic local search ability of GOA. The core sents a concise description of GOA. Section 3 explains the
of the SA mechanism is to jump out of a certain range of conception of SA and SGOA in detail. Section 4 introduces
optimal solutions with a set probability to find other optimal and analyzes the simulation results of SGOA on benchmark
solutions. If there is a better solution than the previous one, functions, engineering design problems and parameter opti-
it will be replaced, otherwise it will not change. The new mization of the KELM model. Section 5 summarizes the
SGOA is tested on 30 functions to evaluate its effectiveness. whole study and points out future work.

13
Engineering with Computers

2 Grasshopper optimization algorithm Gi = −ĝeg (6)


(GOA)
where g is the acceleration of gravity, ê g refers a unity vec-
GOA was proposed by Saremi et al. [36], who solved optimi- tor of g.
zation problems by simulating the behavior of grasshoppers The composition of variable A in Eq. (1) is as follows:
in nature. The life of grasshopper can be divided into two
Ai = ûew (7)
stages: larval and adult. The behavior of grasshoppers in
two stages is entirely different. Grasshoppers in their larval where u is a constant drift and ê w is a unity vector.
phase move slowly. If they want to move in a wide range, In combination with the above formula, we can convert
they can only rely on the wind. In contrast, adult grasshop- the original Eq. (1) into the following form:
pers can move and search for objects over long distances by
N ( )
jumping and flying. In the GOA, the corresponding math- ∑ | | xj − xi
Xi = s |xj − xi | − ĝeg + ûew
ematical model is obtained by imitating the grasshopper’s | | dij (8)
j=1
survival behavior. In the algorithm, the grasshopper’s posi- j≠i
tion is taken as a candidate solution. Combined with the two
stages of grasshopper’s life, its movement is mainly affected where N refers the number of grasshoppers.
by three factors. The formulation expression is as follows: To be more conducive to the establishment of mathemati-
cal model, parameters are added on the basis of Eq. (8), so
Xi = Si + Gi + Ai (1) that the original formula can adjust the exploitation and
exploration ability more effectively. In addition, among the
where Xi defines the position of the ith grasshopper, Si is the three influencing factors, the influence of gravity force is the
social interaction, Gi is the gravity force on the ith grasshop- weakest and can be ignored. At the same time, it is assumed
per, and Ai shows the wind advection. that the direction of wind is in the direction of the optimal

N
( ) solution. Finally, the above formula can be transformed into:
Si = s dij d̂ ij
(2) ⎛N ⎞
⎜� ubd − lbd �� d � xj − xi ⎟
j=1
j≠i d�
Xid = c⎜ c s �xj − xi � + T̂ d (9)
⎜ j=1 2 � � dij ⎟⎟
| | ⎝ j≠i ⎠
dij = |xj − xi | (3)
| |
where ubd and lbd is the bound of Dth dimension, T̂ d is the
value of the Dth dimension in the current optimal solution.
d̂ ij = (xj − xi )∕dij (4)
And the parameter c is as follows:
−r c max −c min
s(r) = fe l − e−r (5) c = c max −l (10)
L
The above four formulas are used to calculate social where cmax and cmin is the maximum and minimum value,
interaction. Where dij is the distance between the ith and l is the current iteration, and L is the maximum of iteration.
jth grasshopper, d̂ ij is a unit vector between the ith and jth The overall framework of GOA is as follows:
grasshopper. The function s represents social forces, where
f and l represent the intensity and scale of attraction.
The composition of variable G in Eq. (1) is as follows:

13
Engineering with Computers

3 Proposed method The Metropolis criterion means that even if the solution
obtained in the next iteration is worse than the previous one,
3.1 Simulated annealing (SA) it will accept the next solution with a certain probability and
abandon the current solution. The acceptance probability
In 1983, American physicist S. Kirkpatrick et al. [64] devel- formula is as follows:
oped a set of algorithms based on the metropolis method, {
and used them to seek the optimal solution in combinatorial 1 ( ) snew ≤ s
P= s−snew (11)
optimization. Later, Kirkpatrick et al. named the method exp T
snew > s
"simulated annealing". The simulated solid annealing pro-
cess in thermodynamics is employed in SA. After the solid where s and snew are the solutions obtained by the current
is heated, the high-temperature environment will cause the solution and the next iteration, respectively. T denotes the
molecules to have a turbulent and irregular arrangement
state. However, after the temperature is slowly reduced, the
molecules will slowly reach a stable solidification state, and Temperature T0 Temperature Tn
Energy E0 Basic energy
then the solid will also reach a stable state. The annealing
process is shown in Fig. 1. Starting from the initial tem-
perature T0 , the whole energy is at its maximum, and the Temperature T0 Temperature T1 Temperature Tn
quantum state (reflected in the algorithm as the initial state
Thermal equilibrium Cooling Thermal equilibrium N-th Thermal equilibrium
of the population) is also relatively active. After that, the process process Cooling process
quantum state will reach a balance through constant cooling,
Metropolis Metropolis Metropolis
thermal equilibrium process and Metropolis process. At this
time, the quantum has the most basic energy.
Fig. 1  Simulated annealing in solid state

13
Engineering with Computers

Fig. 2  Flowchart of simulated annealing

where being the controlled parameter, 𝜇 is always evaluated


at 4. When 𝜇 = 4 , the particle moves in chaos. 𝛽i is a random
value between [0,1], and s is the number of population parti-
cles. Chaos is very sensitive to early conditions, which can
be understood as a kind of dynamic motion with random-
ness. To increase the diversity of the premature population,
chaos was described into SA and mapped into the grasshop-
pers’ population.
The introduction of levy flight into the SA mechanism can
enhance the searching capacity of the population and ensure
that the population can get a better solution. Levy flight
operation is performed according to the following formula:

xi = xi × (1 + L(𝛽)) (13)

where xi is the current individual and xi is the individual


after Levy operation. And L(𝛽) is a distributed random num-


ber obtained from levy distribution.
Based on the above analysis, the flow chart of the
Fig. 3  Flow chart of SGOA improved simulated annealing is obtained as shown in Fig. 2.

3.3 SA‑based GOA
corresponding temperature, and P denotes the probability.
SA operation continues until the temperature drops to a set
Although the original GOA has some advantages in solv-
minimum.
ing optimization problems, such as fast searching power for
feasible solutions in space. However, the balance of GOA
3.2 Enhanced SA
in exploration and exploitation phases is not better enough
than other algorithms. First of all, the original GOA has
It can be seen from the previous section that the concept
weak ability to jump out of local optimal solution. Usually,
of SA is derived from physics performance. Therefore, to
after finding the local minimum in the solution space (tak-
make the SA mechanism more consistent with GOA, SA is
ing searching for the minimum solution as an example), it is
enhanced further. In the proposed algorithm, two operations
difficult to jump out of it and search for the minimum value.
are introduced into SA to increase population diversity: cha-
Secondly, the population solution is onefold, which will
otic map and Levy-flight [65].
lead to the smaller range of population search and the lower
The implementation of chaotic mapping is as follows:
probability of encountering the optimal solution. Therefore,
( )
𝛽i+1 = 𝜇𝛽i × 1 − 𝛽i i = 1, 2, … , s − 1 (12) based on the above discussion, the first thing to solve is to
get rid of the current method and get the optimal solution,

13
Engineering with Computers

and then increase the number of population solutions. There- Table 2  Parameters setting for MAs
fore, the SA mechanism involves to the original GOA, and MAs Popula- Maximum Parameter values
levy flight and chaotic mapping embrace to the annealing tion size generation
process for the increase of population diversity. The general
SGOA 30 1000 cMax = 1; cMin = 0.00004
framework of SGOA is shown in Fig. 3.
DE 30 1000 Scaling factor = [0.2, 0.8];
Crossover probability = 0.2
3.4 Motivation GWO 30 1000 a ∈ [20]
WOA 30 1000 a1 = [20]; a2 = [−2 − 1];b = 1
As described above, in a traditional GOA, the population
MFO 30 1000 b = 1;t = [−11]; a ∈ [−1 − 2]
concentrates towards an optimal individual position as the
GOA 30 1000 cMax = 1; cMin = 0.00004
individual position is constantly updated. The concentra-
SSA 30 1000 c1 ∈ [01];c2 ∈ [01]
tion of individuals in the population will, to some extent,
BA 30 1000 A = 0.5;r = 0.5
lead to early maturation when facing multi-modal problems.
SCA 30 1000 A=2
Moreover, the imbalance between exploitation and explora-
MVO 30 1000 Existence probability ∈ [0.21]
tion in GOA also causes the problem of poor search ability Travelling distance rate ∈ [0.61]
of the algorithm. Therefore, there is reason to believe that it ALO 30 1000 k = 500
is crucial to design an optimizer that can ameliorate the loss
of population diversity, poor search ability, and sufficient

Table 1  Detail of the IEEE No. Function Fi∗ = Fi (x∗ )


CEC2017 test functions
Unimodal function 1 Shifted and rotated bent cigar function 100
2 Shifted and rotated sum of different power function 200
3 Shifted and rotated zakharov function 300
Simple multimodal function 4 Shifted and rotated Rosenbrock’s function 400
5 Shifted and rotated Rastrigin’s function 500
6 Shifted and rotated expanded Scaffer’s F6 function 600
7 Shifted and rotated lunacek Bi_Rastrigin function 700
8 Shifted and rotated non-continuous Rastrigin’s function 800
9 Shifted and rotated Levy function 900
10 Shifted and rotated Schwefel’s function 1000
Hybrid function (HF) 11 HF 1 (N = 3) 1100
12 HF 2 (N = 3) 1200
13 HF 3 (N = 3) 1300
14 HF 4 (N = 4) 1400
15 HF 5 (N = 4) 1500
16 HF 6 (N = 4) 1600
17 HF 6 (N = 5) 1700
18 HF 6 (N = 5) 1800
19 HF 6 (N = 5) 1900
20 HF 6 (N = 6) 2000
Composition function (CF) 21 CF 1 (N = 3) 2100
22 CF 2 (N = 3) 2200
23 CF 3 (N = 4) 2300
24 CF 4 (N = 4) 2400
25 CF 5 (N = 5) 2500
26 CF 6 (N = 5) 2600
27 CF 7 (N = 6) 2700
28 CF 8 (N = 6) 2800
29 CF 9(N = 3) 2900
30 CF 10 (N = 3) 3000
Search range:[−100,100]D

13
Engineering with Computers

ability to jump local optimality in the original GOA. The • WOA [35]
chaotic map and Levy flight strategy are introduced into the • Moth-Flame Optimization Algorithm (MFO) [69]
original SA. And, the uncertainty of the chaotic map makes • GOA [36]
it possible to explore other potential solutions in the space, • Salp swarm algorithm (SSA) [70]
ensuring the diversity of the population during the entire • Bat Algorithm (BA) [71]
iteration process, as the same time, Levy flight enhances the • Sine Cosine Algorithm (SCA) [72]
search ability of the algorithm and improves the ability of • Muti-Verse Optimization Algorithm (MVO) [73]
the population to jump out of the local optimum. Based on • Ant Lion Optimization Algorithm (ALO) [74]
this consideration, we design SGOA to improve the perfor-
mance of basic GOA when deal with engineering problems
and parameter optimization of KELM. with comparison to other MAs, including DE, GWO,
WOA, MFO, GOA, SSA, BA, SCA, MVO, ALO, the SGOA
testing results are shown in Table 3. Figures in Table 3 indi-
4 Experimental result cate the Avg. and Stdv. index of each algorithm after multi-
ple runs on the function. Moreover, the population size was
In this chapter, SGOA will be compared with the well- 30, the dimensionality of variable dimension functions was
known algorithms and advanced algorithms, and then it will set to 30, and iteration was set to 1000, respectively.
be employed to engineering problems and parameter optimi- From the results in Table 3, it can be revealed that the
zation of the KELM [63]. Firstly, SGOA is in comparison best value obtained by SGOA is better than that obtained by
with other meta-heuristics [66] on 30 IEEE CEC2017 [61] other algorithms in almost all test functions, and ranks first
benchmark problems. The purpose is to analyze whether in more than half of the test functions. Therefore, in this test
the proposed algorithm is well improved or not. Secondly, set, SGOA is easier to find the optimal solution than other
SGOA is competed with the advanced algorithms on IEEE MAs.
CEC2017 test set to further verify its effectiveness. Finally, In the F3 function in Fig. 4, the optimal value obtained
the performance of SGOA is validated using engineering by SGOA operation is the closest to the global optimal value
problems and parameter optimization of KELM. of all values. Compared with GOA, the search ability of
The average result (Avg.) and standard deviation (Std.) the proposed algorithm on unimodal function is greatly
were used to evaluate the performance of SGOA, where the improved. Also, at the beginning of F3 function, the value
optimal result for each problem will be bold. Among them, obtained is far less than that of other functions. There is no
"+/=/−" respectively indicates whether the algorithm of this local optimal solution for F1–F3 function, so these three
paper was superior to, equal to, or inferior to other algo- functions are quite valid to evaluate the basic performance of
rithms. In addition, the Friedman test [67] was also used to the MAs. The Levy flight operation in the simulated anneal-
assess the average performance of the algorithm in statisti- ing mechanism helps to extend new searching areas. To sum
cal sense and the average ranking value (ARV) was used to up, SGOA has increased the detail processing method based
represent the results. on the original algorithm, so the performance has got great
All experiments were carried out under the Windows improvement. In F4 function, all algorithms come close to
Server 2012 R2 OS using MATLAB R2014a software, and the optimal solution, while SGOA is even weak with BA
the hardware platform used was configured with Intel (R) at the beginning, but with the increase of iterations, SGOA
Xeon (R) Sliver 4110 CPU (2.10 GHz) and 16 GB RAM. finally surpasses all other algorithms to reach the global
optimal solution. For F7 function, it can be shown that it is
4.1 Comparison with MAs in the optimal solution of all algorithms from the beginning
to the end of iteration. According to the figure, its result is
In this section, SGOA will be compared with some classical better than other types of algorithms on the F10 problem.
and novel algorithms on the IEEE CEC2017 benchmark test In the hybrid functions, we take the most representative
set. The test set can be divided into four parts. T he details functions (F14 and F16) for example. From the two func-
of these functions are shown in Table 1. tions, it is occurred that the curve corresponding to SGOA
To ensure the fairness of competition, all algorithms will is far superior to other MAs in finding the minimum value.
be tested in the same environment, and the external param- Such a phenomenon can also be found in functions repre-
eters are set to the same value, as shown in Table 2. The sented by composition functions (F26 and F29). Therefore,
MAs involved in the comparison are as follows: from the results of these functions, it exists that SGOA can
find the best value even in complex functions. The addition
• Differential Evolution Algorithm (DE) [68] of simulated annealing mechanism enables GOA to effec-
• GWO [33] tively jump out of the local area and expand the solution.

13
Engineering with Computers

Table 3  Comparison of SGOA and MAs on IEEE CEC2017 problem


F1 F2 F3
Avg Stdv Avg Stdv Avg Stdv

SGOA 1.0909E+05 7.3218E+04 6.6092E+10 3.6200E+11 3.0299E+02 1.1233E+00


DE 9.9217E+03 5.1297E+03 3.0208E+29 8.6638E+29 1.3183E+05 2.4015E+04
GWO 1.9491E+09 1.3758E+09 1.8386E+32 7.5874E+32 4.4971E+04 1.0322E+04
WOA 7.6612E+08 4.5195E+08 6.6990E+35 3.6671E+36 2.6085E+05 8.8282E+04
MFO 9.8470E+09 8.7011E+09 3.1395E+36 1.6396E+37 1.4567E+05 6.1480E+04
GOA 2.1054E+06 2.0731E+06 1.8518E+30 7.3839E+30 1.2143E+04 5.4996E+03
SSA 1.4456E+09 9.5863E+08 3.4833E+32 9.4337E+32 6.6863E+04 1.9861E+04
BA 1.8823E+07 2.4196E+06 6.3259E+06 1.3451E+07 2.0256E+03 1.2585E+03
SCA 1.7266E+10 2.4537E+09 3.1025E+36 7.8779E+36 6.2966E+04 1.4026E+04
MVO 3.6387E+05 1.1016E+05 4.2048E+12 9.1309E+12 5.7555E+02 1.9199E+02
ALO 2.8349E+03 3.5340E+03 1.1963E+15 4.7639E+15 1.0840E+05 3.9564E+04
F4 F6 F6
Avg Stdv Avg Stdv Avg Stdv

SGOA 4.7840E+02 1.2990E+01 6.1066E+02 3.0559E+01 6.0191E+02 1.7746E+00


DE 4.9954E+02 1.0124E+01 6.7200E+02 9.3080E+00 6.0000E+02 1.3242E−04
GWO 6.0121E+02 9.8582E+01 6.1794E+02 4.2326E+01 6.0925E+02 3.7050E+00
WOA 7.6928E+02 9.9817E+01 8.3304E+02 5.9629E+01 6.7371E+02 1.1533E+01
MFO 1.0464E+03 6.3484E+02 6.9754E+02 4.0354E+01 6.3623E+02 1.1774E+01
GOA 5.0875E+02 2.8501E+01 6.6537E+02 4.4716E+01 6.4452E+02 1.2953E+01
SSA 7.1988E+02 1.1507E+02 7.2661E+02 4.7878E+01 6.5898E+02 1.0147E+01
BA 5.0282E+02 2.1952E+01 8.5060E+02 5.1291E+01 6.7554E+02 1.1325E+01
SCA 2.1687E+03 6.0091E+02 8.0951E+02 2.4320E+01 6.5982E+02 7.0528E+00
MVO 4.9366E+02 1.2306E+01 6.1023E+02 3.2389E+01 6.2060E+02 1.0261E+01
ALO 5.0859E+02 2.7560E+01 6.5962E+02 4.4968E+01 6.4321E+02 7.0389E+00
F7 F8 F9
Avg Stdv Avg Stdv Avg Stdv

SGOA 7.8338E+02 1.7416E+01 8.9919E+02 1.9425E+01 1.4481E+03 1.4397E+03


DE 9.0147E+02 1.2654E+01 9.6694E+02 9.6876E+00 1.0438E+03 6.6648E+01
GWO 8.8381E+02 3.9598E+01 8.9275E+02 2.4610E+01 1.8949E+03 4.7825E+02
WOA 1.2635E+03 9.7676E+01 1.0389E+03 5.1200E+01 1.0381E+04 3.4449E+03
MFO 1.0947E+03 1.6031E+02 1.0082E+03 4.6796E+01 6.9387E+03 2.4471E+03
GOA 9.1439E+02 4.8456E+01 9.5777E+02 3.2105E+01 4.6087E+03 2.1770E+03
SSA 1.1509E+03 8.4487E+01 9.9382E+02 3.3715E+01 6.2417E+03 1.0301E+03
BA 1.6875E+03 2.0282E+02 1.0600E+03 6.5353E+01 1.4981E+04 4.9802E+03
SCA 1.1918E+03 4.7537E+01 1.0889E+03 2.5906E+01 7.0160E+03 2.2006E+03
MVO 8.5910E+02 4.4689E+01 9.0808E+02 2.9720E+01 4.1803E+03 2.6114E+03
ALO 1.0107E+03 8.6785E+01 9.3558E+02 3.4653E+01 4.1020E+03 1.6141E+03
F10 F11 F12
Avg Stdv Avg Stdv Avg Stdv

SGOA 4.3892E+03 4.5564E+02 1.2046E+03 3.5952E+01 1.8089E+06 9.3270E+05


DE 7.3040E+03 3.1636E+02 1.4196E+03 1.2132E+02 1.8162E+07 7.1432E+06
GWO 4.9258E+03 1.5412E+03 1.8041E+03 6.2651E+02 9.4107E+07 1.0112E+08
WOA 6.8194E+03 7.5771E+02 6.1401E+03 2.7933E+03 2.0819E+08 1.6561E+08
MFO 5.7588E+03 7.1356E+02 4.6815E+03 5.8366E+03 3.6059E+08 6.7565E+08
GOA 5.2751E+03 6.7523E+02 1.4436E+03 9.3542E+01 2.1488E+07 1.7990E+07

13
Engineering with Computers

Table 3  (continued)
F10 F11 F12
Avg Stdv Avg Stdv Avg Stdv

SSA 5.9056E+03 9.4176E+02 1.8875E+03 3.2402E+02 1.8868E+08 1.9882E+08


BA 5.9527E+03 7.9907E+02 1.3515E+03 7.1931E+01 1.3057E+07 8.8147E+06
SCA 8.7064E+03 2.7051E+02 3.1119E+03 8.9605E+02 2.2176E+09 7.1150E+08
MVO 4.5595E+03 7.9896E+02 1.3346E+03 5.8508E+01 1.0947E+07 9.0849E+06
ALO 5.3454E+03 7.7260E+02 1.3145E+03 6.2251E+01 1.2519E+07 1.0617E+07
F13 F14 F15
Avg Stdv Avg Stdv Avg Stdv

SGOA 1.0328E+05 4.5182E+04 1.8505E+03 2.1415E+02 4.0634E+04 2.2098E+04


DE 1.1495E+06 9.4677E+05 1.5882E+05 1.2802E+05 2.0716E+05 1.3850E+05
GWO 2.0240E+07 5.5754E+07 3.6295E+05 5.4873E+05 4.0317E+05 8.2706E+05
WOA 7.7476E+05 4.6743E+05 1.3741E+06 1.5083E+06 5.5078E+05 9.2587E+05
MFO 9.0827E+07 3.0736E+08 2.2071E+05 3.3476E+05 5.5702E+04 5.2725E+04
GOA 1.5920E+05 1.0527E+05 5.3312E+04 4.9527E+04 9.6675E+04 9.8683E+04
SSA 1.1232E+05 8.3653E+04 4.8864E+05 4.4598E+05 5.0704E+04 2.4838E+04
BA 1.0124E+06 2.4312E+05 2.9080E+04 2.1369E+04 1.8504E+05 8.7922E+04
SCA 8.4833E+08 5.3106E+08 4.2234E+05 3.0933E+05 3.2687E+07 3.5636E+07
MVO 1.5685E+05 8.2138E+04 2.8304E+04 2.8403E+04 5.8881E+04 3.2535E+04
ALO 1.0202E+05 5.0193E+04 8.2807E+04 7.9163E+04 4.6588E+04 2.7599E+04
F16 F17 F18
Avg Stdv Avg Stdv Avg Stdv

SGOA 2.0773E+03 1.4591E+02 1.9739E+03 1.1315E+02 3.7190E+04 1.6669E+04


DE 2.6569E+03 1.7882E+02 2.0148E+03 7.6880E+01 1.8774E+06 7.9535E+05
GWO 2.5528E+03 3.5484E+02 2.0362E+03 1.6594E+02 9.1543E+05 9.9421E+05
WOA 3.9695E+03 5.4216E+02 2.6642E+03 3.3160E+02 7.9745E+06 9.8469E+06
MFO 3.2741E+03 3.2270E+02 2.4717E+03 2.7426E+02 2.5930E+06 4.4778E+06
GOA 2.8803E+03 2.7965E+02 2.1249E+03 1.7612E+02 6.6881E+05 7.6066E+05
SSA 3.2335E+03 3.4685E+02 2.4259E+03 2.4102E+02 3.0932E+06 3.9875E+06
BA 3.6547E+03 5.2084E+02 2.9345E+03 3.2153E+02 4.2234E+05 3.1187E+05
SCA 3.9210E+03 2.6760E+02 2.6817E+03 1.6586E+02 8.7261E+06 5.3615E+06
MVO 2.6666E+03 2.5900E+02 2.0840E+03 1.8181E+02 3.1776E+05 1.6227E+05
ALO 3.1409E+03 3.5665E+02 2.4570E+03 3.3723E+02 7.0092E+05 6.6152E+05
F19 F20 F21
Avg Stdv Avg Stdv Avg Stdv

SGOA 1.5554E+04 8.1547E+03 2.3085E+03 1.6035E+02 2.4148E+03 2.7654E+01


DE 1.8680E+05 1.4784E+05 2.3518E+03 1.0551E+02 2.4708E+03 9.7146E+00
GWO 1.9768E+06 5.6904E+06 2.4115E+03 1.3477E+02 2.4034E+03 3.9499E+01
WOA 1.3928E+07 1.2845E+07 2.8090E+03 2.0621E+02 2.6173E+03 5.3086E+01
MFO 7.7392E+06 2.2891E+07 2.6752E+03 2.1635E+02 2.4930E+03 5.2956E+01
GOA 3.7529E+06 2.6730E+06 2.6203E+03 2.1467E+02 2.4384E+03 2.7630E+01
SSA 8.5255E+06 8.7303E+06 2.6731E+03 2.1613E+02 2.4956E+03 4.2385E+01
BA 1.9470E+06 1.1317E+06 3.0227E+03 2.7270E+02 2.6216E+03 5.8388E+01
SCA 5.9137E+07 2.6317E+07 2.8336E+03 1.3412E+02 2.5880E+03 1.9941E+01
MVO 8.3889E+05 8.3421E+05 2.5262E+03 1.8599E+02 2.4191E+03 3.6255E+01
ALO 2.2592E+06 1.5462E+06 2.6489E+03 2.1696E+02 2.4359E+03 2.8381E+01

13
Engineering with Computers

Table 3  (continued)
F22 F23 F24
Avg Stdv Avg Stdv Avg Stdv

SGOA 5.3328E+03 1.4869E+03 2.7524E+03 3.1377E+01 2.9768E+03 4.6694E+01


DE 5.0321E+03 1.9366E+03 2.8187E+03 1.1974E+01 3.0175E+03 1.2923E+01
GWO 5.2859E+03 2.1674E+03 2.7640E+03 4.5715E+01 2.9393E+03 5.8608E+01
WOA 7.5543E+03 2.0904E+03 3.1224E+03 9.9169E+01 3.1945E+03 6.8036E+01
MFO 6.1975E+03 1.6195E+03 2.8349E+03 3.6607E+01 2.9835E+03 2.5516E+01
GOA 5.0148E+03 2.0202E+03 2.8047E+03 6.1774E+01 2.9517E+03 4.0027E+01
SSA 4.2040E+03 2.2641E+03 2.8918E+03 5.6725E+01 3.0333E+03 6.2542E+01
BA 6.9922E+03 1.7641E+03 3.2936E+03 1.3976E+02 3.3356E+03 1.4130E+02
SCA 9.1291E+03 2.0575E+03 3.0413E+03 3.9160E+01 3.2232E+03 3.9388E+01
MVO 5.5680E+03 1.3996E+03 2.7519E+03 2.4392E+01 2.9133E+03 3.8163E+01
ALO 4.8713E+03 2.1974E+03 2.8294E+03 5.3537E+01 3.0000E+03 5.0163E+01
F25 F26 F27
Avg Stdv Avg Stdv Avg Stdv

SGOA 2.8881E+03 4.4208E+00 3.7315E+03 1.0694E+03 3.2000E+03 1.0914E−04


DE 2.8878E+03 3.3665E−01 5.3235E+03 1.1316E+02 3.2202E+03 3.3471E+00
GWO 2.9901E+03 4.2882E+01 4.7206E+03 4.9107E+02 3.2521E+03 2.1136E+01
WOA 3.0769E+03 4.5400E+01 8.0572E+03 1.3245E+03 3.4478E+03 1.2444E+02
MFO 3.2774E+03 3.7079E+02 5.7245E+03 5.2158E+02 3.2514E+03 2.4002E+01
GOA 2.9217E+03 2.6409E+01 4.9009E+03 8.6740E+02 3.2442E+03 2.7524E+01
SSA 3.1074E+03 8.2225E+01 5.9909E+03 1.2898E+03 3.4003E+03 1.0294E+02
BA 2.9076E+03 2.1979E+01 9.0574E+03 1.6314E+03 3.4606E+03 1.3705E+02
SCA 3.4148E+03 1.2966E+02 7.6230E+03 4.2891E+02 3.4910E+03 6.4992E+01
MVO 2.8900E+03 7.7805E+00 4.7867E+03 2.8272E+02 3.2214E+03 1.4676E+01
ALO 2.9330E+03 2.2401E+01 5.5023E+03 1.0504E+03 3.3802E+03 6.5984E+01
F28 F29 30
Avg Stdv Avg Stdv Avg Stdv

SGOA 3.3000E+03 5.8248E−02 3.5051E+03 1.2350E+02 1.9577E+05 1.4415E+05


DE 3.2623E+03 1.9202E+01 3.9506E+03 1.2864E+02 1.5269E+05 9.1849E+04
GWO 3.4169E+03 5.3640E+01 3.8474E+03 2.1762E+02 8.0088E+06 7.4436E+06
WOA 3.4813E+03 8.0907E+01 5.3006E+03 4.9248E+02 3.3516E+07 2.9399E+07
MFO 4.3509E+03 9.8239E+02 4.2794E+03 2.6576E+02 4.3462E+05 6.1409E+05
GOA 3.2890E+03 4.4151E+01 4.1203E+03 2.4949E+02 8.9026E+06 6.4330E+06
SSA 3.5467E+03 1.9238E+02 4.7677E+03 3.4469E+02 2.5391E+07 2.2268E+07
BA 3.2353E+03 3.4671E+01 4.8959E+03 5.4016E+02 4.4409E+06 2.0990E+06
SCA 4.1625E+03 2.4197E+02 5.0211E+03 2.6901E+02 1.5983E+08 6.1573E+07
MVO 3.2362E+03 2.4579E+01 3.9528E+03 2.1312E+02 3.1011E+06 2.7001E+06
ALO 3.2689E+03 2.9664E+01 4.5572E+03 3.2584E+02 5.1035E+06 3.0344E+06
Overall rank
Rank  + / = /- ARV

SGOA 1  ~  1.833E+00


DE 4 21/3/6 4.400E+00
GWO 3 22/1/7 4.833E+00
WOA 10 30/0/0 9.500E+00
MFO 7 27/0/3 7.733E+00
GOA 5 27/0/3 5.067E+00

13
Engineering with Computers

Table 3  (continued)
Overall rank
Rank  + / = /- ARV

SSA 8 27/1/2 7.400E+00


BA 9 28/2/0 7.367E+00
SCA 11 30/0/0 1.000E+01
MVO 2 21/2/7 3.067E+00
ALO 6 24/2/4 4.800E+00

In Fig. 4, the curve corresponding to SGOA can converge • Particle swarm optimization with an aging leader and
to a good effect in the first few iterations. Moreover, the challengers (ALCPSO) [77]
quality (i.e., the minimum value) of the solution it achieves • Chaotic local search and Gaussian mutation-enhanced
is much higher than that of other algorithms with faster con- MFO (CLSGMFO) [78]
vergence speed in the early phase. One of the reasons why • Lévy mutation, Gaussian mutation and Cauchy mutation
SGOA has this performance is that under the effect of SA, MFO (LGCMFO) [79]
the algorithm can discard the current solution and find a bet- • Bat algorithm with random wind power (RCBA) [80]
ter solution. In the later stage of iteration, most algorithms • Chaotic bat algorithm (CBA) [81]
have little effect in searching for better solutions. At this • Lévy flight trajectory-based whale optimization algo-
point, it is much more difficult to find a better solution than rithm (LWOA) [82]
in the early stage, but once the algorithm breaks through • An enhanced associative learning-based exploratory
the constraint, the solution obtained is often better than that whale optimizer (BMWOA) [83]
obtained by other algorithms. In addition, from the decline
rate of SGOA’s convergence curve, it can make actions to The specific comparison results of SGOA and advanced
find a very good solution in a short time. algorithms are formulated in Table 4. The table shows the
Also, it is clear from the symbol "+/=/−" that the optimal average value and Stdv of the best solution of each algo-
function value of SGOA is superior to the original algorithm rithm after 30 independent operations and 1000 iterations in
in the other ten test functions in most cases. According to each independent operation. According to the table, SGOA
the Friedman assessment results in Table 3, SGOA has the can achieve good results in most test functions and ranks
lowest ARV value among the 30 benchmarks. Therefore, first in 18 different functions out of 30 CEC2017 functions.
in this case, the best variant of GOA is the best xxx among Therefore, SGOA conquers other algorithms to find a better
other competitors. solution in most benchmark functions.
In the unimodal functions, taking F2 and F3 as represent-
4.2 Comparison with advanced MAs atives, it is clear that SGOA can achieve better solutions than
other algorithms. For F2, the whole process of SGOA shows
In this section, CEC2017 test set is taken to compare the a powerful search ability. For F3, the solution obtained by
advanced algorithms with SGOA. The external parameters SGOA at the beginning is better than GL25, CLPSO and
(such as the number of iterations, etc.) and the running some other functions. Although they coincide with IGOA
environment of all algorithms are set to the same. And, the in the later stage, according to the detailed diagram, SGOA
population size was set to 30, the dimensionality of vari- get the better performance and implement. In conclusion,
able dimension functions was 30, and iterations was 1000, the superiority of SGOA in dealing with unimodal function
respectively. is proved.
To ensure the fairness of competition, all algorithms will In simple multimodal functions, from F4 to F10, all func-
be tested in the same environment, and the external param- tions are very close to the optimal solution, while F4, F6,
eters are set to the same value. The advanced algorithms F7, F8, F9, F10 and other functions, SGOA has the best
involved in the comparison are as follows: effect of all algorithms Therefore, SGOA makes full use of
SA mechanism in multimodal function, which makes the
• Improved variant of GOA (IGOA) [51] process of obtaining the optimal solution further.
• Global and local real-coded genetic algorithms (GL25) In hybrid function, taking the test functions F14 and F18
[75] in Fig. 5 as examples, the curve corresponding to SGOA can
• Comprehensive learning particle swarm optimizer get satisfactory results in convergence speed and quality.
(CLPSO) [76]

13
Engineering with Computers

Fig. 4  Convergence curves of
SGOA compared with MAs on
CEC 2017 (First column: F3,
F7, F14, F26; second column:
F4, F10, F16, F29)

13
Engineering with Computers

Table 4  Comparison of SGOA and advanced algorithm on IEEE CEC2017 problem


F1 F2 F3
Avg Stdv Avg Stdv Avg Stdv

SGOA 1.0886E+05 7.5860E+04 2.2550E+02 2.2454E+01 3.0321E+02 1.0530E+00


IGOA 1.9336E+06 1.1750E+06 1.6899E+10 5.1091E+10 3.2155E+02 8.9989E+00
GL25 6.4036E+09 2.1722E+09 1.5889E+35 4.3918E+35 1.6206E+05 3.0944E+04
CLPSO 1.1726E+09 2.9448E+08 8.7422E+30 1.9209E+31 1.2144E+05 2.1993E+04
ALCPSO 7.6730E+03 6.8896E+03 1.0645E+18 5.1568E+18 6.1099E+04 1.2016E+04
CLSGMFO 7.1644E+03 6.4320E+03 7.4000E+17 3.4800E+18 1.2769E+04 4.7203E+03
LGCMFO 6.6377E+03 7.0669E+03 2.4551E+17 1.2026E+18 1.4083E+04 4.5703E+03
RCBA 2.1802E+05 6.5653E+04 4.9983E+06 1.8491E+07 2.0075E+04 1.9060E+04
CBA 1.1078E+04 3.8105E+04 1.8716E+11 6.4791E+11 2.6687E+04 1.7405E+04
LWOA 2.0596E+06 6.1086E+05 9.2725E+11 1.7237E+12 5.5764E+04 3.1946E+04
BMWOA 3.3558E+07 1.7084E+07 1.2240E+24 5.7975E+24 5.0275E+04 8.1490E+03
F4 F5 F6
Avg Stdv Avg Stdv Avg Stdv

SGOA 4.7197E+02 2.2602E+01 6.1265E+02 2.9647E+01 6.0229E+02 2.4922E+00


IGOA 4.9515E+02 1.3564E+01 6.1912E+02 3.2708E+01 6.2154E+02 1.0467E+01
GL25 1.6582E+03 5.5487E+02 8.0828E+02 2.4817E+01 6.4758E+02 8.0398E+00
CLPSO 8.4887E+02 7.3653E+01 7.1705E+02 1.8759E+01 6.2258E+02 3.1644E+00
ALCPSO 5.0653E+02 3.5541E+01 6.0778E+02 3.6475E+01 6.0603E+02 6.1665E+00
CLSGMFO 4.9724E+02 2.5409E+01 6.4087E+02 2.7249E+01 6.1975E+02 1.4393E+01
LGCMFO 4.9107E+02 2.8327E+01 6.3473E+02 2.7027E+01 6.0906E+02 9.8198E+00
RCBA 4.9091E+02 1.8399E+01 7.9422E+02 4.4376E+01 6.6679E+02 1.0764E+01
CBA 5.1135E+02 2.1593E+01 7.8340E+02 6.6066E+01 6.7528E+02 1.2416E+01
LWOA 5.1490E+02 2.6105E+01 7.9081E+02 6.8386E+01 6.6024E+02 1.0136E+01
BMWOA 5.4705E+02 3.1166E+01 7.7420E+02 4.3651E+01 6.5914E+02 8.2103E+00
F7 F8 F9
Avg Stdv Avg Stdv Avg Stdv

SGOA 7.8436E+02 1.6124E+01 8.9853E+02 1.5249E+01 1.0948E+03 8.7579E+02


IGOA 8.6707E+02 3.7065E+01 9.2370E+02 3.3153E+01 7.0332E+03 3.0557E+03
GL25 1.1636E+03 3.3282E+01 1.0879E+03 3.8440E+01 6.4440E+03 1.5138E+03
CLPSO 1.0153E+03 2.0736E+01 1.0157E+03 1.9099E+01 6.0520E+03 1.0690E+03
ALCPSO 8.5797E+02 4.0822E+01 9.0941E+02 3.5416E+01 1.7930E+03 8.8197E+02
CLSGMFO 8.8042E+02 4.7233E+01 9.2206E+02 2.7292E+01 3.3751E+03 1.2584E+03
LGCMFO 8.8818E+02 3.8440E+01 9.1653E+02 2.7481E+01 2.4703E+03 8.5621E+02
RCBA 1.7679E+03 2.6086E+02 1.0235E+03 4.3900E+01 7.8081E+03 2.4001E+03
CBA 1.7394E+03 3.1387E+02 1.0230E+03 4.9410E+01 8.4256E+03 3.0016E+03
LWOA 1.1437E+03 7.1383E+01 1.0109E+03 5.0848E+01 8.0108E+03 2.5668E+03
F10 F11 F12
Avg Stdv Avg Stdv Avg Stdv

SGOA 4.1159E+03 5.5250E+02 1.2109E+03 4.9043E+01 2.2525E+06 9.1601E+05


IGOA 4.5146E+03 7.3305E+02 1.3245E+03 8.3265E+01 7.2387E+06 4.7167E+06
GL25 9.4858E+03 4.7379E+02 9.7614E+03 3.3853E+03 3.5162E+08 2.7622E+08
CLPSO 7.1099E+03 3.3180E+02 2.5146E+03 5.6226E+02 1.5930E+08 7.2538E+07
ALCPSO 4.3598E+03 5.7326E+02 1.2452E+03 5.4170E+01 6.0473E+05 7.0085E+05
CLSGMFO 5.1066E+03 6.0080E+02 1.2615E+03 7.5518E+01 9.8337E+05 7.5788E+05
LGCMFO 4.8476E+03 6.4854E+02 1.2393E+03 4.5374E+01 1.1666E+06 1.0082E+06

13
Engineering with Computers

Table 4  (continued)
F10 F11 F12
Avg Stdv Avg Stdv Avg Stdv

RCBA 5.7425E+03 6.5462E+02 1.2765E+03 5.0244E+01 6.8210E+06 4.6884E+06


CBA 5.9109E+03 6.6087E+02 1.3314E+03 7.2761E+01 1.6513E+07 1.0923E+07
LWOA 5.5541E+03 5.0546E+02 1.2841E+03 4.8565E+01 1.0424E+07 6.2725E+06
BMWOA 6.0034E+03 6.5255E+02 1.4276E+03 1.1577E+02 3.2350E+07 1.9343E+07
F13 F14 F15
Avg Stdv Avg Stdv Avg Stdv

SGOA 7.8745E+04 3.8010E+04 1.8653E+03 2.1534E+02 3.4395E+04 2.0294E+04


IGOA 2.0128E+05 1.0913E+05 4.1558E+04 3.3200E+04 1.3014E+05 8.8577E+04
GL25 3.5337E+07 3.5589E+07 1.1037E+06 8.0297E+05 4.1534E+05 9.2844E+05
CLPSO 7.8263E+07 4.5011E+07 2.4186E+05 1.8266E+05 2.8033E+06 2.5247E+06
ALCPSO 2.1187E+04 2.0800E+04 4.5066E+04 6.5849E+04 1.6828E+04 1.3511E+04
CLSGMFO 5.6457E+04 7.8994E+04 4.8712E+04 5.7661E+04 7.3654E+03 6.2909E+03
LGCMFO 3.5382E+04 2.6749E+04 3.4027E+04 3.0659E+04 7.3463E+03 9.1445E+03
RCBA 1.9223E+05 1.3680E+05 2.7955E+04 2.3847E+04 6.3793E+04 5.3505E+04
CBA 1.7146E+05 2.4881E+05 3.3604E+04 3.0076E+04 8.0206E+04 8.6373E+04
LWOA 2.2137E+05 1.2511E+05 4.2526E+04 3.4301E+04 8.6225E+04 5.5143E+04
BMWOA 1.3874E+05 8.0197E+04 4.2255E+05 3.7646E+05 3.7519E+04 2.9068E+04
F16 F17 F18
Avg Stdv Avg Stdv Avg Stdv

SGOA 2.1850E+03 1.1723E+02 1.9549E+03 8.6665E+01 3.6851E+04 1.4430E+04


IGOA 2.6549E+03 3.1771E+02 2.2161E+03 1.8858E+02 5.0763E+05 3.4086E+05
GL25 4.2399E+03 2.2295E+02 2.7778E+03 1.8009E+02 1.5917E+07 1.3887E+07
CLPSO 3.1696E+03 1.7776E+02 2.2878E+03 1.2307E+02 1.4549E+06 6.3556E+05
ALCPSO 2.6756E+03 2.9777E+02 2.2394E+03 2.2470E+02 7.1774E+05 1.1014E+06
CLSGMFO 2.7346E+03 3.0295E+02 2.2180E+03 1.8618E+02 2.5932E+05 3.3953E+05
LGCMFO 2.7821E+03 2.7805E+02 2.1858E+03 2.0384E+02 2.4095E+05 1.5698E+05
RCBA 3.4597E+03 4.4583E+02 2.7342E+03 3.3084E+02 2.9864E+05 1.9149E+05
CBA 3.6870E+03 4.6745E+02 2.9741E+03 3.4044E+02 7.8308E+05 6.1251E+05
LWOA 3.2254E+03 3.8387E+02 2.4943E+03 2.8563E+02 1.0876E+06 9.0789E+05
BMWOA 3.3318E+03 3.8876E+02 2.3973E+03 2.2775E+02 2.0598E+06 2.2572E+06
F19 F20 F21
Avg Stdv Avg Stdv Avg Stdv

SGOA 2.4128E+04 1.5461E+04 2.3829E+03 1.6346E+02 2.4387E+03 3.1091E+01


IGOA 1.6019E+05 1.2360E+05 2.5535E+03 1.9676E+02 2.4146E+03 3.2871E+01
GL25 1.1483E+06 1.9570E+06 3.1010E+03 2.7080E+02 2.5963E+03 1.3712E+01
CLPSO 4.5905E+06 3.9427E+06 2.5415E+03 1.1805E+02 2.5098E+03 3.9839E+01
ALCPSO 1.5995E+04 1.8305E+04 2.4555E+03 1.8094E+02 2.4058E+03 2.4025E+01
CLSGMFO 7.8875E+03 9.7063E+03 2.4879E+03 1.6892E+02 2.4137E+03 3.8055E+01
LGCMFO 6.3893E+03 4.0522E+03 2.3909E+03 1.7331E+02 2.4036E+03 1.8523E+01
RCBA 2.0541E+05 1.0869E+05 2.9946E+03 2.5450E+02 2.6242E+03 7.4872E+01
CBA 1.7846E+06 1.2613E+06 3.0461E+03 2.9142E+02 2.6280E+03 7.3054E+01
LWOA 4.3049E+05 2.1401E+05 2.8343E+03 2.1230E+02 2.5744E+03 5.7393E+01
BMWOA 2.8980E+05 2.9148E+05 2.7105E+03 1.9962E+02 2.5230E+03 5.9434E+01

13
Engineering with Computers

Table 4  (continued)
F22 F23 F24
Avg Stdv Avg Stdv Avg Stdv

SGOA 5.5527E+03 1.1967E+03 2.7660E+03 3.7475E+01 2.9988E+03 4.5097E+01


IGOA 4.8142E+03 1.7980E+03 2.7681E+03 3.4341E+01 2.9344E+03 3.1409E+01
GL25 3.2462E+03 1.1536E+03 3.0113E+03 4.3357E+01 3.1659E+03 3.5812E+01
CLPSO 4.3162E+03 1.5588E+03 2.9107E+03 1.9722E+01 3.0865E+03 2.4430E+01
ALCPSO 4.2461E+03 2.0273E+03 2.8171E+03 6.8624E+01 2.9822E+03 7.5844E+01
CLSGMFO 2.3010E+03 1.7649E+00 2.7969E+03 5.4860E+01 2.9567E+03 4.6308E+01
LGCMFO 2.3007E+03 1.3471E+00 2.7799E+03 2.6299E+01 2.9460E+03 3.4020E+01
RCBA 7.0353E+03 1.2367E+03 3.3576E+03 1.4322E+02 3.5628E+03 1.5565E+02
CBA 7.2500E+03 1.3445E+03 3.3436E+03 1.4322E+02 3.3723E+03 1.5063E+02
LWOA 6.5751E+03 1.5271E+03 3.0190E+03 1.1692E+02 3.2084E+03 1.0149E+02
BMWOA 4.9511E+03 2.6762E+03 2.9732E+03 6.8543E+01 3.1101E+03 7.4460E+01
F25 F26 F27
Avg Stdv Avg Stdv Avg Stdv

SGOA 2.8897E+03 4.7140E+00 3.5913E+03 9.2651E+02 3.2000E+03 1.5319E−04


IGOA 2.8893E+03 8.6953E+00 4.9098E+03 5.9458E+02 3.2195E+03 1.6983E+01
GL25 3.2791E+03 1.1495E+02 7.1433E+03 1.0453E+03 3.4886E+03 5.5477E+01
CLPSO 3.0666E+03 3.7060E+01 6.1742E+03 4.2177E+02 3.3273E+03 2.6420E+01
ALCPSO 2.8950E+03 1.3224E+01 5.1322E+03 7.0158E+02 3.2457E+03 2.3706E+01
CLSGMFO 2.8959E+03 1.6789E+01 4.0841E+03 1.1542E+03 3.2979E+03 5.4881E+01
LGCMFO 2.8948E+03 1.5920E+01 4.1086E+03 1.2985E+03 3.2910E+03 4.0571E+01
RCBA 2.9033E+03 1.9173E+01 9.0901E+03 2.3321E+03 3.5419E+03 2.4291E+02
CBA 2.9035E+03 2.2417E+01 9.4179E+03 2.3114E+03 3.3964E+03 1.1877E+02
LWOA 2.9032E+03 1.7357E+01 6.5173E+03 1.3539E+03 3.3020E+03 6.8783E+01
BMWOA 2.9673E+03 3.6625E+01 6.5437E+03 9.2823E+02 3.3193E+03 6.5321E+01
BMWOA 8 28/0/2 7.767E+00
F28 F29 F30
Avg Stdv Avg Stdv Avg Stdv

SGOA 3.3000E+03 4.7854E−02 3.5510E+03 1.4408E+02 1.8081E+05 1.6499E+05


IGOA 3.2385E+03 2.1175E+01 3.7613E+03 1.3874E+02 7.1721E+05 3.4101E+05
GL25 4.0303E+03 3.4757E+02 5.1626E+03 3.0750E+02 1.3200E+07 7.5106E+06
CLPSO 3.7533E+03 1.0798E+02 4.2933E+03 1.6779E+02 6.4261E+06 5.3927E+06
ALCPSO 3.2422E+03 3.6841E+01 3.8582E+03 2.1809E+02 1.5162E+04 5.1469E+03
CLSGMFO 3.2182E+03 1.7723E+01 3.9222E+03 2.5586E+02 6.9507E+04 1.4473E+05
LGCMFO 3.2411E+03 2.8303E+01 3.7660E+03 1.9587E+02 2.4719E+04 2.3653E+04
RCBA 3.2104E+03 3.3990E+01 5.0239E+03 4.6390E+02 2.0176E+06 1.1941E+06
CBA 3.2320E+03 2.3505E+01 5.4464E+03 5.6091E+02 5.1633E+06 2.8902E+06
LWOA 3.2521E+03 2.2175E+01 4.3144E+03 3.0547E+02 1.9227E+06 9.6660E+05
BMWOA 3.3406E+03 3.4280E+01 4.6788E+03 3.1856E+02 4.4185E+06 2.5737E+06
Overall rank
Rank +/=/− ARV

SGOA 1 ~ 2.433E+00
IGOA 5 22/4/4 4.333E+00
GL25 11 29/1/0 9.800E+00
CLPSO 9 29/1/0 8.067E+00
ALCPSO 3 15/8/7 3.667E+00

13
Engineering with Computers

Table 4  (continued)
Overall rank
Rank +/=/− ARV
CLSGMFO 4 17/10/3 3.767E+00
LGCMFO 2 16/10/4 2.900E+00
RCBA 6 29/1/0 7.367E+00
CBA 10 27/2/1 8.367E+00
LWOA 7 29/1/0 7.533E+00
BMWOA 8 28/0/2 7.767E+00

Moreover, SGOA can rank first in F11, F14, F16, F17, F18 a way to deal with these constraints. Some penalty functions
and F20. are introduced in [84], such as co-evolutionary, death pen-
In the last ten composition functions, SGOA’s advantages alty, etc. Among them, the death penalty is an intermediate
are not particularly obvious. However, it can still occupy function to deal with a variety of related models, The MAs
first rank in F23, F26, F27, F29. At the same time, LGC- can automatically eliminate the solution that does not meet
MFO and ALCPSO are the most threatening functions to the conditions. Therefore, SGOA, with the function of the
SGOA. Moreover, SGOA converges well in the early stage death penalty, was adopted to solve three critical problems
and slowly in the later stage. This is because the algorithm is in mathematical modeling.
more powerful in developing the unexplored solution space
in the early stage, and meanwhile, it can obtain the optimal 4.3.1 Welded beam design problem
solution better. This is related to the added SA mechanism,
because in the annealing scale, the probability of discard- The WBD is a kind of composite beam. The commonly used
ing the current solution and finding other solutions becomes welded steel beam consists of I-section and box section com-
smaller and smaller as the number of iterations changes. posed of upper and lower flange plates and webs. The latter
According to the symbol of “+/=/−”, the proposed SGOA is more expensive in materials and complicated in fabrica-
is superior to IGOA, GL25, CLPSO, ALCPSO, CLSGMFO, tion process, but it has larger bending and torsional rigidity,
LGCMFO, RCBA, CBA, LWOA and BMWOA on 22, 29, which is appropriate for the situations with high lateral load
29, 15, 17, 16, 29, 27, 29 and 28 out of 30 cases, and equal and torsional requirements or limited in beam height. The
to them on 4, 1, 1, 8, 10, 10, 1, 2, 1 and 0 out of 30 problems, WBD is transformed into a mathematical model. Its goal
and inferior to them on 4, 0, 0, 7, 3, 4, 0, 1. 0 and 2 out of is to find the following four factors to minimize the cost of
30 ones, respectively. welding beam: shear stress ( 𝜏  ), bending stress ( 𝜃 ), buckling
To sum up, SGOA is superior to the listed comparison load ( Pc ) and deflection ( 𝛿 ). The problem involves the fol-
algorithm in terms of processing all functions. In addition, lowing four variables: welding seam thickness ( h ), welding
from the last part of the table, the average ranking value of joint length ( l  ), beam width ( t  ) and beam thickness ( b ). The
SGOA is only 2.43, which means 0.5 lower than the LGC- mathematical model is as follows:
MFO in the second place. [ ]
Consider x⃗(=) x1 , x2 , x3 , x4 = [hltb] ( )
2
Minimize f x⃗( )= 1.10471x
( ) 1 + 0.04811x3 x4 14.0 + x4
4.3 Engineering problems Subject to g1 (x⃗) = 𝜏 (x⃗ ) − 𝜏max ≤ 0
g2 (x⃗) = 𝜎( x⃗) − 𝜎max ≤ 0
In this part, SGOA was applied to three kinds of mathemati- g3 (x⃗) = 𝛿 x⃗ − 𝛿max ≤ 0
cal modeling with constraints: welded beam design (WBD), g4 (x⃗) = x1 − x4 (≤)0
pressure vessel design (PVD) and speed reducer design g5 (x⃗) = P − PC x⃗ ≤ 0
g6 (x⃗) = 0.125 − x1 ≤ 0
(SRD) problem. In many mathematical models, there are ( )
g7 x⃗ = 1.10471x12 + 0.04811x3 x4 14.0 + x2 − 5.0 ≤ 0
often many constraints to regulate, so it is imperative to find

13
Engineering with Computers

Fig. 5  Convergence curves of
SGOA compared with advanced
algorithm on CEC 2017 (First
column: F2, F9, F14, F26; sec-
ond column: F3, F10, F18, F29)

13
Engineering with Computers

Table 5  The comparison results Algorithm The optimal value for variables Best cost
of the WBD problem
h l t b

SGOA 0.202382 3.314602 9.036272 0.205754 1.698741


BA 2 0.1 3.174303 2 1.818138
RO 0.203687 3.528467 9.004233 0.207241 1.735344
HS 0.2442 6.2231 8.2915 0.2433 2.3807
IHS 0.20573 3.47049 9.03662 0.20573 1.7248
Random method [88] 0.45750 4.73130 5.08530 0.66000 4.11850
Simple method [88] 0.27920 5.62560 7.75120 0.27960 2.53070
David method [88] 0.24340 6.25520 8.29150 0.24440 2.38410

Variable range 0.1 ≤ x1 ≤ 2, 0.1 ≤ x2 ≤ 10, 0.1 ≤ x3 ≤ 10, 0.1 ≤ x4 ≤ 2,


� � � x
where 𝜏 x⃗ = (𝜏 � )2 + 2𝜏 � 𝜏 �� 2R2 + (𝜏 �� )2 ,
� �
x2
𝜏 � = √ P , 𝜏 �� = MR , M = P L + ,
�2x1 x2 � �
J 2

x22 2
x +x
R= + 12 3 ,
�√4
� � �2 ��
x22 x1 +x3
J=2 2x1 x2 4 + 2
� � 6PL � � 6PL3
𝜎 x⃗ = x x2 , 𝛿 x⃗ = Ex x2 ,
4 3 � 4 3
x2 x6 � � �
� � 4.013E 3364 x3 E
PC x⃗ = L2
1 − 2L 4G
,
P = 6000lb,
L = 14 in., 𝛿max = 0.25 in., E = 30 × 106 psi, G = 12 × 106 psi
𝜏max = 13600 psi, 𝜎max = 30000 psi.

h = 0.2024, l = 3.3146, t = 9.0363, b = 0.2058 , the manu-


This optimization problem was also well solved by other
facturing cost of WBD can reach 1.698741. Therefore, the
algorithms. Kaveh et al. [85] adopted RO to solve this prob-
improved SGOA has good potential in solving the WBD
lem. Lee et al. [86] proposed HS and applied it to the WB
problem.
problem. The final optimal result is 2.3807. The improved
HS (IHS) [87]was also tested.
4.3.2 Pressure vessel design problem
The value obtained by SGOA in Table  5 is the
best of all values. Specifically, compared with some
PVD problem refers to the maximum pressure that the
methods, the cost has been reduced several times.
thickness of the shell of pressure vessel can bear at the
This fact shows that when the parameters are set
specified design temperature. Because of the typicality

Table 6  Comparison results for Algorithm The optimal value for variables Best cost
PVD problem
Ts Th r l

SGOA 0.816637 0.403014 42.209181 175.779778 5977.495118


BA 97.8015 98.10897 10.98606 200 7258.564
IHS 1.125000 0.625000 58.29015 43.69268 7197.7300
PSO 0.812500 0.437500 42.091266 176.746500 6061.0777
GA 0.937500 0.500000 48.329000 112.679000 6410.3811
ES 0.812500 0.437500 42.098087 176.640518 6059.7456
Lagrangian multiplier [91] 1.125000 0.625000 58.291000 43.690000 7198.0428
Branch-and-bound 1.125000 0.625000 47.700000 117.71000 8129.1036

13
Engineering with Computers

Table 7  Comparison results for Algorithm The optimal value for variables Best cost
SRD problem
x1 x2 x3 x4 x5 x6 x7

SGOA 6.866813 0.605828 12.653421 7.941985 9.736686 3.628515 5.337909 2708.331033


WCA​ 3.500000 0.700000 17.000000 7.300000 7.715319 3.350214 5.286654 2994.471066
DEDS 3.5E+09 0.7E+09 17.000000 7.3E+09 7.715319 3.350214 5.286654 2994.471066
DELC 3.5E+09 0.7E+09 17.000000 7.3E+09 7.715319 3.350214 5.286654 2994.471066
HEAA 3.500022 0.700000 17.000012 7.300427 7.715377 3.350230 5.286663 2994.499107
MDE 3.500010 0.700000 17.000000 7.300156 7.800027 3.350221 5.286685 2996.356689
PSO-DE 3.500000 0.700000 17.000000 7.300000 7.800000 3.350214 5.286683 2996.348167

of vessel pressure measurement, this problem is often 6410.3811. Beyond that, evolution strategies (ES) [92] and
used as an application of structural design. Vessel pres- IHS [87] were used to deal with this work.
sure is closely related to materials, structures, and weld- Table 6 reflects the comparison results of SGOA and
ing [89]. In the PVD problem, the variables ( that
) need( to
) some algorithms on PVD. From the table, the result of
be optimized are the thickness of the shell Ts  , head Th  , SGOA is the best among all algorithms, and the cost is the
the inner radius (r) and the range of the section minus head smallest. According to the results of SGOA, we can see that
(l) , respectively.
[ ]The[ constraints] are as follows: Consider when Ts = 0.8166, Th = 0.4030, r = 42.2092, l = 175.7798 ,
x⃗ = x1 , x2 , x3 , x4 = Ts, Th , R, L we can get the best value 5977.495118. In summary, SGOA
( ) 2 2 2
Objective f x⃗( min
) = 0.6224x1 x3 x4 + 1.7781x3 x1 + 3.1661x4 x1 + 19.84x3 x1
Subject to g1 (x⃗) = −x1 + 0.0193x3 ≤ 0
g2 x⃗ = −x3 + 0.00954x3 ≤ 0
( )
g3 x⃗ = −𝜋x4 x32 − 43 𝜋x33 + 1296000 ≤ 0
( )
g4 x⃗ = x4 − 240 ≤ 0

Variable ranges 0 ≤ x1 ≤ 99 has a strong ability to deal with PVD problems.


0 ≤ x2 ≤ 99
10 ≤ x3 ≤ 200 4.3.3 Speed reducer design problem
10 ≤ x4 ≤ 200
The SRD problem is designed to minimize the weight of the
PVD problem has been studied by many scholars. He reducer under relevant constraints [92]. The variables x1 to
et al. [90] proposed that using PSO algorithm to settle this x7 represent the face width (b) , module of teeth (m) , number
problem, and the minimum value reached 6061.0777. Del of teeth in
et al. [91] used GA to tackle and got a minimum cost of ( the
) pinion (z) , length of the first shaft between
bearings
( ) l 1  , length of the second
( ) shaft between bearings
( )
l2 and the diameter of first d1 and second shafts d2  ,
respectively. The mathematical model is shown as follows:

13
Engineering with Computers

( ) ( ) ( )
Objective f (x) = 0.7854x1 x22 3.3333x32 + 14.9334x3 − 43.0934 − 1.508x1 x62 + x72 + 7.4777 x63 + x73
Subject to g1 (x) = x 27
x2 x
−1≤0
1 2 3
397.5
g2 (x) = x1 x22 x32
−1≤0
1.93x43
g3 (x) = x2 x64 x3
−1≤0
1.93x53
g4 (x) = x x4 x
−1≤0
[2( 7 3 ) ]1∕2
745x42
x2 x3
+16.9×106
g5 (x) = 110x63
−1≤0
[( )2 ]1∕2
745x4 6
x x
+157.5×10
2 3
g6 (x) = 85x73
−1≤0
x2 x3
g7 (x) = 40 −1≤0
5x
g8 (x) = x 2 − 1 ≤ 0
x1
g9 (x) = 12x1 − 1 ≤ 0
2
1.5x6 +1.9
g10 (x) = x4
−1 ≤0
1.1x7 +1.9
g11 (x) = x5
−1 ≤0

Fig. 6  Flowchart of the proposed SGOA-KELM model

13
Engineering with Computers

where 2.6 ≤ x1 ≤ 3.6


0.7 ≤ x2 ≤ 0.8
17 ≤ x3 ≤ 28
7.3 ≤ x4 ≤ 8.3
7.3 ≤ x5 ≤ 8.3
2.9 ≤ x6 ≤ 3.9
5.0 ≤ x7 ≤ 5.5

In this problem, SGOA is compared with WCA [93],


DEDS [94], DELC [95], HEAA [96], MDE [97], PSO-
DE [98]. From the results in the Table 7, we can see that
when x1 = 6.8668, x2 = 0.6058 , x3 = 12.6534, x4 = 7.9420 ,
x5 = 9.7367 , x6 = 3.6285 , x7 = 5.3379 , SGOA can find the
minimum cost of all algorithms. The minimum consumption
value is 2708.331033. The main reason why SGOA can get Fig. 7  The classification performance obtained by the five models in
terms of four criteria on the Cleveland heart dataset
the best value is the algorithm model can pay attention to
the diversity of solutions and the quality of solutions in the
limited feature space. The connection weight β between the hidden layer and the
output layer does not need to be adjusted iteratively, but is
4.4 Parameter optimization of kernel extreme determined at one time by solving equations.
learning machine
The output function of ELM is:
Extreme learning machine (ELM) is a kind of machine
learning system or method based on Feedforward Neural ∑
M
( )
Network (FNN) proposed by Huang et al. [63] in 2004, f (x) = 𝛽i g wi xi + bi = g(x)𝛽 (14)
i=1
which is suitable for supervised learning and unsupervised [ ]
learning. The advantage of ELM compared with FNN is: where 𝛽 = 𝛽1 , … , 𝛽M is output weight vector representing
[nodes connecting
] hidden layer and network output layer,
The weights and thresholds involved in the network can wi , … , wM is the input weight
[ vector of input] layer node
be set randomly, and after setting, there is no need to and hidden layer node, g(x) = g1 (x), … , gM (x) is the output
adjust. This is not the same as BP neural network, which vector of the hidden layer with input x . ELM has a faster
needs to adjust the weight and threshold continuously in speed in classification operation because of its fast calcula-
the reverse direction. So we can reduce the computation tion of parameter 𝛽  . The calculation formula of parameter
by half here. 𝛽 is as follows:
( )−1
1
𝛽 = HT + HH T T (15)
C
Table 8  The results obtained by SGOA-KELM on the Cleveland where H is the output matrix of hidden-layer and T is the
heart dataset target matrix. The coefficient C in the formula is a positive
Fold ACC​ SENS SPES MCC integer, to ensure the stability of the algorithm. So, combin-
ing Eq. (14) with Eq. (15), we can get
1 0.7333 0.7143 0.7500 0.4643
2 0.8387 0.7500 0.9333 0.6920 ( )−1
1
f (x) = g(x)𝛽 = g(x)H T + HH T T (16)
3 0.8065 0.7647 0.8571 0.6193 C
4 0.8667 0.8000 0.9333 0.7399
5 0.8000 0.9231 0.7059 0.6290 KELM is constructed based on the above ELM by adding
6 0.9032 0.8182 0.9500 0.7863 kernel functions. The kernel matrix can be expressed in the
7 0.8000 0.8000 0.8000 0.5774 following form:
8 0.6333 0.5833 0.6667 0.2472
WKELM = HH T ∶ WKELM = h(xi ) × h(xj ) = K(xi , xj ) (17)
9 0.7667 0.7273 0.7895 0.5083
10 0.7667 0.7500 0.8000 0.5232 Combining Eq. (16) with Eq. (17), the calculation for-
Avg 0.7915 0.7631 0.8186 0.5787 mula of KELM is obtained.
Std 0.0748 0.0867 0.0984 0.1553

13
Engineering with Computers

Table 9  Financial ratios Feature Financial ratios Feature Financial ratios

X1 Net profit/total assets X6 Sales/total assets


X2 Current assets/current liabilities X7 Market value of equity/total assets
X3 Retained earnings/total assets X8 Cash flow/total debt
X4 Working capital/total assets X9 Current assets/total assets
X5 EBIT/total assets X10 Cash /current liabilities
*
 EBIT earning before interests and taxes)

Table 10  The results obtained by SGOA-KELM on the JPNBDS commonly used kernel function is Gaussian kernel, as shown
Fold ACC​ SENS SPES MCC
in the following formula:
( )
#1 0.8667 1.0000 0.7778 0.7638 K(u, v) = exp −𝛾u − v2 (19)
#2 0.8667 0.9091 0.7500 0.6591
#3 0.8667 1.0000 0.7778 0.7638
where uandv are input value, coefficient 𝛾 is used to control
#4 0.8667 1.0000 0.7143 0.7559
Gaussian function distribution.
#5 0.9375 0.8571 1.0000 0.8783
It is reported that the two key parameters in KELM has
#6 0.8667 0.8000 0.9000 0.7000
great effect on the performance of KELM Kernel extreme
#7 0.8667 0.8889 0.8333 0.7222
learning machines [99–104]. In this section, we apply SGOA
#8 0.7333 0.7143 0.7500 0.4643
to the optimization of two key parameters in KELM, C and 𝛾.
#9 0.6667 0.5714 0.7500 0.3273
#10 0.8125 0.7000 1.0000 0.6831
C indicates the penalty coefficient, which is the tolerance
Avg 0.8350 0.8441 0.8253 0.6718
of error. The higher C  , the lower the tolerance of error,
Std 0.0787 0.1465 0.1057 0.1606
which is prone to overfitting, and vice versa, which is
prone to underfitting. Therefore, either too large or too
small C will lead to poor generalization ability.
𝛾 is an inherent parameter that comes with the Gaussian
kernel function and implicitly determines the distribution
of the data when mapped to the new feature space. The
larger the 𝛾  , the less support vector there is, and con-
versely, the more.

In this study, we set the dimension of individual popula-


tion generated by SGOA initialization as 2, which is used to
represent C and 𝛾 , respectively. It then iterates and optimizes
to generate the parameter values for the optimal KELM
model.
The resultant of SGOA-KELM model was used for deal-
ing with the Cleveland heart and financial stress prediction
problem.
Figure  6 shows the framework flow chart of SGOA-
Fig. 8  The classification performance obtained by the five models in
KELM. As shown in the figure, the whole framework is
terms of four criteria on the JPNBDS
divided into two parts: parameter optimization and KELM
classification. When optimizing parameters, we use the
� � SGOA proposed, and divide the whole data set into five
� �−1 ⎡ K x, x1 ⎤� �−1 parts, four of which are training sets. First, through five
1 1
f (x) = g(x)H T + HH T T = ⎢ �M � ⎥ + WKELM T
C ⎢ ⎥ C cross-validations, we select the optimal parameter pair, and
⎣ K x, xN ⎦
then we use the parameter pair for training to get the opti-
(18)
mization model. After that, in KELM classification, we use
in the above formula, we replace the original h(x) with tenfold cross validation to perform prediction tasks.
K(u, v) . In the actual operation of the algorithm, the most

13
Engineering with Computers

To test the effectiveness of SGOA-KELM in parameter SGOA-KELM still stood out. It can be seen from the ACC
optimization, it competes with the following algorithms: index that the ACC of SGOA-KELM is the highest after ten-
KNN [105], Classification Tree, BP neural network [106], fold cross validation, which is nearly 10% higher than that of
GOA-KELM. Here are four criteria: classification accuracy BP with the lowest accuracy. Sensitivity represents the pro-
(ACC), sensitivity (SENS), specificity (SPES) and Matthews portion of patients that are correctly detected, and it meas-
correlation coefficient (MCC). ures the ability to identify sick cases. SGOA-KELM is also
ahead of other models in this index. Specificity is defined
TP + TN
ACC = × 100% (20) to the proportion of the recognition error of negative cases
TP + FP + FN + TN
in all negative cases and measures the classifier’s recogni-
tion ability of negative cases. In this index, GOA-KELM is
TP slightly better than SGOA-KELM. Finally, MCC measure is
SENS = × 100% (21)
TP + FN used. The model composed of swarm intelligence algorithm
is better than the traditional machine learning method in
TN effect, and SGOA-KELM is a little better than GOA-KELM.
SPES = × 100% (22)
FP + TN
4.4.2 Japanese bankruptcy dataset (JPNBDS)
TP × TN − FP × FN
MCC = √ × 100%
(TP + FP) × (TP + FN) × (TN + FP) × (TN + FN)
The financial ratio of JPNBDS is abstracted from the real
(23) financial statements of Japan from 1995 to 2009. The dataset
where TP and TN is the true positive and negative samples, contains 152 instances, of which the bankrupt sample and
FP and FN is the false positive and negative samples. the non-bankrupt sample account for half, respectively, and
it can be used to predict financial stress. In terms of finan-
4.4.1 Cleveland heart dataset cial stress prediction, according to the literature [107], ten
important financial ratios that can reflect the financial situ-
The Cleveland heart data set is obtained from the famous ation of the company are selected as prediction models for
UCI database. It is composed of 303 instances. Among the training. The financial ratios are shown in Table 9.
most attributes, 14 of them are mainly used. These 14 attrib- From Table 10, it can be seen that SGOA-KELM achieves
utes are also used by all published experiments. The specific the average results of 83.5% ACC, 84.41% SENS, 82.53%
attributes are as follows: SPES, and 67.18% MCC.
The comparison finding in Fig. 8 between SGOA-KELM
Num (the predicted attribute) and other four models is related with average results and
Age standard deviations on four rating standards. In the figure,
Sex it is obvious that the effect of individual model is quite dif-
Cp (chest pain type) ferent under different indicators. This is because KELM
Trestbps (resting blood pressure (in mm Hg on admission belongs to neural network model, which has great depend-
to the hospital)) ence on parameter setting, and different algorithms have
Chol (serum cholestoral in mg/dl) various effects on optimizing parameters, so the classifica-
Fbs ((fasting blood sugar > 120  mg/dl) (1 = true; tion effect of KELM is also diverse. As can be seen from
0 = false)) the figure above, SGOA-KELM is weaker than KNN and
Restecg (resting electrocardiographic results) TREE in SPES index, but still has certain advantages in
Thalach (maximum heart rate achieved) other indicators. In addition, compared with GOA-KELM
Exang (exercise-induced angina (1 = yes; 0 = no)) in SPES, SGOA-KELM has made great progress. SPES
Oldpeak (depression induced by exercise relative to rest) refers to the percentage of correct judgment of actual posi-
Slope (the slope of the peak exercise ST segment) tive cases. SGOA-KELM model is more advantageous than
Ca (number of major vessels (0–3) colored by flourosopy) other excellent models in the most important evaluation
Thal (3 = normal; 6 = fixed defect; 7 = reversable defect) index ACC, and accuracy as well, compared with the origi-
nal GOA-KELM model. It is deduced that SGOA can find a
From Table 8, it can be seen that SGOA-KELM achieves more suitable value in the parameter optimization of KELM
the average results of 79.156% ACC, 76.31% SENS, 81.86% to improve the accuracy of classification.
SPES, and 57.81% MCC.
The comparison results between SGOA-KELM and other
four models can be observed in Fig. 7. The algorithm mod-
els that took part in the comparison all worked well, but

13
Engineering with Computers

5 Conclusions and future work 6. Zhang X et al (2019) Robust low-rank tensor recovery with rec-
tification and alignment. IEEE Trans Pattern Anal Mach Intell.
https​://doi.org/10.1109/TPAMI​.2019.29290​43
In this study, an improved algorithm SGOA, integrates SA 7. Deng W (2020) An enhanced MSIQDE algorithm with
strategy into GOA and enhances the performance of the novel multiple strategies for global optimization problems.
original algorithm. The proposed SGOA was compared with IEEE Trans Syst Man Cybern Syst. https​://doi.org/10.1109/
TSMC.2020.30307​92
ten MAs and ten advanced MAs. The experiment findings 8. Deng W et al (2020) An effective improved co-evolution ant
estimate that SGOA can beat other algorithms in 17 func- colony optimization algorithm with multi-strategies and its appli-
tions compared with the original heuristic functions. When cation. Int J Bio-Inspir Comput 16(3):158–170
coming to advanced functions, it shows much better perfor- 9. Deng W et al (2020) An improved quantum-inspired differential
evolution algorithm for deep belief network. IEEE Trans Instrum
mance than other peers in 18 functions. In addition to testing Meas. https​://doi.org/10.1109/TIM.2020.29832​33
the operation of SGOA itself, it is applied to the parameter 10. Zhao H et al (2019) Performance prediction using high-order
optimization of KELM problems and several engineering differential mathematical morphology gradient spectrum entropy
design problems such as WBD, PVD, and I-beam design. and extreme learning machine. IEEE Trans Instrum Meas. https​
://doi.org/10.1109/TIM.2019.29484​14
The experimental results verify that SGOA has achieved 11. Song Y et al (2021) MPPCEDE: multi-population parallel co-
quite promising results in these two aspects. What’s more, evolutionary differential evolution for parameter optimization.
it is not difficult to find that SGOA has excellent perfor- Energy Convers Manag 228:113661
mance in solving continuous problems, but its ability to 12. Chen H et al (2020) Multi-population differential evolution-
assisted Harris hawks optimization: framework and case studies.
solve binary problems is still insufficient. The application Future Gener Comput Syst 111:175–198
of the proposed algorithm to binary problems [108–110] will 13. Wang M, Chen H (2020) Chaotic multi-swarm whale optimizer
also be one of the tasks of the next research. boosted support vector machine for medical diagnosis. Appl Soft
For the future direction, there are many valuable aspects Comput 88:105946. https​://doi.org/10.1016/j.asoc.2019.10594​6
14. Zhao X et al (2019) Chaos enhanced grey wolf optimization
worthy of exploration. For example, the proposed SGOA wrapped ELM for diagnosis of paraquat-poisoned patients. Com-
can be applied to more areas such as the optimization of put Biol Chem 78:481–490
the structure and weights of machine learning models 15. Wang M et al (2017) Toward an optimal kernel extreme learning
[111–120], information fusion [121], social evolution mod- machine using a chaotic moth-flame optimization strategy with
applications in medical diagnoses. Neurocomputing 267:69–84
elling [122], recommender system [123], parameters identi- 16. Shen L et al (2016) Evolving support vector machines using fruit
fication of photovoltaic cell models [124–132]. In addition, fly optimization for medical data classification. Knowl Based
the exploration of SGOA to multi-objective [133] and other Syst 96:61–75
complex environments will also be a research direction in 17. Xu X, Chen HL (2014) Adaptive computational chemotaxis
based on field in bacterial foraging optimization. Soft Comput
the near future. 18(4):797–807
18. Chen H et al (2019) A balanced whale optimization algorithm
Acknowledgment  This research is supported by National Natural for constrained engineering design problems. Appl Math Model
Science Foundation of China (62076185, 71803136, U1809209), the 71:45–59
Ministry of Education of Humanities and Social Science Project of 19. Luo J et al (2019) Multi-strategy boosted mutative whale-inspired
Wenzhou Business College (20YJA790090), the Characteristic Inno- optimization approaches. Appl Math Model 73:109–123
vation Project of Guangdong Universities in 2020 (2020KTSCX302), 20. Yu H et al (2020) Chaos-enhanced synchronized bat optimizer.
Guangdong Natural Science Foundation (2018A030313339), Scientific Appl Math Model 77:1201–1215
Research Team Project of Shenzhen Institute of Information Technol- 21. Chen H et al (2020) Efficient multi-population outpost fruit fly-
ogy (SZIIT2019KJ022). driven optimizers: framework and advances in support vector
machines. Expert Syst Appl 142:112999
22. Chen H, Wang M, Zhao X (2020) A multi-strategy enhanced
sine cosine algorithm for global optimization and constrained
References practical engineering problems. Appl Math Comput 369:124872.
https​://doi.org/10.1016/j.amc.2019.12487​2
1. Wang G-G, Tan YJITOC (2017) Improving metaheuristic algo- 23. Zhang X et al (2020) Gaussian mutational chaotic fruit fly-built
rithms with information feedback models. IEEE Trans Cybern optimization and feature selection. Expert Syst Appl 141:112976
49(2):542–555 24. Song S et al (2020) Dimension decided Harris hawks optimi-
2. Wang G-G et al (2014) Chaotic krill herd algorithm. Inf Sci zation with Gaussian mutation: balance analysis and diversity
274:17–34 patterns. Knowl Based Syst. https​://doi.org/10.1016/j.knosy​
3. Gao D, Wang G-G, Pedrycz WJITOFS (2020) Solving fuzzy s.2020.10642​5
job-shop scheduling problem using de algorithm improved by 25. Zhao D et al (2020) Ant colony optimization with horizontal
a selection mechanism. IEEE Trans Fuzzy Syst. https​://doi. and vertical crossover search: fundamental visions for multi-
org/10.1109/TFUZZ​.2020.30035​06 threshold image segmentation. Expert Syst Appl. https​://doi.
4. Wang G-G et al (2019) Monarch butterfly optimization. Neural org/10.1016/j.eswa.2020.11412​2
Comput Appl 31(7):1995–2014 26. Zhang Y et al (2020) Towards augmented kernel extreme learn-
5. Yi J-H et al (2018) An improved NSGA-III algorithm with adap- ing models for bankruptcy prediction: algorithmic behavior
tive mutation operator for big data optimization problems. Future and comprehensive analysis. Neurocomputing. https​: //doi.
Gener Comput Syst 88:571–585 org/10.1016/j.neuco​m.2020.10.038

13
Engineering with Computers

27. Wang X et al (2020) Multi-population following behavior-driven 50. Saxena A (2019) A comprehensive study of chaos embedded
fruit fly optimization: a Markov chain convergence proof and bridging mechanisms and crossover operators for grasshopper
comprehensive analysis. Knowl Based Syst 210:106437. https​:// optimisation algorithm. Expert Syst Appl 132:166–188
doi.org/10.1016/j.knosy​s.2020.10643​7 51. Luo J et al (2018) An improved grasshopper optimization algo-
28. Zhao D et al (2020) Chaotic random spare ant colony optimi- rithm with application to financial stress prediction. Appl Math
zation for multi-threshold image segmentation of 2D Kapur Model 64:654–668
entropy. Knowl Based Syst. https​://doi.org/10.1016/j.knosy​ 52. Jia H et al (2019) Hybrid grasshopper optimization algorithm
s.2020.10651​0 and differential evolution for global optimization. J Intell Fuzzy
29. Holland JH (1992) Genetic algorithms. Sci Am 267(1):66–72 Syst 37(5):6899–6910
30. Dorigo M, Blum C (2005) Ant colony optimization theory: a 53. Zakeri A, Hokmabadi A (2019) Efficient feature selection method
survey. Theor Comput Sci 344(2–3):243–278 using real-valued grasshopper optimization algorithm. Expert
31. James K, Gireesha OB (1995) Particle swarm optimization. In: Syst Appl 119:61–72
Proceedings of IEEE international conference on neural net- 54. Saxena A, Shekhawat S, Kumar R (2018) Application and devel-
works, vol 4. https​://doi.org/10.1109/ICNN.1995.48896​8  opment of enhanced chaotic Grasshopper optimization algo-
32. Mirjalili S (2016) Dragonfly algorithm: a new meta-heuristic rithms. Model Simul Eng 2018:1–14
optimization technique for solving single-objective, discrete, and 55. Jia H et al (2019) Hybrid Grasshopper optimization algorithm
multi-objective problems. Neural Comput Appl 27(4):1053–1073 and differential evolution for multilevel satellite image segmenta-
33. Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. tion. Remote Sens 11(9):1134
Adv Eng Softw 69:46–61 56. Ewees AA, Abd Elaziz M, Houssein EH (2018) Improved grass-
34. Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) GSA: a gravi- hopper optimization algorithm using opposition-based learning.
tational search algorithm. Inf Sci 179(13):2232–2248 Expert Syst Appl 112:156–172
35. Mirjalili S, Lewis A (2016) The whale optimization algorithm. 57. Yue X, Zhang H (2019) Grasshopper optimization algorithm with
Adv Eng Softw 95:51–67 principal component analysis for global optimization. J Super-
36. Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation comput 76:5609–5635
algorithm: theory and application. Adv Eng Softw 105:30–47 58. Arora S, Anand P (2019) Chaotic grasshopper optimiza-
37. Zhou H et al (2020) An improved Grasshopper optimizer for tion algorithm for global optimization. Neural Comput Appl
global tasks. Complexity 2020:4873501 31(8):4385–4405
38. Xu Z et al (2020) Orthogonally-designed adapted grasshop- 59. Ghulanavar R, Dama KK, Jagadeesh A (2020) Diagnosis of faulty
per optimization: a comprehensive analysis. Expert Syst Appl gears by modified AlexNet and improved grasshopper optimiza-
150:113282 tion algorithm (IGOA). J Mech Sci Technol 34(10):4173–4182
39. Tumuluru P, Ravi B (2017) GOA-based DBN: Grasshopper opti- 60. Kirkpatrick S, Gelatt CD Jr, Vecchi MP (1983) Optimization by
mization algorithm-based deep belief neural networks for cancer simulated annealing. Science 220(4598):671–680
classification. Int J Appl Eng Res 12(24):14218–14231 61. LaTorre A, Pena JM (2017) A comparison of three large-scale
40. Zhao H, Zhao H, Guo S (2018) Short-term wind electric power global optimizers on the CEC 2017 single objective real param-
forecasting using a novel multi-stage intelligent algorithm. Sus- eter numerical optimization benchmark. In: 2017 IEEE congress
tainability (Switzerland) 10(3):881 on evolutionary computation, CEC 2017—proceedings
41. Sultana U et al (2018) Placement and sizing of multiple distrib- 62. Alcalá-Fdez J et al (2009) KEEL: a software tool to assess evo-
uted generation and battery swapping stations using grasshopper lutionary algorithms for data mining problems. Soft Comput
optimizer algorithm. Energy 165:408–421 13(3):307–318
42. Liang H et al (2019) Modified grasshopper algorithm-based mul- 63. Huang GB et al (2012) Extreme learning machine for regression
tilevel thresholding for color image segmentation. IEEE Access and multiclass classification. IEEE Trans Syst Man Cybern Part
7:11258–11295 B Cybern 42(2):513–529
43. Omar AI et al (2019) An improved approach for robust control of 64. Kirkpatrick S, Gelatt CD Jr, Vecchi MP (1983) Optimi-
dynamic voltage restorer and power quality enhancement using zation by simulated annealing. Science (New York NY)
grasshopper optimization algorithm. ISA Trans 95:110–129 220(4598):671–680
44. Zhang X et al (2018) A parameter-adaptive VMD method based 65. Chechkin AV et al (2008) Introduction to the theory of Lévy
on grasshopper optimization algorithm to analyze vibration flights, in anomalous transport: foundations and applications. pp
signals from rotating machinery. Mech Syst Signal Process 129–162
108:58–72 66. Bäck T, Schwefel H-P (1993) An overview of evolutionary algo-
45. Mafarja M et al (2018) Evolutionary population dynamics and rithms for parameter optimization. Evol Comput 1(1):1–23
Grasshopper Optimization approaches for feature selection prob- 67. Derrac J et al (2011) A practical tutorial on the use of nonpara-
lems. Knowl Based Syst 145:125–145 metric statistical tests as a methodology for comparing evolu-
46. Jumani TA et al (2018) Optimal voltage and frequency control of tionary and swarm intelligence algorithms. Swarm Evol Comput
an islanded microgrid using grasshopper optimization algorithm. 1(1):3–18
Energies 11(11):3191 68. Storn R, Price K (1997) Differential evolution—a simple and
47. Ibrahim HT et al (2019) A grasshopper optimizer approach for efficient heuristic for global optimization over continuous spaces.
feature selection and optimizing SVM parameters utilizing real J Glob Optim 11(4):341–359
biomedical data sets. Neural Comput Appl 31(10):5965–5974 69. Mirjalili S (2015) Moth-flame optimization algorithm: a
48. Liu J et al (2018) Coordinated operation of multi-integrated novel nature-inspired heuristic paradigm. Knowl Based Syst
energy system based on linear weighted sum and Grasshopper 89:228–249
optimization algorithm. IEEE Access 6:42186–42195 70. Mirjalili S et al (2017) Salp swarm algorithm: a bio-inspired
49. Wu J et al (2017) Distributed trajectory optimization for multiple optimizer for engineering design problems. Adv Eng Softw
solar-powered UAVs target tracking in urban environment by 114:163–191
adaptive Grasshopper optimization algorithm. Aerosp Sci Tech- 71. Yang XS (2010) A new metaheuristic bat-inspired algorithm. In:
nol 70:497–510 Studies in computational intelligence. pp 65–74

13
Engineering with Computers

72. Mirjalili S (2016) SCA: a sine cosine algorithm for solving opti- 95. Wang L, Li LP (2010) An effective differential evolution with
mization problems. Knowl Based Syst 96:120–133 level comparison for constrained engineering design. Struct
73. Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse opti- Multidiscip Optim 41(6):947–963
mizer: a nature-inspired algorithm for global optimization. Neu- 96. Wang Y et al (2009) Constrained optimization based on hybrid
ral Comput Appl 27(2):495–513 evolutionary algorithm and adaptive constraint-handling tech-
74. Mirjalili S (2015) The ant lion optimizer. Adv Eng Softw nique. Struct Multidiscip Optim 37(4):395–413
83:80–98 97. Mezura-Montes E, Velázquez-Reyes J, Coello Coello CA (2006)
75. García-Martínez C et al (2008) Global and local real-coded Modified differential evolution for constrained optimization. In:
genetic algorithms based on parent-centric crossover operators. 2006 IEEE congress on evolutionary computation, CEC 2006
Eur J Oper Res 185(3):1088–1113 98. Liu H, Cai Z, Wang Y (2010) Hybridizing particle swarm optimi-
76. Liang JJ et al (2006) Comprehensive learning particle swarm zation with differential evolution for constrained numerical and
optimizer for global optimization of multimodal functions. IEEE engineering optimization. Appl Soft Comput J 10(2):629–640
Trans Evol Comput 10(3):281–295 99. Zhao D et  al (2017) An effective computational model for
77. Chen WN et al (2013) Particle swarm optimization with an aging bankruptcy prediction using kernel extreme learning machine
leader and challengers. IEEE Trans Evol Comput 17(2):241–258 approach. Comput Econ 49(2):325–341
78. Xu Y et al (2019) An efficient chaotic mutative moth-flame- 100. Chen H et al (2020) An enhanced bacterial foraging optimization
inspired optimizer for global optimization tasks. Expert Syst and its application for training kernel extreme learning machine.
Appl 129:135–155 Appl Soft Comput 86:105884
79. Xu Y et al (2019) Enhanced moth-flame optimizer with muta- 101. Wang M et al (2017) Grey wolf optimization evolving kernel
tion strategy for global optimization. Inf Sci. https​ : //doi. extreme learning machine: application to bankruptcy prediction.
org/10.1016/j.ins.2019.04.022 Eng Appl Artif Intell 63:54–68
80. Liang H et  al (2018) A hybrid bat algorithm for economic 102. Qiang L et al (2017) An enhanced grey wolf optimization based
dispatch with random wind power. IEEE Trans Power Syst feature selection wrapped kernel extreme learning machine for
33(5):5052–5261 medical diagnosis. Comput Math Methods Med 2017:1–15
81. Adarsh BR et al (2016) Economic dispatch using chaotic bat 103. Liu T et al (2015) A fast approach for detection of erythemato-
algorithm. Energy 96:666–675 squamous diseases based on extreme learning machine with
82. Ling Y, Zhou Y, Luo Q (2017) Lévy flight trajectory-based whale maximum relevance minimum redundancy feature selection. Int
optimization algorithm for global optimization. IEEE Access J Syst Sci 46(5):919–931
5:6168–6186 104. Chen H et al (2015) Using blood indexes to predict overweight
83. Heidari AA et al (2019) An enhanced associative learning-based statuses: an extreme learning machine-based approach. PLoS
exploratory whale optimizer for global optimization. Neural ONE 10(11):e0143003
Comput Appl 32:5185–5211 105. Cover TM, Hart PE (1967) Nearest neighbor pattern classifica-
84. Coello Coello CA (2002) Theoretical and numerical constraint- tion. IEEE Trans Inf Theory 13(1):21–27
handling techniques used with evolutionary algorithms: a sur- 106. Rumelhart DE, Hinton GE, Williams RJ (1986) Learn-
vey of the state of the art. Comput Methods Appl Mech Eng ing representations by back-propagating errors. Nature
191(11–12):1245–1287 323(6088):533–536
85. Kaveh A, Khayatazad M (2012) A new meta-heuristic method: 107. Kumar PR, Ravi V (2007) Bankruptcy prediction in banks and
ray optimization. Comput Struct 112–113:283–294 firms via statistical and intelligent techniques—a review. Eur J
86. Lee KS, Geem ZW (2005) A new meta-heuristic algorithm Oper Res 180(1):1–28
for continuous engineering optimization: harmony search 108. Zhang X et al (2020) Top-k feature selection framework using
theory and practice. Comput Methods Appl Mech Eng robust 0–1 integer programming. IEEE Trans Neural Netw Learn
194(36–38):3902–3933 Syst. https​://doi.org/10.1109/TNNLS​.2020.30092​09
87. Mahdavi M, Fesanghary M, Damangir E (2007) An improved 109. Zhang Y et al (2020) Boosted binary Harris hawks optimizer
harmony search algorithm for solving optimization problems. and feature selection. Eng Comput. https:​ //doi.org/10.1007/s0036​
Appl Math Comput 188(2):1567–1579 6-020-01028​-5
88. Ragsdell KM, Phillips DT (1976) Optimal design of a class 110. Yang C et al (2018) Superpixel-based unsupervised band selec-
of welded structures using geometric programming. J Eng Ind tion for classification of hyperspectral images. IEEE Trans Geo-
98(3):97–97 sci Remote Sens 56(12):7230–7245
89. Kannan BK, Kramer SN (1994) An augmented lagrange multi- 111. Chen HL et al (2016) An efficient hybrid kernel extreme learn-
plier based method for mixed integer discrete continuous opti- ing machine approach for early diagnosis of Parkinson’s disease.
mization and its applications to mechanical design. J Mech Des Neurocomputing 184:131–144
Trans ASME 116(2):405–411 112. Hu L et al (2015) An efficient machine learning approach for
90. He Q, Wang L (2007) An effective co-evolutionary particle diagnosis of paraquat-poisoned patients. Comput Biol Med
swarm optimization for constrained engineering design prob- 59:116–124
lems. Eng Appl Artif Intell 20(1):89–99 113. Xia J et al (2017) Ultrasound-based differentiation of malig-
91. Deb K (1997) GeneAS: a robust optimal design technique for nant and benign thyroid Nodules: an extreme learning machine
mechanical component design, vol 185 approach. Comput Methods Programs Biomed 147:37–49
92. Mezura-Montes E, Coello CAC (2008) An empirical study about 114. Zhang X et al (2020) Pyramid channel-based feature attention
the usefulness of evolution strategies to solve constrained opti- network for image dehazing. Comput Vis Image Understand.
mization problems. Int J Gen Syst 37(4):443–473 https​://doi.org/10.1016/j.cviu.2020.10300​3
93. Eskandar H et  al (2012) Water cycle algorithm—a novel 115. Wang T et al (2020) Video deblurring via spatiotemporal pyra-
metaheuristic optimization method for solving constrained engi- mid network and adversarial gradient prior. Comput Vis Image
neering optimization problems. Comput Struct 110–111:151–166 Understand. https​://doi.org/10.1016/j.cviu.2020.10313​5
94. Zhang M, Luo W, Wang X (2008) Differential evolution with 116. Li Y et al (2019) Epileptic seizure detection in EEG signals using
dynamic stochastic selection for constrained optimization. Inf sparse multiscale radial basis function networks and the Fisher
Sci 178(15):3043–3074 vector approach. Knowl Based Syst 164:96–106

13
Engineering with Computers

117. Li Y et al (2020) Deep spatial-temporal feature fusion from 127. Abbassi A et al (2020) Parameters identification of photovol-
adaptive dynamic functional connectivity for MCI identification. taic cell models using enhanced exploratory salp chains-based
IEEE Trans Med Imaging 39(9):2818–2830 approach. Energy 198:117333. https​://doi.org/10.1016/j.energ​
118. Li Y et al (2020) Epileptic seizure detection in EEG signals using y.2020.11733​3
a unified temporal-spectral squeeze-and-excitation network. 128. Ridha HM et al (2020) Boosted mutation-based Harris hawks
IEEE Trans Neural Syst Rehabil Eng 28(4):782–794 optimizer for parameters identification of single-diode solar
119. Guan R et al (2020) Deep feature-based text clustering and its cell models. Energy Convers Manag 209:112660. https​://doi.
explanation. IEEE Trans Knowl Data Eng 14:1–1 org/10.1016/j.encon​man.2020.11266​0
120. Fei X et al (2020) Projective parameter transfer based sparse 129. Zhang H et  al (2020) Orthogonal Nelder-Mead moth flame
multiple empirical kernel learning machine for diagnosis of brain method for parameters identification of photovoltaic modules.
disease. Neurocomputing 413:271–283 Energy Convers Manag 211:112764. https​://doi.org/10.1016/j.
121. Chen Z et al (2021) Information synergy entropy based multi- encon​man.2020.11276​4
feature information fusion for the operating condition identifica- 130. Jiao S et al (2020) Orthogonally adapted Harris hawks optimi-
tion in aluminium electrolysis. Inf Sci 548:275–294 zation for parameter estimation of photovoltaic models. Energy
122. Xue X et al (2019) Social learning evolution (SLE): computa- 203:117804. https​://doi.org/10.1016/j.energ​y.2020.11780​4
tional experiment-based modeling framework of social manufac- 131. Liu Y et al (2020) Horizontal and vertical crossover of Harris
turing. IEEE Trans Ind Inform 15(6):3343–3355 hawk optimizer with Nelder-Mead simplex for parameter estima-
123. Wang D et al (2018) A content-based recommender system for tion of photovoltaic models. Energy Convers Manag 223:113211.
computer science publications. Knowl Based Syst 157:1–9 https​://doi.org/10.1016/j.encon​man.2020.11321​1
124. Ridha HM et al (2021) Multi-objective optimization and multi- 132. Wang M et al (2020) Evaluation of constraint in photovoltaic
criteria decision-making methods for optimal design of stan- models by exploiting an enhanced ant lion optimizer. Sol Energy
dalone photovoltaic system: a comprehensive review. Renew 211:503–521
Sustain Energy Rev 135:110202. https​: //doi.org/10.1016/j. 133. Sun Y, Yen GG, Yi Z (2019) IGD indicator-based evolutionary
rser.2020.11020​2 algorithm for many-objective optimization problems. IEEE Trans
125. Chen H et al (2020) Parameters identification of photovoltaic Evol Comput 23(2):173–187
cells and modules using diversification-enriched Harris hawks
optimization with chaotic drifts. J Clean Prod 244:118778. https​ Publisher’s Note Springer Nature remains neutral with regard to
://doi.org/10.1016/j.jclep​ro.2019.11877​8 jurisdictional claims in published maps and institutional affiliations.
126. Chen H et al (2019) An opposition-based sine cosine approach
with local search for parameter estimation of photovoltaic mod-
els. Energy Convers Manag 195:927–942

13

You might also like