You are on page 1of 16

Acta Geotechnica

https://doi.org/10.1007/s11440-022-01685-4 (0123456789().,-volV)(0123456789().
,- volV)

RESEARCH PAPER

Proposing several hybrid SSA—machine learning techniques


for estimating rock cuttability by conical pick with relieved cutting
modes
Jian Zhou1 • Yong Dai1 • Shuai Huang1 • Danial Jahed Armaghani2 • Yingui Qiu1

Received: 14 November 2021 / Accepted: 13 August 2022


 The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2022

Abstract
During excavation of roadheader, specific energy (SE) is a key component of rock cuttability evaluation and cutting head
design. Previous studies have shown that the specific energy is simultaneously affected by physical and mechanical
parameters of rock, pick geometry, and pick operation parameters. In the paper, six machine learning (ML) algorithms
(back-propagation neural network, Elman neural network, extreme learning machine, kernel extreme learning machine,
random forest, support vector regression) optimized by sparrow search algorithm (SSA) for SE prediction are developed by
simultaneously considering two rock mechanical parameters (tensile strength of the rock rt and uniaxial compressive
strength of the rock rc), one pick geometry (cone angle h) and five pick operation parameters (cutting depth d, tool spacing
s, rake angle a, attack angle c, back-clearance angle b). 213 rock samples containing 26 rock types were selected to build
the SSA-ML model. Mean absolute error (MAE), mean absolute percentage error (MAPE) and determination coefficient
(R2) between the measured and predicted values are assigned as evaluation indicators to compare prediction performance
of SSA-ML models. The importance of input variables is calculated internally using random forest (RF) algorithm. The
results indicated that SSA-RF model with MAE of (0.7938 and 1.0438), MAPE of (12.76% and 16.98%), R2 of (0.9632 and
0.8943) on the training set and testing set has the most potential for SE prediction. The sensitive analysis shows the d, rc
and rt are the most significant input variables for SE prediction.

Keywords Machine learning  Rock cuttability  Sparrow search algorithm  Specific energy

1 Introduction

Roadheaders are widely used in tunneling, civil engineer-


& Jian Zhou
j.zhou@csu.edu.cn; csujzhou@hotmail.com ing, and the mining industry. Low efficiency of breaking
hard rock and large consumption of picks are the domi-
Yong Dai
205511027@csu.edu.cn nating challenges in the mechanical operation of road-
header machinery [28, 29, 33, 57]. The specific energy
Shuai Huang
205511038@csu.edu.cn (SE) refers to the energy consumed by rock breaking per
unit volume, and it has been demonstrated to be a com-
Danial Jahed Armaghani
danialarmaghani@susu.ru prehensive criterion for rock cuttability and mechanical
excavators’ rock breaking efficiency [42]. Therefore,
Yingui Qiu
195512085@csu.edu.cn specific energy prediction before practical operation
according to the rock parameters and pick configuration
1
School of Resources and Safety Engineering, Central South can provide guidance for the cutting head design and
University, Changsha 410083, China
roadheader selection so as to improve the rock breaking
2
Department of Urban Planning, Engineering Networks and efficiency and reduce the pick consumption. So far, SE
Systems, Institute of Architecture and Construction, South
measurements have been mainly conducted through small-
Ural State University, 76, Lenin Prospect,
Chelyabinsk 454080, Russia scale and full-scale cutting tests, and a large number of

123
Acta Geotechnica

studies have proved that these two methods are effective favorable to the structural stability and cutting efficiency of
[4]. For assessment of roadheader performance, Hughes the cutting machine. In general, the above studies can play
[16] first formulated secant elasticity modulus and com- a key role in rock cuttability estimation, but a few variables
prehensive strength of rock to estimate SE value, which and a small number of test samples with specific rock
proved to be effective [5]. Balci et al. [4] conducted full- properties and pick parameters are considered in their
scale rock cutting tests on 23 types of rocks to investigate empirical formulas, suggesting previous proposed empiri-
the relationships between physical and mechanical cal models lack generalization ability and are difficult to
parameters of rock (i.e., Cerchar abrasivity index, static apply in other engineering practice. Therefore, it is perti-
and dynamic elasticity moduli, density, Schmidt hammer nent to develop a comprehensive, powerful and practical
rebound values, Brazilian tensile strength, and uniaxial model that can be used in other conditions, considering
compressive strength) and optimal SE. The results both physical and mechanical parameters of rock and pick
demonstrated that tensile and uniaxial compressive stress geometry and pick operation parameters.
affect SE most, and several performance prediction models During the past few years, soft computing techniques
were developed in their research. Tiryaki and Dikmen [39] have been widely used in tunneling, mining, and civil
constructed some rock cuttability prediction models by engineering, proving their potential to explore complex
correlating SE, and several physical and mechanical relationships between multiple parameters and build mul-
parameters of rock, indicating Poisson’ s ratio, had the tiple nonlinear relationship [1, 2, 17, 18, 20, 21, 23,
highest effect on the SE of worn. Research carried out by 25, 26, 34, 44, 53–55, 60–65]. In terms of predicting SE in
Ozturk et al. [27] showed that SE is linearly related to rock cutting processes, Yurdakul et al. [59] properly
texture coefficient. Through laboratory rock cutting tests, selected the depth of cut, bending strength, unconfined
Goktan and Gunes [13] first made an attempt to investigate compress strength, feed rate, and point load strength as
the relationship between cutting efficiency of drag picks input parameters to estimate SE values in saw cutting
and rock brittleness. Their research indicates that rock through principal component analysis and established an
brittleness exhibits a high correlation with SE. To sum up, ANFIS model that outperformed other proposed prediction
the abovementioned research findings mainly investigated models. Tiryaki [40, 41] pointed out SE prediction of rock-
the relationship between rock physical and mechanical cutting interaction is a complex nonlinear problem, so
parameters and specific energy. However, Singhal [36] artificial neural network (ANN) and the regression tree
pointed out that the factors affecting the performance of a methods are adopted to develop SE prediction models. The
roadheader can be divided into physical and mechanical result revealed that predicting SE values by ANN is
parameters of rock, pick geometry, and pick operation promising. Although the above scholars tried to establish
parameters. Previous studies mainly aimed at estimating machine learning models for SE prediction, they did not
rock cuttability, and in terms of the performance prediction consider the influence of pick geometry and pick operation
of roadheaders, pick geometry and pick operation param- parameters on SE, and only considered a few samples.
eters should be considered. Based on the fact that few Thus, this study attempts to predict SE values by com-
researchers have studied the effect of pick geometry and prehensively considering physical and mechanical param-
pick operation parameters on the performance of road- eters of rock, pick geometry and pick operation parameters
headers, Dogruoz et al. [9, 10] focused on the relationship with soft computing technique, which is significant for
between the degree of chisel type pick wear and SE value, cutting head design, roadheader selection, rock cuttability,
based on the fact that the cutting head is prone to wear and cutting efficiency estimation.
during operation. The proposed formulas in their studies To estimate and predict SE values, several reliable and
can be used to forecast the SE values of cutting low- popular machine learning (ML) algorithms, namely back-
strength and medium-strength rocks. Wang et al. [47] propagation neural network (BPNN), [22], Elman neural
comprehensively considered mechanical parameters of network [24], extreme learning machine (ELM), [52],
rock (uniaxial compressive strength, UCS), pick geometry support vector regression (SVR), [20, 21, 66] and random
(cone angle), and pick operation parameters (rake angle, forest (RF), [64], with the integrated sparrow search
clearance angle, and attack angle), and principal compo- algorithm (SSA) are used to estimate SE. Random hyper-
nent regression methods and multiple nonlinear regression parameter selection will lead to machine learning model
were both adopted in their study for SE prediction. For the accuracy instability and poor performance, so this paper
purpose of minimizing specific energy, Park et al. [30] tried selects SSA to optimize the hyper-parameters of each
to select skew angle, attack angle, cut spacing, and depth to machine learning model. The classical ML algorithms
study their influence on SE through a full-scale cutting test. mentioned above have been successfully and broadly used
It has been reported that changing the attack angle from 45 in other engineering fields [1, 20, 25, 32, 63–70], and the
to 60, an s/d ratio of 2–3, and a positive skew angle are newly proposed SSA model has been proven to have better

123
Acta Geotechnica

optimization ability than the traditional optimization where Xb and Xw refer the global optimal location searched
algorithm in some cases. Therefore, the new SSA-based by the producer and the global worst position, respectively.
ML models proposed in this paper have a good capacity for A denotes a matrix of size 1 9 d where each element is
SE prediction. randomly set to - 1 or 1, and A? = AT(AAT)-1. If i B l/2,
the scroungers compete food with the producer who find
the optimal position. Otherwise, the scrounger in the worst
2 Methodology position might be starving.
In this paper, 20% of sparrows are aware of the danger
2.1 Sparrow search algorithm (SSA) and their location is changed as below:
8  
> t  t t 
In the sparrow population, there are two types of sparrows, >
> X best þ b   X i;j  X best  if fi [ fg
which are the producers and the scroungers. Producers < 0 1

xtþ1
i;j ¼  X t
i;j  X t
worst  ð5Þ
fervently hunt for food, and scroungers get food by fol- >
> X t
þ K  @ A if fi ¼ fg
>
: i;j
ðfi  fw Þ þ w
lowing producers. Based on the special behavior and
feeding strategies of scroungers and producers, Xue and
Shen [51] proposed a novel swarm intelligence called the where Xbest is the present global optimal position; K is a
sparrow search algorithm. The mathematical model of the random number in [- 1, 1]. Step control parameter b is a
SSA is established as below: random number with variance of 1 and mean value of 0 that
  obeys normal distribution. fi, fg and fw are the fitness value
xi ¼ xi1 ; . . .; xij ; . . .; xlP ð1Þ
of the current sparrow, the global best fitness value, and the
where, i = 1, 2, …, l; j = 1, 2, …, P; l and P represent the global worst fitness value, respectively. Small constant w is
number of sparrows and the dimension of the optimized set as 10-8 to avoid zero-division-error.
variables. In order to evaluate the sparrow’s ability to
obtain food, the fitness of each sparrow was described as 2.2 Back-propagation neural network (BPNN)
Eq. (2), and the producer with the highest fitness has the
priority of obtaining food. BPNN has become the most widely used ANN so far for its
  excellent nonlinear mapping ability, generalization ability,
Fxi ¼ f xi1 ;    ; xij ;    ; xlP ð2Þ
and fault tolerance ability [7]. BPNN is composed of an
In order to guide the whole sparrow population to get input layer, a hidden layer or hidden layers, and an output
food, 70% of the sparrows are assigned as producers in this layer, as shown in Fig. 1a. The layers are fully intercon-
study. The position of the producer is assigned by: nected, and the nodes of each layer are not connected. The
8   number of nodes in the input layer is usually taken as the
< t i
tþ1 Xi;j  exp if R2 \ST dimension of the input vector, and the number of nodes in
Xi;j ¼ a  tmax ð3Þ
: t the output layer is usually taken as the dimension of the
Xi;j þ Q  L if R2  ST
output vector. At present, there is no definite standard for
where, t refers the present iteration; Xti,j indicates the the number of nodes in the hidden layer, and the result
position of the ith sparrow in the jth dimension during should be obtained through repeated trials, which is set as
iteration t; a is a random number in (0,1]; tmax represents 10 in this study. Suppose the inputs of BPNN are x = (x1,
the maximum iterations; safety threshold ST is a random x2, …, xl)T, the output of the hidden layer is x’ = (x’1, x’2,
number in [0.5, 1], and set as 0.6. Alarm value R2 is ran- …, x’m)T, and the result of the output layer is x’’ = (x’’1,
dom number in [0, 1]; Random number Q obeys Gaussian x’’2, …, x’’n)T. The neuron output of each layer is:
8 !
distribution; L is a matrix [1, 1, …, 1]19d. When R2 C ST, > Xl
>
> 0
which means the producer is aware of danger and all >
> x ¼f wij xi  hj ; j ¼ 1; 2;    ; m
< j
sparrows go into other safe places. If R2 \ ST, it means i¼1
! ð6Þ
that producers have not discover predators and fly to >
> Xm
>
> 00 0 0 0
extensive search mode. : xk ¼ f
> wjk xj  hk ; k ¼ 1; 2;    ; n
j¼1
As for the scroungers, they update their position using
the following formula: where, l, m and n refer the number of neurons in input,
8  
> tþ1  t tþ1  þ hidden and output layer, respectively; wij and w’jk represent
>
< Xb þ Xi;j  Xb   A  L if i  l=2 the weights connecting the input and hidden layer, and
tþ1
Xi;j ¼  t  ð4Þ
> Xw  Xijt hidden and output layer, respectively; hj and h’k denote the
>
: Q  exp otherwise
i2 thresholds of hidden and output layer, respectively; f (t) is

123
Acta Geotechnica

Fig. 1 Principles of ML algorithms: a BPNN; b ElmanNN; c ELM, KELM; d RF e SVR

123
Acta Geotechnica

activation function, which is set as Sigmoid function as 2.4 Extreme learning machine (ELM)
follows:
1 In order to improve the back-propagation algorithm to
f ðtÞ ¼ ð7Þ simplify the setting of learning parameters and improve the
1 þ et
low learning efficiency, the ELM was developed by Huang
The gradient descent method is used in BPNN to con- et al. [15]. As can be seen in Fig. 1c, the network structure
stantly adjust the weights and thresholds of the network of ELM is the same as that of single-hidden-layer feed-
through back propagation to minimize the sum of squares forward neural networks (SLFNs), but in the training stage,
of errors of the network. the weights connecting the hidden layer and the thresholds
1X n of the hidden layer are randomly selected, so the algorithm
Er ¼ ðdrk  yrk Þ2 ð8Þ has a single structure and a fast-learning rate. The optimal
2 k¼1
ELM algorithm is determined by calculating the weights
Assuming there are R samples, the total error is: connecting the hidden layer and the input layer and the
X thresholds of the hidden layer. The weights connecting the
R
1X R X n
EA ¼ Er ¼ ðdrk  yrk Þ2 ð9Þ hidden layer and the input layer are solved by minimizing
2 r¼1 k¼1
r¼1 the approximate square deviation. The objective function is
where, Er is the error of rth sample and EA is the total error as follows:
of R sample, respectively; drk shows the kth desired output min k Hw0  D k2 ð14Þ
value of rth sample; yrk denotes the kth network output
value of rth sample. where H is the output matrix of the hidden layer, D is the
desired output matrix.
2 3
2.3 Elman neural network hðx1 Þ
6 7
H ¼ 4 ... 5
The Elman neural network is a common global feedfor-
hðx Þ
ward local recurrent network, which can be regarded as a 2 R 0   3
g w1 ; h1 ; x1  g wm ; h0m ; x1
recursive neural network with local memory units and local 6 7
¼4 .. .. 5 ; ð15Þ
feedback connections [12]. On the basis of the BP network,  
 .0  .
0

the Elman network adds a context layer as a one-step delay g w 1 ; h1 ; x R    g wm ; hm ; xR RL
2 T3
operator, which achieves the purpose of memorizing the d1
final output value of the hidden layer, makes the system D ¼ 45
have the ability to adapt to time-varying characteristics and dRT
enhances the global stability of the network, as shown in
where, x refers the training data vector. The other symbols
Fig. 1b. Compared with the feedforward neural network, it
are partially identical to those in the BPNN network. A
has stronger computing power, can also be used to solve
more detailed description of the ELM algorithm is pre-
the problem of fast optimization, and has short-term
sented in the research of Huang et al. [15].
memory function. An Elman neural network can be
mathematically modeled as:

2.5 Kernel extreme learning machine (KELM)
x0j ðrÞ ¼ f w00jj x00C ðrÞ þ wij Rðr  1Þ ð10Þ
There are some disadvantages to ELM, such as over-fitting,
x00C ðrÞ ¼ bx00C ðr  1Þ þ xðr  1Þ ð11Þ random initialization of input weights and difficulty in

determining the number of hidden layer nodes. In order to
x00k ðrÞ ¼ g w0jk x0j ðrÞ ð12Þ enhance the stability and generalization ability of ELM,
Huang et al. [14] introduced the kernel function as the
g () is activation function, set as a linear function in this
hidden layer node mapping of an ELM to construct KELM.
study, then
Note that the construction of KELM is the same as that of
x00k ðrÞ ¼ w0jk x0j ðrÞ ð13Þ ELM, as shown in Fig. 1c. The objective function of
KELM can be expressed as:
where, w’’jj represents the weights connecting the context
2 3T
and hidden layer; x’’C refers the output value of the context K ðx; x1 Þ
layer; feedback gain of the self-connections b is in the tðxÞ ¼ 4    5 ðI=C þ XELM Þ1 T ð16Þ
range of [0,1). The other symbols are partially identical to K ðx; xN Þ
the BP neural network.
where T refers the training dataset; XELM represents the

123
Acta Geotechnica

kernel matrix, C denotes the penalty factor, I refers the unit 3 SSA-ML model construction and data
matrix. (x, xi) is the training sample, K ðx; xi Þ is kernel description
function, and the Gaussian kernel function is used in this
study: 3.1 Hybrid SSA-ML models
!
kx  xi k2
K ðx; xi Þ ¼ exp ð17Þ Figure 2 shows the procedure for constructing SSA-ML
2r2 models. The accuracy of ML algorithms is greatly depen-
where r denotes kernel parameter. dent on their internal parameters. Thus, hybrid models for
SE prediction combining SSA and six ML algorithms are
2.6 Random forest (RF) proposed to determine their optimal internal parameters
which are summarized in Table 1. Firstly, SSA-ML models
Breiman [6] proposed a powerful ensemble learning algo- are constructed and tested using 213 samples that consider
rithm called RF. Firstly, multiple training sets were physical and mechanical parameters of rock, pick geome-
extracted from the original training set by Bootstrap tech- try, and pick operation parameters. To improve the itera-
nology, and then, decision trees were established by using tion speed and prediction accuracy, the input and output
the generated training sets, and the RF was constructed by parameters are normalized to [- 1, 1] by the formula
aggregating decision trees. As shown in Fig. 1d, each node below:
(orange and green) can test one input parameter, and the x  xmin  0 
xnormalized ¼ xmax  x0min þ x0min ð21Þ
leaf node denotes the output value. The final output value is xmax  xmin
obtained by aggregating the values of the leaf nodes in the where, xnormalized denotes the normalized value; xmax and
whole forest. The calculation formula is as follows: xmin represent the maximum and minimum values of the
1 X
ntree variable, respectively; x’max and x’min refer 1 and - 1,
Output ¼ Outputi ðxÞ ð18Þ respectively. Note that the predicted SE values must be
ntree i¼1
mapped to the original vector space.
where Output is the final output value, Outputi denotes the For effectively establishing and estimating SSA-ML
output value of ith decision tree. models, 80% of the original database is randomly selected
as a training set (172 samples), and the remaining is
2.7 Support vector regression (SVR) assigned as a testing set (41 samples). In addition, the
population size and maximum iteration of SSA are set as
Support vector machine (SVM) is proposed for dichoto- 100 and 500, respectively, which are sufficient to deter-
mies, and SVR is an important part of SVM [37]. The mine the optimal parameters. The fitness function is
optimal hyperplane sought by SVR is not to separate two or defined as the mean square error (MSE), which is defined
more types of sample points like SVM, but to minimize the as below:
total deviation of all sample points from the hyperplane, as 1X n
shown in Fig. 1e. A hyperplane, f(x) = wx ? b, can be MSE ¼ ðAi  Pi Þ2 ð22Þ
n i¼1
used to divide the training set linearly. SVR is essentially a
problem of finding the optimal hyperplane, which can be where, A and P refer the actual and predicted SE values.
obtained by the following formulas: After maximum iteration is reached, each optimal SSA-
1 ML model is developed with a minimum fitness value
Minimize : kxk2 ð19Þ
2 (MSE). Eventually, six optimal SSA-ML models are
  comprehensively compared by evaluating the accuracy of
X M 
  the training set and testing set with evaluation indicators.
Subject to :  ðxi Ki ðxÞ þ b  yi Þ  e ð20Þ
 i¼1 
3.2 Data source
where K(x) is adopted as Gaussian kernel function, see
Eq. 16. x denotes weight vector, b refers thresholds.
Previous research [45, 46] has suggested that the SE is not
only affected by the physical and mechanical parameters of
rock, but also influenced by pick geometry and pick
operational parameters. Researches have pointed out that
the optimal or minimum SE is obtained with an optimal
spacing (s)/depth (d) is 2–3, that is, cut ratio (s/d) is 2–3,
and demonstrating there is a significant statistical

123
Acta Geotechnica

Fig. 2 The procedure of constructing SSA-ML models for SE prediction

Table 1 Internal parameters in ML algorithms angel c. Wang et al. [47, 48] pointed out attack angle (c),
back-clearance angle (b) significant correlated with SE,
Algorithm Optimized parameters Xmin Xmax
and cone angle, and rake angle a were also potentially
SVR kernel parameter, r penalty factor, C 0 20 affecting SE. Thus, two rock physical and mechanical
RF The number of features for splitting at 1 100 parameters (tensile strength of the rock rt and uniaxial
each node, mtry The number of trees in compressive strength of the rock rc), one pick geometry
the forest, ntree
(cone angle h), and five pick operation parameters (cutting
KELM kernel parameter, r penalty factor, C 0.5 50
depth d, tool spacing s, rake angle a, attack angle c, back-
ElmanNN weights connecting the input and hidden -3 3
clearance angle b), which have been considered by many
layer, w weights connecting the hidden
and output layer, w’ weights connecting researchers, were chosen as input features in this study to
the context and hidden layer, w’’ ensure diversity and quantity of data. The data were all
thresholds of hidden layer, h thresholds collected from full-scale linear rock cutting test, and the
of output layer, h’
geometrical features of conical pick cutting are presented
ELM the weights connecting the input and -1 1
in Fig. 3. Therefore, 213 experiment samples performed in
hidden layer, w the thresholds of hidden
layer, h relieved cutting mode (interaction between grooves exists)
BPNN weights connecting the input and hidden -5 5 considering 8 influential parameters were collected in this
layer, w weights connecting the hidden study for SE prediction model construction
and output layer, w’ thresholds of hidden [31, 43, 45, 48, 50, 56]. It should be noted that 26 kinds of
layer, h thresholds of output layer, h’ rock specimens, including limestone, beige marble, sand-
stone, travertine, etc., were taken into consideration to
improve the generation ability of SSA-ML models.
relationship among d, s and SE [4, 43, 49]. Balci
et al. [4, 3] decribed a linear relationship among tensile 3.3 Data analysis
strength of the rock (rt), uniaxial compressive strength of
the rock (rc) and SE, a similar relationshp was also found The distribution and Pearson correlation coefficient
out by Dogruoz and Bolukbasi [9]. It should be noted that (R) values for the relationships between eight inputs and
rc and rt were also widely used to build ML models for SE one output are shown in Fig. 4. It can be observed that the
prediction [40, 41, 59]. Recently, many scholars have tried distribution of the pick operational parameters d and s, and
to study the influence of various pick operation angles on the physical and mechanical parameters of rock rc and rt
SE based on the consideration of rock physical and are relatively dispersed, which indicate these parameters
mechanical properties, cutting spacing and depth. Kang are broadly studied during experiments. Other parameters
et al. [19] found SE decreases with the increase in attack are mainly concentrated on certain values, revealing that

123
Acta Geotechnica

Fig. 3 Geometrical features of conical pick cutting

Fig. 4 Correlation matrix of SE database

these parameters are rarely studied/investigated. In addi- physical and mechanical parameters of rock rc and rt have
tion, the Pearson correlation coefficient (R) values for the a relatively high correlation with the output parameter
relationships between eight inputs and one output indicate compared with other parameters. This demonstrates that it
that the pick operational parameters d and s, and the is reasonable to focus on the influence of these parameters

123
Acta Geotechnica

Fig. 5 Distributions of training set and testing set for SSA-ML


models

on SE in previous studies [3, 11]. However, eight input


parameters weakly correlate with the SE, which implies the
input parameters, and the SE is not simply related. More-
over, the distribution of eight parameters of the training set Fig. 6 The fitness curve of SSA-ML models
and the testing set is presented in Fig. 5, which suggests the
distribution of parameters of the training set and the testing them and the MSE was used as a fitness function for per-
set is roughly the same, so the results obtained in this study formance evaluation. Therefore, six SSA-ML models with
will be reliable. 100 population and 500 iterations were established, which
is time saving and sufficient to reach the optimal parame-
3.4 Evaluation indicators ters of ML algorithms [63, 66]. The iteration process of six
SSA-ML models is presented in Fig. 6, which indicates the
Performance evaluation is a key component of ML model convergent fitness of SSA-ElmanNN and SSA-BPNN is
design and construction [32, 58, 63, 65]. In this study, obviously larger than other SSA-ML models. The finding
scale-dependent indicator of mean absolute error (MAE), reveals that SSA is relatively unpromising for optimizing
and scale-independent indicators of mean absolute per- internal parameters of ElmanNN and BPNN, while other
centage error (MAPE), and determination coefficient (R2) models have bright prospects. Moreover, Table 2 gives the
are adopted in this study to assess the accuracy of predic- fitness and running time of each model, indicating that the
tion models. The calculation formulas for evaluation indi- alterable parameters C and r in SVM algorithm are per-
cators are described as below: fectly optimized. What is noteworthy is that SSA-KELM
1X n takes the advantage of little time consuming, whereas SSA-
MAE ¼ j Ai  P i j ð23Þ ElmanNN and SSA-BPNN are the opposite. In sum up,
n i¼1
optimal internal parameters in six ML algorithms are
n  
1X  Ai  Pi  determined by SSA and their prediction performance will
MAPE ¼  100% ð24Þ
n i¼1  Ai  be compared in the next section.
P
n
ðAi  Pi Þ2 4.2 Prediction performance comparison
2 i¼1
R ¼1 Pn ð25Þ
ðAi  Ai Þ The prediction results of six SSA-ML models are presented
i¼1 in Fig. 7. The findings reveal that the measured SE values
where, A, P and A refer the actual SE values, the predicted
SE values and the average of actual SE values. Table 2 Running results of SSA-ML models
Algorithm Fitness (kWh/m3) Running time (s)

RF 0.00615 402
4 Results
SVM 0.00001 116
4.1 Determination of optimal SSA-ML models ELM 0.03397 75
ElmanNN 1.75339 4981
For optimizing proposed ML algorithms, the SSA was BPNN 1.75012 4084
introduced to determine the optimal internal parameters of KELM 0.03659 11

123
Acta Geotechnica

Fig. 7 Prediction performance of SSA-ML models

and predicted SE values in the training set and testing set 1.6763, MAPE = 29.91%, R2 = 0.7912 on the testing set),
are roughly identical in the SSA-RF model (MAE = suggesting a distinguished SE prediction ability. It is worth
0.7938, MAPE = 12.76%, R2 = 0.9632 on the training set; mentioning that the SSA-SVR (MAE = 0.1334, MAPE =
MAE = 1.0438, MAPE = 16.98%, R2 = 0.8943 on the 3.57%, R2 = 0.9998 on the training set; MAE = 2.4408,
testing set) and SSA-KELM model (MAE = 0.3656, MAPE = 43.96%, R2 = 0.5322 on the testing set) model
MAPE = 6.84%, R2 = 0.9913 on the training set; MAE = outperforms on the training set, but performs

123
Acta Geotechnica

unpromisingly on the testing set, which signifies the SSA-


SVR model is severely over-fitted and unsuitable for SE
prediction. It can be seen from Fig. 7, the scatter points are
relatively dispersive in SSA-ELM model (MAE = 1.9338,
MAPE = 35.29%, R2 = 0.8197 on the training set;
MAE = 2.4134, MAPE = 43.44%, R2 = 0.6602 on the
testing set), SSA-BPNN model (MAE = 1.2502, MAPE =
32.06%, R2 = 0.8907 on the training set; MAE = 2.4704,
MAPE = 176.91%, R2 = 0.7623 on the testing set), and
SSA-ElmanNN model (MAE = 3.3159, MAPE = 69.15%,
R2 = 0.5157 on the training set; MAE = 2.9047, MAPE =
62.03%, R2 = 0.4765 on the testing set), suggesting that
their potential for predicting SE does not exist.
To further validate and compare the prediction ability of
proposed SSA-ML models, the evaluation indicators of six
hybrid models in the training set and testing set are ranked
and shown in Fig. 8. The width of the line connecting the
index and the model is the score of the model on the index.
As can be seen from Fig. 8, both the SSA-KELM model Fig. 9 Taylor graph of SSA-ML models
and the SSA-RF model scored 30 (SSA-KELM model
scored 15 on the training set, 15 on the testing set; the SSA- testing set. It can be observed in Fig. 9 that the point
RF scored 12 on the training set, 18 on the testing set), representing the SSA-RF model is closest to the reference
implying excellent potential for SE prediction. In fact, the point, indicating excellent generation ability for it.
SSA-KELM model performed slightly better in the training To sum up, two SSA-ML models are promising for SE
set, while SSA-RF model notably performed better in the prediction, namely SSA-RF and SSA-KELM. However,
testing set. In addition, a Taylor graph [38, 65] which based on the fact that SSA-RF showed better prediction
contains the Pearson correlation coefficient, root mean accuracy for both the training set and the testing set, and no
square deviation and standard deviation between measured noticeable over-fitting occurred, it is recommended in SE
and predicted SE values is introduced for assessing the prediction.
prediction performance of optimal SSA-ML models in the

Fig. 8 Comprehensive ranking of a training set and b testing set

123
Acta Geotechnica

4.3 Parametric investigation deemed as the importance value [8]. Obviously, for
inessential variables, the rearrangement will not have a
A robust ML model can roughly exhibit smooth functions great impact on the accuracy of the model, but for
that describe the correlations between input and output important variables, the rearrangement will reduce the
variables, with physical explanations for those correlations accuracy of the model. The mean increase in MSE can be
[35]. Hence, the correlations between eight input variables calculated as below:
and SE at three representative levels in the dataset are
1X N
surveyed. Note that values of seven input variables at each InMSEi ¼ ðmsei1n msei2n Þ ð26Þ
N n¼1
round are fixed in the optimal SSA-RF model, while the
remaining variable increasing from 0.6t to 1.5t, at an where N denotes the number of trees; msei1n and msei2n
interval of 0.1v. (with t referring the original value of refer the MSE of OOB data after and before random
changed variable). rearrangement.
Figure 10 shows the correlations between input and Eventually, the importance of each input variable is
output variables in the optimal SSA-RF model. It should be computed and shown in Fig. 11. The finding implies that d,
noted that an ideally smooth correlations between input and rc and rt contribute the most to the SE prediction, which is
output variables do not exist in the SSA-RF model, because consistent with the results of Wang et al. [47] and
there is not a smooth correlation occurred in the measured demonstrates the selected parameters for SE and cuttability
data used to construct ML models. Previous studies have prediction. Additionally, variable parameters such as pick
found that the relation between s/d and SE roughly con- geometry (h) and pick operation parameters (s, c, a, b) also
forms to a parabola, when s/d is 2 or 3, there is a minimum have an impact on SE prediction, even if it is small, sug-
or optimal SE value [4, 30, 43]. It can be seen from gesting it is promising to reduce SE value in practical
Fig. 10a that, when s = 10, d has a general downward trend engineering through tuning these variable parameters. In
within the value range; when s = 25, d reaches the optimal the future, more parameters and data samples will be
SE value at about 12; when s = 36, SE reaches the mini- considered for construction SE prediction models and for
mum value at d = 11. The results are consistent with pre- guiding cutting head design and roadheader selection.
vious studies. Note that the predicted SE invariably
remains stable when the input variables exceed the range of
measured values which is the limitation of RF model. In 5 Conclusions
Fig. 10b, the predicted SE decreases monotonically with
the increase in s, approaching the minimum SE value, such In this paper, six strong ML algorithms, namely BPNN,
trend is as Balci et al. [4], Park et al. [30] and Tuncdemir ElmanNN, ELM, KELM, RF, and SVR, were optimized by
et al. [43] have discovered. In Fig. 10c, d and f, the cor- SSA for predicting SE of conical pick. Firstly, a database
relations among h, c and b and SE show a similar trend that containing eight input parameters-tensile strength of the
the SE decreases with the increase in input variables. When rock rt, uniaxial compressive strength of the rock rc, cone
the c and b are small, there is a severe friction between the angle h, cutting depth d, tool spacing s, cone angle h, rake
pick and the rock, so that the cutting force increases, so that angle a, attack angle c, back-clearance angle b- and one
the specific energy consumption is large [47, 48]. In output parameter SE was established and used for model-
Fig. 10e, there are no obvious linear relation between a and ing purposes. The low level of the R between input
SE. In Fig. 10g and h, SE increases with the increase in rc parameters and SE suggested a simple linear correlation
and rt, which agrees with the results obtained by previous did not exist. The data distribution of the training set and
studies [13, 4, 3, 9], Bilgin et al. [5]. In sum up, the cor- testing set is almost the same, indicating the model con-
relations showed in Fig. 10 are in accord with physical struction phase is reliable and scientific.
explanation, demonstrating that RF model is a reasonable To optimize the internal parameters of ML algorithms, a
and robust tool for SE prediction. novel swarm intelligence optimization approach was
adopted, and the fitness function was defined as MSE. In
4.4 Sensitive analysis addition, the novel SSA-ML models are evaluated and
examined by the evaluation indicators MAE, MAPE and
In the previous section, it was concluded that the SSA-RF R2. The results demonstrated that the SSA-RF model out-
model performed best in predicting SE. In the RF algo- performed the other five models for SE prediction, gener-
rithm, the importance of each variable can be determined ating low values of MAE, (0.7938 on the training set,
by randomly rearranging their order in OOB data, and the 1.0438 on the testing set) MAPE (12.76% on the training
subtraction of MSE after and before rearrangement is set, 16.98% on the testing set) and a large value of R2

123
Acta Geotechnica

(a) (b)

(c) (d)

(e) (f)

(g) (h)

Fig. 10 Predicted SE using SSA-RF model against a d; b s; c h; d c; e a; f b; g rc; and h rt

123
Acta Geotechnica

empirical and statistical approaches. Tunn Undergr Space Tech-


nol 118:104183
2. Armaghani DJ, Harandizadeh H, Momeni E, Maizir H, Zhou J
(2022) An optimized system of GMDH-ANFIS predictive model
by ICA for estimating pile bearing capacity. Artif Intell Rev
55(3):2313–2350
3. Balci C, Bilgin N (2007) Correlative study of linear small and
full-scale rock cutting tests to select mechanized excavation
machines. Int J Rock Mech Min Sci 44:468–476
4. Balci C, Demircin MA, Copur H, Tuncdemir H (2004) Estima-
tion of optimum specific energy based on rock properties for
assessment of roadheader performance. J South Afr Inst Min
Metall 104:633–641
5. Bilgin N, Demircin MA, Copur H, Balci C, Tuncdemir H, Akcin
N (2006) Dominant rock properties affecting the performance of
conical picks and the comparison of some experimental and
theoretical results. Int J Rock Mech Min Sci 43:139–156
6. Breiman L (2001) Random forests. Mach Learn 45:5–32. https://
doi.org/10.1023/a:1010933404324
7. Ceryan N, Okkan U, Kesimal A (2012) Application of general-
Fig. 11 Mean increase in MSE of eight input variables ized regression neural networks in predicting the unconfined
compressive strength of carbonate rocks. Rock Mech Rock Eng
(0.9632 on the training set, 0.8943 on the testing set). 45:1055–1072. https://doi.org/10.1007/s00603-012-0239-9
Finally, the importance of each variable is determined 8. de Castro Galizia, Luiz Felipe, and Marcos Rodrigues. (2019)
Modeling the influence of eucalypt plantation on wildfire
internally by the RF algorithm. The mean increase in MSE
occurrence in the Brazilian savanna biome. Forests. https://doi.
in the RF algorithm was obtained as 19.39(rt), 16.58(rc), org/10.3390/f10100844
1.05(h), 24.75(d), 6.31(s), 3.63(a), 2.48(c), 1.54(b), indi- 9. Dogruoz C, Bolukbasi N (2014) Effect of cutting tool blunting on
cating rt, rc and d contribute most to SE prediction. In the the performances of various mechanical excavators used in low-
and medium-strength rocks. Bull Eng Geol Env 73:781–789.
future, additional working conditions and rock types will
https://doi.org/10.1007/s10064-013-0551-y
be considered to enhance model generation ability. More 10. Dogruoz C, Bolukbasi N, Rostami J, Acar C (2016) An experi-
pick operation parameters and rock physical and mechan- mental study of cutting performances of worn picks. Rock Mech
ical properties should be considered to comprehensively Rock Eng 49:213–224
11. Dursun AE, Kemal Gokay M (2016) Cuttability assessment of
evaluate the rock cuttability of rock, providing basis for the
selected rocks through different brittleness values. Rock Mech
roadheader machine selection and performance evaluation Rock Eng 49:1173–1190
of roadheader machine in hard rock tunneling, and reduc- 12. Elman JL (1990) Finding structure in time. Cogn Sci 14:179–211
ing cutting energy consumption. In addition, the novel 13. Goktan RM, Gunes Yilmaz N (2005) A new methodology for the
analysis of the relationship between rock brittleness index and
SSA-ML models proposed in this study can be applied in
drag pick cutting efficiency. J South Afr Inst Min Metall
other engineering practices. 105:727–733
14. Huang G-B, Zhou H, Ding X, Zhang R (2012) Extreme learning
Acknowledgements This research was funded by the National Sci- machine for regression and multiclass classification. Ieee Trans
ence Foundation of China (42177164), the Distinguished Youth Syst Man Cybern Part B-Cybern 42:513–529. https://doi.org/10.
Science Foundation of Hunan Province of China (2022JJ10073), the 1109/tsmcb.2011.2168604
Innovation-Driven Project of Central South University (No. 15. Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning
2020CX040) and the Fundamental Research Funds for the Central machine: theory and applications. Neurocomputing 70:489–501
Universities of Central South University (2022ZZTS0480). 16. Hughes HM (1972) Some aspects of rock machining. Int J Rock
Mech Mining Sci Geomech Abs 9:205–211
Data availability Data will be made available on reasonable request. 17. Khandelwal M, Singh TN (2009) Prediction of blast-induced
ground vibration using artificial neural network. Int J Rock Mech
Declarations Min Sci 46(7):1214–1222
18. Khandelwal M, Armaghani DJ (2016) Prediction of drillability of
rocks with strength properties using a hybrid GA-ANN technique.
Conflict of interest The authors declare that they have no known
Geotech Geol Eng 34:605–620. https://doi.org/10.1007/s10706-
competing financial interests or personal relationships that could have
015-9970-9
appeared to influence the work reported in this paper.
19. Kang KX, He B, and Wang SJ (2020) Experimental Study on the
breaking ability of the cutting angle of conical picks. In: MATEC
Web of Conferences, 04002. EDP Sciences.
References 20. Li E, Yang F, Ren M, Zhang X, Zhou J, Khandelwal M (2021)
Prediction of blasting mean fragment size using support vector
1. Armaghani DJ, Yagiz S, Mohamad ET, Zhou J (2021) Prediction regression combined with five optimization algorithms. J Rock
of TBM performance in fresh through weathered granite using Mech Geotech Eng. https://doi.org/10.1016/j.jrmge.2021.07.013
21. Li E, Zhou J, Shi X, Jahed Armaghani D, Yu Z, Chen X, Huang P
(2021) Developing a hybrid model of salp swarm algorithm-

123
Acta Geotechnica

based support vector machine to predict the strength of fiber- machine, and regression. IEEE Trans Instrum Meas 64:52–62.
reinforced cemented paste backfill. Eng Comput https://doi.org/10.1109/tim.2014.2330494
37(4):3519–3540 38. Taylor KE (2001) Summarizing multiple aspects of model per-
22. Liu B, Wang R, Zhao G, Guo X, Wang Y, Li J, Wang S (2020) formance in a single diagram. J Gerontol Ser A Biol Med Sci
Prediction of rock mass parameters in the TBM tunnel based on 106:7183–7192
BP neural network integrated simulated annealing algorithm. 39. Tiryaki B, Cagatay Dikmen A (2006) Effects of rock properties
Tunn Undergr Space Technol. https://doi.org/10.1016/j.tust.2019. on specific cutting energy in linear cutting of sandstones by picks.
103103 Rock Mech Rock Eng 39:89–120. https://doi.org/10.1007/
23. Mahdevari S, Shahriar K, Yagiz S, Shirazi MA (2014) A support s00603-005-0062-7
vector regression model for predicting tunnel boring machine 40. Tiryaki B (2008) Application of artificial neural networks for
penetration rates. Int J Rock Mech Min Sci 72:214–229. https:// predicting the cuttability of rocks by drag tools. Tunn Undergr
doi.org/10.1016/j.ijrmms.2014.09.012 Space Technol 23(3):273–280
24. Moazenzadeh R, Mohammadi B (2019) Assessment of bio-in- 41. Tiryaki B (2009) Estimating rock cuttability using regression
spired metaheuristic optimisation algorithms for estimating soil trees and artificial neural networks. Rock Mech Rock Eng
temperature. Geoderma 353:152–171. https://doi.org/10.1016/j. 42:939–946
ijrmms.2014.09.012 42. Tumac D, Bilgin N, Feridunoglu C, Ergin H (2007) Estimation of
25. Li C, Zhou J, Tao M, Du K, Wang S, Armaghani DJ, Mohamad rock cuttability from shore hardness and compressive strength
ET (2022b) Developing hybrid ELM-ALO, ELM-LSO and ELM- properties. Rock Mech Rock Eng 40:477–490. https://doi.org/10.
SOA models for predicting advance rate of TBM. Transp Geo- 1007/s00603-006-0108-5
tech 100819 43. Tuncdemir H, Bilgin N, Copur H, Balci C (2008) Control of rock
26. Li C, Zhou J, Khandelwal M, Zhang X, Monjezi M, Qiu Y cutting efficiency by muck size. Int J Rock Mech Min Sci
(2022a) Six novel hybrid extreme learning machine–swarm 45:278–288. https://doi.org/10.1016/j.ijrmms.2007.04.010
intelligence optimization (ELM–SIO) models for predicting 44. Wang SM, Zhou J, Li CQ, Armaghani DJ, Li XB, Mitri HS
backbreak in open-pit blasting. Nat Resour Res 1–23. https://doi. (2021) Rockburst prediction in hard rock mines developing
org/10.1007/s11053-022-10082-3 bagging and boosting tree-based ensemble techniques. J Central
27. Ozturk CA, Nasuf E, Bilgin N (2004) The assessment of rock South Univ 28(2): 527–542
cutability, and physical and mechanical rock properties from a 45. Wang X, Liang Y, Wang Q, Zhang Z (2017) Empirical models
texture coefficient. J S Afr Inst Min Metall 104:397–402 for tool forces prediction of drag-typed picks based on principal
28. Pan Y, Liu Q, Kong X, Liu J, Peng X, Liu Q (2019) Full-scale component regression and ridge regression methods. Tunn
linear cutting test in Chongqing Sandstone and the comparison Undergr Space Technol 62:75–95
with field TBM excavation performance. Acta Geotech 46. Wang X, Okan Su, Wang Q-F, Liang Y-P (2017) Effect of cutting
14(4):1249–1268 depth and line spacing on the cuttability behavior of sandstones
29. Pan Y, Liu Q, Liu Q, Liu J, Peng X, Huang X, Wei M (2020) by conical picks. Arab J Geosci. https://doi.org/10.1007/s12517-
Full-scale linear cutting tests to check and modify a widely used 017-3307-3
semi-theoretical model for disc cutter cutting force prediction. 47. Wang X, Wang Q-F, Liang Y-P, Okan Su, Yang L (2018)
Acta Geotech 15(6):1481–1500 Dominant cutting parameters affecting the specific energy of
30. Park J-Y, Kang H, Lee J-W, Kim J-H, Joo-Young Oh, Cho J-W, selected sandstones when using conical picks and the develop-
Rostami J, Kim HD (2018) A study on rock cutting efficiency and ment of empirical prediction models. Rock Mech Rock Eng
structural stability of a point attack pick cutter by lab-scale linear 51:3111–3128
cutting machine testing and finite element analysis. Int J Rock 48. Wang X, Wang Q, Liang Y (2018) Effects of cutting parameters
Mech Min Sci 103:215–229 affecting on specific cutting energy of conical picks. J China Coal
31. Polat Can (2015) Roadheader performance prediction using Soc 2:563–570
portable linear cutting machine PhD Thesis. Fen Bilimleri 49. Wang X, Okan Su (2019) Specific energy analysis of rock cutting
Enstitüsü, Turkey based on fracture mechanics: a case study using a conical pick on
32. Qiu Y, Zhou J, Khandelwal M, Yang H, Yang P, Li C (2021) sandstone. Eng Fract Mech 213:197–205
Performance evaluation of hybrid WOA-XGBoost, GWO- 50. Wicaksana Y (2020) Prediction of rock cutting performance and
XGBoost and BO-XGBoost models to predict blast-induced abrasiveness considering dynamic properties at intermediate
ground vibration. https://doi.org/10.1007/s00366-021-01393-9 strain rate. PhD Thesis, Seoul National University
33. Rostami K, Hamidi JK, Nejati HR (2020) Use of rock microscale 51. Xue JK, Shen B (2020) A novel swarm intelligence optimization
properties for introducing a cuttability index in rock cutting with approach: sparrow search algorithm. Syst Sci Control Eng
a chisel pick. Arab J Geosci 13(18):1–12 8:22–34. https://doi.org/10.1080/21642583.2019.1708830
34. Shang L, Nguyen H, Bui XN, Vu TH, Costache R, Hanh LTM 52. Xue Y, Bai C, Qiu D, Kong F, Li Z (2020) Predicting rockburst
(2022) Toward state-of-the-art techniques in predicting and with database using particle swarm optimization and extreme
controlling slope stability in open-pit mines based on limit learning machine. Tunn Undergr Space Technol 98:103287
equilibrium analysis, radial basis function neural network, and 53. Yagiz S, Karahan H (2011) Prediction of hard rock TBM pene-
brainstorm optimization. Acta Geotechnica 17(4):1295–1314 tration rate using particle swarm optimization. Int J Rock Mech
35. Shahin MA, Maier HR, Jaksa MB (2005) Investigation into the Min Sci 48(3):427–433
robustness of artificial neural networks for a case study in civil 54. Yagiz S, Sezer EA, Gokceoglu C (2012) Artificial neural net-
engineering. In: Zerger A, Argent RM (eds) MODSIM 2005 works and nonlinear regression techniques to assess the influence
International congress on modelling and simulation. Citeseer, of slake durability cycles on the prediction of uniaxial com-
New Jersey, pp 79–83 pressive strength and modulus of elasticity for carbonate rocks.
36. Singhal RK (2014) Mechanical excavation in mining and civil Int J Numer Anal Meth Geomech 36(14):1636–1650
industries. Taylor & Francis, Oxford 55. Yang J, Yagiz S, Liu YJ, Laouafa F (2021) A comprehensive
37. Soualhi A, Medjaher K, Zerhouni N (2015) Bearing health evaluation of machine learning algorithms on application to
monitoring based on hilbert-huang transform, support vector predict TBM performance. Undergr Space. https://doi.org/10.
1016/j.undsp.2021.04.003l

123
Acta Geotechnica

56. Yasar S (2019) Determination of optimum rock cutting data 65. Zhou J, Qiu Y, Khandelwal M, Zhu S, Zhang X (2021) Devel-
through single pick cutting tests. Geotechnique Letters 9:8–14. oping a hybrid model of Jaya algorithm-based extreme gradient
https://doi.org/10.1680/jgele.18.00124 boosting machine to estimate blast-induced ground vibrations. Int
57. Yilmaz N, Gunes DT, Goktan RM (2015) Rock cuttability J Rock Mech Min Sci 145:104856
assessment using the concept of hybrid dynamic hardness (HDH). 66. Zhou J, Qiu Y, Zhu S, Armaghani DJ, Li C, Nguyen H, Yagiz S
Bull Eng Geol Env 74:1363–1374 (2021) Optimization of support vector machine through the use of
58. Yu Z, Shi X, Miao X, Zhou J, Khandelwal M, Chen X, Qiu Y metaheuristic algorithms in forecasting TBM advance rate. Eng
(2021) Intelligent modeling of blast-induced rock movement Appl Artif Intell 97:104015
prediction using dimensional analysis and optimized artificial 67. Zhou J, Dai Y, Du K, Khandelwal M, Li C, Qiu Y(2022)
neural network technique. Int J Rock Mech Min Sci 143:104794 COSMA-RF: new intelligent model based on chaos optimized
59. Yurdakul M, Gopalakrishnan K, Akdas H (2014) Prediction of slime mould algorithm and random forest for estimating the peak
specific cutting energy in natural stone cutting processes using the cutting force of conical picks. Transp Geotech 36:100806
neuro-fuzzy methodology. Int J Rock Mech Min Sci 67:127–135. 68. Zhou J, Huang S, Qiu YG (2022) Optimization of random forest
https://doi.org/10.1016/j.ijrmms.2014.01.015 through the use of MVO, GWO and MFO in evaluating the sta-
60. Zhang WG, Li HR, Wu CZ, Li YQ, Liu ZQ, Liu HL (2021) Soft bility of underground entry-type excavations. Tunn Undergr
computing approach for prediction of surface settlement induced Space Technol 124:104494
by earth pressure balance shield tunneling. Undergr Space 69. Zhou J, Shen X, Qiu Y, Shi X, Khandelwal M (2022) Cross-
6(4):353–363 correlation stacking-based microseismic source location using
61. Zhang W, Li H, Li Y, Liu H, Chen Y, Ding X (2021) Application three metaheuristic optimization algorithms. Tunn Undergr Space
of deep learning algorithms in geotechnical engineering: a short Technol 126:104570
critical review. Artif Intell Rev. https://doi.org/10.1007/s10462- 70. Zhou J, Huang S, Zhou T, Armaghani DJ, Qiu YG (2022)
021-09967-1 Employing a genetic algorithm and grey wolf optimizer for
62. Zhang W, Zhang R, Wu C, Goh ATC, Lacasse S, Liu Z, Liu H optimizing RF models to evaluate soil liquefaction potential.
(2020) State-of-the-art review of soft computing applications in Artif Intell Rev. https://doi.org/10.1007/s10462-022-10140-5
underground excavations. Geosci Front 11(4):1095–1106
63. Zhou J, Qiu Y, Armaghani DJ, Zhang W, Li C, Zhu S, Tarinejad Publisher’s Note Springer Nature remains neutral with regard to
R (2021) Predicting TBM penetration rate in hard rock condition: jurisdictional claims in published maps and institutional affiliations.
a comparative study among six XGB-based metaheuristic tech-
niques. Geosci Front. https://doi.org/10.1016/j.gsf.2020.09.020
Springer Nature or its licensor holds exclusive rights to this article
64. Zhou J, Dai Y, Khandelwal M, Monjezi M, Zhi Yu, Qiu Y (2021)
under a publishing agreement with the author(s) or other rightsh-
Performance of hybrid SCA-RF and HHO-RF models for pre-
older(s); author self-archiving of the accepted manuscript version of
dicting Backbreak in open-pit mine blasting operations. Nat Res
this article is solely governed by the terms of such publishing
Res 30:4753–4771
agreement and applicable law.

123

You might also like