You are on page 1of 21

sensors

Article
Predicting Blast-Induced Ground Vibration in
Open-Pit Mines Using Vibration Sensors and Support
Vector Regression-Based Optimization Algorithms
Hoang Nguyen 1 , Yosoon Choi 2, * , Xuan-Nam Bui 3,4 and Trung Nguyen-Thoi 5,6, *
1 Institute of Research and Development, Duy Tan University, Da Nang 550000, Vietnam;
nguyenhoang23@duytan.edu.vn
2 Department of Energy Resources Engineering, Pukyong National University, Busan 48513, Korea
3 Department of Surface Mining, Mining Faculty, Hanoi University of Mining and Geology, 18 Pho Vien,
Duc Thang ward, Bac Tu Liem district, Hanoi 100000, Vietnam; buixuannam@humg.edu.vn
4 Center for Mining, Electro-Mechanical Research, Hanoi University of Mining and Geology, 18 Pho Vien,
Duc Thang ward, Bac Tu Liem district, Hanoi 100000, Vietnam
5 Division of Computational Mathematics and Engineering, Institute for Computational Science,
Ton Duc Thang University, Ho Chi Minh City 700000, Vietnam
6 Faculty of Civil Engineering, Ton Duc Thang University, Ho Chi Minh City 700000, Vietnam
* Correspondence: energy@pknu.ac.kr (Y.C.); nguyenthoitrung@tdtu.edu.vn (T.N.-T.)

Received: 4 November 2019; Accepted: 20 December 2019; Published: 24 December 2019 

Abstract: In this study, vibration sensors were used to measure blast-induced ground vibration (PPV).
Different evolutionary algorithms were assessed for predicting PPV, including the particle swarm
optimization (PSO) algorithm, genetic algorithm (GA), imperialist competitive algorithm (ICA), and
artificial bee colony (ABC). These evolutionary algorithms were used to optimize the support vector
regression (SVR) model. They were abbreviated as the PSO-SVR, GA-SVR, ICA-SVR, and ABC-SVR
models. For each evolutionary algorithm, three forms of kernel function, linear (L), radial basis
function (RBF), and polynomial (P), were investigated and developed. In total, 12 new hybrid models
were developed for predicting PPV in this study, named ABC-SVR-P, ABC-SVR-L, ABC-SVR-RBF,
PSO-SVR-P, PSO-SVR-L, PSO-SVR-RBF, ICA-SVR-P, ICA-SVR-L, ICA-SVR-RBF, GA-SVR-P, GA-SVR-L
and GA-SVR-RBF. There were 125 blasting results gathered and analyzed at a limestone quarry
in Vietnam. Statistical criteria like R2 , RMSE, and MAE were used to compare and evaluate the
developed models. Ranking and color intensity methods were also applied to enable a more complete
evaluation. The results revealed that GA was the most dominant evolutionary algorithm for the
current problem when combined with the SVR model. The RBF was confirmed as the best kernel
function for the GA-SVR model. The GA-SVR-RBF model was proposed as the best technique for
PPV estimation.

Keywords: peak particle velocity; vibration sensor; soft computing; evolutionary algorithm; hybrid
model; open-pit mine

1. Introduction
Construction materials and energy are in great demand in every country, especially developing
ones. As a result of this demand, opencast mines and quarries are flourishing and displaying high
levels of productivity to meet market requirements. Owing to this, ultimate pit boundaries are
reached quickly, and this significantly affects neighboring residential areas and surrounding structures.
Among the activities conducted in opencast mines, blasting is a necessary step which leads to serious
environmental impacts, such as air and ground vibrations, fly-rock, noise pollution, back-break,

Sensors 2020, 20, 132; doi:10.3390/s20010132 www.mdpi.com/journal/sensors


Sensors 2020, 20, 132 2 of 21

and release of gases [1]. Of these harmful effects, ground vibration is considered to be the most
dangerous phenomenon. It can make vibrations of buildings, instability of slopes and benches,
and cause bewilderment for the residentials surrounding the mine.
To measure ground vibration induced by blasting operations in open-pit mine, peak particle
velocity (PPV) is considered a standard criterion for evaluating the intensity of the ground vibration.
More than 80% of the energy of explosives is released by generating ground vibrations that spread
through the rock and soil to surrounding structures [2,3]. With a large oscillation amplitude over a short
period, blast-induced PPV significantly affects these surrounding structures. More specifically, in the
case of blast-induced PPV oscillations that concur with the natural vibrations of the building, resonance
phenomena happen, which can cause substantial structural damage, crashes or cracks, and instability of
the bench and slope in opencast mines [4,5]. Many households around blasting areas feel disconcerted
by the effects of blast-induced ground vibrations. Complaints and litigation related to blast-induced
PPV are serious issues, causing many opencast mines to stop production. Therefore, accurate prediction
of blast-induced PPV and control of harmful effects caused by the blasting operations are significant
challenges for open-pit mines. Precise forecast models of blast-induced PPV are necessary to minimize
unwanted injuries in the surrounding environment.
In this regard, artificial intelligence (AI) applications are considered useful, not only as robust
techniques in the mining field but also in many other areas (e.g., civil engineering, fuel, and energy,
and environment) [2,4,6–24]. An overview of the literature related to PPV prediction showed that
many AI models have been developed and proposed, as listed in Table 1. The reports in Table 1 showed
that the ANN (artificial neural network) seem to be the most common approach for predicting ground
vibration. The enhance algorithm of ANN (i.e., Levenberg–Marquardt algorithm) was also applied to
improve the accuracy of the ANN model in predicting ground vibration. Besides, the hybrid of ANN
and optimization algorithms (e.g., the particle swarm optimization (PSO), imperialist competitive
algorithm (ICA), artificial bee colony (ABC)) was also studied and proposed—aiming to provide high
reliability in predicting blast-induced ground vibration. Although the proposed models by the previous
researchers are outstanding, they were not applied in other areas/regions/countries. Furthermore,
new hybrid models with better accuracy are always the goal of researchers, especially benchmark
models-based approach. Of those, support vector machine for regression problems (SVR) is considered
as one of the common benchmark models, which was applied in many fields [25–31]. Our best review
of the related works showed that hybrid models based on SVR and evolutionary algorithms were
rare. The SVR model does not seem to be considered for optimization by evolutionary algorithms in
predicting PPV. Only the PSO-SVR model has been developed to predict blasting issues, although the
expected object is air over-pressure [32]. Moreover, kernel functions have been recommended that
significantly affect the performance of SVR models, and these have been reviewed by Nguyen [33].
However, they have not been considered and evaluated when combined with evolutionary algorithms.
Therefore, this study aims to assess the overall performance of different evolutionary algorithms such
as PSO algorithm, GA, ICA, and ABC when they are combined with the SVR model using linear
(L), radial basis function (RBF), and polynomial (P) kernel functions. 12 new hybrid models were
developed, named the ABC-SVR-P, ABC-SVR-L, ABC-SVR-RBF, PSO-SVR-P, PSO-SVR-L, PSO-SVR-RBF,
ICA-SVR-P, ICA-SVR-L, ICA-SVR-RBF, GA-SVR-P, GA-SVR-L, GA-SVR-RBF models. A comprehensive
comparison and assessment of these models will be presented in this study.
Sensors 2020, 20, 132 3 of 21

Table 1. Some works and their results in predicting blast-induced peak particle velocity (PPV).

Reference Method Results


Khandelwal, Singh [34] ANN R2 = 0.986; MAE = 0.196
Khandelwal et al. [35] SVM R2 = 0.955; MAE = 0.226
Saadat et al. [36] ANN-LM R2 = 0.957; MSE = 0.000722
Hajihassani et al. [37] ICA-ANN R2 = 0.976
Hajihassani et al. [38] PSO-ANN R2 = 0.89; MSE = 0.038
Amiri et al. [39] ANN-KNN R2 = 0.88; RMSE = 0.54; VAF = 87.84
Hasanipanah et al. [40] CART R2 = 0.95; RMSE = 0.17; NS = 0.948
Hasanipanah et al. [41] PSO-power R2 = 0.938; RMSE = 0.24; VARE = 0.13; NS = 0.94
Taheri et al. [42] ABC-ANN R2 = 0.92; RMSE = 0.22; MAPE = 4.26
Faradonbeh, Monjezi [43] GEP-COA R2 = 0.874; RMSE = 6.732; MAE = 5.164
R2 = 0.939; RMSE = 0.320; VAF = 92.18%; MBE = 0.22;
Behzadafshar et al. [44] ICA-linear
MAPE = 0.038
Tian et al. [45] GA-power R2 = 0.977; RMSE = 0.285
Hasanipanah et al. [46] FS-ICA R2 = 0.942; RMSE = 0.22; VAF = 94.2%
Nguyen et al. [12] HKM-ANN R2 = 0.983; RMSE = 0.554; VAF = 97.488%
Nguyen et al. [11] HKM-CA R2 = 0.995; RMSE = 0.475; MAE = 0.373
Zhang et al. [8] PSO-XGBoost R2 = 0.968; RMSE = 0.583; MAE = 0.346, VAF = 96.083

2. Proposing the Framework of SVR-based Evolution Algorithms


As introduced above, this study aims to investigate and developed several hybrid models based
on SVR and evolution algorithms, such as PSO, ICA, GA and ABC. Due to the details of PSO, ICA, GA,
ABC, and SVR being introduced in many previous papers [47–60]; therefore, they were not introduced
in this study. This study focuses on proposing novel hybrid framework of evolution algorithms (EAs)
and SVR, called EAs-SVR framework. For the development of the SVR model herein, three forms of
kernel functions were applied, including linear function (L), polynomial function (P), and radial basis
function (RBF). They can be formulated as follow:
- Linear:
G(x, xi ) = x · xi (1)

- Polynomial:
G(x, xi ) = [(x · xi ) + 1]d ; d = (1, 2, . . .) (2)

- Radial basis function:


−kx − xi k2
G(x, xi ) = exp[ ] (3)
σ2
For each kernel function, there will be one or more hyper-parameters(s) adjusted to get the highest
performance for the models. In this study, the hyper-parameter(s) of the SVR models was searched
and selected by the EAs (i.e., PSO, ICA, ABC, GA) with the aim to establish optimal performance.
In this part, the framework of the EAs-SVR models is proposed and described. As stated above,
PSO, ICA, ABC, and GA were applied to optimize the SVR’s parameters. So, the EAs-SVR models
consist of PSO-SVR, ABC-SVR, ICA-SVR, and GA-SVR models. During the development of the
EAs-SVR models, L, P, and RBF kernel function were applied. The SVR’s parameters with different
functions of the kernel are shown in Table 2. Based on the initial parameters in Table 2, the ABC, ICA,
PSO, and GA algorithms perform a searching procedure for the optimal values of SVR’s parameters.
Root-mean-squared error (RMSE) was used as a fitness function for all EAs-SVR models during the
development of the models, and it was calculated using Equation (4). For each best hybrid model
obtained, the RMSE value is lowest respectively to the best-obtained SVR’s parameters. The loops for
the optimization process are used to find the optimal hyper-parameters. Eventually, the final EAs-SVR
models (i.e., ABC-SVR, ICA-SVR, PSO-SVR, GA-SVR) are defined. The framework of the EAs-SVR
models for predicting PPV is proposed in Figure 1.
Sensors 2020, 20, 132 4 of 21

Figure 1. Framework of the support vector regression (SVR)-optimized by the four evolutionary algorithms for predicting peak particle velocity (PPV).

Figure 1. Framework of the support vector regression (SVR)-optimized by the four evolutionary algorithms for predicting peak particle velocity (PPV).

Sensors 2019, 19, x; doi: FOR PEER REVIEW www.mdpi.com/journal/sensors


Sensors 2020, 20, 132 5 of 21

Table 2. Support vector regression (SVR’s) parameters based on the kernel functions used.

Kernel Function C µ κ σ
L x - - -
P x x x -
RBF x - - x
3. Statistical Criteria
For evaluating
3. Statistical Criteria the quality, as well as the reliability level of the mentioned models, three
indicators of performance were used, including RMSE, MAE, and R2.
For evaluating the quality, as well as the reliability level of the mentioned models, three indicators
of performance were used, including RMSE, MAE, and R2 .
1 n ^
t n ( yi − yi )
2
RMSE = v (4)
n1 i =1
X
2
RMSE = ( yi − ŷi ) (4)
n
i=1

2
P((yyi −−yˆŷi))2i i
2

R = 1− i i
(5)
 ( yi − y)
R2 = 1 − P( y − y) 2
i
i
2 (5)
i

n
1 X
MAE = n yi − yˆ^i

(6)
1n
MAE =
i =1 n
 i=1 | yi − yi | (6)
where n is the total number of data, yi is measured PPV, ŷi is predicted PPV and y is mean of
where n is the total number of data, yi is measured PPV, yˆi is predicted PPV and y is mean of
measured PPVs.
measured PPVs.
4. Vibration Sensors and Experimental Datasets
4.
AVibration
quarry Sensors
in Vietnam and Experimental
was selectedDatasets
as a typical case. It is located within latitudes
◦ 0 ◦
quarry 0
in Vietnam was selected as a ◦ 0
20 25 40” N–20 26 20” N and longitudes 105 53 10” E–105 54 00” E (Figure
A typical case. It is ◦ 0
located within 2). 20°25′40”
latitudes The area N–of the
mine20°26′20”
is ~0.86 Km 2
N andwith longitudes 105°53′10”~6
the production E–105°54′00” E (FigureThe
million tons/year. 2). geological
The area of conditions
the mine is of
~ 0.86
thisKm
site2 were
with the
presented in production
[11]. Figure ~ 63million
showstons/year. The geological
the structure conditions
of geological of this of site
this study.
site were presented in [11].
Figure 3 shows the structure of geological of this site study.

Figure 2. View of the site study.


Figure 2. View of the site study.

Sensors 2019, 19, x; doi: FOR PEER REVIEW www.mdpi.com/journal/sensors


Sensors 2019,
2020, 19,
20, x FOR PEER REVIEW 26 of 19
Sensors 2019, 19, 132
x FOR PEER REVIEW 21
2 of 19

Figure 3. Structure of geology of the site study.


Figure
Figure 3.
3. Structure
Structure of
of geology
geology of
of the
the site
site study.
study.
For measuring the intensity of vibration at this mine, Micromate geophone sensors were utilized.
For
For measuring
measuring
Its structure consiststhe
theofintensity
intensity of
two parts:of vibration
a magnetat
vibration at this
thisamine,
and coil ofMicromate
mine, wire. The geophone
Micromate magnet issensors
geophone sensors were
wereby
suspended utilized.
utilized.
a wire
Its
Its structure
coil,structure
as shownconsists
consists of two
of two
in Figure parts:
4. Aparts: a magnet and
a magnet ground
blast-induced a coil
and a coil of wire. The
of wire.reading
vibration magnet
The magnet is suspended
is suspended
is shown in Figure 5. by aa wire
byFinally,
wire
coil, as
as shown
coil,PPV
125 eventsin
shown in Figure
Figure
were 4.
4. A
recordedA blast-induced
blast-induced
by a Micromate ground
ground vibration
sensorreading
vibration
vibration (Instatel,is
reading shown
shown in
isCanada). Figure
Figure6 5.
inFigure 5. Finally,
Finally,
illustrates
125
the PPV
125 PPV events
events
process were
were
of data recorded
recordedand
collection by
by aavibration
Micromate
Micromate vibration
vibration
sensor sensor
used insensor (Instatel,
(Instatel, Canada).
this study. Canada). Figure
Figure 66 illustrates
illustrates
the process of data collection and vibration sensor used in this
the process of data collection and vibration sensor used in this study. study.

Structure of the
Figure Structure
Figure the geophone sensor
sensor for measuring
measuring
Figure 4.
4. Structure of
of the geophone
geophone sensor for
for measuring vibration.
vibration.
Sensors 2019, 20,
Sensors 2020, 19, 132
x FOR PEER REVIEW 37 of
of 21
19

Figure 5. Vibration sensor and a result of the blast-induced ground vibration. (a) Micromate instrument
Figure 5.Canada);
(Instatel, Vibration
(b) sensor
A resultand a result inofopen-pit
of vibration the blast-induced
mine. ground vibration. (a) Micromate
instrument (Instatel, Canada); (b) A result of vibration in open-pit mine.
Sensors 2020, 20, 132 8 of 21
Sensors 2019, 19, x FOR PEER REVIEW 4 of 19

Figure 6.
Figure Data collection
6. Data collection and
and aa result
result of
of the
the PPV.
PPV.

A review of literature showed that many factors affect PPV. However, it is zoned into two
A review of literature showed that many factors affect PPV. However, it is zoned into two main
main groups, including a controllable parameters group and a uncontrollable parameters group [61].
groups, including a controllable parameters group and a uncontrollable parameters group [61]. The
The controllable parameters group is used by scientists to popularize predicting PPV due to the
controllable parameters group is used by scientists to popularize predicting PPV due to the ability to
ability to collect data directly and accurately, including W (explosive charge per delay), R (monitoring
collect data directly and accurately, including W (explosive charge per delay), R (monitoring
distance), T (stemming), P (powder factor), the number of boreholes, B (burden), H (bench high),
distance), T (stemming), P (powder factor), the number of boreholes, B (burden), H (bench high), S
S (spacing) and time delay [4]. In this study, four controllable parameters were used as the input
(spacing) and time delay [4]. In this study, four controllable parameters were used as the input
variables, namely W, R, B, and S, whereas, PPV was considered as the output factor. A hand-held GPS
variables, namely W, R, B, and S, whereas, PPV was considered as the output factor. A hand-held
was used to measure R, namely X91B GPS receiver, as shown in Figure 7. It can achieve an accuracy
GPS was used to measure R, namely X91B GPS receiver, as shown in Figure 7. It can achieve an
in centimeter-level based on landmark coordinates built and adjusted in the local [2]. The other
accuracy in centimeter-level based on landmark coordinates built and adjusted in the local [2]. The
parameters (W, B, S) were exported from 125 blast patterns. Table 3 summarizes the database used in
other parameters (W, B, S) were exported from 125 blast patterns. Table 3 summarizes the database
this research. The structure and distribution of the data are illustrated through the histogram of the
used in this research. The structure and distribution of the data are illustrated through the histogram
datasets as shown in Figure 8.
of the datasets as shown in Figure 8.
Table 3. Summary of PPV database in this study.
Table 3. Summary of PPV database in this study.
Parameter Min. Mean Max. Standard Deviation
Parameter Min. Mean Max. Standard Deviation
W 39.200 54.620 77.900 6.846
W R39.200 100.000 54.620
202.800 380.000
77.900 55.751 6.846
B 2.400 3.312 4.500 0.437
R S100.000 3.000 202.800
3.302 3.600
380.000 0.208 55.751
PPV 0.300 4.804 15.170 2.928
B 2.400 3.312 4.500 0.437
S 3.000 3.302 3.600 0.208
PPV 0.300 4.804 15.170 2.928
Sensors 2019, 19, x FOR PEER REVIEW 5 of 19

Sensors 2020, 20, 132 9 of 21


Sensors 2019, 19, x FOR PEER REVIEW 5 of 19

Figure 7. X91B GPS receiver used for measuring the distance in this study.
Figure
Figure 7. X91B
7. X91B GPSreceiver
GPS receiverused
used for
for measuring
measuringthe distance
the in this
distance study.
in this study.

Figure 8. Histograms of the dataset used in this study.

Figure 8.
Figure 8. Histograms of the dataset
dataset used
used in
in this
this study.
study.
5. Results and Discussion
5.
5. Results and
and Discussion
ResultsBefore developing the models, the dataset is needed to divide into two phases, including training
Discussion
and test phases. Accordingly, 80% of the whole dataset (101 blasting operations) was selected
Before
Beforedeveloping
developing the
themodels,
models, the dataset is
is needed to
to divide into two
two phases, including training
randomly as the training phase the
for dataset
the models’ needed divideThe
development. intoremaining
phases,
20%including training
(24 blasting
and test phases.
and operations)
test phases. Accordingly,
wasAccordingly,
80% of
used to check 80%
the whole
of the as
the accuracy,
dataset
whole (101 blasting
dataset
well as
operations)
(101 blasting
the reliability
was selected
operations)
of the models developedwas randomly
selected
based
as theontraining
randomly phase
as the
the training for the
training
dataset. models’
Tophase fordevelopment.
the models’
avoid over-fitting The remaining
development.
or under-fitting 20% remaining
The
of the models, (24
theblasting operations)
20%
dataset was in was
(24 blasting
scaled
used to check the accuracy, as well as the reliability of the models developed based
operations) was used to check the accuracy, as well as the reliability of the models developed based on the training
dataset. To avoid
on the training over-fitting
dataset. To avoidor under-fitting
over-fitting orofunder-fitting
the models,of the dataset
the models,wasthescaled inwas
dataset the range
scaled of
in
[−1, 1]. Besides, the Box-Cox transformation method [62] and 10-fold cross-validation technique [63]
were also applied to transfer data and improve the accuracy of the models.
Sensors
Sensors 2019,
2019, 19,
19, xx FOR
FOR PEER
PEER REVIEW
REVIEW 66 of
of 19
19

the
the range
range of [-1,1].
[-1,1]. Besides,
Besides, the
the Box-Cox
Box-Cox transformation
transformation method
method [62]
[62] and
and 10-fold
10-fold cross-validation
Sensors 2020, of
20, 132 cross-validation
10 of 21
technique
technique [63] were also applied to transfer data and improve the accuracy of the
[63] were also applied to transfer data and improve the accuracy of the models.
models.

5.1.
5.1. ABC-SVR
5.1. ABC-SVR
ABC-SVR Models
Models
For
For the
For theABC-SVR
the ABC-SVRmodels,
ABC-SVR models,
models, the
thethetraining dataset
training
training was
was used
dataset
dataset was to
used used
to develop the
the PPV
to develop
develop PPVthepredictive models.
PPV predictive
predictive models.
Three
Three kernel functions were involved in developing the ABC-SVR models with various
models. kernel
Three functions
kernel were
functions involved
were in
involved developing
in developing the ABC-SVR
the ABC-SVR models
models with
with various
various
hyper-parameters,
hyper-parameters, as
hyper-parameters, as shown
as shown
shown in in Table
in Table 2.
Table 2. The The ABC
The ABC algorithm
ABC algorithm implemented
algorithm implemented a
implemented aa global global search
global search procedure
search procedure
procedure
to
to define the optimal values of the SVR models through the framework in Figure 1. In the
to define
define the
the optimal
optimal values
values of the SVR models through the framework in Figure 1. In
In the ABC
the ABC
ABC
algorithm,
algorithm, a trial and error procedure for the number of bees was conducted for the parameters to be
algorithm, a
a trial
trial and
and error
error procedure
procedure for
for the
the number
number of
of bees
bees was
was conducted
conducted for
for the
the parameters
parameters to
to be
be
optimized over,
optimized over,
optimized with
over, with the
with the number
the number
number of of bees
of bees set
bees set equal
set equal to
equal to 100,
to 100, 200,
100, 200, 300,
200, 300, 400,
300, 400, 500,
400, 500, respectively;
500, respectively;
respectively; RMSERMSE was
RMSE was
was
used
used as a fitness function to be minimized; The number of food sources to exploit for the bees was set
used as
asaa fitness
fitness function
function totobebe minimized;
minimized; The number
The number of food
of food sources
sources to exploit
to for
exploit the
for bees
the was
bees was
set
to
set50
to toand
50 and the
50 and limit
the the of
of aaof
limit
limit food source
a food
food source to
to 100;
source The
to 100;
100; boundary
TheThe boundary
boundary of
of the parameters
of the
the parameters
parameters to
to betooptimized
be be optimized
optimized was
waswasset
set
as
set[–10;10].
as [–10;10]. The
The optimization
as [–10;10]. The optimization
optimization process
process was
was repeated
process was repeated
repeated 1000
1000 times to
to find
1000 times
times findtothe
find
the optimal RMSE
the optimal
optimal RMSERMSE value. After
value. value.
After
setting
setting parameters for the ABC algorithm, the bee finds the optimal values for the SVR models with
After parameters
setting for
parameters the ABC
for the algorithm,
ABC the
algorithm, bee
thefinds
bee the
findsoptimal
the values
optimal for
values the SVR
for the models
SVR models
with
the
withcorresponding
the the corresponding
corresponding parameters.
parameters.
parameters. The
The Theperformance
performance
performance of
of the optimization
of the
the optimization
optimization process
processwas
process wasillustrated
was illustrated in
illustrated in
in
Figures
Figures 9–11.
9–11. The
The optimal
optimal results
results for
for the
the ABC-SVR
ABC-SVR models
models
Figures 9–11. The optimal results for the ABC-SVR models are shown in Table 4. are
are shown in Table 4.

Figure
Figure 9.
Figure 9. RMSE
9. RMSEof
RMSE ofthe
of theproposed
the proposed artificial
proposed artificial bee
artificial bee colony
bee colony (ABC)-SVR-L
colony (ABC)-SVR-L model.
model.

RMSE ofthe
Figure 10. RMSE
Figure the proposed ABC-SVR-P
ABC-SVR-P model.
Figure 10.
10. RMSE of
of the proposed
proposed ABC-SVR-P model.
model.
Sensors 2020, 20, 132 11 of 21
Sensors 2019, 19, x FOR PEER REVIEW 7 of 19

Figure 11.
Figure RMSE of
11. RMSE of the
the proposed
proposed ABC-SVR-RBF
ABC-SVR-RBF model.
model.

Table 4. Optimal values of the artificial bee colony (ABC)-SVR model for predicting PPV.
Table 4. Optimal values of the artificial bee colony (ABC)-SVR model for predicting PPV.
Model C µ κ σ
Model
ABC-SVR-L C
0.544 -
μ- κ σ
ABC-SVR-P 0.146 0.729 2
ABC-SVR-L ABC-SVR-RBF 0.544
70.067 -
-- 0.016
-
ABC-SVR-P 0.146 0.729 2
5.2. PSO-SVR Models
ABC-SVR-RBF 70.067 - - 0.016
As in the ABC-SVR models, an initial SVR model was generated with different kernel functions.
Subsequently, the SVR’s parameters were searched and optimized by the PSO algorithm. The best
5.2. PSO-SVR
values of SVR’sModels
parameters (after discovery by the PSO algorithm) were evaluated through RMSE as
thoseAs
used for the ABC-SVR
in the ABC-SVR models.
models, The lowest
an initial RMSEwas
SVR model is the optimal PSO-SVR
generated model.
with different To employ
kernel the
functions.
global search procedure
Subsequently, the SVR’sby the PSO algorithm,
parameters its parameters
were searched neededbytothe
and optimized be set
PSOupalgorithm.
first, including:
The best
-valuesThe
of SVR’s
number parameters
of particle(after
swarmsdiscovery
(p); by the PSO algorithm) were evaluated through RMSE as
-those The
usedmaximum
for the ABC-SVR
particle’smodels.
velocityThe lowest
(Vmax ); RMSE is the optimal PSO-SVR model. To employ
the global search procedure by the PSO algorithm, its parameters needed to be set up first, including:
- The individual cognitive (φ1 );
- -The The number
group of particle
cognitive (φ2 ); swarms (p);
- - The maximum particle’s
The inertia weight (w); velocity (Vmax);
- -The The
maximum number
individual
φ
of iteration
cognitive ( 1 ); (mi ).

-For this
The study,
group the
φ2 );
PSO’s (parameters
cognitive were set as follow: p = 100, 150, 200, 250, 300; Vmax = 1.9;
φ1 =-φ2 =The w = 1.8,
1.6;inertia and m i
weight (w); = 1000. Once the PSO’s parameters were established, the particles
searched
- The maximum number of iteration (mi).best place (under lowest RMSE). The lowest RMSE,
in a constrained space to find out the
the optimal PSO-SVR model was defined, and its hyper-parameters were extracted to build the optimal
ForFigures
model. this study,
12–14the PSO’s
show theparameters
performance were setPSO-SVR
of the as follow:models
p = 100, 150,searching
while 200, 250, optimal
300; Vmaxvalues.
= 1.9;
φ1 = 5φ2lists
Table thewoptimal
= 1.6; parameters
= 1.8, and mi = 1000. of the PSO-SVR
Once models.
the PSO’s parameters were established, the particles
searched in a constrained space to find out the best place (under lowest RMSE). The lowest RMSE,
the optimal PSO-SVR model was defined, and its hyper-parameters were extracted to build the
optimal model. Figures 12–14 show the performance of the PSO-SVR models while searching optimal
values. Table 5 lists the optimal parameters of the PSO-SVR models.
Sensors 2019, 19, x FOR PEER REVIEW 8 of 19
Sensors 2019, 19, x FOR PEER REVIEW 8 of 19
Sensors 2020, 20, 132 12 of 21
Sensors 2019, 19, x FOR PEER REVIEW 8 of 19

Figure 12. RMSE of the proposed particle swarm optimization (PSO)-SVR-L model.
Figure 12. RMSE of the proposed particle swarm optimization (PSO)-SVR-L model.
Figure 12. RMSE of the proposed particle swarm optimization (PSO)-SVR-L model.
Figure 12. RMSE of the proposed particle swarm optimization (PSO)-SVR-L model.

Figure 13. RMSE of the proposed PSO-SVR-P model.


Figure 13. RMSE of the proposed PSO-SVR-P model.
Figure 13. RMSE of the proposed PSO-SVR-P model.
Figure 13. RMSE of the proposed PSO-SVR-P model.

Figure
Figure 14. RMSE of
14. RMSE of the
the proposed
proposed PSO-SVR-RBF
PSO-SVR-RBF model.
model.
Figure 14. RMSE of the proposed PSO-SVR-RBF model.
Figure 14. RMSE of the proposed PSO-SVR-RBF model.
Sensors 2019, 19, x FOR PEER REVIEW 9 of 19
Sensors 2019, 19, x FOR PEER REVIEW 9 of 19

Sensors 2020, 20, 132 Table 5. Optimal parameters of the PSO-SVR models. 13 of 21
Table 5. Optimal parameters of the PSO-SVR models.
Model C μ κ σ
Model C
Table 5. Optimal parameters μ models.
of the PSO-SVR κ σ
PSO-SVR-L 0.119 - -
PSO-SVR-L Model 0.119C µ κ- σ -
PSO-SVR-P 4.995 0.022 2
PSO-SVR-P PSO-SVR-L 4.995 0.119 - -
0.022 2
PSO-SVR-RBF PSO-SVR-P 4.995
40.901 0.022 2 - - 0.036
PSO-SVR-RBF PSO-SVR-RBF 40.901 40.901 - -- 0.036 - 0.036
5.3. ICA-SVR Models
5.3.
5.3. ICA-SVR
ICA-SVR ModelsModels
For optimization of the SVR model by the ICA, the ICA’s parameters were established as the
For optimization
optimization of ofthe
theSVR
SVRmodel
modelbybythe the ICA,thethe ICA’s parameters were established as first
the
first For
step. A trial and error procedure ICA,
for the numberICA’s ofparameters were
initial countries established
(Ncountry) and as the initial
first
step. step. A
A trial (N trial
and errorand error procedure for the number of initial countries (N country ) and initial
imperialists imper ) wasprocedure
applied withfor the number
Ncountry was set of equal
initialtocountries
100, 150,(N 200, 250,) and
country 300, initial imperialists
respectively; Nimper
imperialists (N ) was applied with N was set equal to 100, 150, 200, 250, 300, respectively; Nimper
30,N (Ni) was setNequal
(N imper country
was set) was
imper equalapplied
to 10, 20, with country was set
respectively. Theequal to 100, number
maximum 150, 200,of250, 300, respectively;
iterations imper was
to 1000set
was
equal settoequal
10, 20,to30,10,respectively.
20, 30, respectively.
The The maximum
maximum number number
of of iterations
iterations (N ) was (Niequal
set ) was set
to equal
1000 toto 1000
ensure
to ensure the stability of the model; the lower-upper limit of the optimization region (L) was placed
i
to
theensure
stabilitythe stability of the model; the lower-upper of the limit of the optimization region (L) in was placed
in the rangeofofthe –10model;
to 10; the
thelower-upper
assimilation limit
coefficient optimization
(As) was set equalregionto (L)3;was
the placed
revolution theof range
each
in
of the range
–10 to(r) 10;wasof –10
the set to 10;
assimilation the assimilation
coefficient (As) coefficient
was set equal (As) was set equal to 3; the revolution of
(r)each
country equal to 0.5. After the parameters of thetoICA
3; the revolution
were established,of each country
empires perform wasa
country
set equal (r)towas
0.5. set equal
After the toparameters
0.5. After theof parameters
the ICA of the
were ICA wereempires
established, established, empires
perform a perform a
competition
competition to find out the most substantial empire where the SVR’s parameters are optimized. The
competition
to find out the to most
find out the mostempire
substantial substantial
where empire
the where
SVR’s the SVR’s parameters
parameters are optimized. are The
optimized.
ICA-SVR’s The
ICA-SVR’s performances are shown in Figures 15-17. Finally, the ICA-SVR’s parameters were listed
ICA-SVR’s
performances performances are shown in Figures 15-17. Finally, the ICA-SVR’s
are shown in Figures 15–17. Finally, the ICA-SVR’s parameters were listed in Table 6. parameters were listed
in Table 6.
in Table 6.

Figure 15. RMSE of the proposed imperialist competitive algorithm (ICA)-SVR-L model.
Figure 15.
Figure RMSE of
15. RMSE of the
the proposed
proposed imperialist
imperialist competitive
competitive algorithm
algorithm (ICA)-SVR-L
(ICA)-SVR-L model.
model.

Figure 16. RMSE of the proposed ICA-SVR-P model.


Figure 16. RMSE of the proposed ICA-SVR-P model.
Sensors 2019, 19, x FOR PEER REVIEW 10 of 19

Sensors 2020, 20, 132 14 of 21


Sensors 2019, 19, x FOR PEER REVIEW 10 of 19

Figure 17. RMSE of the proposed ICA-SVR-RBF model.

Table 6. Obtained values of the


Figure
Figure 17.SVR
17. models
RMSE
RMSE byproposed
of the
of the imperialist
proposed competitive algorithm
ICA-SVR-RBF model. (ICA) optimization.

Table 6. Obtained μ
Model values of the SVR models by imperialist competitive
Table 6. Obtained values of the SVR models C σ
algorithm (ICA) optimization.
by imperialist competitive algorithm (ICA) optimization.κ
ICA-SVR-L Model 0.101C µ κ- σ -
Model
ICA-SVR-L C 0.101 - -
μ κ σ
ICA-SVR-P ICA-SVR-P 128.596
128.596 0.002 0.002
3 3
ICA-SVR-L 0.101 - -
ICA-SVR-RBF ICA-SVR-RBF 2.461
2.461 - -- 0.079 - 0.079
ICA-SVR-P 128.596 0.002 3
5.4. GA-SVR
GA-SVR Models
Models
ICA-SVR-RBF 2.461 - - 0.079
Concerning the GA-SVR models, the similar steps as those those conducted
conducted for the the previous
previous models
models
5.4.
was
was GA-SVR
applied. Models
An initial SVR model with kernel functions was established
An initial SVR model with kernel functions was established as the first step; then, as the first step; then,
the
the parameters
parameters of of
GA GA were
were set
set upup asas the
the second
second step
step (e.g.,
(e.g., mutation
Concerning the GA-SVR models, the similar steps as those conducted for the previous models probability (P
(Pm
m ), crossover
crossover
probability
probability
was applied.(P c),
(PcAn), and
and the
initial number
theSVR
number
model variable
with (n),
variable (n),the
kernel thenumber
number
functions ofwas
populations
of populations
established(p)).
asInthe
(p)). this study,
Infirst
this study,
step; Pmthen,
was set
Pm was
the
equal to 0.1;
set equal
parameters Pc was
to 0.1;
of Pc was
GA set equal
set
were set toup
equal 0.9;
to 0.9; =
asn the andppwas
n =4,4,second
and was
stepset equal
set(e.g., to
to 100,
100, 150,
equalmutation 150, 200, 250, 300,
probability 300, respectively.
(Pmrespectively.
), crossover
RMSE
RMSE was
was used
used as
as the
the fitness
fitness function
function according
according totoEquation
equation (4).
4. The
The maximum
maximum
probability (Pc), and the number variable (n), the number of populations (p)). In this study, Pm was iteration
iteration number was
repeated
repeated
set 1000
1000
equal to times
0.1;times
Pc was to ensure
toset
ensure
equal finding
finding out
to 0.9; out the
the
n = 4, best
andbest values
values
p was of
set of the GA-SVR
equal GA-SVR models.
models.
to 100, 150, Figures 18–20 report
200, 250, 300, respectively.
the GA-SVR’s
RMSE was used performances
as the fitnessthrough
functionthe RMSE. Eventually,
according to equationthe
Eventually, the optimal
optimal
4. The GA-SVR’s
GA-SVR’s
maximum parameters
numberwere
parameters
iteration was
extracted
extracted in
in Table
Table 7.
7.
repeated 1000 times to ensure finding out the best values of the GA-SVR models. Figures 18–20 report
the GA-SVR’s performances through the RMSE. Eventually, the optimal GA-SVR’s parameters were
extracted in Table 7.

Figure 18. RMSE


Figure 18. RMSE of
of the
the proposed
proposed genetic
genetic algorithm
algorithm (GA)-SVR-L
(GA)-SVR-L model.
model.

Figure 18. RMSE of the proposed genetic algorithm (GA)-SVR-L model.


Sensors 2020, 20, 132 15 of 21
Sensors
Sensors 2019,
2019, 19,
19, xx FOR
FOR PEER
PEER REVIEW
REVIEW 11
11 of
of 19
19

Figure
Figure 19.19.
Figure RMSE
19.RMSE of
RMSEof the
of the proposed
proposed GA-SVR-P
the proposed GA-SVR-P
GA-SVR-P model.
model.
model.

Figure
Figure 20.RMSE
20.20.
Figure RMSEof
RMSE ofthe
of the proposed
the proposed GA-SVR-RBF
proposed GA-SVR-RBF
GA-SVR-RBF model.
model.
model.

Table 7. Genetic
Table algorithm
7. Genetic algorithm(GA)-SVR modelswith
(GA)-SVR models withthe
the optimal
optimal parameters.
parameters.
Table 7. Genetic algorithm (GA)-SVR models with the optimal parameters.
µ κ σ
Model
Model
Model
C
C
C μ
μ κ σ
σ
GA-SVR-L 0.178 - -
GA-SVR-L
GA-SVR-L GA-SVR-P 0.178
0.178
12.730 0.018 --2 --
GA-SVR-P GA-SVR-RBF 12.730
7.938 - 0.018
- 0.030 22
GA-SVR-P 12.730 0.018
GA-SVR-RBF
GA-SVR-RBF 7.938
7.938 -- -- 0.030
0.030
5.5. Evaluating the Developed Models
5.5.
5.5. Evaluating
Once Evaluating
the optimalthe
the Developed
Developed
parameters Models
Modelsof the 12 hybrid models were achieved, 24 blasting events in the set
of testing Once
data were
Once the used to
the optimal
optimal confirm the
parameters
parameters of
of thedeveloped
the 12 hybridmodels’
12 hybrid models accuracy.
models were Table24
were achieved,
achieved, 8 presents
24 blasting the results
blasting events
events in theof the
in the
set
set of
12 hybrid testing
testing data
of models were
on the
data were used
sets
usedofto confirm
confirm the
data.
to the developed
developed models’
models’ accuracy.
accuracy. Table
Table 88 presents
presents thethe results
results
of
of the
the 12
12 hybrid
Observing the models
hybrid on
on the
statistical
models sets
sets of
criteria
the ofondata.
both the sets of training and testing data, the performance
data.
Observing
Observing the
the statistical
of the proposed hybrid models was veryboth
statistical criteria
criteria on
on both
good. the sets
sets of
the The of training
training and
developed and testing data,
data, the
testingseem
models the performance
to beperformance
very stable, of
of and
the
the proposed
proposed hybrid
hybrid models
models was
was very
very good.
good. The
The developed
developed models
models seem
seem to
to be
be very
very stable,
stable, and
and the
the
the differences between the training dataset and the testing dataset are limited. However, if one
differences
differences between
between the training
the in
training dataset
dataset and the testing dataset are
are limited. However, if one only
only looks
looksatatthethe numbers
numbers in
Table
Table 8, it
8,
is
it is and
tough
tough
to
thetotesting
evaluate
dataset
evaluate
which
which
model
limited.
is
model
the best
However,
is the
in the
best ifin
present
one
the only
present
study.
looks at the numbers in Table 8, it is tough to evaluate which model is the best in the present study.
study. Therefore,
Therefore, a ranking method and intensity of color was used for assessing the efficiency
Therefore, aa ranking
ranking method
method and and intensity
intensity of of color
color was
was used
used for
for assessing
assessing thethe efficiency
efficiency of of the
the
of thedeveloped
developed models. Whereas, the green color represents the highest
developed models. Whereas, the green color represents the highest performance, the white color
models. Whereas, the green color represents the highest performance,
performance, the white the
color
white
color represents the lowest performance. The highlight of the GA-SVR-RBF indicated that it was
the best model. Next are the ABC-SVR-RBF, ABC-SVR-P, ICA-SVR-RBF, PSO-SVR-RBF, GA-SVR-P,
PSO-SVR-P, ICA-SVR-P, GA-SVR-L, ABC-SVR-L, PSO-SVR-L, and the last is ICA-SVR-L. It is of interest
to consider the accuracy of the ABC-SVR-RBF, ABC-SVR-P, and ICA-SVR-RBF models. Although the
Sensors 2020, 20, 132 16 of 21

ABC-SVR-RBF model provided lower accuracy than the ABC-SVR-P and ICA-SVR-RBF models on the
testing dataset, the total ranking of the ABC-SVR-RBF model was higher than these two above models.
The main reason is due to the fact that ABC-SVR-RBF’s performance was higher than the ABC-SVR-P
and ICA-SVR-RBF models on the training dataset. In other words, the ABC-SVR-RBF model yielded
more stable results than the ABC-SVR-P and ICA-SVR-RBF models in terms of the PPV prediction
in the present study. Remarkably, the RBF kernel function seems to bring higher levels of accuracy
over the linear and polynomial kernel functions. In contrast, the linear kernel function provided the
lowest performance for the models in predicting PPV. As per the results of this study, the non-linear
relationship of the variables was clarified. Figure 21 interprets the accuracy of the 12 proposed hybrid
models. Note that, the intensity of the green color represents for the accuracy of the models. Greater
intensity of green color, greater performance.

Table 8. Statistical criteria of the PSO-SVR, ICA-SVR, ABC-SVR, and GA-SVR models.

Training Dataset Testing Dataset


Rank Rank Rank Rank Rank Rank Total
Model
RMSE R2 MAE for for for RMSE R2 MAE for for for Rank
RMSE R2 MAE RMSE R2 MAE
ABC-SVR-P 0.977 0.265 7 9 8 0.317 0.989 0.226 11 11 9 55
ABC-SVR-L 0.754 0.943 0.493 4 3 4 1.107 0.907 0.578 1 3 2 17
ABC-SVR-RBF 0.362 0.981 0.227 11 11 12 0.345 0.986 0.222 9 9 10 62
PSO-SVR-P 0.417 0.976 0.276 8 7 6 0.413 0.980 0.286 6 7 5 39
PSO-SVR-L 0.833 0.941 0.526 2 2 2 1.044 0.904 0.574 4 2 4 16
PSO-SVR-RBF 0.411 0.978 0.256 9 10 10 0.410 0.979 0.245 7 6 8 50
ICA-SVR-P 0.434 0.974 0.286 5 5 5 0.471 0.977 0.272 5 5 6 31
ICA-SVR-L 0.843 0.940 0.530 1 1 1 1.045 0.901 0.580 3 1 1 8
ICA-SVR-RBF 0.428 0.975 0.274 6 6 7 0.335 0.987 0.205 10 10 11 50
GA-SVR-P 0.403 0.976 0.264 10 7 9 0.379 0.983 0.263 8 8 7 49
GA-SVR-L 0.789 0.945 0.510 3 4 3 1.090 0.907 0.577 2 3 3 18
Sensors 2019, 19,0.351
GA-SVR-RBF x FOR 0.983 0.238
PEER REVIEW 12 12 11 0.267 0.991 0.182 12 12 12 13 of 19
71

Figure
Figure 21. Cont.
21. Cont.
Sensors 2020, 20, 132 17 of 21
Sensors 2019, 19, x FOR PEER REVIEW 14 of 19

Figure 21. Correlation


Figure21. Correlation ofof measured
measured andand estimated
estimated PPVs PPVs
by the by the 12 proposed
12 proposed hybrid models.
hybrid models.

5.6. Sensitivity Analysis


5.6. Sensitivity Analysis
In thisInstudy, four evolutionary
this study, algorithms
four evolutionary (i.e.,(i.e.,
algorithms ABC, ICA,ICA,
ABC, PSO, andand
PSO, GA)GA)have played
have an an
played essential
role inessential
optimizingrole inthe
optimizing
SVR models.the SVROfmodels.
thoseOf those developed
developed hybrid hybrid models,
models, thethe GA-SVR-RBF model
GA-SVR-RBF
model
provided theprovided
highestthe highest
level level of accuracy
of accuracy as wellasaswell as performance.
performance. AsAs statedabove,
stated above, the
the RBF
RBFmodel
model was
was considered as the most useful function for the development of the SVR-based models using four
considered as the most useful function for the development of the SVR-based models using four
evolutionary algorithms. However, determining which input factor(s) are the most important with
evolutionary algorithms. However, determining which input factor(s) are the most important with
the GA-SVR-RBF model, is a complicated problem; therefore, a procedure to assess the importance
the GA-SVR-RBF model,
of input variables has is a complicated
been carried out in problem; therefore,
this study based on theaselected
procedure to assessmodel.
GA-SVR-RBF the importance
The
of input variables has
Hilbert-Schmidt been carried
Independence out in (HSIC)
Criterion this studymethodbasedwasonemployed
the selected GA-SVR-RBF
to analyze the inputmodel.
variables importance
The Hilbert-Schmidt [64,65]. The results
Independence showed
Criterion (HSIC)that R and W are
method wastheemployed
most important factors the
to analyze for input
estimating PPV, as illustrated in Figure 22.
variables importance [64,65]. The results showed that R and W are the most important factors for
estimating PPV, as illustrated in Figure 22.
Sensors 2020, 20, 132 18 of 21
Sensors 2019, 19, x FOR PEER REVIEW 15 of 19

Figure 22. Sensitivity analyses of the influence parameters in this study.


Figure 22. Sensitivity analyses of the influence parameters in this study.
6. Conclusions
6. Conclusion
AI has become more common in all fields, especially in technical fields. Applications of AI have
AI to
helped has becometechnical
improve more common
issues, in all fields,inespecially
especially the mininginindustry.
technical In
fields. Applications
this study, of AI
12 hybrid have
models
helped
based ontofour
improve technicalalgorithms
evolutionary issues, especially
and SVRinmodel
the mining industry. In
were developed andthis study, 12 hybridassessed.
comprehensively models
based on four evolutionary algorithms
Some conclusions were drawn as follow: and SVR model were developed and comprehensively
assessed. Some conclusions were drawn as follow:
(1) Evolutionary algorithms
1) Evolutionary are of great
algorithms are value in improving
of great the accuracy
value in improving theofaccuracy
traditionalof models for
traditional
PPV estimation, particular the SVR model.
models for PPV estimation, particular the SVR model.
(2) Kernel2) functions have a great
Kernel functions have effect on SVR’s
a great effectaccuracy,
on SVR’sespecially
accuracy,the RBF. By means
especially the RBF.of evolutionary
By means of
algorithms, kernel functions can reach optimal values for the SVR model.
evolutionary algorithms, kernel functions can reach optimal values for the SVR model.
(3) GA3) is the
GA most dominant
is the evolutionary
most dominant algorithm
evolutionary when combined
algorithm with the with
when combined SVR model
the SVR and RBF
model
(i.e., GA-SVR-RBF
and RBF (i.e., GA-SVR-RBF model) for predicting PPV. It should be approved as to
model) for predicting PPV. It should be approved as a robust technique a
accurately predict
robust PPV. to accurately predict PPV.
technique
(4) Monitoring distancedistance
4) Monitoring and explosive charge (per
and explosive delay)
charge (peraredelay)
the most
are critical
the most factors in predicting
critical factors in
PPV. They should be given special attention and carefully collected
predicting PPV. They should be given special attention and carefully collectedto improve the models’
to
accuracyimprove
in practice.
the models’ accuracy in practice.
Author Contributions: Conceptualization, H.N. and T.N.-T.; Data curation, H.N. and X.-N.B.; Formal analysis,
Author Y.C.; Funding Conceptualization,
Contributions:
H.N. and acquisition, T.N.-T.; H.N. and T.N.-T.;
Methodology, Data
H.N., Y.C.curation, H.N.Project
and T.N.-T.; and X.-N.B.; Formal analysis,
administration, X.-N.B.;
H.N. and Y.C.; Funding acquisition, T.N.-T.; Methodology, H.N., Y.C. and T.N.-T.; Project administration, X.-N.B.;
Software, T.N.-T.; Validation, Y.C.; Writing—review & editing, H.N., Y.C., X.-N.B. and T.N.-T.
Software, T.N.-T.; Validation, Y.C.; Writing—review & editing, H.N., Y.C., X.-N.B. and T.N.-T. All authors have
read and agreed
Funding: to the was
This work published version
supported byofBasic
the manuscript.
Science Research Program through the National Research
Funding: This
Foundation work was
of Korea supported
(NRF) fundedbybyBasic Science Research
the Ministry Program
of Education through the National Research Foundation
(2018R1D1A1A09083947).
of Korea (NRF) funded by the Ministry of Education (2018R1D1A1A09083947).
Acknowledgments: The authors would like to thank Hanoi University of Mining and Geology (HUMG), Hanoi,
Acknowledgments: The authors
Vietnam; Duy Tan University, Dawould
Nang, like to thank
Vietnam; theHanoi
CenterUniversity of Mining
for Excellence and Geology
in Analysis (HUMG),and
and Experiment Hanoi,
the
Vietnam; Duy Tan University, Da Nang, Vietnam; the Center for Excellence in Analysis and Experiment and the
Center for Mining, Electro-Mechanical research of HUMG, Duy Tan University, Da Nang, Vietnam, and
Center for Mining, Electro-Mechanical research of HUMG, Duy Tan University, Da Nang, Vietnam, and Ton Duc Ton
Duc Thang
Thang University,
University, HoMinh
Ho Chi Chi Minh City, Vietnam.
City, Vietnam.
Conflicts of Interest:
Conflicts of Interest: The authors declare
declare no
no conflict
conflict of
of interest.
interest.

References
1. Fang, Q.; Nguyen, H.; Bui, X.-N.;
X.-N; Nguyen,
Nguyen,T.-T.
T.-T. Prediction of Blast-Induced Ground Vibration in Open-Pit
Mines Using a New Technique Based on Imperialist Competitive
Competitive Algorithm
Algorithm andand M5Rules.
M5Rules. Nat. Resour.
Resour. Res.
Res.
2019, 1–16,
1–16. doi:10.1007/s11053-019-09577-3.
[CrossRef]
2. Nguyen, H.;
H.;Bui,
Bui,X.-N.;
X.-N.; Bui,
Bui, H.-B.;
H.-B.; Mai,Mai,
N.-L.N.-L. A comparative
A comparative study ofstudy of artificial
artificial neural in
neural networks networks in
predicting
predicting blast-induced
blast-induced air-blast overpressure
air-blast overpressure at Deo Nai at Deo Nai
open-pit open-pit
coal coal mine,Neural
mine, Vietnam. Vietnam. Neural
Comput. Comput.
Appl. 2018,
Appl. 2018,
1–17. 1–17, doi:10.1007/s00521-018-3717-5.
[CrossRef]
Sensors 2020, 20, 132 19 of 21

3. Nguyen, H.; Bui, X.-N.; Nguyen-Thoi, T.; Ragam, P.; Moayedi, H. Toward a State-of-the-Art of Fly-Rock
Prediction Technology in Open-Pit Mines Using EANNs Model. Appl. Sci. 2019, 9, 4554. [CrossRef]
4. Nguyen, H.; Bui, X.-N.; Tran, Q.-H.; Le, T.-Q.; Do, N.-H.; Hoa, L.T.T. Evaluating and predicting blast-induced
ground vibration in open-cast mine using ANN: A case study in Vietnam. SN Appl. Sci. 2018, 1, 125.
[CrossRef]
5. Zhou, J.; Shi, X.; Li, X. Utilizing gradient boosted machine for the prediction of damage to residential
structures owing to blasting vibrations of open pit mining. J. Vib. Control. 2016, 22, 3986–3997. [CrossRef]
6. Bui, X.N.; Nguyen, H.; Le, H.A.; Bui, H.B.; Do, N.H. Prediction of Blast-induced Air Over-pressure in
Open-Pit Mine: Assessment of Different Artificial Intelligence Techniques. Nat. Resour. Res. 2019. [CrossRef]
7. Bui, X.N.; Muazu, M.A.; Nguyen, H. Optimizing Levenberg–Marquardt backpropagation technique in
predicting factor of safety of slopes after two-dimensional OptumG2 analysis. Eng. Comput. 2019. [CrossRef]
8. Zhang, X.; Nguyen, H.; Bui, X.-N.; Tran, Q.-H.; Nguyen, D.-A.; Bui, D.T.; Moayedi, H. Novel Soft
Computing Model for Predicting Blast-Induced Ground Vibration in Open-Pit Mines Based on Particle
Swarm Optimization and XGBoost. Nat. Resour. Res. 2019. [CrossRef]
9. Nguyen, H.; Moayedi, H.; Jusoh, W.A.W.; Sharifi, A. Proposing a novel predictive technique using
M5Rules-PSO model estimating cooling load in energy-efficient building system. Eng. Comput. 2019.
[CrossRef]
10. Nguyen, H.; Moayedi, H.; Foong, L.K.; Al Najjar, H.H.A.; Jusoh, W.W.A.; Rashid, A.S.A.; Jamali, J. Optimizing,
ANN models with PSO for predicting short building seismic response. Eng. Comput. 2019. [CrossRef]
11. Nguyen, H.; Drebenstedt, C.; Bui, X.-N.; Bui, D.T. Prediction of Blast-Induced Ground Vibration in an
Open-Pit Mine by a Novel Hybrid Model Based on Clustering and Artificial Neural Network. Nat. Resour.
Res. 2019. [CrossRef]
12. Nguyen, H.; Bui, X.-N.; Tran, Q.-H.; Mai, N.-L. A new soft computing model for estimating and controlling
blast-produced ground vibration based on hierarchical K-means clustering and cubist algorithms. Appl. Soft
Comput. 2019, 77, 376–386. [CrossRef]
13. Nguyen, H.; Bui, X.-N.; Bui, H.-B.; Cuong, D.T. Developing an XGBoost model to predict blast-induced peak
particle velocity in an open-pit mine: A case study. Acta Geophys. 2019, 67, 477–490. [CrossRef]
14. Nguyen, H.; Bui, X.-N. Predicting Blast-Induced Air Overpressure: A Robust Artificial Intelligence System
Based on Artificial Neural Networks and Random Forest. Nat. Resour. Res. 2019, 28, 893–907. [CrossRef]
15. Moayedi, H.; Armaghani, D.J. Optimizing an ANN model with ICA for estimating bearing capacity of driven
pile in cohesionless soil. Eng. Comput. 2018, 34, 347–356. [CrossRef]
16. Moayedi, H.; Hayati, S. Modelling and optimization of ultimate bearing capacity of strip footing near a slope
by soft computing methods. Appl. Soft Comput. 2018, 66, 208–219. [CrossRef]
17. Moayedi, H.; Hayati, S. Applicability of a CPT-Based Neural Network Solution in Predicting Load-Settlement
Responses of Bored Pile. Int. J. Geomech. 2018, 18, 06018009. [CrossRef]
18. Moayedi, H.; Hayati, S. Artificial intelligence design charts for predicting friction capacity of driven pile in
clay. Neural Comput. Appl. 2018. [CrossRef]
19. Moayedi, H.; Raftari, M.; Sharifi, A.; Jusoh, W.A.W.; Rashid, A.S.A. Optimization of ANFIS with GA and
PSO estimating α ratio in driven piles. Eng. Comput. 2019. [CrossRef]
20. Moayedi, H.; Rezaei, A. An artificial neural network approach for under-reamed piles subjected to uplift
forces in dry sand. Neural Comput. Appl. 2017. [CrossRef]
21. Wu, S.; Song, H.; Cheng, G.; Zhong, X. Civil engineering supervision video retrieval method optimization
based on spectral clustering and R-tree. Neural Comput. Appl. 2018. [CrossRef]
22. Cao, M.S.; Pan, L.X.; Gao, Y.F.; Novák, D.; Ding, Z.C.; Lehký, D.; Li, X.L. Neural network ensemble-based
parameter sensitivity analysis in civil engineering systems. Neural Comput. Appl. 2017, 28, 1583–1590.
[CrossRef]
23. Rahimdel, M.J.; Bagherpour, R. Haulage system selection for open pit mines using fuzzy MCDM and the
view on energy saving. Neural Comput. Appl. 2018, 29, 187–199. [CrossRef]
24. Ghasemi, E. Particle swarm optimization approach for forecasting backbreak induced by bench blasting.
Neural Comput. Appl. 2017, 28, 1855–1862. [CrossRef]
25. Aich, U.; Banerjee, S. Modeling of EDM responses by support vector machine regression with parameters
selected by particle swarm optimization. Appl. Math. Model. 2014, 38, 2800–2818. [CrossRef]
Sensors 2020, 20, 132 20 of 21

26. Amini, H.; Gholami, R.; Monjezi, M.; Torabi, S.R.; Zadhesh, J. Evaluation of flyrock phenomenon due to
blasting operation by support vector machine. Neural Comput. Appl. 2012, 21, 2077–2085. [CrossRef]
27. de Almeida, B.J.; Neves, R.F.; Horta, N. Combining Support Vector Machine with Genetic Algorithms to
optimize investments in Forex markets with high leverage. Appl. Soft Comput. 2018, 64, 596–613. [CrossRef]
28. Deka, P.C. Support vector machine applications in the field of hydrology: A review. Appl. Soft Comput. 2014,
19, 372–386.
29. Feng, Q.; Zhang, J.; Zhang, X.; Wen, S. Proximate analysis based prediction of gross calorific value of coals:
A comparison of support vector machine, alternating conditional expectation and artificial neural network.
Fuel Process. Technol. 2015, 129, 120–129. [CrossRef]
30. Bui, X.-N.; Lee, C.W.; Nguyen, H.; Bui, H.-B.; Long, N.Q.; Le, Q.-T.; Nguyen, V.-D.; Nguyen, N.-B.; Moayedi, H.
Estimating PM10 Concentration from Drilling Operations in Open-Pit Mines Using an Assembly of SVR and
PSO. Appl. Sci. 2019, 9, 2806. [CrossRef]
31. Bui, H.-B.; Nguyen, H.; Choi, Y.; Bui, X.-N.; Nguyen-Thoi, T.; Zandi, Y. A Novel Artificial Intelligence
Technique to Estimate the Gross Calorific Value of Coal Based on Meta-Heuristic and Support Vector
Regression Algorithms. Appl. Sci. 2019, 9, 4868. [CrossRef]
32. Hasanipanah, M.; Shahnazar, A.; Amnieh, H.B.; Armaghani, D.J. Prediction of air-overpressure caused by
mine blasting using a new hybrid PSO-SVR model. Eng. Comput. 2017, 33, 23–31. [CrossRef]
33. Nguyen, H. Support vector regression approach with different kernel functions for predicting blast-induced
ground vibration: A case study in an open-pit coal mine of Vietnam. SN Appl. Sci. 2019, 1, 283. [CrossRef]
34. Khandelwal, M.; Singh, T. Prediction of blast-induced ground vibration using artificial neural network. Int. J.
Rock Mech. Min. Sci. 2009, 46, 1214–1222. [CrossRef]
35. Khandelwal, M.; Kankar, P.; Harsha, S. Evaluation and prediction of blast induced ground vibration using
support vector machine. Min. Sci. Technol. (China) 2010, 20, 64–70. [CrossRef]
36. Saadat, M.; Khandelwal, M.; Monjezi, M. An ANN-based approach to predict blast-induced ground vibration
of Gol-E-Gohar iron ore mine, Iran. J. Rock Mech. Geotech. Eng. 2014, 6, 67–76. [CrossRef]
37. Hajihassani, M.; Armaghani, D.J.; Marto, A.; Mohamad, E.T. Ground vibration prediction in quarry blasting
through an artificial neural network optimized by imperialist competitive algorithm. Bull. Eng. Geol. Environ.
2015, 74, 873–886. [CrossRef]
38. Hajihassani, M.; Armaghani, D.J.; Monjezi, M.; Mohamad, E.T.; Marto, A. Blast-induced air and ground
vibration prediction: A particle swarm optimization-based artificial neural network approach. Environ. Earth
Sci. 2015, 74, 2799–2817. [CrossRef]
39. Amiri, M.; Amnieh, H.B.; Hasanipanah, M.; Khanli, L.M. A new combination of artificial neural network and
K-nearest neighbors models to predict blast-induced ground vibration and air-overpressure. Eng. Comput.
2016, 32, 631–644. [CrossRef]
40. Hasanipanah, M.; Faradonbeh, R.S.; Amnieh, H.B.; Armaghani, D.J.; Monjezi, M. Forecasting blast-induced
ground vibration developing a CART model. Eng. Comput. 2017, 33, 307–316. [CrossRef]
41. Hasanipanah, M.; Naderi, R.; Kashir, J.; Noorani, S.A.; Qaleh, A.Z.A. Prediction of blast-produced ground
vibration using particle swarm optimization. Eng. Comput. 2017, 33, 173–179. [CrossRef]
42. Taheri, K.; Hasanipanah, M.; Golzar, S.B.; Majid, M.Z.A. A hybrid artificial bee colony algorithm-artificial
neural network for forecasting the blast-produced ground vibration. Eng. Comput. 2017, 33, 689–700.
[CrossRef]
43. Faradonbeh, R.S.; Monjezi, M. Prediction and minimization of blast-induced ground vibration using two
robust meta-heuristic algorithms. Eng. Comput. 2017, 33, 835–851. [CrossRef]
44. Behzadafshar, K.; Mohebbi, F.; Soltani Tehrani, M.; Hasanipanah, M.; Tabrizi, O. Predicting the ground
vibration induced by mine blasting using imperialist competitive algorithm. Eng. Comput. 2018, 35,
1774–1787. [CrossRef]
45. Tian, E.; Zhang, J.; Tehrani, M.S.; Surendar, A.; Ibatova, A.Z. Development of GA-based models for simulating
the ground vibration in mine blasting. Eng. Comput. 2018, 55, 849–855. [CrossRef]
46. Hasanipanah, M.; Amnieh, H.B.; Khamesi, H.; Armaghani, D.J.; Golzar, S.B.; Shahnazar, A. Prediction of an
environmental issue of mine blasting: An imperialistic competitive algorithm-based fuzzy system. Int. J.
Environ. Sci. Technol. 2018, 15, 551–560. [CrossRef]
Sensors 2020, 20, 132 21 of 21

47. Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory. In Proceedings of the Sixth
International Symposium on Micro Machine and Human Science, Nagoya, Japan, 4–6 October 1995;
pp. 39–43.
48. Fang, Q.; Nguyen, H.; Bui, X.-N.; Tran, Q.-H. Optimizing ANN models with PSO for predicting short
building seismic response. Nat. Resour. Res. 2019, 1–15. [CrossRef]
49. Tang, B.; Xiang, K.; Pang, M. An integrated particle swarm optimization approach hybridizing a new
self-adaptive particle swarm optimization with a modified differential evolution. Neural Comput. Appl. 2018.
[CrossRef]
50. Kennedy, J. Particle swarm optimization. In Encyclopedia of Machine Learning; Springer: Berlin/Heidelberg,
Germany, 2011; pp. 760–766.
51. Atashpaz, -G.E.; Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired
by imperialistic competition. In Evolutionary Computation, CEC 2007; IEEE: Piscataway, NJ, USA, 2007;
pp. 4661–4667.
52. Hosseini, S.; Al Khaled, A. A survey on the imperialist competitive algorithm metaheuristic: Implementation
in engineering domain and directions for future research. Appl. Soft Comput. 2014, 24, 1078–1094. [CrossRef]
53. Elsisi, M. Design of neural network predictive controller based on imperialist competitive algorithm for
automatic voltage regulator. Neural Comput. Appl. 2019. [CrossRef]
54. Zadeh Shirazi, A.; Mohammadi, Z. A hybrid intelligent model combining ANN and imperialist competitive
algorithm for prediction of corrosion rate in 3C steel under seawater environment. Neural Comput. Appl.
2017, 28, 3455–3464. [CrossRef]
55. Mitchell, M. An Introduction to Genetic Algorithms; MIT press: Cambridge, MA, USA, 1998.
56. Carr, J. An introduction to genetic algorithms. Sr. Proj. 2014, 1, 40.
57. Karaboga, D. An Idea based on Honey bee Swarm for Numerical Optimization; Technical Report-tr06; Erciyes
University, Engineering Faculty, Computer Egineering Department: Kayseri, Turkey, October 2005.
58. Vapnik, V. Pattern recognition using generalized portrait method. Autom. Remote. Control. 1963, 24, 774–780.
59. Yu, Z.; Shi, X.; Zhou, J.; Rao, D.; Chen, X.; Dong, W.; Miao, X.; Ipangelwa, T. Feasibility of the indirect
determination of blast-induced rock movement based on three new hybrid intelligent models. Eng. Comput.
2019. [CrossRef]
60. Ding, Z.; Nguyen, H.; Bui, X.-N.; Zhou, J.; Moayedi, H. Computational Intelligence Model for Estimating
Intensity of Blast-Induced Ground Vibration in a Mine Based on Imperialist Competitive and Extreme
Gradient Boosting Algorithms. Nat. Resour. Res. 2019. [CrossRef]
61. Verma, A.; Singh, T. Comparative study of cognitive systems for ground vibration measurements. Neural
Comput. Appl. 2013, 22, 341–350. [CrossRef]
62. Box, G.E.; Cox, D.R. An analysis of transformations. J. R. Stat. Soc. Ser. B (Methodol.) 1964, 26, 211–243.
[CrossRef]
63. Ling, H.; Qian, C.; Kang, W.; Liang, C.; Chen, H. Combination of Support Vector Machine and K-Fold cross
validation to predict compressive strength of concrete in marine environment. Constr. Build. Mater. 2019,
206, 355–363. [CrossRef]
64. Da Veiga, S. Global sensitivity analysis with dependence measures. J. Stat. Comput. Simul. 2015, 85,
1283–1305. [CrossRef]
65. Gretton, A.; Bousquet, O.; Smola, A.; Schölkopf, B. Measuring statistical dependence with Hilbert-Schmidt
norms. In International Conference on Algorithmic Learning Theory; Springer: Berlin/Heidelberg, Germany, 2005;
pp. 63–77.

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like