You are on page 1of 26

Neural Processing Letters (2020) 52:241–266

https://doi.org/10.1007/s11063-020-10247-2

GM-CPSO: A New Viewpoint to Chaotic Particle Swarm


Optimization via Gauss Map

Hasan Koyuncu1

Published online: 2 May 2020


© Springer Science+Business Media, LLC, part of Springer Nature 2020

Abstract
Abstract Chaos concept has been appealed in the recent optimization methods to achieve a
convenient tradeoff between exploration and exploitation. Different chaotic maps have been
considered to find out the appropriate one for the system dynamics. However, on particle
swarm optimization (PSO), the usage of these maps has not been handled in an extensive
manner, and the best fit one has not known yet. In this paper, ten chaotic maps are handled
to reveal the best fit one for PSO, and to explore whether chaotic maps are necessary for
PSO or not. Thirteen benchmark functions are used to perform a detailed evaluation at
the first experiment. Chaotic PSO (CPSO) methods including different maps are tested on
global function optimization. Concerning this, Gauss map based CPSO (GM-CPSO) has
come to the forefront by achieving promising fitness values in all function evaluations and
in comparison with the state-of-the-art methods. To test the efficiency of GM-CPSO on a
different task, GM-CPSO is hybridized with neural network (NN) at the second experiment,
and the epileptic seizure recognition is handled. Discrete wavelet transform (DWT) based
features, GM-CPSO and NN are considered to design an efficient framework and to specify
the type of electroencephalography signals. GM-CPSO-NN is compared with hybrid NNs
including two state-of-the-art optimization methods so as to examine the efficiency of GM-
CPSO. To accurately test the performances, twofold cross validation is realized on 11,500
instances, and four metrics [accuracy, area under ROC curve (AUC), sensitivity, specificity]
are consulted for a detailed assessment beside of computational complexity analysis. In
experiments, GM-CPSO including the necessary map, has provided remarkable fitness scores
over the state-of-the-art optimization methods on optimization of various functions defined
in different dimensions. Besides, the proposed framework including GM-CPSO-NN, has
achieved remarkable performance by obtaining reliable accuracy (97.24%), AUC (95.67%),
sensitivity (93.04%) and specificity (98.29%) scores, and by including less computational
complexity than other algorithms. According to the results, GM-CPSO has arisen as the most
convenient optimization method to be preferred in the formation of hybrid NNs. In addition
to optimization and classification results, it’s seen that the detail sub-bands of DWT comprise
necessary information for seizure recognition. Consequently, it’s revealed that GM-CPSO
can be preferred on global function optimization for reliable convergence, and its usage can
be extended to different disciplines like signal classification, pattern recognition or hybrid
system design.

Extended author information available on the last page of the article

123
242 H. Koyuncu

Graphical Abstract

Keywords Chaotic · Epileptic seizure · Gauss map · Hybrid classifier · Optimization ·


Pattern recognition

1 Introduction

Optimization is divided into different subjects according to the type of handled problem, and
it can be defined as constraint [1, 2], unconstraint [3, 4], quadratic [5], multi-objective [6],
nonlinear [7], single objective [8] etc. Single and multi-objective optimization constitute a
larger scale examination involving the process of many other types. Herein, single objective
optimization is frequently appealed in the literature even for multi-objective tasks by utiliz-
ing the linearization process [8]. In parallel with the recent optimization methods generated
for single objective tasks, particle swarm optimization (PSO) and its derivatives remained
as efficient in various disciplines [9–13]. As a chaotic derivative, chaotic PSO (CPSO) was
appealed in many research areas including artificial intelligence [14], visible light commu-
nication (VLC) [15], shape analysis [16], and it yielded promising performance. For chaotic
optimization techniques like CPSO, chaotic maps were added into the optimization methods
to improve the convergence [13, 17, 18].
Chen et al. [13] updated the inertia weight of PSO by Sine map, and added dynamic
weights into the PSO. Sine map based chaotic PSO (SM-CPSO), dynamic weight PSO (DW-
PSO) and chaotic dynamic weight PSO (CDW-PSO) were suggested in [13]. These methods
were compared with global PSO (GPSO) including linearly decreased inertia weight and
with other chaotic methods on global function optimization. However, only the Sine map
was handled in PSO, and this choice was inspired by [17, 19]. Wang et al. [17] generated
chaotic krill heard (CKH) algorithm, and evaluated different chaotic maps to reveal the best
fit. All chaotic maps were tested on benchmark function optimization. Consequently, Singer

123
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm… 243

map revealed as the most efficient map for CKH algorithm, and the second best one arose as
Sine map. Saremi et al. [18] suggested biogeography-based optimization with chaos (CBBO)
that involves chaotic behavior besides of immigration, emigration and mutation behaviors
of different habitants. Proposed method was evaluated for optimization of ten benchmark
functions, and ten chaotic maps were handled to pick out the necessary one. As a result,
Gauss/Mouse was found as the most efficient one upgrading the performance.
In this study, CPSO is handled at the first part, and chaotic maps are utilized in the update
of inertia weight as in [13]. As advised in [18], ten chaotic maps are evaluated to reveal the
best fit for the system dynamics of PSO. Thirteen well-known benchmark functions are used
to test the performance of different CPSOs including 10 chaotic maps. CPSOs are compared
with each other and with GPSO whether chaotic maps are necessary or not. According to
the experiments, Gauss map based CPSO (GM-CPSO) has come to the forefront, and it’s
compared with SM-CPSO, DW-PSO and CDW-PSO algorithms. As a result, the most proper
CPSO including the necessary map, is presented to the literature.
At the second part of study, the epileptic seizure recognition is performed on electroen-
cephalogram (EEG) signals by using GM-CPSO based hybrid framework.
EEG signals are utilized to detect various abnormalities like freezing of gait, Parkinson’s
disease, Alzheimer’s disease, epileptic seizures, etc. [20–23]. In epileptic seizure detection,
the evaluation of EEG takes an important place to specify the seizure ones among other
signals. Herein different handicaps can arise, since signals can be taken in different conditions
like open-eye/closed-eye recordings or in various regions of the brain like tumor area/non-
tumor area. Hence a robust classification framework is needed to attain the classes as binary
(seizure vs. non-seizure).
In the literature, some studies touch upon the subject of epileptic seizure recognition by
using EEG. Tawfik et al. [24] performed binary classification on EEG data including five sets:
tumor area (Set B), non-tumor area (Set C), closed-eye (Set D), open-eye (Set E) recordings
with no seizure activity, and seizure activity (Set A). For this purpose, a framework including
weighted permutation entropy (WPE) and support vector machines (SVM) was used, and it
achieved 93.75% accuracy on binary classification of five sets. Osman and Alzahrani [25]
utilized self-organizing maps (SOM) with radial basis function (RBF) to classify seizure
versus non-seizure patterns. In trials, SOM-RBF obtained 97.47% accuracy on UCI epileptic
seizure recognition dataset. Gupta et al. [26] designed a system involving discrete cosine
transform (DCT) filterbank, Hurst exponent, autoregressive moving average (ARMA) model
and SVM classifier so as to designate seizure ones among five sets. In experiments, proposed
system attained 97.79% accuracy. Al-Sharhan and Bimba [27] formed a model containing
adaptive multi-parent crossover (MPC) and genetic algorithm (GA) for feature selection,
and SVM for classification. Proposed model achieved 98.01% accuracy on trials with 4800
patterns randomly chosen from UCI dataset.
To upgrade the classification performance on seizure detection, different feature selection
and classifier methods have been handled in the literature. However, as an efficient-hybrid
technique, optimized neural networks (optimized NNs) have not been considered in detail.
Hybrid classifiers are considered in different areas, and the optimized NNs keeps its efficiency
in different disciplines [10, 28]. In these methods, optimization dynamics have been accepted
as the key point that yields better statistics.
In the literature, particle swarm optimization (PSO) and its derivatives prove their per-
fection in different tasks like global function optimization, hybrid classifiers, etc. [10, 11].
Concerning this, we focus on the recent PSO based derivatives to compare with GM-CPSO on
the epileptic seizure recognition. In [13], CDW-PSO became the best on benchmark function
optimization, and the results of DW-PSO was remarkable along with CDW-PSO. CDW-PSO

123
244 H. Koyuncu

generally overcame 11 PSO based derivatives and 9 well-known optimization methods like
artificial bee colony (ABC), chaotic krill heard (CKH), chaotic gravitational search algorithm
(CGSA), etc.
In this paper, at the second part of experiments, seizure (2300 patterns—Set A) and non-
seizure (9200 patterns—Set B, C, D, E) classification is performed as binary by considering 5
different folders as in [24–27]. Three different datasets formed by the usage of discrete wavelet
transform (DWT), are compared and used in hybrid classifiers to detect the best framework.
DW-PSO, CDW-PSO and GM-CPSO algorithms are handled to design the classifier part
of framework. Moreover, these algorithms are compared with each other on testing their
coherent with NN on seizure detection (pattern classification).
This study differs from the literature studies by presenting the most extensive chaotic
behavior analysis of PSO algorithm. Besides, the epileptic seizure recognition is comprehen-
sively considered by using an efficient framework. The remainder of this paper is organized
as follows:
Section 2 explains the dynamics behind DW-PSO, CDW-PSO and GM-CPSO algorithms.
Moreover, it interprets the formation of a hybrid classifier in detail. Section 3 provides exper-
imental results and analysis, interprets the results, and presents the literature comparisons.
In the second part of this section, DWT based datasets are designed according the literature
advices. The DWT based features and advices are mentioned in the third section not to spoil
the coherent of cause-effect relationship. Section 4 concludes the paper.

2 Methods

In this section; DW-PSO, CDW-PSO and GM-CPSO algorithms are explained with the
same flowchart, since these methods originate from PSO by wisely additions. Besides, the
formation of hybrid NN is presented in detail.

2.1 PSO Based Derivatives

DW-PSO, CDW-PSO and GM-CPSO methods are designed in the basis of PSO algorithm. To
achieve optimum convergence, PSO processes two phenomena named velocity and position
respectively presented in Eqs. (1) and (2) [9].
   
Vi (t + 1)  ωVi (t) + c1 r1 (t) X pbest(i) (t) − X i (t) + c2 r2 (t) X gbest (t) − X i (t) (1)

X i (t + 1)  X i (t) + Vi (t + 1) (2)

According to Eq. (1); ω symbolizes the inertia weight limiting the step size, c1 and c2
are acceleration constants moving the particles to the local and global best locations, r 1
and r 2 stand for the random numbers generated within (0,1). X pbest(i) (t), X gbest (t) and X i (t)
respectively mean the individual best position of ith particle, global best position of the whole
swarm and position of ith particle [9].
According to Eq. (2); V i (t + 1) and X i (t + 1) are respectively the new velocity and position
values. In the update of inertia weight, a linear descent movement is generally applied to
upgrade the performance of method. This movement is formulized as in Eq. (3) [13].
Mj
ω  ωmax − (ωmax − ωmin ) (3)
Mmax

123
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm… 245

In Eq. (3), M j and M max are respectively the current and maximum iteration numbers.
In the literature, it’s accepted that PSO can achieve higher performance if this movement is
chosen as “0.9 → 0.4”. This process is realized with ωmax  0.9 and ωmin  0.4 [10].
Dynamic weight particle swarm optimization (DW -PSO) DW-PSO algorithm uses Eq. (1)
as in PSO so as to update the new velocities. Inertia weight is arranged in accordance with
the linear descent movement of PSO. However, dynamic weights are added into the position
part to update the new positions as seen in Eq. (4) [13].
X i (t + 1)  X i (t)wi + Vi (t + 1)wi + ρ X gbest (t)ψ (4)
In Eq. (4), DW-PSO includes three major changes that are dynamic weights (wi and wi  ),
acceleration constant (ψ) and X gbest (t) to improve the search capacity. Herein, wi (ψ) and
wi  are respectively provided in Eqs. (5) and (6), and additionally, ρ symbolizes a random
number within the range of (0,1) [13]. In DW-PSO; wi (ψ) adjusts the effects of position and
X gbest (t) values while w’i arranges the effect of velocity, to the new position.
exp( f (i)/u)
wi  ψ  (5)
(1 + exp(− f (i)/u))iter
wi  1 − wi (6)
According to Eq. (6), wi and wi  stay as inversely correlated. According to Eq. (5), f (i) and
u are respectively specified as the fitness value of ith particle and the mean fitness of particles
in the first iteration. Dynamic weights are justified in parallel with the process based on the
particle fitness. This phenomenon is selected to balance the tradeoff between exploration and
exploitation [13]. In the search space, if the best particle fitness is large, it’s approved that the
particle stands off from the global optimal position. This process necessitates the larger wi
(ψ) values. Conversely, if the best particle fitness stays small, it’s confirmed that the particle
is close to the global optimal position that yields wi to be decreased [13].
Chaotic dynamic weight particle swarm optimization (CDW -PSO) CDW-PSO algorithm
is based on DW-PSO, and it utilizes Eqs. (1) and (4) to produce the new position. In addition,
it contains Sine chaotic map instead of linear descent movement in order to arrange the inertia
weight. Sine map is defined as in Eq. (7) [13].
ω  xk  A · sin(π xk−1 ), xk ∈ (0, 1), 0 < A ≤ 1, k  1, 2, . . . , Mmax (7)
In Eq. (7), k and M max respectively symbolizes the current and maximum iteration num-
bers. Herein, the change of inertia weight is connected with the Sine function and with chaotic
change to enhance the performance of algorithm [13].
Gauss map based chaotic particle swarm optimization (GM-CPSO) GM-CPSO algorithm
utilizes Eqs. (1) and (2) to attain the new position values. Additionally, it arranges the inertia
weight by using Gauss/Mouse map to better balance the tradeoff between exploration and
exploitation among iterations. Gauss map is formed as in Eq. (8).

1 ω(i)  0
ω(i + 1)  , ω(0)  0.7 (8)
1
mod (ω(i),1) other wise

In Eq. (8), the boundaries of Gauss map is (0,1) which can positively change the velocity
movements over global points. In other words, this process gains flexibility to the particle
movements and enhances the efficiency of algorithm. According to the advice of [18], ω(0)
is fixed as equal to “0.7” for the achievement of a robust-chaotic characteristic.
In the literature, PSO derivatives mostly restrict the velocity boundaries as “0.2” fold of
position boundaries (V max  X max * 0.2 and V min  X min * 0.2) to obtain higher convergence.

123
246 H. Koyuncu

Fig. 1 The formation of PSO based derivatives

Besides, acceleration constants are generally chosen as equal (c1  c2  2). These choices are
declared as the parameter settings of PSO derivatives in [13]. Therefore, GM-CPSO agrees
with the aforementioned restrictions (except in hybrid NNs). In hybrid classifiers, it’s seen
necessary to change acceleration constants to obtain better results [10], and concerning this,
we change c1 and c2 to achieve reliable performance in pattern recognition. Figure 1 presents
the formation of PSO based derivatives.

2.2 Formation of Hybrid Classifier

NN usually utilizes the backpropagation type methods to decrease the error rate, and mean
squared error (MSE) is frequently admitted as error. [29]. Optimization methods are fre-
quently appealed to provide better convergence to minimal MSE by using these methods
instead of backpropagation type approaches. Herein, the formed structure is named as hybrid

123
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm… 247

Fig. 2 The formation of hybrid NNs

classifier or optimized classifier [29]. The formation of these classifiers is presented in Fig. 2.

The design of hybrid NN seems a little complicated by means of designing the system
dynamics in accordance. For this purpose, two ways can be determined that yield the same
results and similar processes:
1. Directly writing the integrated flowchart as in [29]
2. Designing the NN as the “calculation of fitness” part of Fig. 1
In this study, we have chosen the second choice, since it eases the formation of hybrid
architecture and saves computation time. The design of NN as the “calculation of fitness”, just
simplifies the sections inside of hybrid architecture, and only contain the obtainment of error
by using the weight-bias values generated with optimization methods. Hybrid NNs should
fit to the rules stated below, and these rules provide the same results with the architecture in
[29] (the first choice):
1. To consider each particle of optimization algorithms as a vector including weight and
bias values
2. To evaluate the MSE rates for all particles that represent different networks

123
248 H. Koyuncu

3. For all particles, X i (t) symbolizes the ith weight-bias vector (particle) to be processed,
X pbest(i) (t) means the weight-bias vector obtaining the individual best MSE error of ith
vector, and X gbest (t) represents the weight-bias vector achieving the minimum MSE
among all vectors
4. Update process should fit to the system dynamics (formulations) of the utilized optimiza-
tion method

The designed networks change according to the optimization methods. Concerning this,
proposed networks are named as DW-PSO-NN, CDW-PSO-NN and GM-CPSO-NN due to
the managing optimization methods.

3 Experimental Results

This section presents the utilized benchmark functions and chaotic maps at the first part.
Datasets, features and metrics are explained to be used in seizure recognition at the second
part. CPSOs including different chaotic maps are compared using 13 benchmark functions at
the third part. After that, GM-CPSO is compared with SM-CPSO, DW-PSO and CDW-PSO
algorithms in the literature comparison. At the fourth part, DW-PSO-NN, CDW-PSO-NN and
GM-CPSO-NN methods are tested on epileptic pattern recognition. Besides, GM-CPSO-NN
based framework is compared with the state-of-the-art studies in the literature.

3.1 Benchmark Functions and Chaotic Maps

In experiments, CPSOs including ten different chaotic maps are compared as advised in [18],
and all methods are tested on 13 well-known benchmark functions [13, 30, 31]. Herein, the
output of Chebyshev and Iterative maps are normalized within the range of (0,1) by using
min–max normalization. This choice is advised in [18] to obstruct the unnecessary velocity
acts, and the output of other maps also stay in this range. Tables 1 and 2 respectively present
the utilized benchmark functions and chaotic maps with their features [13, 18, 30, 31].

3.2 Dataset, Features and Metrics

For the obtainment of data, 100 patient files with 5 different folders are used, and 500
individuals with 4097 data points is attained [32, 33]. Each file includes the brain activity
of a patient for 23.5 s. To finalize the dataset, 4097 data points are divided into 23 groups
to obtain about 1 s patterns including 178 data points. By multiplying 500 individuals with
23 group (1 s data), 11,500 patterns are revealed including 178 data features [32, 33]. The
dataset consists of 5 different folders: tumor area (Set B), non-tumor area (Set C), closed-eye
(Set D), open-eye (Set E) recordings with no seizure activity, and seizure activity (Set A).
Herein, every category comprises 2300 patterns, and binary classification is realized by using
Set A as digit 1 and others as digit 0 [32, 33].
In this study, a binary classification is proposed by using Set A versus others (Set B, Set C,
Set D, Set E) that includes 2300 positive patterns against 9200 negative patterns. To visually
present different folders (sets), a sample including five folders is presented in Fig. 3.
For an impartial and detailed comparison, 4 metrics (accuracy, sensitivity, specificity,
AUC) are utilized respectively presented in Eqs. (9–12) [34].

123
Table 1 The utilized benchmark functions
Function name Function Search space x* f(x*)
n

Sphere f1  xi2 [− 100,100] [0]n 0
i1
n
 n

Schwefel’s 2.22 f2  |xi | + |xi | [− 10,10] [0]n 0
i1 i1
 2
Schwefel’s 1.2 n
 i
 [− 100,100] [0]n 0
f3  xj
i1 j1

Schwefel’s 2.21 f 4  max|xi |, 1 ≤ i ≤ n [− 100,100] [0]n 0


n
Step f5  |xi + 0.5|2 [− 100,100] [− 0.5]n 0
i1
n

Noise f6  i xi4 + rand[0.1) [− 1.28,1.28] [0]n 0
i1
n

GM-CPSO: A New Viewpoint to Chaotic Particle Swarm…

Rastrigin f7  xi2 − 10 cos(2π xi ) + 10 [− 5.12,5.12] [0]n 0


i1
   
n n

Ackley [− 32,32] [0]n 0
f 8  −20ex p −0.2 n1 xi2 − ex p n1 cos(2π xi ) + 20 + e
i1 i1
n n
Griewank xi [− 600,600] [0]n 0
1  x 2 −  cos √
f 9  4000 i +1
i1 i1 i
 
n kmax kmax

Weierstrass  k [− 0.5,0.5] [0]n 0
f 10  a cos 2π bk (xi + 0.5) −n a k cos 2π bk 0.5
i1 k0 k0
a  0.5, b  3, kmax  20

123
249
Table 1 continued
250

Function name Function Search space x* f(x*)


   
n n

123
Shifted and rotated Ackley   [− 32,32] [o]n − 140
f 8  −20ex p −0.2 n1 z i2 − ex p n1 cos(2π z i ) + 20 + e − 140
i1 i1
 
n kmax
  kmax

Shifted rotated Weierstrass f 12  a k cos 2π bk (z i + 0.5) −n [− 0.5,0.5] [o]n 90
i1k0 k0
a k cos 2π bk 0.5 + 90
a  0.5, b  3, kmax  20
  2
sin x 2 +y 2 −0.5
Shifted rotated expanded Schaffer’s F6 f 13  F(x, y)  0.5 +   2 , [− 100,100] [o]n − 300
1+0.001 x 2 +y 2
f 13  E F(z 1 , z 2 , . . . , z D )  F(z 1 , z 2 ) + F
(z 2 , z 3 ) + · · · + F(z D−1 , z D ) + F(z D , z 1 ) − 300
H. Koyuncu
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm… 251

Table 2 Chaotic maps


Chaotic map Function Range

Chebyshev xi+1  cos i cos−1 (xi ) (− 1,1)


  a  
Circle xi+1  mod xi + b − 2π sin(2π xk ), 1 , a  0.5andb  0.2 (0,1)

Gauss/Mouse 1 xi  0 (0,1)
xi+1  1
mod(x ,1) other wise
i

Iterative xi+1  sin aπ


x , a  0.7 (− 1,1)
i

Logistic xi+1  axi (1 − xi ), a  4 (0,1)


⎧ xi
⎪ 0 ≤ xi < P
Piecewise ⎪
⎪ P (0,1)
⎨ xi −P P ≤ xi < 0.5
0.5−P
xi+1  1−P−xi , P  0.4

⎪ 0.5 ≤ xi < 1 − P
⎪ 0.5−P
⎩ 1−x
P
i 1 − P ≤ xi < 1
Sine xi+1  a4 sin(π xi ), a  4 (0,1)
Singer xi+1  μ 7.86xi − 23.31xi2 + 28.75xi3 − 13.302875xi4 , μ  1.07 (0,1)
Sinusoidal xi+1  axi2 sin(π xi ), a  2.3 (0,1)

Tent ⎨ xi xi < 0.7 (0,1)
xi+1  10 0.7
⎩ 3 (1 − xi ) x ≥ 0.7
i

Fig. 3 A sample for all folders

In Eqs. (9–11); true positives (TP) stands for the patterns that are positive and correctly
classified as positive, true negatives (TN) means the patterns that are negative and correctly
classified as negative, false positives (FP) symbolizes the patterns that are negative and
incorrectly classified as positive, and false negatives (FN) are the patterns that are positive
and incorrectly classified as negative [28]. In Eq. (12), i scans the positive labeled m data
points, while j searches the negative labeled n data points. Herein, pi and pj are respectively
the probabilities given by classifier to the data points i and j [35].

TP +TN
Accuracy  (9)
T P + FP + T N + FN

123
252 H. Koyuncu

TP
Sensitivit y  (10)
(T P + F N )
TN
Speci f icit y  (11)
(T N + F P)
1 
m n
AU C  1 pi > p j (12)
mn
i1 j1

For the preparation of features to be implemented, multi-resolution analysis (MRA) is


considered using the DWT method. Gandhi et al. [36] suggested that the direct usage of
detail (Di (t)) and approximation (Ai (t)) coefficients can decrease the performance of system.
Instead of this process, it’s proved that energy (EDA) and standard deviation (SD) of these
coefficients carry meaningful information about EEG signals. Besides, Coif 1 was found as
the best wavelet type to apply in EEG signals. Concerning this, we utilize these features and
Coif 1 according to the advices in [36]. At each decomposition level, EDA is calculated by
using Eqs. (13, 14), and SD is designated at each decomposition via Eqs. (15–18) [36].

N
 2
E Di   Di j  , i  1, 2, . . . , l (13)
j1
N
 2
E Ai   Ai j  , i  1, 2, . . . , l (14)
j1
⎛ ⎞1/2
1 N
 2
σ di  ⎝ Di j − μi ⎠ , i  1, 2, . . . , l (15)
N −1
j1
⎛ ⎞1/2
1 N
 2
σ ai  ⎝ Ai j − μi ⎠ , i  1, 2, . . . , l (16)
N −1
j1
1 
N
μdi  Di j , i  1, 2, . . . , l (17)
N
j1
1 N
μai  Ai j , i  1, 2, . . . , l (18)
N
j1

In Eqs. (13–18); Dij and Aij are respectively the detail and approximation coefficients
in jth component at ith decomposition level, N stands for the component number in each
decomposition level, l symbolizes the level of decomposition. EDi and EAi mean energy
values belonging to ith decomposition level. σ d i and σ ai are the standard deviation values at
ith decomposition level, and these features are achieved using the mean of detail (μd i ) and
approximation (μai ) components at ith decomposition level [36].
For the obtainment of features to be applied to classifiers, three datasets are formed
according to the literature advices, and these datasets are presented in Table 3. The first
dataset includes energy and standard deviation values among all approximation and detail
sub-bands as in [36]. The second and third datasets are designed according to the attitude
of [37] in which the last three detail sub-bands are evaluated during feature selection. In
[37], detail sub-bands are specified as the necessary ones containing meaningful information
about seizures. Therefore, data_2 includes the EDA and SD features among all detail sub-
bands, while data_3 only involves EDA and SD features procured from the last three detail
sub-bands.

123
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm… 253

Table 3 The features applied to hybrid classifiers (5th level decomposition is referenced)
Dataset Features Feature number

data_1 1. Energy values among all approximation sub-bands 20


2. Energy values among all detail sub-bands
3. Standard deviation values among all approximation sub-bands
4. Standard deviation values among all detail sub-bands
data_2 1. Energy values among all detail sub-bands 10
2. Standard deviation values among all detail sub-bands
data_3 1. Energy values among the last three detail sub-bands 6
2. Standard deviation values among the last three detail sub-bands

3.3 Results and Evaluation on Global Optimization

Performance evaluation of 10 CPSOs including different chaotic maps and of GPSO is


realized to reveal the necessity of chaotic maps and to find out which one is more convenient
for usage. Population size, maximum iteration number and independents runs are respectively
chosen as 40, 500 and 20 according to [13]. Dimension number is changed as 30, 50 and 100
so as to observe the performances in both low and high dimensional problems. The mean
fitness values of different dimensions are respectively presented in Tables 4, 5 and 6.
According to Tables 4, 5 and 6;
• Gauss map based CPSO (GM-CPSO) outperform to other methods in all function evalua-
tions and in all dimensions.
• GM-CPSO technique stays more convenient in both low and high dimensional problems
than other techniques.
• On the event that the dimension is chosen as 100, the performance difference between GM-
CPSO and other methods increases in general. Also this situation qualifies the necessity
about GM-CPSO again, especially in high dimensional problems.
• Except GM-CPSO, CPSOs cannot overcome GPSO in all trials meaning that the usage of
other CPSOs seems suspicious whether they are necessary for the system dynamics or not.
• GM-CPSO method can be efficiently preferred in unimodal (f 1 –f 5 ), multimodal (f 6 –f 10 ),
and in shifted—rotated (f 11 –f 13 ) conditions instead of other CPSOs and GPSO algorithms.
For observation of convergence performance, Figs. 4, 5 and 6 present the evaluations by
using mean of best function values versus iteration number in 20 trials.
In Figs. 4, 5 and 6, different functions and dimensions are handled to accurately present
the convergence scores. According to the results, GM-CPSO can get better fitness values and
can converge to the best function value quicker than other algorithms. Moreover, GM-CPSO
generally tents to converge to the best fitness just in 50–100 iterations providing that it needs
very short time to fulfil the convergence.
Table 7 presents the first literature comparison of GM-CPSO with the recent-efficient
algorithms. In Table 7, except DW-PSO, all algorithms involve chaotic process, and CDW-
PSO algorithm also comprises dynamic weights in addition to the Sine map. Dimension of
functions is chosen as 50 in parallel with [13], and parameter settings of these algorithms
can be seen in [13].
SM-CPSO owns to different fitness values than our Sine map results of Table 4, since it
restricts the velocity boundaries as “0.1” fold of position boundaries. Also both results stay
similar, and they outperform each other in different functions.
As seen in Table 7;

123
254

123
Table 4 Performance comparison of CPSOs according to chaotic maps (dimension  30)
Functions No chaos Chebyshev Circle Gauss Iterative Logistic Piecewise Sine Singer Sinusoidal Tent

f1 3.58e+01 1.51e+02 1.18e+03 0 3.94e+02 9.36e+01 2.73e+02 1.64e+02 1.65e+01 1.99e+01 3.09e+03
f2 5.03e+00 1.13e+01 1.56e+01 0 1.66e+01 9.86e+00 1.40e+01 1.27e+01 5.38e+00 5.35e+00 2.39e+01
f3 0 0 0 0 0 0 0 0 0 2.68e−24 0
f4 1.22e+01 1.67e+01 2.07e+01 0 1.88e+01 1.86e+01 1.88e+01 1.91e+01 1.38e+01 1.34e+01 2.63e+01
f5 3.27e+01 1.58e+02 9.97e+02 6.54e+00 4.77e+02 7.89e+01 2.13e+02 1.31e+02 1.64e+01 2.05e+01 3.27e+03
f6 1.03e+01 1.01e+01 1.03e+01 1.00e+01 1.04e+01 1.02e+01 1.03e+01 1.04e+01 1.05e+01 1.03e+01 1.08e+01
f7 5.84e+01 7.20e+01 9.36e+01 0 8.84e+01 6.63e+01 8.48e+01 6.84e+01 5.06e+01 4.90e+01 1.34e+02
f8 4.25e+00 9.05e+00 1.02e+01 7.99e−15 1.03e+01 8.98e+00 9.46e+00 9.83e+00 6.39e+00 6.40e+00 1.10e+01
f9 1.03e+00 1.11e+00 2.02e+00 0 1.31e+00 1.08e+00 1.26e+00 1.17e+00 1.01e+00 1.02e+00 3.79e+00
f 10 1.50e+01 1.83e+01 1.95e+01 0 2.04e+01 1.91e+01 1.78e+01 1.97e+01 1.66e+01 1.55e+01 2.06e+01
f 11 − 118,60 − 118,65 − 118,61 − 118,97 − 118,63 − 118,64 − 118,62 − 118,64 − 118,62 − 118,65 − 118,63
f 12 139,71 139,03 139,14 138,63 139,55 139,44 139,97 139,50 140,04 140,03 139,92
f 13 − 285,48 − 285,50 − 285,41 − 285,52 − 285,50 − 285,52 − 285,49 − 285,49 − 285,49 − 285,42 − 285,46
The best results are bolded in tables to signify the best method
(No Chaos  GPSO)
H. Koyuncu
Table 5 Performance comparison of CPSOs according to chaotic maps (dimension  50)
Functions No Chaos Chebyshev Circle Gauss Iterative Logistic Piecewise Sine Singer Sinusoidal Tent

f1 3.75e+02 8.70e+02 4.34e+03 0 1.93e+03 6.71e+02 1.42e+03 8.49e+02 1.89e+02 2.12e+02 8.21e+03
f2 2.03e+01 2.93e+01 3.95e+01 0 4.03e+01 2.95e+01 3.58e+01 3.59e+01 2.02e+01 1.90e+01 5.11e+01
f3 0 0 0 0 0 0 0 0 3.94e−32 3.55e−29 0
f4 2.05e+01 2.55e+01 2.78e+01 0 2.64e+01 2.56e+01 2.64e+01 2.70e+01 2.36e+01 2.25e+01 3.04e+01
f5 3.96e+02 8.71e+02 3.86e+03 1.18e+01 1.91e+03 5.36e+02 1.37e+03 8.48e+02 1.60e+02 2.44e+02 7.98e+03
f6 2.03e+01 2.11e+01 2.13e+01 1.87e+01 2.14e+01 2.15e+01 2.13e+01 2.13e+01 2.07e+01 2.04e+01 2.26e+01
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm…

f7 1.42e+02 1.73e+02 2.36e+02 0 2.25e+02 1.56e+02 1.96e+02 1.84e+02 1.12e+02 1.29e+02 2.93e+02
f8 7.57e+00 1.14e+01 1.24e+01 7.99e−15 1.30e+01 1.17e+01 1.18e+01 1.18e+01 9.99e+00 9.44e+00 1.29e+01
f9 1.34e+00 1.79e+00 4.91e+00 0 2.74e+00 1.60e+00 2.28e+00 1.76e+00 1.17e+00 1.19e+00 8.39e+00
f 10 3.32e+01 3.69e+01 3.90e+01 2.00e−01 3.86e+01 3.72e+01 3.81e+01 3.82e+01 3.53e+01 3.47e+01 3.93e+01
f 11 − 118,53 − 118,56 − 118,56 − 118,79 − 118,55 − 118,55 − 118,55 − 118,55 − 118,57 − 118,54 − 118,54
f 12 176,41 176,19 176,87 175,59 175,83 177,93 176,22 176,38 175,70 175,72 176,61
f 13 − 275,60 − 275,52 − 275,56 − 275,64 − 275,61 − 275,57 − 275,55 − 275,63 − 275,59 − 275,63 − 275,59
The best results are bolded in tables to signify the best method
(No chaos  GPSO)

123
255
256

123
Table 6 Performance comparison of CPSOs according to chaotic maps (dimension  100)
Functions No Chaos Chebyshev Circle Gauss Iterative Logistic Piecewise Sine Singer Sinusoidal Tent

f1 4.41e+03 6.03e+03 1.75e+04 0 1.21e+04 5.51e+03 9.39e+03 6.83e+03 2.29e+03 2.85e+03 2.38e+04
f2 7.68e+01 9.85e+01 1.16e+02 1.50e+00 1.10e+02 9.16e+01 1.04e+02 9.50e+01 7.31e+01 7.60e+01 1.23e+02
f3 0 0 0 0 0 0 0 0 1.04e−31 3.19e−29 0
f4 3.25e+01 3.43e+01 3.60e+01 0 3.67e+01 3.44e+01 3.60e+01 3.44e+01 3.22e+01 3.29e+01 3.89e+01
f5 4.53e+03 6.95e+03 1.75e+04 2.42e+01 1.16e+04 5.41e+03 8.83e+03 7.26e+03 2.31e+03 2.69e+03 2.39e+04
f6 5.53e+01 6.09e+01 6.16e+01 4.12e+01 6.34e+01 6.10e+01 6.37e+01 6.25e+01 5.90e+01 5.84e+01 6.73e+01
f7 4.77e+02 5.29e+02 6.66e+02 0 6.29e+02 5.05e+02 5.83e+02 5.68e+02 4.01e+02 4.22e+02 7.53e+02
f8 1.15e+01 1.40e+01 1.44e+01 4.73e−01 1.47e+01 1.41e+01 1.44e+01 1.40e+01 1.36e+01 1.33e+01 1.44e+01
f9 4.97e+00 6.38e+00 1.67e+01 0 1.19e+01 5.83e+00 9.45e+00 7.28e+00 3.11e+00 3.55e+00 2.24e+01
f 10 8.08e+01 8.80e+01 8.93e+01 6.00e−01 8.86e+01 8.73e+01 8.92e+01 8.77e+01 8.63e+01 8.47e+01 9.05e+01
f 11 − 118,45 − 118,44 − 118,44 − 118,46 − 118,45 − 118,45 − 118,45 − 118,44 − 118,45 − 118,44 − 118,44
f 12 273,33 272,50 271,94 270,94 273,71 271,11 271,87 272,51 272,58 273,50 272,93
f 13 − 250,65 − 250,77 − 250,70 − 250,78 − 250,66 − 250,67 − 250,73 − 250,72 − 250,62 − 250,73 − 250,73
The best results are bolded in tables to signify the best method
(No chaos  GPSO)
H. Koyuncu
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm… 257

Fig. 4 Performance comparison for minimization of f 4 with dimension  30

Fig. 5 Performance comparison for minimization of f 8 with dimension  50

Fig. 6 Performance comparison for minimization of f 7 with dimension  100

123
258 H. Koyuncu

Table 7 Literature comparison of recent-efficient algorithms (dimension  50)


Functions SM-CPSO [5] CBBO [10] CKH [9] DW-PSO [5] CDW-PSO [5] GM-CPSO

f1 1.71e+02 1.77e+00 7.89e−01 6.64e−39 0 0


f2 5.14e+00 1.20e−01 2.31e+02 1.37e−20 0 0
f3 3.46e+03 7.55e+03 4.31e+02 9.99e−38 0 0
f4 1.11e+01 1.71e+01 9.71e+00 0 0 0
f5 2.12e+02 1.45e+00 5.15e−01 0 0 6.54e+00
f6 3.40e−03 9.01e−02 9.44e−02 0 0 1.00e+01
f7 6.05e+01 3.14e+02 6.43e+01 0 0 0
f8 4.37e+00 6.45e−01 8.07e+00 8.88e−16 8.88e−16 7.99e−15
f9 3.07e+00 8.25e−01 1.65e−01 0 0 0
f 11 − 119.08 − 118.92 − 119.03 − 118.95 − 118.95 − 118,97
The best results are bolded in tables to signify the best method

• GM-CPSO obtains more reliable fitness scores than SM-CPSO in 8/10 of function eval-
uations (except in f 6 and f 11 ). At this point, this situation proves again the necessity of
Gauss map instead of Sine map.
• GM-CPSO generally converges better than CBBO (except in f 5 and f 6 ).
• CKH algorithm exhibits a remarkable performance, but it can only score better than GM-
CPSO in three function evaluations (f 5 , f 6 , f 11 ).
• GM-CPSO and DW-PSO algorithms attain the best fitness values in 6/10 of trials. How-
ever, GM-CPSO outperforms DW-PSO on four problems (f 1 , f 2 , f 3 , f 11 ), while DW-PSO
overcomes GM-CPSO on three of best scores (in f 5 , f 6 , f 8 ). Concerning this, GM-CPSO
stays one step front of DW-PSO.
• CDW-PSO gains the best performance in 9/10 of evaluations. Herein, its performance stays
better than the GM-CPSO’s on three functions (f 5 , f 6 , f 8 ), while GM-CPSO stay better
on one function (f 11 ) in the event that dimension is kept as 50.
According to Table 7, GM-CPSO exhibits a reliable performance among the peer
approaches in high dimensional search space. CDW-PSO has dynamic weights part in addi-
tion to chaotic behaviour, and GM-CPSO only contain Gauss map in addition to GPSO.
Yet the performance of GM-CPSO seems promising in comparison with CDW-PSO, since it
constitutes a part of CDW-PSO algorithm.
Table 8 presents the second literature comparison of GM-CPSO algorithm in the event
that the dimension is chosen as 30.
As seen in Table 8;
• GM-CPSO outperforms SM-CPSO in 9 function trials (except in f 6 ).
• DW-CPSO obtains the best fitness values in four functions, whereas GM-CPSO and CDW-
PSO algorithms obtain the best scores in 7/10 of function evaluations.
• Herein, it can be inferred that GM-CPSO can be utilized instead of CDW-PSO in low
dimensional problems, since their performance compete with each other in low dimensional
space.
According to the results in Sect. 3.3, it’s seen that Gauss map is the only coherent one
for the usage in PSO algorithm, since other choices cannot always outperform GPSO in all
trials on global function optimization. Also Gauss map based PSO (GM-CPSO) provides
better performance than other CPSOs in all trials and in all function evaluations. In [13],

123
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm… 259

Table 8 Literature comparison of recent-efficient algorithms (dimension  30)


Functions SM-CPSO [5] DW-PSO [5] CDW-PSO [5] GM-CPSO

f1 1.21e+03 2.13e−27 0 0
f2 1.93e+00 9.84e−15 8.47e−37 0
f3 1.32e+04 3.46e−21 0 0
f4 1.81e+01 0 5.26e−43 0
f5 1.04e+03 0 0 1.18e+01
f6 9.61e−02 8.99e−49 0 1.87e+01
f7 1.86e+02 0 0 0
f8 6.82e+00 7.64e−15 8.88e−16 7.99e−15
f9 1.41e+01 0 0 0
f 11 − 118.77 − 118,74 − 118,76 − 118,79
The best results are bolded in tables to signify the best method

CDW-PSO algorithm outperformed many optimization algorithms that are basic, improved
and chaotic ones. DW-PSO algorithm obtained promising results behind of CDW-PSO. In
relation with this, Tables 7 and 8 are presented to actually compare the GM-CPSO with
DW-PSO, CDW-PSO and chaotic based optimization methods. According to the results, it’s
revealed that GM-CPSO exhibits remarkable performance by achieving reliable fitness values
among trials. Also it draws a competitive performance by considering that it only constitutes
a part in CDW-PSO algorithm. However, the performance of GM-CPSO seems notable
especially on low dimensional function optimization. To better reveal the performances of
DW-PSO, CDW-PSO and GM-CPSO algorithms, hybrid NNs are designed to compare these
algorithms on a different task that involves hybrid system design, signal processing and
pattern classification.

3.4 Results and Evaluation on Epileptic Seizure Recognition

Proposed frameworks are revealed by using data_1, data_2 and data_3 as input to DW-PSO-
NN, CDW-PSO-NN and GM-CPSO-NN algorithms. In trials, acceleration constants are
changed according to the rule of “c1 + c2 ≤ 4” as in [29]. For hybrid classifiers, three param-
eters (hidden node number, population size and maximum iteration number) are examined
in detail.
In the analysis of data_1, the appropriate decomposition level is determined to be used in
all trials. In the literature, it’s seen that 4th, 5th and 6th levels of decompositions are frequently
appealed to score better performance [24, 27, 36, 37]. Hence EDA and SD features attained
at 4th, 5th and 6th level decompositions, are formed and applied into the GM-CPSO-NN
to find the convenient level. As a result, Fig. 7 shows the comparison of different ordered
decompositions.
In Fig. 7, the features of 5th level decompositions yield better accuracy and specificity rates.
Besides, AUC rates of 5th and 6th levels of decompositions stay nearly same. Concerning
this, the features formed by 5th level decompositions, are more appropriate to use than others.
Table 9 compares the hybrid classifiers by using the data_1 obtained by 5th level of
decomposition, and optimum parameter values in which algorithms record the best scores
are presented. In experiments, GM-CPSO-NN outperform other algorithms on all metrics,
and it attains remarkable accuracy on classification of 11,500 patterns. Even if the accuracy

123
260 H. Koyuncu

99 98,6
98,33
97,93
98 97,28
96,77 96,91
97
96 95,3 95,38
95 94,45
94
92,83
93
92
92
91 90,57

90
Accuracy AUC Sensitivity Specificity

4th order decomposition 5th order decomposition 6th order decomposition

Fig. 7 Comparison of different ordered decompositions

Table 9 Optimum parameter values and the best results of hybrid classifiers on data_1
Methods c1 c2 Hidden node Population size Maximum iteration
number number

Optimum parameter values


DW-PSO-NN 2.03 1.97 38 16 4000
CDW-PSO-NN 1.99 2.01 30 17 4500
GM-CPSO-NN 2.04 1.96 10 9 3500
Methods Accuracy AUC Sensitivity Specificity

The best results of hybrid classifiers


DW-PSO-NN 96.51 93.79 89.26 98.33
CDW-PSO-NN 96.51 94.15 90.22 98.09
GM-CPSO-NN 97.28 95.30 92.00 98.60
The best results are bolded in tables to signify the best method

scores of DW-PSO-NN and CDW-PSO-NN are the same, the 2nd best method is revealed as
CDW-PSO-NN in terms of obtaining higher AUC, sensitivity and specificity scores.
Table 10 presents the results by using the data_2 obtained by 5th level of decomposition
and by handling the detail sub-bands. In addition, the optimum parameter values are stated.
As seen in Table 10, GM-CPSO-NN outperform other approaches with distinct differences
on accuracy, AUC and sensitivity metrics. The 2nd best technique arises as CDW-PSO-NN
concerning the success rates against DW-PSO-NN. Even if the accuracy scores of DW-PSO-
NN and CDW-PSO-NN are the same, the 2nd best method is revealed as CDW-PSO-NN by
means of obtaining higher accuracy, AUC and sensitivity rates.
Table 11 presents the results by utilizing the data_3 obtained by 5th level of decomposition
and by considering the last three detail sub-bands, and additionally, optimum parameter values
are presented.
As seen in Table 11, GM-CPSO-NN achieves higher performance with regards to the
accuracy, AUC and sensitivity rates. For data_3, the 2nd best algorithm is seemed as DW-PSO-

123
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm… 261

Table 10 Optimum parameter values and the best results of hybrid classifiers on data_2
Methods c1 c2 Hidden Node Population size Maximum iteration
number number

Optimum parameter values


DW-PSO-NN 2.02 1.98 10 5 500
CDW-PSO-NN 2.02 1.98 18 17 500
GM-CPSO-NN 2.03 1.97 16 9 4500
Methods Accuracy AUC Sensitivity Specificity

The best results of hybrid classifiers


DW-PSO-NN 95.32 90.08 81.35 98.82
CDW-PSO-NN 95.97 92.32 86.22 98.41
GM-CPSO-NN 97.24 95.67 93.04 98.29
The best results are bolded in tables to signify the best method

Table 11 Optimum parameter values and the best results of hybrid classifiers on data_3
Methods c1 c2 Hidden node Population size Maximum iteration
number number

Optimum parameter values


DW-PSO-NN 1.9 2.1 11 11 500
CDW-PSO-NN 2.08 1.92 12 18 500
GM-CPSO-NN 2.09 1.91 4 5 1500
Methods Accuracy AUC Sensitivity Specificity

The best results of hybrid classifiers


DW-PSO-NN 96.12 94.71 92.35 97.07
CDW-PSO-NN 95.88 92.42 86.65 98.18
GM-CPSO-NN 96.90 95.08 92.04 98.12
The best results are bolded in tables to signify the best method

NN, since its accuracy, AUC and sensitivity scores come one step front of CDW-PSO-NN
algorithm.
According to Tables 9, 10 and 11, GM-CPSO-NN algorithm can be preferred as the
best technique among hybrid approaches. In relation with this, proposed framework should
include GM-CPSO-NN. Herein, this situation proves that GM-CPSO is more coherent with
NN, and can be preferred instead of DW-PSO and CDW-PSO algorithms for the formation
of a hybrid architecture. To reveal the features to be used in the framework, Table 12 makes
a comparison of datasets by using GM-CPSO-NN as the classifier unit.
In Table 12, GM-CPSO-NN achieves the best accuracy (97.28%) and specificity (98.60%)
on data_1, and these scores are revealed as more proper with differences 0.04% and 0.31% on
comparison with the results on data_2, respectively. On data_2, the sensitivity of technique
arises as 1% higher than the result on data_1, and the AUC rate is better (0.37% higher)
than the score on data_1. On the other hand, the accuracy scores of data_1 and data_2 are
very similar with a difference of 0.04%. AUC and sensitivity rates on data_1, come one

123
262 H. Koyuncu

Table 12 Performance comparison of datasets


Dataset Accuracy AUC Sensitivity Specificity

data_1 97.28 95.30 92.00 98.60


data_2 97.24 95.67 93.04 98.29
data_3 96.90 95.08 92.04 98.12
The best results are bolded in tables to signify the best method

step behind of the results obtained on data_2. Concerning this, data_2 seems more proper to
utilize in our framework.
Beside of four metric based evaluations, three hybrid classifiers are compared on data_2 to
reveal the computational complexity (consuming time) on the same parameter adjustments.
According to the results, the consuming time of DW-PSO-NN and CDW-PSO-NN methods
are respectively found as 78.83 s and 75.36 s. However, GM-CPSO gives result in less time
(72.30 s) than other methods for the entire (training + test) process on twofold cross validation
with 500 iterations.
According to the deductions of this section, GM-CPSO-NN outperforms DW-PSO-NN
and CDW-PSO-NN by achieving higher success rates (especially better accuracy and AUC
scores) and by providing less computational complexity on the epileptic seizure recognition.
Herein, it can be inferred that GM-CPSO stays more appropriate than DW-PSO and CDW-
PSO methods so as to design hybrid NNs. Therefore, it’s revealed that GM-CPSO is not only
effective on global function optimization, but it also exhibits reliable performance directly
on hybrid system design, and indirectly on pattern classification and signal processing. The
second deduction of this section is the efficiency of usage of detail sub-bands. In this study,
the information of detail sub-bands comes to the forefront for better performance as in [37].
However, the usage of all detail sub-bands ensures better performance than the usage of the
last three detail sub-bands. Concerning this, the usage of approximation sub-bands seems
inappropriate to utilize for the detection of seizure. Therefore, proposed framework should
involve EDA and SD features attained at 5th level decomposition among all detail sub-bands.
Moreover, framework should process GM-CPSO-NN algorithm as the classifier unit.
Concerning the deductions stated above, proposed framework is revealed as in Fig. 8. The
first part includes the implementation of DWT method to EEG signal. For the obtainment
of features, EDA and SD are considered at 5th level of decomposition among all detail sub-
bands, and all values are combined within a vector for every data. Consequently, the formed
dataset consists of the necessary input for GM-CPSO-NN classifier.
In this paper, the epileptic seizure recognition is performed on data of UCI machine
learning repository. This dataset comes into the forefront than other datasets (Bonn, etc.) in
terms of handling 1 s. EEG analysis. Herein, the epileptic seizure detection via one second
signal, stays more challenging, since the information of signal will always be less than the
longer EEG signals’. Concerning this, the seizure recognition stays as complicated with the
use of this data.
In Table 13, the literature comparison is realized with GM-CPSO-NN based framework
versus the state-of-the-art studies. As mentioned before, the usage of one second EEG analysis
is challenging, and this situation can be more challenging by increasing the pattern number.
As seen in Table 13, we use all patterns (11,500 samples) in UCI dataset, and despite this,
proposed framework achieves a remarkable classification performance on comparison with

123
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm… 263

Fig. 8 The formation of proposed framework

Table 13 Literature comparison on epileptic pattern recognition


Study Year Methods Dataset Pattern Number Accuracy

Tawfik et al. [16] 2016 Framework including Bonn 500 93.75


WPE and SVM
Osman and Alzahrani 2019 SOM-RBF UCI Not directly declared 97.47
[17]
Gupta et al. [18] 2018 System involving DCT Bonn 500 98
based Filterbank,
Hurst exponent,
ARMA and SVM
Al-Sharhan and Bimba 2019 Model containing UCI 4800 98.01
[19] MPC-GA and SVM
This study 2019 EDA and SD among all UCI 11,500 97.24
detail sub-bands +
GM-CPSO-NN
The best results are bolded in tables to signify the best method

other studies including lower pattern numbers and involving the detailed EEG information
with more time (in seconds).

4 Conclusions

In this paper, the chaotic behavior of PSO is extensively determined, and it’s seen that
Gauss map based PSO (GM-CPSO) is superior to other CPSOs and to GPSO in all function
evaluations and in all dimensions. Concerning this, the choice of Gauss map as inertia weight
in PSO, is revealed as more efficient than the Sine map and linearly descent movement
(0.9 → 0.4). Besides, the convergence of proposed method keeps very short time (50–100
iterations) in comparison with CPSOs and GPSO techniques. In comparison with other
chaotic methods, GM-CPSO achieves remarkable results even though it comprises of fewer

123
264 H. Koyuncu

sections in algorithm, and ordinarily achieves better performance than CBBO, CKH and
DW-PSO methods. In comparison with CDW-PSO, GM-CPSO seems promising, since it
forms one part in CDW-PSO. Consequently, GM-CPSO can be efficiently preferred in low
dimensional optimization instead of CDW-PSO.
A detailed analysis of EEG signals has been performed to recognize the epileptic ones
among 5 folders. As seen in experiments, the feature extraction by using EDA and SD features
among all detail sub-bands of 5th level of decomposition, yields better scores, and can be
used for EEG analysis of different diseases for future evaluations. As being hybrid classifier,
GM-CPSO-NN achieves promising performance among other classifiers consisting of two
state-of-the-art optimization methods (DW-PSO and CDW-PSO). Therefore, GM-CPSO-
NN can be preferred in the pattern classification of different datasets. Proposed framework
obtains remarkable performance on processing with 11,500 patterns including one second
EEG analysis. Furthermore, GM-CPSO seems more reliable to use than DW-PSO and CDW-
PSO algorithms in hybrid NNs. Herein, GM-CPSO optimizes MSE error better than these
approaches.
For future work, GM-CPSO will be implemented on machine learning area to improve
the performance of different classifiers. Besides, we want to utilize our proposed framework
including EDA, SD and GM-CPSO-NN in order to detect different medical diseases in EEG
signals.

Acknowledgements This work is supported by the Coordinatorship of Konya Technical University’s Scientific
Research Projects.

References
1. Xie Z, Jin L, Du X, Xiao X, Li H, Li S (2019) On generalized RMP scheme for redundant robot manip-
ulators aided with dynamic neural networks and nonconvex bound constraints. IEEE Trans Ind Inform
15(9):5172–5181
2. Dai Z (2014) Extension of modified Polak–Ribiere–Polyak conjugate gradient method to linear equality
constraints minimization problems. Abstr Appl Anal 2014:1–9
3. Zhou W, Wang F (2015) A PRP-based residual method for large-scale monotone nonlinear equations.
Appl Math Comput 261:1–7
4. Dong XL, Han DR, Ghanbari R, Li XL, Dai ZF (2017) Some new three-term Hestenes–Stiefel conjugate
gradient methods with affine combination. Optimization 66(5):759–776
5. Chu HH, Qian WM, Chu YM, Song YQ (2016) Optimal bounds for a Toader-type mean in terms of
one-parameter quadratic and contraharmonic means. J Nonlinear Sci Appl 9(5):3424–3432
6. Liu C, Gong Z, Teo KL, Sun J, Caccetta L (2017) Robust multi-objective optimal switching control arising
in 1,3-propanediol microbial fed-batch process. Nonlinear Anal Hybrid 25:1–20
7. Wei L, Jin L, Yang C, Chen K, Li W (2019) New noise-tolerant neural algorithms for future dynamic
nonlinear optimization with estimation on hessian matrix inversion. IEEE T Syst Man Cyb Syst. https://
doi.org/10.1109/TSMC.2019.2916892
8. Yang XS (2014) Nature-inspired optimization algorithms. Elsevier, New York
9. Shi Y, Eberhart R (1998) A modified particle swarm optimizer. In: Proceedings of 1998 IEEE international
conference on evolutionary computation, pp 69–73
10. Koyuncu H, Ceylan R (2018) A PSO based approach: scout particle swarm algorithm for continuous
global optimization problems. J Comput Des Eng 6(2):129–142
11. Koyuncu H, Ceylan R, Asoglu S, Cebeci H, Koplay M (2019) An extensive study for binary characteri-
sation of adrenal tumours. Med Biol Eng Comput 57(4):849–862
12. Xu X, Rong H, Trovati M, Liptrott M, Bessis N (2018) CS-PSO: chaotic particle swarm optimization
algorithm for solving combinatorial optimization problems. Soft Comput 22(3):783–795
13. Chen K, Zhou F, Liu A (2018) Chaotic dynamic weight particle swarm optimization for numerical function
optimization. Knowl Based Syst 139:23–40

123
GM-CPSO: A New Viewpoint to Chaotic Particle Swarm… 265

14. Daneshyari M (2010) Chaotic neural network controlled by particle swarm with decaying chaotic inertia
weight for pattern recognition. Neural Comput Appl 19(4):637–645
15. Zhang M, Li F, Guan W, Wu Y, Xie C, Peng Q, Liu X (2018) A three-dimensional indoor positioning
technique based on visible light communication using chaotic particle swarm optimization algorithm.
Optik 165:54–73
16. Wang C, Yu T, Shao G, Nguyen TT, Bui TQ (2019) Shape optimization of structures with cutouts by
an efficient approach based on XIGA and chaotic particle swarm optimization. Eur J Mech A Solid.
74:176–187
17. Wang GG, Guo L, Gandomi AH, Hao GS, Wang H (2014) Chaotic krill herd algorithm. Inf Sci 274:17–34
18. Saremi S, Mirjalili S, Lewis A (2014) Biogeography-based optimisation with chaos. Neural Comput Appl
25(5):1077–1097
19. Niu P, Chen K, Ma Y, Li X, Liu A, Li G (2017) Model turbine heat rate by fast learning network with
tuning based on ameliorated krill herd algorithm. Knowl Based Syst 118:80–92
20. Handojoseno AA, Shine JM, Nguyen TN, Tran Y, Lewis SJ, Nguyen HT (2015) Analysis and prediction
of the freezing of gait using EEG brain dynamics. IEEE Trans Neural Syst Rehabilit Eng 23(5):887–896
21. Sushkova OS, Morozov AA, Gabova AV (2016) A method of analysis of EEG wave trains in early stages
of Parkinson’s disease. In: 2016 International conference on bioinformatics and systems biology (BSB),
pp 1–4
22. Tahaei MS, Jalili M, Knyazeva MG (2012) Synchronizability of EEG-based functional networks in early
Alzheimer’s disease. IEEE Trans Neural Syst Rehabilit Eng 20(5):636–641
23. Mahmoodian N, Boese A, Friebe M, Haddadnia J (2019) Epileptic seizure detection using cross-
bispectrum of electroencephalogram signal. Seizure 66:4–11
24. Tawfik NS, Youssef SM, Kholief M (2016) A hybrid automated detection of epileptic seizures in EEG
records. Comput Electr Eng 53:177–190
25. Osman AH, Alzahrani AA (2019) New approach for automated epileptic disease diagnosis using an
ıntegrated self-organization map and radial basis function neural network algorithm. IEEE Access
7:4741–4747
26. Gupta A, Singh P, Karlekar M (2018) A novel signal modeling approach for classification of seizure and
seizure-free EEG signals. IEEE Trans Neural Syst Rehabilit Eng 26(5):925–935
27. Al-Sharhan S, Bimba A (2019) Adaptive multi-parent crossover GA for feature optimization in epileptic
seizure identification. Appl Soft Comput 75:575–587
28. Ceylan R, Koyuncu H (2019) A novel rotation forest modality based on hybrid NNs: RF (ScPSO-NN). J
King Saud Univ Comput Inf Sci 31(2):235–251
29. Ceylan R, Koyuncu H (2016) A new breakpoint in hybrid particle swarm-neural network architecture:
individual boundary adjustment. Int J Inf Technol Decis Mak 15(6):1313–1343
30. Feng D, Wenkang S, Liangzhou C, Yong D, Zhenfu Z (2005) Infrared image segmentation with 2-D max-
imum entropy method based on particle swarm optimization (PSO). Pattern Recognit Lett 26(5):597–603
31. Suganthan PN, Hansen N, Liang JJ, Deb K, Chen YP, Auger A, Tiwari S (2005) Problem definitions and
evaluation criteria for the CEC 2005 special session on real-parameter optimization. KanGAL report,
2005005
32. https://archive.ics.uci.edu/ml/datasets/Epileptic+Seizure+Recognition
33. Andrzejak RG, Lehnertz K, Mormann F, Rieke C, David P, Elger CE (2001) Indications of nonlinear
deterministic and finite-dimensional structures in time series of brain electrical activity: dependence on
recording region and brain state. Phys Rev E 64(6):061907
34. Koyuncu H, Ceylan R (2015) Scout particle swarm optimization. In: 6th European conference of the
international federation for medical and biological engineering, pp 82–85
35. Fawcett T (2006) An introduction to ROC analysis. Pattern Recognit Lett 27(8):861–874
36. Gandhi T, Panigrahi BK, Anand S (2011) A comparative study of wavelet families for EEG signal
classification. Neurocomputing 74(17):3051–3057
37. Zhou W, Liu Y, Yuan Q, Li X (2013) Epileptic seizure detection using lacunarity and Bayesian linear
discriminant analysis in intracranial EEG. IEEE Trans Bio-med Eng 60(12):3375–3381

Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and
institutional affiliations.

123
266 H. Koyuncu

Affiliations

Hasan Koyuncu1

B Hasan Koyuncu
hkoyuncu@ktun.edu.tr
1 Electrical and Electronics Engineering Department, Faculty of Engineering and Natural Sciences,
Konya Technical University, 42250 Konya, Turkey

123

You might also like