You are on page 1of 12

Computers in Biology and Medicine 63 (2015) 196–207

Contents lists available at ScienceDirect

Computers in Biology and Medicine


journal homepage: www.elsevier.com/locate/cbm

A Hybrid Swarm Algorithm for optimizing glaucoma diagnosis


Chandrasekaran Raja a,n, Narayanan Gangatharan b
a
Department of ECE, Anjalai Ammal Mahalingam Engineering College, Kovilvenni 614403, India
b
Department of ECE, R.M.K. College of Engineering and Technology, Puduvoyal 601206, India

art ic l e i nf o a b s t r a c t

Article history: Glaucoma is among the most common causes of permanent blindness in human. Because the initial
Received 30 December 2014 symptoms are not evident, mass screening would assist early diagnosis in the vast population. Such mass
Accepted 21 May 2015 screening requires an automated diagnosis technique. Our proposed automation consists of pre-
processing, optimal wavelet transformation, feature extraction, and classification modules. The hyper
Keywords: analytic wavelet transformation (HWT) based statistical features are extracted from fundus images.
Glaucoma Because HWT preserves phase information, it is appropriate for feature extraction. The features are then
Hyper analytic wavelet transform classified by a Support Vector Machine (SVM) with a radial basis function (RBF) kernel. The filter
Hybrid PSO–GSO coefficients of the wavelet transformation process and the SVM-RB width parameter are simultaneously
Feature extraction
tailored to best-fit the diagnosis by the hybrid Particle Swarm algorithm. To overcome premature
Support Vector Machines
convergence, a Group Search Optimizer (GSO) random searching (ranging) and area scanning behavior
(around the optima) are embedded within the Particle Swarm Optimization (PSO) framework. We also
embed a novel potential-area scanning as a preventive mechanism against premature convergence,
rather than diagnosis and cure. This embedding does not compromise the generality and utility of PSO.
In two 10-fold cross-validated test runs, the diagnostic accuracy of the proposed hybrid PSO exceeded
that of conventional PSO. Furthermore, the hybrid PSO maintained the ability to explore even at later
iterations, ensuring maturity in fitness.
& 2015 Elsevier Ltd. All rights reserved.

1. Introduction Early diagnosis of glaucoma is crucial or preventing permanent


structural damage and irreversible vision loss [8]. Currently avail-
Glaucoma is among the leading causes of blindness worldwide. able imaging techniques for examining the RNFL (Retinal Nerve
Furthermore, the loss of vision caused by glaucoma is irreversible [1]. Fiber Layer) thickness and optic disc in glaucoma include confocal
The most common type, primary open angle glaucoma, now affects scanning laser ophthalmoscopy (CSLO), optical coherence tomo-
60.5 million people worldwide. This number is expected to rise to graphy (OCT), and scanning laser polarimetry (SLP). Each of these
79.6 million in 2020, as the global population ages [2]. In the US, the technique uses different technologies and light sources to char-
projected number of patients with primary open angle glaucoma is acterize the distribution of RNFL and/or the optic disc topography.
7.32 million in 2050 [3]. In a 2012 study, the affected population CSLO builds a three-dimensional representation of the retinal
above 40 years of age in India was reported as 11.2 million [4]. surface height from multiple image sections acquired by confocal
Glaucoma is a widespread optic neuropathy characterized by optic technology. OCT determines the thickness of the circumpapillary
disc damage and visual loss [5]. Glaucoma is conventionally diagnosed RNFL by interferometry and a reflection based edge-detection
by intraocular pressure (4 22 mmHg without medication), glaucoma- algorithm. SLP uses light to penetrate the birefringent RNFL and
tous cupping of the optic disc and glaucomatous visual field defects estimates the RNFL thickness from the linear relationship between
[6]. The agents that influence optic disc size, such as age, gender, race, RNFL birefringence and the retardation of reflected light [9]. A
refractive error, axial length (AL), and central corneal thickness (CCT), portable retinal camera is a cost-effective method of screening for
have been extensively investigated [7]. The importance of optic disc diabetic retinopathy in isolated communities of at-risk individuals
size as an independent risk factor for glaucomatous optic neuropathy [10]. Bock and Meier highlighted that, unlike structural features,
remains controversial [7] and presents a serious drawback to struc- statistical methodologies require no precise measurements of
tural feature extraction for automated diagnosis. geometric structures because they perform statistical data mining
on the image patterns themselves. Statistical methodologies can
be transferred to other domains and might extract additional
n
Corresponding author. parameters, providing new insights into other ophthalmic ques-
E-mail address: rajachandru82@yahoo.co.in (C. Raja). tions [11].

http://dx.doi.org/10.1016/j.compbiomed.2015.05.018
0010-4825/& 2015 Elsevier Ltd. All rights reserved.
C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207 197

Mass screening would help identify glaucoma disease among resources, efficiency of mining and locating resources and adapt-
the vast population but requires an automated glaucoma diagnosis ability to the environment [26]. The GSO framework is primarily
technique [12]. Diseased regions in digital fundus images are grounded on the producer–scrounger model, which assumes that
diagnosed by structural and statistical computations. Statistical group members search by finding (producers) or by joining(scroun-
analysis has become the recent trend in automated glaucoma gers). The finding-other (ranger) role is a peculiar, privileged
diagnosis and is commonly performed by wavelet analysis. searching strategy that discloses potential zones.
Wavelets are mathematical functions that decompose data into The population is initialized, and the fitness of all members is
their frequency components. Each component is then investigated evaluated (accuracy of diagnosis). The fittest member is assigned
at the resolution that matches its scale [13]. The approximate the producer role. The producer scans a certain distance in some
coefficients decompose the image into dyadic resolutions, while initial direction, searching for more fit near by solutions. In this
the detailed coefficients provide significant edge information. The manner, the producer ensures that it remains optimal in the zone,
discrete wavelet transform (DWT) is less suitable for pattern a strategy that is unavailable in PSO. Scroungers are less prominent
recognition tasks because it lacks shift invariance and has poor members with lower fitness. They join the producers in searching
directional selectivity [14]. Kingsbury [14] introduced quadrature the same optimum. Rangers are also less prominent members that
phase shifted filter coefficients, which simultaneously operate on navigate the space in any orientation, preventing entrapment in
the original data to preserve the phase information. Because they local optima [26]. All members can change their roles depending
incorporate the phase spectrum along with the conventional on their fitness in a given iteration.
magnitude spectrum, the wavelet coefficients significantly improve In any optimization technique (Genetic Algorithm (GA), PSO, GSO)
feature extraction [15]. The statistical parameters extracted from wrapping the parameters that belong to different steps of optimiza-
the wavelet coefficients are energy and entropy, both of which are tion leads to more honest outcomes as the algorithm proceeds.
suitable for feature extraction from retinal images [16]. The importance of simultaneously wrapping the parameters at
The HWT is another wavelet transformation that inherits the different stages of the operation is highlighted in [27]. Liu et al.
classical mother wavelets [17] because the input data are phase- [27] simultaneously optimized the SVM kernel and feature selec-
shifted instead of the coefficients, which are left unmodified for tion. They referred to the methodology of integrating the optimi-
phase switching. Moreover, HWT admits application-specific opti- zation of different parameters as Improved. In the works of Liu
mization because the coefficients can be generated from randomly et al. [27] and Huang et al. [28], the SVM kernel parameter,
generated angular parameters. integrated with other parameters, is optimized by PSO.
The features obtained from the wavelet transformation are Huang [29] optimized SVM-RBF kernel width along with the
classifiable by a wide range of available classifiers. The SVM is a features using binary coded GA. The chromosomes in GA resemble
supervised learning method used for classification and regression the swarm members in PSO.
[18], which is extensively applied to biomedical signal classifica-
tion [19]. Basically, SVM transforms the feature space into a
hypothesis space of linearly separable targets. The core functions 2. Premature convergence problem in PSO
of the SVM introduce non-linearity to the hypothesis space with-
out explicitly requiring a nonlinear algorithm [18]. In SVM, the The standard PSO converges over time and loses diversity [27].
selection of appropriate kernel function and its parameter values Nakisa et al. [30] noted that the conventional PSO algorithm
[19] are of primary importance. satisfactorily finds optimal solutions but often fails to find the
The statistical parameters (for feature extraction) and the universally applicable solution. Therefore, whether PSO reaches
classification parameters (for decision making) are simultaneously the global or local optimum cannot be known. If PSO fails to find
optimized to attain the best execution. The Particle Swarm the global optima, the swarm will become trapped into premature
Optimization (PSO) is an evolutionary optimization algorithm convergence [30].
inspired by social individual animal navigation, proposed by Nakisa et al. [30] state that premature convergence will always
Kennedy and Eberhart [20,21]. Biological phenomena such as prevail in conventional PSO because the entire search-space must
swarming and flocking give rise to collective intelligence as an be matched to obtain the optimal resolution. As the goals of
innovative distributed optimization. PSO is recognized for its maintaining high diversity and achieving fast convergence are
global search ability [22], ease of implementation, few constraints partially antagonistic, it makes perfect sense to reduce the number
and fast convergence [23]. The swarms are initialized by random of sub-optimal solutions found in the optimization.
encoding. Like any other optimization, the initial swarms should To remedy premature convergence and other problems in PSO,
be constraint-free, randomly producible quantities. For example, researchers have modified the PSO parameters, incorporated
in this work, the angular parameters are initiated and tailored to mutations within the PSO framework, hybridized PSO and GA, or
modulate their derived wavelet coefficients. In the execution stage, adopted other evolutionary techniques [30].
the swarms browse the universe for the ultimate solution (optima) The inertia weight balances exploration (large inertia weight)
by iteratively updating their position through velocity modula- and exploitation (small inertia weight) [31–33]. Linearly decreasing
tions. In turn, the velocity is modulated by the inertia weight inertia weight favors exploitation as the iterations proceed [34,35].
factor and also influenced by the local optima (self-memory) and Eventually, the algorithm is expected to converge to a profitable
global optima (social experience). The inertia weight factor trades- zone. In another interesting modification of conventional PSO,
off the local optimum, which promotes exploitation of a zone, with swarms are induced to attract when far away and repel when they
the global optimum, which encourages the swarm to explore the approach too closely [36]. Tang and Zhou [37] applied a mutation
universe. The velocity is also modulated by the previous velocity fix when premature convergence was identified. Chuang et al. [38]
[24]. In PSO, a particular member's own best-fit score over all reset the global optimum after a long period of no update.
iterations is the fitness of the local best, the p-best. The best fit
member in the population over all iterations is called the g-best. 2.1. Diagnosis of premature convergence
Another bio-inspired optimization algorithm is Group Search
Optimization (GSO). This algorithm relies on direct fitness infor- In most research, premature convergence is diagnosed by
mation rather than collateral information [25]. The chance of monitoring the variation in the fitness function. For example,
finding the ultimate fitness is proportional to the available Tiang [32] assessed the normalized difference between the
198 C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207

variance of the fitness of a member and the average fitness. of PSO [44]. In the hybrid PSO–GA approach of Yang et al. [45],
Premature convergence is also detected when the fitness variance genetic operation is executed on selected members following a
falls below a specified threshold, and the fittest member does not PSO step. Zeng et al. switched between PSO and GSO modes [40].
reach the expected fitness [37]. In [27], duplicated particles with They combined the step search mechanism of PSO with the angle
very similar fitness functions are identified and subsequently search mechanism of GSO, securing the advantages of both
withdrawn. Zhan et al. [39] quantified premature convergence algorithms.
by monitoring the evolutionary status of the swarm throughout
the iterations.
2.3.1. Rangers in the PSO framework
Rangers undertake random walks, considered as the most
2.2. Premature convergence therapy and prevention efficient way to seek randomly located resources. In PSO, members
always remain clustered in a group none of the members fly away
In most works, the universe comprises members identified by to search other spaces [46,47]. By contrast, GSO finds new clues
their fitness values. For instance, Zeng et al. [40] described the leading toward the prey (better location) by the random walk style
universe as the search space of objective functions. Conversely, Jie of the rangers.
et al. [41] defined the premature state as the convergence of Rather than diagnosis and cure, our work prevents premature
particles around one stage. These authors state that to avoid convergence by a ranging action, which is executed on an equal
premature convergence and maintain an active particle popula- scale and parallel to PSO searching. The rangers assist in identify-
tion, we should mainly adjust the distance between the particles ing potential zones. Potential members are defined as members
and the optimal position. with fitness comparable to the g-best and located far from the g-
Nakisa et al. [30] addressed the problem of non-scanning of best (in terms of Euclidean distance).
potential areas when the swarm becomes trapped in a sub-optimal The potential zone is scanned for better solutions. If a better
zone. In this case, the swarm cannot explore other promising areas. member is found within the potential zone, the swarm escapes the
In [42], premature convergence is diagnosed by a consistent g-best local optimum. In addition, as a preventative measure against
throughout the iterations and is corrected by a leap. premature convergence, a ranger may directly become the ulti-
Other researchers use the Euclidean distance as a diagnostic mate member. Since the rangers are embedded at the stage of
parameter of PSO convergence. Jie et al.'s [41] algorithm pushes a finding new optima, our algorithm preserves the generality and
member aside if its Euclidean distance becomes too close to an utility of conventional PSO.
optimum, to avoid ambush by premature convergence. Zhang et al.
[43] adopted a hybrid (Differential Evolution) DE-PSO, which
2.3.2. Scanning around the optimal member in the PSO framework
updates the particle positions by DE if the Euclidean distance
The poor local search ability of PSO has rarely been addressed.
between the p-best and the current best neighbor becomes
Ant Colony Optimization (ACO) has been adopted for local search-
minimal.
ing in the PSO framework [48], and the solution obtained at each
Our work identifies a particle by its location and quantifies
step has been refined by sequential quadratic programming (SQP)
premature convergence in terms of the Euclidean distance
[49]. In our work, we embed the “scanning around the optima”
between the particles' positions.
strategy of GSO into the PSO framework.

2.3. Hybrid PSO


3. Methods
Various efforts have been made to embed the unique operators
of a particular technique into another methodology. For example, The images used in this study are taken from the “rim-one”
the mutation and crossover operators of GA can be adopted in PSO data-base. Out of 158 extracted images, 84 and 74 belong to the
[44]. Specifically, hybridization improves the local searching ability normal and diseased classes, respectively. The operation of the

Population (Population) iter


( Wavelets - Angular parameters Hybrid PSO
&
(Population) iter +1
SVM – Sigma Parameter)

(Ɵ1 , Ɵ 2,...... Ɵ N ) iter

Wavelet coefficient generation σ parameter


No

(Ø1, Ψ1 ) , (Ø2, Ψ2 ) ......... (Ø2N , Ψ2N)

Input Pre- Hyper analytic SVM Termination


Fitness
Images processing Wavelet Transform
Evaluation
Classifier Criteria
Yes

Fittest Wavelet coefficient &


Fittest “σ” parameter for SVM

Fig. 1. Block diagram of the proposed algorithm.


C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207 199

hybrid PSO is shown in Fig. 1. The input images are pre-processed 3.2.1. Feature extraction
before the wavelet transformation. The hyper-analytic wavelet The mean, energy and entropy were extracted from the sub-
transform coefficients and the SVM-RBF-σ parameter are simulta- band outcomes of the HWT. In previous studies, the mean gray-
neously optimized as detailed in Section 3.6. The images are level intensity, as well as the entropy and energy, has provided
transformed by the optimized wavelet functions and the statistical significant statistical quantification of the sub-bands [18]. Raja
features are extracted. et al. [16] quantified the uniqueness and discriminatory potential
The SVM classifier learns the features in a supervised fashion of the energy and entropy measures in quantifying wavelet sub-
and tests whether a new set of features matches the learned bands. According to Huang et al. [51], the energy measures the
features of any class. response of the image to the specific scale and orientation of the
filters, while the measured randomness of the grey levels in a sub-
band subjectively quantifies the texture.
3.1. Pre-processing The mean intensity, energy and entropy of an m  n sub-band
ðωÞ are respectively calculated as follows:
Prior to feature extraction, the fundus images are pre-processed Pm Pn
by grayscale conversion and histogram equalization. i¼1 j ¼ 1 ωði; jÞ
meanðωÞ ¼ n ð2Þ
m
Pm Pn
3.1.1. Gray scale conversion i¼1 j¼1f ωði; jÞg2
energyðωÞ ¼ ð3Þ
The fundus images in tri-planar RGB format are reformatted to m2 þ n2
a uniplanar grayscale. The intensity varies from 0 (absolute black)
to 255 (absolute white). The 512  512 pixel image is represented X
m
entropyðωÞ ¼  npi log 2 pi : ð4Þ
in a matrix of the same size. Single information from each pixel is
i¼1
appropriate for the statistical operations.

3.3. Support Vector Machines


3.1.2. Histogram equalization
Generally, the raw gray scale images do not cover the entire
The problem is to categorize a set of n training vectors which
range of gray levels (from 0 to 255) but converge in a certain
are linearly non-separable. We denote the training vectors by
region of the scale. This convergence localizes the brightness and
T i ¼ ½T 1 ; T 2 ; T 3 ; …; T n  and the targets by out ¼ ½ þ 1;  1, i.e.,
contrast visualization of the images. For statistical processing, the
T i A out. The classes are distinguished by the following linear
histogram is equalized such that the gray levels span 0-255.
discrimination formula:
Representative fundus images of normal and glaucomatous eyes
are shown in Fig. 2. gðxÞ ¼ W t  T þ b ð5Þ

where W is the weight vector normal to the hyperplane of class


3.2. Hyper-analytic wavelet transformation (HWT) separation, and b is the bias of the plane corresponding to the
classes. Positive and negative conditions can be expressed in terms
The wavelet transform conveys the information content of an of the class assignment out
image as a hierarchy of its frequency content, subsequently
out i ðW t T i þ bÞ Z 1: ð6Þ
providing the significant statistics of the data. In HWT, the input
data (not the wavelet coefficients) are Hilbert transformed. The W  T þ j bj = J w J is the perpendicular distance of the hyperplane to
Hilbert transform is defined such that if F 1 ðω) and F 2 ðωÞ are the origin [52]. To properly bias the classes, this distance should
Hilbert transform pairs, then F 1 ðωÞ ¼ 7 jF 2 ðωÞ, where j is negative exceed a certain amount ν:
(positive) for positive (negative) frequencies. The two low pass
j bj
filters then form orthogonal bases [50]. The hyper-analytic wavelet W Tþ Z ν: ð7Þ
JwJ
associated with an image f ðx; yÞ and its 2D-DWT Ψ ðx; yÞ is denoted
Ψ h ðx; yÞ: The bias is optimized by simultaneously minimizing J w J and
maximizing j bj .
Ψ h ðx; yÞ ¼ Ψ ðx; yÞ þ iH x fΨ ðx; yÞg þ jHy fΨ ðx; yÞg The quadratic optimization problem is solved by Lagrange
 
þ kHx H y fΨ ðx; yÞg : ð1Þ multipliers, minimizing the Lagrange multiplier with respect to
w and b. Applying the Karush–Kuhn–Tucker conditions, the
In Eq. (1), the terms i, j, and k denote imaginary numbers in two problem transforms to a dual Lagrange problem [18].
dimensions. Ψ represents the wavelet transformation of the In a nonlinear SVM, the linearly non-separable data are translated
corresponding orientation (enclosed in braces). The complex into a linearly separable hypothetical space by a kernel function.
terms i, j, and k refer to the wavelet transformations applied to The dual Lagrange to be optimized becomes [52]
the Hilbert-transformed rows, columns, and both dimensions of X
n
1Xn Xn
the image, respectively. Each of these terms preserves the phase in max LD ðθÞ θi  θ θ out i out j k; ð8Þ
the corresponding direction(s). i¼1
2i¼1j¼1 i j
In our work, features are extracted by adding and subtracting
where θ represent the dual variables, and k denotes the kernel
the first term of (1) and the k-term, thereby obtaining the spectra
function.
in π =4 and  π =4 respectively. Similarly, the i- and k-terms are
As the kernel function, we selected a radial basis function given
added and subtracted to obtain the spectra in different orienta-
by
tions. The outcome is four operations.
!
Finally for each of the four sub-bands of the first level wavelet J T ti  T j J
k ¼ exp ; ð9Þ
decomposition (A-Approximate, H-Horizontal, V-Vertical and 2σ 2
D-Diagonal), we obtain four manipulated outcomes. Hence 16
outputs are available from the four sub-bands. where σ2 is the variance.
200 C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207

Fig. 2. Samples of raw and preprocessed images. Gray scale image (left) and histogram equalized image (right) of a normal (top) and glaucomatous (bottom) eye.

3.4. Particle Swarm Optimization of all members in the population ð 8 memÞ over all iterations:
 
The initial population generated in PSO contains n  m mem-
8 iter
gbest ¼ best_posdim ¼ max Fit 8 iter ð 8 memÞ : ð13Þ
bers, where m is the number of swarms, mem ¼ f1; 2; …; mg, and n
The second and third terms in (10) describe the orientations of a
is the length of each swarm, dim ¼ f1; 2; …; ng. The position and
member towards the p-best and the g-best, respectively, and are
velocity of a member are represented by the matrices
multiplied by the cognitive acceleration constant c1 and the latter
posdim;mem ¼ fposdim;mem ; …; posn;m g and
term is multiplied with the social acceleration constants c1 and c2,
veldim;mem ¼ fveldim;mem ; …; veln;m g, respectively.
respectively. Based on its current velocity, each member updates its
In PSO, each member updates its own position and velocity by
position by Eq. (10). The velocity is restricted to a predefined range
Eqs. (10) and (14),
½  vmax ; vmax :
iter iter  1
veldim;mem ¼ W  veldim;mem
þ1 iter
positer iter
dim;mem ¼ posdim;mem þ veldim;mem : ð14Þ
þ c1  r 1 ðbest_posðmemÞiter
dim  posdim ; mem
iter
Þ
þ c2  r 2 ðbest_positer
dim  posdim ; mem
iter
Þ: ð10Þ
3.5. Group Search Optimizer
vel and pos represent the velocity and position respectively, and
mem and dim represent the member and dimension respectively. The conventional GSO proceeds through several steps: popula-
Iter represents the iteration number, and W denotes the inertia tion initialization, fitness evaluation of the initial population,
weight that controls the impact of the previous velocity on the declaration and execution of the members’ roles in each iteration
current velocity. (as producers, scroungers or rovers), and modification of these
The fitness of the memth member in the iterth iteration is roles based on the current fitness in the next iteration.
calculated as follows:

Fit iter
mem ¼ accuracy
iter
ðmemÞ; ð11Þ 3.6. PSO–GSO hybridization

where accuracy means the accuracy of the SVM. The term Fig. 3 is a flowchart of the hybridized PSO–GSO. In PSO, non-
8 iter
best pos ðmemÞdim represents the fittest position (or p-best) of a optimal members are oriented towards the optima. This action is
given member ðmemthÞ over alliterations: governed by the loop (the non-optimal members towards the
 
8 iter
pbest ¼ best pos ðmemÞdim ¼ max Fit 8 iter ðmemÞ : ð12Þ optima) in Fig. 3, which successively regenerates the population if
a better member is not found in the current iteration. This curling
8 iter
The term best pos ðmemÞdim represents the fittest position (or g-best) action limits the search space, leading to the premature
C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207 201

( Optimal Member) Iter


(Population) iter + 1 ( Potential Member) Iter

(Population) iter

PSO population Ranger population

Random Search

Local Scan

Optima Non-Optimal
Members

PSO Search -
Replace Orientation
Optima Towards
Optima

Yes Found Better


Member ?

PSO - Regenerate
No
population

Fig. 3. Flowchart of the hybrid-PSO population behavior during one iteration.

convergence problem commonly encountered in PSO. To prevent finite impulse response (FIR) filter of arbitrary length as
premature convergence, we execute a large-scale ranging action !
H 00 ðZÞ H 01 ðZÞ
parallel to the PSO search. This action, programmed in an addi- H p ðZÞ ¼
H 10 ðZÞ H 11 ðZÞ
tional loop, increases the chance of exploration. The rangers
!   !
identify the potential zones that might otherwise be neglected, C0 S0 N  1 1 0 Ci Si
but are instead scanned to improve the result. The probability of ¼ ∏ ð15Þ
 S0 C 0 i ¼ 1 0 Z  1  Si C i
identifying potential zones is uncompromised by specifying a fixed
proportion of rangers (50% of the hybrid population). The output In Eq. (14), the first and second columns correspond to even and
of the premature convergence evolves as a ranger transforms into odd taps of the filters; the first and second rows correspond to the
a producer or as potential zones are identified. The PSO–GSO low-pass and high-pass coefficients. Ci and Si are the cosine and
executes by a fusion of the processes described in Sections 3.4 and sine of the input angular parameters, respectively.
3.5. The PSO population and its rangers are initialized in the PSO In this context, an elegant way to determine the coefficients of
framework, and the optimal member (g-best) is identified. The a filter bank has been developed by Sherlock and Monro [54]. The
global optima and potential members are scanned to identify form leads to the following recursive formula for the even
better members. numbered coefficients:
ðk þ 1Þ ðkÞ
h0 ¼ ck h0 ð16Þ
3.6.1. PSO initialization
ðk þ 1Þ ðkÞ ðkÞ
The hybrid algorithm is initialized in the PSO framework. The h2i ¼ ck h2i  sk h2i  1 for ði ¼ 1; 2; …; k  1Þ ð17Þ
entire process runs through maxiter iterations (specified as
ðk þ 1Þ ðkÞ
iter ¼ ð1 to maxiterÞ in Fig. 3). h2k ¼  sk h2k  1 ð18Þ
The hybrid population is generated in the first iteration (50%
And the following recursive formula for the odd numbered co-
PSO population and 50% ranger population). The processes that
efficients,
follow are the execution of the member roles, fitness evaluation,
ðk þ 1Þ ðkÞ
identification of local and global optima, and population regenera- h1 ¼ sk h1 ð19Þ
tion in subsequent iterations. In the initial population, half of the
members are assigned as rovers. ðk þ 1Þ ðkÞ ðkÞ
h2i þ 1 ¼ sk h2i þ ck h2i  1 for ði ¼ 1; 2; …; k  1Þ ð20Þ

ðk þ 1Þ ðkÞ
h2k þ 1 ¼ ck h2k  1 ð21Þ
3.6.2. Evolution of the PSO population and rangers
Generation of the wavelet coefficients from the angular para-
meters: The population is randomly generated with angular para-
meters ranging from 0 to 2π radians. To generate the wavelet  The first even ðhk0 þ 1 Þ and odd ðhk1 þ 1 Þ coefficients are obtained
k
coefficients from the angular parameters, the poly-phase factor- by modulating the previous stage even ðh0 Þ coefficients with
ization method was proposed by Vaidyanathan et al. [53]. Their cosine and sine of the angular parameters respectively in Eqs.
algorithm allows deriving orthonormal perfect-reconstruction (16) and (19).
202 C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207

For comparison purposes, we execute both conventional and


hybrid PSO algorithms. The PSO population is appended once to
generate the hybrid population. The PSO population is initially
regenerated as described in Section 3.4. During the regeneration,
the non-optimal members of the PSO population are oriented
towards the global and local optima.
The ranger society avoids local entrapment by random walk-
ing and encountering new clues leading to the prey (better
position) [47].  In each
 iteration, the rangers search in random
kþ1
directions Dki ϕ over a random distance for better solutions
[26]:
 
kþ1
xki þ 1 ¼ xki þ li Dki ϕ : ð22Þ

After all iterations, the optimal wavelet coefficient is selected as


the g-best.

3.6.3. Local area scan


The steps that belong to the producer in the GSO framework are
executed for the PSO's “g-best”.
Specifically, the maximum pursuit distance lmax is calculated as
the norm of the upper bound minus the lower bound of each
dimension of the population. The initial distance is an n-dimen-
sional vector containing the lmax values for all dimensions. The
initial angle vector is a random ðn  1Þ dimensional vector,
ϕi A Rn  1 . The initial direction is an n-dimensional vector derived
from the initial angle vector as follows [26]:

1. The first element of the initial direction vector is the product of


the cosines of all angular elements.
2. The second to penultimate element of the initial direction (jth
element) is the product of the cosines from the jth to the final
angular element; the product is then multiplied by the sine of
the ðj  1Þth angular element
3. The last element of the initial direction vector is the product of
the sines of all the angular elements. Thus the initial direction
is an (n-dimensional) column vector derived from ðn  1Þ
dimensional initial angle vector.

The initial distance and direction are utilized for g-best scanning at
the initial distance and direction as detailed in Section 3.6.5.

3.6.4. g-Best scanning from the initial distance and direction


Once the population has been generated and regenerated, its
fitness is evaluated. Let the fittest member (g-best) be the pth
Fig. 4. Scaling (a, c) and wavelet functions (b, d) of Test runs 1 and 2. The horizontal
member of the population. Here, the fitness is directly equated to
axis represents the sample number of the filter taps; the vertical axis in the left and diagnostic accuracy. Scanning from the initial angle and direction
right panels represents the scaling and wavelet coefficients, respectively. (a) Scaling is implemented as follows [26]:
function of Test Run 1, (b) Wavelet function of Test Run 1, (c) Scaling function of
X z ¼ X kp þ r 1 lmax Dkp ðϕ Þ;
k
Test Run 2 and (d) Wavelet function of Test Run 2. ð23Þ

where Xkp denotes the g-best in the kth iteration, r 1 is a random


number, lmax and Dkp ðϕ Þ denote the initial distance and direction,
k

respectively.
 The last even ðhk2kþ 1 Þ and odd ðhk2kþþ11 Þ coefficients are obtained The g-best scans to the left and right (over its local field). The
k
by modulating the previous stage odd ðh2k  1 Þ last coefficients left and right scanning limits are obtained by adding a positive and
with negative of the sine and cosine of the angular parameters negative constant to the initial angle as follows:
respectively in Eqs. (18) and (21).  
 r 2 θmax
X z ¼ X kp þ r 1 lmax Dkp ϕ þ
k
The intermediate even terms are evaluated from last ð24Þ
stage's intermediate-even terms multiplied by cosine of the 2
angular parameter plus last stage's intermediate-odd terms  
r 2 θmax
X z ¼ X kp þ r 1 lmax Dkp ϕ 
k
multiplied by negative of the sine of the angular parameter in ; ð25Þ
2
Eq. (17).
 The intermediate odd terms are evaluated from last stage's where r2 is an (n  1) column vector of random numbers and θmax
intermediate-even terms multiplied by sine of the angular denotes the maximum pursuit angle (π =2).
parameter plus last stage's intermediate-odd terms multi- The fitness functions of these three new points are evaluated.
plied by cosine of the angular parameter in Eq. (20). If one of these points is better than the current g-best, the
C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207 203

Table 1
Sample sigma values of the SVM classifier.

Test Run Wavelet function SVM-RBF parameter (σ) Accuracy attained (%)

1 Wavelet function un-optimized 5.2456 88.89


7.3385 93.3
Wavelet function-1 7.3386 98.2

2 Wavelet function un-optimized 6.1146(18) 85.18


6.2018 (19) 91
Wavelet function-2 6.2572 (20) 95

Fig. 5. Metrics of the Test (a) Run 1 and (b) Run 2. (a) Metrics of Test Run 1, and (b) Metrics of Test Run 2.

g-best will fly to that point, otherwise it focuses on a new 3.6.6. Quantifying the exploration capability of the hybrid
random angle. population
The rangers in our PSO population are expected to maintain
3.6.5. Potential area scanning exploration until the fitness matures. Rangers’ goals are twofold:
In each iteration, potential members are searched from both (i) to become the g-best, and (ii) to disclose one or more
the PSO and rangers populations. A potential member (denoted potential zones.
the potenth member) satisfies the following condition: Zhan et al. [39] quantified premature convergence by evaluat-
ing the evolutionary status of the swarm in each iteration. They
Fit iter ðpotenÞ 4 ðα  Fit iter ðgbestÞÞ; ð26Þ calculated the average distance between the atoms by the Eucli-
where α is less than one. Since the accuracy is nominated as dean metric:
the fitness function, we desire that the fitness function is vffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
u dim  2
1 Xm uX
maximized, not minimized. That is, the higher the fitness, the di ¼ t xk xkj : ð28Þ
better the solution. The second condition is as follows: the m  1 j ¼ 1;j a i k ¼ 1 i
Euclidean distance (ED) given in Eq. (20) between the potenth
member and the g-best should be greater than a pre-set Again, m denotes the number of members in the swarm and
threshold: dim is the dimension of each swarm (see Section 3.4). After
vffiffiffiffiffiffiffiffiffiffiffiffiffi assessing the di, Zhan et al. [39] measured the evolutionary factor
u n
u X
as the divergence of the mean distance between the g-best and its
E:D: ¼ t potendim  gbestdim : ð27Þ closest member, relative to the divergence of the closest and
dim ¼ 1
furthest members from g-best. The evolutionary factor f was then
Collectively, these two conditions increase the distance of defined as
seeking a potential member with fitness comparable to the
F ¼ ðdg dmin Þ=ðdmax  dmin Þ: ð29Þ
current g-best. The potential member is then scanned around,
similar to g-best scanning. In Eq. (22) the distance of the globally best particle is given
204 C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207

Fig. 6. Fitness vs iterations. (a) Best, mean, and worst fitness of PSO and Hybrid PSO (Test Run 1) and (b) Best, mean, and worst fitness of PSO and Hybrid PSO (Test Run 2).

Table 2
Alarm of premature convergence in the Test Runs (for 100 iterations).

Iteration no. PSO Hybrid PSO

Alarm of premature convergence Leap-out from premature convergence Alarm of premature convergence Leap-out from premature convergence

Test Run-1
1–24 – – – –
25 Evolutionary factor: 0.3544 – –
Fitness: 88.89 %
43 Already trapped at 25th iteration – Evolutionary factor: 0.3379 –
Fitness: 93.3%
54 – – Evolutionary factor: 0.50
Fitness: 98.2%

Test Run-2
1–73 – – – –
74 Evolutionary factor: 0.3512 – – –
Fitness: 88.89%

as dg, dmax and dmin are the maximal and minimal distances, Based on this evolutionary factor, Zhan et al. [39] classified the
respectively. evolutionary status in a fuzzy manner as convergence, exploita-
tion, exploration, or leap-out.
C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207 205

Out of these four criteria, we adopt the exploitation criterion. Table 2 lists the alarm of the premature convergence trig-
We say that an evolutionary factor below 0.4 signifies exploitation. gered in the test runs by the PSO and hybrid algorithm
We also specify a required accuracy of 95% in advance. Accord- initialised with the same population. The evolutionary factor is
ingly, we fix the conditions of premature convergence as follows: calculated as per Eq. (29) and the value has a lower threshold of
if the accuracy is below 95% and the evolutionary factor is less 0.4 i.e., a value of evolutionary factor less 0.4 indicates decay of
than 0.4, premature convergence is suspected. Thus, convergence the exploration of the search space. Further we would coin the
(in terms of inter-member distances) is assured only when the term mature convergence as convergence in the exploration
fitness has matured. space (quantified by the evolutionary factor less than 0.4)
should happen only after attaining maturity in fitness
( 495%). Else the alarm is raised.
4. Results As shown in Fig. 6(a), conventional PSO is prematurely con-
verged around the 25th iteration in Test Run-1. The trap is also
The PSO and ranger populations were evaluated in two 10-fold evident from Table 2 which indicates the alarm rose with an
cross-validation test runs. The population size of the rangers was evolutionary factor of 0.3544 ( o0:4) and a fitness of 88.89%. In
twice that of conventional PSO, and the initial populations of both Test Run-2, the conventional PSO managed to escape the trap till
subgroups was the same. This setup is appropriate for comparison 73rd generation as shown in Fig. 6(b) and Table 2. At the 74th
with conventional PSO. iteration, as shown in Table 2, the exploration decayed with an
The wavelet coefficients were optimized by fitting the accuracy evolutionary factor of 0.3512 ( o0:4) when the fitness (88.89%)
of diagnosis in the two test runs, as shown in Fig. 4. The SVM–RBF was not matured. In both the runs, once trapped, the algorithm
width parameter (the sigma parameter) is optimized around 7.33 had no avenue for recovery. Permanent trapping is evident, from
and 6.25 in Runs 1 and 2, respectively. Table 1 tabulates the final Fig. 6(a) and (b), as the fitness does not improve after 25th and
optimized values of the sigma parameter. Sample values obtained 74th iterations.
in the intermediate iterations are also listed. As shown in Fig. 6(a), the hybrid population also gets trapped at
The accuracy, sensitivity and specificity metrics were evaluated the 43rd iteration in Test Run-1. (Infer Table 2, the evolutionary
from the numbers of true positives, false positives, false negatives, factor: 0.3379 when the fitness is immature: 93.3%.) Still, the
and true negatives, which are defined as follows: hybrid algorithm escapes the situation at 54th iteration. The
evolutionary factor becomes 0.50 which is an indication of
True positive (TP) : A diseased image is correctly identified as enhancement in the exploration space with a mature fitness of
diseased. 98.2%. Throughout the remaining iterations the fitness remained
False positive : A normal image is incorrectly identified as constant while exploration of the search space continued. In Test
(FP) diseased. Run 2, the hybrid PSO avoided the convergence problem through-
False Negative : A diseased image is incorrectly identified as out the whole evolution. The ultimate fitness was 95%, with an
(FN) normal. evolutionary factor of 0.41.
True Positive (TP) : A normal image is correctly identified as Table A1 summarizes previous attempts to solve the premature
normal. convergence problem of conventional PSO. Solutions include
hybridization with GSO, Ant Colony Optimization (ACO), and Tabu
The three metrics are then evaluated as Search (TS). Zeng and Li [40] and Grosan et al. [47] adopted PSO–
GSO switching instead of embedding. The inertia weight is an
Accuracy ¼ ðTN þ TPÞ=ðTN þTP þFN þ FPÞ extensively used parameter that tunes the exploration–exploita-
Sensitivity ¼ TP=ðTP þ FNÞ tion balance. Premature convergence is generally estimated by the
Euclidean distance or fitness function.
Specificity ¼ TN=ðTN þ FPÞ

Fig. 5 shows bar graphs of the accuracy, sensitivity, and


specificity metrics obtained in the test runs. The accuracy, sensi- 6. Conclusion
tivity and specificity of PSO (Hybrid PSO) was 88.88% (98.2%), 80%
(97.5%), and 92.37% (98.31%), respectively, in Test Run 1; and Once trapped by premature convergence, the conventional PSO
88.88% (95%), 35.48% (95%), and 87.4% (94.92%), respectively, in could not recover in any of the test runs. However, the hybrid
Test Run 2. Our hybrid algorithm achieved an increase of 6% algorithm incorporating a ranger population always escaped such
accuracy, 38.51% sensitivity, and 6.73% specificity, relative to situations.
conventional PSO. Relative to the conventional PSO, the hybrid algorithm
increased the fitness by 9.4% and 6.1%, in the test runs. Thus, the
hybrid PSO consistently outperformed conventional PSO.
5. Discussion This improvement was achieved by complementing the super-
ior search ability of PSO with the entropy of the rangers. That the
Panels (a) and (b) of Fig. 6 show how the fitness evolves over rangers can disclose unexplored potential solutions is evident
the generations in Test Runs 1 and 2. The best, mean, and worst from their leaps over premature convergence in all test runs.
fitness of PSO (hybrid algorithm) in Test Run 1 were 88.8% (98.2%), Moreover, the algorithm maintains the identity of the PSO popula-
88.5% (94.5%), and 81.4% (83.5%) respectively. tion, as the ranger population introduces no noise besides updat-
In Test Run 2, the best, mean, and worst fitness of PSO (hybrid ing the g-best. To further enhance the algorithm, we could inhibit
algorithm) was 88.8% (95%), 86.6% (89.7%), and 77.7% (79.3%) or accelerate the rangers depending on the evolutionary factor of
respectively. the swarm.
206 C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207

Table A1
Previously proposed solutions to the PSO convergence problem.

No. Ref. Hybrids/ Modifications Objective Parameters Mechanism Merits and metrics
optimized

1 [40] PSO–GSO hybrid Accelerate Special truss Initialized with GSO; Step-search PS–GSO converges quicker than
appropriate global structural mechanism of PSO and Area search conventional PSO
convergence of PSO design mechanism of GSO are fused
parameters
2 [47] PSO–GSO hybrid To achieve global 4 Benchmark PSO-to explore and GSO-to exploit. Rangers Better fitness than PSO and GSO
convergence for high functions revise the local space
dimensional
problems
3 [55] PSO–Tabu Search (TS) hybrid. To overleap local Gene Selection Tabu Search is incorporated as a local The accuracy, after cross validation, is
Modified PSO–Particle Optima for Tumor improvement higher for the hybrid algorithm than for
position is modified with classification pure PSO and pure TS
respect to velocity thresholds
4 [56] PSO–Differential Evolution To maintain the Medical Image Differential evolution is executed for the best Local minima problem is overcome.
(DE) population diversity Processing; from PSO. To maintain the diversity, the Efficient in handling multimodal image
by bell shaped Multimodal differential PSO introduces random registration
mutations Image mutations
registration
5 [48] PSO–Ant Colony To improve the 17 Benchmark ACO works as a local search procedure. For higher dimensions ( 4 4), PSACO
Optimization (ACO) performance of PSO problems Pheromone-guided mechanism to improve converges efficiently
for optimization of the performance of PSO
multimodal
functions
6 [39] Adaptive PSO Parameter adaption 17 Benchmark Search efficiency (for global optima) is Adaptive PSO offers the highest
to achieve global problems achieved by evolutionary state estimation accuracy for five benchmark functions
optimality and appropriate parameter tuning out of 12
7 [57] Modification of Inertia Improve PSO's 4 Benchmark Linearly decreasing inertia weights with Mean best fitness is zero to about
weights performance using functions respect to iterations 4 decimal places for sphere function and
adaptive inertia relatively low for other functions
weights
8 [41] Repulse force is inhibited if To overcome the 5 Benchmark If the radius between a particle and Optima Diversity of the particles is high enough
the distance between the precocious defect of functions is smaller, it tends to move away to find an optimal solution
particle and optima is less the PSO algorithm
9 [58] Particles are reinitiated when To overcome the 7 Benchmark Particles are reinitiated with random Fast convergence rate. Except for the
the Euclidean distance with premature functions velocity when they are struck in local Schaffer and Griewank functions, G-PSO
the global best is less than a convergence minima; Adapts a reactive determination of outperforms the Basic-PSO and HPSO-
threshold step-size based on the feedback from the last TVAC for all the benchmark functions
iterations
10 [37] Hybridized with G. A; To overcome the 4 Benchmark If the variance is less and the fitness of g-best Adaptive mutation PSO attains
Adaptive mutation is done premature functions is low, g-best is mutated theoretical fitness
with respect to variance of convergence
the fitness

Conflict of interest statement [8] Pooja Sharma, Pamela A. Sample, et al., Diagnostic tools for glaucoma
detection and management, Surv. Ophthalmol. 53 (1) (2008) S17–S32.
[9] Christopher Bowd, Linda M. Zangwill, Structure–function relationships using
None declared. confocal scanning laser ophthalmoscopy, optical coherence tomography, and
scanning laser polarimetry, Investig. Ophthalmol. Vis. Sci. 47 (7) (2006) 2889–2895.
[10] David Maberley, Hugh Walker, Screening for diabetic retinopathy in James Bay,
Ontario: a cost-effectiveness analysis, Can. Med. Assoc. J. 168 (2) (2003) 160–164.
Appendix A [11] Rüdiger Bock, Jörg Meier, Glaucoma risk index: automated glaucoma detection
from color fundus images, Med. Image Anal. 14 (2010) 471–481.
See Table A1. [12] M. Muthu Rama Krishnan, Oliver Faust, Automated glaucoma detection using
hybrid feature extraction in retinal fundus images, J. Mech. Med. Biol. 13 (1)
(2013) 1–21.
References [13] C. Raja, N. Gangatharan, Glaucoma detection in fundal retinal images using
trispectrum and complex wavelet-based features, Eur. J. Sci. Res. 97 (1) (2013)
159–171.
[1] Y. Mallikarjun, Two molecular mechanisms causing glaucoma found, The [14] Nick. Kingsbury, Complex wavelet transform for shift invariant analysis and
Hindu, July 11, 2013, p. 15. filtering of Signals, J. Appl. Comput. Harmon. Anal. 10 (3) (2001) 234–253.
[2] H.A. Quigley, A.T. Broman, The number of people with glaucoma worldwide in [15] C. Raja, N. Gangatharan, Incorporating phase information for efficient glau-
2010 and 2020, Br. J. Ophthalmol. 90 (3) 2006 262–267. coma diagnosis through hyper analytic wavelet transform, in: Proceedings of
[3] Thasarat S. Vajaranant, Shuang Wu, A 40-year forecast of the demographic Fourth International Conference on Soft Computing for Problem Solving,
shift in primary open-angle glaucoma in the United States, Investig. Ophthal-
Advances in Intelligent Systems and Computing, vol. 2, 2014, pp. 325–339.
mol. Vis. Sci., Special Issue 53 (5) (2012) 2464–2466.
[16] C. Raja, N. Gangatharan, Appropriate sub-band selection in wavelet packet
[4] R. George, S. Ve Ramesh, et al., Glaucoma in India: estimated burden of
decomposition for automated glaucoma diagnosis, Int. J. Autom. Comput.
disease, J. Glaucoma 19 (6) (2010) 391–397.
[5] M.B. Shields, Optic Nerve, Retina, and Choroid, Shields Textbook of Glaucoma, (2015), http://dx.doi.org/10.1007/s11633-014-0858-6, in press.
6th ed, 2005, pp. 216–217. [17] I. Firoiu, C. Nafornita, et al., Bayesian hyperanalytic denoising of sonar images,
[6] Ryusuke Futa, Tsutomu Shimizu, et al., Clinical features of capsular glaucoma IEEE Geosci. Remote Sens. Lett. 8 (6) (2011) 1065–1069.
in comparison with primary open-angle glaucoma in Japan, Acta Ophthalmol. [18] M. Muthu Rama Krishnan, U. Rajendra Acharya, et al., Data mining technique
70 (1992) 214–219. for automated diagnosis of glaucoma using higher order spectra and wavelet
[7] Na Hee Kang, Roo Min Jun, Clinical features and glaucoma according to optic energy features, Knowl. Based Syst. 33 (2012) 73–82.
disc size in a South Korean population: the Namil study, Jpn. J. Ophthalmol. 58 [19] Abdulhamit, Classification of EMG signals using PSO optimized SVM for
(2) (2014) 205–211. diagnosis of neuromuscular disorders, Comput. Biol. Med. 43 (2013) 576–586.
C. Raja, N. Gangatharan / Computers in Biology and Medicine 63 (2015) 196–207 207

[20] J. Kennedy, R. Eberhart, Particle swarm optimization, in: Proceedings of the [39] Zhi-Hui Zhan, Sun Yat-Sen, et al., Adaptive particle swarm optimization, IEEE
IEEE International Conference on Neural Network, vol. 4, 1995, pp. 1942–1948. Trans. on Syst. Man Cybern.—Part B: Cybern. 39 (6) (2009) 1362–1381.
[21] R. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in: [40] S.K. Zeng, L.J. Li, Particle swarm-group search algorithm and its application to
Proceedings of the Sixth International Symposium on Micro Machine and spatial structural design with discrete variables, Int. J. Optim. Civil Eng. 2 (4)
Human Science, IEEE: Nagoya 1995, pp. 39–43. (2012) 443–458.
[22] Bing Xue, Mengjie Zhang, Will N. Browne, Multi-objective particle swarm [41] Z. Jie, F. Chaozan, L. Bo, et al., An improved particle swarm optimization based
optimisation (PSO) for feature selection, in: GECCO'12 2012 Philadelphia, on repulsion factor, Open J. Appl. Sci. 2 (4B) (2012) 112–115.
Pennsylvania, USA. [42] R. Cheng, M. Yao, Particle swarm optimizer with time-varying parameters
[23] J. Kennedy, W. Spears, Matching algorithms to problems: an experimental test based on a novel operator, Int. J. Appl. Math. Inf. Sci. 5 (2) (2011) 33–38.
of the particle swarm and some genetic algorithms on the multimodal [43] C. Zhang, J. Ning, et al., A novel hybrid differential evolution and particle
problem generator, in: IEEE Congress on Evolutionary Computation, vol. 78– swarm optimization algorithm for unconstrained optimization, Oper. Res. Lett.
83, 1998, pp. 226–233. 37 (2009) 117–122.
[24] M.A. Esseghir, Gilles Goncalves, Yahya Slimani, Adaptive particle swarm [44] Radha Thangaraj, Millie Pant, Particle swarm optimization: hybridization
optimizer for feature selection, in: Intelligent Data Engineering and auto- perspectives and experimental illustrations, Appl. Math. Comput. 217 (12)
mated learning, vol. 6283, 2010. (2011) 5208–5226.
[25] Hai Shen, Yunlong Zhu et al., An improved group search optimizer for [45] B. Yang, Y. Chen et al., A hybrid evolutionary algorithm by combination of PSO
mechanical design optimization problems, Prog. Nat. Sci. 19 (1) (2009) 91–97. and GA for unconstrained and constrained optimization problems, in: Pro-
[26] Q.H. Wu, J.R. Saunders, Group search optimizer: an optimization algorithm ceedings of the IEEE International Conference on Control and Automation,
inspired by animal searching behavior, IEEE Trans. Evol. Comput. 13 (5) (2009) 2007, pp. 166–170.
973–990. [46] G.M. Viswanathan, S.V. Buldyrev, et al., Optimizing the success of random
[27] Yuanning Liu, Gang Wang, et al., An improved particle swarm optimization for searches, Nature 401 (6756) (1999) 911–914.
feature selection, J. Bionic Eng. 8 (2) (2011) 191–200. [47] C. Grosan, A. Abraham et al., A hybrid algorithm based on particle swarm
[28] C.L. Huang, J.F. Dun, A distributed PSO–SVM hybrid system with feature optimization and group search optimization, in: Seventh International Con-
selection and parameter optimization, Appl. Soft Comput. 8 (2008) ference on Natural Computation, Shanghai, 2011.
1381–1391. [48] P.S. Shelokar, P. Siarry, et al., Particle swarm and ant colony algorithms
[29] Cheng-Lung Huang, Chieh-Jen Wang, A GA-based feature selection and hybridized for improved continuous optimization, Appl. Math. Comput. 188
parameters optimization for support vector machines, Expert Syst. Appl. 31 (2007) 129–142.
(2005) 231–240. [49] P.T. Boggs, J.W. Tolle, Sequential quadratic programming, Acta Numer. 4 (1995)
[30] Bahareh Nakisa, Mohd Zakree Ahmad Nazri, et al., A survey: particle swarm 1–52.
optimization based algorithms to solve premature convergence problem, J. [50] Ivan W. Selesnick, Hilbert transform pairs of wavelet bases, IEEE Signal
Comput. Sci. 10 (9) (2014) 1758–1765. Process. Lett. 8 (6) (2001) 170–173.
[31] Ahmad Nickabadi, Mohammad Mehdi Ebadzadeh, et al., A novel particle [51] Ke Huang, Selin Aviyente, Statistical partitioning of wavelet subbands for
swarm optimization algorithm for adaptive inertia weight, Appl. Soft Comput. texture classification, in: IEEE International Conference on Image Processing,
11 (2011) 3658–3670. vol. 1, Michigan State University, East Lansing, USA, 2005, pp. I-441–I-444.
[32] Dong ping Tian, A review of convergence analysis of particle swarm optimiza- [52] Christopher J.C. Burges, A tutorial on support vector machines for pattern
tion international. J. Grid Distrib. Comput. 6 (6) (2013) 117–128. recognition, Data Min. Knowl. Discov. 2 (2) (1998) 121–167.
[33] C.S. Yang, L.Y. Chuang, et al., Chaotic maps in binary particle swarm [53] P.P. Vaidyanathan, Multirate Systems and Filter Banks, Prentice-Hall, Engle-
optimization for feature selection, in: Proceedings of IEEE conference on Soft wood Cliffs, NJ, 1993.
Computing in Industrial Applications, 2008, pp. 107–112. [54] B.G. Sherlock, D.M. Monro, On the space of Orthonormal Wavelets, IEEE Trans.
[34] R.C. Eberhart, Y.H. Shi, Comparing inertia weights and constriction factors in Signal Process. 46 (6) (1998) 1716–1720.
particle swarm optimization, IEEE Congress Evol. Comput. (2000) 84–88. [55] Qi Shen, Wei-Min Shi, et al., Hybrid particle swarm optimization and tabu
[35] Y.H. Shi, R.C. Eberhart, Experimental study of particle swarm optimization, in: search approach for selecting genes for tumor classification using gene
SCI2000 Conference, Orlando, 2000. expression data, Comput. Biol. Chem. 32 (2007) 53–60.
[36] J. Riget, J.S. Vesterstrom, A Diversity-Guided Particle Swarm Optimizer, The [56] Hassiba TaIbi, Mohamed Batouche, Hybrid particle swam with differential
ARPSO, Vesterstrøm, 2002. evolution for multimodal image registration, in: IEEE International Conference
[37] J. Tang, X. Zhao, Particle swarm optimization with adaptive mutation, in: on Industrial Technology, vol. 3, 2004, pp. 1562–1572.
Proceedings of the International Conference on Information Engineering, [57] Y. Shi, R.C. Eberhart, Empirical study of particle swarm optimization, in:
2009, pp. 234–237 2009. Proceedings of the Congress on Evolutionary Computation, vol. 3, 1999.
[38] L.Y. Chuang, S.W. Tsai, C.H. Yang, Improved binary particle swarm optimization [58] S. Pasupuleti, R. Battiti, The gregarious particle swarm optimizer, in: Proceed-
using catfish effect for feature selection, Expert Syst. Appl. 38 (10) (2011) ings of the 8th Annual Conference on Genetic and Evolutionary Computation,
699–707. New York, 2006, pp. 67–74.

You might also like