You are on page 1of 5

3rd Conference on Swarm Intelligence and Evolutionary Computation (CSIEC2018), Higher Education Complex of Bam, Iran, 2018

MWWO: Modified Water Wave Optimization


Amin Soltanian , Fatemeh Derakhshan, Mohadeseh Soleimanpour- Moghadam
Department of Computer Engineering, Higher Education Complex of Bam
Bam, Iran

Abstract— Water Wave Optimization (WWO) is a swarm the search space for finding new and better solutions, and
intelligence optimization algorithm that shares many similarities exploitation is the ability to look for the optimal solution near a
with evolutionary computation techniques. However, the WWO good one. The abilities of exploration and exploitation in every
uses three wave-inspired operators including a high dimensional heuristic algorithm are applied with specific operators. Since
solution space of an optimization problem. Like some other each operator has its own abilities of exploration and
evolutionary optimization techniques, premature convergence is exploitation, the operators should be artfully hybridized
also happened in WWO. In this paper, we propose a new kind of together for a good trade-off between exploitation and
exploration parameter to increase the exploration ability of exploration. Hence, new operators are designed or available
algorithm. As the results show it helps the agent to escape the
operators redesigned in order to add specific capabilities to
local optima more easily. The Modified Water Wave
Optimization (MWWO) is evaluated on some benchmark
heuristic algorithms for solving some problems.
function and the results are reported. In MWWO algorithm, the adaptive value is considered to
increase the exploration ability of algorithm at the beginning
Keywords: Metaheuristic, Water Wave Optimization (WWO), and by laps of time this exploration ability is substituted with
Propagation, Exploration, Exploitation. exploration ability of algorithm.
This paper is organized as follows. The section ‘‘the basic
of WWO’’ provides a brief review of WWO. In the section
I. INTRODUCTION ‘‘Proposed method’’, we explain the proposed exploration
operator and its characteristics are described in the section
Nowadays, heuristic search algorithms have been widely ‘‘Simulation Results’’ and finally in the last section, the paper
employed to solve global optimization problems. Heuristic is discussed in detail.
search algorithms are stochastic algorithms that mimic the
processes of natural phenomena such as natural selection, II. BASIC OF WWO
natural evolution, self-organization, etc. These algorithms The method of WWO has putted the seabed as an axis
maintain a collection of potential solutions for a problem. higher the fitness is the wave that has a shorter distance to the
Some of these possible solutions are used to create new level of the sea. Waves are solutions which find the global
potential solutions through the use of specific operators. The optima and define a height (or amplitude) ℎ ∈ and a
operators act on population and produce collections of new wavelength ∈ and initial values of and ℎ are
potential solutions at each iteration. This process is used considered 0.5 and ℎ respectively. In [7] three types of
repeatedly to generate new collections of potential solutions operation on the waves are considered which are called:
till the stopping criterion is met [1]. It is shown that the propagation, refraction and breaking respectively.
population-based heuristic search algorithms are effective and 1) Propagation. at each iteration each wave need to
flexible tools to solve the problems in a wide range of propagate exactly once. propagation is utilized for creating a
application [2]. Since the late 1990s, many various heuristic
new wave as follows:
approaches have been adopted by researches, such as genetic
algorithm (GA) [3], ant colony optimization (ACO) [4], ( )= ( )+ (−1, 1). ( ) (1)
particle swarm optimization (PSO) [5], GSA [6], water wave
optimization (WWO), etc. where rand (−1, 1) is a uniformly distributed random number
Very recently, inspired by wave motions controlled by the within the range [−1, 1] and, ( ) is the length of the dth
wave-current bottom interactions [7], Zheng proposed a novel dimension of the search space (1 ≤ ≤ ) as illustrated in
optimization algorithm called water wave optimization Fig. 1. Thus, after propagation we calculate the fitness of the
(WWO), which derives effective mechanisms from water wave offspring wave . If ( ′) > ( ), is replaced by . In
phenomena such as propagation, refraction, and breaking for the population, and the wave height of is reset to hmax.
solving high-dimensional global optimization problems. Otherwise, is remained, but its height ℎ is decreased by one
Heuristic algorithms must have a good balance between When a wave moves its height will increases and the wave
exploration and exploitation to achieve both efficient global length by use of this formula will decrease:
and local searches. In this way, they can efficiently solve ( ( ) )/( )
= (2)
optimization problems. Exploration is the ability to investigate

978-1-5386-4978-7/18/$31.00 ©2018 IEEE


where and are the maximum and minimum fitness accuracy of exploitation reduce, so in next section we suggest a
values among the current population. is the wavelength method that rectifies this problem.
reduction coefficient and is a very small positive number to
avoid divide by zero. III. THE PROPOSED METHOD
As mentioned before a good balance between exploration
In wave propagation, if the wave ray is not perpendicular to the
and exploitation is important to achieve both global and local
isobath, its direction will be deflected. It is observed that the
searches, while exploitation needs the searching to be restricted
rays converge in shallow regions while diverge in deep regions,
in the current space locally. They are in conflict because
as illustrated in Fig. 1. [7]
emphasizing one must sacrifice the other. Trying to achieve an
appropriate balance between exploration and exploitation
becomes a key and challenging issue for all the optimization
algorithms.
WWO, nevertheless, has the shortcomings of all other
intelligent techniques. Firstly, while the local exploitation
ability of WWO is considered adequate, its global exploration
ability is regarded weak. Secondly, WWO suffers from the
problem of premature convergence, where the search process
may be trapped in local optima in multimodal objective
function and lose its diversity.
To solve this issue, in this paper an adaptive is proposed to
improve the exploration ability of WWO which is called
Modified WWO (MWWO).
In other words, we improved the performance of propagation in
WWO which cause to reduce probability of trapping in the
Fig.1. Different wave shapes in deep and shallow water local optima for multimodal functions. This procedure
improved the performance of WWO while tries to jump out of
2) Refraction. Only perform refraction of waves whose local optima. The proposed adaptive makes an acceptable
height decreases to zero and cause the rays converge in shallow balance between exploration and exploitation, meanwhile
regions while diverge in deep region, the position after improved the effect of propagation operator.
refraction is calculated as follows:
Note that, to avoid to be trapped in a local optimum, the
( )= ( , ) (3) algorithm must use the exploration in the first little iteration.
Hence, the exploration is an important issue in a population-
where ∗ is the best solution found so far, and N( , ) is a
based heuristic algorithm. By lapse of iterations, exploration
Gussian random number with mean and standard deviation
as follows.: fades out and exploitation fades in, so the algorithm tunes itself
in semi-optimal points.
∗( )+ ( ) (4) In MWWO, we use an exponential adaptive strategy where
= for each iteration the ( ) is updated as follows:
2
| ∗( ) − ( )| (5) − +1
= ( )= . (8)
2
After refraction the wave height is also reset to ℎ and
its wavelength is set as log ( )
(9)
( ) (6) =
= 1
log ( )
( )
3) Breaking. This operator searches a local search around the where and are the maximum and minimum
best solution ( ∗ ) and finally the wave breaks into a train of wavelength reduction coefficient, is the current
solitary waves. At each iteration dimension d generates a generation number (or number of fitness evaluation, NFEs) and
solitary wave as is maximum generation number (or NFEs) of algorithm.
( ) = ( ) + (0, 1). ( ) (7) in first improves the performance of exploration by having
where is the breaking coefficient. large values. As iteration increases, decreases, resulting in
better exploitation performance. So it creates a good balance
In multimodal functions the result of WWO isn’t satisfactory between exploration and exploitation.
and can’t find global optimization because the WWO fall in
local optima. Otherwise the velocity of exploration and the
mean 4.0590e+02 5.3907e+02
std 1.3733e+01 1.1545e-04
mean 5.2001e+02 5.1999e+02
std 0.1059e+00 2.2964e-13
Algorithm 1 The MWWO algorithm
mean 6.2860e+02 6.0894e+02
1 Randomly initializes a population of solutions (waves) std 0.1321e+00 3.8278e-05
2 While stop criterion is not satisfied do mean 7.0004e+02 7.0002e+02
4 Update with new according to Eq. (8) and (9) std 0.0060e+00 0.0182e+00
3 for each wave ∈ do mean 9.2437e+02 9.0049e+02
4 propagate to a new according to Eq. (1); std 0.4357e+00 2.9983e+00
5 if ( ) > ( ) then mean 9.9253e+02 9.8656e+02
6 if ( ) > ( ) then std 0.0019e+00 1.4253e+00
7 Break into new waves according to Eq. (7) mean 3.1878e+03 3.0458e+03
8 Update ( ) with std 3.6164e+01 3.5157e+01
9 Replace with mean 4.5136e+03 4.0225e+03
std 3.7754e+01 6.9931e-12
10 else
Multimodal mean 1.2004e+03 1.2002e+03
11 . ℎ = . ℎ − 1;
Functions std 0.0036e+00 0.0038e+00
12 if . ℎ = 0 then
mean 1.3004e+03 1.3002e+03
13 Refract to a new according to Eq. (3) and (6)
std 9.6037e-13 4.5927e-13
14 Update the wavelength according to Eq. (2);
mean 1.4003e+03 1.4001e+03
15 return the best solution found so far.
std 7.0449e-13 1.3778e-12
mean 1.5042e+03 1.5281e+03
IV. SIMULATION RESULT
std 0.0304e+00 0.6926e+00
The performance of proposed method on 16 benchmark mean 1.6118e+03 1.6110e+03
functions of CEC2014. To have a thorough performance std 0.0340e+00 8.8108e-04
evaluation proposed MWWO with original WWO. The
parameter sets as follows: dimension= 30, population (the The convergence curves of the algorithms on 4 test problems
number of waves)= 100 and the maximum iteration= 3000 for are presented in Fig. 2, 3, 4 and 5 where the x-axis denotes the
51 run per epoch. Here we set a fixed parameter setting for
iteration and the y-axis denotes the average fitness values.
both algorithms in order to evaluate its overall performance on
the whole benchmark suite. The recommended parameters of Functions to are unimodal functions. In this case the
the MWWO algorithms are set as follow: convergence rate of search algorithm is more important for
unimodal functions than the final results because there are
• MWWO: = 1.01 , = 1.0026 ( other methods which are specifically designed to optimize
exponentially reduces from to ) ,ℎ = unimodal functions. For and as a unimodal functions
12, = 6, linearly decreases from 0.25 to 0.001 WWO performs better than MWWO while for have many
and = 100. local minima and almost are most difficult to optimize. For
In table1 the mean of minimum (mean) and standard deviation multimodal functions, the final results are more important since
(STD) values of function values among the 51 runs are they reflect the ability of the algorithm in escaping from poor
reported. Tables1 presents the simulation results on the local optima and locating a near-global optimum. We have
unimodal and multimodal respectively. The values in boldface carried out experiments on to where the number of local
indicate the best result among the two algorithms. minima increases exponentially as the dimension of the
function increases. As shown in Figs 2, 3, and 4, MWWO has
We can see on numerical results that:
obtained better performance in comparison to WWO. It’s
because MWWO has better exploration capability. But in Fig.5
• On the unimodal functions and , WWO performs as unimodal function WWO has better performance compared
better than MWWO while for MWWO has best to MWWO indicating that it uses a better exploitation
result. capability. That as mentioned before this ability of WWO may
• On the multimodal functions, for all functions lead to premature convergence.
MWWO obtain the best result except and .

Table 1. EXPRIMENTAL RESULT ON BENCHMARK FUNCTIONS


TYPE ID Metric WWO MWWO
mean 1.0733e+05 8.7762e+05
std 2.1453e+04 2.0906e+04
Unimodal
mean 1.3271e+04 8.8534e+05
Functions
std 2.4281e+03 6.2783e+06
mean 5.0512e+02 3.0523e+02
std 9.8443e+02 3.7144e+01
Fig. 2. Comparison of performance of MWWO and WWO for Fig. 5. Comparison of performance of MWWO and WWO for
minimization of minimization of

The figure 2, 3 and 4 display that the WWO misses quickly the
exploration ability but the MWWO retains the swarm
exploration ability in the beginning iteration and in the final
iteration.
On other hand, the powerful exploration capability of the
proposed MWWO could be evolved .
As table 1 display, in Functions 1, 2, 4 and 15, WWO has
better capability to explore and exploit than MWWO but for
twelve remain functions MWWO provides much better results
than WWO because MWWO is more powerful. For instance,
for multimodal functions WWO has weak exploration whiles
MWWO has strong ability explore and exploit the search space
and has acceptable convergence rate. Consequently, these
characteristics cause good results.
V. CONCLUSION
Fig. 3. Comparison of performance of MWWO and WWO for This paper modifies the performance of WWO and presents
minimization of a method, named MWWO. In MWWO, an adaptive is
proposed which improves the exploration ability of WWO. The
proposed MWWO has less probability of trapping on local
optima for benchmark functions which are as a result of
acceptable balance between exploration and exploitation
compare to the original WWO. The simulation results confirm
the superiority of proposed MWWO algorithm.
REFERENCES
[1] Beyer H- G, Schwefel H- P, “Evolution strategies- a comprehensive
introduction”, Natural Computing, vol. 1, issue 1, pp 3–52, March
2002.
[2] A. Rouhi, H. Nezamabadi-pour, “A hybrid method for dimensionality
reduction in microarray data based on advanced binary ant colony
algorithm”, 1st Conference on Swarm Intelligence and Evolutionary
Computation (CSIEC), March 2016.
[3] M. Sahin, U. Atav, M. Tomak, “Quantum genetic algorithm method
in self-consistent electronic structure calculations of a quantum dot
with many electrons”, international Journal of Modern Physics C, vol.
16, issue 9, pp 1379–1393, September 2005.
[4] M. Dorigo, V. Maniezzo, “A. Colorni, The ant system: optimization
by a colony of cooperating agents”, IEEE Transactions on Systems,
Man, and Cybernetics – Part B( Cybernetics), vol. 26, issue 1, pp 29–
Fig. 4. Comparison of performance of MWWO and WWO for
41, February 1996.
minimization of
[5] F.V.D. Bergh, A. P. Engelbrecht, “A study of particle swarm Y. J. Zheng, “Water wave optimization: A new nature-inspired
optimization particle trajectories”, Information Sciences, vol. 176,
issue 8, pp 937–971, April 2006.
metaheuristic”, Computers & Operations Research, vol. 55, pp
[6] E. Rashedi, H. Nezamabadi-pour and S. Saryazdi, “GSA: a
1–11, March 2015.
gravitational search algorithm”, Information Sciences, vol. 179, issue
13, pp 2232-2248, June 2009.

You might also like