You are on page 1of 30

Group Discussion

Harmony Search
A unique Music-inspired Algorithm

Created and Presented by:

Aditya Vikram Choudhary Kishan Sahu Ritik Ranjan Shashwat Srivastava


Introduction
● The Harmony Search (HS) method is
an emerging metaheuristic
optimization algorithm, which has
been proposed by Geem et al. in
2001.
● Musicians usually try various
possible combinations of the music
pitches stored in their memory
● This is as an optimization process of
adjusting the input (pitches) to obtain
the optimal output (perfect harmony).
● This kind of efficient search for a
perfect harmony is analogous to the
procedure of finding the optimal
solutions to engineering problems

● In the HS algorithm, each musician


(= decision variable) plays
(= generates) a note (= a value) for
finding a best harmony
(= global optimum) all together.

● This searching method has the


distinguishing features of algorithm
simplicity and search efficiency.
Heuristics ● When trying to solve some given
optimization problem, you have some
set of input variables that can be
evaluated for their quality, and you want
to know what inputs produce the best
quality (solution)
● Heuristics are usually employed to:
○ to solve a problem more quickly
when classic methods are too
slow
○ for finding an approximate solution
when classic methods fail to find
any exact solution
● A metaheuristic is a higher-level heuristic designed to generate or select a
heuristic that may provide a sufficiently good solution to an optimization
problem, especially with incomplete or imperfect information

● In simple words, it is a repetitive/iterative process used to solve a problem


that even the toughest algorithms, with greatest optimization powers, fail to
solve effectively or efficiently.

● Metaheuristic algorithms try to find this global optimum using some


strategy which is better than brute force (but may not guarantee that a
globally optimal solution can be found on some class of problems).
● For problems where it is hard to
decipher why changing an input
changes the quality (and thus the
optimal solution isn’t very obvious),
these algorithms are extremely
useful.

● Harmony search does not guarantee


that the globally optimal solution
will be found, but often they do find
it, and they are usually much more
efficient than an exhaustive brute
force search of all input
combinations.
Main Idea
● The pitch of each musical instrument determines the aesthetic quality,
just as the fitness function value determines the quality of decision
variables.

● In the music improvisation process, all players sound pitches within


possible range together to make one harmony.

● If all pitches make a good harmony, each player stores in his memory that
experience and the possibility of making a good harmony is increased next
time.
● For the same thing in optimization,
the initial solution is randomly
generated from decision variables
within the possible range.

● If the objective function value of


those decision variables is good to
make a promising solution, then the
possibility of making a good solution
is increased next time.
The HS methodology
● The HS method is actually
inspired by the working
principles of the harmony
improvisation.
● The figure alongside shows the
flowchart of the basic HS
method, with 4 principal steps
involved.
● A harmonic memory (HM) is
considered which has random
initial solutions and is updates
as the algorithm proceeds.
How it works?
Step 1:
Initialize the HS Memory (HM). The
x11, x21, x31, …, xn1
initial HM consists of a certain
x12, x22, x32, …, xn2 number of randomly generated
solutions to the optimization
. problems under consideration.
.
. For an n-dimension problem, an HM
with the size of can be represented
x1HMS, x2HMS, x3HMS, …, xnHMS
as shown alongside.
Here [x1i, x2i, x3i, …, xni] (i=1,2,...,HMS)
HMS is typically set between 50 and 100 is a solution candidate.
Step 2: The HMCR is defined as the
probability of selecting a
Improvise a new solution from the
component from the HM
HM. Each component of this members, and 1-HMCR is,
solution, [x1’, x2’, x3’, …, xn’], is obtained therefore, the probability of
based on the Harmony Memory generating it randomly.
Considering Rate (HMCR).

If xj’ comes from the HM, it is chosen


from the jth dimension of a random The PAR determines the
HM member and is further mutated probability of a candidate
by distance bw according to the from the HM to be mutated.
Pitching Adjust Rate (PAR).
Step 3:
Update the HM. The new solution from Step 2 is evaluated. If it yields a
better fitness than that of the worst member in the HM, it will replace that
one. Otherwise, it is eliminated.

Step 4:
Repeat Step 2 to Step 3 until a preset termination criterion is met, for
example, the maximal number of iterations, is met.
Similar to the GA and swarm
intelligence algorithms, the HS method
is a random search technique. It does
not require any prior domain
knowledge

Different from those


population-based approaches, it only
utilizes a single search memory to
evolve. Therefore, the HS method
has the distinguishing feature of
computational simplicity.
Repeat
Pseudocode   /* Construction and evaluation of new solution candidate   */
  for (j=1; j<=n; j++)
   if (rand(0, 1) < HMCR)
    Let xj be jth dimension of a randomly chosen HM member x.
/* HM initialization */     if (rand(0, 1) < PAR)
      Apply pitch adjustment distance bw to mutate xj :
for (i=1; i<=HMS; i++)       xj = xj ± rand(0,1) * bw.
  for (j=1; j<=n; j++)      endif
    Randomly initialize xji    else
  endfor     Let xj in x be a random value.
endfor    endif
  endfor
/* End of initialization */ Evaluate the fitness of x : f(x)
  /* HM update */
  if (f(x) is better than the fitness of the worst HM member)
    Replace the worst HM member with x.
  else
    Disregard .
  endif
Until a preset termination criterion is met
Repeat
Pseudocode   /* Construction and evaluation of new solution candidate   */
  for (j=1; j<=n; j++)
   if (rand(0, 1) < HMCR)
    Let xj be jth dimension of a randomly chosen HM member x.
/* HM initialization */     if (rand(0, 1) < PAR)
      Apply pitch adjustment distance bw to mutate xj :
for (i=1; i<=HMS; i++)       xj = xj ± rand(0,1) * bw.
  for (j=1; j<=n; j++)      endif
    Randomly initialize xji    else
  endfor     Let xj in x be a random value.
endfor    endif
  endfor
/* End of initialization */ Evaluate the fitness of x : f(x)
  /* HM update */
  if (f(x) is better than the fitness of the worst HM member)
    Replace the worst HM member with x.
  else
    Disregard .
  endif
Until a preset termination criterion is met
Repeat
Pseudocode   /* Construction and evaluation of new solution candidate   */
  for (j=1; j<=n; j++)
   if (rand(0, 1) < HMCR)
    Let xj be jth dimension of a randomly chosen HM member x.
/* HM initialization */     if (rand(0, 1) < PAR)
      Apply pitch adjustment distance bw to mutate xj :
for (i=1; i<=HMS; i++)       xj = xj ± rand(0,1) * bw.
  for (j=1; j<=n; j++)      endif
    Randomly initialize xji    else
  endfor     Let xj in x be a random value.
endfor    endif
  endfor
/* End of initialization */ Evaluate the fitness of x : f(x)
  /* HM update */
  if (f(x) is better than the fitness of the worst HM member)
    Replace the worst HM member with x.
  else
    Disregard .
  endif
Until a preset termination criterion is met
Repeat
Pseudocode   /* Construction and evaluation of new solution candidate   */
  for (j=1; j<=n; j++)
   if (rand(0, 1) < HMCR)
    Let xj be jth dimension of a randomly chosen HM member x.
/* HM initialization */     if (rand(0, 1) < PAR)
      Apply pitch adjustment distance bw to mutate xj :
for (i=1; i<=HMS; i++)       xj = xj ± rand(0,1) * bw.
  for (j=1; j<=n; j++)      endif
    Randomly initialize xji    else
  endfor     Let xj in x be a random value.
endfor    endif
  endfor
/* End of initialization */ Evaluate the fitness of x : f(x)
  /* HM update */
  if (f(x) is better than the fitness of the worst HM member)
    Replace the worst HM member with x.
  else
    Disregard .
  endif
Until a preset termination criterion is met
Performance of Harmony Search
● Harmony search is frequently used in solving complex optimization
problems.

● Most of the well known optimization functions used for testing and
benchmarking the performance of these algorithms are:

Rosenbrock’s banana function


Griewank test function

● We will evaluate Harmony Search on Rosenbrock’s banana function.


Rosenbrock’s Banana Test Function
For 2-Dimension
Rosenbrock’s banana test function

For d- Dimension:
Performance of HS on Rosenbrock’s banana test
function
Performance test problem for Evaluation Using H.S Algorithm
optimization algorithms Parameters:

HMCR = 0.95,
● Global Minimum of the given
function exist at X* = [1,1,….,1] PAR = 0.8,

● For 2-D Rosenbrock Banana test bw = 0.2,


Function, the minimum value is 0
and it occurs at (1,1). Tmax = 1000
The initial positions memorized in HM
The positions generated by HS using the first 1000 iterations.
Pictorial Representation
Applications
● The hybridization of HS and DE was developed for designing a CMOS
inverter.

● HS could be adopted to design a classifier or a clustering algorithm. The


application of HS in recognizing patterns in image or speech data can be
helpful for Image/Speech processing.

● In telecommunication, HS is used for optimal designing of wireless sensor


networks (WSN), antenna and radar.
It comes in handy while solving the following real-world problems:

● Routine problems ● Medical Applications


○ Sudoku Puzzle ○ RNA Structure Prediction
○ Project Scheduling ○ Medical Imaging
○ University Timetabling ○ Radiation Oncology

● Computer Networks ● Civil & Mechanical fields


○ Internet Routing ○ Truss Structure Design
○ Web-Based Parameter ○ Water Distribution Network
Calibration Design
Conclusion
The advantages of HS algorithm are: Owing to these merits, Harmony
Search has been considerably used to
● simple concept solve various problems in different
● easy implementation fields like power system,
● fast convergence speed communication, software, civil, water
engineering, and pattern recognition.
● fewer parameters to adjust

Using these results, it can be drawn that the Harmony Search Algorithm could
be a good candidate to solve complex optimization problems.
References - Web
1) Geem, Z.W.: Harmony Search Algorithm for Solving Sudoku.
Knowledge-Based Intelligent Information and Engineering Systems.
http://dx.doi.org/10.1007/978-3-540-74819-9_46

2) Exam Mark Demo Example


http://harry.me/blog/2011/07/05/neat-algorithms-harmony-search/#searc
hVis

3) X. Z. Gao, V. Govindasamy, H. Xu, X. Wang, K. Zenger, "Harmony Search


Method: Theory and Applications", Computational Intelligence and
Neuroscience, vol. 2015, Article ID 258491, 10 pages, 2015.
https://doi.org/10.1155/2015/258491
References - Books
1) Z. W. Geem, J. H. Kim, and G. V. Loganathan, “A new heuristic optimization algorithm: harmony search,”
Simulation, vol. 76, no. 2, pp. 60–68, 2001.
View at: Publisher Site | Google Scholar
2) K. S. Lee and Z. W. Geem, “A new meta-heuristic algorithm for continuous engineering optimization: harmony
search theory and practice,” Computer Methods in Applied Mechanics and Engineering, vol. 194, no. 36–38, pp.
3902–3933, 2005.
View at: Publisher Site | Google Scholar
3) K. S. Lee and Z. W. Geem, “A new structural optimization method based on the harmony search algorithm,”
Computers and Structures, vol. 82, no. 9-10, pp. 781–798, 2004.
View at: Publisher Site | Google Scholar
4) Z. W. Geem, J. H. Kim, and G. V. Loganathan, “Harmony search optimization: application to pipe network
design,” International Journal of Modelling and Simulation, vol. 22, no. 2, pp. 125–133, 2002.
View at: Google Scholar
5) X. Wang, X.-Z. Gao, and S. J. Ovaska, “Fusion of clonal selection algorithm and harmony search method in
optimisation of fuzzy classification systems,” International Journal of Bio-Inspired Computation, vol. 1, no. 1-2, pp.
80–88, 2009.
View at: Publisher Site | Google Scholar
6) R. Poli and W. B. Langdon, Foundations of Genetic Programming, Springer, Berlin, Germany, 2002.
7) A. P. Engelbrecht, Fundamentals of Computational Swarm Intelligence, John Wiley & Sons, West Sussex, UK, 2005.
8) Z. W. Geem, “Novel derivative of harmony search algorithm for discrete design variables,” Applied Mathematics
and Computation, vol. 199, no. 1, pp. 223–230, 2008.
View at: Publisher Site | Google Scholar | MathSciNet
9) M. G. H. Omran and M. Mahdavi, “Global-best harmony search,” Applied Mathematics and Computation, vol. 198,
no. 2, pp. 643–656, 2008.
View at: Publisher Site | Google Scholar | MathSciNet
10) J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on
Neural Networks, pp. 1942–1948, Perth, Australia, December 1995.
View at: Google Scholar
THANK YOU !

You might also like