You are on page 1of 8

Comparative analysis

Genetic Algorithm and Evolution Strategy

Evolution Strategies (ESs) and Genetic Algorithms (GAs) are compared in a


formal as well as in an experimental way. It is shown, that both are identical with
respect to their major working scheme, but nevertheless they exhibit significant
differences with respect to the details of the selection scheme, the amount of the
genetic representation and, especially, the self-adaptation of strategy parameters

Introduction
Genetic Algorithms (GAs) and Evolution strategies (ESs) are two strata of consciously
pursued attempts to imitate principles of organic evolution processes as rules for opti-
mum seeking procedures. Both rely upon the collective learning paradigm gleaned from
natural evolution and implement the principles 'population', 'mutation', 'recombination'
and 'selection'. In addition, ESs try to use a collective self-learning mechanism to adapt
its strategy parameters during the optimum search (adaptive search).
The aim of this paper is to show the similarities and differences between both approaches
in a formal as well as in an experimental way by viewing at some test functions from
Schwefel [Sch75] and Rudolph [Rudg0]. In particular, the paper is only an excerpt of a
larger investigation on the same subject [HB90

Evolutionary algorithm
The implementation of the generic Evolutionary algorithm is made of three important steps
that should be taken into consideration such as: Initialization, Evaluation and The evolution. The
Initialization part randomly generates the initial population Pi. In the evaluation part we have to
compute the fitness value of f(x) for each x that is included in Pi. Next is the evolution part in
which the process is divided into 6 parts. Those parts are Mating pool selection, crossover
(recombination) , mutation, evaluation, survivor selection and i <- i + 1. This happens until a
certain termination condition holds.
The evolutionary algorithm is split into genetic algorithms (GA) , and this it the most commonly
used evolutionary algorithm, evolutionary strategies (ES) – continuous search space, self –
adaptation, evolutionary programming and genetic programming.

Genetic Algorithm
In computer science and operations research, a genetic algorithm (GA) is
a metaheuristic inspired by the process of natural selection that belongs to the larger class
of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-
quality solutions to optimization and search problems by relying on biologically inspired
operators such as mutation, crossover and selection. Some examples of GA applications include
optimizing decision trees for better performance, solving sudoku puzzles, hyperparameter
optimization, etc.
In a genetic algorithm, a population of candidate solutions to an optimization problem is
evolved toward better solutions. Each candidate solution has a set of properties (its
chromosomes or genotype) which can be mutated and altered. The evolution of a population is
an iterative process, with each iteration called a generation. Commonly, the algorithm
terminates when either a maximum number of generations has been produced, or a
satisfactory fitness level has been reached for the population.

Evolutionary Strategy
In computer science, an evolution strategy (ES) is an optimization technique based on ideas
of evolution. It belongs to the general class of evolutionary computation or artificial
evolution methodologies.
Evolution Strategies emerged from Rechenberg's work in the late sixties, which resulted
in a (I+I)-ES with a deterministic step size control [Rec73]. For two model functions
Rechenberg was able to show its convergence and the achieved rate of convergence. Schwe-
fel extended the (I+I)-ES towards a (/~+~)-ES and (#,A)-ES by applying principles from organic
evolution more rigorously. As a result Schwefel proposed an ES capable of self-
adapting some of its strategy parameters [Sch77, Sch81a]. Born proposed also a population
based (tt+l)-ES with the additional concept of a Genetic Load, for which he proved the
convergence with probability 1 [Bor78]. From the different ES variants Schwefel's ESs
will be presented, since it is closest to organic evolution and best suited for the later
comparison with GAs.

Differences
In evolution strategies, the individuals are coded as vectors of real numbers. On
reproduction, parents are selected randomly and the fittest offsprings are selected and
inserted in the next generation. ES individuals are self-adapting. The step size or
"mutation strength" is encoded in the individual, so good parameters get to the next
generation by selecting good individuals.
In genetic algorithms, the individuals are coded as integers. The selection is done by
selecting parents proportional to their fitness. So individuals must be evaluated before
the first selection is done. Genetic operators work on the bit-level (e.g. cutting a bit
string into multiple pieces and interchange them with the pieces of the other parent or
switching single bits).

That's the theory. In practice, it is sometimes hard to distinguish between both


evolutionary algorithms, and you need to create hybrid algorithms (e.g. integer (bit-
string) individuals that encodes the parameters of the genetic operators).

As Paul noticed before, the encoding is not really the difference here, as this is an
implementation detail of specific algorithms, although it seems more common in ES.

To answer the question, we first need to do a small step back and look at internals of an
ES algorithm. In ES there is a concept of endogenous and exogenous parameters of the
evolution. Endogenous parameters are associated with individuals and therefore are
evolved together with them, exogenous are provided from "outside" (e.g. set constant
by the developer, or there can be a function/policy which sets their value depending on
the iteration no).

The individual k consists therefore of two parts:

 y(k) - a set of object parameters (e.g. a vector of real/int values) which denote the
individual genotype
 s(k) - a set of strategy parameters (e.g. a vector of real/int values again) which e.g.
can control statistical properties of mutation)
Those two vectors are being selected, mutated, recombined together.

The main difference between GA and ES is that in classic GA there is no distinction


between types of algorithm parameters. In fact all the parameters are set from "outside",
so in ES terms are exogenous.

There are also other minor differences, e.g. in ES the selection policy is usually one and
the same and in GA there are multiple different approaches which can be interchanged.

evolutionary algorithms: algorithms that try, inside a given population of candidates


solutions, to find a "best fit" solution. The algorithm start with random candidates and try to
evolve from there by reproduction, mutation, recombination and selection
From there, you have several "standard way" of calling several families of evolutionary
algorithms. Theses names have been given by academic research ; I believe the subtle
differences in the naming convention come both from subtle reasons and from lack of
available vocabulary as time goes on :

 genetic algorithm: The given population is defined as "a string of numbers" (traditionally a
vector of 0 an 1s).
 evolution strategy: the given population is a vector of real numbers + parameters like the
mutation rate among studied candidates tries to reach an optimum. Imagine you start with a
population of (c1,c2,c3) 3-dimensional vectors. At each generation you mutate them by
adding a random value to c1, c2 and c3. The random value may be based on a Gaussian with
a standard deviation S. It may be interesting to start with a big value of S, so that mutations
will produce vectors all over the place and once you start finding interesting vectors, you start
decreasing S in order to "focus" around these interesting vectors.
Source code
ES Ackley
Defining the initial function:

Implementing the algorithm:

Generating the population:


GA Ackley
Defining the initial function:

Fitness function:

Generating the initial population:

Implementing the algorithm:


References:
https://online.ase.ro/
https://stackoverflow.com/questions/7787232/what-are-the-differences-between-genetic-
algorithms-and-evolution-strategies
https://en.wikipedia.org/wiki/Evolution_strategy
James Edward Baker. Adaptive selection methods for genetic algorithms. In J. J. Grefenstette, editor,
Proceedings of the first international conference on genetic algorithms and their applications, pages
101–111, Hillsdale, New Jersey, 1985. Lawrence Erlbaum Associates.

Richard A. Caruna, Larry J. Eshelman, and J. David Schaffer. Representation and hidden bias II:
Eliminating defining length bias in genetic search via shuffle crossover. In N. S. Sridharan, editor,
Eleventh international joint conference on artificial intelligence, pages 750–755. Morgan Kaufmann
Publishers, August 1989.

J. Born. Evolutionsstrategien zur numerischen Lösung von Adaptationsaufgaben. Dissertation A,


Humboldt-Universität, Berlin, GDR, 1978.

You might also like