You are on page 1of 11

Evolutionary algorithm

In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary


computation,[1] a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms
inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate
solutions to the optimization problem play the role of individuals in a population, and the fitness function
determines the quality of the solutions (see also loss function). Evolution of the population then takes place
after the repeated application of the above operators.

Evolutionary algorithms often perform well approximating solutions to all types of problems because they
ideally do not make any assumption about the underlying fitness landscape. Techniques from evolutionary
algorithms applied to the modeling of biological evolution are generally limited to explorations of
microevolutionary processes and planning models based upon cellular processes. In most real applications
of EAs, computational complexity is a prohibiting factor.[2] In fact, this computational complexity is due to
fitness function evaluation. Fitness approximation is one of the solutions to overcome this difficulty.
However, seemingly simple EA can solve often complex problems;[3][4][5] therefore, there may be no direct
link between algorithm complexity and problem complexity.

Implementation
The following is an example of a generic single-objective genetic algorithm.

Step One: Generate the initial population of individuals randomly. (First generation)

Step Two: Repeat the following regenerational steps until termination:

1. Evaluate the fitness of each individual in the population (time limit, sufficient fitness
achieved, etc.)
2. Select the fittest individuals for reproduction. (Parents)
3. Breed new individuals through crossover and mutation operations to give birth to offspring.
4. Replace the least-fit individuals of the population with new individuals.

Types
Similar techniques differ in genetic representation and other implementation details, and the nature of the
particular applied problem.

Genetic algorithm – This is the most popular type of EA. One seeks the solution of a problem
in the form of strings of numbers (traditionally binary, although the best representations are
usually those that reflect something about the problem being solved),[2] by applying
operators such as recombination and mutation (sometimes one, sometimes both). This type
of EA is often used in optimization problems.
Genetic programming – Here the solutions are in the form of computer programs, and their
fitness is determined by their ability to solve a computational problem. There are many
variants of Genetic Programming, including Cartesian genetic programming, gene
expression programming, grammatical evolution, linear genetic programming, multi
expression programming etc.
Evolutionary programming – Similar to genetic programming, but the structure of the
program is fixed and its numerical parameters are allowed to evolve.
Evolution strategy – Works with vectors of real numbers as representations of solutions, and
typically uses self-adaptive mutation rates. The method is mainly used for numerical
optimization, although there are also variants for combinatorial tasks.[6][7]
Differential evolution – Based on vector differences and is therefore primarily suited for
numerical optimization problems.
Coevolutionary algorithm – Similar to genetic algorithms and evolution strategies, but the
created solutions are compared on the basis of their outcomes from interactions with other
solutions. Solutions can either compete or cooperate during the search process.
Coevolutionary algorithms are often used in scenarios where the fitness landscape is
dynamic, complex, or involves competitive interactions.[8][9]
Neuroevolution – Similar to genetic programming but the genomes represent artificial neural
networks by describing structure and connection weights. The genome encoding can be
direct or indirect.
Learning classifier system – Here the solution is a set of classifiers (rules or conditions). A
Michigan-LCS evolves at the level of individual classifiers whereas a Pittsburgh-LCS uses
populations of classifier-sets. Initially, classifiers were only binary, but now include real,
neural net, or S-expression types. Fitness is typically determined with either a strength or
accuracy based reinforcement learning or supervised learning approach.

Theoretical background
The following theoretical principles apply to all or almost all EAs.

No free lunch theorem

The no free lunch theorem of optimization states that all optimization strategies are equally effective when
the set of all optimization problems is considered. Under the same condition, no evolutionary algorithm is
fundamentally better than another. This can only be the case if the set of all problems is restricted. This is
exactly what is inevitably done in practice. Therefore, to improve an EA, it must exploit problem
knowledge in some form (e.g. by choosing a certain mutation strength or a problem-adapted coding). Thus,
if two EAs are compared, this constraint is implied. In addition, an EA can use problem specific knowledge
by, for example, not randomly generating the entire start population, but creating some individuals through
heuristics or other procedures.[10][11] Another possibility to tailor an EA to a given problem domain is to
involve suitable heuristics, local search procedures or other problem-related procedures in the process of
generating the offspring. This form of extension of an EA is also known as a memetic algorithm. Both
extensions play a major role in practical applications, as they can speed up the search process and make it
more robust.[10][12]

Convergence

For EAs in which, in addition to the offspring, at least the best individual of the parent generation is used to
form the subsequent generation (so-called elitist EAs), there is a general proof of convergence under the
condition that an optimum exists. Without loss of generality, a maximum search is assumed for the proof:

From the property of elitist offspring acceptance and the existence of the optimum it follows that per
generation an improvement of the fitness of the respective best individual will occur with a
probability . Thus:
I.e., the fitness values represent a monotonically non-decreasing sequence, which is bounded due to the
existence of the optimum. From this follows the convergence of the sequence against the optimum.

Since the proof makes no statement about the speed of convergence, it is of little help in practical
applications of EAs. But it does justify the recommendation to use elitist EAs. However, when using the
usual panmictic population model, elitist EAs tend to converge prematurely more than non-elitist ones.[13]
In a panmictic population model, mate selection (step 2 of the section about implementation) is such that
every individual in the entire population is eligible as a mate. In non-panmictic populations, selection is
suitably restricted, so that the dispersal speed of better individuals is reduced compared to panmictic ones.
Thus, the general risk of premature convergence of elitist EAs can be significantly reduced by suitable
population models that restrict mate selection.[14][15]

Virtual alphabets

With the theory of virtual alphabets, David E. Goldberg showed in 1990 that by using a representation with
real numbers, an EA that uses classical recombination operators (e.g. uniform or n-point crossover) cannot
reach certain areas of the search space, in contrast to a coding with binary numbers.[16] This results in the
recommendation for EAs with real representation to use arithmetic operators for recombination (e.g.
arithmetic mean or intermediate recombination). With suitable operators, real-valued representations are
more effective than binary ones, contrary to earlier opinion.[17][18]

Comparison to biological processes


A possible limitation of many evolutionary algorithms is their lack of a clear genotype–phenotype
distinction. In nature, the fertilized egg cell undergoes a complex process known as embryogenesis to
become a mature phenotype. This indirect encoding is believed to make the genetic search more robust (i.e.
reduce the probability of fatal mutations), and also may improve the evolvability of the organism.[19][20]
Such indirect (also known as generative or developmental) encodings also enable evolution to exploit the
regularity in the environment.[21] Recent work in the field of artificial embryogeny, or artificial
developmental systems, seeks to address these concerns. And gene expression programming successfully
explores a genotype–phenotype system, where the genotype consists of linear multigenic chromosomes of
fixed length and the phenotype consists of multiple expression trees or computer programs of different sizes
and shapes.[22]

Applications
The areas in which evolutionary algorithms are practically used are almost unlimited[5] and range from
industry,[23][24] engineering,[2][3][25] complex scheduling,[4][26][27] agriculture,[28] robot movement
planning[29] and finance[30][31] to research[32][33] and art. The application of an evolutionary algorithm
requires some rethinking from the inexperienced user, as the approach to a task using an EA is different
from conventional exact methods and this is usually not part of the curriculum of engineers or other
disciplines. For example, the fitness calculation must not only formulate the goal but also support the
evolutionary search process towards it, e.g. by rewarding improvements that do not yet lead to a better
evaluation of the original quality criteria. For example, if peak utilisation of resources such as personnel
deployment or energy consumption is to be avoided in a scheduling task, it is not sufficient to assess the
maximum utilisation. Rather, the number and duration of exceedances of a still acceptable level should also
be recorded in order to reward reductions below the actual maximum peak value.[34] There are therefore
some publications that are aimed at the beginner and want to help avoiding beginner's mistakes as well as
leading an application project to success.[34][35][36] This includes clarifying the fundamental question of
when an EA should be used to solve a problem and when it is better not to.

Related techniques
Swarm algorithms include:

Ant colony optimization is based on the ideas of ant foraging by pheromone communication
to form paths.[37] Primarily suited for combinatorial optimization and graph problems.
The runner-root algorithm (RRA) is inspired by the function of runners and roots of plants in
nature.[38]
Artificial bee colony algorithm is based on the honeybee foraging behaviour. Primarily
proposed for numerical optimization and extended to solve combinatorial, constrained and
multi-objective optimization problems.
Bees algorithm is based on the foraging behaviour of honeybees. It has been applied in
many applications such as routing and scheduling.
Cuckoo search is inspired by the brooding parasitism of the cuckoo species. It also uses
Lévy flights, and thus it suits for global optimization problems.
Particle swarm optimization is based on the ideas of animal flocking behaviour.[37] Also
primarily suited for numerical optimization problems.

Other population-based metaheuristic methods


Hunting Search – A method inspired by the group hunting of some animals such as wolves
that organize their position to surround the prey, each of them relative to the position of the
others and especially that of their leader. It is a continuous optimization method[39] adapted
as a combinatorial optimization method.[40]
Adaptive dimensional search – Unlike nature-inspired metaheuristic techniques, an
adaptive dimensional search algorithm does not implement any metaphor as an underlying
principle. Rather it uses a simple performance-oriented method, based on the update of the
search dimensionality ratio (SDR) parameter at each iteration.[41]
Firefly algorithm is inspired by the behavior of fireflies, attracting each other by flashing light.
This is especially useful for multimodal optimization.
Harmony search – Based on the ideas of musicians' behavior in searching for better
harmonies. This algorithm is suitable for combinatorial optimization as well as parameter
optimization.
Gaussian adaptation – Based on information theory. Used for maximization of manufacturing
yield, mean fitness or average information. See for instance Entropy in thermodynamics and
information theory.
Memetic algorithm – A hybrid method, inspired by Richard Dawkins's notion of a meme, it
commonly takes the form of a population-based algorithm coupled with individual learning
procedures capable of performing local refinements. Emphasizes the exploitation of
problem-specific knowledge and tries to orchestrate local and global search in a synergistic
way.

Examples
In 2020, Google stated that their AutoML-Zero can successfully rediscover classic algorithms such as the
concept of neural networks.[42]

The computer simulations Tierra and Avida attempt to model macroevolutionary dynamics.

Gallery
[43][44][45]

A two-population EA A two-population EA Estimation of A two-population EA


search over a search over a distribution search of a bounded
constrained constrained algorithm over optima of
Rosenbrock function Rosenbrock Keane's function Simionescu's
with bounded global function. Global function
optimum optimum is not
bounded.

References
1. Vikhar, P. A. (2016). "Evolutionary algorithms: A critical review and its future prospects".
Proceedings of the 2016 International Conference on Global Trends in Signal Processing,
Information Computing and Communication (ICGTSPICC). Jalgaon: 261–265.
doi:10.1109/ICGTSPICC.2016.7955308 (https://doi.org/10.1109%2FICGTSPICC.2016.7955
308). ISBN 978-1-5090-0467-6. S2CID 22100336 (https://api.semanticscholar.org/CorpusID:
22100336).
2. Cohoon, J; et al. (2002-11-26). Evolutionary algorithms for the physical design of VLSI
circuits (https://www.ifte.de/mitarbeiter/lienig/cohoon.pdf) (PDF). Advances in Evolutionary
Computing: Theory and Applications. Springer, pp. 683-712, 2003. ISBN 978-3-540-43330-
9.
3. Slowik, Adam; Kwasnicka, Halina (2020). "Evolutionary algorithms and their applications to
engineering problems" (https://doi.org/10.1007%2Fs00521-020-04832-8). Neural
Computing and Applications. 32 (16): 12363–12379. doi:10.1007/s00521-020-04832-8 (http
s://doi.org/10.1007%2Fs00521-020-04832-8). ISSN 0941-0643 (https://www.worldcat.org/iss
n/0941-0643). S2CID 212732659 (https://api.semanticscholar.org/CorpusID:212732659).
4. Mika, Marek; Waligóra, Grzegorz; Węglarz, Jan (2011). "Modelling and solving grid resource
allocation problem with network resources for workflow applications" (http://link.springer.co
m/10.1007/s10951-009-0158-0). Journal of Scheduling. 14 (3): 291–306.
doi:10.1007/s10951-009-0158-0 (https://doi.org/10.1007%2Fs10951-009-0158-0).
ISSN 1094-6136 (https://www.worldcat.org/issn/1094-6136). S2CID 31859338 (https://api.se
manticscholar.org/CorpusID:31859338).
5. "International Conference on the Applications of Evolutionary Computation" (https://www.evo
star.org/). The conference is part of the Evo* series. The conference proceedings are
published by Springer. Retrieved 2022-12-23.
6. Nissen, Volker; Krause, Matthias (1994), Reusch, Bernd (ed.), "Constrained Combinatorial
Optimization with an Evolution Strategy" (https://link.springer.com/chapter/10.1007/978-3-64
2-79386-8_5), Fuzzy Logik, Berlin, Heidelberg: Springer: 33–40, doi:10.1007/978-3-642-
79386-8_5 (https://doi.org/10.1007%2F978-3-642-79386-8_5), ISBN 978-3-642-79386-8
7. Coelho, V. N.; Coelho, I. M.; Souza, M. J. F.; Oliveira, T. A.; Cota, L. P.; Haddad, M. N.;
Mladenovic, N.; Silva, R. C. P.; Guimarães, F. G. (2016). "Hybrid Self-Adaptive Evolution
Strategies Guided by Neighborhood Structures for Combinatorial Optimization Problems".
Evol Comput. 24 (4): 637–666. doi:10.1162/EVCO_a_00187 (https://doi.org/10.1162%2FEV
CO_a_00187). PMID 27258842 (https://pubmed.ncbi.nlm.nih.gov/27258842).
S2CID 13582781 (https://api.semanticscholar.org/CorpusID:13582781).
8. Ma, Xiaoliang; Li, Xiaodong; Zhang, Qingfu; Tang, Ke; Liang, Zhengping; Xie, Weixin; Zhu,
Zexuan (2019), "A Survey on Cooperative Co-Evolutionary Algorithms." (https://ieeexplore.ie
ee.org/document/8454482/), IEEE Transactions on Evolutionary Computation, vol. 23, no. 3,
pp. 421–441, doi:10.1109/TEVC.2018.2868770 (https://doi.org/10.1109%2FTEVC.2018.286
8770), retrieved 2023-05-22
9. Popovici, Elena; Bucci, Anthony; Wiegand, R. Paul; De Jong, Edwin D. (2012).
"Coevolutionary Principles". In Rozenberg, Grzegorz; Bäck, Thomas; Kok, Joost N. (eds.).
Handbook of Natural Computing (https://link.springer.com/referenceworkentry/10.1007/978-3
-540-92910-9_31). Berlin, Heidelberg: Springer Berlin Heidelberg. pp. 987–1033.
doi:10.1007/978-3-540-92910-9_31 (https://doi.org/10.1007%2F978-3-540-92910-9_31).
ISBN 978-3-540-92910-9.
10. Davis, Lawrence (1991). Handbook of genetic algorithms (https://www.worldcat.org/oclc/230
81440). New York: Van Nostrand Reinhold. ISBN 0-442-00173-8. OCLC 23081440 (https://
www.worldcat.org/oclc/23081440).
11. Lienig, Jens; Brandt, Holger (1994), Davidor, Yuval; Schwefel, Hans-Paul; Männer, Reinhard
(eds.), "An evolutionary algorithm for the routing of multi-chip modules" (http://link.springer.co
m/10.1007/3-540-58484-6_301), Parallel Problem Solving from Nature — PPSN III, Berlin,
Heidelberg: Springer, vol. 866, pp. 588–597, doi:10.1007/3-540-58484-6_301 (https://doi.or
g/10.1007%2F3-540-58484-6_301), ISBN 978-3-540-58484-1, retrieved 2022-10-18
12. Neri, Ferrante; Cotta, Carlos; Moscato, Pablo, eds. (2012). Handbook of Memetic Algorithms
(http://link.springer.com/10.1007/978-3-642-23247-3). Studies in Computational Intelligence.
Vol. 379. Berlin, Heidelberg: Springer Berlin Heidelberg. doi:10.1007/978-3-642-23247-3 (ht
tps://doi.org/10.1007%2F978-3-642-23247-3). ISBN 978-3-642-23246-6.
13. Leung, Yee; Gao, Yong; Xu, Zong-Ben (1997). "Degree of population diversity - a
perspective on premature convergence in genetic algorithms and its Markov chain analysis"
(https://ieeexplore.ieee.org/document/623217). IEEE Transactions on Neural Networks. 8
(5): 1165–1176. doi:10.1109/72.623217 (https://doi.org/10.1109%2F72.623217). ISSN 1045-
9227 (https://www.worldcat.org/issn/1045-9227). PMID 18255718 (https://pubmed.ncbi.nlm.n
ih.gov/18255718).
14. Gorges-Schleuter, Martina (1998), Eiben, Agoston E.; Bäck, Thomas; Schoenauer, Marc;
Schwefel, Hans-Paul (eds.), "A comparative study of global and local selection in evolution
strategies" (http://link.springer.com/10.1007/BFb0056879), Parallel Problem Solving from
Nature — PPSN V, Lecture Notes in Computer Science, Berlin, Heidelberg: Springer Berlin
Heidelberg, vol. 1498, pp. 367–377, doi:10.1007/bfb0056879 (https://doi.org/10.1007%2Fbfb
0056879), ISBN 978-3-540-65078-2, retrieved 2022-10-21
15. Dorronsoro, Bernabe; Alba, Enrique (2008). Cellular Genetic Algorithms (http://link.springer.c
om/10.1007/978-0-387-77610-1). Operations Research/Computer Science Interfaces
Series. Vol. 42. Boston, MA: Springer US. doi:10.1007/978-0-387-77610-1 (https://doi.org/1
0.1007%2F978-0-387-77610-1). ISBN 978-0-387-77609-5.
16. Goldberg, David E. (1990), Schwefel, Hans-Paul; Männer, Reinhard (eds.), "The theory of
virtual alphabets" (http://link.springer.com/10.1007/BFb0029726), Parallel Problem Solving
from Nature, Lecture Notes in Computer Science, Berlin/Heidelberg: Springer-Verlag
(published 1991), vol. 496, pp. 13–22, doi:10.1007/bfb0029726 (https://doi.org/10.1007%2F
bfb0029726), ISBN 978-3-540-54148-6, retrieved 2022-10-22
17. Stender, J.; Hillebrand, E.; Kingdon, J. (1994). Genetic algorithms in optimisation, simulation,
and modelling (https://www.worldcat.org/oclc/47216370). Amsterdam: IOS Press. ISBN 90-
5199-180-0. OCLC 47216370 (https://www.worldcat.org/oclc/47216370).
18. Michalewicz, Zbigniew (1996). Genetic Algorithms + Data Structures = Evolution Programs
(https://www.worldcat.org/oclc/851375253) (3rd ed.). Berlin Heidelberg: Springer. ISBN 978-
3-662-03315-9. OCLC 851375253 (https://www.worldcat.org/oclc/851375253).
19. G.S. Hornby and J.B. Pollack. "Creating high-level components with a generative
representation for body-brain evolution". Artificial Life, 8(3):223–246, 2002.
20. Jeff Clune, Benjamin Beckmann, Charles Ofria, and Robert Pennock. "Evolving
Coordinated Quadruped Gaits with the HyperNEAT Generative Encoding" (http://www.ofria.c
om/pubs/2009CluneEtAl.pdf) Archived (https://web.archive.org/web/20160603205252/http://
www.ofria.com/pubs/2009CluneEtAl.pdf) 2016-06-03 at the Wayback Machine. Proceedings
of the IEEE Congress on Evolutionary Computing Special Section on Evolutionary
Robotics, 2009. Trondheim, Norway.
21. J. Clune, C. Ofria, and R. T. Pennock, "How a generative encoding fares as problem-
regularity decreases" (http://jeffclune.com/publications/Clune-HyperNEATandRegularity.pdf),
in PPSN (G. Rudolph, T. Jansen, S. M. Lucas, C. Poloni, and N. Beume, eds.), vol. 5199 of
Lecture Notes in Computer Science, pp. 358–367, Springer, 2008.
22. Ferreira, C., 2001. "Gene Expression Programming: A New Adaptive Algorithm for Solving
Problems" (http://www.gene-expression-programming.com/webpapers/GEP.pdf). Complex
Systems, Vol. 13, issue 2: 87–129.
23. Sanchez, Ernesto; Squillero, Giovanni; Tonda, Alberto (2012). Industrial Applications of
Evolutionary Algorithms (http://link.springer.com/10.1007/978-3-642-27467-1). Intelligent
Systems Reference Library. Vol. 34. Berlin, Heidelberg: Springer Berlin Heidelberg.
doi:10.1007/978-3-642-27467-1 (https://doi.org/10.1007%2F978-3-642-27467-1). ISBN 978-
3-642-27466-4.
24. Miettinen, Kaisa (1999). Evolutionary algorithms in engineering and computer science :
recent advances in genetic algorithms, evolution strategies, evolutionary programming,
genetic programming, and industrial applications (https://www.wiley.com/en-us/Evolutionary
+Algorithms+in+Engineering+and+Computer+Science%3A+Recent+Advances+in+Genetic
+Algorithms%2C+Evolution+Strategies%2C+Evolutionary+Programming%2C+Genetic+Pro
gramming+and+Industrial+Applications-p-9780471999027). Chichester: Wiley and Sons.
ISBN 0-585-29445-3. OCLC 45728460 (https://www.worldcat.org/oclc/45728460).
25. Gen, Mitsuo; Cheng, Runwei (1999-12-17). Genetic Algorithms and Engineering
Optimization (http://doi.wiley.com/10.1002/9780470172261). Wiley Series in Engineering
Design and Automation. Hoboken, NJ, USA: John Wiley & Sons, Inc.
doi:10.1002/9780470172261 (https://doi.org/10.1002%2F9780470172261). ISBN 978-0-
470-17226-1.
26. Dahal, Keshav P.; Tan, Kay Chen; Cowling, Peter I. (2007). Evolutionary scheduling (https://
www.worldcat.org/oclc/184984689). Berlin: Springer. doi:10.1007/978-3-540-48584-1 (http
s://doi.org/10.1007%2F978-3-540-48584-1). ISBN 978-3-540-48584-1. OCLC 184984689 (h
ttps://www.worldcat.org/oclc/184984689).
27. Jakob, Wilfried; Strack, Sylvia; Quinte, Alexander; Bengel, Günther; Stucky, Karl-Uwe; Süß,
Wolfgang (2013-04-22). "Fast Rescheduling of Multiple Workflows to Constrained
Heterogeneous Resources Using Multi-Criteria Memetic Computing" (https://doi.org/10.339
0%2Fa6020245). Algorithms. 6 (2): 245–277. doi:10.3390/a6020245 (https://doi.org/10.339
0%2Fa6020245). ISSN 1999-4893 (https://www.worldcat.org/issn/1999-4893).
28. Mayer, David G. (2002). Evolutionary Algorithms and Agricultural Systems (http://link.springe
r.com/10.1007/978-1-4615-1717-7). Boston, MA: Springer US. doi:10.1007/978-1-4615-
1717-7 (https://doi.org/10.1007%2F978-1-4615-1717-7). ISBN 978-1-4613-5693-6.
29. Blume, Christian (2000), Cagnoni, Stefano (ed.), "Optimized Collision Free Robot Move
Statement Generation by the Evolutionary Software GLEAM" (http://link.springer.com/10.100
7/3-540-45561-2_32), Real-World Applications of Evolutionary Computing, LNCS 1803,
Berlin, Heidelberg: Springer, vol. 1803, pp. 330–341, doi:10.1007/3-540-45561-2_32 (https://
doi.org/10.1007%2F3-540-45561-2_32), ISBN 978-3-540-67353-8, retrieved 2022-12-28
30. Aranha, Claus; Iba, Hitoshi (2008), Wobcke, Wayne; Zhang, Mengjie (eds.), "Application of a
Memetic Algorithm to the Portfolio Optimization Problem" (http://link.springer.com/10.1007/97
8-3-540-89378-3_52), AI 2008: Advances in Artificial Intelligence, Berlin, Heidelberg:
Springer Berlin Heidelberg, vol. 5360, pp. 512–521, doi:10.1007/978-3-540-89378-3_52 (htt
ps://doi.org/10.1007%2F978-3-540-89378-3_52), ISBN 978-3-540-89377-6, retrieved
2022-12-23
31. Chen, Shu-Heng, ed. (2002). Evolutionary Computation in Economics and Finance (http://lin
k.springer.com/10.1007/978-3-7908-1784-3). Studies in Fuzziness and Soft Computing.
Vol. 100. Heidelberg: Physica-Verlag HD. doi:10.1007/978-3-7908-1784-3 (https://doi.org/1
0.1007%2F978-3-7908-1784-3). ISBN 978-3-7908-2512-1.
32. Lohn, J.D.; Linden, D.S.; Hornby, G.S.; Kraus, W.F. (June 2004). "Evolutionary design of an
X-band antenna for NASA's Space Technology 5 mission" (https://ieeexplore.ieee.org/docu
ment/1331834). IEEE Antennas and Propagation Society Symposium, 2004. 3: 2313–2316
Vol.3. doi:10.1109/APS.2004.1331834 (https://doi.org/10.1109%2FAPS.2004.1331834).
hdl:2060/20030067398 (https://hdl.handle.net/2060%2F20030067398). ISBN 0-7803-8302-
8.
33. Fogel, Gary; Corne, David (2003). Evolutionary Computation in Bioinformatics (https://linking
hub.elsevier.com/retrieve/pii/B9781558607972X50008). Elsevier. doi:10.1016/b978-1-
55860-797-2.x5000-8 (https://doi.org/10.1016%2Fb978-1-55860-797-2.x5000-8). ISBN 978-
1-55860-797-2.
34. Jakob, Wilfried (2021), Applying Evolutionary Algorithms Successfully - A Guide Gained
from Realworld Applications (https://publikationen.bibliothek.kit.edu/1000135763/12127829
8), KIT Scientific Working Papers, vol. 170, Karlsruhe, FRG: KIT Scientific Publishing,
arXiv:2107.11300 (https://arxiv.org/abs/2107.11300), doi:10.5445/IR/1000135763 (https://do
i.org/10.5445%2FIR%2F1000135763), S2CID 236318422 (https://api.semanticscholar.org/C
orpusID:236318422), retrieved 2022-12-23
35. Whitley, Darrell (2001). "An overview of evolutionary algorithms: practical issues and
common pitfalls" (https://linkinghub.elsevier.com/retrieve/pii/S0950584901001884).
Information and Software Technology. 43 (14): 817–831. doi:10.1016/S0950-
5849(01)00188-4 (https://doi.org/10.1016%2FS0950-5849%2801%2900188-4).
S2CID 18637958 (https://api.semanticscholar.org/CorpusID:18637958).
36. Eiben, A.E.; Smith, J.E. (2015). "Working with Evolutionary Algorithms". Introduction to
Evolutionary Computing (http://link.springer.com/10.1007/978-3-662-44874-8). Natural
Computing Series (2nd ed.). Berlin, Heidelberg: Springer Berlin Heidelberg. pp. 147–163.
doi:10.1007/978-3-662-44874-8 (https://doi.org/10.1007%2F978-3-662-44874-8). ISBN 978-
3-662-44873-1. S2CID 20912932 (https://api.semanticscholar.org/CorpusID:20912932).
37. Slowik, Adam; Kwasnicka, Halina (2018). "Nature Inspired Methods and Their Industry
Applications—Swarm Intelligence Algorithms". IEEE Transactions on Industrial Informatics.
Institute of Electrical and Electronics Engineers (IEEE). 14 (3): 1004–1015.
doi:10.1109/tii.2017.2786782 (https://doi.org/10.1109%2Ftii.2017.2786782). ISSN 1551-
3203 (https://www.worldcat.org/issn/1551-3203). S2CID 3707290 (https://api.semanticschola
r.org/CorpusID:3707290).
38. F. Merrikh-Bayat, "The runner-root algorithm: A metaheuristic for solving unimodal and
multimodal optimization problems inspired by runners and roots of plants in nature", Applied
Soft Computing, Vol. 33, pp. 292–303, 2015
39. Oftadeh, R.; Mahjoob, M.J.; Shariatpanahi, M. (October 2010). "A novel meta-heuristic
optimization algorithm inspired by group hunting of animals: Hunting search" (https://doi.org/
10.1016%2Fj.camwa.2010.07.049). Computers & Mathematics with Applications. 60 (7):
2087–2098. doi:10.1016/j.camwa.2010.07.049 (https://doi.org/10.1016%2Fj.camwa.2010.0
7.049).
40. Amine Agharghor; Mohammed Essaid Riffi (2017). "First Adaptation of Hunting Search
Algorithm for the Quadratic Assignment Problem". Europe and MENA Cooperation
Advances in Information and Communication Technologies. Advances in Intelligent Systems
and Computing. 520: 263–267. doi:10.1007/978-3-319-46568-5_27 (https://doi.org/10.100
7%2F978-3-319-46568-5_27). ISBN 978-3-319-46567-8.
41. Hasançebi, O., Kazemzadeh Azad, S. (2015), "Adaptive Dimensional Search: A New
Metaheuristic Algorithm for Discrete Truss Sizing Optimization", Computers and Structures,
154, 1–16.
42. Gent, Edd (13 April 2020). "Artificial intelligence is evolving all by itself" (https://www.scienc
e.org/content/article/artificial-intelligence-evolving-all-itself). Science | AAAS. Archived (http
s://web.archive.org/web/20200416222954/https://www.sciencemag.org/news/2020/04/artifici
al-intelligence-evolving-all-itself) from the original on 16 April 2020. Retrieved 16 April 2020.
43. Simionescu, P.A.; Dozier, G.V.; Wainwright, R.L. (2006). "A Two-Population Evolutionary
Algorithm for Constrained Optimization Problems" (https://ieeexplore.ieee.org/document/168
8506). 2006 IEEE International Conference on Evolutionary Computation. Vancouver, BC,
Canada: IEEE: 1647–1653. doi:10.1109/CEC.2006.1688506 (https://doi.org/10.1109%2FCE
C.2006.1688506). ISBN 978-0-7803-9487-2. S2CID 1717817 (https://api.semanticscholar.or
g/CorpusID:1717817).
44. Simionescu, P.A.; Dozier, G.V.; Wainwright, R.L. (2006). "A Two-Population Evolutionary
Algorithm for Constrained Optimization Problems" (http://faculty.tamucc.edu/psimionescu/PD
Fs/WCCI2006-Paper7204(1).pdf) (PDF). 2006 IEEE International Conference on
Evolutionary Computation. Proc 2006 IEEE International Conference on Evolutionary
Computation. Vancouver, Canada. pp. 1647–1653. doi:10.1109/CEC.2006.1688506 (https://
doi.org/10.1109%2FCEC.2006.1688506). ISBN 0-7803-9487-9. S2CID 1717817 (https://api.
semanticscholar.org/CorpusID:1717817). Retrieved 7 January 2017.
45. Simionescu, P.A. (2014). Computer Aided Graphing and Simulation Tools for AutoCAD
Users (1st ed.). Boca Raton, FL: CRC Press. ISBN 978-1-4822-5290-3.

External links
An Overview of the History and Flavors of Evolutionary Algorithms (https://www.staracle.co
m/general/evolutionaryAlgorithms.php)

Bibliography
Ashlock, D. (2006), Evolutionary Computation for Modeling and Optimization, Springer, New
York, doi:10.1007/0-387-31909-3 ISBN 0-387-22196-4.
Bäck, T. (1996), Evolutionary Algorithms in Theory and Practice: Evolution Strategies,
Evolutionary Programming, Genetic Algorithms (https://books.google.com/books?id=htJHI1
UrL7IC), Oxford Univ. Press, New York, ISBN 978-0-19-509971-3.
Bäck, T., Fogel, D., Michalewicz, Z. (1999), Evolutionary Computation 1: Basic Algorithms
and Operators, CRC Press, Boca Raton, USA, ISBN 978-0-7503-0664-5.
Bäck, T., Fogel, D., Michalewicz, Z. (2000), Evolutionary Computation 2: Advanced
Algorithms and Operators, CRC Press, Boca Raton, USA, doi:10.1201/9781420034349
ISBN 978-0-3678-0637-8.
Banzhaf, W., Nordin, P., Keller, R., Francone, F. (1998), Genetic Programming - An
Introduction, Morgan Kaufmann, San Francisco, ISBN 978-1-55860-510-7.
Eiben, A.E., Smith, J.E. (2003), Introduction to Evolutionary Computing, Springer,
Heidelberg, New York, doi:10.1007/978-3-662-44874-8 ISBN 978-3-662-44873-1.
Holland, J. H. (1992), Adaptation in Natural and Artificial Systems (https://books.google.com/
books?id=5EgGaBkwvWcC), MIT Press, Cambridge, MA, ISBN 978-0-262-08213-6.
Michalewicz, Z.; Fogel, D.B. (2004), How To Solve It: Modern Heuristics. Springer, Berlin,
Heidelberg, ISBN 978-3-642-06134-9, doi:10.1007/978-3-662-07807-5.
Benko, Attila; Dosa, Gyorgy; Tuza, Zsolt (2010). "Bin Packing/Covering with Delivery, solved
with the evolution of algorithms". 2010 IEEE Fifth International Conference on Bio-Inspired
Computing: Theories and Applications (BIC-TA). pp. 298–302.
doi:10.1109/BICTA.2010.5645312 (https://doi.org/10.1109%2FBICTA.2010.5645312).
ISBN 978-1-4244-6437-1. S2CID 16875144 (https://api.semanticscholar.org/CorpusID:1687
5144).
Poli, R.; Langdon, W. B.; McPhee, N. F. (2008). A Field Guide to Genetic Programming (http
s://web.archive.org/web/20160527142933/http://cswww.essex.ac.uk/staff/rpoli/gp-field-guid
e/). Lulu.com, freely available from the internet. ISBN 978-1-4092-0073-4. Archived from the
original (http://cswww.essex.ac.uk/staff/rpoli/gp-field-guide/) on 2016-05-27. Retrieved
2011-03-05.
Price, K., Storn, R.M., Lampinen, J.A., (2005). Differential Evolution: A Practical Approach to
Global Optimization (https://books.google.com/books?id=hakXI-dEhTkC), Springer, Berlin,
Heidelberg, ISBN 978-3-642-42416-8, doi:10.1007/3-540-31306-0.
Ingo Rechenberg (1971), Evolutionsstrategie - Optimierung technischer Systeme nach
Prinzipien der biologischen Evolution (PhD thesis). Reprinted by Fromman-Holzboog
(1973). ISBN 3-7728-1642-8
Hans-Paul Schwefel (1974), Numerische Optimierung von Computer-Modellen (PhD
thesis). Reprinted by Birkhäuser (1977).
Hans-Paul Schwefel (1995), Evolution and Optimum Seeking. Wiley & Sons, New York.
ISBN 0-471-57148-2
Simon, D. (2013), Evolutionary Optimization Algorithms (http://academic.csuohio.edu/simon
d/EvolutionaryOptimization), Wiley & Sons, ISBN 978-0-470-93741-9
Kruse, Rudolf; Borgelt, Christian; Klawonn, Frank; Moewes, Christian; Steinbrecher,
Matthias; Held, Pascal (2013), Computational Intelligence: A Methodological Introduction (htt
ps://books.google.com/books?id=yQVGAAAAQBAJ). Springer, London. ISBN 978-1-4471-
5012-1, doi:10.1007/978-1-4471-5013-8.
Rahman, Rosshairy Abd.; Kendall, Graham; Ramli, Razamin; Jamari, Zainoddin; Ku-
Mahamud, Ku Ruhana (2017). "Shrimp Feed Formulation via Evolutionary Algorithm with
Power Heuristics for Handling Constraints" (https://doi.org/10.1155%2F2017%2F7053710).
Complexity. 2017: 1–12. doi:10.1155/2017/7053710 (https://doi.org/10.1155%2F2017%2F7
053710).
Retrieved from "https://en.wikipedia.org/w/index.php?title=Evolutionary_algorithm&oldid=1163854809"

You might also like