Professional Documents
Culture Documents
Algorithms for
Engineers Using Python
Leonardo Azevedo Scardua
Federal Institute of Technology of Espírito Santo
Vitoria, Brazil
R
A SCIENCE PUBLISHERS BOOK
First edition published 2021
by CRC Press
6000 Broken Sound Parkway NW, Suite 300, Boca Raton, FL 33487-2742
Reasonable efforts have been made to publish reliable data and information, but the author and
publisher cannot assume responsibility for the validity of all materials or the consequences of
their use. The authors and publishers have attempted to trace the copyright holders of all material
reproduced in this publication and apologize to copyright holders if permission to publish in this
form has not been obtained. If any copyright material has not been acknowledged please write
and let us know so we may rectify in any future reprint.
Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced,
transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or
KHUHDIWHULQYHQWHGLQFOXGLQJSKRWRFRS\LQJPLFUR¿OPLQJDQGUHFRUGLQJRULQDQ\LQIRUPDWLRQ
storage or retrieval system, without written permission from the publishers.
For permission to photocopy or use material electronically from this work, access www.copyright.com or
contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-
750-8400. For works that are not available on CCC please contact mpkbookspermissions@tandf.co.uk
Trademark notice: Product or corporate names may be trademarks or registered trademarks and are
XVHGRQO\IRULGHQWL¿FDWLRQDQGH[SODQDWLRQZLWKRXWLQWHQWWRLQIULQJH
This book has been written for those who seek to apply evolutionary algorithms
to problems in engineering and science. To this end, it provides the theoretical
background necessary for the understanding of the presented evolutionary algorithms
and their shortcomings, while also discussing themes that are pivotal to the successful
application of evolutionary algorithms to real-world problems.
The theoretical descriptions are illustrated with Python implementations of the
algorithms, which not only allow readers to consolidate their understanding but also
provide a sound starting point for those intending to apply evolutionary algorithms
WRRSWLPL]DWLRQSUREOHPVLQWKHLU¿HOGVRIZRUN3\WKRQKDVEHHQFKRVHQGXHWRLWV
ZLGHVSUHDG DGRSWLRQ LQ WKH$UWL¿FLDO ,QWHOOLJHQFH FRPPXQLW\7KRVH IDPLOLDU ZLWK
high level languages such as MATLABTM ZRXOG ¿QG QR GLI¿FXOW\ LQ UHDGLQJ WKH
provided Python implementations of the evolutionary algorithms.
Instead of attempting to encompass most of the existing evolutionary
algorithms, this book focuses on those algorithms that researchers have recently
DSSOLHG WR GLI¿FXOW HQJLQHHULQJ RSWLPL]DWLRQ SUREOHPV VXFK DV FRQWURO SUREOHPV
with continuous action spaces and the training of high-dimensional convolutional
neuralnetworks. The basic characteristics of real-world optimization problems are
presented, together with advice on how to properly apply evolutionary algorithms
to them. The applied nature of this book is enforced by the presentation of cases of
successful application of evolutionary algorithms to optimization problems which
are closely related to real-world problems. The presentation is complemented by
Python source code, allowing the user to gain insight into the idiosyncrasies of the
practical application of evolutionary algorithms.
All source code presented in this book was developed in Python 3.7.6. The
LPSOHPHQWDWLRQVDUHPHDQWWREHGLGDFWLFLQVWHDGRIFRPSXWDWLRQDOO\HI¿FLHQW)RU
WKLVUHDVRQYHFWRUL]DWLRQDQGRWKHU3\WKRQLPSOHPHQWDWLRQWULFNVZKLFKVSHFL¿FDOO\
aim at reducing execution times, but add extra complexity to the source code, have
been avoided.
iv Applied Evolutionary Algorithms for Engineers Using Python
Preface LLL
Glossary YLLL
SECTION I: INTRODUCTION
(YROXWLRQDU\$OJRULWKPVDQG'LI¿FXOW2SWLPL]DWLRQ3UREOHPV
1.1 What Makes an Optimization Problem Harder to Solve 3
1.2 Why Evolutionary Algorithms 6
,QWURGXFWLRQWR2SWLPL]DWLRQ
2.1 What is Optimization 7
2.2 Solutions of an Optimization Problem 8
2.3 Maximization or Minimization 8
2.4 Basic Mathematical Formulation 8
2.5 Constraints and Feasible Regions 9
2.6 Local Solutions and Global Solutions 11
2.7 Multimodality 11
2.8 Multi-Objective Optimization 11
2.9 Combinatorial Optimization 15
3.4 Population 32
3.5 Selecting Parents 32
3.5.1 Selection Probabilities 33
3.5.2 Sampling 3
3.5.3 Selection of Individuals 4
3.6 Crossover (Recombination) 4
3.6.1 Recombination for Discrete Representations 4
3.6.2 Recombination for Real-Yalued Representations 52
3.7 Mutation 5
3.7.1 Mutation for Binary Representations 58
3.7.2 Mutation for Real-Yalued Representations
3.7.3 Mutation for Integer Representations 6
3.8 Elitism 6
5. Evolution Strategies 86
5.1 Recombination Operators 87
5.2 Mutation Operators 89
5.3 The (1 + 1) ES 91
5.4 The (ȝ + Ȝ) ES 98
5.5 Natural Evolution Strategies 104
5.6 Covariance Matrix Adaptation Evolution Strategies 108
*HQHWLF$OJRULWKPV
6.1 Real-Valued Genetic Algorithm 118
6.2 Binary Genetic Algorithm 122
'LIIHUHQWLDO(YROXWLRQ
0XOWLREMHFWLYH(YROXWLRQDU\$OJRULWKP%DVHGRQ'HFRPSRVLWLRQ
Contents vii
Evolutionary Algorithms
and Difficult
Optimization Problems
The best starting point to solve an optimization problem is seeking to understand the
problem as much as possible. A better understanding of the problem’s characteristics
offers two essential benefits. The first benefit is that it allows the selection of the
most suitable tools for tackling the problem. The second benefit is that it allows
better assessment of candidate solutions, which is of paramount importance when
there are multiple solutions and there isn’t one that is clearly superior to the others.
All this seems to be true for all kinds of problems, but it is certainly true for difficult
optimization problems. Before diving into the theory and practice of evolutionary
algorithms, it is important to first discuss what makes an optimization problem more
difficult to solve, since it will help understand why evolutionary algorithms are a
sound approach for solving them.
Take for instance a common problem in engineering and science, which is to find
the real roots of a given polynomial function, such as f (x) = x3 + 2x − 1, shown
in Figure 1.1. If the search space is limited to −2 ≤ x ≤ 2, as depicted in plot (a) of
Figure 1.1, even a random search algorithm would be able to rapidly find a very good
approximate solution, because any point in the search space would be already close
to the true solution.
Nonetheless, simply increasing the search space to [−100, 100], as depicted in
plot (b) of Figure 1.1, makes the root finding problem harder, entailing that the ran-
dom search algorithm would likely need many more evaluations of f (x) to find the
root than were needed when the search space was smaller. It is thus not difficult to
guess that a big search space is one of the characteristics that make an optimiza-
Evolutionary Algorithms and Difficult Optimization Problems 3
-5
-10
0.5
-0.5
-1
-100 -80 -60 -40 -20 0 20 40 60 80 100
(b)
tion problem harder to solve, and that for problems with bigger search spaces more
sophisticated optimization algorithms should be used.
The size of the search space and other characteristics that contribute to making an
optimization problem harder to solve, as seen in [9], [16] and [59], are described in
what follows. Before proceeding, it is important to note that the characteristics below
may not be all present in a given optimization problem. Nonetheless, when one or
more of them are present, they make the optimization problem harder to solve.
-1
-2
-3
-4
0 2 4 6 8
Figure 1.2: Noise can mislead the optimization algorithm. fo is the objective function and
y = fo + ε is the information received by the optimization algorithm.
methods that need perfect information about the loss function (and/or also about its
derivatives) to deterministically find the search direction.
The second characteristic is the size of the problem. Suppose an optimization
problem with two discrete variables, where each variable can assume only 4 different
values. The search space of this two-dimensional optimization problem is comprised
of 42 = 16 different points. If another discrete variable with 4 different possible val-
ues is added to the problem, the search space of the resulting three-dimensional op-
timization problem will be comprised of 43 = 64 different points. Adding a fourth
dimension would yield a search space of 44 = 256 points. For each dimension added,
the number of search points is multiplied by 4. Figure 1.3 depicts the relation be-
tween the number of dimensions and the number of search points in this toy problem.
Appropriately, the explosive growth in the number of search points (volume of the
search space) that results from increases in the number of dimensions of the search
space in optimization problems in general was called the curse of dimensionality
according to [7].
The third characteristic is the existence of constraints. Hardly a real-world prob-
lem will place no constraints on the possible values of its variables. A car engine,
for instance, has an upper limit on the number of rotations per minute it can deliver
under the penalty of sustaining structural damage. Any optimization problem that in-
volves a particular car engine will have to restrict the number of rotations per minute
according to the engine’s maximum capacity. The immediate effect of the existence
of constraints is to invalidate all search points that violate the constraints as possible
solutions, thus making the job of the optimization algorithm much harder.
The fourth characteristic of a difficult optimization problem is the existence of
multiple and possibly conflicting objectives. Real-world problems usually involve
smaller subproblems that interact with each other. Unfortunately, these subproblems
are usually non-separable. This means that one cannot try to tackle each subproblem
individually, since optimizing one of the subproblems may have a negative impact on
the other subproblems. Solutions to multi-objective optimization problems are a fam-
Evolutionary Algorithms and Difficult Optimization Problems 5
Figure 1.3: Growth in the number of search points as a function of the number of dimensions.
[10] Greg Brockman, Vicki Cheung, Ludwig Pettersson, Jonas Schneider, John
Schulman, Jie Tang and Wojciech Zaremba. 2016. Openai gym.
[11] A.E. Bryson. June, 1996. Optimal control-1950 to 1985. IEEE Control Systems
Magazine 16(3): 26–33.
[12] D. Buche, P. Stoll, R. Dornberger and P. Koumoutsakos. 2002. Multiobjective
evolutionary algorithm for the optimization of noisy combustion processes.
IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications
and Reviews) 32(4): 460–473.
[13] Gary B. Lamont Carlos Coello Coello and David A. van Veldhuizen.
Evolutionary Algorithms for Solving Multi-Objective Problems. Genetic and
Evolutionary Computation. Springer US, 2007.
[14] R.A. Caruana and J.D. Schaffer. 1988. Representation and hidden bias: Gray
vs binary coding. In 6th International Conference in Machine Learning,
pp. 153–161.
[15] Shelvin Chand and Markus Wagner. 2015. Evolutionary many-objective
optimization: A quick-start guide. Surveys in Operations Research and
Management Science 20(2): 35–42.
[16] Raymond Chiong, Thomas Weise and Zbigniew Michalewicz. 2011. Variants
of Evolutionary Algorithms for Real-World Applications. Springer Publishing
Company, Incorporated.
[17] &KULVWLDQ %OXP 5D\PRQG &KLRQJ Maurice Clerc Kenneth De Jong Zbigniew
0LFKDOHZLF] )HUUDQWH 1HUL DQG 7KRPDV :HLVH EvolutLonary
optimization. pp. 1–29. In: Thomas Weise Raymond Chiong and Zbigniew
Michalewicz (eds.). Variants of Evolutionary Algorithms for Real-World
Applications, Springer.
[18] Carlos A. Coello Coello. July 2016. Constraint-handling techniques used
with evolutionary algorithms. In: GECCO’16 Companion: Proceedings of
the 2016 on Genetic and Evolutionary Computation Conference Companion,
pp. 563–587.
[19] Dorigo M. Colorni and A. Maniezzo. 1991. Distributed optimization by ant
colonies. pp. 134–142. In (XURSHDQ &RQIHUHQFH RQ $UWL¿FLDO /LIH 3DULV
Prance.
[20] D.R. Cox. 2006. Principles of Statistical Inference. Cambridge University
Press.
[21] Tobias Glasmachers, Yi Sun, Jan Peters, Daan Wierstra, Tom Schaul and
Jürgen Schmidhuber. 2014. Natural evolution strategies. Journal of Machine
Learning Research 15(1): 949–980.
[22] Dipankar Dasgupta and Zbigniew Michalewicz. 1997. Evolutionary
Algorithms—An Overview. Springer Berlin Heidelberg, Berlin, Heidelberg,
pp. 3–28.
References 235
[23] K. Deb, A. Pratap, S. Agarwal and T. Meyarivan. April, 2002. A fast and elitist
multiobjective genetic algorithm: Nsga-ii. IEEE Transactions on Evolutionary
Computation 6(2): 182–197.
[24] Kalyanmoy Deb. 2011. Multi-objective optimization using evolutionary
algorithms: An introduction. pp. 3–34. In: Lihui Wang, Amos H.C. Ng and
Kalyanmoy Deb (eds.). Multi-objective Evolutionary Optimisation for Product
Design and Manufacturing, Springer Publishing Company, Incorporated.
>@ .DO\DQPR\'HEDQG6DPLU$JUDZDO$UWL¿FLDO1HXUDO1HWVDQG*HQHWLF
Algorithms. Niched-Penalty Approach for Constraint Handling in Genetic
Algorithms. Springer, Vienna.
[26] Kalyanmoy Deb and Debayan Deb. February, 2014. Analysing mutation
schemes for real parameter genetic algorithms. International Journal of
$UWL¿FLDO,QWHOOLJHQFHDQG6RIW&RPSXWLQJ±
[27] Tobias Wagner, Dimo Brockhoff and Heike Trautmann. July, 2012. On the
properties of the r2 indicator. pp. 465–472. In: Proceedings of the 14th Annual
Conference on Genetic and Evolutionary Computation (GECCO’2012),
Philadelphia, United States, ACM.
[28] Robert W. Doran. 2007. The gray code. J. UCS 13: 1573–1597.
[29] M. Dorigo. 1992. Optimization, Learning and Natural Algorithms. Ph.D.
Thesis, Dipartimento di Elettronica, Politecnico di Milano, Milan.
[30] D. Doufene, S. Bouazabia and A. Haddad. 2020. Shape and electric performance
improvement of an insulator string using particles swarm algorithm. IET
Science, Measurement Technology 14(2): 198–205.
[31] Michael T.M. Emmerich and André H. Deutz. September, 2018. A tutorial on
multiobjective optimization: fundamentals and evolutionary methods. Natural
Computing 17(3): 585–609.
[32] H.P. Lopuhaä, F.M. Dekking, C. Kraaikamp and L.E. Meester. 2005. A modern
introduction to probability and statistics. Springer Texts in Statistics. Springer-
Verlag London, 1 Edition.
[33] A.I.J. Forrester, A.J. Keane and N.W. Bressloff. October, 2006 Design and
analysis of ’noisy’ computer experiments. AIAA Journal 44(10): 2331–2339.
[34] Alexander I.J. Forrester, Andras Sobester and Andy J. Keane. 2008. Engineering
Design via Surrogate Modelling—A Practical Guide. Wiley.
[35] T.V. Geetha G. Pavai. 2016. A survey on crossover operators. ACM Computing
Surveys (CSUR) 49(4): 1–43.
[36] David E. Goldberg. January, 1989. Genetic Algorithms in Search, Optimization,
and Machine Learning. Addison-Wesley Professional, Boston, MA, USA,
1 Edition.
23 References
>@ Diederik P. Kingma and Jimmy Ba. 2014. Adam: A method for stochastic
optimization.
>@ D.E. Kirk. 2004. Optimal Control Theory: An Introduction. Dover Books on
Electrical Engineering Series. Dover Publications.
>@ S. Koziel and Z. Michalewicz. March, 1999. Evolutionary algorithms,
homomorphous mappings, and constrained parameter optimization. Evolutionary
Computation 7(1): 19–44.
>@ H. Kuribayashi, M. Souza, D. Gomes, K. Silva, M. Silva, J. Costa and C.
Francês. 2020. Particle swarm-based cell range expansion for heterogeneous
mobile networks. IEEE Access, pp. 1–1.'RL$&&(66
>@ A.J. Owens, L.J. Fogel and M.J. Walsh. 2014. From chaos to order: How
ants optimize food search. https://www.pik-potsdam.de/news/pressreleases/
archive/2014/from-chaos-to-order-how-ants-optimize-foodsearch?
searchterm=from chaos to order.
>@ Larry J. Eshelman5LFKDUG$&DUXDQD and J. David Schaffer. 1989. %iases in
the crossover landscape. pp. 10–19. In: Proceedings of the 3rd International
Conference on Genetic Algorithms, Morgan Kaufman.
>@ K. Li, K. Deb and X. Yao. 2018. R-metric: Evaluating the performance of
preference-based evolutionary multiobjective optimization using reference
points. IEEE Transactions on Evolutionary Computation 22(6): 821–835.
>@ W. Liu. 2020. Route optimization for last-mile distribution of rural e-commerce
logistics based on ant colony optimization. IEEE Access 8: 12179–12187.
>@ Antonio López Jaimes and Carlos Coello. 2015. Many-Objective Problems:
Challenges and Methods, pp. 1033–1046. 01.
>@ M. Méndez, D.A. Rossit, B. González and M. Frutos. 2020. Proposal and
comparative study of evolutionary algorithms for optimum design of a gear
system. IEEE Access 8: 3482–3497.
>@ Zbigniew Michalewicz. November, 2012. Ubiquity symposium: Evolutionary
computation and the processes of life: The emperor is naked: Evolutionary
algorithms for real-world applications. Ubiquity 2012(November): 3:1–3:13.
>@ Kaisa Miettinen. 1998. Nonlinear Multiobjective Optimization, volume 12
of International Series in Operations Research & Management Science.
Springer US.
>@ M. Mitchell. 1999. An Introduction to Genetic Algorithms. The MIT Press,
Cambridge, MA.
>@ H. Mühlenbein and D. Schlierkamp-Voosen. 1993. Predictive models for the
breeder genetic algorithm. Evolutionary Computation 1(1): 25–49.
>@ Andrew Nelson. 2019. Evolutionay Robotics (accessed November 10, 2019).
23 References
[64] Dirk Arnold, Nikolaus Hansen and Anne Auger. 2015. Evolution strategies.
In: Janusz Kacprzyk and Witold Pedrycz (eds.). Handbook of Computational
Intelligence.
[65] Michael A. Osborne, Roman Garnett and Stephen J. Roberts. 2009. Gaussian
processes for global optimization. 3rd International Conference on Learning
and Intelligent Optimization (LION3), pp. 1–15.
[66] Victor Picheny, Tobias Wagner and David Ginsbourger. 2013. A benchmark
RI NULJLQJEDVHG LQ¿OO FULWHULD IRU QRLV\ RSWLPL]DWLRQ 6WUXFWXUDO DQG
Multidisciplinary Optimization 48(3): 607–626.
[67] Kenneth Price and Rainer Storn. 2020. Differential evolution (de) for
continuous function optimization (an algorithm by kenneth price and rainer
storn). retrieved April 04, 2020, from https://www1.icsi.berkeley.edu/~storn/
code.html.
[68] N. Radcliff. 1991. Forma analysis and random respectful recombination. In:
Morgan Kauffman (ed.). Proc. 4th Int. Conf. on Genetic Algorithms. San
Mateo, CA.
[69] Pratyusha Rakshit, Amit Konar and Swagatam Das. 2017. Noisy evolutionary
optimization algorithms—a comprehensive survey. Swarm and Evolutionary
Computation 33: 18–45.
[70] I. Rechenberg. 1973. Evolutions strategie: Optimierung technischer Systeme
nach Prinzipien der biologischen Evolution. Number 15 in Problemata.
Frommann-Holzboog, Stuttgart-Bad Cannstatt.
[71] David E. Rumelhart and James L. McClelland. 1987. Parallel Distributed
Processing: Explorations in the Microstructure of Cognition: Foundations,
volume 1. MIT Press.
[72] Tim Salimans, Jonathan Ho, Xi Chen and Ilya Sutskever. 2017. Evolution
strategies as a scalable alternative to reinforcement learning. CoRR,
abs/1703.03864.
[73] E. Sandgren. 1990. Nonlinear integer and discrete programming in mechanical
design optimization. Journal of Mechanical Design 112(2): 223–229.
[74] H. Sato, H.E. Aguirre and K. Tanaka. 2004. Local dominance using polar
coordinates to enhance multiobjective evolutionary algorithms. In: Proceedings
of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753)
1: 188–195.
[75] Hans Paul Schwefel. 1977. Numerische Optimierung von Computer-Modellen
mittels der Evolution-sstrategie. Birkhäuser.
>@ <6KLDQG5(EHUKDUW0D\$PRGL¿HGSDUWLFOHVZDUPRSWLPL]HU,Q
IEEE International Conference on Evolutionary Computation Proceedings.
IEEE World Congress on Computational Intelligence (Cat. No.98TH8360),
pp. 69–73.
References 23
[91] A.H. Wright. 1991. Genetic algorithms for real parameter optimization.
pp. 205–218. In: J.E. Rawlins (ed.). Foundations of Genetic Algorithms,
Morgan Kaufmann.
[92] Q. Zhang and H. Li. 2007. Moea/d: A multiobjective evolutionary algorithm
based on decomposition. IEEE Transactions on Evolutionary Computation
11(6): 712–731.
[93] R. Zhang, S. Song and C. Wu. April, 2020. Robust scheduling of hot rolling
production by local search enhanced ant colony optimization algorithm. IEEE
Transactions on Industrial Informatics 16(4): 2809–2819.
[94] Eckart Zitzler, Kalyanmoy Deb and Lothar Thiele. 2000. Comparison of
multiobjective evolutionary algorithms: Empirical results. Evolutionary
Computation 8: 173–195.