You are on page 1of 53

Engineering Optimization Theory and

Practice 5th Edition Singiresu S. Rao


Visit to download the full and correct content document:
https://textbookfull.com/product/engineering-optimization-theory-and-practice-5th-editi
on-singiresu-s-rao/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Teaching Learning Based Optimization Algorithm And Its


Engineering Applications 1st Edition R. Venkata Rao

https://textbookfull.com/product/teaching-learning-based-
optimization-algorithm-and-its-engineering-applications-1st-
edition-r-venkata-rao/

Advanced Engineering Optimization Through Intelligent


Techniques: Select Proceedings of AEOTIT 2018 R.
Venkata Rao

https://textbookfull.com/product/advanced-engineering-
optimization-through-intelligent-techniques-select-proceedings-
of-aeotit-2018-r-venkata-rao/

Cultural Studies Theory and Practice 5th Edition Barker

https://textbookfull.com/product/cultural-studies-theory-and-
practice-5th-edition-barker/

Manufacturing Technology Vol 1 5th Edition Rao

https://textbookfull.com/product/manufacturing-technology-
vol-1-5th-edition-rao/
Reliability engineering theory and practice 8th Edition
Birolini A

https://textbookfull.com/product/reliability-engineering-theory-
and-practice-8th-edition-birolini-a/

System Engineering Management 5th Edition Benjamin S.


Blanchard

https://textbookfull.com/product/system-engineering-
management-5th-edition-benjamin-s-blanchard/

Strategic Information Management Theory and Practice


5th Edition Robert D. Galliers (Editor)

https://textbookfull.com/product/strategic-information-
management-theory-and-practice-5th-edition-robert-d-galliers-
editor/

Sustainable Water Engineering Theory and Practice 1st


Edition Ramesha Chandrappa

https://textbookfull.com/product/sustainable-water-engineering-
theory-and-practice-1st-edition-ramesha-chandrappa/

Transportation Engineering Theory Practice and Modeling


1st Edition Dusan Teodorovic

https://textbookfull.com/product/transportation-engineering-
theory-practice-and-modeling-1st-edition-dusan-teodorovic/
Engineering Optimization
Engineering Optimization
Theory and Practice

Fifth Edition

Singiresu S Rao
University of Miami
Coral Gables, Florida
This edition first published 2020
© 2020 John Wiley & Sons, Inc.

Edition History
John Wiley & Sons Ltd (4e, 2009)

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or
transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise,
except as permitted by law. Advice on how to obtain permission to reuse material from this title is available
at http://www.wiley.com/go/permissions.

The right of Singiresu S Rao to be identified as the author of this work has been asserted in accordance
with law.

Registered Offices
John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA

Editorial Office
The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK

For details of our global editorial offices, customer services, and more information about Wiley products
visit us at www.wiley.com.

Wiley also publishes its books in a variety of electronic formats and by print-on-demand. Some content that
appears in standard print versions of this book may not be available in other formats.

Limit of Liability/Disclaimer of Warranty


While the publisher and authors have used their best efforts in preparing this work, they make no
representations or warranties with respect to the accuracy or completeness of the contents of this work and
specifically disclaim all warranties, including without limitation any implied warranties of merchantability or
fitness for a particular purpose. No warranty may be created or extended by sales representatives, written
sales materials or promotional statements for this work. The fact that an organization, website, or product is
referred to in this work as a citation and/or potential source of further information does not mean that the
publisher and authors endorse the information or services the organization, website, or product may provide
or recommendations it may make. This work is sold with the understanding that the publisher is not engaged
in rendering professional services. The advice and strategies contained herein may not be suitable for your
situation. You should consult with a specialist where appropriate. Further, readers should be aware that
websites listed in this work may have changed or disappeared between when this work was written and when
it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial
damages, including but not limited to special, incidental, consequential, or other damages.

Library of Congress Cataloging-in-Publication Data


Names: Rao, Singiresu S., 1944- author.
Title: Engineering optimization : theory and practice / Singiresu S Rao,
University of Miami, Coral Gables, FL, US.
Description: Fifth edition. | Hoboken, NJ, USA : John Wiley & Sons, Inc.,
2020. | Includes bibliographical references and index. |
Identifiers: LCCN 2019011088 (print) | LCCN 2019011277 (ebook) | ISBN
9781119454762 (Adobe PDF) | ISBN 9781119454793 (ePub) | ISBN 9781119454717
(hardback)
Subjects: LCSH: Engineering–Mathematical models. | Mathematical optimization.
Classification: LCC TA342 (ebook) | LCC TA342 .R36 2019 (print) | DDC
620.001/5196–dc23
LC record available at https://lccn.loc.gov/2019011088

Cover Design: Wiley


Cover Image: © Nayanba Jadeja/Getty Images

Set in 10.25/12pt TimesLTStd by SPi Global, Chennai, India

Printed in United States of America

10 9 8 7 6 5 4 3 2 1
Contents

Preface xvii
Acknowledgment xxi
About the Author xxiii
About the Companion Website xxv

1 Introduction to Optimization 1
1.1 Introduction 1
1.2 Historical Development 3
1.2.1 Modern Methods of Optimization 4
1.3 Engineering Applications of Optimization 5
1.4 Statement of An Optimization Problem 6
1.4.1 Design Vector 6
1.4.2 Design Constraints 7
1.4.3 Constraint Surface 7
1.4.4 Objective Function 8
1.4.5 Objective Function Surfaces 9
1.5 Classification of Optimization Problems 14
1.5.1 Classification Based on the Existence of Constraints 14
1.5.2 Classification Based on the Nature of the Design Variables 14
1.5.3 Classification Based on the Physical Structure of the Problem 15
1.5.4 Classification Based on the Nature of the Equations Involved 18
1.5.5 Classification Based on the Permissible Values of the Design Variables 27
1.5.6 Classification Based on the Deterministic Nature of the Variables 28
1.5.7 Classification Based on the Separability of the Functions 29
1.5.8 Classification Based on the Number of Objective Functions 31
1.6 Optimization Techniques 33
1.7 Engineering Optimization Literature 34
1.8 Solutions Using MATLAB 34
References and Bibliography 34
Review Questions 40
Problems 41

2 Classical Optimization Techniques 57


2.1 Introduction 57
2.2 Single-Variable Optimization 57
2.3 Multivariable Optimization with no Constraints 62
2.3.1 Definition: rth Differential of f 62
2.3.2 Semidefinite Case 67
2.3.3 Saddle Point 67
2.4 Multivariable Optimization with Equality Constraints 69
2.4.1 Solution by Direct Substitution 69
2.4.2 Solution by the Method of Constrained Variation 71
2.4.3 Solution By the Method of Lagrange Multipliers 77

vii
viii Contents

2.5 Multivariable Optimization with Inequality Constraints 85


2.5.1 Kuhn–Tucker Conditions 90
2.5.2 Constraint Qualification 90
2.6 Convex Programming Problem 96
References and Bibliography 96
Review Questions 97
Problems 98

3 Linear Programming I: Simplex Method 109


3.1 Introduction 109
3.2 Applications of Linear Programming 110
3.3 Standard form of a Linear Programming Problem 112
3.3.1 Scalar Form 112
3.3.2 Matrix Form 112
3.4 Geometry of Linear Programming Problems 114
3.5 Definitions and Theorems 117
3.5.1 Definitions 117
3.5.2 Theorems 120
3.6 Solution of a System of Linear Simultaneous Equations 122
3.7 Pivotal Reduction of a General System of Equations 123
3.8 Motivation of the Simplex Method 127
3.9 Simplex Algorithm 128
3.9.1 Identifying an Optimal Point 128
3.9.2 Improving a Nonoptimal Basic Feasible Solution 129
3.10 Two Phases of the Simplex Method 137
3.11 Solutions Using MATLAB 143
References and Bibliography 143
Review Questions 143
Problems 145

4 Linear Programming II: Additional Topics and Extensions 159


4.1 Introduction 159
4.2 Revised Simplex Method 159
4.3 Duality in Linear Programming 173
4.3.1 Symmetric Primal–Dual Relations 173
4.3.2 General Primal–Dual Relations 174
4.3.3 Primal–Dual Relations when the Primal Is in Standard Form 175
4.3.4 Duality Theorems 176
4.3.5 Dual Simplex Method 176
4.4 Decomposition Principle 180
4.5 Sensitivity or Postoptimality Analysis 187
4.5.1 Changes in the Right-Hand-Side Constants bi 188
4.5.2 Changes in the Cost Coefficients cj 192
4.5.3 Addition of New Variables 194
4.5.4 Changes in the Constraint Coefficients aij 195
4.5.5 Addition of Constraints 197
4.6 Transportation Problem 199
Contents ix

4.7 Karmarkar’s Interior Method 202


4.7.1 Statement of the Problem 203
4.7.2 Conversion of an LP Problem into the Required Form 203
4.7.3 Algorithm 205
4.8 Quadratic Programming 208
4.9 Solutions Using Matlab 214
References and Bibliography 214
Review Questions 215
Problems 216

5 Nonlinear Programming I: One-Dimensional Minimization Methods 225


5.1 Introduction 225
5.2 Unimodal Function 230

ELIMINATION METHODS 231

5.3 Unrestricted Search 231


5.3.1 Search with Fixed Step Size 231
5.3.2 Search with Accelerated Step Size 232
5.4 Exhaustive Search 232
5.5 Dichotomous Search 234
5.6 Interval Halving Method 236
5.7 Fibonacci Method 238
5.8 Golden Section Method 243
5.9 Comparison of Elimination Methods 246

INTERPOLATION METHODS 247

5.10 Quadratic Interpolation Method 248


5.11 Cubic Interpolation Method 253
5.12 Direct Root Methods 259
5.12.1 Newton Method 259
5.12.2 Quasi-Newton Method 261
5.12.3 Secant Method 263
5.13 Practical Considerations 265
5.13.1 How to Make the Methods Efficient and More Reliable 265
5.13.2 Implementation in Multivariable Optimization Problems 266
5.13.3 Comparison of Methods 266
5.14 Solutions Using MATLAB 267
References and Bibliography 267
Review Questions 267
Problems 268

6 Nonlinear Programming II: Unconstrained Optimization Techniques 273


6.1 Introduction 273
6.1.1 Classification of Unconstrained Minimization Methods 276
6.1.2 General Approach 276
6.1.3 Rate of Convergence 276
6.1.4 Scaling of Design Variables 277
x Contents

DIRECT SEARCH METHODS 280

6.2 Random Search Methods 280


6.2.1 Random Jumping Method 280
6.2.2 Random Walk Method 282
6.2.3 Random Walk Method with Direction Exploitation 283
6.2.4 Advantages of Random Search Methods 284
6.3 Grid Search Method 285
6.4 Univariate Method 285
6.5 Pattern Directions 288
6.6 Powell’s Method 289
6.6.1 Conjugate Directions 289
6.6.2 Algorithm 293
6.7 Simplex Method 298
6.7.1 Reflection 298
6.7.2 Expansion 301
6.7.3 Contraction 301

INDIRECT SEARCH (DESCENT) METHODS 304

6.8 Gradient of a Function 304


6.8.1 Evaluation of the Gradient 306
6.8.2 Rate of Change of a Function Along a Direction 307
6.9 Steepest Descent (Cauchy) Method 308
6.10 Conjugate Gradient (Fletcher–Reeves) Method 310
6.10.1 Development of the Fletcher–Reeves Method 310
6.10.2 Fletcher–Reeves Method 311
6.11 Newton’s Method 313
6.12 Marquardt Method 316
6.13 Quasi-Newton Methods 317
6.13.1 Computation of [Bi ] 318
6.13.2 Rank 1 Updates 319
6.13.3 Rank 2 Updates 320
6.14 Davidon–Fletcher–Powell Method 321
6.15 Broyden–Fletcher–Goldfarb–Shanno Method 327
6.16 Test Functions 330
6.17 Solutions Using Matlab 332
References and Bibliography 333
Review Questions 334
Problems 336

7 Nonlinear Programming III: Constrained Optimization Techniques 347


7.1 Introduction 347
7.2 Characteristics of a Constrained Problem 347

DIRECT METHODS 350

7.3 Random Search Methods 350


7.4 Complex Method 351
7.5 Sequential Linear Programming 353
Contents xi

7.6 Basic Approach in the Methods of Feasible Directions 360


7.7 Zoutendijk’s Method of Feasible Directions 360
7.7.1 Direction-Finding Problem 362
7.7.2 Determination of Step Length 364
7.7.3 Termination Criteria 367
7.8 Rosen’s Gradient Projection Method 369
7.8.1 Determination of Step Length 372
7.9 Generalized Reduced Gradient Method 377
7.10 Sequential Quadratic Programming 386
7.10.1 Derivation 386
7.10.2 Solution Procedure 389

INDIRECT METHODS 392

7.11 Transformation Techniques 392


7.12 Basic Approach of the Penalty Function Method 394
7.13 Interior Penalty Function Method 396
7.14 Convex Programming Problem 405
7.15 Exterior Penalty Function Method 406
7.16 Extrapolation Techniques in the Interior Penalty Function Method 410
7.16.1 Extrapolation of the Design Vector X 410
7.16.2 Extrapolation of the Function f 412
7.17 Extended Interior Penalty Function Methods 414
7.17.1 Linear Extended Penalty Function Method 414
7.17.2 Quadratic Extended Penalty Function Method 415
7.18 Penalty Function Method for Problems with Mixed Equality and Inequality
Constraints 416
7.18.1 Interior Penalty Function Method 416
7.18.2 Exterior Penalty Function Method 418
7.19 Penalty Function Method for Parametric Constraints 418
7.19.1 Parametric Constraint 418
7.19.2 Handling Parametric Constraints 420
7.20 Augmented Lagrange Multiplier Method 422
7.20.1 Equality-Constrained Problems 422
7.20.2 Inequality-Constrained Problems 423
7.20.3 Mixed Equality–Inequality-Constrained Problems 425
7.21 Checking the Convergence of Constrained Optimization Problems 426
7.21.1 Perturbing the Design Vector 427
7.21.2 Testing the Kuhn–Tucker Conditions 427
7.22 Test Problems 428
7.22.1 Design of a Three-Bar Truss 429
7.22.2 Design of a Twenty-Five-Bar Space Truss 430
7.22.3 Welded Beam Design 431
7.22.4 Speed Reducer (Gear Train) Design 433
7.22.5 Heat Exchanger Design [7.42] 435
7.23 Solutions Using MATLAB 435
References and Bibliography 435
Review Questions 437
Problems 439
xii Contents

8 Geometric Programming 449


8.1 Introduction 449
8.2 Posynomial 449
8.3 Unconstrained Minimization Problem 450
8.4 Solution of an Unconstrained Geometric Programming Program using Differential
Calculus 450
8.4.1 Degree of Difficulty 453
8.4.2 Sufficiency Condition 453
8.4.3 Finding the Optimal Values of Design Variables 453
8.5 Solution of an Unconstrained Geometric Programming Problem Using
Arithmetic–Geometric Inequality 457
8.6 Primal–dual Relationship and Sufficiency Conditions in the Unconstrained
Case 458
8.6.1 Primal and Dual Problems 461
8.6.2 Computational Procedure 461
8.7 Constrained Minimization 464
8.8 Solution of a Constrained Geometric Programming Problem 465
8.8.1 Optimum Design Variables 466
8.9 Primal and Dual Programs in the Case of Less-than Inequalities 466
8.10 Geometric Programming with Mixed Inequality Constraints 473
8.11 Complementary Geometric Programming 475
8.11.1 Solution Procedure 477
8.11.2 Degree of Difficulty 478
8.12 Applications of Geometric Programming 480
References and Bibliography 491
Review Questions 493
Problems 493

9 Dynamic Programming 497


9.1 Introduction 497
9.2 Multistage Decision Processes 498
9.2.1 Definition and Examples 498
9.2.2 Representation of a Multistage Decision Process 499
9.2.3 Conversion of a Nonserial System to a Serial System 500
9.2.4 Types of Multistage Decision Problems 501
9.3 Concept of Suboptimization and Principle of Optimality 501
9.4 Computational Procedure in Dynamic Programming 505
9.5 Example Illustrating the Calculus Method of Solution 507
9.6 Example Illustrating the Tabular Method of Solution 512
9.6.1 Suboptimization of Stage 1 (Component 1) 514
9.6.2 Suboptimization of Stages 2 and 1 (Components 2 and 1) 514
9.6.3 Suboptimization of Stages 3, 2, and 1 (Components 3, 2, and 1) 515
9.7 Conversion of a Final Value Problem into an Initial Value Problem 517
9.8 Linear Programming as a Case of Dynamic Programming 519
9.9 Continuous Dynamic Programming 523
9.10 Additional Applications 526
9.10.1 Design of Continuous Beams 526
9.10.2 Optimal Layout (Geometry) of a Truss 527
Contents xiii

9.10.3 Optimal Design of a Gear Train 528


9.10.4 Design of a Minimum-Cost Drainage System 529
References and Bibliography 530
Review Questions 531
Problems 532

10 Integer Programming 537


10.1 Introduction 537

INTEGER LINEAR PROGRAMMING 538

10.2 Graphical Representation 538


10.3 Gomory’s Cutting Plane Method 540
10.3.1 Concept of a Cutting Plane 540
10.3.2 Gomory’s Method for All-Integer Programming Problems 541
10.3.3 Gomory’s Method for Mixed-Integer Programming Problems 547
10.4 Balas’ Algorithm for Zero–One Programming Problems 551

INTEGER NONLINEAR PROGRAMMING 553

10.5 Integer Polynomial Programming 553


10.5.1 Representation of an Integer Variable by an Equivalent System of Binary
Variables 553
10.5.2 Conversion of a Zero–One Polynomial Programming Problem into a
Zero–One LP Problem 555
10.6 Branch-and-Bound Method 556
10.7 Sequential Linear Discrete Programming 561
10.8 Generalized Penalty Function Method 564
10.9 Solutions Using MATLAB 569
References and Bibliography 569
Review Questions 570
Problems 571

11 Stochastic Programming 575


11.1 Introduction 575
11.2 Basic Concepts of Probability Theory 575
11.2.1 Definition of Probability 575
11.2.2 Random Variables and Probability Density Functions 576
11.2.3 Mean and Standard Deviation 578
11.2.4 Function of a Random Variable 580
11.2.5 Jointly Distributed Random Variables 581
11.2.6 Covariance and Correlation 583
11.2.7 Functions of Several Random Variables 583
11.2.8 Probability Distributions 585
11.2.9 Central Limit Theorem 589
11.3 Stochastic Linear Programming 589
11.4 Stochastic Nonlinear Programming 594
11.4.1 Objective Function 594
11.4.2 Constraints 595
11.5 Stochastic Geometric Programming 600
xiv Contents

References and Bibliography 602


Review Questions 603
Problems 604

12 Optimal Control and Optimality Criteria Methods 609


12.1 Introduction 609
12.2 Calculus of Variations 609
12.2.1 Introduction 609
12.2.2 Problem of Calculus of Variations 610
12.2.3 Lagrange Multipliers and Constraints 615
12.2.4 Generalization 618
12.3 Optimal Control Theory 619
12.3.1 Necessary Conditions for Optimal Control 619
12.3.2 Necessary Conditions for a General Problem 621
12.4 Optimality Criteria Methods 622
12.4.1 Optimality Criteria with a Single Displacement Constraint 623
12.4.2 Optimality Criteria with Multiple Displacement Constraints 624
12.4.3 Reciprocal Approximations 625
References and Bibliography 628
Review Questions 628
Problems 629

13 Modern Methods of Optimization 633


13.1 Introduction 633
13.2 Genetic Algorithms 633
13.2.1 Introduction 633
13.2.2 Representation of Design Variables 634
13.2.3 Representation of Objective Function and Constraints 635
13.2.4 Genetic Operators 636
13.2.5 Algorithm 640
13.2.6 Numerical Results 641
13.3 Simulated Annealing 641
13.3.1 Introduction 641
13.3.2 Procedure 642
13.3.3 Algorithm 643
13.3.4 Features of the Method 644
13.3.5 Numerical Results 644
13.4 Particle Swarm Optimization 647
13.4.1 Introduction 647
13.4.2 Computational Implementation of PSO 648
13.4.3 Improvement to the Particle Swarm Optimization Method 649
13.4.4 Solution of the Constrained Optimization Problem 649
13.5 Ant Colony Optimization 652
13.5.1 Basic Concept 652
13.5.2 Ant Searching Behavior 653
13.5.3 Path Retracing and Pheromone Updating 654
13.5.4 Pheromone Trail Evaporation 654
13.5.5 Algorithm 655
Contents xv

13.6 Optimization of Fuzzy Systems 660


13.6.1 Fuzzy Set Theory 660
13.6.2 Optimization of Fuzzy Systems 662
13.6.3 Computational Procedure 663
13.6.4 Numerical Results 664
13.7 Neural-Network-Based Optimization 665
References and Bibliography 667
Review Questions 669
Problems 671

14 Metaheuristic Optimization Methods 673


14.1 Definitions 673
14.2 Metaphors Associated with Metaheuristic Optimization Methods 673
14.3 Details of Representative Metaheuristic Algorithms 680
14.3.1 Crow Search Algorithm 680
14.3.2 Firefly Optimization Algorithm (FA) 681
14.3.3 Harmony Search Algorithm 684
14.3.4 Teaching-Learning-Based Optimization (TLBO) 687
14.3.5 Honey Bee Swarm Optimization Algorithm 689
References and Bibliography 692
Review Questions 694

15 Practical Aspects of Optimization 697


15.1 Introduction 697
15.2 Reduction of Size of an Optimization Problem 697
15.2.1 Reduced Basis Technique 697
15.2.2 Design Variable Linking Technique 698
15.3 Fast Reanalysis Techniques 700
15.3.1 Incremental Response Approach 700
15.3.2 Basis Vector Approach 704
15.4 Derivatives of Static Displacements and Stresses 705
15.5 Derivatives of Eigenvalues and Eigenvectors 707
15.5.1 Derivatives of 𝜆i 707
15.5.2 Derivatives of Yi 708
15.6 Derivatives of Transient Response 709
15.7 Sensitivity of Optimum Solution to Problem Parameters 712
15.7.1 Sensitivity Equations Using Kuhn–Tucker Conditions 712
15.7.2 Sensitivity Equations Using the Concept of Feasible Direction 714
References and Bibliography 715
Review Questions 716
Problems 716

16 Multilevel and Multiobjective Optimization 721


16.1 Introduction 721
16.2 Multilevel Optimization 721
16.2.1 Basic Idea 721
16.2.2 Method 722
16.3 Parallel Processing 726
xvi Contents

16.4 Multiobjective Optimization 729


16.4.1 Utility Function Method 730
16.4.2 Inverted Utility Function Method 730
16.4.3 Global Criterion Method 730
16.4.4 Bounded Objective Function Method 730
16.4.5 Lexicographic Method 731
16.4.6 Goal Programming Method 732
16.4.7 Goal Attainment Method 732
16.4.8 Game Theory Approach 733
16.5 Solutions Using MATLAB 735
References and Bibliography 735
Review Questions 736
Problems 737

17 Solution of Optimization Problems Using MATLAB 739


17.1 Introduction 739
17.2 Solution of General Nonlinear Programming Problems 740
17.3 Solution of Linear Programming Problems 742
17.4 Solution of LP Problems Using Interior Point Method 743
17.5 Solution of Quadratic Programming Problems 745
17.6 Solution of One-Dimensional Minimization Problems 746
17.7 Solution of Unconstrained Optimization Problems 746
17.8 Solution of Constrained Optimization Problems 747
17.9 Solution of Binary Programming Problems 750
17.10 Solution of Multiobjective Problems 751
References and Bibliography 755
Problems 755

A Convex and Concave Functions 761

B Some Computational Aspects of Optimization 767


B.1 Choice of Method 767
B.2 Comparison of Unconstrained Methods 767
B.3 Comparison of Constrained Methods 768
B.4 Availability of Computer Programs 769
B.5 Scaling of Design Variables and Constraints 770
B.6 Computer Programs for Modern Methods of Optimization 771
References and Bibliography 772

C Introduction to MATLAB® 773


C.1 Features and Special Characters 773
C.2 Defining Matrices in MATLAB 774
C.3 Creating m-Files 775
C.4 Optimization Toolbox 775

Answers to Selected Problems 777

Index 787
Preface
The ever-increasing demand on engineers to lower production costs to withstand
global competition has prompted engineers to look for rigorous methods of decision
making, such as optimization methods, to design and produce products and systems
both economically and efficiently. Optimization techniques, having reached a degree
of maturity by now, are being used in a wide spectrum of industries, including
aerospace, automotive, chemical, electrical, construction, and manufacturing indus-
tries. With rapidly advancing computer technology, computers are becoming more
powerful, and correspondingly, the size and the complexity of the problems that
can be solved using optimization techniques are also increasing. Optimization
methods, coupled with modern tools of computer-aided design, are also being used
to enhance the creative process of conceptual and detailed design of engineering
systems.
The purpose of this textbook is to present the techniques and applications of
engineering optimization in a comprehensive manner. The style of prior editions has
been retained, with the theory, computational aspects, and applications of engineering
optimization presented with detailed explanations. As in previous editions, essential
proofs and developments of the various techniques are given in a simple manner
without sacrificing accuracy. New concepts are illustrated with the help of numerical
examples. Although most engineering design problems can be solved using non-
linear programming techniques, there are a variety of engineering applications for
which other optimization methods, such as linear, geometric, dynamic, integer, and
stochastic programming techniques, are most suitable. The theory and applications
of all these techniques are also presented in the book. Some of the recently developed
optimization methods, such as genetic algorithms, simulated annealing, particle
swarm optimization, ant colony optimization, neural-network-based methods, and
fuzzy optimization, do not belong to the traditional mathematical programming
approaches. These methods are presented as modern methods of optimization. More
recently, a class of optimization methods termed the metaheuristic optimization
methods, have been evolving in the literature. The metaheuristic methods are also
included in this edition. Favorable reactions and encouragement from professors,
students, and other users of the book have provided me with the impetus to prepare
this fifth edition of the book. The following changes have been made from the previous
edition:
• Some less-important sections were condensed or deleted.
• Some sections were rewritten for better clarity.
• Some sections were expanded.
• Some of the recently-developed methods are reorganized in the form of a new
chapter titled, Modern methods of optimization.
• A new chapter titled, Metaheuristic Optimization Methods, is added by
including details of crow search, firefly, harmony search, teaching-learning,
and honey bee swarm optimization algorithms.
• A new chapter titled, Solution of optimization problems using MATLAB, is
added to illustrate the use of MATLAB for the solution of different types of
optimization problems.

xvii
xviii Preface

Features
Each topic in Engineering Optimization: Theory and Practice is self-contained, with
all concepts explained fully and the derivations presented with complete details. The
computational aspects are emphasized throughout with design examples and prob-
lems taken from several fields of engineering to make the subject appealing to all
branches of engineering. A large number of solved examples, review questions, prob-
lems, project-type problems, figures, and references are included to enhance the pre-
sentation of the material.
Specific features of the book include:

• More than 155 illustrative examples accompanying most topics.


• More than 540 references to the literature of engineering optimization theory
and applications.
• More than 485 review questions to help students in reviewing and testing their
understanding of the text material.
• More than 600 problems, with solutions to most problems in the instructor’s
manual.
• More than 12 examples to illustrate the use of Matlab for the numerical solution
of optimization problems.
• Answers to review questions at the web site of the book, http://www.wiley.com/
rao.
• Answers to selected problems are given at the end of the book.

I used different parts of the book to teach optimum design and engineering opti-
mization courses at the junior/senior level as well as first-year-graduate-level at Indian
Institute of Technology, Kanpur, India; Purdue University, West Lafayette, Indiana;
and University of Miami, Coral Gables, Florida. At University of Miami, I cover
Chapter 1 and parts of Chapters 2, 3, 5, 6, 7, and 13 in a dual-level course entitled
Optimization in Design. In this course, a design project is also assigned to each student
in which the student identifies, formulates, and solves a practical engineering problem
of his/her interest by applying or modifying an optimization technique. This design
project gives the student a feeling for ways that optimization methods work in practice.
In addition, I teach a graduate level course titled Mechanical System Optimization in
which I cover Chapters 1–7, and parts of Chapters 9, 10, 11, 13, and 17. The book can
also be used, with some supplementary material, for courses with different emphasis
such as Structural Optimization, System Optimization and Optimization Theory and
Practice. The relative simplicity with which the various topics are presented makes the
book useful both to students and to practicing engineers for purposes of self-study. The
book also serves as a reference source for different engineering optimization applica-
tions. Although the emphasis of the book is on engineering applications, it would also
be useful to other areas, such as operations research and economics. A knowledge of
matrix theory and differential calculus is assumed on the part of the reader.

Contents
The book consists of 17 chapters and 3 appendixes. Chapter 1 provides an introduc-
tion to engineering optimization and optimum design and an overview of optimization
methods. The concepts of design space, constraint surfaces, and contours of objective
function are introduced here. In addition, the formulation of various types of optimiza-
tion problems is illustrated through a variety of examples taken from various fields of
engineering. Chapter 2 reviews the essentials of differential calculus useful in finding
Preface xix

the maxima and minima of functions of several variables. The methods of constrained
variation and Lagrange multipliers are presented for solving problems with equal-
ity constraints. The Kuhn–Tucker conditions for inequality-constrained problems are
given along with a discussion of convex programming problems.
Chapters 3 and 4 deal with the solution of linear programming problems. The
characteristics of a general linear programming problem and the development of the
simplex method of solution are given in Chapter 3. Some advanced topics in linear
programming, such as the revised simplex method, duality theory, the decomposition
principle, and post-optimality analysis, are discussed in Chapter 4. The extension of
linear programming to solve quadratic programming problems is also considered in
Chapter 4.
Chapters 5–7 deal with the solution of nonlinear programming problems. In
Chapter 5, numerical methods of finding the optimum solution of a function of
a single variable are given. Chapter 6 deals with the methods of unconstrained
optimization. The algorithms for various zeroth-, first-, and second-order techniques
are discussed along with their computational aspects. Chapter 7 is concerned with the
solution of nonlinear optimization problems in the presence of inequality and equality
constraints. Both the direct and indirect methods of optimization are discussed. The
methods presented in this chapter can be treated as the most general techniques for
the solution of any optimization problem.
Chapter 8 presents the techniques of geometric programming. The solution tech-
niques for problems of mixed inequality constraints and complementary geometric
programming are also considered. In Chapter 9, computational procedures for solving
discrete and continuous dynamic programming problems are presented. The prob-
lem of dimensionality is also discussed. Chapter 10 introduces integer programming
and gives several algorithms for solving integer and discrete linear and nonlinear
optimization problems. Chapter 11 reviews the basic probability theory and presents
techniques of stochastic linear, nonlinear, and geometric programming. The theory and
applications of calculus of variations, optimal control theory, and optimality criteria
methods are discussed briefly in Chapter 12. Chapter 13 presents several modern meth-
ods of optimization including genetic algorithms, simulated annealing, particle swarm
optimization, ant colony optimization, neural-network-based methods, and fuzzy sys-
tem optimization. Chapter 14 deals with metaheuristic optimization algorithms and
introduces nearly 20 algorithms with emphasis on Crow search, Firefly, Harmony
search, Teaching-Learning and Honey bee swarm optimization algorithms. The prac-
tical aspects of optimization, including reduction of size of problem, fast reanalysis
techniques and sensitivity of optimum solutions are discussed in Chapter 15. The mul-
tilevel and multiobjective optimization methods are covered in Chapter 16. Finally,
Chapter 17 presents the solution of different types of optimization problems using the
MATLAB software.
Appendix A presents the definitions and properties of convex and concave func-
tions. A brief discussion of the computational aspects and some of the commercial
optimization programs is given in Appendix B. Finally, Appendix C presents a brief
introduction to Matlab, optimization toolbox, and use of MATLAB programs for
the solution of optimization problems. Answers to selected problems are given after
Appendix C.
Acknowledgment
I would like to acknowledge that the thousands of hours spent in writing/revising this
book were actually the hours that would otherwise have been spent with my wife
Kamala and other family members. My heartfelt gratitude is for Kamala’s patience
and sacrifices which could never be measured or quantified.

S.S. Rao
srao@miami.edu
April 2019

xxi
About the Author
Dr. S.S. Rao is a Professor in the Department of Mechanical and
Aerospace Engineering at University of Miami, Coral Gables, Florida.
He was the Chairman of the Mechanical and Aerospace Engineering
Department during 1998–2011 at University of Miami. Prior to that,
he was a Professor in the School of Mechanical Engineering at Purdue
University, West Lafayette, Indiana; Professor of Mechanical Engineering
at San Diego State University, San Diego, California; and Indian Institute
of Technology, Kanpur, India. He was a visiting research scientist for two
years at NASA Langley Research Center, Hampton, Virginia.

Professor Rao is the author of eight textbooks: The Finite Element Method in
Engineering, Engineering Optimization, Mechanical Vibrations, Reliability-Based
Design, Vibration of Continuous Systems, Reliability Engineering, Applied Numerical
Methods for Engineers and Scientists, Optimization Methods: Theory and Applica-
tions. He coedited a three-volume Encyclopedia of Vibration. He edited four volumes
of Proceedings of the ASME Design Automation and Vibration Conferences. He
has published over 200 journal papers in the areas of multiobjective optimization,
structural dynamics and vibration, structural control, uncertainty modeling, analysis,
design and optimization using probability, fuzzy, interval, evidence and grey system
theories. Under his supervision, 34 PhD students have received their degrees. In
addition, 12 Post-Doctoral researchers and scholars have conducted research under
the guidance of Dr. Rao.
Professor Rao has received numerous awards for academic and research achieve-
ments. He was awarded the Vepa Krishnamurti Gold Medal for University First Rank
in all the five years of the BE (Bachelor of Engineering) program among students
of all branches of engineering in all the Engineering Colleges of Andhra Univer-
sity. He was awarded the Lazarus Prize for University First Rank among students
of Mechanical Engineering in all the Engineering Colleges of Andhra University. He
received the First Prize in James F. Lincoln Design Contest open for all MS and PhD
students in USA and Canada for a paper he wrote on Automated Optimization of Air-
craft Wing Structures from his PhD dissertation. He received the Eliahu I. Jury Award
for Excellence in Research from the College of Engineering, University of Miami in
2002; was awarded the Distinguished Probabilistic Methods Educator Award from the
Society of Automotive Engineers (SAE) International for Demonstrated Excellence
in Research Contributions in the Application of Probabilistic Methods to Diversified
Fields, Including Aircraft Structures, Building Structures, Machine Tools, Aircondi-
tioning and Refrigeration Systems, and Mechanisms in 1999; received the American
Society of Mechanical Engineers (ASME) Design Automation Award for Pioneer-
ing Contributions to Design Automation, particularly in Multiobjective Optimization,
and Uncertainty Modeling, Analysis and Design Using Probability, Fuzzy, Interval,
and Evidence Theories in 2012; and was awarded the ASME Worcester Reed Warner
Medal in 2013 for Outstanding Contributions to the Permanent Literature of Engi-

xxiii
xxiv About the Author

neering, particularly for his Many Highly Popular Books and Numerous Trendsetting
Research Papers. Dr. Rao received the Albert Nelson Marquis Lifetime Achievement
Award for demonstrated unwavering excellence in the field of Mechanical Engineer-
ing in 2018. In 2019, the American Society of Mechanical Engineers presented him
the J.P. Den Hartog Award for his Lifetime Achievements in research, teaching and
practice of Vibration Engineering.
About the Companion Website
This book is accompanied by a companion website:

www.wiley.com/go/Rao/engineering_optimization

The website includes:

• Answers to review questions


• Solution to selected problems

xxv
1

Introduction to Optimization

1.1 INTRODUCTION
Optimization is the act of obtaining the best result under given circumstances. In
design, construction, and maintenance of any engineering system, engineers have to
take many technological and managerial decisions at several stages. The ultimate goal
of all such decisions is either to minimize the effort required or to maximize the desired
benefit. Since the effort required or the benefit desired in any practical situation can
be expressed as a function of certain decision variables, optimization can be defined
as the process of finding the conditions that give the maximum or minimum value of
a function. It can be seen from Figure 1.1 that if a point x* corresponds to the mini-
mum value of function f (x), the same point also corresponds to the maximum value
of the negative of the function, −f (x). Thus, without loss of generality, optimization
can be taken to mean minimization, since the maximum of a function can be found by
seeking the minimum of the negative of the same function.
In addition, the following operations on the objective function will not change the
optimum solution x* (see Figure 1.2):

1. Multiplication (or division) of f (x) by a positive constant c.


2. Addition (or subtraction) of a positive constant c to (or from) f (x).

There is no single method available for solving all optimization problems effi-
ciently. Hence a number of optimization methods have been developed for solving dif-
ferent types of optimization problems. The optimum seeking methods are also known
as mathematical programming techniques and are generally studied as a part of oper-
ations research. Operations research is a branch of mathematics concerned with the
application of scientific methods and techniques to decision-making problems and
with establishing the best or optimal solutions. The beginnings of the subject of oper-
ations research can be traced to the early period of World War II. During the war, the
British military faced the problem of allocating very scarce and limited resources (such
as fighter airplanes, radar, and submarines) to several activities (deployment to numer-
ous targets and destinations). Because there were no systematic methods available to
solve resource allocation problems, the military called upon a team of mathemati-
cians to develop methods for solving the problem in a scientific manner. The methods
developed by the team were instrumental in the winning of the Air Battle by Britain.
These methods, such as linear programming (LP), which were developed as a result
of research on (military) operations, subsequently became known as the methods of
operations research.
In recent years several new optimization methods that do not fall in the area of
traditional mathematical programming have been and are being developed. Most of

Engineering Optimization: Theory and Practice, Fifth Edition. Singiresu S Rao.


© 2020 John Wiley & Sons, Inc. Published 2020 by John Wiley & Sons, Inc.
Companion Website: www.wiley.com/go/Rao/engineering_optimization

1
2 Introduction to Optimization

f(x)
f(x)

x*, Minimum of f(x)

x*
x
0

x*, Maximum of –f(x)

–f(x)

Figure 1.1 Minimum of f (x) is same as maximum of −f (x).

cf(x)
c + f(x)

cf(x)
f(x)
c + f*
cf*

f(x) f(x)

f(x) f(x)
f* f*

x* x x* x

Figure 1.2 Optimum solution of cf (x) or c + f (x) same as that of f (x).

these new methods can be labeled as metaheuristic optimization methods. All the
metaheuristic optimization methods have the following features: (i) they use stochas-
tic or probabilistic ideas in various steps; (ii) they are intuitive or trial and error based,
or heuristic in nature; (iii) they all use strategies that imitate the behavior or character-
istics of some species such as bees, bats, birds, cuckoos, and fireflies; (iv) they all tend
to find the global optimum solution; and (v) they are most likely to find an optimum
solution, but not necessarily all the time.
Table 1.1 lists various mathematical programming techniques together with
other well-defined areas of operations research, including the new class of methods
termed metaheuristic optimization methods. The classification given in Table 1.1 is
not unique; it is given mainly for convenience.
Mathematical programming techniques are useful in finding the minimum of a
function of several variables under a prescribed set of constraints. Stochastic process
techniques can be used to analyze problems described by a set of random variables
1.2 Historical Development 3

Table 1.1 Methods of Operations Research.


Mathematical programming Stochastic process
or optimization techniques techniques Statistical methods
Calculus methods Statistical decision theory Regression analysis
Calculus of variations Markov processes Cluster analysis, pattern
Nonlinear programming Queueing theory recognition
Geometric programming Renewal theory Design of experiments
Quadratic programming Simulation methods Discriminate analysis
Linear programming Reliability theory (factor analysis)
Dynamic programming
Integer programming
Stochastic programming
Separable programming
Multiobjective programming
Network methods: Critical
Path Method (CPM) and
Program (Project)
Management and Review
Technique (PERT)
Game theory
Modern or nontraditional optimization techniques (including Metaheuristic
optimization methods)
Genetic algorithms Bat algorithm Salp swarm algorithm
Simulated annealing Honey Bee algorithm Cuckoo algorithm
Ant colony optimization Crow search algorithm Water evaporation
algorithm
Particle swarm optimization Firefly algorithm Passing vehicle search
algorithm
Tabu search method Harmony search algorithm Runner-root algorithm
Teaching-learning algorithm Artificial immune system
algorithm
Fruitfly algorithm Neural network-based
optimization
Fuzzy optimization

having known probability distributions. Statistical methods enable one to analyze the
experimental data and build empirical models to obtain the most accurate represen-
tation of the physical situation. This book deals with the theory and application of
mathematical programming techniques suitable for the solution of engineering design
problems. A separate chapter is devoted to the metaheuristic optimization methods.

1.2 HISTORICAL DEVELOPMENT


The existence of optimization methods can be traced to the days of Newton, Lagrange,
and Cauchy. The development of differential calculus methods of optimization was
possible because of the contributions of Newton and Leibnitz to calculus. The
foundations of calculus of variations, which deals with the minimization of func-
tionals, were laid by Bernoulli, Euler, Lagrange, and Weierstrass. The method of
optimization for constrained problems, which involves the addition of unknown
multipliers, became known by the name of its inventor, Lagrange. Cauchy made the
first application of the steepest descent method to solve unconstrained minimization
problems. Despite these early contributions, very little progress was made until the
4 Introduction to Optimization

middle of the twentieth century, when high-speed digital computers made imple-
mentation of the optimization procedures possible and stimulated further research
on new methods. Spectacular advances followed, producing a massive literature on
optimization techniques. This advancement also resulted in the emergence of several
well-defined new areas in optimization theory.
It is interesting to note that the major developments in the area of numerical meth-
ods of unconstrained optimization have been made in the United Kingdom only in the
1960s. The development of the simplex method by Dantzig in 1947 for linear pro-
gramming problems and the annunciation of the principle of optimality in 1957 by
Bellman for dynamic programming problems paved the way for development of the
methods of constrained optimization. Work by Kuhn and Tucker in 1951 on the nec-
essary and sufficiency conditions for the optimal solution of programming problems
laid the foundations for a great deal of later research in nonlinear programming (NLP).
The contributions of Zoutendijk and Rosen to NLP during the early 1960s have been
significant. Although no single technique has been found to be universally applicable
for NLP problems, work of Carroll and Fiacco and McCormick allowed many difficult
problems to be solved by using the well-known techniques of unconstrained optimiza-
tion. Geometric programming (GMP) was developed in the 1960s by Duffin, Zener,
and Peterson. Gomory did pioneering work in integer programming, one of the most
exciting and rapidly developing areas of optimization. The reason for this is that most
real-world applications fall under this category of problems. Dantzig and Charnes and
Cooper developed stochastic programming techniques and solved problems by assum-
ing design parameters to be independent and normally distributed.
The desire to optimize more than one objective or goal while satisfying the phys-
ical limitations led to the development of multiobjective programming methods. Goal
programming is a well-known technique for solving specific types of multiobjec-
tive optimization problems. The goal programming was originally proposed for linear
problems by Charnes and Cooper in 1961. The foundations of game theory were laid
by von Neumann in 1928, and since then the technique has been applied to solve sev-
eral mathematical economics and military problems. Only during the last few years
has game theory been applied to solve engineering design problems.

1.2.1 Modern Methods of Optimization


The modern optimization methods, also sometimes called nontraditional optimization
methods, have emerged as powerful and popular methods for solving complex engi-
neering optimization problems in recent years. These methods include genetic algo-
rithms, simulated annealing, particle swarm optimization, ant colony optimization,
neural network-based optimization, and fuzzy optimization. The genetic algorithms
are computerized search and optimization algorithms based on the mechanics of natu-
ral genetics and natural selection. The genetic algorithms were originally proposed by
John Holland in 1975. The simulated annealing method is based on the mechanics of
the cooling process of molten metals through annealing. The method was originally
developed by Kirkpatrick, Gelatt, and Vecchi.
The particle swarm optimization algorithm mimics the behavior of social organ-
isms such as a colony or swarm of insects (for example, ants, termites, bees, and
wasps), a flock of birds, and a school of fish. The algorithm was originally proposed
by Kennedy and Eberhart in 1995. The ant colony optimization is based on the coop-
erative behavior of ant colonies, which are able to find the shortest path from their
nest to a food source. The method was first developed by Marco Dorigo in 1992. In
recent years, a class of methods, termed metaheuristic algorithms, are being devel-
oped for the solution of optimization problems. These include techniques such as the
firefly, harmony search, bee, cuckoo, bat, crow, teaching-learning, passing-vehicle,
1.3 Engineering Applications of Optimization 5

and salp swarm algorithms. The neural network methods are based on the immense
computational power of the nervous system to solve perceptional problems in the pres-
ence of a massive amount of sensory data through its parallel processing capability.
The method was originally used for optimization by Hopfield and Tank in 1985. The
fuzzy optimization methods were developed to solve optimization problems involv-
ing design data, objective function, and constraints stated in imprecise form involving
vague and linguistic descriptions. The fuzzy approaches for single and multiobjective
optimization in engineering design were first presented by Rao in 1986.

1.3 ENGINEERING APPLICATIONS OF OPTIMIZATION


Optimization, in its broadest sense, can be applied to solve any engineering prob-
lem. Some typical applications from different engineering disciplines indicate the wide
scope of the subject:

1. Design of aircraft and aerospace structures for minimum weight


2. Finding the optimal trajectories of space vehicles
3. Design of civil engineering structures such as frames, foundations, bridges,
towers, chimneys, and dams for minimum cost
4. Minimum-weight design of structures for earthquake, wind, and other types
of random loading
5. Design of water resources systems for maximum benefit
6. Optimal plastic design of structures
7. Optimum design of linkages, cams, gears, machine tools, and other mechan-
ical components
8. Selection of machining conditions in metal-cutting processes for minimum
production cost
9. Design of material handling equipment, such as conveyors, trucks, and cranes,
for minimum cost
10. Design of pumps, turbines, and heat transfer equipment for maximum effi-
ciency
11. Optimum design of electrical machinery such as motors, generators, and
transformers
12. Optimum design of electrical networks
13. Shortest route taken by a salesperson visiting various cities during one tour
14. Optimal production planning, controlling, and scheduling
15. Analysis of statistical data and building empirical models from experimental
results to obtain the most accurate representation of the physical phenomenon
16. Optimum design of chemical processing equipment and plants
17. Design of optimum pipeline networks for process industries
18. Selection of a site for an industry
19. Planning of maintenance and replacement of equipment to reduce operating
costs
20. Inventory control
21. Allocation of resources or services among several activities to maximize the
benefit
22. Controlling the waiting and idle times and queueing in production lines to
reduce the costs
6 Introduction to Optimization

23. Planning the best strategy to obtain maximum profit in the presence of a com-
petitor
24. Optimum design of control systems

1.4 STATEMENT OF AN OPTIMIZATION PROBLEM


An optimization or a mathematical programming problem can be stated as follows.

⎧x1 ⎫
⎪x ⎪
Find X = ⎨ 2 ⎬ which minimizes f (X)

⎪ ⎪
⎩xn ⎭

subject to the constraints

gj (X) ≤ 0, j = 1, 2, . . . , m
(1.1)
lj (X) = 0, j = 1, 2, . . . , p

where X is an n-dimensional vector called the design vector, f (X) is termed the objec-
tive function, and gj (X) and lj (X) are known as inequality and equality constraints,
respectively. The number of variables n and the number of constraints m and/or p need
not be related in any way. The problem stated in Eq. (1.1) is called a constrained opti-
mization problem.1 Some optimization problems do not involve any constraints and
can be stated as
⎧x1 ⎫
⎪x ⎪
Find X = ⎨ 2 ⎬ which minimizes f (X) (1.2)

⎪ ⎪
⎩xn ⎭

Such problems are called unconstrained optimization problems.

1.4.1 Design Vector


Any engineering system or component is defined by a set of quantities, some of which
are viewed as variables during the design process. In general, certain quantities are
usually fixed at the outset and these are called preassigned parameters. All the other
quantities are treated as variables in the design process and are called design or deci-
sion variables xi , i = 1, 2, . . . , n. The design variables are collectively represented as
a design vector X = {x1 , x2 , . . . , xn }T . As an example, consider the design of the gear
pair shown in Figure 1.3, characterized by its face width b, number of teeth T1 and T2 ,
center distance d, pressure angle 𝜓, tooth profile, and material. If center distance d,
pressure angle 𝜓, tooth profile, and material of the gears are fixed in advance, these
quantities can be called preassigned parameters. The remaining quantities can be col-
lectively represented by a design vector X = {x1 , x2 , x3 }T = {b, T1 , T2 }T . If there are
no restrictions on the choice of b, T1 , and T2 , any set of three numbers will constitute a
design for the gear pair. If an n-dimensional Cartesian space with each coordinate axis
representing a design variable xi (i = 1, 2, . . . , n) is considered, the space is called the
design variable space or simply design space. Each point in the n-dimensional design

1 In the mathematical programming literature, the equality constraints l (X) = 0, j = 1, 2, . . . , p are often
j
neglected, for simplicity, in the statement of a constrained optimization problem, although several methods
are available for handling problems with equality constraints.
1.4 Statement of An Optimization Problem 7

T1

N1

T2 d

N2

b
Figure 1.3 Gear pair in mesh.

space is called a design point and represents either a possible or an impossible solution
to the design problem. In the case of the design of a gear pair, the design point {1.0,
20, 40}T , for example, represents a possible solution, whereas the design point {1.0,
−20, 40.5}T represents an impossible solution since it is not possible to have either a
negative value or a fractional value for the number of teeth.

1.4.2 Design Constraints


In many practical problems, the design variables cannot be chosen arbitrarily; rather,
they have to satisfy certain specified functional and other requirements. The restric-
tions that must be satisfied to produce an acceptable design are collectively called
design constraints. Constraints that represent limitations on the behavior or perfor-
mance of the system are termed behavior or functional constraints. Constraints that
represent physical limitations on design variables, such as availability, fabricability,
and transportability, are known as geometric or side constraints. For example, for the
gear pair shown in Figure 1.3, the face width b cannot be taken smaller than a cer-
tain value, due to strength requirements. Similarly, the ratio of the numbers of teeth,
T1 /T2 , is dictated by the speeds of the input and output shafts, N1 and N2 . Since these
constraints depend on the performance of the gear pair, they are called behavior con-
straints. The values of T1 and T2 cannot be any real numbers but can only be integers.
Further, there can be upper and lower bounds on T1 and T2 due to manufacturing limi-
tations. Since these constraints depend on the physical limitations, they are called side
constraints.

1.4.3 Constraint Surface


For illustration, consider an optimization problem with only inequality constraints
gj (X) ≤ 0. The set of values of X that satisfy the equation gj (X) = 0 forms a hyper-
surface in the design space and is called a constraint surface. Note that this is an
8 Introduction to Optimization

(n − 1)-dimensional subspace, where n is the number of design variables. The con-


straint surface divides the design space into two regions: one in which gj (X) < 0 and
the other in which gj (X) > 0. Thus, the points lying on the hypersurface will satisfy the
constraint gj (X) critically, whereas the points lying in the region where gj (X) > 0 are
infeasible or unacceptable, and the points lying in the region where gj (X) < 0 are fea-
sible or acceptable. The collection of all the constraint surfaces gj (X) = 0, j = 1, 2, . . . ,
m, which separates the acceptable region is called the composite constraint surface.
Figure 1.4 shows a hypothetical two-dimensional design space where the infeasi-
ble region is indicated by hatched lines. A design point that lies on one or more than
one constraint surface is called a bound point, and the associated constraint is called an
active constraint. Design points that do not lie on any constraint surface are known as
free points. Depending on whether a particular design point belongs to the acceptable
or unacceptable region, it can be identified as one of the following four types:

1. Free and acceptable point


2. Free and unacceptable point
3. Bound and acceptable point
4. Bound and unacceptable point

All four types of points are shown in Figure 1.4.

1.4.4 Objective Function


The conventional design procedures aim at finding an acceptable or adequate design
that merely satisfies the functional and other requirements of the problem. In gen-
eral, there will be more than one acceptable design, and the purpose of optimization
is to choose the best one of the many acceptable designs available. Thus, a criterion
has to be chosen for comparing the different alternative acceptable designs and for

Behavior constraint g2 = 0
x2 Side constraint g3 = 0

Infeasible region Feasible region

Free point Behavior


Behavior constraint
constraint g1 = 0
g4 = 0
Bound acceptable
point
Free
unacceptable
point Side constraint g5 = 0

Bound unacceptable point

x1

Figure 1.4 Constraint surfaces in a hypothetical two-dimensional design space.


Another random document with
no related content on Scribd:
„Wat wenscht gij?” vroeg hij met heesche stem op hoogmoedigen toon.

Raffles was naast de schrijftafel komen slaan, terwijl Charly bij de deur
was gebleven.

„Ik kom naar aanleiding van een droevige zaak,” sprak de groote
onbekende met veranderde stem. „Mijn naam is Edward Stanhope,
professor in de Anthropologie. Dus een collega van u, dr. Braddon.”

Hij legde voor den dokter een visitekaartje neer, dat den naam professor
Edward Stanhope droeg.

Dr. Braddon, wiens gelaat tot dusverre een onvriendelijke uitdrukking


had gehad, veranderde dadelijk en een beminnelijk, onderdanig
glimlachje speelde om zijn mond, toen hij antwoordde:

„Ik ben innig verheugd, kennis met u te maken. Ik meen uw naam


meermalen in wetenschappelijke tijdschriften te hebben gelezen,
professor. Waarmee kan ik u van dienst zijn?”

„Zooals ik zei,” herhaalde Raffles op ernstigen toon, „het is een treurige


aangelegenheid, die mij dwingt, u op te zoeken en uw hulp te vragen.

„Sta mij toe,” hij maakte een handbeweging in de richting van Charly
Brand, „u mijn assistent, dr. Harry Smith, voor te stellen.”

Dr. Braddon en Charly bogen en wisselden de gebruikelijke


beleefdheidswoorden.

„Ik veronderstel, dat het om een zieke te doen is”, sprak dr. Braddon.

„Juist,” antwoordde Raffles, „om mijn broer. Ik heb hem een jaar lang bij
mij aan huis laten verplegen en zou hem ook niet laten heengaan, maar
een reis naar Borneo dwingt mij, hem eenigen tijd aan andere handen
toe te vertrouwen. Een wetenschappelijk doel, een belangrijke
ontdekking betreffende, noodzaakt mij, deze reis te ondernemen.
„Misschien hebt gij, dr. Braddon, reeds gehoord, dat op Borneo, evenals
in Frankrijk, een schedelopgraving heeft plaats gehad, die waarschijnlijk
van oudere tijden spreekt dan alles wat tot dusverre op dat gebied is
ontdekt.

„Ik zelf heb voor ons wetenschappelijk museum meerdere menschelijke


schedels in de steengroeve te Lausanne gevonden en zou de
gelegenheid, welke deze nieuwe ontdekking aanbiedt, niet willen laten
voorbijgaan en misschien een nieuw, testbaar resultaat voor [7]het
bestaan der menschen in de ijsperiode te verkrijgen.”

Hij zweeg even en vervolgde na een poosje:

„Neem mij niet kwalijk, dr. Braddon, de belangstelling in mijn vak leidt mij
geheel van de reden mijner komst af.

„Om dus weer op mijn broer terug te komen, ik ben van plan, u den
zieke gedurende mijn afwezigheid toe te vertrouwen.”

„Zeer aangenaam,” sprak dr. Braddon verheugd met een buiging. „Het
zal uw broer bij mij aan niets ontbreken, professor. Mag ik vragen, welke
ziekteverschijnselen, zich bij uw broer voordoen?”

Raffles kuchte en sprak:

„Een bijzonder geval. Het is merkwaardig, welke ideeën zich in een ziek
brein kunnen nestelen.

„Mijn broer is jurist. Zijn vermogen staat hem toe, zonder een beroep uit
te oefenen, verder te studeeren. Ik heb reeds in de jeugd van mijn broer
vreemde gedachten bij hem opgemerkt, maar heb toch nooit vermoed,
dat het zoover met hem zou komen. Zijn ideaal is altijd geweest om zich
bezig te houden met het ontdekken van zware misdaden.

„Een jaar geleden deelde hij mij mede, dat zijn theorieën hem niet meer
bevredigden en dat hij zich in de practijk wilde bezighouden met het
opsporen van misdaden.
„Maandenlang was hij uit mijn huis verdwenen en ik weet op het
oogenblik nog niet, waar hij zich heeft opgehouden. In het kort, op
zekeren avond verscheen hij in mijn studeerkamer in de kleeren van een
beambte van politie, die hij bij den een of anderen uitdrager had gekocht
en sprak tot mij:

„„Ik verklaar u in naam der wet als mijn gevangene. Ik ben inspecteur
van politie Baxter uit Londen.”

„Ik lachte erom, want ik dacht, dat mijn broer schertste.

„Maar helaas, ik zag weldra in, dat zijn optreden treurige werkelijkheid
was, dat mijn broeder krankzinnig was geworden.”

Dr. Braddon knikte met het hoofd en sprak:

„Een dergelijk geval komt hier dikwijls voor. Een idee fixe.
Vervolgingswaanzin! Is uw broer kwaadaardig?”

„Hoe meent gij dat?” vroeg Raffles.

„Ik bedoel, als men hem tegenspreekt, dat hij de Londensche inspecteur
van politie is, dan begint hij te razen en te tieren, nietwaar?”

„Ja,” antwoordde Raffles, „af en toe heeft hij zich zelfs voorzien van
revolvers en andere wapenen en mijn assistent en ik moesten veel
moeite aanwenden om hem te kalmeeren.”

„Een dergelijken patiënt mag men nooit tegenspreken,” sprak dr.


Braddon, „dat is helaas een ongeneeslijke ziekte. Wanneer denkt gij af
te reizen, professor?”

„Reeds over eenige dagen,” antwoordde Raffles, „ik moet u daarom


verzoeken, mijn broeder reeds morgen in uw inrichting op te nemen.”

„Ik heb op het oogenblik een zeer mooie kamer disponibel,” vertelde dr.
Braddon, in een boek bladerend, dat op zijn schrijftafel lag. „Het is no.
310; die kamer kost met verpleging eerste klasse en een bijzonderen
bewaker driehonderd pond sterling per maand.

„Ik neem geen patiënten aan, zonder dat de prijs van een jaar is
vooruitbetaald. Ik verzoek u dus, professor, om aan mijn kassier het
bedrag van drieduizend zeshonderd pond sterling te betalen.

„Indien gij wenscht, dat uw broeder sigaren of alcoholische dranken


gebruikt, welke niet in de verpleging zijn begrepen, dan moet ik u
verzoeken, een extra bedrag te deponeeren. Voor eventueele bijzondere
uitgaven moet tweehonderd pond sterling worden betaald. Medicijnen
en, zoo noodig, medische behandeling, worden u in rekening gebracht.

„Kleeren, evenals lijfgoed, bedde- en tafellinnen moeten worden


meegebracht. Zoodra gij voor dit alles hebt gezorgd, ben ik bereid, uw
broeder in mijn inrichting op te nemen.”

„Goed”, sprak Raffles, „ik zal u morgenochtend [8]het gewenschte door


mijn bediende laten brengen en morgenmiddag zal mijn broer bij u
komen. Daar hij, evenals alle dergelijke zieken, zeer wantrouwend is en
waarschijnlijk zou weigeren, onder mijn geleide hier te komen, verzoek ik
u, mij een brief voor hem mee te geven van den volgenden inhoud:

„Waarde Heer!

In zake een gewichtige misdaad verzoek ik u om mij in den middag tusschen 4


en 5 uur te willen bezoeken.

Hoogachtend, enz.”

„Ik zal het bedrag van duizend pond sterling dadelijk bij u deponeeren en
de rest morgen door mijn bediende u toezenden.”

Raffles zag, hoe dr. Braddon met begeerige oogen het biljet van duizend
pond, dat de groote onbekende uit zijn portefeuille nam, opstreek.

Dr. Braddon schreef hiervoor een kwitantie.


Daarop ham hij een vel papier, met zijn naam bedrukt, en schreef wat
Raffles verlangde.

Met beleefde woorden namen de bezoekers afscheid. Toen zij het


bureau hadden verlaten, grijnslachte dr. Braddon, wreef zich vergenoegd
de handen en sprak:

„De zaak bloeit! Nu is de laatste vrije kamer weer bezet.

„Er zijn meer gekken in de wereld dan ik kamers heb om ze te


herbergen.”

Raffles had intusschen in een huurrijtuig plaats genomen en reed terug


naar zijn huis aan het Waterlooplein.

Ook hij was zeer tevreden en sprak tot Charly Brand:

„De val is gezet, mijn lieve Braddon, morgen klapt ze dicht.”

[Inhoud]
DERDE HOOFDSTUK.
TWEE HEEREN DIENEN.

Den volgenden voormiddag verliet Raffles zijn woning, vermomd als een
bediende van deftigen huize.

Met een hoogmoedige en dombrutale uitdrukking op het gelaat ging hij


naar verschillende winkels in linnengoederen en kocht de door dr.
Braddon verlangde uitrusting voor inspecteur Baxter.

Daarop nam hij een rijtuig en reed met het pakket naar de inrichting van
dr. Braddon.

Hij werd in een kantoor gebracht, waar men hem het goed afnam.

Daarop ging hij naar de kas, om het nog ontbrekende bedrag te betalen.

Toen hij de quitantie had ontvangen, kwam de bediende van dr. Braddon
uit de wachtkamer en meldde:

„Dr. Braddon wenscht u te spreken.”

„Mijn tijd is zeer beperkt,” antwoordde Raffles met het domverwaande


gezicht van een deftigen bediende. „Ik heb nog veel te doen.”

„Het zal maar eenige minuten duren,” meende de bediende van den
dokter. [9]

Raffles volgde hem in de studeerkamer van zijn meester. Dr. Braddon


zat weer aan zijn schrijftafel en zonder Raffles, die als een welopgevoed
ondergeschikte bij de deur bleef staan, dichterbij te laten komen, vroeg
hij:

„Hoe lang zijt gij al in dienst bij professor Stanhope?”

„Reeds tien jaar,” antwoordde Raffles.


„Zoo, zoo,” antwoordde Braddon, schijnbaar in verschillende papieren
snuffelend. „Rijke familie?”

„Ja,” antwoordde Raffles, „maar het spijt mij u te moeten zeggen, dat ik
mij niet gaarne over mijn meester laat uithooren.”

„Praat geen nonsens,” bulderde dr. Braddon hem toe.

„Groet mijnheer den professor en zeg hem, dat ik zijn broer op den
afgesproken tijd, vanmiddag tusschen 4 en 5, verwacht. Zeg eens,
bedient gij den broer, den patiënt?”

„Ja,” antwoordde Raffles, „en ik zal mijnheer vanmiddag zelf brengen. Hij
volgt mij in alles. Ik kan heel goed met hem opschieten.”

„Het is goed. Gij kunt gaan.”

Raffles verdween en begaf zich naar zijn huis terug.

Het was tegen half vier in den namiddag, toen Raffles, nog steeds als
bediende verkleed, met hetzelfde verwaande uiterlijk als des morgens,
zich tot den dienstdoenden beambte in Scotland Yard wendde en dezen
meedeelde, dat hij een brief persoonlijk aan inspecteur Baxter moest
overhandigen. Argeloos bracht de beambte hem naar den chef der
Londensche politie.

Raffles bleef beleefd voor den inspecteur staan, maakte een buiging en
sprak:

„Compliment van mijn meester en ik moet u dezen brief geven.”

Inspecteur Baxter nam den brief, die den vorigen dag door dr. Braddon
was geschreven en las hem.

„Wenscht gij antwoord mee te hebben?” vroeg de inspecteur.


„Gaarne,” antwoordde Raffles, „ik kreeg opdracht, mijnheer den
inspecteur dadelijk mee te brengen.”

„Allright,” sprak Baxter, „er schijnt iets gewichtigs bij u te zijn


voorgevallen. Ik zal dadelijk met u meegaan.”

Hij maakte zich gereed en volgde Raffles.

Deze riep een auto aan, opende het portier en bleef met zijn hoed in de
hand staan, totdat de inspecteur in den wagen had plaats genomen.

Daarop sloot Raffles de deur en nam naast den chauffeur plaats, als een
goed geschoold bediende, met gekruiste armen.

Na een rit van een half uur hield de auto stil voor de inrichting van dr.
Braddon.

Weer opende Raffles het portier der auto, opnieuw nam hij zijn hoed af
voor den uitstappenden inspecteur van politie en daarna ging hij hem
voor het gebouw binnen.

Dr. Braddon verwachtte hen reeds.

Raffles bracht den inspecteur door de kleine voorkamer, klopte aan de


deur van het studeervertrek, opende deze en diende aan: inspecteur van
politie Baxter uit Londen.

Op dit oogenblik drukte dr. Braddon op een electrisch knopje, dat zich
onder zijn bureau bevond en gaf daardoor een teeken aan de in het
souterrain gelegen verplegerskamer.

Hij drukte driemaal en dit beteekende drie verplegers.

Al naarmate hij de lichaamskracht van den patiënt taxeerde, liet hij een
of meer verplegers komen. Dr. Braddon stond uit zijn stoel op en ging
den inspecteur tegemoet.
Met een vriendelijk glimlachje reikte hij hem een hand en sprak:

„Ik ben zeer verheugd, heer inspecteur van politie, dat gij gevolg hebt
gegeven aan mijn brief en ik verzoek u, plaats te nemen.”

Baxter ging zitten en antwoordde:

„Ik moet u verzoeken, doctor, de zaak zoo snel mogelijk te behandelen,


daar ik heden nog met mijn [10]rechercheurs een groote misdaad moet
behandelen.”

Dr. Braddon glimlachte goedig.

„De zaak zal spoedig opgeknapt zijn, inspecteur, en ik hoop, dat gij
tevreden over mij zult zijn.”

„Waarover handelt het?” vroeg Baxter vol belangstelling.

De stoel, waarop de bezoeker zat, was een uitvinding van dr. Braddon
en zoo ingericht, dat men weerspannige patiënten er gemakkelijk op
vast kon binden.

Achter de rugleuning was een onzichtbare deur aangebracht. De op den


stoel zittende kon dus niet zien, wanneer deze geopend werd.

Door die deur waren onhoorbaar, op schoenen met gummizolen, drie


forschgebouwde verplegers binnengekomen en wachtten op een wenk,
om zich op inspecteur Baxter te werpen en dezen te boeien.

Nu hief dr. Braddon zijn arm op, ten teeken, dat de inspecteur moest
worden weggebracht.

Baxter, die zich juist begon te verbazen, dat dr. Braddon geen antwoord
gaf, kreeg dit op onverwachte wijze. Op het teeken van den dokter
sprongen de drie bewakers naar den inspecteur toe, en voordat deze
een kreet kon slaken, was hij op zijn stoel vastgesnoerd, zoodat hij zich
op geen enkele manier kon bewegen.
„Wat wilt gij van mij? Zijt gij gek?” riep Baxter uit, woest om zich heen
slaande.

„Voor den duivel! Laat mij los! Ik ben de inspecteur van Scotland Yard!”

„Zeker, zeker,” lachte Braddon, „gij zijt de inspecteur van politie van
Scotland Yard. Daaraan twijfelen wij geen oogenblik. En daarom moet
gij, om een misdaad aan het licht te brengen, eenigen tijd uw tenten bij
mij opslaan.”

„Vervloekt Sir! daarvoor is het toch niet noodig om mij als een
krankzinnige te boeien.”

„Laat dat maar aan mij over,” sprak dr. Braddon. „Brengt hem weg.”

De verplegers tilden den van handvatten voorzienen stoel met den


vastgebonden inspecteur van politie op en droegen den patiënt, die nu
luid brullend begon te vloeken, uit de kamer.

Nu keerde Braddon zich om en zag den bij de deur wachtenden


bediende, die zijn zakdoek voor de oogen hield en snikte.

„Huil niet!” sprak dr. Braddon.

„Mijn goede heer— —mijn beste, brave heer!—” riep Raffles uit.

„Hij blijft slechts eenige minuten in den stoel,” stelde dr. Braddon den
bediende gerust.

„Zoodra hij gekalmeerd is, wordt hij losgelaten en bevindt zich dan in
een zeer goed ingerichte kamer. Meld nu den professor, dat zijn broer
goed bezorgd is.”

Raffles maakte een buiging en verliet de inrichting.

Toen hij thuis was, sprak hij tot Charly Brand:


„Nu is Scotland Yard zonder inspecteur van politie en opdat dit niet
opvalt, zal ik mij telefonisch met Scotland Yard in verbinding stellen en
daar meedeelen, alsof ik inspecteur Baxter was, dat ik verscheiden
dagen buiten Londen moet blijven. Ook zijn familie zal ik dit bericht doen
toekomen.

„De val is dicht, dr. Braddon!” [11]

[Inhoud]
VIERDE HOOFDSTUK.
IN HET KRANKZINNIGENGESTICHT.

Inspecteur Baxter kon nu persoonlijk ervaren, wat het zeggen wil om


voor de overmacht te moeten bukken.

Op het hoofdbureau van politie bracht men verstokte misdadigers


zonder erbarming in een donkere cel, waar zij met knuppels en
gummistokken zoolang bewerkt werden, tot zij een bekentenis aflegden,
of murw geslagen werden.

Dat noemde men den derden graad toepassen.

Toen de bewakers hem van den stoel losbonden om hem naar zijn
kamer te brengen, stortte hij zich op den dichtstbijstaande, sloeg dezen
neer en wilde zich op den tweede werpen, toen een half dozijn mannen
hem beetpakten en hem, ondanks wanhopigen tegenstand, in een cel
voor gevaarlijke krankzinnigen wierpen.

Uit dit vensterlooze, door dikke muren omgeven vertrek, waarheen


dubbele deuren toegang gaven, tusschen welke een bewaker kon staan
om den in de cel opgeslotene te bespieden, drong geen enkel geluid.

De parketvloer was spiegelglad geboend, opdat de voeten van den


ongelukkige geen steunpunt vonden.

De bewakers daarentegen, met hun gummizolen, stonden er vast op.

In een dergelijke cel brachten de bewakers den inspecteur en ranselden


zoolang op hem, totdat hij in een hoek lag en om genade smeekte.

„Allright”, sprak de hoofdverpleger, „nu ken je onze manieren. Neem je


voortaan in acht en denk niet, dat je met ons kunt doen wat je wilt. Hier
zijn wij de baas en als je nog eens lust mocht krijgen om met ons te
beginnen, dan hebben wij nog betere dwangmiddelen. Denk daaraan.”
Inspecteur Baxter zag in, dat verzet tegen de overmacht hem niets zou
baten. Hij besloot dus, zich door list te helpen.

Men had hem de kleeren uitgetrokken en een dwangbuis aangedaan.


Dat was van onscheurbaar dril en zonder naad gemaakt en werd op den
rug met zoogenaamde patentknoopen gesloten. De mouwen waren een
halve meter te lang en konden zoodanig geknoopt worden, dat de
persoon, die het buis droeg zijn armen en handen niet kon gebruiken.

Een met lucht gevulde zak lag in een hoek der cel als slaapgelegenheid.

Hier liet men den nieuwen patiënt twee dagen lang. Daarop kwam dr.
Braddon, vergezeld door den hoofdverpleger, bleef in de deur staan en
sprak:

„Breng den man nu naar zijn kamer. Zoodra hij zich wederom
weerspannig mocht toonen, wordt hij in de keldercel opgesloten.”

„Ik maak er u attent op,” sprak Baxter, „dat gij de gevolgen van uw
handelwijze moet afwachten. Aan mij is een misdaad begaan; ik ben niet
krankzinnig en even verstandig als gij. Ik herhaal u, dat ik de inspecteur
van politie van Scotland Yard ben.”

Dr. Braddon antwoordde met een glimlach.

„Goed, goed, ik weet, dat gij de inspecteur van Scotland Yard zijt en gij
bevindt u bij mij in uitstekend gezelschap. Gij zult bij mij den Koning van
Engeland, den Keizer van Rusland en andere allerhoogste potentaten
aantreffen, zelfs is de lieve God bij mij.

„Ik zou niet weten, waarom gij, als inspecteur van politie, niet evengoed
verblijf bij mij zoudt houden. [12]

„Ik hoop, dat het u hier goed zal bevallen en dat wij de beste vrienden
zullen worden.”
Dr. Braddon verliet de cel en inspecteur Baxter werd naar zijn kamer op
de vierde verdieping gebracht.

Het was een vertrek met verschillende bedden, in het midden stond een
groote tafel, waaraan eenige verplegers zaten te kaartspelen.

Op een leeren sofa hadden verscheiden ongelukkigen plaats genomen,


die den binnentredende inspecteur met groote oogen aankeken.

„Hier is de nieuwe”, sprak de verpleger tot zijn collega’s, terwijl hij Baxter
naar binnenduwde.

„De kerel verbeeldt zich, de Londensche inspecteur van politie te zijn.”

„Dien kunnen wij hier juist gebruiken,” antwoordde de hoofdverpleger


van de afdeeling en naderde den binnentredende.

Hij pakte hem ruw beet en spotte:

„Wel, inspecteur van politie? Blij, kennis met je te maken, leelijke hond!
Je zult het goed bij mij hebben. Wij leeren elkaar hier beter kennen dan
wanneer ik bij bij jou in Scotland Yard was. Ik sla je al je tanden uit je
bek, als je mij gevangen wilt nemen. En ga nu op de sofa zitten en geef
geen geluid, anders zal je eens wat ondervinden.

„Ik houd van rust, denk daaraan!”

Baxter antwoordde geen woord, maar ging naast de ongelukkige zieken


op de sofa zitten, terwijl de verplegers, ruw vloekend, verder speelden.

Wel een uur lang had hij zoo gezeten, toen een der patiënten met
haperende stem begon te bidden.

„Stilte!” gebood dé hoofdverpleger den man. „Vervloekte schijnheilige,


wil je zwijgen?”
Maar de patiënt, die aan godsdienstwaanzin leed, lette niet op dit
verbod, stond op en bad verder.

Toen een der verplegers hem op de sofa terug wilde duwen, sloeg hij
met zijn hand naar dezen.

Dit was het teeken om den ongelukkige aan te vallen.

Als een bende wilde bandieten wierpen de verplegers zich op den


patiënt, gooiden hem op een bed en begonnen hem onbarmhartig met
hun vuisten te slaan.

Dit benam Baxter zijn laatste kalmte.

Hij sprong op, greep een stoel en sloeg hiermee op de verplegers los.

En het gelukte den vier mannen niet, den woedenden en zich als dol
aanstellenden inspecteur te overweldigen.

Gillend moesten zij uit de kamer vluchten en andere verplegers te hulp


roepen.

Baxter wist, wat nu zou gebeuren. Hij schoof de bedden en tafels voor
de deur en barrikadeerde deze.

Daarop sloeg hij met een stoelpoot de vensterruit stuk en riep door de
opening luide om hulp.

Maar het vertrek lag niet aan den straatkant, maar kwam uit op de
binnenplaats.

Voor de deur waren de verplegers samengekomen en Baxter dacht elk


oogenblik, dat een nieuw gevecht zou beginnen.

Maar hij vergiste zich.

De verplegers hadden respect voor zijn kracht gekregen en wisten, dat


hij gewapend was.
Ook dr. Braddon, die bericht had gekregen, was verschenen.

Hij gaf nu aan, welke maatregelen genomen moesten worden om den


razenden Baxter onschadelijk te maken.

Boven de deur bevond zich een klein venster, dat diende om het licht
van een gasvlam, die op de gang brandde, in de kamer te doen
schijnen.

Dit venster werd nu geopend en een verpleger verscheen ervoor.

Hij bracht den mond van een waterslang in de kamer.

Voordat Baxter nog wist, wat de verpleger hiermee wilde doen, trof hem
een straal warm water dat hem tot op de huid nat maakte. [13]

Baxter trachtte, zich tegen den waterstraal te beschermen, maar dit was
te vergeefsch.

De kamer bood hem geen bescherming tegen het natte element.


Plotseling voelde de inspecteur, dat de straal al warmer en warmer werd,
totdat deze zoo heet was dat hij het bijna niet meer kon uithouden.

„Brand hem het vleesch van het lijf!” schreeuwde dr. Braddon.

Nu begreep Baxter het vreeselijke gevaar, waarin hij zich bevond.

De onmenschen hadden de waterslang aan de stoomverwarming van


het gebouw aangesloten met de bedoeling, den oproerigen patiënt te
verbranden.

Met een tafelblad als schild trachtte Baxter zich tegen het kokende water
te beschermen.

De patiënten, die zich met hem in de kamer bevonden, brulden van pijn,
want ook zij werden door de heete stralen getroffen.

„Verzuipt den geheelen troep”, riep de hoofdverpleger buiten.


Baxter zag in, dat hij tegen dit paardemiddel machteloos was en riep tot
den verpleger, die de slang vasthield:

„Houdt op, ik geef mij over!”

„Allright!” sprak dr. Braddon, die naast den verpleger op een ladder
stond, „maak de deur open!”

Inspecteur Baxter ruimde de voor de deur aangebrachte barricade weg


en wachtte, zonder zich verder te verdedigen, de verplegers af.

Als een bende jachthonden vlogen zij op hem aan, scheurden hem de
natte kleeren van het lichaam en trokken hem opnieuw een dwangbuis
aan.

„Dit is een moordenaarshol!” riep Baxter tot dr. Braddon, die het tooneel
stond aan te kijken.

„Slaat den kerel!” schreeuwde dr. Braddon en als hagelsteenen vielen de


vuistslagen op Baxter neer.

Daarop beval dr. Braddon:

„Brengt hem in den kelder en sluit hem aan. Wij zullen hem wel tam
krijgen. De hond brengt anders de geheele inrichting in opstand.”

Door acht verplegers werd Baxter nu naar den kelder gebracht en in een
vertrek geduwd, dat volkomen donker en ijskoud was.

Een electrische lamp aan de zoldering werd opgedraaid en Baxter zag


drie slachtoffers, die in verschillende hoeken van den kerker op den
steenen vloer lagen zonder iets om op te liggen en die met ijzeren ringen
zoodanig aan den vloer waren vastgemaakt, dat zij niets eens konden
opstaan.

Zij hieven een woest gebrul aan, toen de verplegers met Baxter
binnentraden.
„Beulsknechten!—moordenaars!—galgenhonden!—vervloekte
misdadigers!” riepen de gevangenen tot hun pijnigers.

Deze antwoordden met een ruwen lach.

De inspecteur van politie trachtte nog eenmaal om zich te verzetten; het


gelukte hem ook, twee zijner vijanden door trappen onschadelijk te
maken, maar daarop overweldigden de anderen hem en sloten hem aan
ijzeren ringen vast.

Nadat zij hem nog verscheiden vuistslagen hadden toegebracht en met


de voeten hadden getrapt, verlieten zij het vertrek, sloten de zware deur
achter zich dicht en draaiden het licht uit. De inspecteur van politie
bevond zich, als een wild dier onschadelijk gemaakt, in een vertrek,
zooals zijn phantasie het niet griezeliger kon uitdenken.

Hij had daar wel een kwartier gelegen, toen hij de stem van een zijner
lotgenooten in het donker hoorde, die tot hem riep:

„Waart gij ook zoo dwaas? Hebt ge u ook tegen die moordenaars willen
verzetten?”

„Ja”, antwoordde inspecteur Baxter, „Wie zijt gij?”

„Ik ben bankier Gulden”, antwoordde de stem, „en sinds drie dagen in dit
moordenaarshol. Mijn zoon heeft mij hierheen gebracht, om zich mijn
vermogen toe te eigenen en die schurk, dokter Braddon, helpt hem
erbij.”

„Wie zijt gij?”

Baxter dacht erover na, of hij den bankier zijn [14]werkelijken naam zou
meededen; hij zag geen enkele reden, om dien te verbergen en
antwoordde: „Ik ben de inspecteur van politie Baxter van Scotland Yard.”

„Wie zijt gij?” vroeg bankier Gulden met groote verbazing. „Zijt gij de
inspecteur van politie van Scotland Yard? Mijn hemel! Dan zijt gij de

You might also like