You are on page 1of 734
Ea Es us or Bi] Be ee ea Derma tS Paredes 7 AS rises ee ae HAM init ese) Neural Network Design This book provides a clear and detailed survey of basic neural network architectures and learning rules, In it, the authors emphasize mathematical analysis of networks, methods for training networks, and application of networks to practical engineering problems in pattern recognition, signal processing, and control systems. Features: Extensive coverage of performance learning, including the Widrow-Hoff rule, back propagation, and several enhancements of back propagation (e.g,, conjugate gradient, Levenberg-Marquardt variations) # Discussion of recurrent associative memory networks (e.g., Hopfield network) “Detailed examples and numerous solved problems # Associative and competitive networks (including feature maps, learning vector quanti- zation, and adaptive resonance theory) are explained using simple building blocks «Neural Network Design Demonstrations on bound-in disk using MATLAB 4.0 (student and professional versions) “in ry strongly that this is an excellent book, | have rarely reviewed a book that Is this well written, The illustrations and examples are si the demonstrations really support one's intuition and add greatly to the te: — Professor Stan Ahalt, Ohio State Untversity | iSEN F-117-108aTS QED Res: wew.china-pub.com ) | SAME MO 100087 ieinmae, SONG TONZEO¢ewES (010) GagEse59 | aaa chisteditor® hzbook.com | PRL ial Bate ok SE \9 bl ISBN 7-111-10841-8/TP - 2583 Eth: 69.00% 4 SR cw Bag * eae ate “Sat fy A Ca M8 a 7 1 i a ae - i ae tee 4 YE mye wep Pil 25 1 VT (RSCHR) ite Cg _ 3 : : _ Neural Network. Design | 5 t ‘ + Martin T. Hagan (#) Howard B. Demuth # Mark Beale / & & ° : : . 2 + oe . # Tb. dt : fe iy i at’ * 84 eg leg F (2S Seinetane Prdss : es LEHING HOUSE | yk Peg deen * ye ‘i Barty is * : Bow cp Roa. Bowe LS : Seg Tk og * %, i cg Roa vB + i sf ,o# 3 : Sw, nS a Boa: & 4 & og 4 i; = 3 royg 2 2, at ge ea & & #3 fF, 3 rig hy ey teas hha bea fg se ate OE ea * 4 - ' Boe. a ® x & ® * Beg eos Bow we x £ OCR ow Lok & 186 ce | ee Oo Be og Yak, Martin T. Hagan, Howard B. Demuth: Neural Network Design Original copyright © 1996 by PWS Publishing Company. All tights reserved. First published by PWS Publishing Company, a division of Thomson Learning, United States of America. Reprinted for People's Republic of China by Thomson Asia Pte Ltd and China Machine Press and CITIC Publishing House under the authorization of Thomsen Learning. No part of this book may be reproduced in any form without the the prior written permission of Thomsen Learning and China Machine Press. 4 PEAY EP ye DL Se AR RL lb a A AR HE BEE], AEE Det la eA AA MOAR AY, RUE SHRRBIAS: BE: 01-2001-5321 Eh A CIP) Se HE RIT ( 3S ) RA C Hagan, M. 1.) 83.- he: Le Llu neet, 2002.8 CE BED “43 J: Neural Network Design ISBN 7-111-10841-8 Lge 1. Re TD, AT SSS PES - BEET - Bic NV. TP183 "Pie CIPS AS (2002) 30631863 OLR TAL ARE CIR TH RS A122 METS 100037) HiRes. 2H TLR AE fe CE - SESE RA RTT 2002 468A 1 SR 1 PEA 787mm «x 1092mm t/16 - 46809 Ae, 0 GOL-3 OOOAT MEET: 69.0056 JURA, SOAR. AD. RR, ARAL AT AR Preface This book gives an introduction to basic neural network architectures and learning rules. Emphasis is placed on the mathematical analysis of these networks, on methods of training them and on their application to practical engineering problems in such areas as pattern recognition, signal process- ing and control systems. Every effort has been made to present material in a clear and consistent manner so that it can be read and applied with ease. We have included many solved problems to illustrate each topic of discussion. Since this is a book on the design of neural networks, our choice of topics was guided by two principles. First, we wanted to present the most useful and practical neural network architectures, learning rules and training techniques. Second, we wanted the book to be complete in itself and to flow easily from one chapter to the next. For this reason, various introductory materials and chapters on applied mathematics are included just before they are needed for a particular subject. In summary, we have chosen some topics because of their practical importance in the application of neural networks, and other topics because of their importance in explaining how neural networks operate. We have omitted many topics that might have been included. We have not, for instance, made this book a catalog or compendium of all known neural network architectures and learning rules, but have instead concentrated on the fundamental concepts. Second, we have not discussed neural net- work implemeniation technologies, such as VLSI, optical devices and par- allel computers. Finally, we do not present the biological and psychological foundations of neural networks in any depth. These are all important top- ies, but we hope that we have done the reader a service by focusing on these topics that we consider to be most useful in the design of neural networks and by treating these topics in some depth. This book has been organized for a one-semester introductory course in neural networks at the senior or first-year graduate level. (It is also suit- able for short courses, self-study and reference.) The reader is expected to have some background in linear algebra, probability and differential equa- tions. Each chapter of the book is divided into the following sections: Objectives, Theory and Examples, Summary of Results, Solved Problems, Epilogue, Pl Preface P-2 Further Reading and Exercises. The Theory and Examples section compris- es the main body of each chapter. It includes the development of fundamen- tal ideas as well as worked examples (indicated by the icon shown here in the left margin). The Summary of Results section provides a convenient listing of important equations and concepts and facilitates the use of the book as an industrial reference. About a third of each chapter is devoted to the Solved Problems section, which provides detailed examples for ail key concepts. The following figure illustrates the dependencies among the chapters. . 7 13 Supervised Associative Hebb Learning 14 3 Competitive Performance Learning Surfaces 1 Grossberg 5 9 Peformance Optimization 10 Widrow-Hoff 11 Backpropagation 12 Variations on Backpropagation Introduction q Architectures I Illustrative Example it Perceptron Leaming Rule iy Signal and Weight Vector Spaces 7 Stability 18 Hopfield 19 Epilogue Linear Transformations for Neural Networks Chapters 1 through 6 cover basic concepts that are required for all of the remaining chapters. Chapter 1 is an introduction te the text, with a brief historical background and some basic biology, Chapter 2 describes the ba- sic neural network architectures. The notation that ts introduced in this chapter is used throughout the book. In Chapter 3 we present a simple pat- tern recognition problem and show how it can be solved using three differ- ent types of neural networks. These three networks are representative of the types of networks that are presented in the remainder of the text. In addition, the pattern recognition problem presented here provides a com- mon thread of experience throughout the book. Much of the focus of this book will be on methods for training neural net- works to perform various tasks. In Chapter 4 we introduce learning algo- rithms and present the first practical algorithm: the perceptron learning rule. The perceptron network has fundamental limitations, but it is impor- tant for historical reasons and is also a useful tool for introducing key con- cepts that will be applied to more powerful networks in later chapters. One of the main objectives of this book is to explain how neural networks operate. For this reason we will weave together neural network topics with important introductory material. For example, linear algebra, which is the core of the mathematics required for understanding neural networks, is re- viewed in Chapters 5 and 6, The concepts discussed in these chapters wil] be used extensively throughout the remainder of the book. Chapters 7 and 13-16 describe networks and learning rules that arc heavi- ly inspired by biology and psychology. They fall into two categories: assc- ciative networks and competitive networks. Chapters 7 and 13 introduce basic concepts, while Chapters 14-16 describe more advanced networks. Chapters 8-12 develep a elass of learning called performance learning, in which a network is trained to optimize its performance. Chapters 8 and 9 introduce the basic concepts of performance learning. Chapters 10-12 ap- ply these concepts to feedforward neural networks of increasing power and complexity. Chapters 17 and 18 discuss recurrent networks. These networks, which have feedback connections, are dynamical systems. Chapter 17 investi- gates the stability of these systems. Chapter 18 presents the Hopfield net- work, which has been one of the most influential recurrent networks. In Chapter 19 we summarize the networks presented in this book and dis- cuss their relationships to other networks that we do not cover. We also point the reader to other sources for further study. If you want to know “Where do I go from here?” look to Chapter 19. Pg

You might also like