You are on page 1of 136
Artificial Neural Networks Colin Fyfe, Department of Computing and Information Systems, ‘The University of Paisley. Room L117 Phone 848 3305. Edition 1.1, 1996 Contents 1 Introduction Ld 12 13 14 15 16 lz 18 19 BEEBE Associative Memory Sipe Supervned Learning Objectives Intelligen : “Artificial Neural Networks « + Biological and Silicon Neurons Leaming in Artificial Neural Networks ‘Typical Problem Areas... A'short history of ANNs Genetic Algorithms . . « ‘Tutorial. . 1.91 Worked Baunple 1.02 Brercies «66. Objectives . . - Heteroasodation Creating the Memory Matrix 26.2 Lyapunov’s Method Condusion 5. Exercises... Objectives Introduction. « ‘The perceptron “The perceptron learning rule” ‘An example problem solvable by a perceptron 35.1 And now the bad news « ‘The Delta Rule 36.1 The learning rule: error descent, 36.2 Nomlinear Neurons . 363 36.4 365 ‘Multiple Discrimination Exercises 6. SRNSs SSSRRSBERBNNRRRS 2 4 ‘The Multilayer Pereeptron: Dbackprop a 41 42 43 44 45 49 4.10 Exercises... ‘Unsupervised learning S&L 52 53 Objectives « Introduction. ‘The Backpropagation Algorithm. 43.1 The XOR problan . . . Backpropagation Derivation « « Issues in Backpropagation « 4.5.1 Batch vs Onrline Leaning « 4.52 Activation Functions « 45.3 Initialsation ofthe weights’ | 454 Momentum and Speed of Canvegince 455 Stopping Criteria 4.56 Local Minima. « 457 Weis Decay and Generalization 458 Adaptive parameters . . 459 The number of hidden neurons Application - A dlssification problen 46.1 Theory 462 AnBxample - Linear Discrimination. 47.1 Multlayered Perceptrons’ Function Approximation... « 481 A Protiction Problem . 482 Practical Issues... . - Data Compression... Objectives vo. e eee Unsupervised learing . « Hebbian leaming « . 5.31 The lnfoMax Principle in Lins’ Modd | 532 The Stability of the Hebbian Learning Rule ‘Hebbian Learning and Information Theory 5.4.1 Quantification of Information . 54.2 Principal Component. Analysis 4.3 Weight Decay in Hebbian Learning . 5.44. Oja’s One Neuron Model. 54.5 Oja’s Subspace Algorithm 546 Oja’s Weighted Subspace Algorithm . 5.4.7 Sanger’s Generalized Hebbian Algorithm . 54.8 Summary of Hebbian Learning 54.9 Applications « « Anti-Hebbian Learning . SSL The Novelty Filter... Competitive leaming .... . Simple Competitive Learning ‘Learning Vector Quantisation ART Models. « ‘The Kohonen Feature Map: CONTENTS SSSSRR4EseRR ges SSALLARSSSSSRS4GSSSatwesss

You might also like