William Sayers

2009
Milestone 3

05025397

Genetic Algorithms and Neural Networks

Genetic Algorithms and Neural Networks
William Sayers (05025397)
Supervised by: Colin W. Morris (BSc, MSc)
Backpropagation neural networks are usually trained using some iterative method derived from a mathematical analysis of the operation of the network. This technique is time consuming and requires understanding of the mathematical principles involved. This project will investigate the application of genetic algorithms to replace the "normal" training of the network. This involves setting up some candidate networks and then picking the best of these. The best of the population are then combined in some way to produce a new population of candidate networks. This procedure is continued until a satisfactory network is obtained.

William Keith Paul Sayers225 Page 1 of Faculty of Advanced Technology 22/04/2009

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3

Contents
Contents ............................................................................................................................ 2 1 - Introduction ............................................................................................................... 10 2 - Research .................................................................................................................... 10 2.1 - Neural Networks ................................................................................................. 10 2.1.1 - The Biological Neuron................................................................................. 10 Figure 1 - A human neuron ............................................................................................ 10 2.1.2 - The Artificial Neuron................................................................................... 11 Figure 2 - An artificial neuron ........................................................................................ 12 Equation 1 - Summation of the inputs and weights ........................................................ 12 Figure 3 - The sigmoid function shape ........................................................................... 13 2.2 - The History of Neural Networks ........................................................................ 13 2.3 - Advantages and Disadvantages of Neural Networks ......................................... 14 2.3.1 - Advantages................................................................................................... 14 2.3.2 - Disadvantages .............................................................................................. 15 2.4 - Current Applications of Neural Networks .......................................................... 17 2.4.1 - Neural Networks in Medicine ...................................................................... 17 2.4.2 - Neural Networks in Business ....................................................................... 18 2.4.3 - Object Trajectories....................................................................................... 19 2.4.4 - Robot Control .............................................................................................. 19 2.5 - Back-propagation ............................................................................................... 20 2.5.1 - Back-propagation overview ......................................................................... 20 Page 2 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3 2.5.2 - Back-propagation in depth ........................................................................... 20 2.5.3 - Back-propagation library ............................................................................. 22 2.6 - The FANN neural network library (Nissen, FANN) .......................................... 22 2.7 - Genetic Algorithms ............................................................................................ 22 2.7.1 - History of Genetic Algorithms .................................................................... 23 2.7.2 - Advantages and Disadvantages of Genetic Algorithms .............................. 24 Figure 4 - local optimum (units are undefined in this case). .......................................... 27 2.8 - Training Neural Networks with Genetic Algorithms ......................................... 28 2.8.1 - Determining Weight Values with Genetic Algorithms................................ 28 2.8.2 - Representation of a Neural Network within a Genetic Algrithm ................ 28 2.8.3 - Using Genetic Algorithms to Determine Neural Network Structure ........... 29 Figure 5 - A feed-forward neural network ..................................................................... 30 2.9 - Cascade Correlation............................................................................................ 32 2.10 - C# User Interface Programming ....................................................................... 37 2.11 - Writing C++/CLI Wrappers for Unmanaged C++ code................................... 37 2.12 - Application of Research ................................................................................... 38 3 - Design ....................................................................................................................... 38 3.1 - Program Requirements ....................................................................................... 39 3.2 - Design of the Class Structure ............................................................................. 39 3.3 - Linking C# code to managed dll’s...................................................................... 40 3.4 - Design of the User Interface ............................................................................... 40 Page 3 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3 3.4.1 - Main Form ................................................................................................... 40 Figure 9 – Network and output tab design (Main form) ................................................ 41 Figure 10 - Mean squared error tab design (Main form) ................................................ 42 Figure 11 - Dataset tab design (Main form) ................................................................... 42 3.4.2 - New Network Wizard .................................................................................. 43 Figure 12 - Design of the training algorithm tab (Wizard form) .................................... 43 Figure 13 - Design of the network settings tab (Wizard form) ...................................... 44 Figure 14 - Dataset selection tab design (Wizard form)................................................. 44 3.4.3 - Working Form ............................................................................................. 44 Figure 15 - Working form design (Working form) ........................................................ 45 3.4.4 - About Form .................................................................................................. 45 Figure 16 - About form design (About form) ................................................................. 45 Figure 17 - Dataset display design (About form) ........................................................... 46 Figure 18 - Licenses display design (About form) ......................................................... 46 4 - Implementation ......................................................................................................... 46 4.1 - User Interface Implementation ........................................................................... 47 4.1.1 - Main Form ................................................................................................... 47 Figure 19 - Network and Output tab............................................................................... 47 Figure 20 - Mean squared error graph tab ..................................................................... 48 Figure 21 - The dataset tab ............................................................................................. 49 4.1.2 - New Neural Network Wizard ...................................................................... 50 Figure 22 - Learning Algorithm tab ............................................................................... 50 Figure 23 - Network Settings tab.................................................................................... 51

Page 4 of 225

................ 51 4................................................................................ 52 Figure 26 .......Data Parser Implementation ...............About Form ............................Genetic Algorithm Implementation .............2........................................... 54 4.Graphs ............................. Datasets view ..............Neural Network Implementation ....................Passing Information between Forms ......Back-propagation and Cascade Correlation Implementation ..............................4 ................................2 ..............................................................................................1 ................. 55 4.. 52 Figure 25 ..About form.......2 ...... 57 5 ........................................................................Working form .......................... 59 5.. 62 Page 5 of 225 ......................William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Figure 24 ................. 57 4..... 54 4.....................Mean squared error upon last training test (XOR) ...........................Testing Data ................Keeping the User Interface Active whilst Processing is Occurring............................................ 54 Figure 28 .... 60 5............1.....................The mean squared error at testing (XOR).............................1 ............... 61 Figure 29 ................................................... main view ..............................................................About form..........................................................Back-propagation . Licenses view ...... 58 5....... 58 5.......1.......... 54 4.............................About form.........3 .............................. 56 4.... 61 Figure 31 .......................Dataset tab ..1....................................1.......Cascade Correlation ............................................6 .........4 . 59 5..........3 ......................................................XOR .....................3 ....................1 ............Genetic Algorithms ............................ 56 4............ 53 Figure 27 ...............4 .............................................................Number of iterations to achieve target (XOR) ...1...................................1............................................. 61 Figure 30 ................................Working Form .............................5 .....................1..............................................................................1...............................

...................................................................... 68 6..............Number of iterations to achieve target (Viruses) ...............................1 ................ 63 5.................William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 5......................................2 ....................3.......2........................Cascade Correlation .........................2 ..... 66 5....................... 68 6.................... 66 5..........................3 ........................Mean squared error at testing (Viruses).......................................................................1 .The mean squared error at testing (Iris data) .............................4 ............................4 .Back-propagation ......... 65 5....2.........................................Mean squared error upon last training test (Viruses)...............................Graphs ....... 69 Page 6 of 225 ......................................Graphs ..Virus Classification ..............................Genetic Algorithms ...................2 .................................................Fishers Iris Data .............3 ................Cascade Correlation and Genetic Algorithms .......... 64 Figure 32 ........................................................................................ 67 Figure 35 ...................................................... 64 Figure 33.... 68 6 .........Fishers Iris data......................................................... 69 6...............2 .............1 ...........................................1.........................3...................Cascade Correlation .....1......................................................................... 68 6......3.......................................2............................................................Back-propagation ...2....................................................... 64 Figure 34....XOR ................................ 66 5..................Number of iterations to achieve target (Iris data) ..................... 63 5...1.......Back-propagation and Genetic Algorithms ......... 65 5...... 67 Figure 36 ....................3........ 63 5................................................Comparisons ..3 ...............................1 ........3 ...........................Virus Classification...... 69 6...................................... 63 5......................................................Genetic Algorithms .........................Mean squared error upon last training test (Iris data) ...2 ................. 67 Figure 37 ................

...........Datasets............................. 70 7 .......................Objectives .......... 69 6..............1 .........................DLL Wrapper for FANN functions ...Fishers Iris Data ... 214 Page 7 of 225 .....3 .....................3 .................................................................... 109 9..........2 ................................................................................Possible Training Algorithm Improvements .......................... 210 10 ..1 ............ 211 11 . 70 6........................Model ..........................................3 .......................................................Possible Improvements ..........4 .......2 ..... 73 9 .. 74 9.........................1..........................1........................................................................................2............4 ............................. 204 9........................................................................................... 207 9.......................................Evaluation .............................2............Testing conclusions ................................................Bibliography... 70 7...................... 74 9......3 ......Possible User Interface Improvements ..1 ............................................................2 .... 72 8................................................1.........................................................................................1 .................... 158 9...................................................................................1...........................5 .............................................UML Class Diagram............. 174 9.................................Works Cited ......1 ............ 209 9........................Libraries Researched ......................................................................................................................................................................................... 71 8 ...............William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 6........................................ 75 9.................................................Appendices .................2 ..................Virus Classification...............................................XOR .......................How many of the original objectives were achieved? ..........................................................2...................................Controller .............View ..............................................................................Source code . 69 6..... 73 8...

3 ............................................. 220 13 ............... 217 12.............................Table of Equations ................................................... 219 12...........................William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 12 ................................................................................1 ..............Table of tables .........................................................................................Table of Figures ..................................Tables ...Diary.2 ....................................................................... 222 Page 8 of 225 .............................. 217 12..................................

...... and I fully realise the consequences of plagiarising any of these sources............ ……………………………….......... Any material taken from published texts or computerised sources have been fully referenced............... ……………………………….... Student Name (Printed) Student Signature Registered Scheme of Study Date of Signing ……………………………….......William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 STATEMENT OF ORIGINALITY SCHOOL OF COMPUTING DEGREE SCHEME IN COMPUTING LEVEL THREE PROJECT This is to certify that..... nor any part of it... and that neither this project.......... the work described within this project is the result of the investigation carried out by myself..................... Page 9 of 225 ............. has been submitted in candidature for any other award other than this being presently studied....... except where specific reference is made.... ………………………………............................

compare and contrast three common learning algorithms that are applicable to neural networks.Neural Networks Neural networks are networks of small-specialised units that sum the products of their inputs and the weights on those inputs.1.1 . In addition. Figure 1 .The Biological Neuron The human brain is composed of many millions of nerve cells called “Neurons”. This report encompasses design.Research 2.1 .Introduction The purpose of this report is to bring to a conclusion my final year project on neural networks and genetic algorithms. Neurons do not make decisions alone. 2. 2 .A human neuron Page 10 of 225 . but in combination with a network of other neurons within the brain. development and final prototyping of the program developed for this project. Although they are simple in concept. the report will investigate. on the course “Computer Games Development”.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 1 . they can produce immensely complex results.

the nucleus will generate an “action potential”. with extensions called “Dendrites” which are the neuronal inputs and a single extension called an “Axon” which is the output for that neuron. in this gap between neurons chemical reactions take place that will inhibit or excite the signal. Axon branches connect to synapses to form the neural networks within the brain. If the inputs from the dendrites to the neuron nucleus when summed are past a certain threshold. 1. Page 11 of 225 . a number of weights associated with those inputs.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 A neuron is composed of a cell body. Concerning inputs from the dendrites. 2004). Axon branches can connect to other types of cells also. or to another part of the program. 2006) 2. AI: Neural Network for Beginners pt. the nucleus of the neuron cell processes them and produces an output.The Artificial Neuron In order to construct an artificial neuron we need a way of representing the same model in data structures and algorithms. The axon and dendrites both split before terminating into “Axon branches” and “Synapses”. The usual method for approaching this is to have a number of inputs. in order to allow the brain to control cells external to the brain. a synaptic gap occurs between the axon branch and the synapse. In the case of the axon branches connecting to synapses that are part of a second neuron. This action potential extends down the axon to the cells at the end of the axon branches (often synapses connecting to other neuronal cells. a summing function. (Barber. an activation function and a method for representing an output – either to another artificial neuron in the next layer. (Bourg & Seeman.1. as mentioned above).2 .

real number. A specific input accepts a data item (usually a binary digit. ℎ = × ℎ Equation 1 . or integer). or one (activated) if the summed products are above a certain value.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Figure 2 .An artificial neuron The system devised with these components is as follows. Alternatively.Summation of the inputs and weights The next stage depends on the activation function chosen. a differentiable function of some kind (such as a sigmoid function) which will output values between zero and one depending on the value of the inputs is another possibility. A stepping function can be used which will output zero (or non-activated) if the summed products are below a certain value. The activation function accepts this stored value. The product of the data item and its associated weighting is then stored before the summation of all the data items and associated weights occurs. Page 12 of 225 .

in 1958. 1943).2 0 -6 -4 -2 0 2 4 6 Figure 3 . 2004). 1958). contributed to the ideas that underpinned early neural network research.8 0. In this example “y” is the function output and “t” is the function input.6 0. however he was unable to come up with a reliable mathematically accurate mechanism for allowing multilayer perceptrons to learn (Rosenblatt. 1.4 0. William James and others.A sigmoid function (Bourg & Seeman.The History of Neural Networks The foundation of neural network research are in psychology.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 1. as defined in the graph above. McCulloch and Pitts formulated the first neural network model which featured digital neurons but the neurons had no capability to learn (McCulloch & Pitts.The sigmoid function shape = 1 1+ Equation 2 . 2006) 2. AI: Neural Network for Beginners pt. 2004).2 . (Barber.2 1 0. the work of 19th century psychologists such as Freud. Frank Rosenblatt developed the perceptron model then. Page 13 of 225 . (Bourg & Seeman.

Advantages Neural networks make possible or practical many things that with conventional solutions would be extremely difficult. when Werbos (Werbos.Advantages and Disadvantages of Neural Networks 2. Werbos discovered the algorithm whilst working on his doctoral thesis in statistics and called it “dynamic feedback”.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Two layer perceptrons were the subject of several experiments to determine their usage and effectiveness. The individual neurons in a neural network have no time dependencies on each other and each can therefore be run on a separate processor (or as separate threads on a single processor) if desired without causing the problems often associated with such parallelism. used on this architecture they become an extremely fast as well as flexible tool. The next major advancement in neural networks was not until 1974.3.1 . perceptrons with “n” hidden layers. 1974) discovered the back-propagation algorithm (independently rediscovered by Parker in 1982 (Parker. Parker discovered it independently in 1982 whilst performing graduate work at Stanford University in the United States of America and called it “learning logic”. Back-propagation allows for the training of multilayer perceptrons and in particular. taking Page 14 of 225 .3 . A good example of this is that neural networks have been implemented to run on architecture originally designed for processing three dimensional computer graphics. One of the benefits of neural networks when compared with other problem solving techniques in that they are inherently parallel and thus can run very effectively and efficiently on parallel hardware. of “n” nodes. 1992). (Blum. 1982)). 2. where “n” is an undefined number.

Disadvantages One disadvantage to neural networks can be that it is very difficult for a human being to analyse a trained neural network and describe why it may come up with one answer over another. 2.The statistical approach to deciding relevant data Another advantage of neural networks is that noisy data is not a real problem for them to learn to interpret. 5 Build a system that incorporates what we have learned. 3 Run the formulated model.The statistical approach to deciding relevant data”. Page 15 of 225 . as irrelevant data will simply end up weighted so that it has zero or close to zero actual effect on the solution produced by the network.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 advantage of the massively parallel architecture of this hardware and the programmable sections of the graphics pipeline (vector and fragment shaders) to produce extremely fast neural networks (Davis. noisy data is utilised during training upon occasion.2 . 4 Analyze the results. to attempt to provide a more robust post-training network. 2001). in fact. Table 1 . 2 Formulate a statistical model. We can simply present the neural network with all the data as opposed to following the circuitous and overly complex (by comparison) statistical approach demonstrated in “Table 1 .3. When using a neural network there's no need to establish before attempting problem solving which data is relevant. Neural networks can excel at determining what data in a particular set is relevant. where explaining why you have arrived at a particular diagnosis is an important part of the process. 1 Decide on relevant data. This is a large disadvantage in areas such as medical diagnosis.

or even to verify the quality of the training. All the photographs with tanks were from a cloudy day and all the images without. the network had trained itself to recognise whether the sky was cloudy or not. a sunny day. The US military then tested the neural network with a further set of one hundred images. only to find that on this round of testing. Eventually the original data set was re-examined. it can also be hard to tell how well (or how badly) trained a neural network is. They collected one hundred images with a concealed tank. A classic example is that in the 1980s. After the neural network’s training was complete. Page 16 of 225 . the US military wanted to use a neural network to analyse images and determine whether they contained a concealed tank. although the length of time the problem would take to solve without employing a neural network must be taken into account when deciding how much of a disadvantage. The training times on neural networks can also be a disadvantage. then a further one hundred images without. there is no formal methodology with which to choose the architecture. Therefore. asked to differentiate between the two sets of pictures. Fifty pictures of each set were set aside and the neural network trained with the remaining one hundred images with and without tanks. train the neural network. the network came up with apparently random answers. requires largely heuristic construction and preparation. When using neural networks is that.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Once a neural network’s training is complete it is hard to tell why it is coming up with a particular solution. they tested the network with the remaining fifty images that the neural network had not seen before and the neural network correctly identified most of the tanks. for obvious purposes. A tool used to solve heuristic problems. with and without tanks.

Garbage out”.Current Applications of Neural Networks 2. Data sets for training neural networks generally separate into two sub-sets.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Another possible criticism is that neural networks are extremely dependant on the quality and content of the original training data set. 2. The real strength of neural networks. or it does exacerbate the problem with the time spent training the neural network.4. Neural networks make it more apparent however because (as in the previous US Military example) it can be difficult sometimes to decide what constitutes good data. This drawback is however one that is generally universal to computing (and many other areas) often referred to as “GIGO” or “Garbage in. The network training first takes place using the appropriate “training” set. This is generalisation and it is far more useful than it at first appears and similar from a very high-level perspective to how the human brain and brains in general function. Fortunately since neural networks are (as has been mentioned above) good at ignoring irrelevant data. the usual best approach is to simply feed them as much data as can be obtained. in the opinion of the author. “training” and “testing”. covering as many relevant situations as possible. this has to be done within reason however. Page 17 of 225 .Neural Networks in Medicine There is a large amount of research now covering the applications of neural networks in medicine.1 . lies in their ability for generalisation. Testing may then take place with the “testing” set to determine accuracy with unseen data. the more accurate and in line with expectations and desires. the eventual result will be.4 . One of the main research areas is using neural networks to recognise diseases/conditions via scans. The more time spent training the neural network on quality data sets.

(Stergiou & Dimitrios. and assessing borrowers have all incorporated neural networks as an integral part of the system. However.2 .4.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 This is particularly suitable work for neural networks because you do not need perfect examples of how to recognise the diseases and conditions. (Stergiou & Dimitrios. thus supplying information more or less directly linked to the airlines main income. 1997) 2. can fit very well into most business situations. 1997) Page 18 of 225 .Neural Networks in Business Neural networks. You can also use the technology to model parts of the human biological system in order to better understand it. In the 1980s. Credit scoring. A back-propagation neural net is integrated with the airline marketing tactician which monitored and recommended on booking for each flight. being good at analysing patterns and predicting future trends. research on using neural networks to diagnose diseases took place. merely a variety of scans covering all possible permutations of the disease or condition. a clear problem with this approach is that when using a neural network it would be difficult to tell how a neural network reached its conclusion. The main reasons for using a neural network to model the human cardio-vascular system are that a neural network is capable of adjusting itself to be relevant to a particular individual and also to adapt by itself to changes in that individual. for example the human cardio-vascular system. mortgage screening. Businesses have used neural networks in the past for applications such as assisting the marketing control of airline seat allocations in the AMT (Airline Marketing Tactician).

The position. (relatively speaking) involving the following steps (Robot Control.e. compensating automatically for wear and tear on Page 19 of 225 . (Marshall & Srikanth. 1999) 2. 2008): • • • • Forward Kinematics Inverse Kinematics Dynamics Trajectory generation The ultimate goal of course is to position the “effecter” (I.Object Trajectories Predicting the trajectory of an object in real time is another area where neural networks have been utilised. 2008): • • • Determine the target coordinates relative to the robot. Adjust the arm to the appropriate joint angles and close the effecter. When controlling manipulators a neural net must deal with four major problems (Robot Control. Calculate the joint angles required for the arm to be in the appropriate position.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 2. a robotic hand or any other similar manipulator) in the appropriate position to grasp or otherwise manipulate an object or part of an object.4 .4.Robot Control Controlling manipulators (such as robotic arms in a car manufacturing plant) is usually how robotics and neural networks connect.3 . When a robot arm is one hundred percent accurate that is a simple task. velocity and acceleration of the object being estimated in those implementations by several neural networks using several of the most recent measurements of the object coordinates.4. A neural network allows the robotic arms to be very flexible in their operation and perform self-adjustments as time goes by.

2. allowing us to apply our learning algorithm.cs” lines 130-133.1. 2008).5 . Adjust the weight of the neurons to minimise the responsibility.Back-propagation overview In essence. The sigmoid function is in the source code at 9.1 -“Network.2 . the disadvantage of a differentiable function is that it is by the use of these functions that local optima become apparent in the search space.5. based on the strength of the weights connecting them to the output neurons.2 -The Artificial Neuron. The advantage of using a function like this is that it allows us to differentiate how close we came to the correct result. training by back-propagation involves the steps: • • • • • Present a set of training data Compare the network’s output to the desired output at each output neuron and calculate the error. Page 20 of 225 .1 .1.3. Adjust the weight of the output neurons to lessen the error.5. 2. This is far preferable to having to perform lengthy and expensive re-calibrations on any regular basis (Robot Control. A commonly used function is the sigmoid function as described in section 2. This rule essentially describes adjusting the weight using the difference between the expected and actual outputs.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 themselves.Back-propagation in depth In a single layer neural network (or with a single node) the perceptron rule is applicable.Back-propagation 2. In order to use a back-propagation training algorithm you must have a non-linear activation function for your artificial neuron. Assign responsibility to each of the neurons in the previous level.

the following alteration to the perceptron learning rule would serve our purposes (Matthews.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 ∆ = ℎ = − . as the effect that the change in the weight will have on the rest of the network is a missing factor. Page 21 of 225 . “n” is the learning rate of the neural network. 2002) Therefore. such as the sigmoid function (see 2. = 1− Equation 4 .1. The effect that the change will have on the following neurons in the network is an extra factor. ℎ : = [1 − ] . Back-propagation for the Uninitiated. 2002): ∆ = ℎ = − Equation 5 .Altered perceptron learning rule (Matthews. 2002) However. 2002) Calculating the hidden layer deltas is a little more complicated. Back-propagation for the Uninitiated. 2002) In this equation “w” and “x” are as before. The sigmoid function differentiates very neatly. Assuming the sigmoid activation function is in use. Equation 3 – The delta rule (Matthews. respectively. Back-propagation for the Uninitiated. the output layer nodes training function can be written as: = 1− − Equation 6 . Provided you are using a differentiable function. this rule has no positive effect in a multi-layer neural network.Altered delta calculation rule for the output layer (Matthews.differentiated sigmoid function (Matthews. Back-propagation for the Uninitiated. The other two values “di and yi” are the actual and desired outputs.2 -The Artificial Neuron) an alteration can be made to the perceptron learning rule allowing this difficulty to be overcome. Back-propagation for the Uninitiated.

1859) very neatly avoids one of the larger problems involved in software design: how to specify in advance all the possible permutations of the original problem and how the program should react to those permutations. it is fast. Natural selection (Darwin. in order to apply this learning mechanism you start at the output of the network and work your way towards the inputs. against which to compare the genetic algorithm trained neural network. to calculate the delta for a hidden layer requires the deltas for the following layers. although as it turned out I did not). 2.Back-propagation library 2. Back-propagation for the Uninitiated. By using similar techniques to natural selection. 2. 2002) As demonstrated above.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Equation 7 . The FANN library was decided upon for several reasons.6 . saving me finding a separate library. easy to implement.Genetic Algorithms Genetic algorithms attempt to copy natural laws to a certain extent in order to apply a random search pattern to a defined search space.The FANN neural network library (Nissen. open-source (allowing me to alter it if I needed too. 1859) Page 22 of 225 .3 -Libraries Researched) for other libraries investigated) implements a back-propagation solution and this library will be used in the project to implement a feed-forward back-propagation neural network.Altered delta calculation rule for the hidden layer(s) (Matthews. Most organisms evolve by means of sexual reproduction and natural selection (Darwin.7 .5. it is possible to “breed” solutions to problems. in a language that can be linked to C# fairly easily and it also supports cascade neural networks. Therefore. FANN) The FANN library (see (9.3 .

during this process. In 1965 one Ingo Rechenberg.1 . During the process chromosomes line up together and then cross over partway along their length. although this technique involved no crossover. Also in 1975. 2. selection and crossover. mostly designed to help evolutionary biologists’ model aspects of natural evolution. but this variety helps to prevent stagnation in the gene pool. In 1975.7. many researchers had independently developed evolution-inspired algorithms for function optimisation and machine learning but did not get much follow up to their work.History of Genetic Algorithms Genetic algorithms first appeared on computers in the late 1950’s and early 1960’s. however. Mutation plays a part in this as well.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Sexual reproduction ensures mixing and recombination of genes. By 1962. By the early 1980s. to alter slightly in a random way. This could result in either an effective or defective solution. or indeed multiple genomes in a population. a dissertation by Kenneth De Jong established the potential of genetic algorithms by showing that they perform well on a wide variety of data. the publication of the book “Adaptation in Natural and Artificial Systems” occurred. by the author John Holland. genetic Page 23 of 225 . this mixing leads to much faster evolution than if off-spring simply copied the genetic material as it is in the parents. including noisy and discontinuous data. built on papers both by himself and by other researchers. swapping genetic material. At first. This book presents the concepts of adaptive digital systems using mutation. occasionally causing the genetic information. this application was largely theoretical. genetic algorithms were becoming more widely used. simulating processes of biological evolution. across a range of subjects. then of the technical University of Berlin introduced a technique he named evolution strategy.

Thanks to the parallelism that is a genetic algorithms main advantage. 2004) 2. These are “non-linear” Page 24 of 225 .Advantages The primary advantage of genetic algorithms is their inherent parallelism. This leads to the exploration of a large proportion of the search space. they are very well suited as a means of exploring search spaces too large to be effectively searched in a reasonable amount of time by other. 2004). This is “Schema Theorem” and allows a genetic algorithm to move towards the searchspace with the most promising individuals in a timely fashion and then select the best member of that group (Marczyk.7. or further up the hierarchy. Genetic algorithms effectively explore many different branches of the tree at once and when a certain branch turns out to be non-optimal. more conventional. abandon that search. proceeding with other more likely candidates. in the same way that the average response of a relatively small percentage of the population of any given country.2 . can be used to fairly accurately predict national trends.Advantages and Disadvantages of Genetic Algorithms 2. (Marczyk.7. The real advantage of parallelism however is that by evaluating the relatively small search-space that it does. Many other algorithms are largely serial and can only explore a search-tree so far in any one direction before abandoning all progress and starting again from the beginning.1 . with processing power exponentially dedicated to better areas as the exploration progresses. a genetic algorithm is implicitly evaluating a much larger group of individuals.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 algorithms soon moved into commercial territory and nowadays help solve many problems in many different areas. methods.2.

they produce random changes to their candidates and then use the objective function to determine whether those changes are positive or negative overall. Selection allows the pruning of the least promising searches. such as crossover and mutation. or never contained in their search space in the first place. Genetic algorithms have no prior knowledge about the problem they are trying to solve. A good example of this is the concept Page 25 of 225 . 2004). making genetic algorithms an effective way to search them. However. In a linear problem. A third advantage of genetic algorithms is that they do not tend to be easily trapped by local optima. it is hard to know whether we have reached the global optimum. Many real life problems are not like this.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 problems. Without crossover. This allows them to discover solutions that other algorithms may have over-looked. selection and mutation a genetic algorithm is metaphorically similar to a large number of parallel depth first algorithms. This non-linearity results in a huge increase in the search space. One of the largest strengths of genetic algorithms can at first glance appear to be their largest weakness. or merely very good local optima. due to their strength in navigating large search spaces (Marczyk. they tend towards non-linearity where altering one component's fitness positively may make other components less fit and cause a ripple effect across the whole system. the random nature of the various starting points of the initial population and the other methods they employ. each searching their own space for the best solution. genetic algorithms tend to give good results compared to other search strategies. each component’s fitness is individual and any improvement to individual component's fitness is necessarily an improvement to the whole. In infinite or very large search spaces. crossover allows promising solutions to share their success and mutation allows random changes in the local search space of a given solution. due again to the parallelism of their approach.

Generally.2. or adjust them. Likewise. the investigation may cover too little of the search space to find the optimum solution (Marczyk. or real-valued numbers. 2004). 2004). If you have a poorly written fitness function. 2004). but denied a patent for several years because it ran counter to established beliefs (Marczyk. if the size of the population is too low. population size. the system will never converge towards a suitable solution.Disadvantages One of the disadvantages of a genetic algorithm is the necessity of representing the problem the genetic algorithm is trying to solve. in a form that the genetic algorithm can use.2 . For example in the diagram below. the genetic algorithm may end up solving an entirely different problem from the originally intended one.7. then genetic algorithms (and most other search techniques) are no better than a random search for finding the solution. A second disadvantage of genetic algorithms is that the fitness function is crucial to the development of a suitable solution for the problem. Another method is to use genetic programming. however that is only one point. therefore no gradual increase can lead you to it. Page 26 of 225 . 2. A further problem is that fitness functions can be deceptive. with each number representing some distinct part of the solution (Marczyk. the global optima is at the far right.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 of negative feedback. rediscovered by genetic algorithms. If the mutation is too high. where the actual code of the program forms the genomes in the genetic algorithm and it can swap portions in or out. however if it is too low. there is no slope leading to it. genetic algorithms use strings of binary. integer. a comparatively smaller segment of the search space will be covered and the eventual solution may take longer to reach or never be reached as a result. etc. A third issue is setting the correct mutation rate. if the solutions to various problems are in areas of the search space that the fitness function would not consider likely.

A further problem is premature convergence: this occurs when a mutation early in the process produces a large leap in the fitness of that particular genome. This is because analytical solutions usually produce more accurate results faster than a genetic algorithm (or any heuristic method) is capable of (Marczyk. local optima in the centre of the diagram.local optimum (units are undefined in this case). but altogether easier to find. Finally. including sigma scaling and Boltzmann selection (Mitchell. 900 800 700 600 500 400 300 200 100 0 0 2 4 6 8 10 12 Figure 4 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 you either have the solution or you don’t. lowering the diversity of the population and resulting in genetic algorithms possibly falling into the local optima that the mutation represents. Page 27 of 225 . where analytical solutions exist they should be take precedence over genetic algorithms. which will then reproduce abundantly. unless by random chance it discovered the point with the most optimum solution. 2004). 1996). There are various methods for solving this. In this example a properly constructed genetic algorithm would be likely to settle on the marginally less optimal.

however. a neural network representation within a genetic algorithm is a concatenation of all the weighs in the neural network. The error surfaces associated with these problems tend to be highly variable and contain many local optima.7 -Genetic Algorithms). these are not a complete solution (Blum.2 . the only information genetic algorithms require to train a neural network is an objective function (as described in 2. to test the appropriateness of a given genome.Training Neural Networks with Genetic Algorithms Training neural networks is a form of problem solving.Determining Weight Values with Genetic Algorithms Genetic algorithms (as described in 2. Genetic algorithms also have the benefit that they do not place any restrictions whatsoever on the architecture of the neural network. 1992).7 -Genetic Algorithms) are good at avoiding being stuck on local optima due to their searching several regions of search space simultaneously. however the method of choice for training neural networks is usually back propagation (see 2.Representation of a Neural Network within a Genetic Algrithm As a rule. 1995). such as adding a “momentum” to the algorithm. involving finding the optimum values for a set of real numbers (connection weights) which will produce the least error in the results from the neural network.1 .5 -Back-propagation) which is a gradient descent algorithm and as such can be easily trapped in local optima (Branke. not altering them besides that in any way. 2.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 2. There are methods to try to avoid this problem when using back propagation. to try to help it move beyond local optima. since you are merely increasing the size of the genomes being worked with. In addition.8.8. 2.8 . Page 28 of 225 .

The problem at hand dictates the input and output layer structures. Because of this it is helpful to keep functional units close together (I.Using Genetic Algorithms to Determine Neural Network Structure As well as using a genetic algorithm to evolve the weight set for a fixed-structure neural network. can lead to very large strings (thousands of bits) (Branke. 2. to form a feed forward network. outputs to inputs. 1995). it is also possible to use a genetic algorithm to evolve a structure for a neural network. however representing a large number of real valued numbers as a binary string.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Crossover can disrupt genes far apart. One of the most important decisions when deciding how to represent a neural network within a genome is whether to use binary strings or real number strings. one or two hidden layers of an indeterminate number of nodes and one output layer. These are all connected. a nodes input weights and the nodes of a layer.e. Page 29 of 225 . The average neural network has an input layer. side by side). but is less likely to do so for genes close together.8.3 . Standard genetic algorithms use binary strings. place a neurons input and output weights.

the neural network will never learn to solve the problem (or come close enough to suggest a solution). there is currently no method by which to establish. a neural networks generalisation is its main strength and therefore you do not wish to lose it. this is not necessarily the most efficient form for a network for any given problem and the trial and error method is an inefficient method for determining the correct architecture for an optimally performing neural network for the problem at hand. if the neural network architecture is too complex. Page 30 of 225 . the neural network will learn fast but will generate too specific a solution and will not generalize between similar inputs. There is also no method to check how optimal your solution is (Branke. On the other hand. Although network architecture plays such a large role in network performance. As mentioned in section 1.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Figure 5 . However. a neural network topology to deal with that problem. given a specific problem. If neural network architecture is too simplistic for the problem at hand. number of hidden layers) by trial and error and intuition to establish a neural network that performs well for the problem at hand.A feed-forward neural network This architecture is then altered (number of nodes in each hidden layer.0. 1995).

or high-level encoding (Branke. Ideally. direct. Low-level encodings specify each connection individually.1 . a representation is required which can accommodate all networks that will work for the problem but none that will not. There are two methods for representing neural network topologies within a genetic algorithm. 1995). 1995).Representing neural network structure Representing the structure of a neural network within a genetic algorithm is not as straightforward as representing the weight values of a neural network within a genetic algorithm. In direct encoding. In one version of high level encoding of a neural network. Page 31 of 225 . an exception is possible. Back propagation learning rate is also a parameter for one of these areas. being able to contain the most optimum solution for the problem at hand.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 2. thus meaning that in order to remove a node completely. indirect. the network is divided into areas and for each area the number of nodes and the density of connections to other nodes is determined (Branke. the strong. the genetic algorithm represents the neural network by means of its connections.3. In cases where a genetic algorithm determines the number of hidden layers and the number of nodes per hidden layer and the network itself is interconnected fully.8. thus eliminating the searching of meaningless space and by definition. you must remove all connections to or from that node completely. or low-level encoding and the weak. whereas high-level encodings group together connections and/or nodes.

As mentioned in section 5. This could be useful. Page 32 of 225 . A problem with high level encoding is that regular networks are favoured but not every network is equally probable.Cascade Correlation The cascade learning architecture is a method by which the training algorithm builds the neural network as it proceeds. The Cascade-Correlation Learning Architecture. it is impossible to know whether these large and complex networks may be the most suitable way to solve the problem at hand.9 . Therefore. 1995).2 neural networks that are too large will solve a problem fast but will not generalize well and therefore their usefulness is limited severely. 1991). as the size of the neural network increases. Neural networks that are too small on the other hand. This section largely drawn from information in: “Curved trajectory prediction using a self-organizing neural network” (Branke. 2.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 One of the reasons why alternatives to low-level encoding arose is the exponential rise in the number of connections. and the moving target problem (Fahlman & Lebiere. a balance between speed of learning and generalisation exhibited in the trained network is necessary. This can lead to human bias entering the system and affecting the outcome. will not ever succeed in learning the problem. by limiting the potential size of the networks evolved and preventing the generation of overly large and complex networks but on the other hand. The algorithm attempts to solve the issues associated with back-propagation that produce slow learning. which is obviously undesirable. These are the Step-size problem. These mapping methods are therefore more suitable for small networks or networks with a small number of connections.

After a number of training cycles pass with no substantial improvement (the precise number being determined by a user-supplied “patience” parameter). An Empirical Study of Learning Speed in Back-Propagation Networks. and attempt to maximise “s” the sum overall output units “o” of the magnitude of the correlation between “v”. the candidate unit’s value.5. and “Eo”. either an extra node is added.2 -Back-propagation in depth) or the quick prop rule (Fahlman. 'p' is the training pattern.The definition of 's' given that 'o' is the network output at which the error is measured. or if the networks output is of sufficient quality the training is ended. The output units may use either a linear or differentiable activation function. The Cascade-Correlation Learning Architecture. the residual output error at “o” (Fahlman & Lebiere. There is also a bias input. If a trigger occurs for the addition of a unit. or sometimes the quick prop rule (Fahlman. with the aim of getting as close to a solution as possible. 1991). Page 33 of 225 . It then trains the adjustable weights using the perceptron rule (as described in 2. = −` −` .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 The system begins with a basic neural network of inputs and outputs (numbers of each dictated by the task as usual) and no hidden layer. The connection between each input and output is via an adjustable weight. the quantities ‘`v ’ and ‘`Eo’ are the values of v and Eo averaged over all patterns (Fahlman & Lebiere.5. 1988). the partial derivative of "s" with respect to each of the candidate unit’s incoming weights "wi" is a necessity. Equation 8 .2 -Back-propagation in depth). 1991). 'v' is the candidate unit's value. the initiation of a number of “candidate” units takes place and trained using either the perceptron rule (2. An Empirical Study of Learning Speed in BackPropagation Networks. with a value of one. plus the outputs of any previously created units. The “candidate” nodes take the neural net inputs. The Cascade-Correlation Learning Architecture. 1988). In order to maximise "s".

Expansion and differentiation of this calculation can then take place.The partial derivative of ‘s’ with respect to each of the candidates incoming weights ‘wi’. where ‘oo’ is the sign of the correlation between the candidates value output ‘o’. The training iterations for these adjustable weights in the main network then continue. . The Cascade-Correlation Learning Architecture.p’ is the input the candidate receives from unit ‘i’ for pattern ‘p’. = . 1991). Adjustable weights link the outputs for this node and all the output nodes inputs. Equation 10 . ‘f'p’ is the derivative for pattern 'p' of the candidate unit's activation function with respect to the sum of it’s inputs and ‘Ii. Page 34 of 225 . or the completed training epochs reaches a maximum level. The connection of the best of the candidates to the neural network as it stands takes place.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Equation 9 . After these computations are complete. a gradient descent to maximize "s" can take place.The partial derivative of ’s’ (Fahlman & Lebiere. and the inputs for that node are frozen. −` ′ . Once the correlation on a candidate has reached an acceptable level.

The Cascade Correlation Learning Architecture. A first node has been added.William Sayers 05025397 Genetic Algorithms and Neural Networks Neural Milestone 3 Figure 6 . The circular connections are adjustable weights (Fahlman & Lebiere. 1991) Cascade-Correlation 1991). Page 35 of 225 . The square econd connections are locked weights (Fahlman & Lebiere.The initial state of a cascade correlation neural network.The second state of a cascade correlation neural network. 1991) Cascade-Correlation 1991). Figure 7 . The Cascade Correlation Learning Architecture.

The third state of a cascade correlation neural network. The Cascade-Correlation Learning Architecture. 1991).William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Figure 8 . with two nodes added (Fahlman & Lebiere. Page 36 of 225 .

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3

2.10 - C# User Interface Programming
C# uses the Winforms API to create user interfaces. It allows you to “draw” objects on a form in a visual manner to design your layout and then access and alter properties and subproperties of those objects in your programming code. A separate thread for each Window in Winforms exists, in order to facilitate performance on modern multi-core machines, although it does make it more difficult to move data between the forms. A method using “delegate functions” (similar to “function pointers” in C++) and “events” (Lysle, 2007) is the method I will use. Another challenge is making the neural network training independent of the user interface, in order to avoid the interface freezing whilst training a neural network. The solution I plan for this is to use the background worker object provided with C#, which allows you to assign a delegate function to be your background task, and will trigger events upon completion, for which event handlers will be executed (Microsoft, BackgroundWorker Class, 2009). Making a C# user interface interact with C++/CLI dll’s simply involves adding a reference to the dll in your project, via the Visual Studio® user interface.

2.11 - Writing C++/CLI Wrappers for Unmanaged C++ code
The cleanest method of connecting a native C++ or C dll or piece of code to a managed C# interface is via a C++/CLI bridge class. The VC++ compiled can swap between managed and unmanaged code on the fly whilst compiling and allows you to have both managed and unmanaged code (in C++) as part of the same project. Thus, a managed class can communicate with the unmanaged class (performing appropriate marshalling of the non-primitive data types (Microsoft, Using C++ Interop (Implicit PInvoke), 2009)) and the C# code can then access it directly.

Page 37 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3 Although it is possible for a C# program to interact with an un-managed dll directly, this method is less error-prone and more likely to be robust and stable, as well as making for cleaner code.

2.12 - Application of Research
The research presented here will help to produce the deliverables for milestone two and milestone three of my final year project: I plan to combine C# for the user interface and C++ for the neural network back-ends in order to create an efficient, powerful and fast solution. Section 2.11 -Writing C++/CLI Wrappers for Unmanaged C++ code, covers the combination of C# and C++ so I will not bother to cover it here, I plan to use the DLL method, with one DLL combining managed and unmanaged C++ linked into my C #user interface code. The user interface will then have methods to adjust parameters such as the genetic algorithms population size, number of generations, etc. In order to test and see what kind of genetic algorithm learns and develops fastest. It will also allow you to specify the number of back-propagation iterations. The use of this application will be in evaluating the difference in performance between a genetic algorithm, back-propagation algorithm and cascade correlation algorithm as neural network training algorithms.

3 - Design
The purpose of this application will be to allow the user to run a back-propagation trained neural network, or a genetic algorithm trained neural network and present the results of the network in such a fashion as to allow analysis and comparison of the training methods and their suitability in different situations.

Page 38 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3

3.1 - Program Requirements
The program must be able to: • • • • • • Run a genetic algorithm on a feed forward fully interconnected neural network to train the network. Run a back propagation algorithm on a similar network to train the network. Run a cascade algorithm on a similar network to train the network. Allow the user to set appropriate variables (see 3.4.2 -New Network Wizard) to adjust the execution of the learning algorithm selected for the neural network. Allow the user to select which learning algorithm they wish to use. Display the results of running that algorithm on a neural network in a meaningful fashion.

3.2 - Design of the Class Structure
Before I discovered a simple to program and effective design, I tried and discarded two designs. I eventually used the third design and it is that third design documented here. The two prior designs to this involved representing neural networks via objects and compositions of objects in vectors, which lead to flexible, but slow and overly complex code upon experimentation. The final solution, the solution documented here, is a far simpler one (in terms of the neural network representation, it is far more detailed in the other areas than the initial designs were). This solution loosely follows a model, view, controller architecture. The solution enables easy replacement of sections of code as long as the interfaces and outputs remain the same. For more information on the class structure, see section 9.2 -UML Class Diagram.

Page 39 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3

3.3 - Linking C# code to managed dll’s
Linking C# code and managed dll’s is a simple process (in Visual Studio) as explained in (2.11 -Writing C++/CLI Wrappers for Unmanaged C++ code) of selecting the references section in the C# project in question and adding the dll you wish to refer too. Then you can simply add a “using” directive to use the dll namespace and create, destroy, and use the classes within as if they were native C# classes. This can be seen in (9.1.2.6 “Algorithms\BackProp.cs”) and (9.1.2.4 -“Algorithms\CascadeCorrelation.cs”) with the “using FANN_Wrapper;” directive. This is superior to the direct C# to unmanaged code linking that used in milestone two, since it makes for cleaner and thus more easily maintainable code and the code is more robust in this fashion. The robustness comes from the fact that C++/CLI is far better at interacting with unmanaged C++ code than C# is. Indeed – they can even be in the same source file and compiled into one object. Therefore, by using a C++/CLI wrapper to interface to the unmanaged C++ and then interacting with the C++/CLI wrapper/s errors are less likely to arise.

3.4 - Design of the User Interface
The construction of the user interface utilised the Winforms based tools in visual studio 2008 (see 2.10 -C# User Interface Programming). I also used the ZedGraph control (ZedGraph) to display the graph of my results.

3.4.1 - Main Form
The main form is the crux of the application (as is default in C# Winforms programming) and if the main form is closed, all other forms and classes are disposed and the application terminates.

Page 40 of 225

1.4. Page 41 of 225 . Figure 9 – Network and output tab design (Main form) 3.Network and Output tab This tab needs a method by which to initialise the adding of a new network (3.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 3.2 .1.4.Mean Squared Error Graph tab The mean squared error graph tab is a simple tab.4.2 -New Network Wizard) a method to train the current network and a method to test that training.1 . Three buttons will therefore be present. which simply needs to display the graph of the mean squared error over the current epoch/generation. One display will show network details and one display will show the training and testing output. along with two displays.

1.Dataset tab design (Main form) Page 42 of 225 .Dataset tab This tab needs two display sections.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Figure 10 .4.3 . Figure 11 .Mean squared error tab design (Main form) 3. to show the training data and to show the testing data.

2 .4.Design of the training algorithm tab (Wizard form) 3.Learning Algorithm tab On the learning algorithm tab you select the learning algorithm for the neural network you are creating.2.1 -Learning Algorithm tab (if no algorithm has yet been selected.2.4. 2. Verification of all data takes place before the form passes control back to the main form and triggers a main form event with a parameter that takes a specially constructed class to transfer the data from the wizard. 2. 3. Figure 12 .5 -Back-propagation.4.Network Settings tab This tab allows you to set appropriate settings for the network and training algorithm you have selected in 3.4. Page 43 of 225 . out of three choices previously mentioned (2.2 . It allows you to move in a logical progression through the set up process of a new neural network and associated training algorithm. The user can progress through the steps via the “previous” and “next” buttons or via the tabs at the top in whatever order you choose.1 .9 -Cascade Correlation). the default is genetic algorithm).7 Genetic Algorithms.New Network Wizard The new network wizard is the form that appears when the user selects the creation of a new network.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 3.2.

or to choose custom data sets from your hard disk for training and testing data (appropriately formatted in the FANN style).Dataset tab The dataset tab allows you either to select from two pre-setup datasets.4.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Figure 13 . Page 44 of 225 . both to protect the main form from data inputs that may cause training problems and to provide a visual indicator of work taking place.4.Design of the network settings tab (Wizard form) 3.3 .2.Dataset selection tab design (Wizard form) 3.3 .Working Form The working form is merely a small display shown as a dialog (lock out access to displaying form) which shows whilst a neural network is training. Figure 14 .

Page 45 of 225 .4. Clicking the datasets button or the licenses button displays the appropriate information. project description).4.About Form The about form displays information about the application (author.4.Datasets display Clicking the data sets button displays a tabbed interface. project supervisor.About form design (About form) 3.4 . Figure 16 . displays licensing information and displays information on the pre selected datasets.Working form design (Working form) 3.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Figure 15 . clicking the same button again hides that information. one tab with XOR information and one tab with Fishers Iris Data information.1 .

2 .4.Dataset display design (About form) 3. Page 46 of 225 . it will display the appropriate license.Implementation The implementation is largely in C#. FANN) library written in C and linked as a static library to the managed FANN_Wrapper dynamic link library.Licenses display design (About form) 4 . with small portions in C++/CLI and the back-end FANN (Nissen.Licenses display Clicking the licenses button displays a small information panel with selection methods.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Figure 17 . Figure 18 . depending on your selection.4.

1 .User Interface Implementation The implementation of the user interface is mainly in C#.1 .Main Form The main forms code is in section 9. 4.1.Network and Output tab Page 47 of 225 .1 .2 -“FrmMain.1. 4.1.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 4. mostly via the visual design tools found in visual studio 2008.1.cs”.Network and Output tab Figure 19 .1.

Mean squared error graph tab Page 48 of 225 .2 .Mean Squared Error Graph tab Figure 20 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 4.1.1.

Dataset tab Figure 21 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 4.1.The dataset tab Page 49 of 225 .3 .1.

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3

4.1.2 - New Neural Network Wizard
4.1.2.1 - Learning Algorithm tab

Figure 22 - Learning Algorithm tab

Page 50 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3

4.1.2.2 - Network Settings tab

Figure 23 - Network Settings tab

4.1.2.3 - Dataset tab

Figure 24 - Dataset tab

Page 51 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3

4.1.3 - About Form

Figure 25 - About form, main view

Page 52 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3

4.1.3.1 - Datasets View

Figure 26 - About form, Datasets view

Page 53 of 225

1.Passing Information between Forms Passing the data between the wizard form and the main form takes place as previously described (2. Licenses view 4.About form.2 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 4. Page 54 of 225 .1.5 .1.10 -C# User Interface Programming) via events and delegate functions.3.Working Form Figure 28 .4 .Working form 4.Licenses View Figure 27 .

3 -“Wiz1. 4. the processing of the class sent from the Wizard form takes place (see 9.1.Keeping the User Interface Active whilst Processing is Occurring The user interface delegates the processing to a background worker class. Firstly.cs” lines 145-163).1. 2009).1.1.1.1.1 -“NetworkUpdatedEventHandler”).1. The trigger of a new event takes place in the Wizard form when you are on the last tab and the user clicks “next” (see 9. the background worker class is instantiated and supplied with the event handler references functions (see 9.1.1.2 -“FrmMain.cs”. BackgroundWorker Class. Events. This leads to a user interface that is much more accessible (Microsoft.6 .1.2 “FrmMain.1. lines 29 – 31).cs”. 2007).2 -“FrmMain. Page 55 of 225 . lines 29-44). lines 266-281).cs”. (See 9.2 -“FrmMain. The “NetworkUpdatedEventArgs” class is a storage class for passing data (see 9.1. lines 306-353. are associated with a delegate function to respond to that event (Lysle.cs” lines 120-129). The declaration and association of a delegate and an event takes place in the form that needs to generate an event (9. The two-event handling functions are at 9. The declaration of a new event handler in the main form takes place prior to displaying the new neural network wizard form (see 9.1.cs” lines 70-119). Then the runworkerasync function executes and then shows the working form.1.1. when instantiated.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 An event is a method of notifying a listening object that something has occurred. In the event handler in the main form.1.cs”.2 -“FrmMain.1.3 -“Wiz1.

I decided to try re-implementing it in C#. and didn’t suffer from the memory leaks (being managed code) so that’s the version that exists in the application today (see 9.3. Therefore.cs” lines 51-126) 4.1.1.2 .3 Page 56 of 225 .cs” lines 32-44).NET built in functionality (see 9.Back-propagation and Cascade Correlation Implementation The FANN library implements both cascade correlation and back propagation (Nissen.Genetic Algorithm Implementation Although the original design was to implement genetic algorithms in C++.1. when the C++ implementation of genetic algorithms that I wrote functioned.2 -“GANetwork.2.cs” for the C# implementation) The main points to note in the code as it stands are: • The crossover (single-point) and mutation (random increment or decrement) implementations (see 9. after having used C# extensively in the UI I felt more confident about the performance loss and the use of the programming language itself.5 -“Algorithms\GeneticAlgorithm. it was both faster than the C++ implementation (due mostly to my use of build in . The neural network specific functionality added to the network model in the network specialisation class GANetwork (see 9.2.1. • • The sorting implementation in GANetwork.cs that links into the .3. Once this implementation was complete. There is an unmanaged C++ wrapper (9.1 .1. The compilation of the FANN library is static. FANN) which is a C programming language based neural network library.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 4.5 “Algorithms\GeneticAlgorithm.NET sort methods in the sorting part of the genetic algorithm).2 -“GANetwork.4. and the Dll linked to this to allow access to the functionality.cs” lines 176204). but had issues with memory leaks and was slow.2.

h”.1. which store respectively.2 -“DatasetParser.5 -“UnManaged_FANN_CasC.1. 9.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 “UnManaged_FANN_BackProp. and nodes) the eventual implementation turned out to be basic yet effective (9.cs”).4 “UnManaged_FANN_BackProp. which using the data from the heading (number of epochs of data.1.1. the separation of the textual inputs takes place. which allows it to manipulate the FANN functions in the static library through this class.6 -“UnManaged_FANN_CasC.cpp”. 9. From there.1 -“Network.3 .Data Parser Implementation The data parser (9.4.1. the inputs. followed an object-based structure (network.3.cpp”) for the Cbased FANN functions in the DLL and a C++/CLI wrapper (9.2 -“FANN_Wrapper. will also be the coordinates for its corresponding weight (lines 17-19). 9. In the initialisation routines (lines 28-124). 4.Neural Network Implementation Although two previous attempts made at programming a neural network.cpp”) for the unmanaged C++ classes. Page 57 of 225 .4.4. the initialisation of all the arrays to their correct sizes takes place and then the initialisation of the weights to a random double between zero and one. The inputs and weights are structures so that the coordinates for an input. into arrays of strings.1.2. number of inputs.4. 9.1. 4.4 . via references to these objects passed down via the hierarchy (references are passed to avoid the necessity of passing entire strings down the hierarchy).4. these arrays form parameters for the other functions. Three jagged arrays are utilised.cs”) parses data directly from the rich text boxes that they’re displayed upon on the main form.1 -“FANN_Wrapper. number of outputs) separates the data into two-dimensional arrays suitable to be passed to other functions. via a number of designated separator characters. C# accesses the C++/CLI class directly. the weights.h”.h”. layers. and the outputs. In the constructor (lines 41 – 77).

1 . 5. five times for Virus classification and Fishers iris data). to run the output layer. The target error was 0. there are three functions. As the program presented the data in each case. and creating a neural network with a similar structure on an excel spreadsheet.02 for Fishers iris data and the virus classification data. recording took place for analysis.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 The main functionality of the neural network class can be found in the “internal” region (lines 126-235). and hidden layer which are all called by the public “run()” function (lines 241-270). so repetition is instead used.Testing Data Testing took place several times for each algorithm and dataset (see 9. input layer. All nodes determine their outputs using the sigmoid function (lines 130-133). I tested the neural network implementation by creating a neural network with structure 2. Then entered the weights from the neural network program (by means of using the visual studio debugger to check variable values) into the spreadsheet and checked that the outputs coincided.01 for XOR and 0. 3. The genetic algorithm and back-propagation neural networks each had four hidden nodes for each dataset. 5 . 1.XOR In the XOR dataset. Page 58 of 225 . the dataset is too small to split.4 -Datasets) (ten times for XOR.

the mean squared error from the network result and the testing mean squared error are different despite the data being the same.00992291 0.006150786 0.00999069 0.1.009566951 0.Back-propagation testing data (XOR) Page 59 of 225 .00997699 0.009962559 0.00957518 0.00991907 0.00990161 0.1.009969323 Testing MSqE (BP) 0.00998027 0.00957518 0.009162436 0. because of floating point inaccuracies converting between double precision floating-point numbers in my code and single precision in the FANN code.5 Result MSqE (BP) 0.009375132 0.00989974 0.0098402 0.009157809 Testing MSqE (GA) 0.009248076 0.0099113 0.009829335 0. Iterations (BP) 345 287 319 327 301 323 421 315 362 325 332.00998032 0.00988649 0.1 .Back-propagation In the back-propagation testing data.2 .00990403 0.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 5.009248076 0.5 Result MSqE (GA) 0.00881867 0.006150786 0.009885499 Table 3 .00881867 0.009157809 Table 2 .00987211 0.009375132 0.009888967 0.00983181 0.00992081 0.00995716 0.00988863 0.009829335 0.009888967 0.009162436 0.00999799 0.009962559 0.Genetic Algorithms Iterations (GA) 591 407 500 466 499 534 675 557 357 939 552.Genetic Algorithm testing data (XOR) 5.0099765 0.009566951 0.00998959 0.

00889787 0.00960884 0.43545E-13 Table 4 .00656719 0. the testing result differs quite widely from the network result. At each node addition.00798691 0.14E-14 2.1.02E-13 1.02E-13 2.3 .00863005 0. which is when the program records the mean squared error.00855044 0.00992534 0.78E-14 1.22E-13 1. Nodes (CC) 2 2 2 2 2 2 2 2 2 2 2 Result MSqE (CC) 0.Cascade Correlation testing data (XOR) Page 60 of 225 .008039565 Testing MSqE (CC) 7.00585159 0.18E-13 7. but in the case of cascade training.Cascade Correlation In the cascade correlation data. the network is trained more after that point (output weight training).83E-13 2.00573252 0.32E-13 2.86E-14 1.0086449 0.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 5.99E-13 1. the call back function triggers.

Graphs 1000 900 800 700 600 500 400 300 200 100 0 0 5 10 15 Iterations (GA) Iterations (BP) Nodes (CC) Figure 29 .006 0.Number of iterations to achieve target (XOR) 0.002 0 0 5 10 15 Result MSqE (GA) Result MSqE (BP) Result MSqE (CC) Figure 30 .008 0.4 .Mean squared error upon last training test (XOR) Page 61 of 225 .01 0.1.004 0.012 0.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 5.

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 0.002 0 0 5 10 15 Testing MSqE (GA) Testing MSqE (BP) Testing MSqE (CC) Figure 31 .012 0.The mean squared error at testing (XOR) Page 62 of 225 .004 0.006 0.008 0.01 0.

019763877 0.019719802 0.01956255614 Testing mean squared error (GA) 0.0196297 0.014116 0.0198416 0.018789438 0.Cascade correlation testing data (Iris data) Page 63 of 225 .01956255614 Table 5 .0199719 0.0160925 Table 7 .2.0199892 0.0199918 0.0189456 0.0199916 0.3 .01998996 Testing mean squared error (BP) 0.0143806 0.Genetic Algorithms Iterations (GA) 1928 1834 1277 2835 8716 3318 Mean squared error (GA) 0.2 .019719802 0.0164964 0.2.0195853 0.Back-propagation Iterations (BP) 434 529 474 391 527 471 Mean squared error (BP) 0.0141406 0.2 .Cascade Correlation Nodes (CC) 2 2 2 2 2 2 Mean squared error (CC) 0.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 5.019751257 0.018789438 0.0143778 0.Back-propagation testing data (Iris data) 5.0199999 0.0141935 Table 6 .Fishers Iris data 5.0140669 0.0199972 0.019775896 0.Genetic algorithm testing data (Iris data) 5.0139982 0.1 .019751257 0.2.014144 0.019763877 0.01950238 Testing mean squared error (CC) 0.0158679 0.0193595 0.019775896 0.

0188 0.Number of iterations to achieve target (Iris data) 0.Mean squared error upon last training test (Iris data) Page 64 of 225 .0196 0.0186 0 2 4 6 Mean squared error (GA) Mean squared error (BP) Mean squared error (CC) Figure 33.4 .0198 0.0194 0.0202 0.2.Graphs 10000 9000 8000 7000 6000 5000 4000 3000 2000 1000 0 0 2 4 6 Iterations (GA) Iterations (BP) Nodes (CC) Figure 32 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 5.02 0.0192 0.019 0.

025 0.02 Testing mean squared error (GA) Testing mean squared error (BP) Testing mean squared error (CC) 0. There are 18 measurements on each virus.The mean squared error at testing (Iris data) 5.005 0 0 2 4 6 Figure 34. and others. which are the number of amino acid residues per molecule of coat protein. tomato. The dataset is in order: • • • • Hordeviruses x3 Tobraviruses x6 Tobamoviruses x39 Furoviruses x13 Page 65 of 225 .015 0.3 .Virus Classification This dataset covers 61 viruses affecting several crops including tobacco. cucumber.01 0.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 0.

0347797 0.019092293 0.0393106 0.01983722 Mean Squared Error testing (BP) 0.00640291 0.8 Mean Squared Error training (GA) 0.Cascade Correlation Nodes (CC) 1 2 1 2 1 1.019861371 0.019092293 0.03833646 Table 9 .0105845 0.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 5.0196959 0.4 Mean Squared Error training (CC) 0.019743 0.0185152 0.00980208 0.01870197 0.016183342 Mean Squared Error testing (CC) 0.2 .Back-propagation testing data (Viruses) 5.019927361 0.017043 0.0434423 0.00979968 0.017723052 0.0199916 0.01870197 0.0190612094 Mean Squared Error testing (GA) 0.Genetic Algorithms testing data (Viruses) 5.0199621 0.011585152 Table 10 .0165979 0.017723052 0.0190612094 Table 8 .019927361 0.Cascade Correlation testing data (Viruses) Page 66 of 225 .3 .0111416 0.3.2 Mean Squared Error training (BP) 0.0350318 0.0195173 0.0391179 0.0194383 0.3.Genetic Algorithms Iterations (GA) 1818 1421 896 11676 1068 3375.3.Back-propagation Iterations (BP) 3275 10922 2800 3744 6295 5407.0197935 0.1 .019861371 0.

015 0.4 .02 Mean Squared Error training (GA) Mean Squared Error training (BP) Mean Squared Error training (CC) 0.01 0.Number of iterations to achieve target (Viruses) 0.005 0 0 2 4 6 Figure 36 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 5.025 0.3.Graphs 14000 12000 10000 8000 6000 4000 2000 0 0 2 4 6 Iterations (GA) Iterations (BP) Nodes (CC) Figure 35 .Mean squared error upon last training test (Viruses) Page 67 of 225 .

02 0. Genetic algorithms take around five hundred iterations on average to produce a workable solution when aiming for a mean squared error of below 0.1 .01 0.05 0.035 0.XOR As regards to the XOR problem.Comparisons 6. versus backpropagations 333 iterations on average. but to a negligible degree in this scenario. Genetic algorithm solutions are slightly more accurate.1 .045 0.Back-propagation and Genetic Algorithms 6. The back-propagation implementation seems to be slightly faster in real-time but this is possibly attributable to managed code inefficiencies versus unmanaged code. or possible inefficiencies in my programming.015 0.03 0.01.1.025 0.005 0 0 2 4 6 Mean Squared Error testing (GA) Mean Squared Error testing (BP) Mean Squared Error testing (CC) Figure 37 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 0. genetic algorithms and back-propagation offer similar performance in solving the problem. Page 68 of 225 .Mean squared error at testing (Viruses) 6 .04 0.

2 . on this data set cascade training came to a satisfactory solution faster and more accurately than genetic algorithms.1.3 . With an average of 3318 generations for genetic algorithms versus 471 epochs for back-propagation this became clear.Fishers Iris Data Again. whereas the back-propagation solutions lost effectiveness on the training data.1.Fishers Iris Data Genetic algorithms clearly found this a difficult problem to solve as opposed to backpropagation. The back-propagation generated networks were also more accurate overall (in fact on this particular problem analysed the testing data more effectively than the training data). 6.2.8 generations versus 5407. 6. genetic algorithms taking 550 generations on average to solve the problem to a suitable accuracy. The genetic algorithm solutions also held their effectiveness through to the testing data with extreme consistency.1 .2.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 6. with an average of 3375. FANN)) this is around 300 epochs to solve the problem.2 .2 epochs for backpropagation.2 . Page 69 of 225 .Virus Classification The solution of this problem was more effective by genetic algorithms than by backpropagation. The cascade algorithm also achieves a higher degree of accuracy. At a maximum of 150 epochs per node added (the default (Nissen. whilst the cascade algorithm used 2 nodes to solve the problems. 6.Cascade Correlation and Genetic Algorithms 6.XOR This problem was solved more effectively by the Cascade algorithm than by genetic algorithms.

with the information presented and discussed here. Page 70 of 225 .2. Compared to cascade correlation however. I feel the project has been a success. 6. although there are further investigations I would like to continue with given more time (and may continue with in my own time). genetic algorithms fall down as a solution. 7 .Testing conclusions Genetic algorithms are a valuable and effective means of training neural networks. however. more in-depth testing using more varied datasets a clearer comparison and possibly drawbacks associated with cascade training may become evident.Evaluation Overall. With more time.Virus Classification With this data set cascade training again came to a satisfactory solution faster and more accurately than genetic algorithms did on the same data. They are roughly as effective as back-propagation training and although back-propagation is faster for the most part.3 . solutions developed using genetic algorithms tend to hold their effectiveness through to the data set effectively. the unavoidable conclusion is that cascade training is the most effective of the three training methods examined.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 6.3 .

and in which cases these methods are suitable.” With more time more advanced tests could be undertaken using larger data sets.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 7. evaluate training methods for neural networks. Build a user-interface for the aforementioned application allowing for easy alteration of appropriate variables. Page 71 of 225 . “Identify the most effective ways of training neural networks using genetic algorithms. Research current information on training neural networks using genetic algorithms. as well as previous research. and that allows manual adjustment of relevant variables. and in which cases these methods are suitable. Identify the most effective ways of training neural networks using genetic algorithms.1 . Using above application. Build an application that is capable of training a neural network with a genetic algorithm and with a backpropagation system. The only objective that I feel could be improved upon is the final objective. leading to a more solid conclusion and more information on the various strengths and weaknesses of the various learning algorithms examined.5 -Objectives): Research the fields of genetic algorithms and neural networks.How many of the original objectives were achieved? The original objectives were as follows (9.

Improve memory management.Possible Improvements More advanced testing would be the biggest improvement that more time would make possible. Testing with datasets that would take large amounts of time to train. Page 72 of 225 . the program occasionally causes an exception (most specifically when creating cascade or back-propagation networks in quick succession). Due to small conflicts between managed and unmanaged memory.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 8 . Testing both with and without scaling and utilising different scaling methods. These could be resolved fairly simply (I believe) by implementing the iDispose interface in my managed code to allow it to be destroyed upon demand instead of waiting for the garbage collector. Some forms of testing that would be possible are: • • • • • Testing with datasets with larger numbers of inputs Testing with large datasets that attempt to train the network to distinguish between small differences in data.

Having the back-propagation algorithm in C# similar to the genetic algorithm would also allow a more “apples to apples” comparison.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 8. I would experiment with the following training algorithm improvements: • Improve the genetic algorithms crossover and mutation algorithms. Making the forms shown as dialogs. allowing more to be undertaken successfully.2 .Possible User Interface Improvements The main improvements to the user interface that I would make if I had more time are: • • Increased number of variable options presented to the user (with appropriate defaults set). Page 73 of 225 . experimenting with multiple point crossover and other crossover forms. as well as implementing roulette wheel selection. but still minimize instead of blocking other computer programs. for the reasons stated above. (the new network wizard and the working form) lock out the rest of the program. is worth the possibility of reduced performance. Implement the back-propagation algorithm in C# instead of C++.1 . • • Attempt to improve the speed of the genetic algorithm. • Implement the cascade training algorithm in C# instead of C++. Contrary to my opinion before undertaking extensive usage of C# I now believe the reduced development time. 8.Possible Training Algorithm Improvements With more time.

which this code is partially based upon.1 . Page 74 of 225 . and neither is the code for the prototype constructed in milestone two. Although there was another section of the program compiled from source (the FANN library). I have not included it in the source code listing. The earlier attempts at genetic algorithm and neural network implementations are not included. This section also contains only the final draft of the code.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9 . The code contains line numbering for pinpointing the precise lines in question within the body-text when crossreferencing. so that this listing contains only code I have personally written for the project.Source code The source code in this section is organised first by which part of the application it represents and then by what source file it originally resided in.Appendices 9.

private int _CC_Reports.1 .Linq.1 .Text.“NetworkUpdatedEventHandler” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 using using using using System. namespace _05025397.1. private double _CC_LearningRate.Collections. private bool _CC_ITerr. //Dataset identifier private int _TrDataSet.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9. //Cascade Correlation private double _CC_Terr. System.EventArgs { #region data /********************\ |* DATA *| \********************/ //Training Algorithm Identifier private int trainingalg.Controller. Page 75 of 225 . System.1.Generic. System. private int _CC_MaxNeurons.1.Classes { public class NetworkUpdateEventArgs : System.View 9.

private int _GA_Mutation. private int _BP_HiddenL. bool CCITerr. int GAPopSize. private int _GA_GenLimit. private int _GA_Crossover. private bool _BP_ITerr. private bool _GA_ITerr. int CCReports. private int _BP_Reports. private double _BP_Terr. #endregion #region constructor /********************\ |* CONSTRUCTOR *| \********************/ //Constructor to accept all necessary data public NetworkUpdateEventArgs (int Algorithm. private int _GA_HiddenL. double CCTerr. int CCMaxNeurons. double CCLearningRate. private double _GA_Terr.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000027 00000028 00000029 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 //Genetic Algorithm private int _GA_PopSize. private double _BP_LearningRate. private int _GA_Reports. Page 76 of 225 . //BackProp private int _BP_EpochLimit.

double GATerr. int TrData_Set) { //Algorithm trainingalg = Algorithm. bool BPITerr. //Backpropagation Algorithm _BP_EpochLimit = BPEpochLimit. _GA_HiddenL = GAHiddenL. int GAReports. _GA_ITerr = GAITerr. int GACrossOver. _CC_Reports = CCReports. _GA_Mutation = GAMutation. int BPReports. double BPLearningRate. //Genetic Algorithm _GA_PopSize = GAPopSize. Page 77 of 225 . _GA_Reports = GAReports. int BPEpochLimit. int GAHiddenL. _GA_Terr = GATerr. bool GAITerr. _GA_Crossover = GACrossOver. int GAMutation. int BPHiddenL. //Cascade Correlation _CC_Terr = CCTerr. _CC_ITerr = CCITerr. _CC_MaxNeurons = CCMaxNeurons.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000057 00000058 00000059 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 int GAGenLimit. double BPTerr. _GA_GenLimit = GAGenLimit. _CC_LearningRate = CCLearningRate.

_BP_Terr = BPTerr. } } public bool CC_ITerr { get { return _CC_ITerr. } } public double CC_LearningRate { get { return _CC_LearningRate. } } public int CC_Reports { get { return _CC_Reports. _TrDataSet = TrData_Set. } } #endregion #region genetic_algorithms //Genetic Algorithm public int GA_PopSize { get { return _GA_PopSize. } } public int GA_GenLimit Page 78 of 225 . _BP_Reports = BPReports. } #endregion #region getters/setters #region cascade_correlation //Cascade Correlation public double CC_Terr { get { return _CC_Terr.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000087 00000088 00000089 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 _BP_HiddenL = BPHiddenL. } } public int CC_MaxNeurons { get { return _CC_MaxNeurons. _BP_ITerr = BPITerr. _BP_LearningRate = BPLearningRate.

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000117 00000118 00000119 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 00000146 { get { return _GA_GenLimit. } } public int BP_Reports { get { return _BP_Reports. } } public int GA_Mutation { get { return _GA_Mutation. } } public double GA_Terr { get { return _GA_Terr. } } public bool GA_ITerr { get { return _GA_ITerr. } } public int GA_HiddenL { get { return _GA_HiddenL. } } public int GA_Crossover { get { return _GA_Crossover. } } public bool BP_ITerr { get { return _BP_ITerr. } } public double BP_Terr { get { return _BP_Terr. } } #endregion Page 79 of 225 . } } public int BP_HiddenL { get { return _BP_HiddenL. } } public double BP_LearningRate { get { return _BP_LearningRate. } } public int GA_Reports { get { return _GA_Reports. } } #endregion #region back_propagation //Back propagation public int BP_EpochLimit { get { return _BP_EpochLimit.

System.1. System.Drawing. System.cs” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 using using using using using using using using using System.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000147 00000148 00000149 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 } #endregion #region external //General public int TrDataSet { get { return _TrDataSet. System.Windows.2 . using _05025397.“FrmMain.Linq.ComponentModel. System.1.Controller { public partial class FrmMain : Form Page 80 of 225 . using ZedGraph.Collections. namespace _05025397. } } #endregion } 9. System.Text. } } public int TrainingAlgorithm { get { return trainingalg.Data.IO.Forms. System.Generic. System.

Resources. this.DoWork += new DoWorkEventHandler(bw_DoWork).Working(). bw.Algorithms.Working workingdialog = new Controller.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 { #region data /********************\ |* DATA *| \********************/ //The neuralnetwork Controller. } #endregion #region internal_functions /********************\ Page 81 of 225 . #endregion #region constructor /********************\ |* CONSTRUCTOR *| \********************/ public FrmMain() { InitializeComponent().TrainingAlgorithm network.RunWorkerCompleted += new RunWorkerCompletedEventHandler(bw_RunWorkerCompleted). bw.Icon = Properties. //Working dialog Controller. //Background thread static BackgroundWorker bw = new BackgroundWorker().icon.

Classes.AxisChange().NetworkUpdateEventArgs e) { LoadDataset(e.Text = "Mean Squared Error". try { if (e.Title.zedGraphControl1. //Titles MSEpane.TrDataSet).zedGraphControl1.CurveList.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 |* INTERNAL *| \********************/ private void plotgraphsMSE(PointPairList MSE) { this.Text = "MSE".Text = "Iteration".zedGraphControl1.IsVisible = false.XAxis.AddCurve("".Circle). // Hide the legend MSEpane.Title.Clear().TrainingAlgorithm == 0) { Page 82 of 225 . SymbolType. Color. MSEpane. MSEpane.GraphPane. } private void updatenetwork(object sender. MSEpane.Title. this.Red. GraphPane MSEpane = this.YAxis.Legend.GraphPane. MSE.

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 network = new Controller. e.CC_LearningRate. e.Algorithms. e.BP_EpochLimit.Algorithms.CC_MaxNeurons.GA_Mutation. txtTestData).BP_LearningRate.BackProp (e. e. e. e.Algorithms. } if (e. e.BP_Terr. txtTestData).CC_ITerr. e.GA_Crossover.CascadeCorrelation (e.GA_PopSize.BP_ITerr. txtTestData). e. txtTrainData. e. } } catch { Page 83 of 225 .GA_HiddenL.GA_ITerr.CC_Reports. txtTrainData.CC_Terr.TrainingAlgorithm == 1) { network = new Controller. txtTrainData.GeneticAlgorithm (e. } if (e. e. e. e.GA_GenLimit.BP_Reports.GA_Terr.TrainingAlgorithm == 2) { network = new Controller.GA_Reports. e. e. e.BP_HiddenL.

Page 84 of 225 .Error).OK.Text = Properties.Resources.Wiz1. wizard.NetworkUpdated += new Controller.Text = Properties.Show("Error creating new network".XORtrain.Resources.XORtest.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 00000117 00000118 00000119 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 MessageBox. MessageBoxButtons.ShowDialog().XOR: txtTrainData. MessageBoxIcon.network_details().NetworkUpdateHandler(updatenetwork).Wiz1(). txtTestData. } } private void new_network() { Controller. } if (network != null) { txtNetSettings. "Error".Exit(). Application. } private void LoadDataset(int Data_set) { switch (Data_set) { case (int)Dataset. wizard.Wiz1 wizard = new Controller.Text = network.

FISHERtest. string test = "".Text = Properties.FISHERtrain.\\". string train = "". case (int)Dataset. "Select Training Data"). case (int)Dataset.Resources.CUSTOM: LoadCustomDataSet(). } Page 85 of 225 .FISHER: txtTrainData. } catch { MessageBox.Error). "Error!". txtTestData.Resources. "Select Testing Data"). break.OK.\\". break. string testpath = SelectTextFile(". } } private void LoadCustomDataSet() { string trainpath = SelectTextFile(". try { train = LoadTxtFile(trainpath). MessageBoxButtons.Text = Properties. MessageBoxIcon.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 00000146 00000147 00000148 00000149 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 break.Show("Error: Problem loading training data".

MessageBoxButtons.Text = train. FileAccess. MessageBoxIcon.OK. } catch { MessageBox. data = Reader. "Error!".Open.OK. MessageBoxButtons. } private string LoadTxtFile(string path) { string data = "".William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000168 00000169 00000170 00000171 00000172 00000173 00000174 00000175 00000176 00000177 00000178 00000179 00000180 00000181 00000182 00000183 00000184 00000185 00000186 00000187 00000188 00000189 00000190 00000191 00000192 00000193 00000194 00000195 00000196 00000197 try { test = LoadTxtFile(testpath).Error). MessageBoxIcon.Error).Show("Error: Problem loading testing data". } txtTestData. try { File = new FileStream(path. FileStream File = null. StreamReader Reader = null.Text = test. Page 86 of 225 . } catch { MessageBox.Read). FileMode.ReadToEnd().Show("Error reading selected file: " + path. "Error!". Reader = new StreamReader(File). txtTrainData.

Close(). return (dialog. if (File != null) File.Filter = "txt files (*. dialog. string title) { OpenFileDialog dialog = new OpenFileDialog(). } return data.ShowDialog() == DialogResult.txt|All files (*. dialog.*".Title = title. } private string SelectTextFile(string initialDirectory. } finally { if (Reader != null) Reader.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000198 00000199 00000200 00000201 00000202 00000203 00000204 00000205 00000206 00000207 00000208 00000209 00000210 00000211 00000212 00000213 00000214 00000215 00000216 00000217 00000218 00000219 00000220 00000221 00000222 00000223 00000224 00000225 00000226 00000227 data = "".*)|*.FileName : null.Close().txt)|*.OK) ? dialog. dialog.InitialDirectory = initialDirectory. } #endregion #region menu_items /********************\ |* MENU ITEMS *| \********************/ Page 87 of 225 .

Test().Show().William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000228 00000229 00000230 00000231 00000232 00000233 00000234 00000235 00000236 00000237 00000238 00000239 00000240 00000241 00000242 00000243 00000244 00000245 00000246 00000247 00000248 00000249 00000250 00000251 00000252 00000253 00000254 00000255 00000256 00000257 private void newNetworkToolStripMenuItem_Click(object sender. EventArgs e) { if (network != null) { if (network. } #endregion #region buttons Page 88 of 225 . } private void runToolStripMenuItem_Click(object sender. } private void exitToolStripMenuItem_Click(object sender.Close().Trained != false) { network. EventArgs e) { this. EventArgs e) { FrmAbout About = new FrmAbout(). EventArgs e) { new_network(). } } } private void aboutToolStripMenuItem1_Click(object sender. About.

IsBusy) { if (network != null) { if (!network.Trained) Page 89 of 225 . EventArgs e) { //Shouldn't be busy if the user managed to click this //but just make 100% certain. EventArgs e) { new_network().RunWorkerAsync(). if (!bw. workingdialog. EventArgs e) { if (network != null) { if (network. } private void btnTrain_Click(object sender.Trained) { bw.ShowDialog().William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000258 00000259 00000260 00000261 00000262 00000263 00000264 00000265 00000266 00000267 00000268 00000269 00000270 00000271 00000272 00000273 00000274 00000275 00000276 00000277 00000278 00000279 00000280 00000281 00000282 00000283 00000284 00000285 00000286 00000287 /********************\ |* BUTTONS *| \********************/ private void btnNew_Click(object sender. } } } } private void btnTest_Click(object sender.

Length. if (network != null) { success = network. } } } } } #endregion #region bg_worker /********************\ |* BG WORKER *| \********************/ private void bw_DoWork(object sender. txtOutput.SelectionStart = txtOutput.Test().Refresh(). if (success) { txtOutput.ReportData. DoWorkEventArgs e) { bool success = true. } Page 90 of 225 .Text. txtOutput.Train().ScrollToCaret().Clear().William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000288 00000289 00000290 00000291 00000292 00000293 00000294 00000295 00000296 00000297 00000298 00000299 00000300 00000301 00000302 00000303 00000304 00000305 00000306 00000307 00000308 00000309 00000310 00000311 00000312 00000313 00000314 00000315 00000316 00000317 { if (!network. txtOutput.Text = network.Tested) { bool success = network. txtOutput.

ScrollToCaret(). MessageBoxButtons. "Error training network!". "Error". RunWorkerCompletedEventArgs e) { bool success. MessageBoxIcon.Error).ToString().William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000318 00000319 00000320 00000321 00000322 00000323 00000324 00000325 00000326 00000327 00000328 00000329 00000330 00000331 00000332 00000333 00000334 00000335 00000336 00000337 00000338 00000339 00000340 00000341 00000342 00000343 00000344 00000345 00000346 00000347 e. MessageBoxButtons.SelectionStart = txtOutput. txtOutput. } else { MessageBox.Text. MessageBoxIcon.Error). } Page 91 of 225 . plotgraphsMSE(network. } success = (bool) e. //Check for errors if (e.Show("Error during network train: " + e.ReportData.Result = success.Error != null) { MessageBox. txtOutput.Clear().Length. txtOutput.Error.OK.OK. if (success) { txtOutput.Result. } private void bw_RunWorkerCompleted(object sender.getGraphData()).Text = network.Show("Error training network!".

3 . } #endregion #region misc /********************\ |* MISC *| \********************/ private void FrmMain_Shown(object sender.Linq. System. } #endregion } 9.ComponentModel. Page 92 of 225 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000348 00000349 00000350 00000351 00000352 00000353 00000354 00000355 00000356 00000357 00000358 00000359 00000360 00000361 00000362 00000363 00000364 00000365 00000366 } //Remove the dialog locking out the main form //and showing that we're currently working workingdialog.Generic. System.Drawing. System. EventArgs e) { //Run the new neural net wizard new_network(). System.1.1. System.“Wiz1.cs” 00000001 00000002 00000003 00000004 00000005 00000006 using using using using using using System.Data.Hide().Collections.

#endregion public partial class Wiz1 : Form { #region data /********************\ |* DATA *| \********************/ public delegate void NetworkUpdateHandler(object sender. Classes.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 00000030 00000031 00000032 00000033 00000034 00000035 00000036 using System.Forms. enum Dataset { XOR. BP. Page 93 of 225 . using _05025397. public event NetworkUpdateHandler NetworkUpdated. using System. FISHER.Windows.Controller { #region enums /********************\ |* ENUMS *| \********************/ enum Training { GA.NetworkUpdateEventArgs e).Text. CC }. private int Trainset. CUSTOM }. namespace _05025397. private int algorithm.

Icon = Properties. EventArgs e) { this. } #endregion #region control_events /********************\ |* CONTROL EVENTS *| \********************/ #region buttons /********************\ |* BUTTONS *| \********************/ private void btnCancel_Click(object sender.XOR.Resources. } private void rBtnBackprop_CheckedChanged(object sender.Close().William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 00000060 00000061 00000062 00000063 00000064 00000065 00000066 #endregion data #region constructor /********************\ |* CONSTRUCTOR *| \********************/ public Wiz1() { InitializeComponent(). this.GA. EventArgs e) Page 94 of 225 . algorithm = (int)Training.icon. Trainset = (int)Dataset.

Properties.Visible = false. lblGA.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 00000090 00000091 00000092 00000093 00000094 00000095 00000096 { if (rBtnBackprop. EventArgs e) { if (rBtnCascadeCorrelation. panBackprop. panGeneticAlgorithm.Checked == true) { picAlg. panGeneticAlgorithm.Visible = false.Visible = false.Visible = false.Visible = true. algorithm = (int)Training.Visible = false. } } private void rBtnCascadeCorrelation_CheckedChanged(object sender.Visible = true.Properties.Checked == true) { picAlg.Visible = true.IMAGECascadeCorrelation. lblBP. Page 95 of 225 . lblCC. panBackprop. lblGA. lblCC.IMAGEBackProp.CC. panCC.Resources.Visible = false.BP.Visible = false.Visible = false.Image = _05025397. panCC.Image = _05025397. algorithm = (int)Training.Visible = true.Resources. lblBP.

} } private void btnBack_Click(object sender.SelectedIndex .SelectTab(tabControl1.Image = _05025397. lblCC. lblBP.GA.Visible = true. panGeneticAlgorithm.Visible = false.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 00000117 00000118 00000119 00000120 00000121 00000122 00000123 00000124 00000125 00000126 } } private void rBtnGeneticAlgorithm_CheckedChanged(object sender. panCC. EventArgs e) { if (rBtnGeneticAlgorithm.Visible = false. algorithm = (int)Training. lblGA.Resources.SelectedIndex > 0) { tabControl1. EventArgs e) { if (tabControl1.Visible = false. } } private void btnNext_Click(object sender.1).Properties. EventArgs e) { Page 96 of 225 . panBackprop.Visible = true.Checked == true) { picAlg.Visible = false.IMAGEGeneticAlgorithm.

NetworkUpdateEventArgs args = new Classes.SelectedIndex < tabControl1. } else { VerifyData check = new VerifyData (algorithm.Text.Text.Text.1) { tabControl1.GA_Crossover. check.GA_Terr.Verified) { Classes.CC_MaxNeurons. Trainset). txtGA_PopSize.NetworkUpdateEventArgs (check.CC_Terr. if (check. CC_ITerr. chkGA_IgnoreTarget.Text.TabCount .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 00000146 00000147 00000148 00000149 00000150 00000151 00000152 00000153 00000154 00000155 00000156 if (tabControl1. txtBP_LearnRate.SelectTab(tabControl1. txtGA_MaxGen.Checked.Text.Text. check.GA_PopSize.BP_EpochLimit. txtGA_Terr.Text. txtBP_Report. txtCC_LearningRate.BP_HiddenL. check. txtBP_EpochLimit.Checked. check.Text.GA_Reports.TrainingAlgorithm.Text. check. txtGA_HiddenNodes.SelectedIndex + 1).Text.Text. check.CC_LearningRate. check.Text.Checked. txtCC_TargetError. check. check. check. txtGA_Report.Text. txtBP_HiddenNodes. txtBP_Terr. check. txtCC_Report.GA_GenLimit.CC_Reports.Text.GA_Mutation.GA_HiddenL. check. txtCC_MaxNeurons. Page 97 of 225 . txtGA_ProbM.CC_ITerr.Text. check. txtGA_ProbX.Text. check. chkBP_IgnoreTarget.GA_ITerr. check.

} else { MessageBox. } } } #endregion #region checkboxes /********************\ |* CHECKBOXES *| \********************/ private void chkDatasetFisher_CheckedChanged(object sender.FISHER.Properties. "Error Checking Data".Image = _05025397. NetworkUpdated(this.BP_ITerr. check. args).TrDataSet). MessageBoxButtons.BP_Terr. MessageBoxIcon. } } private void chkDatasetXOR_CheckedChanged(object sender. check.BP_Reports. check.OK.IMAGEIrisFlowers.Dispose(). EventArgs e) { if (chkDatasetFisher. Trainset = (int)Dataset. EventArgs e) Page 98 of 225 . check.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 00000168 00000169 00000170 00000171 00000172 00000173 00000174 00000175 00000176 00000177 00000178 00000179 00000180 00000181 00000182 00000183 00000184 00000185 00000186 check.Resources. this.Exclamation).BP_LearningRate.Show("Error Checking Data: Data could not be verified".Checked) { picDataset.

Checked) { picDataset.Image = _05025397. EventArgs e) { if (chkDatasetCustom. } } #endregion #region misc /********************\ |* MISC *| \********************/ private void tabControl1_SelectedIndexChanged(object sender.Properties. Trainset = (int)Dataset.XOR.Focus().Visible == true) { txtCC_TargetError.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000187 00000188 00000189 00000190 00000191 00000192 00000193 00000194 00000195 00000196 00000197 00000198 00000199 00000200 00000201 00000202 00000203 00000204 00000205 00000206 00000207 00000208 00000209 00000210 00000211 00000212 00000213 00000214 00000215 00000216 { if (chkDatasetXOR.CUSTOM.Checked) { picDataset. Trainset = (int)Dataset.SelectedIndex == 1) { if (panCC.Image = _05025397.Properties.Resources.Resources.IMAGEXOR. } } private void chkDatasetCustom_CheckedChanged(object sender. } Page 99 of 225 .IMAGEQuestionMark. EventArgs e) { if (tabControl1.

Focus().Visible == true) { txtBP_EpochLimit.icon.SelectTab(0).William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000217 00000218 00000219 00000220 00000221 00000222 00000223 00000224 00000225 00000226 00000227 00000228 00000229 00000230 00000231 00000232 00000233 00000234 00000235 00000236 } if (panBackprop. } #endregion #endregion } Page 100 of 225 .Focus().Visible == true) { txtGA_PopSize. } } } private void Wiz1_Load(object sender.Resources. } if (panGeneticAlgorithm. tabControl1.Icon = Properties. EventArgs e) { this.

System.Controller { public partial class FrmAbout : Form { #region constructor /********************\ |* CONSTRUCTOR *| \********************/ public FrmAbout() { InitializeComponent().4 .Data.1.1.Linq.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9. System. namespace _05025397. EventArgs e) { Page 101 of 225 .Collections.Generic.Windows. } #endregion #region form /********************\ |* FORM *| \********************/ private void FrmAbout_Load(object sender. System. System.Forms.Text. System.Drawing.“FrmAbout.cs” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 using using using using using using using using System. System. System.ComponentModel.

Text = Properties. LinkLabelLinkClickedEventArgs e) { // Specify that the link was visited. label5.LinkVisited = true.Resources.Show("No e-mail program defined. MessageBoxIcon.".OK.Diagnostics.ProductVersion.Text).Text = Application. textBox5.icon. } } private void linkLabel4_LinkClicked(object sender. this.Start("mailto:" + linkLabel1.Icon = Properties. } catch { MessageBox.Process. MessageBoxButtons. System.Information). LinkLabelLinkClickedEventArgs e) Page 102 of 225 . try { // Navigate to a URL.linkLabel1.LICENCE_LGPL21.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 this.Resources. } #endregion #region linklabels /********************\ |* LINKLABELS *| \********************/ private void linkLabel1_LinkClicked(object sender. "Undefined Application".

Diagnostics. "Error".William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 { try { System. } catch { MessageBox.Process.Show("Error opening link. MessageBoxIcon.Text). "Error". } } private void linkLabel5_LinkClicked(object sender.".Start (linkLabel4. MessageBoxIcon. MessageBoxButtons. LinkLabelLinkClickedEventArgs e) Page 103 of 225 .Error).Process.OK. LinkLabelLinkClickedEventArgs e) { try { System.Error). } } private void linkLabel6_LinkClicked(object sender.Text).Show("Error opening link.Diagnostics. MessageBoxButtons.".OK. } catch { MessageBox.Start (linkLabel5.

OK. "Error".Process.co. } catch { MessageBox.Error). LinkLabelLinkClickedEventArgs e) Page 104 of 225 .Start (linkLabel6.Show("Error opening link. MessageBoxButtons.".Start ("http://www. MessageBoxIcon.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 00000117 00000118 00000119 { try { System. LinkLabelLinkClickedEventArgs e) { try { System.uk").Diagnostics. } catch { MessageBox.Process. "Error". MessageBoxIcon.Show("Error opening link.snikks.Diagnostics.Text). MessageBoxButtons. } } private void linkLabel7_LinkClicked(object sender. } } private void linkLabel2_LinkClicked(object sender.Error).OK.".

Show("Error opening link. EventArgs e) { if (tabControl1.1. tabControl1.Visible != true) { groupBox1.Start ("http://www.Visible = false.org/licenses/old-licenses/lgpl-2.". MessageBoxIcon. } catch { MessageBox. "Error". } else Page 105 of 225 . MessageBoxButtons.Error).gnu.Visible = true.Visible = false. pictureBox1.Visible = false. groupBox2.Process.OK.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 00000146 00000147 00000148 00000149 { try { System.html"). } } #endregion #region buttons /********************\ |* BUTTONS *| \********************/ private void btnDatasets_Click(object sender.Diagnostics.

} else { groupBox1. false.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 00000168 00000169 00000170 00000171 00000172 00000173 00000174 00000175 00000176 00000177 00000178 00000179 { groupBox1.Visible != true) { groupBox2.Visible = false.Visible = false.Visible pictureBox1. pictureBox1. } true.Dispose(). = false.Visible = true.Visible = false. EventArgs e) { if (groupBox2.Visible = tabControl1.Visible = true. pictureBox1. groupBox2.Visible = false. tabControl1. = true. groupBox1. EventArgs e) { this. tabControl1.Visible = groupBox2.Visible } } private void btnLicences_Click(object sender.Visible = true. } } private void btnOkay_Click(object sender. Page 106 of 225 .Visible = false.

LICENCE_LGPL30.LICENCE_LGPL21.Checked == true) textBox5.Text = Properties.Resources. EventArgs e) { if (radioButton1.Resources.Text = Properties.Resources.Text = Properties.Checked == true) textBox5. } private void radioButton2_CheckedChanged(object sender. } private void radioButton3_CheckedChanged(object sender.Checked == true) textBox5. } #endregion } Page 107 of 225 .LICENCE_GPL30. EventArgs e) { if (radioButton2.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000180 00000181 00000182 00000183 00000184 00000185 00000186 00000187 00000188 00000189 00000190 00000191 00000192 00000193 00000194 00000195 00000196 00000197 00000198 00000199 00000200 00000201 00000202 00000203 00000204 00000205 } #endregion #region radiobuttons /********************\ |* RADIOBUTTONS *| \********************/ private void radioButton1_CheckedChanged(object sender. EventArgs e) { if (radioButton3.

System.1.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9.Text.cs” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 using using using using using using using using System.Media. } private void Working_Click(object sender. EventArgs e) { System. namespace _05025397. System.icon.Drawing.Windows.Collections.Icon = Properties.Beep.SystemSounds. System.Data.Controller { public partial class Working : Form { #region functions /********************\ |* FUNCTIONS *| \********************/ public Working() { InitializeComponent().Generic.“Working.5 . } #endregion } Page 108 of 225 .Resources. System. System.Linq. System. this.Play(). System.1.ComponentModel.Forms.

Collections. namespace _05025397. //Dataset identifer //0 = XOR Page 109 of 225 .Text.Linq. //Training Algorithm Identifier //1 = genetic algorith //2 = backprop //3 = cascade correlation private int trainingalg.Forms. System.2. System.Generic.cs” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 using using using using using System.2 .Controller 9. System.1 . System.“VerifyData.Controller { public class VerifyData { #region data /********************\ |* DATA *| \********************/ //Dataverified variable private bool verified.1.Windows.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000030 } 9.1.

private bool _BP_ITerr. private int _CC_Reports. private bool _CC_ITerr. private int _BP_HiddenL. private int _GA_Mutation. private int _GA_Reports. private double _BP_LearningRate. private double _CC_LearningRate. private bool _GA_ITerr. //BackProp private int _BP_EpochLimit. private double _GA_Terr.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000026 00000027 00000028 00000029 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 //1 = FISHERS IRIS DATA //2 = CUSTOM private int _TDataSet. private double _BP_Terr. private int _GA_Crossover. private int _GA_HiddenL. #endregion Page 110 of 225 . private int _CC_MaxNeurons. private int _GA_GenLimit. //Cascade Correlation private double _CC_Terr. //Genetic Algorithm private int _GA_PopSize. private int _BP_Reports.

string GAGenLimit. bool GAITerr. //DataSet _TDataSet = TData_Set. will be changed to //false if any data verification errors occur verified = true. int TData_Set) { //Initially set to true.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000056 00000057 00000058 00000059 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 #region constructor /********************\ |* CONSTRUCTOR *| \********************/ public VerifyData (int Algorithm. string GAPopSize. string CCTerr. string CCReports. string BPLearningRate. string GAReports. if (_TDataSet != (int)Dataset. //Algorithm trainingalg = Algorithm. string BPTerr. string BPHiddenL. string GAMutation. string CCLearningRate. string GACrossOver.XOR && _TDataSet != Page 111 of 225 . string BPEpochLimit. bool CCITerr. string GATerr. string GAHiddenL. string CCMaxNeurons. string BPReports. bool BPITerr.

verified = false.Show ("Parsing Error: Please check supplied cascade" + "correlation data values". if (verified) { if (trainingalg == 2) { try { //Cascade correlation _CC_Terr = Convert. _CC_Reports = Convert.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000086 00000087 00000088 00000089 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 (int)Dataset. _CC_ITerr = CCITerr. MessageBoxButtons.Error).ToInt32(CCMaxNeurons).OK.ToSingle(CCLearningRate).CUSTOM) verified = false.FISHER && _TDataSet != (int)Dataset. } catch (FormatException) { MessageBox. } } } Page 112 of 225 . MessageBoxIcon.ToInt32(CCReports). _CC_MaxNeurons = Convert. _CC_LearningRate = Convert.ToSingle(CCTerr). "Parsing Error".

_GA_Reports = Convert.ToInt32(GAPopSize). _GA_ITerr = GAITerr.ToInt32(GACrossOver). _GA_Terr = Convert.ToInt32(GAReports). _GA_HiddenL = Convert.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000116 00000117 00000118 00000119 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 if (verified) { if (trainingalg == 0) { try { //Genetic algorithm _GA_PopSize = Convert. verified = false.ToInt32(GAMutation). _GA_Crossover = Convert.ToSingle(GATerr).OK. } } } if (verified) Page 113 of 225 .Error). "Parsing Error". _GA_GenLimit = Convert. } catch (FormatException) { MessageBox.ToInt32(GAHiddenL).ToInt32(GAGenLimit). MessageBoxIcon. _GA_Mutation = Convert. MessageBoxButtons.Show ("Parsing Error: Please check supplied" +" genetic algorithm data values".

} Page 114 of 225 .OK.ToInt32(BPReports). _BP_LearningRate = Convert. } } } if (verified) { verified = checkforsanity().Show ("Parsing Error: Please check " + "supplied back propagation data values".ToInt32(BPHiddenL). _BP_HiddenL = Convert.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000146 00000147 00000148 00000149 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 00000168 00000169 00000170 00000171 00000172 00000173 00000174 00000175 { if (trainingalg == 1) { try { //Back-propagation _BP_EpochLimit = Convert.Error).ToDouble(BPTerr). MessageBoxButtons. MessageBoxIcon. _BP_ITerr = BPITerr.ToInt32(BPEpochLimit). } catch (FormatException) { MessageBox.ToDouble(BPLearningRate). _BP_Terr = Convert. verified = false. "Parsing Error". _BP_Reports = Convert.

if (_GA_Crossover < 0 || _GA_Crossover > 100) { sanity = false. } if (_GA_Terr < 0 || _GA_Terr > 1) sanity = false. Page 115 of 225 . } if (_GA_Mutation < 0 || _GA_Mutation > 100) { sanity = false. if (_GA_PopSize < 3) sanity = false. if (_GA_GenLimit < 1) sanity = false.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000176 00000177 00000178 00000179 00000180 00000181 00000182 00000183 00000184 00000185 00000186 00000187 00000188 00000189 00000190 00000191 00000192 00000193 00000194 00000195 00000196 00000197 00000198 00000199 00000200 00000201 00000202 00000203 00000204 00000205 } #endregion #region internal /********************\ |* INTERNAL *| \********************/ private bool checkGAsanity() { bool sanity = true.

MessageBoxButtons. if (_BP_EpochLimit < 1) sanity = false. if (_BP_HiddenL < 1) sanity = false.OK. } private bool checkBPsanity() { bool sanity = true. if (!sanity) MessageBox. if (_GA_HiddenL < 1) sanity = false.Show ("Sanity Check Fail: Please check supplied " + "genetic algorithm data values".William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000206 00000207 00000208 00000209 00000210 00000211 00000212 00000213 00000214 00000215 00000216 00000217 00000218 00000219 00000220 00000221 00000222 00000223 00000224 00000225 00000226 00000227 00000228 00000229 00000230 00000231 00000232 00000233 00000234 00000235 if (_GA_Reports < 1 || _GA_Reports > _GA_GenLimit) sanity = false. "Sanity Check Fail". MessageBoxIcon. Page 116 of 225 . if (_BP_Reports < 1 || _BP_Reports > _BP_EpochLimit) sanity = false.Error). return sanity.

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000236 00000237 00000238 00000239 00000240 00000241 00000242 00000243 00000244 00000245 00000246 00000247 00000248 00000249 00000250 00000251 00000252 00000253 00000254 00000255 00000256 00000257 00000258 00000259 00000260 00000261 00000262 00000263 00000264 00000265 if (_BP_Terr < 0 || _BP_Terr > 1) sanity = false.OK.Show ("Sanity Check Fail: Please check supplied " + "back propagation data values". } private bool checkCCsanity() { bool sanity = true. if (_CC_MaxNeurons < 1) sanity = false. if (_CC_Terr < 0 || _CC_Terr > 1) sanity = false. Page 117 of 225 . "Sanity Check Fail". MessageBoxButtons.Error). return sanity. if (_CC_Reports < 1 || _CC_Reports > _CC_MaxNeurons) sanity = false. if (!sanity) MessageBox. MessageBoxIcon. if (_BP_LearningRate < 0 || _BP_LearningRate > 1) sanity = false.

// // // // // bool Basically we're making sure the data is clean before it goes anywhere near a DLL. only //halt any values from proceeding that are so wrong they could cause //serious errors later down the line. if (trainingalg < 0 || trainingalg > 2) Page 118 of 225 . "Sanity Check Fail". sanity = true. MessageBoxButtons.Show ("Sanity Check Fail: Please check supplied " + "cascade correlation data values". } private bool checkforsanity() { //Some of the values that are allowed through here would not produce //good results . if they're greater than 1 but less than 100 scale them appropriately.Error). return sanity.OK. We're also doing some very minor processing of the probability values.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000266 00000267 00000268 00000269 00000270 00000271 00000272 00000273 00000274 00000275 00000276 00000277 00000278 00000279 00000280 00000281 00000282 00000283 00000284 00000285 00000286 00000287 00000288 00000289 00000290 00000291 00000292 00000293 00000294 00000295 if (_CC_LearningRate < 0 || _CC_LearningRate > 1) sanity = false.however I don't want to prevent any flexibility. if (!sanity) MessageBox. MessageBoxIcon.

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000296 00000297 00000298 00000299 00000300 00000301 00000302 00000303 00000304 00000305 00000306 00000307 00000308 00000309 00000310 00000311 00000312 00000313 00000314 00000315 00000316 00000317 00000318 00000319 00000320 00000321 00000322 00000323 00000324 00000325 { sanity = false. } if (sanity) { if (trainingalg == 0) { sanity = checkGAsanity(). } if (trainingalg == 2) { sanity = checkCCsanity(). } Page 119 of 225 . MessageBox.Show ("Sanity Check Fail: Training Algorithm not " + "selected or selected incorrectly". "Sanity Check Fail". } } return sanity.Error). } if (trainingalg == 1) { sanity = checkBPsanity(). MessageBoxButtons. MessageBoxIcon.OK.

} } public int CC_MaxNeurons { get { return _CC_MaxNeurons.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000326 00000327 00000328 00000329 00000330 00000331 00000332 00000333 00000334 00000335 00000336 00000337 00000338 00000339 00000340 00000341 00000342 00000343 00000344 00000345 00000346 00000347 00000348 00000349 00000350 00000351 00000352 00000353 00000354 00000355 #endregion #region getters/setters /********************\ |* EXTERNAL *| \********************/ #region cascade_correlation //Cascade Correlation public double CC_Terr { get { return _CC_Terr. } } public int GA_Mutation Page 120 of 225 . } } #endregion #region genetic_algorithm //Genetic Algorithm public int GA_PopSize { get { return _GA_PopSize. } } public bool CC_ITerr { get { return _CC_ITerr. } } public int GA_Crossover { get { return _GA_Crossover. } } public int CC_Reports { get { return _CC_Reports. } } public int GA_GenLimit { get { return _GA_GenLimit. } } public double CC_LearningRate { get { return _CC_LearningRate.

} } public double BP_LearningRate { get { return _BP_LearningRate. } } #endregion #endregion #region external //Verification Page 121 of 225 . } } public bool BP_ITerr { get { return _BP_ITerr.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000356 00000357 00000358 00000359 00000360 00000361 00000362 00000363 00000364 00000365 00000366 00000367 00000368 00000369 00000370 00000371 00000372 00000373 00000374 00000375 00000376 00000377 00000378 00000379 00000380 00000381 00000382 00000383 00000384 00000385 { get { return public double GA_Terr { get { return public int GA_Reports { get { return public int GA_HiddenL { get { return public bool GA_ITerr { get { return #endregion _GA_Mutation. } } public int BP_HiddenL { get { return _BP_HiddenL. } } public int BP_Reports { get { return _BP_Reports. } } _GA_HiddenL. } } _GA_ITerr. } } _GA_Terr. } } #region back_propagation //Back propagation public int BP_EpochLimit { get { return _BP_EpochLimit. } } _GA_Reports. } } public double BP_Terr { get { return _BP_Terr.

} } //TrainingAlgorithm public int TrainingAlgorithm { get { return trainingalg.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000386 00000387 00000388 00000389 00000390 00000391 00000392 00000393 00000394 00000395 00000396 00000397 00000398 } public bool Verified { get { return verified. } } #endregion } Page 122 of 225 . } } //Dataset public int TrDataSet { get { return _TDataSet.

System. private ArrayList trinputs. Page 123 of 225 . namespace _05025397. System. private string[] test. System.Collections.Collections.Controller { public class DatasetParser { #region data /********************\ |* DATA *| \********************/ private bool _Verified.Text. System.Forms.cs” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 using using using using using using System. private ArrayList teinputs.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9. private ArrayList aldata.1. private ArrayList altest.Windows. private ArrayList teoutputs.2 .2.“DatasetParser.Linq. private string[] data.Generic. private ArrayList troutputs. System.

Text = txtTest. private double Ninputs. teinputs = new ArrayList(). ""). //Strip carriage returns and trim txtData. txtTest.Trim(). altest = new ArrayList().William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 //private Scaler scale. troutputs = new ArrayList(). txtTest. ""). '\t'. teoutputs = new ArrayList().Replace("\r". Page 124 of 225 . '. '.Text = txtData.Text. '\r'.Text = txtData.Trim(). #endregion #region constructor /********************\ |* CONSTRUCTOR *| \********************/ public DatasetParser(RichTextBox txtData.' }.Text. private double Noutputs.'.Text. aldata = new ArrayList(). trinputs = new ArrayList(). private double Niterations. txtData. RichTextBox txtTest) { //Data init bool Verified = true.Text = txtTest. char[] delimiterChars = { '\n'.Text.Replace("\r". ' '.

i < data.Convert. try { for (int i = 3.Convert.Length. Noutputs = System.ToDouble(data[0]). test = txtTest.ToDouble(data[1]).Split(delimiterChars). _Verified = Verified.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 //Split at the chars specified above data = txtData. number of inputs //and number of outputs Niterations = System.Split(delimiterChars). Ninputs = System. Verified = parseTestingData().Convert. //parse the data into appropriately //structured arrays Verified = parseTrainingData(). } #endregion #region internal_functions /********************\ |* INTERNAL *| \********************/ private bool parseTrainingData() { bool success = true. i++) Page 125 of 225 .Text.Text.ToDouble(data[2]). //Get the number of iterations.

RemoveAt(0). } } } catch { MessageBox. aldata. } return success.Convert. MessageBoxIcon.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 00000117 00000118 00000119 { aldata.Count > 0) { for (int i = 0.Add(System. j++) { troutputs.ToDouble(data[i])).RemoveAt(0). i++) { trinputs. MessageBoxButtons.OK. j < Noutputs.Show("Error parsing training data". "Error". success = false. } for (int j = 0.Add(aldata[0]). i < Ninputs. Page 126 of 225 . } private bool parseTestingData() { bool success = true.Error). } while (aldata. aldata.Add(aldata[0]).

"Error". altest.ToDouble(data[i])). } return success.Add(altest[0]). i < Ninputs. i < data.Error). MessageBoxIcon.RemoveAt(0).Add(System. } } } catch { MessageBox. i++) { teinputs. MessageBoxButtons. j++) { teoutputs.Count > 0) { for (int i = 0. } Page 127 of 225 .Add(altest[0]).OK.Length. i++) { altest.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 00000146 00000147 00000148 00000149 try { for (int i = 3. j < Noutputs. altest. } for (int j = 0.Convert.Show("Error parsing testing data". success = false. } while (altest.RemoveAt(0).

for (int i = 0. out double Nout) { Ninp = Ninputs. for (int i = 0. troutputdata = new double[troutputs.Count. l = 0.Count / (int)Noutputs). out double Ninp. k < Ninputs. iter = Niterations. l++) { for (int k = 0. out double[][] troutputdata) { trinputdata = new double[trinputs.Count / (int)Ninputs).Count / (int) Noutputs][]. i++) trinputdata[i] = new double[(int) Ninputs]. j+=(int) Ninputs. i++) troutputdata[i] = new double[(int) Noutputs]. for (int j = 0.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 00000168 00000169 00000170 00000171 00000172 00000173 00000174 00000175 00000176 00000177 00000178 00000179 #endregion #region external_functions /********************\ |* EXTERNAL *| \********************/ public void GetStructure(out double iter.Count / (int) Ninputs][]. } public void GetTrainingData (out double[][] trinputdata. j < trinputs. i < (troutputs. k++) { Page 128 of 225 . i < (trinputs. Nout = Noutputs.

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000180 00000181 00000182 00000183 00000184 00000185 00000186 00000187 00000188 00000189 00000190 00000191 00000192 00000193 00000194 00000195 00000196 00000197 00000198 00000199 00000200 00000201 00000202 00000203 00000204 00000205 00000206 00000207 00000208 00000209 trinputdata[l][k] = (double) trinputs[j + } } for (int j = 0. i++) teinputdata[i] = new double[(int)Ninputs]. j += (int)Noutputs. for (int i = 0. k < Ninputs. l++) { for (int k = 0. k]. Page 129 of 225 . teoutputdata = new double[teoutputs. j < troutputs.Count / (int)Ninputs). l = 0. i++) teoutputdata[i] = new double[(int)Noutputs]. i < (teinputs.Count.Count / (int)Ninputs][]. } } } public void GetTestingData (out double[][] teinputdata. k < Noutputs. j += (int)Ninputs. out double[][] teoutputdata) { teinputdata = new double[teinputs. l++) { for (int k = 0.Count / (int)Noutputs).Count / (int)Noutputs][]. for (int i = 0. k++) { troutputdata[l][k] = (double)troutputs[j + k]. l = 0.Count. for (int j = 0. j < teinputs. i < (teoutputs. k++) { teinputdata[l][k] = (double)teinputs[j + k].

Page 130 of 225 . l = 0.Count. } public bool Verified { get { return _Verified. k++) { teoutputdata[l][k] = (double)teoutputs[j + k]. j += (int)Noutputs.cs” 00000001 using System.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000210 00000211 00000212 00000213 00000214 00000215 00000216 00000217 00000218 00000219 00000220 00000221 00000222 00000223 00000224 00000225 00000226 00000227 00000228 00000229 00000230 00000231 00000232 00000233 00000234 00000235 00000236 } } } for (int j = 0.2.“Algorithms\TrainingAlgorithm. } } #endregion } 9. l++) { for (int k = 0.1. k < Noutputs.3 . } } } public void GetTestingDataAL (out ArrayList teinputdata. j < teoutputs. out ArrayList teoutputdata) { teinputdata = teinputs. teoutputdata = teoutputs.

Windows.Controller. ZedGraph. private int _IterMax.Algorithms { public abstract class TrainingAlgorithm { #region data /********************\ |* DATA *| \********************/ private double _TargetError. Page 131 of 225 . System. private string _ReportData.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 00000030 00000031 using using using using using using System.Linq. System. System.Forms. protected bool _Tested. _05025397. private int _ReportInterval. protected PointPairList BestErrList. protected bool _ITerr.Text.Collections. protected int CurIt. namespace _05025397. protected bool _Trained.Generic.

BestErrList = new PointPairList(). _ITerr = ITerr. #endregion #region constructor /********************\ |* CONSTRUCTOR *| \********************/ public TrainingAlgorithm(double Target_Error. protected DatasetParser getdata. txtTest). getdata = new DatasetParser(txtData. //If the ignore target error is true //then set the target error to -1. } #endregion Page 132 of 225 . int Iter_Max. _ReportInterval = Reports. RichTextBox txtTest.0 //this is impossible to reach. if (_ITerr) _TargetError = -1. _IterMax = Iter_Max. so the //network will loop to completion.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 00000060 00000061 protected Random Rand = new Random(). bool ITerr) { _TargetError = Target_Error. RichTextBox txtData.00. int Reports.

} } Page 133 of 225 . public PointPairList getGraphData() { return BestErrList. public abstract bool Test(). } #endregion #region external /********************\ |* EXTERNAL *| \********************/ public abstract bool Train(). } public double TargetError { set { _TargetError = value. public abstract string network_details(). } get { return _TargetError.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 00000090 00000091 #region internal /********************\ |* INTERNAL *| \********************/ protected void Report(string output) { _ReportData += output.

} set { _Trained = value. } } public string ReportData { get { return _ReportData.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 00000117 00000118 00000119 } public int ReportInterval { get { return _ReportInterval. } } public bool Tested { get { return _Tested. } } public int IterMax { get { return _IterMax. } } #endregion } Page 134 of 225 . } } public bool Trained { get { return _Trained.

“Algorithms\CascadeCorrelation. FANN_Cascade CCNet.cs” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 using using using using using using using using using using System. _05025397.Generic. System.Forms. System.Text. System.4 .Collections.Collections.Linq. #endregion #region constructor Page 135 of 225 . namespace _05025397. System. private double _OutputL.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9. FANN_Wrapper.1. System.2. System. ZedGraph.Controller.IO.Windows.Algorithms { public class CascadeCorrelation : TrainingAlgorithm { #region data /********************\ |* DATA *| \********************/ private double _InputL. private double _LearnRate.

txtData. getdata. double iterations. base. ITerr) { _LearnRate = LearnRate. double TargetError. Reports. txtTest). bool ITerr.TargetError. RichTextBox txtData. RichTextBox txtTest) : base(TargetError. txtTest. _LearnRate. out _OutputL).IterMax). RichTextBox te) { Page 136 of 225 . CCNet = new FANN_Cascade((int)_InputL. base. base.ReportInterval.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 /********************\ |* CONSTRUCTOR *| \********************/ public CascadeCorrelation(double LearnRate. int Iter_Max. out _InputL. saveDllData(txtData. } #endregion #region internal_functions /********************\ |* INTERNAL *| \********************/ private void saveDllData(RichTextBox tr. Iter_Max.GetStructure(out iterations. int Reports. (int)_OutputL.

output.Close(). tw.Write(te.Text). output.Format ("\nLearning Rate: {0:0. output. output.Write(tr.Append( string. _LearnRate)). tw. tw = new StreamWriter("dlltedata.dat").Append( string.Text). (int)_OutputL)).Format ("\t\t\t\t\tOutput Layer: {0:d}". tw. } #endregion #region external_functions /********************\ |* EXTERNAL *| \********************/ public override string network_details() { StringBuilder output = new StringBuilder().Close(). tw.dat").Format ("Network Details: \n Input Layer: {0:d}".Append( string.Append( Page 137 of 225 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 TextWriter tw = new StreamWriter("dlltrdata.00000}". (int)_InputL)).

return output. output._tr_output).ToString().Format ("\t\t\t\t\t\tIgnore target error " + "(process all iterations): {0}". base.Format ("\t\t\tTarget Error: {0:0. output.IterMax)).". base. } catch { Page 138 of 225 .Train().Append( string. try { success = CCNet.Format ("\t\tMaximum Nodes: {0:d}".William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 00000117 00000118 00000119 string. _ITerr)). Report(CCNet. base. output.00000}".Append( string.ReportInterval)). } public override bool Train() { bool success = true.Format ("\nReport every {0:d} generations.Append( string.TargetError)).

} Page 139 of 225 . } success = getGraphDataFromDll()._Trained = success.OK.OK. } } catch { MessageBox. try { double[] besterrlist = CCNet. "Error". } public bool getGraphDataFromDll() { bool success = true. MessageBoxIcon. i < besterrlist. MessageBoxIcon.Error).Length. "Error".William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 00000146 00000147 00000148 00000149 MessageBox. MessageBoxButtons. return success.besterrorlist. success = false.BestErrList.Show("Error getting graph data from DLL".Error). base. success = false. i++) { base.Show("Error running Cascade training!".Add((double)i. besterrlist[i]). for (int i = 0. MessageBoxButtons.

_Tested = success.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 00000168 00000169 00000170 00000171 00000172 00000173 } return success. } #endregion } Page 140 of 225 . } catch { MessageBox. try { success = CCNet. } base.Error).OK.Test(). MessageBoxIcon. success = false. return success. Report(CCNet. "Error". } public override bool Test() { bool success = true.Show("Error running Cascade testing!"._te_output). MessageBoxButtons.

private double _OutputL.cs” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 using using using using using using using using System.“Algorithms\GeneticAlgorithm. //Network Data private double _InputL. System.Linq.2.Generic.Algorithms { public class GeneticAlgorithm : TrainingAlgorithm { #region data /********************\ |* DATA *| \********************/ //GA Data private int _PopSize. System. System. private int _Mutation. //Population Page 141 of 225 . System.1.Controller. private int _HiddenL. namespace _05025397.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9. _05025397.Windows.Text.Forms.5 . ZedGraph.Collections.Collections. System. private int _Crossover.

//Create new population references Population = new ArrayList(_PopSize). int HiddenL. int Crossover. txtData. for (int i = 0. int Iter_Max. txtTest.Add Page 142 of 225 . ITerr) { _PopSize = PopSize. double TargetError. #endregion #region constructor /********************\ |* CONSTRUCTOR *| \********************/ public GeneticAlgorithm(int PopSize. i < _PopSize.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 ArrayList Population. _Mutation = Mutation. Iter_Max. Reports. RichTextBox txtData. out _InputL. getdata. i++) { Population. out _OutputL). _Crossover = Crossover. bool ITerr. int Mutation.GetStructure(out iterations. //Init Network params double iterations. _HiddenL = HiddenL. int Reports. RichTextBox txtTest) : base(TargetError.

i < base.GANetwork( (int)_InputL. double[][] results. _HiddenL.getMsQE(inputs.IterMax. getdata.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 (new Model. } } #endregion #region internal_functions /********************\ |* INTERNAL *| \********************/ private bool run() { bool success = true.GetScaledTrainingData(out inputs. j < _PopSize. for (int i = 0. i++) { //Get the msqe for (int j = 0. Page 143 of 225 . } //Sort the functions according to fitness now. Rand. double[][] inputs. results). j++) { ((Model.GANetwork) Population[j]). (int)_OutputL. success = sort_fitnesses(). out results).Next())).

CurIt++. } private bool sort_fitnesses() { Page 144 of 225 . } success = build_generation().GANetwork)Population[0]).GANetwork)Population[0]).CurIt = 0.MSqE + ".TargetError) { Report("\nNetwork matching or improving upon target error"+ "found at iteration [ " + i + " ].Report("Best error at iteration [ " + i + " ] was " + ((Model. } base. base.Add ((double)i. //Check for having reached the target if (((Model.BestErrList.CurIt == ReportInterval) { base. } return success.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 00000117 00000118 00000119 //Report if we're at a report iteration //Also update the besterrlist (for graphing) if (base. break. ((Model.MSqE). base.MSqE <= base.\n").").GANetwork)Population[0]).

double[] weightarray2.Show("Error sorting population". selection2. //It selects two points from the start of the population //Then applies crossover if necessary (based on probability) Page 145 of 225 . double[] weightarray1. int selection1. success = false.Sort(). double[] weightarray_f. "Error". MessageBoxButtons.OK. } private bool build_generation() { bool success = true. for (int i = (_PopSize .Error). } catch { MessageBox. try { Population. i > (int)(_PopSize / 2).William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 00000146 00000147 00000148 00000149 bool success = true. MessageBoxIcon. i--) { //This algorithm is altered from previous versions. } return success.1).

the least fit) with this new string. //Next iteration it replaces the next to last. //Then it replaces the last position in the population //(ie.Next((int)_PopSize / 2). "Error". } try { weightarray_f = new double[weightarray1. selection1 = Rand.GANetwork)Population[selection1]). success = false.Next(100) < _Crossover) { //Choose a random position in the weight array //to be our crossover point Page 146 of 225 . weightarray1 = ((Model.".OK.Length != weightarray2. all the way to the middle of the population. if (Rand. weightarray2 = ((Model. MessageBoxButtons. and //so on.Show("Error: 2D weight array length unequal.Length]. selection2 = Rand.Length) { MessageBox.Error). //Just a quick error check if (weightarray1.getWeights().GANetwork)Population[selection1]).William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 00000168 00000169 00000170 00000171 00000172 00000173 00000174 00000175 00000176 00000177 00000178 00000179 //applies mutation if necessary (same again) //then if neither mutation or Xover have been applied //chooses position 1 or 2.getWeights().Next((int)_PopSize / 2). MessageBoxIcon.

j < weightarray_f.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000180 00000181 00000182 00000183 00000184 00000185 00000186 00000187 00000188 00000189 00000190 00000191 00000192 00000193 00000194 00000195 00000196 00000197 00000198 00000199 00000200 00000201 00000202 00000203 00000204 00000205 00000206 00000207 00000208 00000209 int XPos = Rand. } } //Set the weights of the current member of the //population we're on ((Model. } } } if (Rand. j < weightarray1.Length .Length. } Page 147 of 225 .5. (weightarray1. //Simple single point crossover of the weights for (int j = 0.setWeights(weightarray_f).Next(1. } else { weightarray_f[j] = weightarray2[j].1)).GANetwork)Population[i]). j++) { if (j < XPos) { weightarray_f[j] = weightarray1[j].NextDouble() .5 for (int j = 0.Length.Next(100) < _Mutation) { //Jiggle the weights in the array //up or down by 0. j++) { weightarray_f[j] += Rand.0.

output. _PopSize)). _Mutation)).Append( Page 148 of 225 .Format("\t\t\tMutation Probability: {0:d}".Append( string.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000210 00000211 00000212 00000213 00000214 00000215 00000216 00000217 00000218 00000219 00000220 00000221 00000222 00000223 00000224 00000225 00000226 00000227 00000228 00000229 00000230 00000231 00000232 00000233 00000234 00000235 00000236 00000237 00000238 00000239 catch { MessageBox. output.Error). _Crossover)).Format("Network Details: \n Population: {0:d}". } #endregion #region external_functions /********************\ |* EXTERNAL *| \********************/ public override string network_details() { //Add stringbuilder for efficiency here StringBuilder output = new StringBuilder(). success = false. } } return success. "Error".Show("Error building new generation".Append( string.OK. output. MessageBoxButtons. MessageBoxIcon.Format("\t\t\tCrossover Probability: {0:d}".Append( string. output.

success = run().Format("\nHidden layer: {0:d}".00000}".Format("\t\t\t\t\t\t\tIgnore target error (process all iterations): {0}".ReportInterval)).IterMax)). output._Trained = success. output. output. base.Append( string.ToString().Format("\t\t\tMaximum Generation: {0:d}".Append( string. base.TargetError)). output.Append( string. base. return success. _ITerr)).Format("\nReport every {0:d} generations.Format("\t\t\tTarget Error: {0:0. _HiddenL)). } public override bool Train() { bool success = true. return output.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000240 00000241 00000242 00000243 00000244 00000245 00000246 00000247 00000248 00000249 00000250 00000251 00000252 00000253 00000254 00000255 00000256 00000257 00000258 00000259 00000260 00000261 00000262 00000263 00000264 00000265 00000266 00000267 00000268 00000269 string. base.". } public override bool Test() Page 149 of 225 .Append( string.

for (int j = 0. Page 150 of 225 .Append(string.GetOutput(). out tstoutput).GetScaledTestingData(out tstinput. j++) { output. StringBuilder output = new StringBuilder(). j < tstinput[i]. j++) { output.GANetwork)Population[0]). double[][] tstoutput. output. double[] netoutput. i < tstinput. netoutput = ((Model. getdata. j < netoutput. for (int j = 0.Length.Length. } output.Append("\n\n\t\t\t~~Network Testing~~").William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000270 00000271 00000272 00000273 00000274 00000275 00000276 00000277 00000278 00000279 00000280 00000281 00000282 00000283 00000284 00000285 00000286 00000287 00000288 00000289 00000290 00000291 00000292 00000293 00000294 00000295 00000296 00000297 00000298 00000299 { bool success = true.GANetwork)Population[0]). double[][] tstinput. tstinput[i][j])).Append("\nWith inputs:").Append("\nOutput achieved was:").Format(" [{0:g}] ".Length. try { for (int i = 0. i++) { ((Model.Append(string.Run(tstinput[i]).Format(" [{0:g}] ". output. netoutput[j])).

tstoutput).Show("Error Running Test. MessageBoxButtons. j < tstoutput[i]. } } output.Format ("{0:g}".GANetwork)Population[0])._Tested = success. base.Append("\nMean Squared Error with these inputs and outputs is:"). tstoutput[i][j])).William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000300 00000301 00000302 00000303 00000304 00000305 00000306 00000307 00000308 00000309 00000310 00000311 00000312 00000313 00000314 00000315 00000316 00000317 00000318 00000319 00000320 00000321 00000322 00000323 00000324 00000325 00000326 00000327 00000328 } } output.Error).Append(string. ((Model. success = false.MSqE)). output.Length.Format(" [{0:g}] ".ToString()). } catch { MessageBox. MessageBoxIcon. } #endregion } Page 151 of 225 . } base.GANetwork)Population[0]).Append("\nOutput Desired was:"). return success. j++) { output. "Error".Append( string.". for (int j = 0. (double) ((Model.getMsQE(tstinput.Report(output.OK.

System.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9.Algorithms { public class BackProp : TrainingAlgorithm { #region data /********************\ |* DATA *| \********************/ private double _InputL.Forms.1. private double _OutputL. ZedGraph. FANN_BackProp BPnet.Linq. FANN_Wrapper.Windows.Collections. System.Collections. _05025397. System.2.Text.“Algorithms\BackProp.Generic. private double _LearnRate.6 .IO. namespace _05025397. private int _HiddenL. System.cs” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 using using using using using using using using using using System. System.Controller. #endregion #region constructor Page 152 of 225 . System.

RichTextBox txtTest) : base(TargetError. double iterations. out _OutputL).TargetError. int Reports. txtTest). Iter_Max. saveDllData(txtData. txtTest. (int)_OutputL.ReportInterval. RichTextBox txtData. base. Reports. txtData. } #endregion #region internal_functions /********************\ Page 153 of 225 .IterMax). _LearnRate = LearnRate. ITerr) { _HiddenL = HiddenL. base. double TargetError.GetStructure(out iterations. out _InputL.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 /********************\ |* CONSTRUCTOR *| \********************/ public BackProp (int HiddenL. bool ITerr. double LearnRate. int Iter_Max. base. _LearnRate. getdata. BPnet = new FANN_BackProp((int)_InputL. //Copy the training and testing data to the dat files //ready for the DLL's to access. HiddenL.

Close(). tw. output. (int) _OutputL)).Append( Page 154 of 225 . RichTextBox te) { TextWriter tw = new StreamWriter("dlltrdata. tw. output.Append( string.Append( string.Format ("Network Details: \n Input Layer: {0:d}".Close().dat"). output.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 |* INTERNAL *| \********************/ private void saveDllData(RichTextBox tr. tw = new StreamWriter("dlltedata.Text).Text). tw.Write(tr.Write(te. tw. } #endregion #region external_functions /********************\ |* EXTERNAL *| \********************/ public override string network_details() { StringBuilder output = new StringBuilder().dat").Format ("\t\t\tOutput Layer: {0:d}". (int) _InputL)).

00000}".Format ("\nReport every {0:d} generations.00000}". return output. base.Append( string. base. output. (int) _HiddenL)). _LearnRate)).". output. output. output. output. base.Format ("\nLearning Rate: {0:0.Format ("\t\tMaximum Epoch: {0:d}".ReportInterval)). } public override bool Train() { Page 155 of 225 .Format ("\t\t\tHidden Layer: {0:d}".Append( string.TargetError)).Format ("\t\t\t\t\t\tIgnore target error " + "(process all iterations): {0}".Format ("\t\t\tTarget Error: {0:0.Append( string.Append( string.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 00000117 00000118 00000119 string. _ITerr)).IterMax)).Append( string.ToString().

besterrlist[i]). i < besterrlist. success = false. Page 156 of 225 . } public bool getGraphDataFromDll() { bool success = true. } success = getGraphDataFromDll().OK. } catch { MessageBox. base.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 00000146 00000147 00000148 00000149 bool success = true.Length.besterrorlist. return success.BestErrList. for (int i = 0._tr_output).Show("Error running Backprop training!". try { success = BPnet.Error). i++) { base. Report(BPnet.Train(). MessageBoxButtons. "Error"._Trained = success. try { double[] besterrlist = BPnet. MessageBoxIcon.Add((double)i.

_Tested = success._te_output). } return success. } base. try { success = BPnet.Test(). "Error". return success. } #endregion Page 157 of 225 .Error).OK. } public override bool Test() { bool success = true. MessageBoxIcon. MessageBoxButtons.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 00000168 00000169 00000170 00000171 00000172 00000173 00000174 00000175 00000176 00000177 00000178 00000179 } } catch { MessageBox. "Error". success = false.Error).Show("Error getting graph data from DLL". } catch { MessageBox. success = false.Show("Error running Backprop testing!". MessageBoxIcon.OK. Report(BPnet. MessageBoxButtons.

namespace _05025397.1.Forms. System. System.Generic. System.cs” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 using using using using using using System.3 . Page 158 of 225 .Windows. protected int _numinputs.Text. protected int _numoutputs. _05025397.1 .Model 9.Collections. System. protected int _hiddennodes.3. protected double[][][] _weights. protected double[][] _outputs.1.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000180 00000181 } } 9.“Network.Linq.Model { public class Network { #region data /********************\ |* DATA *| \********************/ //[layer][node][input] protected double[][][] _inputs.

for (int i = 0. _weights[0][i] = new double[2]. i < _hiddennodes. for (int i = 0. _weights[0] = new double[_numinputs][]. } } private void init_hiddenlayer() { _inputs[1] = new double[_hiddennodes][]. i++) { _inputs[1][i] = new double[_numinputs + 1]. _outputs[1] = new double[_hiddennodes]. _outputs[0] = new double[_numinputs]. _weights[1][i] = new double[_numinputs + 1]. Page 159 of 225 . i++) { _inputs[0][i] = new double[2]. #endregion #region init /********************\ |* INIT *| \********************/ private void init_inputlayer() { _inputs[0] = new double[_numinputs][].William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000025 00000026 00000027 00000028 00000029 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 protected Random Rand. i < _numinputs. _weights[1] = new double[_hiddennodes][].

i < _numoutputs. _weights[1][i][_numinputs] = 1. i++) { _inputs[2][i] = new double[_hiddennodes + 1]. i++) { _inputs[1][i][_numinputs] = 1.0. for (int i = 0. i < _numinputs. _weights[2] = new double[_numoutputs][]. i++) { _inputs[0][i][1] = 1.0.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000055 00000056 00000057 00000058 00000059 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 } } private void init_outputlayer() { _inputs[2] = new double[_numoutputs][]. Page 160 of 225 . _weights[2][i] = new double[_hiddennodes + 1].0.0. i < _hiddennodes. } //Hidden layer bias for (int i = 0. _outputs[2] = new double[_numoutputs]. } } private void init_bias() { //Input bias for (int i = 0. _weights[0][i][1] = 1.

0. } } private void init_inputweights() { for (int i = 0. i < _numoutputs. j++) { _weights[1][i][j] = Rand. i++) { for (int j = 0. i < _hiddennodes. j < _numinputs. _weights[2][i][_hiddennodes] = 1. i++) { _inputs[2][i][_hiddennodes] = 1.NextDouble(). i++) { _weights[0][i][0] = Rand.NextDouble(). } } } private void init_outputweights() Page 161 of 225 . i < _numinputs. } } private void init_hiddenweights() { for (int i = 0.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000085 00000086 00000087 00000088 00000089 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 } //Output layer bias for (int i = 0.0.

0. i++) { for (int j = 0. double sum = 0.E. j < _hiddennodes. } } } #endregion #region internal /********************\ |* INTERNAL *| \********************/ private double sigmoid(double input) { return (1 / (1 + Math. j++) { _weights[2][i][j] = Rand. i < _numoutputs. i < _numinputs. i++) { Page 162 of 225 . try { //Calculate results for this layer for (int i = 0. -input))).William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000115 00000116 00000117 00000118 00000119 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 { for (int i = 0. } private bool Runinput() { bool success = true.NextDouble().Pow(Math.

"Error".Show("Error processing input layer". } private bool Runhidden() { bool success = true.0. i++) { for (int j = 0. } } catch { MessageBox.Error). success = false.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000145 00000146 00000147 00000148 00000149 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 00000168 00000169 00000170 00000171 00000172 00000173 00000174 for (int j = 0. double sum = 0. try { //Feed forward the results from input layer for (int i = 0. j < _numinputs. j++) { sum += (_inputs[0][i][j] * _weights[0][i][j]). sum = 0.OK. Page 163 of 225 .0. MessageBoxIcon. i < _hiddennodes. } return success. } _outputs[0][i] = sigmoid(sum). j < 2. MessageBoxButtons. j++) { _inputs[1][i][j] = _outputs[0][j].

sum = 0. "Error".OK.0. j++) { sum += (_inputs[1][i][j] * _weights[1][i][j]). } } catch { MessageBox. } return success. j < _numinputs + 1. MessageBoxButtons.Show("Error processing hidden layer".Error).0. } private bool Runoutput() { bool success = true. MessageBoxIcon. i++) { for (int j = 0. } _outputs[1][i] = sigmoid(sum). success = false. try Page 164 of 225 . double sum = 0. i < _hiddennodes.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000175 00000176 00000177 00000178 00000179 00000180 00000181 00000182 00000183 00000184 00000185 00000186 00000187 00000188 00000189 00000190 00000191 00000192 00000193 00000194 00000195 00000196 00000197 00000198 00000199 00000200 00000201 00000202 00000203 00000204 } } //Calculate results for this layer for (int i = 0.

j < _hiddennodes + 1. i++) { for (int j = 0. i < _numoutputs.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000205 00000206 00000207 00000208 00000209 00000210 00000211 00000212 00000213 00000214 00000215 00000216 00000217 00000218 00000219 00000220 00000221 00000222 00000223 00000224 00000225 00000226 00000227 00000228 00000229 00000230 00000231 00000232 00000233 00000234 { //Feed forward the results from hidden layer for (int i = 0. } } //Calculate results for this layer for (int i = 0. sum = 0. success = false. i < _numoutputs. } _outputs[2][i] = sigmoid(sum). } Page 165 of 225 .Show("Error processing output layer". } return success. MessageBoxButtons. } } catch { MessageBox. i++) { for (int j = 0.OK. j++) { sum += (_inputs[2][i][j] * _weights[2][i][j]). j < _hiddennodes. MessageBoxIcon. "Error".0.Error). j++) { _inputs[2][i][j] = _outputs[1][j].

MessageBoxIcon. } if (success) for (int i = 0. Page 166 of 225 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000235 00000236 00000237 00000238 00000239 00000240 00000241 00000242 00000243 00000244 00000245 00000246 00000247 00000248 00000249 00000250 00000251 00000252 00000253 00000254 00000255 00000256 00000257 00000258 00000259 00000260 00000261 00000262 00000263 00000264 #endregion #region external /********************\ |* EXTERNAL *| \********************/ public bool Run(double[] inputs) { bool success = true. "Incorrect number of inputs". if(success) success = Runinput(). //The numbers of inputs must match up if (inputs. if(success) success = Runhidden().Length != _numinputs) { MessageBox. _inputs[0][i][0] = inputs[i].OK.the number of nodes corresponds //to the number of inputs accepted.Show("Error: Incorrect number of inputs supplied to NN". success = false.Error). i < _numinputs. MessageBoxButtons. i++) //Each input node has only one real input //and a bias .

_hiddennodes = hiddennodes. } #endregion #region constructor /********************\ |* CONSTRUCTOR *| \********************/ public Network(int numinputs. int numoutputs. int SetRandom) { //Set random number generator Rand = new Random(SetRandom). int hiddennodes. //Set network structure descriptors _numinputs = numinputs. _numoutputs = numoutputs. return success. } public double[] GetOutput() { //Return the outputs from the //output layer return _outputs[2].William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000265 00000266 00000267 00000268 00000269 00000270 00000271 00000272 00000273 00000274 00000275 00000276 00000277 00000278 00000279 00000280 00000281 00000282 00000283 00000284 00000285 00000286 00000287 00000288 00000289 00000290 00000291 00000292 00000293 00000294 if(success) success = Runoutput(). //We'll always have 3 layers Page 167 of 225 .

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3
00000295 00000296 00000297 00000298 00000299 00000300 00000301 00000302 00000303 00000304 00000305 00000306 00000307 00000308 00000309 00000310 } //Input, Hidden, and Output _inputs = new double[3][][]; _weights = new double[3][][]; _outputs = new double[3][]; init_inputlayer(); init_hiddenlayer(); init_outputlayer(); init_bias(); init_inputweights(); init_hiddenweights(); init_outputweights(); } #endregion }

9.1.3.2 - “GANetwork.cs”
00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 using using using using using using using System; System.Collections.Generic; System.Linq; System.Text; System.Windows.Forms; _05025397; System.Collections;

//This class contains Genetic Algorithm specific functions //in addition to the basic feed foward neural net functionality. namespace _05025397.Model

Page 168 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3
00000013 { 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042

public class GANetwork : Network, IComparable { #region data /********************\ |* DATA *| \********************/ double _MSqE; #endregion #region .NET /********************\ |* .NET *| \********************/ //Interface_Implementation //Return Value Meanings: //-<Zero: x < y //-Zero: x == y //->Zero: x > y public int CompareTo(object a) { GANetwork b; if (a is GANetwork) { b = a as GANetwork; return _MSqE.CompareTo(b.MSqE); }

Page 169 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3
00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 return 0; } #endregion #region external /********************\ |* EXTERNAL *| \********************/ public void getMsQE(double[][] inputs, double[][] outputs) { double sum = 0.0; int counter = 0; double[] netoutputs = new double[_numoutputs]; for (int i = 0; i < inputs.Length; i++) { base.Run(inputs[i]); netoutputs = base.GetOutput(); for (int j = 0; j < netoutputs.Length; j++) { sum += (outputs[i][j] - netoutputs[j]) * (outputs[i][j] - netoutputs[j]); counter++; } } _MSqE = (sum / counter); }

Page 170 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3
00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 public double[] getWeights() { ArrayList collater = new ArrayList(); //Collect all the weights in an array list //then spit them out as a 1D array of doubles try { for (int i = 0; i < _weights.Length; i++) { for (int j = 0; j < _weights[i].Length; j++) { for (int k = 0; k < _weights[i][j].Length; k++) { collater.Add(_weights[i][j][k]); } } } } catch { MessageBox.Show("Fatal Error collating weights to 1D array", "Error", MessageBoxButtons.OK, MessageBoxIcon.Error); Application.Exit(); } return (double[])collater.ToArray(typeof(double)); } public void setWeights(double[] weights)

Page 171 of 225

Application. MessageBoxButtons.Exit(). k < _weights[i][j]. wc = 0. "Error". k++.Length. } } /********************\ |* GETTERS/SETTERS *| \********************/ public double MSqE { Page 172 of 225 .Length. j++) { for (int k = 0. i++) { for (int j = 0.OK. MessageBoxIcon.Length. j < _weights[i]. i < _weights. } } } } catch { MessageBox.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 00000117 00000118 00000119 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 { //Take a 1D array of doubles and apply //to the correct positions in our weights //array try { for (int i = 0.Show("Fatal Error adding 1D weight array to 3D weight array". wc++) { _weights[i][j][k] = weights[wc].Error).

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 00000146 00000147 00000148 00000149 00000150 00000151 } get { return _MSqE. Nhnodes. Noutputs. } set { _MSqE = value. int Nhnodes. int Noutputs. SetRandom) { MSqE = 0. } #endregion } Page 173 of 225 . } } #endregion #region constructor /********************\ |* CONSTRUCTOR *| \********************/ public GANetwork(int Ninputs. int SetRandom) : base(Ninputs.0.

int OutputL. double TargetErr.h” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 #pragma once #include "UnManaged_FANN_BackProp. public: /********************\ |* EXTERNAL *| \********************/ FANN_Cascade(int InputL. { public: /********************\ |* DATA *| \********************/ UnManaged_FANN_CasC* UMWrapper. Page 174 of 225 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9.1. namespace FANN_Wrapper { public ref class FANN_Cascade //Managed wrapper for UnManaged_FANN_CasC //which is a C++ interface to the C programmed //FANN dll.DLL Wrapper for FANN functions 9.4 .1. using namespace System::Runtime::InteropServices.h" #include "UnManaged_FANN_CasC.4. double LearnRate.h" using namespace System.“FANN_Wrapper.1 .

} } property array<double>^ besterrorlist { array<double>^ get() MaxNeurons). bool Test(void). int ~FANN_Cascade(void). bool Train(void). } } property String^ _tr_output { String^ get() { return gcnew String((wchar_t *) UMWrapper->get_report_train()).William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000027 00000028 00000029 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 int ReportInterval. public: /********************\ |* GETTERS/SETTERS *| \********************/ property String^ _te_output { String^ get() { return gcnew String((wchar_t *) UMWrapper->get_report_test()). Page 175 of 225 . !FANN_Cascade(void).

for(int i = 0. } return arout. const double *ar = UMWrapper->get_besterrs(size). array<double>^ arout = gcnew array<double>(size). } } }. public: Page 176 of 225 . i++) { arout[i] = ar[i]. { public: /********************\ |* DATA *| \********************/ UnManaged_FANN_Backprop* UMWrapper. public ref class FANN_BackProp //Managed wrapper for UnManaged_FANN_Backprop //which is a C++ interface to the C programmed //FANN dll.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000057 00000058 00000059 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 { int size. i < size.

bool Train(void). ~FANN_BackProp(void).William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000087 00000088 00000089 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 /********************\ |* EXTERNAL *| \********************/ FANN_BackProp(int InputL. bool Test(void). int MaximumIteration). !FANN_BackProp(void). } } property String^ _tr_output { String^ get() { return gcnew String((wchar_t *) UMWrapper->get_report_train()). int ReportInterval. double TargetErr. double LearnRate. public: /********************\ |* GETTERS/SETTERS *| \********************/ property String^ _te_output { String^ get() { return gcnew String((wchar_t *) UMWrapper->get_report_test()). int HiddenL. } Page 177 of 225 . int OutputL.

} } }. array<double>^ arout = gcnew array<double>(size). const double *ar = UMWrapper->get_besterrs(size). } return arout. for(int i = 0. i < size. i++) { arout[i] = ar[i]. Page 178 of 225 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000117 00000118 00000119 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 } } property array<double>^ besterrorlist { array<double>^ get() { int size.

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3

9.1.4.2 - “FANN_Wrapper.cpp”
00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 #include "stdafx.h" #include "FANN_Wrapper.h" namespace FANN_Wrapper { /********************\ |* FANN_BackProp *| \********************/ //Constructor FANN_BackProp::FANN_BackProp(int InputL, int HiddenL, int OutputL, double LearnRate, double TargetErr, int ReportInterval, int MaximumIteration) { UMWrapper = new UnManaged_FANN_Backprop( InputL, HiddenL, OutputL, LearnRate, TargetErr, ReportInterval, MaximumIteration); } //Destructor FANN_BackProp::~FANN_BackProp(void) { if (UMWrapper) { delete UMWrapper; UMWrapper = 0; } }

Page 179 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3
00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059

//Finalizer FANN_BackProp::!FANN_BackProp(void) { if (UMWrapper) { delete UMWrapper; UMWrapper = 0; } }

//Train bool FANN_BackProp::Train(void) { return UMWrapper->Train(); } //Test bool FANN_BackProp::Test(void) { return UMWrapper->Test(); } /********************\ |* FANN_Cascade *| \********************/ //Constructor FANN_Cascade::FANN_Cascade(int InputL, int OutputL,

Page 180 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3
00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 double LearnRate, double TargetErr, int ReportInterval, int MaxNeurons) { UMWrapper = new UnManaged_FANN_CasC( InputL, OutputL, LearnRate, TargetErr, ReportInterval, MaxNeurons); } //Destructor FANN_Cascade::~FANN_Cascade(void) { if (UMWrapper) { delete UMWrapper; UMWrapper = 0; } } //Finalizer FANN_Cascade::!FANN_Cascade(void) { if (UMWrapper) { delete UMWrapper; UMWrapper = 0; } } //Train

Page 181 of 225

William Sayers

05025397

Genetic Algorithms and Neural Networks
Milestone 3
00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 } bool FANN_Cascade::Train(void) { return UMWrapper->Train(); } //Test bool FANN_Cascade::Test(void) { return UMWrapper->Test(); }

Page 182 of 225

int _OutputL.4. int _HiddenL. { private: /********************\ |* DATA *| \********************/ //Network Structure int _InputL. //Output for C# Page 183 of 225 .h> #include <fann_cpp.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9. double _Momentum.3 . double _TargetErr. //Training Parameters double _LearnRate. int _ReportInterval.h> class UnManaged_FANN_Backprop //An interface to the C programmed //FANN dll.“UnManaged_FANN_BackProp.h” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 #pragma once #include <iostream> #include <string> #include <sstream> #include <vector> #include <doublefann. int _MaximumIteration.1.

00000057 00000058 public: 00000059 /********************\ Page 184 of 225 . 00000036 00000037 //FANN Data 00000038 struct fann *ann. int HiddenL. 00000039 struct fann_train_data *data. 00000032 wchar_t *wchoutput. double TargetErr.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000030 std::wstringstream tr_output. 00000033 00000034 std::vector<double> vecbsterr. 00000047 double LearnRate. int OutputL. int MaximumIteration). 00000052 00000053 //Interface functions 00000054 //accessed from C# 00000055 bool Train(void). 00000031 std::wstringstream te_output. 00000049 00000050 //Destructor 00000051 ~UnManaged_FANN_Backprop(void). 00000040 00000041 public: 00000042 /********************\ 00000043 |* EXTERNAL *| 00000044 \********************/ 00000045 //Constructor 00000046 UnManaged_FANN_Backprop(int InputL. 00000048 int ReportInterval. 00000035 double *bsterr. 00000056 bool Test(void).

const double* get_besterrs(int &size). const wchar_t* get_report_test(void). |* GETTERS/SETTERS *| \********************/ const wchar_t* get_report_train(void).William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000060 00000061 00000062 00000063 00000064 00000065 00000066 }. Page 185 of 225 .

int ReportInterval. int OutputL. } //Destructor UnManaged_FANN_Backprop::~UnManaged_FANN_Backprop() { fann_destroy_train(data). _TargetErr = TargetErr.h" #include "UnManaged_FANN_BackProp. wchoutput = NULL. int MaximumIteration) { _InputL = InputL.4 . Page 186 of 225 . bsterr = NULL. int HiddenL. double TargetErr.1. _MaximumIteration = MaximumIteration. _HiddenL = HiddenL. ann = NULL.4. _ReportInterval = ReportInterval.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9. data = NULL.cpp” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 #include "StdAfx. _OutputL = OutputL. _LearnRate = LearnRate. double LearnRate.h" //Constructor UnManaged_FANN_Backprop::UnManaged_FANN_Backprop (int InputL.“UnManaged_FANN_BackProp.

} //Train bool UnManaged_FANN_Backprop::Train(void) { static int firstrun = false. double error. _HiddenL.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 delete bsterr. bool success = true. wchoutput = NULL...dat").) { throw(""). delete wchoutput. int reportcounter = 0. ann = fann_create_standard(3. _OutputL). } catch(. _InputL. success = false. try { data = fann_read_train_from_file("dlltrdata. bsterr = NULL. } if (fann_num_input_train_data(data) != _InputL) Page 187 of 225 .

fann_set_activation_function_hidden (ann. success = false. (float) _LearnRate). //Sigmoid Activation Functions (the same one //the GA uses). FANN_TRAIN_BATCH). FANN_SIGMOID). //Same range the GA's weights are //initialised too Page 188 of 225 . fann_set_activation_steepness_output(ann. success = false. //Standard backprop fann_set_training_algorithm(ann. FANN_SIGMOID). fann_set_activation_function_output (ann. 1). //Set the learning rate fann_set_learning_rate(ann.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 { throw(""). } fann_set_activation_steepness_hidden(ann. } if (fann_num_output_train_data(data) != _OutputL) { throw(""). 1).

i < _MaximumIteration.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000090 00000091 00000092 00000093 00000094 00000095 00000096 00000097 00000098 00000099 00000100 00000101 00000102 00000103 00000104 00000105 00000106 00000107 00000108 00000109 00000110 00000111 00000112 00000113 00000114 00000115 00000116 00000117 00000118 00000119 fann_randomize_weights(ann. i = _MaximumIteration + 1. 0. if (error < _TargetErr) { tr_output << "\nNetwork matching or improving upon target error" << "found at iteration [ " << i << " ]. reportcounter = 0.0). i++) { //Train one epoch then check the //mean squared error error = (double) fann_train_epoch(ann. } reportcounter++. 1.\n". data).0. } } fann_destroy_train(data). if (reportcounter == _ReportInterval) { tr_output << "Best error at iteration [ " << i << " ] was " << error << ". //Training Loop for (int i = 0. Page 189 of 225 .".push_back(error). vecbsterr.

i < fann_length_train_data(data).dat"). 1). } te_output << "\nOutput achieved was". j++) { Page 190 of 225 . j++) { te_output << " [" << data->input[i][j] << "] ". fann_scale_train_data(data. j < _InputL. i++) { calc_out = fann_run(ann. } //Test bool UnManaged_FANN_Backprop::Test(void) { fann_type *calc_out = NULL.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000120 00000121 00000122 00000123 00000124 00000125 00000126 00000127 00000128 00000129 00000130 00000131 00000132 00000133 00000134 00000135 00000136 00000137 00000138 00000139 00000140 00000141 00000142 00000143 00000144 00000145 00000146 00000147 00000148 00000149 firstrun = true. for (int j = 0. 0. te_output << "\n\n\t\t\t~~Network Testing~~". return true. error = new double[fann_length_train_data(data)]. double *error = NULL. te_output << "\nWith inputs". data = fann_read_train_from_file("dlltedata. data->input[i]). j < _OutputL. for (unsigned int i = 0. for (int j = 0.

length() + 1]. delete error.str().c_str()). wchoutput = NULL.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 00000168 00000169 00000170 00000171 00000172 00000173 00000174 00000175 00000176 00000177 00000178 00000179 te_output << " [" << calc_out[j] << "] ". for (int j = 0. } te_output << "\nOutput Desired was". Page 191 of 225 . } wchoutput = new wchar_t[tr_output. j++) { te_output << " [" << data->output[i][j] << "] ". data). return true. } } te_output << "\nMean Squared Error with these inputs and outputs is:". delete calc_out.str(). tr_output. wcscpy(wchoutput. te_output << fann_test_data(ann. j < _OutputL. } //get_report const wchar_t* UnManaged_FANN_Backprop::get_report_train(void) { if (wchoutput != NULL) { delete wchoutput.

} wchoutput = new wchar_t[te_output.length() + 1].str(). } bsterr = new double[vecbsterr.size()]. wcscpy(wchoutput.size(). } const wchar_t* UnManaged_FANN_Backprop::get_report_test(void) { if (wchoutput != NULL) { delete wchoutput. bsterr = NULL. wchoutput = NULL. return wchoutput.c_str()).str(). te_output.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000180 00000181 00000182 00000183 00000184 00000185 00000186 00000187 00000188 00000189 00000190 00000191 00000192 00000193 00000194 00000195 00000196 00000197 00000198 00000199 00000200 00000201 00000202 00000203 00000204 00000205 00000206 00000207 00000208 00000209 return wchoutput. } //get_besterrs const double* UnManaged_FANN_Backprop::get_besterrs(int &size) { if (bsterr == NULL) { delete bsterr. size = vecbsterr. Page 192 of 225 .

} vecbsterr. int _OutputL.1. 9. return bsterr.h> class UnManaged_FANN_CasC { public: /********************\ |* DATA *| \********************/ //Network Structure int _InputL.4.size().William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000210 00000211 00000212 00000213 00000214 00000215 00000216 00000217 00000218 00000219 } for (unsigned int i = 0. //Training Parameters Page 193 of 225 . i < vecbsterr.h” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 #pragma once #include <iostream> #include <string> #include <sstream> #include <vector> #include <doublefann.5 .“UnManaged_FANN_CasC.clear(). i++) { bsterr[i] = vecbsterr[i].

00000028 00000029 static std::vector<double> vecbsterr. 00000035 00000036 public: 00000037 /********************\ 00000038 |* EXTERNAL *| 00000039 \********************/ 00000040 00000041 //Destructor 00000042 ~UnManaged_FANN_CasC(void). 00000031 00000032 //FANN Data 00000033 struct fann *ann. 00000020 double _TargetErr.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000019 double _LearnRate. Page 194 of 225 . 00000022 int _MaxNeurons. 00000043 00000044 //Constructor 00000045 UnManaged_FANN_CasC 00000046 (int InputL. 00000026 std::wstringstream te_output. 00000021 int _ReportInterval. 00000030 double *bsterr. int OutputL. 00000034 struct fann_train_data *data. 00000027 wchar_t *wchoutput. 00000048 int ReportInterval. 00000047 double LearnRate. int MaxNeurons). 00000023 00000024 //Output for C# 00000025 static std::wstringstream tr_output. double TargetErr.

struct fann_train_data *train.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000049 00000050 //Interface functions 00000051 //accessed from C# 00000052 bool Train(void). Page 195 of 225 . unsigned int epochs_between_reports. 00000066 00000067 const double* get_besterrs(int &size). 00000053 bool Test(void). 00000057 unsigned int max_epochs. 00000059 00000060 public: 00000061 /********************\ 00000062 |* GETTERS/SETTERS *| 00000063 \********************/ 00000064 const wchar_t* get_report_train(void). 00000068 }. 00000065 const wchar_t* get_report_test(void). 00000054 00000055 static int FANN_API Report_Callback 00000056 (struct fann *ann. 00000058 float desired_error. unsigned int epochs).

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9. bsterr = NULL. wchoutput = NULL. int ReportInterval. double TargetErr. ann = NULL. int OutputL. _ReportInterval = ReportInterval. //Constructor UnManaged_FANN_CasC::UnManaged_FANN_CasC (int InputL.cpp” 00000001 00000002 00000003 00000004 00000005 00000006 00000007 00000008 00000009 00000010 00000011 00000012 00000013 00000014 00000015 00000016 00000017 00000018 00000019 00000020 00000021 00000022 00000023 00000024 00000025 00000026 00000027 00000028 00000029 #include "StdAfx.6 . _LearnRate = LearnRate. _OutputL = OutputL.h" //Static variable declarations std::wstringstream UnManaged_FANN_CasC::tr_output. std::vector<double> UnManaged_FANN_CasC::vecbsterr.h" #include "UnManaged_FANN_CasC. data = NULL. double LearnRate. _MaxNeurons = MaxNeurons. _TargetErr = TargetErr.“UnManaged_FANN_CasC.4.1. int MaxNeurons) { _InputL = InputL. } //Destructor UnManaged_FANN_CasC::~UnManaged_FANN_CasC(void) Page 196 of 225 .

wchoutput = NULL. fann_destroy_train(data).William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000030 00000031 00000032 00000033 00000034 00000035 00000036 00000037 00000038 00000039 00000040 00000041 00000042 00000043 00000044 00000045 00000046 00000047 00000048 00000049 00000050 00000051 00000052 00000053 00000054 00000055 00000056 00000057 00000058 00000059 { delete bsterr.str(). } const wchar_t* UnManaged_FANN_CasC::get_report_test(void) { if (wchoutput != NULL) { delete wchoutput. wcscpy(wchoutput.str(). wchoutput = NULL. tr_output. Page 197 of 225 .length() + 1]. delete wchoutput. bsterr = NULL. } wchoutput = new wchar_t[tr_output.c_str()). return wchoutput. } //get_report const wchar_t* UnManaged_FANN_CasC::get_report_train(void) { if (wchoutput != NULL) { delete wchoutput.

} Page 198 of 225 .size()].William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000060 00000061 00000062 00000063 00000064 00000065 00000066 00000067 00000068 00000069 00000070 00000071 00000072 00000073 00000074 00000075 00000076 00000077 00000078 00000079 00000080 00000081 00000082 00000083 00000084 00000085 00000086 00000087 00000088 00000089 wchoutput = NULL.size().size(). bsterr = NULL.c_str()). } wchoutput = new wchar_t[te_output. size = vecbsterr.str(). wcscpy(wchoutput. return bsterr.length() + 1]. i < vecbsterr. } bsterr = new double[vecbsterr. for (unsigned int i = 0. te_output. } //get_besterrs const double* UnManaged_FANN_CasC::get_besterrs(int &size) { if (bsterr == NULL) { delete bsterr.str(). return wchoutput. } vecbsterr.clear(). i++) { bsterr[i] = vecbsterr[i].

_InputL. 2. Page 199 of 225 .) 00000102 { 00000103 throw(""). 00000113 00000114 //Some more network customisation here 00000115 //might be nice in future. 00000116 00000117 fann_set_quickprop_decay(ann. 00000096 00000097 try 00000098 { 00000099 data = fann_read_train_from_file("dlltrdata. 00000094 00000095 ann = fann_create_shortcut(2. (float)_LearnRate). 00000111 00000112 fann_set_learning_rate(ann.0). 00000104 } 00000105 00000106 if (fann_num_input_train_data(data) != _InputL) 00000107 throw(""). 00000100 } 00000101 catch(. _OutputL). 0. 00000118 00000119 fann_set_quickprop_mu(ann...0).dat"). 00000108 00000109 if (fann_num_output_train_data(data) != _OutputL) 00000110 throw("").William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000090 //Train 00000091 bool UnManaged_FANN_CasC::Train() 00000092 { 00000093 int reportcounter = 0.

00000146 } 00000147 00000148 //Callback for reporting 00000149 int FANN_API UnManaged_FANN_CasC::Report_Callback Page 200 of 225 . 00000144 00000145 return true. 00000126 fann_set_activation_steepness_hidden(ann. 00000127 00000128 fann_set_learning_rate(ann. FANN_STOPFUNC_MSE). 0. _ReportInterval. UnManaged_FANN_CasC::Report_Callback). (float) _LearnRate). data.01f).0. 00000140 00000141 fann_set_callback(ann. FANN_TRAIN_QUICKPROP). 1). 150). 1). 00000122 00000123 fann_set_cascade_max_out_epochs(ann.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000120 00000121 fann_set_cascade_weight_multiplier(ann. 00000139 fann_set_train_stop_function(ann. 1). (float) _TargetErr). FANN_SIGMOID). 00000135 00000136 fann_randomize_weights(ann. 00000138 fann_set_cascade_candidate_change_fraction(ann. 00000142 00000143 fann_cascadetrain_on_data(ann. 00000124 00000125 fann_set_activation_steepness_output(ann. FANN_SIGMOID). 00000133 00000134 fann_set_activation_function_output(ann. 00000131 00000132 fann_set_activation_function_hidden(ann. _MaxNeurons. 0. 00000137 fann_set_cascade_output_change_fraction(ann. 0. 00000129 00000130 fann_set_training_algorithm(ann.01f).0). 1.

double *error = NULL. } node++. } //Test bool UnManaged_FANN_CasC::Test() { fann_type *calc_out = NULL. 1). data = fann_read_train_from_file("dlltedata.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000150 00000151 00000152 00000153 00000154 00000155 00000156 00000157 00000158 00000159 00000160 00000161 00000162 00000163 00000164 00000165 00000166 00000167 00000168 00000169 00000170 00000171 00000172 00000173 00000174 00000175 00000176 00000177 00000178 00000179 (struct fann *ann. vecbsterr. float desired_error.push_back(MSqE). if (MSqE <= desired_error) { tr_output << "\nNetwork matching or improving upon target error" << "found at node [ " << node << " ]. fann_scale_train_data(data. return 0. 0.". unsigned int epochs) { double MSqE = fann_get_MSE(ann). struct fann_train_data *train.\n". unsigned int max_epochs. Page 201 of 225 .dat"). static int node = 1. unsigned int epochs_between_reports. tr_output << "Best error at node [ " << node << " ] was " << MSqE << ".

i < fann_length_train_data(data). j < _InputL. te_output << fann_test_data(ann. for (int j = 0. error = new double[fann_length_train_data(data)]. data). data->input[i]). j++) { te_output << " [" << data->input[i][j] << "] ". j < _OutputL. for (int j = 0. i++) { calc_out = fann_run(ann. j++) { te_output << " [" << data->output[i][j] << "] ".William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000180 00000181 00000182 00000183 00000184 00000185 00000186 00000187 00000188 00000189 00000190 00000191 00000192 00000193 00000194 00000195 00000196 00000197 00000198 00000199 00000200 00000201 00000202 00000203 00000204 00000205 00000206 00000207 00000208 00000209 fann_set_callback(ann. } } te_output << "\nMean Squared Error with these inputs and outputs is:". te_output << "\nWith inputs:". } te_output << "\nOutput Desired was:". } te_output << "\nOutput achieved was:". NULL). for (int j = 0. j < _OutputL. j++) { te_output << " [" << calc_out[j] << "] ". for (unsigned int i = 0. Page 202 of 225 . te_output << "\n\n\t\t\t~~Network Testing~~".

Page 203 of 225 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 00000210 00000211 00000212 00000213 00000214 00000215 } delete error. return true. delete calc_out.

2 .UML Class Diagram The class diagram on the following pages represents the structure of the various classes within the program and how they are combined together to create a whole. Page 204 of 225 .William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9.

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Page 205 of 225 .

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Page 206 of 225 .

edu/ga/ o Accessed: 31/10/2008 Page 207 of 225 .mit.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 -BackProp Previous Page -CascadeCorrelation 1 FANN_Cascade -UMWrapper +FANN_Cascade(in AllDataToFillAttributes) +Train() : bool +Test() : bool 1 -UMWrapper 1 FANN_BackProp +FANN_BackProp(in AllDataToFillAttributes) +Train() : bool +Test() : bool 1 1 1 UnManaged_FANN_BackProp -_InputL : int -_HiddenL : int -_OutputL : int -_LearnRate : double -_TargetErr : double -_ReportInterval : int -_MaximumIteration : int -_Momentum : double -tr_output -te_output -wchoutput -vecbsterr -bsterr -ann -data +UnManaged_FANN_Backprop(in AddDataToFillAttributes) +~UnManaged_FANN_Backprop() +Train() : bool +Test() : bool UnManaged_FANN_CasC -_InputL : int -_OutputL : int -_LearnRate : double -_TargetErr : double -_ReportInterval : int -_MaxNeurons : int -tr_output -te_output -wchoutput -vecbsterr -bsterr -ann -data +UnManaged_FANN_CasC(in AllDataToAFillAttributes) +~UnManaged_FANN_CasC() +Train() : bool +Test() : bool +Report_Callback(in Attributes) : int 1 1 1 1 FANN_Library 9.Libraries Researched • GALib o Matthew Wall o http://lancet.3 .

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 • Libneural o D.net/ o Accessed: 31/10/2008 • FANN (Fast Artificial Neural Network Library) o Steffen Nissen Page 208 of 225 .uow.org/libann/index.com/flood/ o Accessed: 31/10/2008 • Annie o Asim Shankar o http://annie. Franklin o http://ieee.html o Accessed: 31/10/2008 • Flood o Roberto Lopez o http://www.edu/au/~daniel/software/libneural/ o Accessed: 31/10/2008 • Libann o Anonymous o http://www.sourceforge.cimne.nongnu.

php o Accessed: 31/10/2008 9.4 .html Page 209 of 225 .ox.Datasets • XOR 1 0 1 0 1 1 1 1 0 0 0 0 • • Fishers Iris Data o http://www.uk/pub/PRNN/README.stats.math.xhtml Virus Classification o http://www.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 o http://leenissen.uah.edu/stat/data/Fisher.ac.dk/fann/index.

and the conclusions derived from that evaluation. Build a user-interface for the aforementioned application allowing for easy alteration of appropriate variables. evaluate training methods for neural networks. Identify the most effective ways of training neural networks using genetic algorithms. and a document describing its implementation and usage. as well as previous research.5 . and that allows manual adjustment of relevant variables.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 9. Using above application. Research current information on training neural networks using genetic algorithms. Page 210 of 225 . Deliverable two: An application meeting the requirements of objectives three and four. • • • Deliverable one: A research document describing and detailing the research done for objectives one and two. and in which cases these methods are suitable. Deliverable three: A document detailing and explaining the evaluation of different training techniques. Build an application that is capable of training a neural network with a genetic algorithm and with a backpropagation system.Objectives FINAL YEAR DEGREE PROJECT Project Objectives & Deliverables • • • • • • Research the fields of genetic algorithms and neural networks.

or the preservation of favoured races in the struggle for life. S.cs.pdf Fahlman. & G. from C# Corner: http://www.ps Fahlman. Bourg. 2009.Works Cited Barber. Davis. January 16). Graphics processing unit computation of neural networks.unikarlsruhe. D.codeproject. Retrieved April 13. 2009.cmu. (1988). C. S.cmu. (2004). E.de/~jbr/Papers/GaNNOverview.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 10 .unm. 2009. Bourg. Retrieved April 16. G.ps. (1992).edu/~chris2d/papers/CED_thesis. from http://www. Wiley Professional Computing. In D. Chapter 14: Neural Networks. A.gz Darwin. S.aspx Page 211 of 225 . (2006).aifb.cmu. Retrieved April 16. Branke.edu/user/sef/www/publications/qp-tr.cs. (1995). (1859). 2009. Retrieved April 13.cs. Carnegie Mellon: http://www-2. (2001). from School of Computer Science. M.edu/user/sef/www/publications/cascor-tr. Retrieved April 16. London: John Murray. (1991). & Lebiere. Neural Network in C++. C. E. Passing Data between Windows Forms. Curved trajectory prediction using a self-organizing neural network.. M. An Empirical Study of Learning Speed in Back-Propagation Networks.. 356-358). (2007.ps Lysle.csharpcorner.com/UploadFile/scottlysle/PassData01142007013005AM/PassData. On the origin of the species by means of natural selection. Seeman. J. 1. from School of Computer Science. & Seeman.edu/afs/cs. C. O'Reilly. 2009.edu/afs/cs. Retrieved April 13. S. Carnegie Mellon: http://www2. from http://www.aspx Blum. from Codeproject: www. AI for Game Developers (pp. AI: Neural Network for Beginners pt. E. 2009. The Cascade-Correlation Learning Architecture.cmu.com/KB/recipes/NeuralNetwork_1.

(1999). Back-propagation for the Uninitiated. (2009). (2004). A. Using C++ Interop (Implicit PInvoke). (1943).generation5. A.unc. F.talkorigins. Page 212 of 225 . from FANN: http://leenissen. An introduction to genetic algorithms. J. Office of Technology Licensing. 2009.d. Retrieved April 15.ps. Retrieved 21 01..htm Rosenblatt. (2009). Robot Control. W. Retrieved April 13. (1996).). Learning Logic. 5. Psychological Review . Retrieved April 16. from MSDN: http://msdn.aspx Mitchell. (2002).gz Matthews. from Artificial Neural Networks: http://www. 2009.com/en-us/library/system. Bulletin of Mathematical Biophysics . 2009. from generation5: http://www. Stanford University. The perceptron: A probabilistic theory for information storage and organization in the brain. 2009. from http://www. Retrieved April 15.org/content/2002/bp. (1958). V. W. (n.dk/fann/ Parker. from MSDN: http://msdn. D.com/faqs/genalg/genalg. 2009.com/robotcontrol. & Pitts. (2008). Retrieved April 16.backgroundworker. Genetic algorithms and evolutionary computation. A logical calculus of the ideas immanent in nervous activity. Massachusetts: MIT Press.microsoft.html Marshall. BackgroundWorker Class. from Computer Science . Curved trajectory prediction using a selforganizing neural network. B. S. 65.. 2009.learnartificialneuralnetworks. 115-133. 386-408.UNC-Chapel Hill: http://www. Microsoft. & Srikanth. J.cs. (1982).componentmodel. Nissen. M. Retrieved April 15.microsoft.aspx Microsoft. Invention Report S81-64.asp McCulloch.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Marczyk. 2009.edu/~marshall/WWW/PAPERS/curve9912.com/en-us/library/2x8kf7zx.

(n. 2009.doc. P. from Imperial College London: http://www. S. PhD.ac.html#Contents Werbos. C.).d. Neural Networks.uk/~nd/surprise_96/journal/vol4/cs11/report.org/wiki/index. Retrieved 21 01. from http://zedgraph. Harvard University.. 2009. (1974).ic.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Stergiou. & Dimitrios. ZedGraph. Retrieved April 15.php?title=Main_Page Page 213 of 225 . Beyond Regression: New tools for prediction and analysis in the behavioral sciences. (1997).

On the origin of the species by means of natural selection.de/~jbr/Papers/GaNNOverview. A.ps. from MSDN: http://msdn. (2001). (2006). (2004).unm.com/KB/recipes/Backprop_ANN.. Retrieved April 13. (1995). AI: Neural Network for Beginners pt.com/enus/magazine/cc301501. Retrieved April 13. from Codeproject: www. Chapter 14: Neural Networks. Killing Processes Cleanly. London: John Murray. 3. from Codeproject: http://www. Bourg. AI: Neural Network for Beginners pt. Graphics processing unit computation of neural networks. In D.unikarlsruhe.cs.codeproject.codeproject.Bibliography Barber. D. (2006). (1859). or the preservation of favoured races in the struggle for life. 2009.php/c9855/ Page 214 of 225 . Bourg. August 6). (2006). from Codeproject: http://www. (1992). & G. Retrieved April 13. 2009. S. (2007. 2009.aifb. 2009. 356-358). C. (2002). from http://www. Retrieved April 13.com/Cpp/Cpp/cpp_mfc/tutorials/article.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 11 . 2009.aspx Barber. Branke. O'Reilly. S.pdf DiLascia. C. P. M.gz Darwin. Retrieved April 13. Seeman. G. from Codeguru: http://www.microsoft. J.aspx Blum. E.aspx DLL Tutorial for Beginners.codeguru.codeproject.com/KB/cs/GA_ANN_XOR.com/KB/recipes/NeuralNetwork_1. from http://www. Neural Network in C++. AI for Game Developers (pp. AI: Neural Network for Beginners pt. & Seeman. Wiley Professional Computing. 1. Retrieved April 13. 2. Retrieved April 13. Curved trajectory prediction using a self-organizing neural network.aspx Barber. Call Unmanaged DLLs from C#. S. 2009. 2009.edu/~chris2d/papers/CED_thesis. M. Davis.

cmu. 2009.. 2009.edu/user/sef/www/publications/cascor-tr. (n. S. (2002). Passing Data between Windows Forms. An Empirical Study of Learning Speed in Back-Propagation Networks.ps Fahlman.org/content/2000/cbpnet. 2009. 2009. Genetic algorithms and evolutionary computation.edu/afs/cs. 2009.html Marshall. E. Page 215 of 225 .edu/afs/cs. The Cascade-Correlation Learning Architecture. & Srikanth.cmu. (2007. 2009. & Pitts. from generation5: http://www. from School of Computer Science. Bulletin of Mathematical Biophysics . from http://code.asp McCulloch. from C# Corner: http://www. & Lebiere.).). (n. January 16). E. (1999). (1943). (1991). A logical calculus of the ideas immanent in nervous activity. (2000).gz Matthews. from XORGA: http://www. (1988). V. J. Retrieved April 13.talkorigins. 2009.com/faqs/genalg/genalg.edu/~marshall/WWW/PAPERS/curve9912.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Fahlman. Retrieved April 16.ps GoogleChartSharp.asp Matthews. A. Retrieved 21 01. Carnegie Mellon: http://www-2.generation5. Retrieved April 15. from generation5: http://www. from School of Computer Science. Retrieved 26 01. S. 2009.d.asp Matthews.d.generation5.cs. Retrieved April 16. (2004).google. S.cs. Curved trajectory prediction using a selforganizing neural network. 5. J. J.generation5.edu/user/sef/www/publications/qp-tr. generation5. Back-Propagation: CBPNet. Retrieved April 15. Retrieved April 15.org/content/2000/xorga. Retrieved April 16.ps. A.cs. from http://www.unc.cmu. W.. 2009. 115-133..cmu.com/p/googlechartsharp/ Lysle.org/content/2002/bp.aspx Marczyk. Back-propagation for the Uninitiated. Carnegie Mellon: http://www2.com/UploadFile/scottlysle/PassData01142007013005AM/PassData. C.UNC-Chapel Hill: http://www.csharpcorner. W. J. from Computer Science .

learnartificialneuralnetworks. (2009).dk/fann/html/files/fann_data-h. Retrieved April 16.aspx Microsoft. Massachusetts: MIT Press. 2009.). 2009. Retrieved 21 01. D. An introduction to genetic algorithms.NET.html#fann_train_enum Nissen. B. from FANN: http://leenissen. 2009. from MSDN: http://msdn.).microsoft.).com/en-us/library/2x8kf7zx. Retrieved April 16. 2009. S. Retrieved 21 01.dk/fann/fann_1_2_0/r1597. from MSDN: http://msdn.htm Page 216 of 225 .html Nplot Charting Library for . (1996).. Using C++ Interop (Implicit PInvoke).).aspx Mitchell. from FANN: http://leenissen. 2009. struct fann_train_data. Invention Report S81-64. (n. M. from FANN: http://leenissen. from FANN: http://leenissen. Retrieved 21 01.d. Robot Control. Neural Networks. Nguyen. (n. 1990 IJCNN International Joint Conference on . 2009. 2009. Nissen.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Microsoft.componentmodel. (n. struct fann.dk/fann/ Nissen.html Nissen. (2009).). S.com/robotcontrol. Learning Logic.org/nplot/wiki/ Parker. Improving the learning speed of 2-layer neural networks by choosinginitial values of the adaptive weights. (1990).com/en-us/library/system. (1982).dk/fann/html/files/fann-h. B.d.microsoft.d.d.html Nissen. 1990. Retrieved 21 01. 21-26. S. BackgroundWorker Class. 2009.). (2008). (n. (n. Retrieved April 15. Retrieved 26 01. (n.d. Reference Manual. D. S. from Artificial Neural Networks: http://www.dk/fann/fann_1_2_0/r1837.backgroundworker. S. Stanford University. from http://netcontrols. 2009. & Widrow. Office of Technology Licensing. from Datatypes: http://leenissen. Retrieved 21 01..d.

..... Retrieved April 15. The Cascade-Correlation Learning Architecture..A feed-forward neural network .................................... 65...........php?title=Main_Page 12 ........local optimum (units are undefined in this case)........ (1974)...................ic...Tables 12.............Table of Figures Figure 1 ........The initial state of a cascade correlation neural network.............. ....... (1958)...... Harvard University............. 10 Figure 2 ................. Psychological Review ......A human neuron ..... S.The sigmoid function shape .........org/wiki/index.html#Contents Werbos............. (1997)........An artificial neuron ....... P...... F............... C.... Stergiou.................... 35 Page 217 of 225 ................1 ................ 30 Figure 6 ..... 12 Figure 3 .. Beyond Regression: New tools for prediction and analysis in the behavioral sciences......doc.... 2009.. (n...................... PhD...................................).. ZedGraph.............................. 1991).....ac.. 2009....William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Rosenblatt... ............. & Dimitrios......... 13 Figure 4 ...............uk/~nd/surprise_96/journal/vol4/cs11/report....... The circular connections are adjustable weights (Fahlman & Lebiere................. Neural Networks.d.... Retrieved 21 01..... 27 Figure 5 ......... The perceptron: A probabilistic theory for information storage and organization in the brain........... from http://zedgraph.......... from Imperial College London: http://www. 386-408.......

.......................................................................................................................The dataset tab ..................................Design of the training algorithm tab (Wizard form) .. 50 Figure 23 ........................................................................................................................Mean squared error graph tab ........................................................ 42 Figure 12 ........... 45 Figure 16 ........................ 46 Figure 18 ...About form design (About form) ....................... with two nodes added (Fahlman & Lebiere.........Network and Output tab.......................... .... 43 Figure 13 ..............................Licenses display design (About form) ....................... 1991)...... 48 Figure 21 ............................ 1991).......Design of the network settings tab (Wizard form) ................. 41 Figure 10 .......................William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Figure 7 . 47 Figure 20 ........................................................................ 35 Figure 8 .........Dataset selection tab design (Wizard form)...............................................................Dataset display design (About form) ..............Working form design (Working form) ................................................... 46 Figure 19 ......................................Learning Algorithm tab ............. 51 Page 218 of 225 ..........Network Settings tab.... 45 Figure 17 ............. 44 Figure 15 .......................................... The CascadeCorrelation Learning Architecture..The third state of a cascade correlation neural network......................The second state of a cascade correlation neural network..................Dataset tab design (Main form) .... 42 Figure 11 ..................... 44 Figure 14 ............. 36 Figure 9 – Network and output tab design (Main form) .Mean squared error tab design (Main form) .......... The square connections are locked weights(Fahlman & Lebiere............ 49 Figure 22 ........ A first node has been added................................ The Cascade-Correlation Learning Architecture................

.................................................................... Datasets view ...................................... 67 Figure 36 ... 65 Figure 35 . Licenses view .................................................... 54 Figure 29 ..Number of iterations to achieve target (Viruses) ... as defined in the graph above......... 2004)......... 21 Page 219 of 225 ......................Working form ................ 52 Figure 26 ...2 ....A sigmoid function(Bourg & Seeman...........................................Mean squared error upon last training test (XOR) ........ main view .. 12 Equation 2 ......................................................................Dataset tab ................ 67 Figure 37 ........Mean squared error at testing (Viruses)............. 68 12................Number of iterations to achieve target (Iris data) .................Mean squared error upon last training test (Iris data) ........ 53 Figure 27 ......................Table of Equations Equation 1 .................................. 61 Figure 30 .............. 64 Figure 33........... 61 Figure 31 ...................................................Number of iterations to achieve target (XOR) ......... Back-propagation for the Uninitiated................................... 64 Figure 34...... 13 Equation 3 – The delta rule(Matthews.The mean squared error at testing (XOR).............The mean squared error at testing (Iris data) ...............................................About form............. In this example “y” is the function output and “t” is the function input............................................................. .....................................................William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Figure 24 ................ 54 Figure 28 . 2002) .............. 51 Figure 25 ....About form...............Mean squared error upon last training test (Viruses).......................... 62 Figure 32 ..............Summation of the inputs and weights .............About form.................

.......................William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Equation 4 .. 34 Equation 10 ............... 21 Equation 6 .................... 1991)......................................................................................The definition of 's' given that 'o' is the network output at which the error is measured........ 34 12................ 1991).......................... Back-propagation for the Uninitiated........ Backpropagation for the Uninitiated.Table of tables Table 1 ............................ 59 Table 3 .........................................................p’ is the input the candidate receives from unit ‘i’ for pattern ‘p’........................................................... ..... 2002) .......................The statistical approach to deciding relevant data .... where ‘oo’ is the sign of the correlation between the candidates value output ‘o’......................................... .................. 'v' is the candidate unit's value................................ the quantities ‘`v ’ and ‘`Eo’ are the values of v and Eo averaged over all patterns (Fahlman & Lebiere.. 15 Table 2 .......Altered delta calculation rule for the hidden layer(s) (Matthews...........................Altered delta calculation rule for the output layer(Matthews............3 ................................... 33 Equation 9 ............................... 2002)....................... 21 Equation 5 ........................ 2002) ....................................................The partial derivative of ‘s’ with respect to each of the candidates incoming weights ‘wi’............... ‘f'p’ is the derivative for pattern 'p' of the candidate unit's activation function with respect to the sum of it’s inputs and ‘Ii....................... The CascadeCorrelation Learning Architecture.............. Back-propagation for the Uninitiated.Back-propagation testing data (XOR)....The partial derivative of ’s’ (Fahlman & Lebiere.............. Backpropagation for the Uninitiated..................... The Cascade-Correlation Learning Architecture....................... 2002)................................. 22 Equation 8 .. 21 Equation 7 ...........Altered perceptron learning rule (Matthews................. 'p' is the training pattern.......................... 59 Page 220 of 225 ...differentiated sigmoid function (Matthews....Genetic Algorithm testing data (XOR) .......

.............................Cascade correlation testing data (Iris data) ........................................................................ 63 Table 7 ...................... 63 Table 6 .......William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 Table 4 .................Cascade Correlation testing data (XOR) ...... 60 Table 5 ..................... 63 Table 8 ............................. 66 Page 221 of 225 ...........Back-propagation testing data (Viruses) ...Genetic algorithm testing data (Iris data) ..................................................................................Back-propagation testing data (Iris data) ......Cascade Correlation testing data (Viruses) ........ 66 Table 9 ...........................................Genetic Algorithms testing data (Viruses) ....................... 66 Table 10 .............

o Agreed on further investigation into GALib and genetic algorithms. back-propagation network was running and learning XOR. o Decision made to adjust the weights of the neural network during training as opposed to using a genetic algorithm to establish network structure. o Submitted objectives. • 31/10/2008 o First presentation complete and ready o References building up o Project plan complete • 10/11/2008 o Presentation went well o Progress satisfactory • 14/11/2008 o Discussed document formatting and layout Page 222 of 225 . • 10/10/2008 o GALib decided as a genetic algorithm starting point o Further non-web references as a goal • • 16/10/2008 o Investigation into various neural network libraries taking place. however.Diary • 01/10/2008 – Research progress satisfactory. 17/10/2008 o Investigating more deeply into journals based on research already conducted. o Decided on XOR as a starting point. • 24/10/2008 o University computers were down so code demo was impossible.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 13 .

William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 o Progress satisfactory. 13/02/2009 o Have decided to extend the project by including cascade correlation learning algorithm and comparison to genetic algorithms. o Progress satisfactory • 20/02/2009 o Progress satisfactory Page 223 of 225 . • • 06/02/2009 o Still exploring extra areas for project to cover. • • • • • • • • • 19/11/2008 o Milestone one submitted successfully. 21/11/2008 o Progress satisfactory 28/11/2008 o Progress satisfactory 05/12/2008 o Progress satisfactory 12/12/2008 o Progress satisfactory 09/01/2009 o Progress satisfactory 16/01/2009 o Progress satisfactory 23/01/2009 o Progress satisfactory 30/01/2009 o Milestone two has been successfully submitted o Started investigating other areas to explore outside of project requirements.

Page 224 of 225 . o Have started report • 02/04/2009 o Main problems experienced with genetic algorithm seem related to memory leaks. o Report satisfactory • 05/04/2009 o Further work on user interface needed o Finish C# genetic algorithm code (C++ code does work but is very slow) o Report satisfactory • 06/04/2009 o C# genetic algorithm code is complete. have decided to implement in managed language (C#) to try to solve this o Report continues satisfactorily • 04/04/2009 o To do: About box Add dataset notes Decided on virus classification as a third dataset via the custom dataset functionality. is much faster (fixed several bugs along the way) and retains none of the problematic memory leaks. catch-up time for previous two weeks will be used over Easter break.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 • • • • 27/02/2009 o Progress satisfactory 06/03/2009 o Project progress delayed by other coursework 13/03/2009 o Project progress delayed by other coursework 20/03/2009 o Project progressing.

o Report satisfactory.William Sayers 05025397 Genetic Algorithms and Neural Networks Milestone 3 o Report satisfactory • 10/04/2009 o Fully functional project – commencing report. Page 225 of 225 .

Sign up to vote on this title
UsefulNot useful