You are on page 1of 225

2009

William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Genetic Algorithms
and Neural Networks
William Sayers (05025397)
Supervised by: Colin W. Morris (BSc, MSc)
Backpropagation neural networks are usually trained using some
iterative method derived from a mathematical analysis of the operation of the
network. This technique is time consuming and requires understanding of the
mathematical principles involved.

This project will investigate the application of genetic algorithms to replace the
"normal" training of the network. This involves setting up some candidate
networks and then picking the best of these. The best of the population are
then combined in some way to produce a new population of candidate
networks. This procedure is continued until a satisfactory network is obtained.

William Keith Paul


PageSayers
1 of 225
Faculty of Advanced Technology
22/04/2009
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Contents
Contents ............................................................................................................................ 2

1 - Introduction ............................................................................................................... 10

2 - Research .................................................................................................................... 10

2.1 - Neural Networks ................................................................................................. 10

2.1.1 - The Biological Neuron................................................................................. 10

Figure 1 - A human neuron ............................................................................................ 10

2.1.2 - The Artificial Neuron................................................................................... 11

Figure 2 - An artificial neuron ........................................................................................ 12

Equation 1 - Summation of the inputs and weights ........................................................ 12

Figure 3 - The sigmoid function shape ........................................................................... 13

2.2 - The History of Neural Networks ........................................................................ 13

2.3 - Advantages and Disadvantages of Neural Networks ......................................... 14

2.3.1 - Advantages................................................................................................... 14

2.3.2 - Disadvantages .............................................................................................. 15

2.4 - Current Applications of Neural Networks .......................................................... 17

2.4.1 - Neural Networks in Medicine ...................................................................... 17

2.4.2 - Neural Networks in Business ....................................................................... 18

2.4.3 - Object Trajectories....................................................................................... 19

2.4.4 - Robot Control .............................................................................................. 19

2.5 - Back-propagation ............................................................................................... 20

2.5.1 - Back-propagation overview ......................................................................... 20

Page 2 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

2.5.2 - Back-propagation in depth ........................................................................... 20

2.5.3 - Back-propagation library ............................................................................. 22

2.6 - The FANN neural network library (Nissen, FANN) .......................................... 22

2.7 - Genetic Algorithms ............................................................................................ 22

2.7.1 - History of Genetic Algorithms .................................................................... 23

2.7.2 - Advantages and Disadvantages of Genetic Algorithms .............................. 24

Figure 4 - local optimum (units are undefined in this case). .......................................... 27

2.8 - Training Neural Networks with Genetic Algorithms ......................................... 28

2.8.1 - Determining Weight Values with Genetic Algorithms................................ 28

2.8.2 - Representation of a Neural Network within a Genetic Algrithm ................ 28

2.8.3 - Using Genetic Algorithms to Determine Neural Network Structure ........... 29

Figure 5 - A feed-forward neural network ..................................................................... 30

2.9 - Cascade Correlation............................................................................................ 32

2.10 - C# User Interface Programming ....................................................................... 37

2.11 - Writing C++/CLI Wrappers for Unmanaged C++ code................................... 37

2.12 - Application of Research ................................................................................... 38

3 - Design ....................................................................................................................... 38

3.1 - Program Requirements ....................................................................................... 39

3.2 - Design of the Class Structure ............................................................................. 39

3.3 - Linking C# code to managed dll’s...................................................................... 40

3.4 - Design of the User Interface ............................................................................... 40

Page 3 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

3.4.1 - Main Form ................................................................................................... 40

Figure 9 – Network and output tab design (Main form) ................................................ 41

Figure 10 - Mean squared error tab design (Main form) ................................................ 42

Figure 11 - Dataset tab design (Main form) ................................................................... 42

3.4.2 - New Network Wizard .................................................................................. 43

Figure 12 - Design of the training algorithm tab (Wizard form) .................................... 43

Figure 13 - Design of the network settings tab (Wizard form) ...................................... 44

Figure 14 - Dataset selection tab design (Wizard form)................................................. 44

3.4.3 - Working Form ............................................................................................. 44

Figure 15 - Working form design (Working form) ........................................................ 45

3.4.4 - About Form .................................................................................................. 45

Figure 16 - About form design (About form) ................................................................. 45

Figure 17 - Dataset display design (About form) ........................................................... 46

Figure 18 - Licenses display design (About form) ......................................................... 46

4 - Implementation ......................................................................................................... 46

4.1 - User Interface Implementation ........................................................................... 47

4.1.1 - Main Form ................................................................................................... 47

Figure 19 - Network and Output tab............................................................................... 47

Figure 20 - Mean squared error graph tab ..................................................................... 48

Figure 21 - The dataset tab ............................................................................................. 49

4.1.2 - New Neural Network Wizard ...................................................................... 50

Figure 22 - Learning Algorithm tab ............................................................................... 50

Figure 23 - Network Settings tab.................................................................................... 51

Page 4 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Figure 24 - Dataset tab ................................................................................................... 51

4.1.3 - About Form .................................................................................................. 52

Figure 25 - About form, main view ................................................................................ 52

Figure 26 - About form, Datasets view .......................................................................... 53

Figure 27 - About form, Licenses view .......................................................................... 54

4.1.4 - Working Form ............................................................................................. 54

Figure 28 - Working form .............................................................................................. 54

4.1.5 - Passing Information between Forms ........................................................... 54

4.1.6 - Keeping the User Interface Active whilst Processing is Occurring............. 55

4.2 - Genetic Algorithm Implementation .................................................................... 56

4.2.1 - Back-propagation and Cascade Correlation Implementation ..................... 56

4.3 - Data Parser Implementation ............................................................................... 57

4.4 - Neural Network Implementation ........................................................................ 57

5 - Testing Data .............................................................................................................. 58

5.1 - XOR .................................................................................................................... 58

5.1.1 - Genetic Algorithms ...................................................................................... 59

5.1.2 - Back-propagation ......................................................................................... 59

5.1.3 - Cascade Correlation ..................................................................................... 60

5.1.4 - Graphs .......................................................................................................... 61

Figure 29 - Number of iterations to achieve target (XOR) ............................................ 61

Figure 30 - Mean squared error upon last training test (XOR) ...................................... 61

Figure 31 - The mean squared error at testing (XOR).................................................... 62

Page 5 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

5.2 - Fishers Iris data................................................................................................... 63

5.2.1 - Genetic Algorithms ...................................................................................... 63

5.2.2 - Back-propagation ......................................................................................... 63

5.2.3 - Cascade Correlation ..................................................................................... 63

5.2.4 - Graphs .......................................................................................................... 64

Figure 32 - Number of iterations to achieve target (Iris data) ........................................ 64

Figure 33- Mean squared error upon last training test (Iris data) ................................... 64

Figure 34- The mean squared error at testing (Iris data) ................................................ 65

5.3 - Virus Classification ............................................................................................ 65

5.3.1 - Genetic Algorithms ...................................................................................... 66

5.3.2 - Back-propagation ......................................................................................... 66

5.3.3 - Cascade Correlation ..................................................................................... 66

5.3.4 - Graphs .......................................................................................................... 67

Figure 35 - Number of iterations to achieve target (Viruses) ......................................... 67

Figure 36 - Mean squared error upon last training test (Viruses)................................... 67

Figure 37 - Mean squared error at testing (Viruses)....................................................... 68

6 - Comparisons .............................................................................................................. 68

6.1 - Back-propagation and Genetic Algorithms ........................................................ 68

6.1.1 - XOR ............................................................................................................. 68

6.1.2 - Fishers Iris Data ........................................................................................... 69

6.1.3 - Virus Classification...................................................................................... 69

6.2 - Cascade Correlation and Genetic Algorithms .................................................... 69

Page 6 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

6.2.1 - XOR ............................................................................................................. 69

6.2.2 - Fishers Iris Data ........................................................................................... 69

6.2.3 - Virus Classification...................................................................................... 70

6.3 - Testing conclusions ............................................................................................ 70

7 - Evaluation ................................................................................................................. 70

7.1 - How many of the original objectives were achieved? ........................................ 71

8 - Possible Improvements ............................................................................................. 72

8.1 - Possible User Interface Improvements ............................................................... 73

8.2 - Possible Training Algorithm Improvements ...................................................... 73

9 - Appendices ................................................................................................................ 74

9.1 - Source code ........................................................................................................ 74

9.1.1 - View ............................................................................................................. 75

9.1.2 - Controller ................................................................................................... 109

9.1.3 - Model ......................................................................................................... 158

9.1.4 - DLL Wrapper for FANN functions ........................................................... 174

9.2 - UML Class Diagram......................................................................................... 204

9.3 - Libraries Researched ........................................................................................ 207

9.4 - Datasets............................................................................................................. 209

9.5 - Objectives ......................................................................................................... 210

10 - Works Cited .......................................................................................................... 211

11 - Bibliography.......................................................................................................... 214

Page 7 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

12 - Tables .................................................................................................................... 217

12.1 - Table of Figures .............................................................................................. 217

12.2 - Table of Equations .......................................................................................... 219

12.3 - Table of tables ................................................................................................ 220

13 - Diary...................................................................................................................... 222

Page 8 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

STATEMENT OF ORIGINALITY

SCHOOL OF COMPUTING

DEGREE SCHEME IN COMPUTING

LEVEL THREE PROJECT

This is to certify that, except where specific reference is made, the work described
within this project is the result of the investigation carried out by myself, and that neither this
project, nor any part of it, has been submitted in candidature for any other award other than
this being presently studied.

Any material taken from published texts or computerised sources have been fully
referenced, and I fully realise the consequences of plagiarising any of these sources.

Student Name (Printed) ………………………………..................................

Student Signature ………………………………..................................

Registered Scheme of Study ………………………………..................................

Date of Signing ………………………………..................................

Page 9 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

1 - Introduction
The purpose of this report is to bring to a conclusion my final year project on neural
networks and genetic algorithms, on the course “Computer Games Development”.

This report encompasses design, development and final prototyping of the program
developed for this project. In addition, the report will investigate, compare and contrast three
common learning algorithms that are applicable to neural networks.

2 - Research
2.1 - Neural Networks
Neural networks are networks of small-specialised units that sum the products of their
inputs and the weights on those inputs. Although they are simple in concept, they can
produce immensely complex results.

2.1.1 - The Biological Neuron

The human brain is composed of many millions of nerve cells called “Neurons”.
Neurons do not make decisions alone, but in combination with a network of other neurons
within the brain.

Figure 1 - A human neuron

Page 10 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

A neuron is composed of a cell body, with extensions called “Dendrites” which are the
neuronal inputs and a single extension called an “Axon” which is the output for that neuron.
The axon and dendrites both split before terminating into “Axon branches” and “Synapses”.
Axon branches connect to synapses to form the neural networks within the brain. Axon
branches can connect to other types of cells also, in order to allow the brain to control cells
external to the brain.

Concerning inputs from the dendrites, the nucleus of the neuron cell processes them
and produces an output. If the inputs from the dendrites to the neuron nucleus when summed
are past a certain threshold, the nucleus will generate an “action potential”. This action
potential extends down the axon to the cells at the end of the axon branches (often synapses
connecting to other neuronal cells, as mentioned above). In the case of the axon branches
connecting to synapses that are part of a second neuron, a synaptic gap occurs between the
axon branch and the synapse, in this gap between neurons chemical reactions take place that
will inhibit or excite the signal.

(Bourg & Seeman, 2004), (Barber, AI: Neural Network for Beginners pt. 1, 2006)

2.1.2 - The Artificial Neuron

In order to construct an artificial neuron we need a way of representing the same model
in data structures and algorithms. The usual method for approaching this is to have a number
of inputs, a number of weights associated with those inputs, a summing function, an
activation function and a method for representing an output – either to another artificial
neuron in the next layer, or to another part of the program.

Page 11 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Figure 2 - An artificial neuron

The system devised with these components is as follows. A specific input accepts a data
item (usually a binary digit, real number, or integer). The product of the data item and its
associated weighting is then stored before the summation of all the data items and associated
weights occurs. The activation function accepts this stored value.

෍ ܵ௫ ‫ݓ‬ℎ݁‫ܹ݃݅݁ × ݐݑ݌݊ܫ = ܵ ݁ݎ‬ℎ‫ݐ‬


௫ୀ଴

Equation 1 - Summation of the inputs and weights

The next stage depends on the activation function chosen. A stepping function can be
used which will output zero (or non-activated) if the summed products are below a certain
value, or one (activated) if the summed products are above a certain value. Alternatively, a
differentiable function of some kind (such as a sigmoid function) which will output values
between zero and one depending on the value of the inputs is another possibility.

Page 12 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

1.2

0.8

0.6

0.4

0.2

0
-6 -4 -2 0 2 4 6

Figure 3 - The sigmoid function shape

1
‫=ݕ‬
1 + ݁ ି௧

Equation 2 - A sigmoid function (Bourg & Seeman, 2004). In this example “y” is the function output and “t” is
the function input, as defined in the graph above.

(Bourg & Seeman, 2004), (Barber, AI: Neural Network for Beginners pt. 1, 2006)

2.2 - The History of Neural Networks


The foundation of neural network research are in psychology, the work of 19th century
psychologists such as Freud, William James and others, contributed to the ideas that
underpinned early neural network research.

McCulloch and Pitts formulated the first neural network model which featured digital
neurons but the neurons had no capability to learn (McCulloch & Pitts, 1943).

Frank Rosenblatt developed the perceptron model then, in 1958, however he was
unable to come up with a reliable mathematically accurate mechanism for allowing multi-
layer perceptrons to learn (Rosenblatt, 1958).

Page 13 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Two layer perceptrons were the subject of several experiments to determine their usage
and effectiveness. The next major advancement in neural networks was not until 1974, when
Werbos (Werbos, 1974) discovered the back-propagation algorithm (independently re-
discovered by Parker in 1982 (Parker, 1982)).

Back-propagation allows for the training of multilayer perceptrons and in particular,


perceptrons with “n” hidden layers, of “n” nodes, where “n” is an undefined number.

Werbos discovered the algorithm whilst working on his doctoral thesis in statistics and
called it “dynamic feedback”, Parker discovered it independently in 1982 whilst performing
graduate work at Stanford University in the United States of America and called it “learning
logic”.

(Blum, 1992).

2.3 - Advantages and Disadvantages of Neural Networks


2.3.1 - Advantages

Neural networks make possible or practical many things that with conventional
solutions would be extremely difficult.

One of the benefits of neural networks when compared with other problem solving
techniques in that they are inherently parallel and thus can run very effectively and efficiently
on parallel hardware, used on this architecture they become an extremely fast as well as
flexible tool.

The individual neurons in a neural network have no time dependencies on each other
and each can therefore be run on a separate processor (or as separate threads on a single
processor) if desired without causing the problems often associated with such parallelism.

A good example of this is that neural networks have been implemented to run on
architecture originally designed for processing three dimensional computer graphics, taking

Page 14 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

advantage of the massively parallel architecture of this hardware and the programmable
sections of the graphics pipeline (vector and fragment shaders) to produce extremely fast
neural networks (Davis, 2001).

Neural networks can excel at determining what data in a particular set is relevant, as
irrelevant data will simply end up weighted so that it has zero or close to zero actual effect on
the solution produced by the network.

When using a neural network there's no need to establish before attempting problem
solving which data is relevant. We can simply present the neural network with all the data as
opposed to following the circuitous and overly complex (by comparison) statistical approach
demonstrated in “Table 1 - The statistical approach to deciding relevant data”.

1 Decide on relevant data.

2 Formulate a statistical model.

3 Run the formulated model.

4 Analyze the results.

5 Build a system that incorporates what we have learned.


Table 1 - The statistical approach to deciding relevant data

Another advantage of neural networks is that noisy data is not a real problem for them
to learn to interpret; in fact, noisy data is utilised during training upon occasion, to attempt to
provide a more robust post-training network.

2.3.2 - Disadvantages

One disadvantage to neural networks can be that it is very difficult for a human being to
analyse a trained neural network and describe why it may come up with one answer over
another. This is a large disadvantage in areas such as medical diagnosis, where explaining
why you have arrived at a particular diagnosis is an important part of the process.

Page 15 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Once a neural network’s training is complete it is hard to tell why it is coming up with a
particular solution, it can also be hard to tell how well (or how badly) trained a neural
network is.

A classic example is that in the 1980s, the US military wanted to use a neural network
to analyse images and determine whether they contained a concealed tank, for obvious
purposes.

They collected one hundred images with a concealed tank, then a further one hundred
images without. Fifty pictures of each set were set aside and the neural network trained with
the remaining one hundred images with and without tanks. After the neural network’s
training was complete, they tested the network with the remaining fifty images that the neural
network had not seen before and the neural network correctly identified most of the tanks.

The US military then tested the neural network with a further set of one hundred
images, with and without tanks, only to find that on this round of testing, the network came
up with apparently random answers.

Eventually the original data set was re-examined. All the photographs with tanks were
from a cloudy day and all the images without, a sunny day. Therefore, asked to differentiate
between the two sets of pictures, the network had trained itself to recognise whether the sky
was cloudy or not.

When using neural networks is that, there is no formal methodology with which to
choose the architecture, train the neural network, or even to verify the quality of the training.
A tool used to solve heuristic problems, requires largely heuristic construction and
preparation.

The training times on neural networks can also be a disadvantage, although the length
of time the problem would take to solve without employing a neural network must be taken
into account when deciding how much of a disadvantage.

Page 16 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Another possible criticism is that neural networks are extremely dependant on the
quality and content of the original training data set. The more time spent training the neural
network on quality data sets, the more accurate and in line with expectations and desires, the
eventual result will be.

This drawback is however one that is generally universal to computing (and many other
areas) often referred to as “GIGO” or “Garbage in, Garbage out”. Neural networks make it
more apparent however because (as in the previous US Military example) it can be difficult
sometimes to decide what constitutes good data.

Fortunately since neural networks are (as has been mentioned above) good at ignoring
irrelevant data, the usual best approach is to simply feed them as much data as can be
obtained, covering as many relevant situations as possible; this has to be done within reason
however, or it does exacerbate the problem with the time spent training the neural network.

The real strength of neural networks, in the opinion of the author, lies in their ability for
generalisation.

Data sets for training neural networks generally separate into two sub-sets, “training”
and “testing”. The network training first takes place using the appropriate “training” set.
Testing may then take place with the “testing” set to determine accuracy with unseen data.

This is generalisation and it is far more useful than it at first appears and similar from a
very high-level perspective to how the human brain and brains in general function.

2.4 - Current Applications of Neural Networks


2.4.1 - Neural Networks in Medicine

There is a large amount of research now covering the applications of neural networks in
medicine. One of the main research areas is using neural networks to recognise
diseases/conditions via scans.

Page 17 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

This is particularly suitable work for neural networks because you do not need perfect
examples of how to recognise the diseases and conditions, merely a variety of scans covering
all possible permutations of the disease or condition.

You can also use the technology to model parts of the human biological system in order
to better understand it, for example the human cardio-vascular system.

The main reasons for using a neural network to model the human cardio-vascular
system are that a neural network is capable of adjusting itself to be relevant to a particular
individual and also to adapt by itself to changes in that individual.

In the 1980s, research on using neural networks to diagnose diseases took place.
However, a clear problem with this approach is that when using a neural network it would be
difficult to tell how a neural network reached its conclusion.

(Stergiou & Dimitrios, 1997)

2.4.2 - Neural Networks in Business

Neural networks, being good at analysing patterns and predicting future trends, can fit
very well into most business situations.

Businesses have used neural networks in the past for applications such as assisting the
marketing control of airline seat allocations in the AMT (Airline Marketing Tactician).

A back-propagation neural net is integrated with the airline marketing tactician which
monitored and recommended on booking for each flight, thus supplying information more or
less directly linked to the airlines main income.

Credit scoring, mortgage screening, and assessing borrowers have all incorporated
neural networks as an integral part of the system.

(Stergiou & Dimitrios, 1997)

Page 18 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

2.4.3 - Object Trajectories

Predicting the trajectory of an object in real time is another area where neural networks
have been utilised. The position, velocity and acceleration of the object being estimated in
those implementations by several neural networks using several of the most recent
measurements of the object coordinates.

(Marshall & Srikanth, 1999)

2.4.4 - Robot Control

Controlling manipulators (such as robotic arms in a car manufacturing plant) is usually


how robotics and neural networks connect.

When controlling manipulators a neural net must deal with four major problems (Robot
Control, 2008):

• Forward Kinematics
• Inverse Kinematics
• Dynamics
• Trajectory generation

The ultimate goal of course is to position the “effecter” (I.e. a robotic hand or any other
similar manipulator) in the appropriate position to grasp or otherwise manipulate an object or
part of an object. When a robot arm is one hundred percent accurate that is a simple task,
(relatively speaking) involving the following steps (Robot Control, 2008):

• Determine the target coordinates relative to the robot.


• Calculate the joint angles required for the arm to be in the appropriate position.
• Adjust the arm to the appropriate joint angles and close the effecter.

A neural network allows the robotic arms to be very flexible in their operation and
perform self-adjustments as time goes by, compensating automatically for wear and tear on

Page 19 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

themselves. This is far preferable to having to perform lengthy and expensive re-calibrations
on any regular basis (Robot Control, 2008).

2.5 - Back-propagation
2.5.1 - Back-propagation overview

In essence, training by back-propagation involves the steps:

• Present a set of training data


• Compare the network’s output to the desired output at each output neuron and
calculate the error.
• Adjust the weight of the output neurons to lessen the error.
• Assign responsibility to each of the neurons in the previous level, based on the
strength of the weights connecting them to the output neurons.
• Adjust the weight of the neurons to minimise the responsibility.

In order to use a back-propagation training algorithm you must have a non-linear


activation function for your artificial neuron. A commonly used function is the sigmoid
function as described in section 2.1.2 -The Artificial Neuron. The sigmoid function is in the
source code at 9.1.3.1 -“Network.cs” lines 130-133.

The advantage of using a function like this is that it allows us to differentiate how close
we came to the correct result, allowing us to apply our learning algorithm; the disadvantage
of a differentiable function is that it is by the use of these functions that local optima become
apparent in the search space.

2.5.2 - Back-propagation in depth

In a single layer neural network (or with a single node) the perceptron rule is
applicable. This rule essentially describes adjusting the weight using the difference between
the expected and actual outputs.

Page 20 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

∆‫ݓ‬௜ = ‫ݔ‬௜ ߜ ‫ݓ‬ℎ݁‫ = ߜ ݁ݎ‬ሺ݀݁‫ݐݑ݌ݐݑ݋ ݀݁ݎ݅ݏ‬ሻ − ሺܽܿ‫ݐݑ݌ݐݑ݋ ݈ܽݑݐ‬ሻ.

Equation 3 – The delta rule (Matthews, Back-propagation for the Uninitiated, 2002)

However, this rule has no positive effect in a multi-layer neural network, as the effect
that the change in the weight will have on the rest of the network is a missing factor.

Provided you are using a differentiable function, such as the sigmoid function (see 2.1.2
-The Artificial Neuron) an alteration can be made to the perceptron learning rule allowing
this difficulty to be overcome. The sigmoid function differentiates very neatly.

݂ሺ‫ݔ‬ሻ = ‫ݔ‬ሺ1 − ‫ݔ‬ሻ

Equation 4 - differentiated sigmoid function (Matthews, Back-propagation for the Uninitiated, 2002)

Therefore, the following alteration to the perceptron learning rule would serve our
purposes (Matthews, Back-propagation for the Uninitiated, 2002):

∆‫ݓ‬௜ = ݊‫ݔ‬௜ ߜ ‫ݓ‬ℎ݁‫ߜ ݁ݎ‬௜ = ݂ ᇱ ሺ݊݁‫ݐ‬௜ ሻሺ݀௜ − ‫ݕ‬௜ ሻ

Equation 5 - Altered perceptron learning rule (Matthews, Back-propagation for the Uninitiated, 2002)

In this equation “w” and “x” are as before, “n” is the learning rate of the neural
network. The other two values “di and yi” are the actual and desired outputs, respectively.

Assuming the sigmoid activation function is in use, the output layer nodes training
function can be written as:

ߜ௜ = ‫ݕ‬௜ ሺ1 − ‫ݕ‬௜ ሻሺ݀௜ − ‫ݕ‬௜ ሻ

Equation 6 - Altered delta calculation rule for the output layer (Matthews, Back-propagation for the
Uninitiated, 2002)

Calculating the hidden layer deltas is a little more complicated. The effect that the
change will have on the following neurons in the network is an extra factor.

‫ ݊݅ ݍ ݊݋ݎݑ݁݊ ݎ݋ܨ‬ℎ݅݀݀݁݊ ݈ܽ‫݌ ݎ݁ݕ‬: ߜ௣ ሺ‫ݍ‬ሻ = ‫ݔ‬௣ ሺ‫ݍ‬ሻ[1 − ‫ݔ‬௣ ሺ‫ݍ‬ሻ] ෍ ‫ݓ‬௣ାଵ ሺ‫ݍ‬, ݅ሻߜ௣ାଵ ሺ݅ሻ

Page 21 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Equation 7 - Altered delta calculation rule for the hidden layer(s) (Matthews, Back-propagation for the
Uninitiated, 2002)

As demonstrated above, to calculate the delta for a hidden layer requires the deltas for
the following layers. Therefore, in order to apply this learning mechanism you start at the
output of the network and work your way towards the inputs.

2.5.3 - Back-propagation library

2.6 - The FANN neural network library (Nissen, FANN)


The FANN library (see (9.3 -Libraries Researched) for other libraries investigated)
implements a back-propagation solution and this library will be used in the project to
implement a feed-forward back-propagation neural network, against which to compare the
genetic algorithm trained neural network.

The FANN library was decided upon for several reasons, it is fast, easy to implement,
open-source (allowing me to alter it if I needed too, although as it turned out I did not), in a
language that can be linked to C# fairly easily and it also supports cascade neural networks,
saving me finding a separate library.

2.7 - Genetic Algorithms


Genetic algorithms attempt to copy natural laws to a certain extent in order to apply a
random search pattern to a defined search space.

Natural selection (Darwin, 1859) very neatly avoids one of the larger problems
involved in software design: how to specify in advance all the possible permutations of the
original problem and how the program should react to those permutations.

By using similar techniques to natural selection, it is possible to “breed” solutions to


problems.

Most organisms evolve by means of sexual reproduction and natural selection (Darwin,
1859)

Page 22 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Sexual reproduction ensures mixing and recombination of genes. During the process
chromosomes line up together and then cross over partway along their length, swapping
genetic material, this mixing leads to much faster evolution than if off-spring simply copied
the genetic material as it is in the parents.

Mutation plays a part in this as well, occasionally causing the genetic information,
during this process, to alter slightly in a random way. This could result in either an effective
or defective solution, but this variety helps to prevent stagnation in the gene pool.

2.7.1 - History of Genetic Algorithms

Genetic algorithms first appeared on computers in the late 1950’s and early 1960’s,
mostly designed to help evolutionary biologists’ model aspects of natural evolution. By 1962,
many researchers had independently developed evolution-inspired algorithms for function
optimisation and machine learning but did not get much follow up to their work.

In 1965 one Ingo Rechenberg, then of the technical University of Berlin introduced a
technique he named evolution strategy, although this technique involved no crossover, or
indeed multiple genomes in a population.

In 1975, the publication of the book “Adaptation in Natural and Artificial Systems”
occurred, by the author John Holland, built on papers both by himself and by other
researchers. This book presents the concepts of adaptive digital systems using mutation,
selection and crossover, simulating processes of biological evolution.

Also in 1975, a dissertation by Kenneth De Jong established the potential of genetic


algorithms by showing that they perform well on a wide variety of data, including noisy and
discontinuous data.

By the early 1980s, genetic algorithms were becoming more widely used, across a
range of subjects. At first, this application was largely theoretical; however, genetic

Page 23 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

algorithms soon moved into commercial territory and nowadays help solve many problems in
many different areas.

(Marczyk, 2004)

2.7.2 - Advantages and Disadvantages of Genetic Algorithms


2.7.2.1 - Advantages

The primary advantage of genetic algorithms is their inherent parallelism. Many other
algorithms are largely serial and can only explore a search-tree so far in any one direction
before abandoning all progress and starting again from the beginning, or further up the
hierarchy.

Genetic algorithms effectively explore many different branches of the tree at once and
when a certain branch turns out to be non-optimal, abandon that search, proceeding with
other more likely candidates.

This leads to the exploration of a large proportion of the search space, with processing
power exponentially dedicated to better areas as the exploration progresses.

The real advantage of parallelism however is that by evaluating the relatively small
search-space that it does, a genetic algorithm is implicitly evaluating a much larger group of
individuals, in the same way that the average response of a relatively small percentage of the
population of any given country, can be used to fairly accurately predict national trends.

This is “Schema Theorem” and allows a genetic algorithm to move towards the search-
space with the most promising individuals in a timely fashion and then select the best
member of that group (Marczyk, 2004).

Thanks to the parallelism that is a genetic algorithms main advantage, they are very
well suited as a means of exploring search spaces too large to be effectively searched in a
reasonable amount of time by other, more conventional, methods. These are “non-linear”

Page 24 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

problems. In a linear problem, each component’s fitness is individual and any improvement
to individual component's fitness is necessarily an improvement to the whole. Many real life
problems are not like this, they tend towards non-linearity where altering one component's
fitness positively may make other components less fit and cause a ripple effect across the
whole system. This non-linearity results in a huge increase in the search space, making
genetic algorithms an effective way to search them, due to their strength in navigating large
search spaces (Marczyk, 2004).

A third advantage of genetic algorithms is that they do not tend to be easily trapped by
local optima, due again to the parallelism of their approach, the random nature of the various
starting points of the initial population and the other methods they employ, such as crossover
and mutation.

Without crossover, selection and mutation a genetic algorithm is metaphorically similar


to a large number of parallel depth first algorithms, each searching their own space for the
best solution. Selection allows the pruning of the least promising searches, crossover allows
promising solutions to share their success and mutation allows random changes in the local
search space of a given solution.

In infinite or very large search spaces, it is hard to know whether we have reached the
global optimum, or merely very good local optima. However, genetic algorithms tend to give
good results compared to other search strategies.

One of the largest strengths of genetic algorithms can at first glance appear to be their
largest weakness. Genetic algorithms have no prior knowledge about the problem they are
trying to solve, they produce random changes to their candidates and then use the objective
function to determine whether those changes are positive or negative overall.

This allows them to discover solutions that other algorithms may have over-looked, or
never contained in their search space in the first place. A good example of this is the concept

Page 25 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

of negative feedback, rediscovered by genetic algorithms, but denied a patent for several
years because it ran counter to established beliefs (Marczyk, 2004).

2.7.2.2 - Disadvantages

One of the disadvantages of a genetic algorithm is the necessity of representing the


problem the genetic algorithm is trying to solve, in a form that the genetic algorithm can use.

Generally, genetic algorithms use strings of binary, integer, or real-valued numbers,


with each number representing some distinct part of the solution (Marczyk, 2004).

Another method is to use genetic programming, where the actual code of the program
forms the genomes in the genetic algorithm and it can swap portions in or out, or adjust them.

A second disadvantage of genetic algorithms is that the fitness function is crucial to the
development of a suitable solution for the problem. If you have a poorly written fitness
function, the genetic algorithm may end up solving an entirely different problem from the
originally intended one.

A third issue is setting the correct mutation rate, population size, etc. If the mutation is
too high, the system will never converge towards a suitable solution, however if it is too low,
a comparatively smaller segment of the search space will be covered and the eventual
solution may take longer to reach or never be reached as a result. Likewise, if the size of the
population is too low, the investigation may cover too little of the search space to find the
optimum solution (Marczyk, 2004).

A further problem is that fitness functions can be deceptive, if the solutions to various
problems are in areas of the search space that the fitness function would not consider likely,
then genetic algorithms (and most other search techniques) are no better than a random search
for finding the solution.

For example in the diagram below, the global optima is at the far right, however that is
only one point, there is no slope leading to it, therefore no gradual increase can lead you to it,

Page 26 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

you either have the solution or you don’t. In this example a properly constructed genetic
algorithm would be likely to settle on the marginally less optimal, but altogether easier to
find, local optima in the centre of the diagram, unless by random chance it discovered the
point with the most optimum solution.

900
800
700
600
500
400
300
200
100
0
0 2 4 6 8 10 12

Figure 4 - local optimum (units are undefined in this case).

A further problem is premature convergence: this occurs when a mutation early in the
process produces a large leap in the fitness of that particular genome, which will then
reproduce abundantly, lowering the diversity of the population and resulting in genetic
algorithms possibly falling into the local optima that the mutation represents. There are
various methods for solving this, including sigma scaling and Boltzmann selection (Mitchell,
1996).

Finally, where analytical solutions exist they should be take precedence over genetic
algorithms. This is because analytical solutions usually produce more accurate results faster
than a genetic algorithm (or any heuristic method) is capable of (Marczyk, 2004).

Page 27 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

2.8 - Training Neural Networks with Genetic Algorithms


Training neural networks is a form of problem solving, involving finding the optimum
values for a set of real numbers (connection weights) which will produce the least error in the
results from the neural network.

The error surfaces associated with these problems tend to be highly variable and
contain many local optima, however the method of choice for training neural networks is
usually back propagation (see 2.5 -Back-propagation) which is a gradient descent algorithm
and as such can be easily trapped in local optima (Branke, 1995). There are methods to try to
avoid this problem when using back propagation, such as adding a “momentum” to the
algorithm, to try to help it move beyond local optima; however, these are not a complete
solution (Blum, 1992).

2.8.1 - Determining Weight Values with Genetic Algorithms

Genetic algorithms (as described in 2.7 -Genetic Algorithms) are good at avoiding
being stuck on local optima due to their searching several regions of search space
simultaneously. Genetic algorithms also have the benefit that they do not place any
restrictions whatsoever on the architecture of the neural network, since you are merely
increasing the size of the genomes being worked with, not altering them besides that in any
way.

In addition, the only information genetic algorithms require to train a neural network is
an objective function (as described in 2.7 -Genetic Algorithms), to test the appropriateness of
a given genome.

2.8.2 - Representation of a Neural Network within a Genetic Algrithm

As a rule, a neural network representation within a genetic algorithm is a concatenation


of all the weighs in the neural network.

Page 28 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Crossover can disrupt genes far apart, but is less likely to do so for genes close
together. Because of this it is helpful to keep functional units close together (I.e. place a
neurons input and output weights, a nodes input weights and the nodes of a layer, side by
side).

One of the most important decisions when deciding how to represent a neural network
within a genome is whether to use binary strings or real number strings.

Standard genetic algorithms use binary strings, however representing a large number of
real valued numbers as a binary string, can lead to very large strings (thousands of bits)
(Branke, 1995).

2.8.3 - Using Genetic Algorithms to Determine Neural Network Structure

As well as using a genetic algorithm to evolve the weight set for a fixed-structure
neural network, it is also possible to use a genetic algorithm to evolve a structure for a neural
network.

The average neural network has an input layer, one or two hidden layers of an
indeterminate number of nodes and one output layer. The problem at hand dictates the input
and output layer structures.

These are all connected, outputs to inputs, to form a feed forward network.

Page 29 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Figure 5 - A feed-forward neural network

This architecture is then altered (number of nodes in each hidden layer, number of
hidden layers) by trial and error and intuition to establish a neural network that performs well
for the problem at hand.

However, this is not necessarily the most efficient form for a network for any given
problem and the trial and error method is an inefficient method for determining the correct
architecture for an optimally performing neural network for the problem at hand.

If neural network architecture is too simplistic for the problem at hand, the neural
network will never learn to solve the problem (or come close enough to suggest a solution).

On the other hand, if the neural network architecture is too complex, the neural network
will learn fast but will generate too specific a solution and will not generalize between similar
inputs. As mentioned in section 1.0, a neural networks generalisation is its main strength and
therefore you do not wish to lose it.

Although network architecture plays such a large role in network performance, there is
currently no method by which to establish, given a specific problem, a neural network
topology to deal with that problem. There is also no method to check how optimal your
solution is (Branke, 1995).

Page 30 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

2.8.3.1 - Representing neural network structure

Representing the structure of a neural network within a genetic algorithm is not as


straightforward as representing the weight values of a neural network within a genetic
algorithm.

Ideally, a representation is required which can accommodate all networks that will
work for the problem but none that will not, thus eliminating the searching of meaningless
space and by definition, being able to contain the most optimum solution for the problem at
hand.

There are two methods for representing neural network topologies within a genetic
algorithm, the strong, direct, or low-level encoding and the weak, indirect, or high-level
encoding (Branke, 1995).

Low-level encodings specify each connection individually, whereas high-level


encodings group together connections and/or nodes.

In direct encoding, the genetic algorithm represents the neural network by means of its
connections, thus meaning that in order to remove a node completely, you must remove all
connections to or from that node completely.

In cases where a genetic algorithm determines the number of hidden layers and the
number of nodes per hidden layer and the network itself is interconnected fully, an exception
is possible.

In one version of high level encoding of a neural network, the network is divided into
areas and for each area the number of nodes and the density of connections to other nodes is
determined (Branke, 1995).

Back propagation learning rate is also a parameter for one of these areas.

Page 31 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

One of the reasons why alternatives to low-level encoding arose is the exponential rise
in the number of connections, as the size of the neural network increases. These mapping
methods are therefore more suitable for small networks or networks with a small number of
connections.

This could be useful, by limiting the potential size of the networks evolved and
preventing the generation of overly large and complex networks but on the other hand, it is
impossible to know whether these large and complex networks may be the most suitable way
to solve the problem at hand.

As mentioned in section 5.2 neural networks that are too large will solve a problem fast
but will not generalize well and therefore their usefulness is limited severely. Neural
networks that are too small on the other hand, will not ever succeed in learning the problem.
Therefore, a balance between speed of learning and generalisation exhibited in the trained
network is necessary.

A problem with high level encoding is that regular networks are favoured but not every
network is equally probable. This can lead to human bias entering the system and affecting
the outcome, which is obviously undesirable.

This section largely drawn from information in: “Curved trajectory prediction using a
self-organizing neural network” (Branke, 1995).

2.9 - Cascade Correlation


The cascade learning architecture is a method by which the training algorithm builds
the neural network as it proceeds. The algorithm attempts to solve the issues associated with
back-propagation that produce slow learning. These are the Step-size problem, and the
moving target problem (Fahlman & Lebiere, The Cascade-Correlation Learning Architecture,
1991).

Page 32 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

The system begins with a basic neural network of inputs and outputs (numbers of each
dictated by the task as usual) and no hidden layer.

The connection between each input and output is via an adjustable weight. There is also
a bias input, with a value of one. The output units may use either a linear or differentiable
activation function. It then trains the adjustable weights using the perceptron rule (as
described in 2.5.2 -Back-propagation in depth), or sometimes the quick prop rule (Fahlman,
An Empirical Study of Learning Speed in Back-Propagation Networks, 1988), with the aim
of getting as close to a solution as possible.

After a number of training cycles pass with no substantial improvement (the precise
number being determined by a user-supplied “patience” parameter); either an extra node is
added, or if the networks output is of sufficient quality the training is ended.

If a trigger occurs for the addition of a unit, the initiation of a number of “candidate”
units takes place and trained using either the perceptron rule (2.5.2 -Back-propagation in
depth) or the quick prop rule (Fahlman, An Empirical Study of Learning Speed in Back-
Propagation Networks, 1988).

The “candidate” nodes take the neural net inputs, plus the outputs of any previously
created units, and attempt to maximise “s” the sum overall output units “o” of the magnitude
of the correlation between “v”, the candidate unit’s value, and “Eo”, the residual output error
at “o” (Fahlman & Lebiere, The Cascade-Correlation Learning Architecture, 1991).

‫ = ݏ‬෍ ቮ෍൫‫ݒ‬௣ − `‫ݒ‬൯൫‫ܧ‬௣,௢ − `‫ܧ‬௢ ൯ቮ


௢ ௣

Equation 8 - The definition of 's' given that 'o' is the network output at which the error is measured, 'p' is the
training pattern, 'v' is the candidate unit's value, the quantities ‘`v ’ and ‘`Eo’ are the values of v and Eo averaged
over all patterns (Fahlman & Lebiere, The Cascade-Correlation Learning Architecture, 1991).

In order to maximise "s", the partial derivative of "s" with respect to each of the
candidate unit’s incoming weights "wi" is a necessity.

Page 33 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

ߜ‫ݏ‬
ߜ‫ݓ‬௜

Equation 9 - The partial derivative of ’s’ (Fahlman & Lebiere, The Cascade-Correlation Learning
Architecture, 1991).

Expansion and differentiation of this calculation can then take place.

ߜ‫ݏ‬
= ෍ ‫݋‬௢ ൫‫ܧ‬௣,௢ − `‫ܧ‬௢ ൯݂′௣ ‫ܫ‬௜,௣
ߜ‫ݓ‬௜
௣,௢

Equation 10 - The partial derivative of ‘s’ with respect to each of the candidates incoming weights ‘wi’, where
‘oo’ is the sign of the correlation between the candidates value output ‘o’, ‘f'p’ is the derivative for pattern 'p' of the
candidate unit's activation function with respect to the sum of it’s inputs and ‘Ii,p’ is the input the candidate receives
from unit ‘i’ for pattern ‘p’.

After these computations are complete, a gradient descent to maximize "s" can take
place.

Once the correlation on a candidate has reached an acceptable level, or the completed
training epochs reaches a maximum level, The connection of the best of the candidates to the
neural network as it stands takes place, and the inputs for that node are frozen.

Adjustable weights link the outputs for this node and all the output nodes inputs. The
training iterations for these adjustable weights in the main network then continue.

Page 34 of 225
William Sayers 05025397

Genetic Algorithms and Neural


Neural Networks
Milestone 3

Figure 6 - The initial state of a cascade correlation neural network. The circular connections are adjustable
weights (Fahlman & Lebiere, The Cascade-Correlation
Cascade Correlation Learning Architecture, 1991).
1991)

Figure 7 - The second


econd state of a cascade correlation neural network. A first node has been added. The square
connections are locked weights (Fahlman & Lebiere, The Cascade-Correlation
Cascade Correlation Learning Architecture, 1991).
1991)

Page 35 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Figure 8 - The third state of a cascade correlation neural network, with two nodes added (Fahlman & Lebiere,
The Cascade-Correlation Learning Architecture, 1991).

Page 36 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

2.10 - C# User Interface Programming


C# uses the Winforms API to create user interfaces. It allows you to “draw” objects on
a form in a visual manner to design your layout and then access and alter properties and sub-
properties of those objects in your programming code. A separate thread for each Window in
Winforms exists, in order to facilitate performance on modern multi-core machines, although
it does make it more difficult to move data between the forms. A method using “delegate
functions” (similar to “function pointers” in C++) and “events” (Lysle, 2007) is the method I
will use.

Another challenge is making the neural network training independent of the user
interface, in order to avoid the interface freezing whilst training a neural network. The
solution I plan for this is to use the background worker object provided with C#, which
allows you to assign a delegate function to be your background task, and will trigger events
upon completion, for which event handlers will be executed (Microsoft, BackgroundWorker
Class, 2009).

Making a C# user interface interact with C++/CLI dll’s simply involves adding a
reference to the dll in your project, via the Visual Studio® user interface.

2.11 - Writing C++/CLI Wrappers for Unmanaged C++ code


The cleanest method of connecting a native C++ or C dll or piece of code to a managed
C# interface is via a C++/CLI bridge class.

The VC++ compiled can swap between managed and unmanaged code on the fly whilst
compiling and allows you to have both managed and unmanaged code (in C++) as part of the
same project. Thus, a managed class can communicate with the unmanaged class (performing
appropriate marshalling of the non-primitive data types (Microsoft, Using C++ Interop
(Implicit PInvoke), 2009)) and the C# code can then access it directly.

Page 37 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Although it is possible for a C# program to interact with an un-managed dll directly,


this method is less error-prone and more likely to be robust and stable, as well as making for
cleaner code.

2.12 - Application of Research


The research presented here will help to produce the deliverables for milestone two and
milestone three of my final year project: I plan to combine C# for the user interface and C++
for the neural network back-ends in order to create an efficient, powerful and fast solution.

Section 2.11 -Writing C++/CLI Wrappers for Unmanaged C++ code, covers the
combination of C# and C++ so I will not bother to cover it here, I plan to use the DLL
method, with one DLL combining managed and unmanaged C++ linked into my C #user
interface code.

The user interface will then have methods to adjust parameters such as the genetic
algorithms population size, number of generations, etc. In order to test and see what kind of
genetic algorithm learns and develops fastest. It will also allow you to specify the number of
back-propagation iterations.

The use of this application will be in evaluating the difference in performance between
a genetic algorithm, back-propagation algorithm and cascade correlation algorithm as neural
network training algorithms.

3 - Design
The purpose of this application will be to allow the user to run a back-propagation
trained neural network, or a genetic algorithm trained neural network and present the results
of the network in such a fashion as to allow analysis and comparison of the training methods
and their suitability in different situations.

Page 38 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

3.1 - Program Requirements


The program must be able to:

• Run a genetic algorithm on a feed forward fully interconnected neural network to


train the network.
• Run a back propagation algorithm on a similar network to train the network.
• Run a cascade algorithm on a similar network to train the network.
• Allow the user to set appropriate variables (see 3.4.2 -New Network Wizard) to adjust
the execution of the learning algorithm selected for the neural network.
• Allow the user to select which learning algorithm they wish to use.
• Display the results of running that algorithm on a neural network in a meaningful
fashion.

3.2 - Design of the Class Structure


Before I discovered a simple to program and effective design, I tried and discarded two
designs. I eventually used the third design and it is that third design documented here.

The two prior designs to this involved representing neural networks via objects and
compositions of objects in vectors, which lead to flexible, but slow and overly complex code
upon experimentation.

The final solution, the solution documented here, is a far simpler one (in terms of the
neural network representation, it is far more detailed in the other areas than the initial designs
were).

This solution loosely follows a model, view, controller architecture. The solution
enables easy replacement of sections of code as long as the interfaces and outputs remain the
same.

For more information on the class structure, see section 9.2 -UML Class Diagram.

Page 39 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

3.3 - Linking C# code to managed dll’s


Linking C# code and managed dll’s is a simple process (in Visual Studio) as explained
in (2.11 -Writing C++/CLI Wrappers for Unmanaged C++ code) of selecting the references
section in the C# project in question and adding the dll you wish to refer too. Then you can
simply add a “using” directive to use the dll namespace and create, destroy, and use the
classes within as if they were native C# classes. This can be seen in (9.1.2.6 -
“Algorithms\BackProp.cs”) and (9.1.2.4 -“Algorithms\CascadeCorrelation.cs”) with the
“using FANN_Wrapper;” directive.

This is superior to the direct C# to unmanaged code linking that used in milestone two,
since it makes for cleaner and thus more easily maintainable code and the code is more robust
in this fashion.

The robustness comes from the fact that C++/CLI is far better at interacting with
unmanaged C++ code than C# is. Indeed – they can even be in the same source file and
compiled into one object. Therefore, by using a C++/CLI wrapper to interface to the
unmanaged C++ and then interacting with the C++/CLI wrapper/s errors are less likely to
arise.

3.4 - Design of the User Interface


The construction of the user interface utilised the Winforms based tools in visual studio
2008 (see 2.10 -C# User Interface Programming).

I also used the ZedGraph control (ZedGraph) to display the graph of my results.

3.4.1 - Main Form

The main form is the crux of the application (as is default in C# Winforms
programming) and if the main form is closed, all other forms and classes are disposed and the
application terminates.

Page 40 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

3.4.1.1 - Network and Output tab

This tab needs a method by which to initialise the adding of a new network (3.4.2 -New
Network Wizard) a method to train the current network and a method to test that training.

Three buttons will therefore be present, along with two displays. One display will show
network details and one display will show the training and testing output.

Figure 9 – Network and output tab design (Main form)

3.4.1.2 - Mean Squared Error Graph tab

The mean squared error graph tab is a simple tab, which simply needs to display the
graph of the mean squared error over the current epoch/generation.

Page 41 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Figure 10 - Mean squared error tab design (Main form)

3.4.1.3 - Dataset tab

This tab needs two display sections, to show the training data and to show the testing
data.

Figure 11 - Dataset tab design (Main form)

Page 42 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

3.4.2 - New Network Wizard

The new network wizard is the form that appears when the user selects the creation of a
new network. It allows you to move in a logical progression through the set up process of a
new neural network and associated training algorithm. The user can progress through the
steps via the “previous” and “next” buttons or via the tabs at the top in whatever order you
choose. Verification of all data takes place before the form passes control back to the main
form and triggers a main form event with a parameter that takes a specially constructed class
to transfer the data from the wizard.

3.4.2.1 - Learning Algorithm tab

On the learning algorithm tab you select the learning algorithm for the neural network
you are creating, out of three choices previously mentioned (2.5 -Back-propagation, 2.7 -
Genetic Algorithms, 2.9 -Cascade Correlation).

Figure 12 - Design of the training algorithm tab (Wizard form)

3.4.2.2 - Network Settings tab

This tab allows you to set appropriate settings for the network and training algorithm
you have selected in 3.4.2.1 -Learning Algorithm tab (if no algorithm has yet been selected,
the default is genetic algorithm).

Page 43 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Figure 13 - Design of the network settings tab (Wizard form)

3.4.2.3 - Dataset tab

The dataset tab allows you either to select from two pre-setup datasets, or to choose
custom data sets from your hard disk for training and testing data (appropriately formatted in
the FANN style).

Figure 14 - Dataset selection tab design (Wizard form)

3.4.3 - Working Form

The working form is merely a small display shown as a dialog (lock out access to
displaying form) which shows whilst a neural network is training, both to protect the main
form from data inputs that may cause training problems and to provide a visual indicator of
work taking place.

Page 44 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Figure 15 - Working form design (Working form)

3.4.4 - About Form

The about form displays information about the application (author, project supervisor,
project description), displays licensing information and displays information on the pre
selected datasets. Clicking the datasets button or the licenses button displays the appropriate
information, clicking the same button again hides that information.

Figure 16 - About form design (About form)

3.4.4.1 - Datasets display

Clicking the data sets button displays a tabbed interface, one tab with XOR information
and one tab with Fishers Iris Data information.

Page 45 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Figure 17 - Dataset display design (About form)

3.4.4.2 - Licenses display

Clicking the licenses button displays a small information panel with selection methods;
depending on your selection, it will display the appropriate license.

Figure 18 - Licenses display design (About form)

4 - Implementation
The implementation is largely in C#, with small portions in C++/CLI and the back-end
FANN (Nissen, FANN) library written in C and linked as a static library to the managed
FANN_Wrapper dynamic link library.

Page 46 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

4.1 - User Interface Implementation


The implementation of the user interface is mainly in C#, mostly via the visual design
tools found in visual studio 2008.

4.1.1 - Main Form

The main forms code is in section 9.1.1.2 -“FrmMain.cs”.

4.1.1.1 - Network and Output tab

Figure 19 - Network and Output tab

Page 47 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

4.1.1.2 - Mean Squared Error Graph tab

Figure 20 - Mean squared error graph tab

Page 48 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

4.1.1.3 - Dataset tab

Figure 21 - The dataset tab

Page 49 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

4.1.2 - New Neural Network Wizard


4.1.2.1 - Learning Algorithm tab

Figure 22 - Learning Algorithm tab

Page 50 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

4.1.2.2 - Network Settings tab

Figure 23 - Network Settings tab

4.1.2.3 - Dataset tab

Figure 24 - Dataset tab

Page 51 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

4.1.3 - About Form

Figure 25 - About form, main view

Page 52 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

4.1.3.1 - Datasets View

Figure 26 - About form, Datasets view

Page 53 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

4.1.3.2 - Licenses View

Figure 27 - About form, Licenses view

4.1.4 - Working Form

Figure 28 - Working form

4.1.5 - Passing Information between Forms

Passing the data between the wizard form and the main form takes place as previously
described (2.10 -C# User Interface Programming) via events and delegate functions.

Page 54 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

An event is a method of notifying a listening object that something has occurred.


Events, when instantiated, are associated with a delegate function to respond to that event
(Lysle, 2007).

The declaration and association of a delegate and an event takes place in the form that
needs to generate an event (9.1.1.3 -“Wiz1.cs”, lines 29 – 31).

The “NetworkUpdatedEventArgs” class is a storage class for passing data (see 9.1.1.1
-“NetworkUpdatedEventHandler”).

The declaration of a new event handler in the main form takes place prior to displaying
the new neural network wizard form (see 9.1.1.2 -“FrmMain.cs” lines 120-129).

The trigger of a new event takes place in the Wizard form when you are on the last tab
and the user clicks “next” (see 9.1.1.3 -“Wiz1.cs” lines 145-163).

In the event handler in the main form, the processing of the class sent from the Wizard
form takes place (see 9.1.1.2 -“FrmMain.cs” lines 70-119).

4.1.6 - Keeping the User Interface Active whilst Processing is Occurring

The user interface delegates the processing to a background worker class. This leads to a user
interface that is much more accessible (Microsoft, BackgroundWorker Class, 2009).

Firstly, the background worker class is instantiated and supplied with the event handler
references functions (see 9.1.1.2 -“FrmMain.cs”, lines 29-44).

Then the runworkerasync function executes and then shows the working form. (See 9.1.1.2 -
“FrmMain.cs”, lines 266-281).

The two-event handling functions are at 9.1.1.2 -“FrmMain.cs”, lines 306-353.

Page 55 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

4.2 - Genetic Algorithm Implementation


Although the original design was to implement genetic algorithms in C++, after having
used C# extensively in the UI I felt more confident about the performance loss and the use of
the programming language itself.

Therefore, when the C++ implementation of genetic algorithms that I wrote functioned,
but had issues with memory leaks and was slow, I decided to try re-implementing it in C#.

Once this implementation was complete, it was both faster than the C++
implementation (due mostly to my use of build in .NET sort methods in the sorting part of the
genetic algorithm), and didn’t suffer from the memory leaks (being managed code) so that’s
the version that exists in the application today (see 9.1.2.5 -
“Algorithms\GeneticAlgorithm.cs” for the C# implementation)

The main points to note in the code as it stands are:

• The crossover (single-point) and mutation (random increment or decrement)


implementations (see 9.1.2.5 -“Algorithms\GeneticAlgorithm.cs” lines 176-
204).
• The sorting implementation in GANetwork.cs that links into the .NET built in
functionality (see 9.1.3.2 -“GANetwork.cs” lines 32-44).
• The neural network specific functionality added to the network model in the
network specialisation class GANetwork (see 9.1.3.2 -“GANetwork.cs” lines
51-126)

4.2.1 - Back-propagation and Cascade Correlation Implementation

The FANN library implements both cascade correlation and back propagation (Nissen,
FANN) which is a C programming language based neural network library.

The compilation of the FANN library is static, and the Dll linked to this to allow access
to the functionality. There is an unmanaged C++ wrapper (9.1.4.3 -

Page 56 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

“UnManaged_FANN_BackProp.h”; 9.1.4.5 -“UnManaged_FANN_CasC.h”; 9.1.4.4 -


“UnManaged_FANN_BackProp.cpp”; 9.1.4.6 -“UnManaged_FANN_CasC.cpp”) for the C-
based FANN functions in the DLL and a C++/CLI wrapper (9.1.4.1 -“FANN_Wrapper.h”;
9.1.4.2 -“FANN_Wrapper.cpp”) for the unmanaged C++ classes.

C# accesses the C++/CLI class directly, which allows it to manipulate the FANN
functions in the static library through this class.

4.3 - Data Parser Implementation


The data parser (9.1.2.2 -“DatasetParser.cs”) parses data directly from the rich text
boxes that they’re displayed upon on the main form, via references to these objects passed
down via the hierarchy (references are passed to avoid the necessity of passing entire strings
down the hierarchy).

In the constructor (lines 41 – 77), the separation of the textual inputs takes place, via a
number of designated separator characters, into arrays of strings.

From there, these arrays form parameters for the other functions, which using the data
from the heading (number of epochs of data, number of inputs, number of outputs) separates
the data into two-dimensional arrays suitable to be passed to other functions.

4.4 - Neural Network Implementation


Although two previous attempts made at programming a neural network, followed an
object-based structure (network, layers, and nodes) the eventual implementation turned out to
be basic yet effective (9.1.3.1 -“Network.cs”).

Three jagged arrays are utilised, which store respectively, the inputs, the weights, and
the outputs. The inputs and weights are structures so that the coordinates for an input, will
also be the coordinates for its corresponding weight (lines 17-19). In the initialisation routines
(lines 28-124), the initialisation of all the arrays to their correct sizes takes place and then the
initialisation of the weights to a random double between zero and one.

Page 57 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

The main functionality of the neural network class can be found in the “internal” region
(lines 126-235), there are three functions, to run the output layer, input layer, and hidden
layer which are all called by the public “run()” function (lines 241-270). All nodes determine
their outputs using the sigmoid function (lines 130-133).

I tested the neural network implementation by creating a neural network with structure
2, 3, 1, and creating a neural network with a similar structure on an excel spreadsheet. Then
entered the weights from the neural network program (by means of using the visual studio
debugger to check variable values) into the spreadsheet and checked that the outputs
coincided.

5 - Testing Data
Testing took place several times for each algorithm and dataset (see 9.4 -Datasets) (ten
times for XOR, five times for Virus classification and Fishers iris data). As the program
presented the data in each case, recording took place for analysis.

The genetic algorithm and back-propagation neural networks each had four hidden
nodes for each dataset. The target error was 0.01 for XOR and 0.02 for Fishers iris data and
the virus classification data.

5.1 - XOR
In the XOR dataset, the dataset is too small to split, so repetition is instead used.

Page 58 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

5.1.1 - Genetic Algorithms


Iterations (GA) Result MSqE (GA) Testing MSqE (GA)
591 0.009888967 0.009888967
407 0.006150786 0.006150786
500 0.009375132 0.009375132
466 0.009829335 0.009829335
499 0.009248076 0.009248076
534 0.009962559 0.009962559
675 0.00957518 0.00957518
557 0.009162436 0.009162436
357 0.009566951 0.009566951
939 0.00881867 0.00881867

552.5 0.009157809 0.009157809


Table 2 - Genetic Algorithm testing data (XOR)

5.1.2 - Back-propagation

In the back-propagation testing data, the mean squared error from the network result
and the testing mean squared error are different despite the data being the same, because of
floating point inaccuracies converting between double precision floating-point numbers in my
code and single precision in the FANN code.

Iterations (BP) Result MSqE (BP) Testing MSqE (BP)


345 0.00998027 0.00988863
287 0.0099765 0.00989974
319 0.00998959 0.00991907
327 0.00998032 0.0099113
301 0.00999069 0.00990403
323 0.00997699 0.00988649
421 0.00999799 0.00990161
315 0.00995716 0.00987211
362 0.00992291 0.00983181
325 0.00992081 0.0098402

332.5 0.009969323 0.009885499


Table 3 - Back-propagation testing data (XOR)

Page 59 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

5.1.3 - Cascade Correlation

In the cascade correlation data, the testing result differs quite widely from the network
result. At each node addition, the call back function triggers, which is when the program
records the mean squared error, but in the case of cascade training, the network is trained
more after that point (output weight training).

Nodes (CC) Result MSqE (CC) Testing MSqE (CC)


2 0.00992534 7.86E-14
2 0.00585159 1.32E-13
2 0.00863005 2.02E-13
2 0.00798691 1.83E-13
2 0.0086449 2.02E-13
2 0.00889787 2.22E-13
2 0.00573252 1.18E-13
2 0.00960884 7.14E-14
2 0.00656719 2.78E-14
2 0.00855044 1.99E-13

2 0.008039565 1.43545E-13
Table 4 - Cascade Correlation testing data (XOR)

Page 60 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

5.1.4 - Graphs

1000
900
800
700
600
Iterations (GA)
500
Iterations (BP)
400
Nodes (CC)
300
200
100
0
0 5 10 15

Figure 29 - Number of iterations to achieve target (XOR)

0.012

0.01

0.008

Result MSqE (GA)


0.006
Result MSqE (BP)
0.004 Result MSqE (CC)

0.002

0
0 5 10 15

Figure 30 - Mean squared error upon last training test (XOR)

Page 61 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

0.012

0.01

0.008

Testing MSqE (GA)


0.006
Testing MSqE (BP)
0.004 Testing MSqE (CC)

0.002

0
0 5 10 15

Figure 31 - The mean squared error at testing (XOR)

Page 62 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

5.2 - Fishers Iris data


5.2.1 - Genetic Algorithms
Testing mean squared error
Iterations (GA) Mean squared error (GA)
(GA)
1928 0.019775896 0.019775896
1834 0.018789438 0.018789438
1277 0.019751257 0.019751257
2835 0.019719802 0.019719802
8716 0.019763877 0.019763877

3318 0.01956255614 0.01956255614


Table 5 - Genetic algorithm testing data (Iris data)

5.2.2 - Back-propagation
Iterations
Mean squared error (BP) Testing mean squared error (BP)
(BP)
434 0.0199999 0.014144
529 0.0199719 0.0140669
474 0.0199892 0.0143778
391 0.0199916 0.0139982
527 0.0199972 0.0143806

471 0.01998996 0.0141935


Table 6 - Back-propagation testing data (Iris data)

5.2.3 - Cascade Correlation


Nodes (CC) Mean squared error (CC) Testing mean squared error (CC)
2 0.0195853 0.0141406
2 0.0199918 0.0198416
2 0.0196297 0.014116
2 0.0193595 0.0158679
2 0.0189456 0.0164964

2 0.01950238 0.0160925
Table 7 - Cascade correlation testing data (Iris data)

Page 63 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

5.2.4 - Graphs

10000
9000
8000
7000
6000
Iterations (GA)
5000
Iterations (BP)
4000
Nodes (CC)
3000
2000
1000
0
0 2 4 6

Figure 32 - Number of iterations to achieve target (Iris data)

0.0202

0.02

0.0198

0.0196 Mean squared error


(GA)
0.0194 Mean squared error
0.0192 (BP)
Mean squared error
0.019 (CC)
0.0188

0.0186
0 2 4 6

Figure 33- Mean squared error upon last training test (Iris data)

Page 64 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

0.025

0.02

Testing mean
0.015 squared error (GA)
Testing mean
0.01 squared error (BP)
Testing mean
0.005 squared error (CC)

0
0 2 4 6

Figure 34- The mean squared error at testing (Iris data)

5.3 - Virus Classification


This dataset covers 61 viruses affecting several crops including tobacco, tomato,
cucumber, and others. There are 18 measurements on each virus, which are the number of
amino acid residues per molecule of coat protein.

The dataset is in order:

• Hordeviruses x3
• Tobraviruses x6
• Tobamoviruses x39
• Furoviruses x13

Page 65 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

5.3.1 - Genetic Algorithms


Iterations
(GA) Mean Squared Error training (GA) Mean Squared Error testing (GA)
1818 0.017723052 0.017723052
1421 0.019861371 0.019861371
896 0.019927361 0.019927361
11676 0.019092293 0.019092293
1068 0.01870197 0.01870197

3375.8 0.0190612094 0.0190612094


Table 8 - Genetic Algorithms testing data (Viruses)

5.3.2 - Back-propagation
Iterations (BP) Mean Squared Error training (BP) Mean Squared Error testing (BP)
3275 0.0197935 0.0391179
10922 0.0196959 0.0434423
2800 0.019743 0.0393106
3744 0.0199621 0.0350318
6295 0.0199916 0.0347797

5407.2 0.01983722 0.03833646


Table 9 - Back-propagation testing data (Viruses)

5.3.3 - Cascade Correlation


Nodes (CC) Mean Squared Error training (CC) Mean Squared Error testing (CC)
1 0.0185152 0.00979968
2 0.0195173 0.0105845
1 0.017043 0.00980208
2 0.00640291 0.0165979
1 0.0194383 0.0111416

1.4 0.016183342 0.011585152


Table 10 - Cascade Correlation testing data (Viruses)

Page 66 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

5.3.4 - Graphs

14000

12000

10000

8000 Iterations (GA)


6000 Iterations (BP)
Nodes (CC)
4000

2000

0
0 2 4 6

Figure 35 - Number of iterations to achieve target (Viruses)

0.025

0.02

Mean Squared Error


0.015 training (GA)
Mean Squared Error
0.01 training (BP)
Mean Squared Error
0.005 training (CC)

0
0 2 4 6

Figure 36 - Mean squared error upon last training test (Viruses)

Page 67 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

0.05
0.045
0.04
0.035
Mean Squared Error
0.03 testing (GA)
0.025 Mean Squared Error
0.02 testing (BP)
0.015 Mean Squared Error
0.01 testing (CC)

0.005
0
0 2 4 6

Figure 37 - Mean squared error at testing (Viruses)

6 - Comparisons
6.1 - Back-propagation and Genetic Algorithms
6.1.1 - XOR

As regards to the XOR problem, genetic algorithms and back-propagation offer similar
performance in solving the problem. The back-propagation implementation seems to be
slightly faster in real-time but this is possibly attributable to managed code inefficiencies
versus unmanaged code, or possible inefficiencies in my programming.

Genetic algorithms take around five hundred iterations on average to produce a


workable solution when aiming for a mean squared error of below 0.01, versus back-
propagations 333 iterations on average. Genetic algorithm solutions are slightly more
accurate, but to a negligible degree in this scenario.

Page 68 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

6.1.2 - Fishers Iris Data

Genetic algorithms clearly found this a difficult problem to solve as opposed to back-
propagation. With an average of 3318 generations for genetic algorithms versus 471 epochs
for back-propagation this became clear. The back-propagation generated networks were also
more accurate overall (in fact on this particular problem analysed the testing data more
effectively than the training data).

6.1.3 - Virus Classification

The solution of this problem was more effective by genetic algorithms than by back-
propagation, with an average of 3375.8 generations versus 5407.2 epochs for back-
propagation.

The genetic algorithm solutions also held their effectiveness through to the testing data
with extreme consistency, whereas the back-propagation solutions lost effectiveness on the
training data.

6.2 - Cascade Correlation and Genetic Algorithms


6.2.1 - XOR

This problem was solved more effectively by the Cascade algorithm than by genetic
algorithms, genetic algorithms taking 550 generations on average to solve the problem to a
suitable accuracy, whilst the cascade algorithm used 2 nodes to solve the problems. At a
maximum of 150 epochs per node added (the default (Nissen, FANN)) this is around 300
epochs to solve the problem. The cascade algorithm also achieves a higher degree of
accuracy.

6.2.2 - Fishers Iris Data

Again, on this data set cascade training came to a satisfactory solution faster and more
accurately than genetic algorithms.

Page 69 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

6.2.3 - Virus Classification

With this data set cascade training again came to a satisfactory solution faster and more
accurately than genetic algorithms did on the same data.

6.3 - Testing conclusions


Genetic algorithms are a valuable and effective means of training neural networks.
They are roughly as effective as back-propagation training and although back-propagation is
faster for the most part, solutions developed using genetic algorithms tend to hold their
effectiveness through to the data set effectively.

Compared to cascade correlation however, genetic algorithms fall down as a solution.


With more time, more in-depth testing using more varied datasets a clearer comparison and
possibly drawbacks associated with cascade training may become evident; however, with the
information presented and discussed here, the unavoidable conclusion is that cascade training
is the most effective of the three training methods examined.

7 - Evaluation
Overall, I feel the project has been a success, although there are further investigations I
would like to continue with given more time (and may continue with in my own time).

Page 70 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

7.1 - How many of the original objectives were achieved?


The original objectives were as follows (9.5 -Objectives):

 Research the fields of genetic algorithms and neural networks.

 Research current information on training neural networks using genetic algorithms.

 Build an application that is capable of training a neural network with a genetic algorithm and with a back-
propagation system, and that allows manual adjustment of relevant variables.

 Build a user-interface for the aforementioned application allowing for easy alteration of appropriate
variables.

 Using above application, as well as previous research, evaluate training methods for neural networks.

 Identify the most effective ways of training neural networks using genetic algorithms, and in which cases
these methods are suitable.

The only objective that I feel could be improved upon is the final objective.

“Identify the most effective ways of training neural networks using genetic algorithms,
and in which cases these methods are suitable.”

With more time more advanced tests could be undertaken using larger data sets, leading
to a more solid conclusion and more information on the various strengths and weaknesses of
the various learning algorithms examined.

Page 71 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

8 - Possible Improvements
More advanced testing would be the biggest improvement that more time would make
possible. Some forms of testing that would be possible are:

• Testing with datasets with larger numbers of inputs


• Testing with large datasets that attempt to train the network to distinguish
between small differences in data.
• Testing both with and without scaling and utilising different scaling methods.
• Testing with datasets that would take large amounts of time to train.
• Improve memory management. Due to small conflicts between managed and
unmanaged memory, the program occasionally causes an exception (most
specifically when creating cascade or back-propagation networks in quick
succession). These could be resolved fairly simply (I believe) by implementing
the iDispose interface in my managed code to allow it to be destroyed upon
demand instead of waiting for the garbage collector.

Page 72 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

8.1 - Possible User Interface Improvements


The main improvements to the user interface that I would make if I had more time are:

• Increased number of variable options presented to the user (with appropriate


defaults set).
• Making the forms shown as dialogs, (the new network wizard and the working
form) lock out the rest of the program, but still minimize instead of blocking
other computer programs.

8.2 - Possible Training Algorithm Improvements


With more time, I would experiment with the following training algorithm
improvements:

• Improve the genetic algorithms crossover and mutation algorithms,


experimenting with multiple point crossover and other crossover forms, as well
as implementing roulette wheel selection.
• Attempt to improve the speed of the genetic algorithm.
• Implement the back-propagation algorithm in C# instead of C++. Contrary to
my opinion before undertaking extensive usage of C# I now believe the reduced
development time, allowing more to be undertaken successfully, is worth the
possibility of reduced performance. Having the back-propagation algorithm in
C# similar to the genetic algorithm would also allow a more “apples to apples”
comparison.
• Implement the cascade training algorithm in C# instead of C++, for the reasons
stated above.

Page 73 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9 - Appendices
9.1 - Source code
The source code in this section is organised first by which part of the application it
represents and then by what source file it originally resided in. The code contains line
numbering for pinpointing the precise lines in question within the body-text when cross-
referencing.

Although there was another section of the program compiled from source (the FANN
library), I have not included it in the source code listing, so that this listing contains only code
I have personally written for the project.

This section also contains only the final draft of the code. The earlier attempts at
genetic algorithm and neural network implementations are not included, and neither is the
code for the prototype constructed in milestone two, which this code is partially based upon.

Page 74 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.1 - View
9.1.1.1 - “NetworkUpdatedEventHandler”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.Linq;
00000004 using System.Text;
00000005
00000006 namespace _05025397.Controller.Classes
00000007 {
00000008 public class NetworkUpdateEventArgs : System.EventArgs
00000009 {
00000010 #region data
00000011 /********************\
00000012 |* DATA *|
00000013 \********************/
00000014
00000015 //Training Algorithm Identifier
00000016 private int trainingalg;
00000017
00000018 //Dataset identifier
00000019 private int _TrDataSet;
00000020
00000021 //Cascade Correlation
00000022 private double _CC_Terr;
00000023 private int _CC_MaxNeurons;
00000024 private int _CC_Reports;
00000025 private double _CC_LearningRate;
00000026 private bool _CC_ITerr;

Page 75 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000027
00000028 //Genetic Algorithm
00000029 private int _GA_PopSize;
00000030 private int _GA_GenLimit;
00000031 private int _GA_Crossover;
00000032 private int _GA_Mutation;
00000033 private double _GA_Terr;
00000034 private int _GA_Reports;
00000035 private int _GA_HiddenL;
00000036 private bool _GA_ITerr;
00000037
00000038 //BackProp
00000039 private int _BP_EpochLimit;
00000040 private int _BP_HiddenL;
00000041 private int _BP_Reports;
00000042 private double _BP_Terr;
00000043 private double _BP_LearningRate;
00000044 private bool _BP_ITerr;
00000045 #endregion
00000046
00000047 #region constructor
00000048 /********************\
00000049 |* CONSTRUCTOR *|
00000050 \********************/
00000051 //Constructor to accept all necessary data
00000052 public NetworkUpdateEventArgs
00000053 (int Algorithm,
00000054 double CCTerr, int CCMaxNeurons,
00000055 int CCReports, double CCLearningRate,
00000056 bool CCITerr, int GAPopSize,

Page 76 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000057 int GAGenLimit, int GACrossOver,


00000058 int GAMutation, double GATerr,
00000059 int GAReports, int GAHiddenL,
00000060 bool GAITerr, int BPEpochLimit,
00000061 int BPHiddenL, int BPReports,
00000062 double BPTerr, double BPLearningRate,
00000063 bool BPITerr, int TrData_Set)
00000064 {
00000065 //Algorithm
00000066 trainingalg = Algorithm;
00000067
00000068 //Cascade Correlation
00000069 _CC_Terr = CCTerr;
00000070 _CC_MaxNeurons = CCMaxNeurons;
00000071 _CC_Reports = CCReports;
00000072 _CC_LearningRate = CCLearningRate;
00000073 _CC_ITerr = CCITerr;
00000074
00000075 //Genetic Algorithm
00000076 _GA_PopSize = GAPopSize;
00000077 _GA_GenLimit = GAGenLimit;
00000078 _GA_Crossover = GACrossOver;
00000079 _GA_Mutation = GAMutation;
00000080 _GA_Terr = GATerr;
00000081 _GA_Reports = GAReports;
00000082 _GA_HiddenL = GAHiddenL;
00000083 _GA_ITerr = GAITerr;
00000084
00000085 //Backpropagation Algorithm
00000086 _BP_EpochLimit = BPEpochLimit;

Page 77 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000087 _BP_HiddenL = BPHiddenL;


00000088 _BP_Reports = BPReports;
00000089 _BP_Terr = BPTerr;
00000090 _BP_LearningRate = BPLearningRate;
00000091 _BP_ITerr = BPITerr;
00000092
00000093 _TrDataSet = TrData_Set;
00000094 }
00000095 #endregion
00000096
00000097 #region getters/setters
00000098 #region cascade_correlation
00000099 //Cascade Correlation
00000100 public double CC_Terr
00000101 { get { return _CC_Terr; } }
00000102 public int CC_MaxNeurons
00000103 { get { return _CC_MaxNeurons; } }
00000104 public int CC_Reports
00000105 { get { return _CC_Reports; } }
00000106 public double CC_LearningRate
00000107 { get { return _CC_LearningRate; } }
00000108 public bool CC_ITerr
00000109 { get { return _CC_ITerr; } }
00000110 #endregion
00000111
00000112 #region genetic_algorithms
00000113 //Genetic Algorithm
00000114 public int GA_PopSize
00000115 { get { return _GA_PopSize; } }
00000116 public int GA_GenLimit

Page 78 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000117 { get { return _GA_GenLimit; } }


00000118 public int GA_Crossover
00000119 { get { return _GA_Crossover; } }
00000120 public int GA_Mutation
00000121 { get { return _GA_Mutation; } }
00000122 public double GA_Terr
00000123 { get { return _GA_Terr; } }
00000124 public int GA_Reports
00000125 { get { return _GA_Reports; } }
00000126 public int GA_HiddenL
00000127 { get { return _GA_HiddenL; } }
00000128 public bool GA_ITerr
00000129 { get { return _GA_ITerr; } }
00000130 #endregion
00000131
00000132 #region back_propagation
00000133 //Back propagation
00000134 public int BP_EpochLimit
00000135 { get { return _BP_EpochLimit; } }
00000136 public int BP_HiddenL
00000137 { get { return _BP_HiddenL; } }
00000138 public int BP_Reports
00000139 { get { return _BP_Reports; } }
00000140 public double BP_Terr
00000141 { get { return _BP_Terr; } }
00000142 public double BP_LearningRate
00000143 { get { return _BP_LearningRate; } }
00000144 public bool BP_ITerr
00000145 { get { return _BP_ITerr; } }
00000146 #endregion

Page 79 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000147 #endregion
00000148
00000149 #region external
00000150 //General
00000151 public int TrDataSet
00000152 { get { return _TrDataSet; } }
00000153 public int TrainingAlgorithm
00000154 { get { return trainingalg; } }
00000155 #endregion
00000156 }
00000157 }

9.1.1.2 - “FrmMain.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.ComponentModel;
00000004 using System.Data;
00000005 using System.Drawing;
00000006 using System.Linq;
00000007 using System.Text;
00000008 using System.IO;
00000009 using System.Windows.Forms;
00000010
00000011 using ZedGraph;
00000012 using _05025397;
00000013
00000014
00000015 namespace _05025397.Controller
00000016 {
00000017 public partial class FrmMain : Form

Page 80 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000018 {
00000019 #region data
00000020 /********************\
00000021 |* DATA *|
00000022 \********************/
00000023 //The neuralnetwork
00000024 Controller.Algorithms.TrainingAlgorithm network;
00000025
00000026 //Working dialog
00000027 Controller.Working workingdialog = new Controller.Working();
00000028
00000029 //Background thread
00000030 static BackgroundWorker bw = new BackgroundWorker();
00000031 #endregion
00000032
00000033 #region constructor
00000034 /********************\
00000035 |* CONSTRUCTOR *|
00000036 \********************/
00000037 public FrmMain()
00000038 {
00000039 InitializeComponent();
00000040 this.Icon = Properties.Resources.icon;
00000041 bw.DoWork += new DoWorkEventHandler(bw_DoWork);
00000042 bw.RunWorkerCompleted += new RunWorkerCompletedEventHandler(bw_RunWorkerCompleted);
00000043 }
00000044 #endregion
00000045
00000046 #region internal_functions
00000047 /********************\

Page 81 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000048 |* INTERNAL *|
00000049 \********************/
00000050 private void plotgraphsMSE(PointPairList MSE)
00000051 {
00000052 this.zedGraphControl1.GraphPane.CurveList.Clear();
00000053
00000054 GraphPane MSEpane = this.zedGraphControl1.GraphPane;
00000055
00000056 //Titles
00000057 MSEpane.Title.Text = "Mean Squared Error";
00000058 MSEpane.YAxis.Title.Text = "MSE";
00000059 MSEpane.XAxis.Title.Text = "Iteration";
00000060
00000061 // Hide the legend
00000062 MSEpane.Legend.IsVisible = false;
00000063
00000064 MSEpane.AddCurve("", MSE, Color.Red, SymbolType.Circle);
00000065
00000066 this.zedGraphControl1.AxisChange();
00000067
00000068 }
00000069
00000070 private void updatenetwork(object sender, Classes.NetworkUpdateEventArgs e)
00000071 {
00000072 LoadDataset(e.TrDataSet);
00000073
00000074 try
00000075 {
00000076 if (e.TrainingAlgorithm == 0)
00000077 {

Page 82 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000078 network =
00000079 new Controller.Algorithms.GeneticAlgorithm
00000080 (e.GA_PopSize, e.GA_Crossover,
00000081 e.GA_Mutation, e.GA_HiddenL,
00000082 e.GA_ITerr, e.GA_Terr,
00000083 e.GA_Reports, e.GA_GenLimit,
00000084 txtTrainData, txtTestData);
00000085 }
00000086
00000087 if (e.TrainingAlgorithm == 1)
00000088 {
00000089 network =
00000090 new Controller.Algorithms.BackProp
00000091 (e.BP_HiddenL, e.BP_LearningRate,
00000092 e.BP_ITerr, e.BP_Terr,
00000093 e.BP_Reports, e.BP_EpochLimit,
00000094 txtTrainData, txtTestData);
00000095 }
00000096
00000097 if (e.TrainingAlgorithm == 2)
00000098 {
00000099 network =
00000100 new Controller.Algorithms.CascadeCorrelation
00000101 (e.CC_LearningRate, e.CC_Terr,
00000102 e.CC_Reports, e.CC_MaxNeurons, e.CC_ITerr,
00000103 txtTrainData, txtTestData);
00000104 }
00000105 }
00000106 catch
00000107 {

Page 83 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000108 MessageBox.Show("Error creating new network",


00000109 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000110 Application.Exit();
00000111 }
00000112
00000113 if (network != null)
00000114 {
00000115 txtNetSettings.Text =
00000116 network.network_details();
00000117 }
00000118 }
00000119
00000120 private void new_network()
00000121 {
00000122 Controller.Wiz1 wizard =
00000123 new Controller.Wiz1();
00000124
00000125 wizard.NetworkUpdated +=
00000126 new Controller.Wiz1.NetworkUpdateHandler(updatenetwork);
00000127
00000128 wizard.ShowDialog();
00000129 }
00000130
00000131 private void LoadDataset(int Data_set)
00000132 {
00000133 switch (Data_set)
00000134 {
00000135 case (int)Dataset.XOR:
00000136 txtTrainData.Text = Properties.Resources.XORtrain;
00000137 txtTestData.Text = Properties.Resources.XORtest;

Page 84 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000138 break;
00000139 case (int)Dataset.FISHER:
00000140 txtTrainData.Text = Properties.Resources.FISHERtrain;
00000141 txtTestData.Text = Properties.Resources.FISHERtest;
00000142 break;
00000143 case (int)Dataset.CUSTOM:
00000144 LoadCustomDataSet();
00000145 break;
00000146 }
00000147 }
00000148
00000149 private void LoadCustomDataSet()
00000150 {
00000151
00000152 string trainpath = SelectTextFile(".\\", "Select Training Data");
00000153 string testpath = SelectTextFile(".\\", "Select Testing Data");
00000154
00000155 string train = "";
00000156 string test = "";
00000157
00000158 try
00000159 {
00000160 train = LoadTxtFile(trainpath);
00000161 }
00000162 catch
00000163 {
00000164 MessageBox.Show("Error: Problem loading training data",
00000165 "Error!", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000166 }
00000167

Page 85 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000168 try
00000169 {
00000170 test = LoadTxtFile(testpath);
00000171 }
00000172 catch
00000173 {
00000174 MessageBox.Show("Error: Problem loading testing data",
00000175 "Error!", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000176 }
00000177
00000178 txtTestData.Text = test;
00000179 txtTrainData.Text = train;
00000180 }
00000181
00000182 private string LoadTxtFile(string path)
00000183 {
00000184 string data = "";
00000185 FileStream File = null;
00000186 StreamReader Reader = null;
00000187
00000188 try
00000189 {
00000190 File = new FileStream(path, FileMode.Open, FileAccess.Read);
00000191 Reader = new StreamReader(File);
00000192 data = Reader.ReadToEnd();
00000193 }
00000194 catch
00000195 {
00000196 MessageBox.Show("Error reading selected file: " + path,
00000197 "Error!", MessageBoxButtons.OK, MessageBoxIcon.Error);

Page 86 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000198 data = "";


00000199 }
00000200 finally
00000201 {
00000202 if (Reader != null)
00000203 Reader.Close();
00000204 if (File != null)
00000205 File.Close();
00000206 }
00000207
00000208 return data;
00000209 }
00000210
00000211 private string SelectTextFile(string initialDirectory, string title)
00000212 {
00000213 OpenFileDialog dialog = new OpenFileDialog();
00000214 dialog.Filter =
00000215 "txt files (*.txt)|*.txt|All files (*.*)|*.*";
00000216 dialog.InitialDirectory = initialDirectory;
00000217 dialog.Title = title;
00000218 return (dialog.ShowDialog() == DialogResult.OK)
00000219 ? dialog.FileName : null;
00000220 }
00000221
00000222 #endregion
00000223
00000224 #region menu_items
00000225 /********************\
00000226 |* MENU ITEMS *|
00000227 \********************/

Page 87 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000228 private void newNetworkToolStripMenuItem_Click(object sender, EventArgs e)


00000229 {
00000230 new_network();
00000231 }
00000232
00000233 private void exitToolStripMenuItem_Click(object sender, EventArgs e)
00000234 {
00000235 this.Close();
00000236 }
00000237
00000238 private void runToolStripMenuItem_Click(object sender, EventArgs e)
00000239 {
00000240 if (network != null)
00000241 {
00000242 if (network.Trained != false)
00000243 {
00000244 network.Test();
00000245 }
00000246 }
00000247 }
00000248
00000249 private void aboutToolStripMenuItem1_Click(object sender, EventArgs e)
00000250 {
00000251 FrmAbout About = new FrmAbout();
00000252 About.Show();
00000253 }
00000254
00000255 #endregion
00000256
00000257 #region buttons

Page 88 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000258 /********************\
00000259 |* BUTTONS *|
00000260 \********************/
00000261 private void btnNew_Click(object sender, EventArgs e)
00000262 {
00000263 new_network();
00000264 }
00000265
00000266 private void btnTrain_Click(object sender, EventArgs e)
00000267 {
00000268 //Shouldn't be busy if the user managed to click this
00000269 //but just make 100% certain.
00000270 if (!bw.IsBusy)
00000271 {
00000272 if (network != null)
00000273 {
00000274 if (!network.Trained)
00000275 {
00000276 bw.RunWorkerAsync();
00000277 workingdialog.ShowDialog();
00000278 }
00000279 }
00000280 }
00000281 }
00000282
00000283 private void btnTest_Click(object sender, EventArgs e)
00000284 {
00000285 if (network != null)
00000286 {
00000287 if (network.Trained)

Page 89 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000288 {
00000289 if (!network.Tested)
00000290 {
00000291 bool success = network.Test();
00000292 if (success)
00000293 {
00000294 txtOutput.Clear();
00000295 txtOutput.Text = network.ReportData;
00000296 txtOutput.SelectionStart = txtOutput.Text.Length;
00000297 txtOutput.ScrollToCaret();
00000298 txtOutput.Refresh();
00000299 }
00000300 }
00000301 }
00000302 }
00000303 }
00000304 #endregion
00000305
00000306 #region bg_worker
00000307 /********************\
00000308 |* BG WORKER *|
00000309 \********************/
00000310 private void bw_DoWork(object sender, DoWorkEventArgs e)
00000311 {
00000312 bool success = true;
00000313
00000314 if (network != null)
00000315 {
00000316 success = network.Train();
00000317 }

Page 90 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000318
00000319 e.Result = success;
00000320 }
00000321
00000322 private void bw_RunWorkerCompleted(object sender, RunWorkerCompletedEventArgs e)
00000323 {
00000324 bool success;
00000325 //Check for errors
00000326 if (e.Error != null)
00000327 {
00000328 MessageBox.Show("Error during network train: " + e.Error.ToString(),
00000329 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000330
00000331 }
00000332
00000333 success = (bool) e.Result;
00000334
00000335 if (success)
00000336 {
00000337 txtOutput.Clear();
00000338 txtOutput.Text = network.ReportData;
00000339 txtOutput.SelectionStart = txtOutput.Text.Length;
00000340 txtOutput.ScrollToCaret();
00000341 plotgraphsMSE(network.getGraphData());
00000342 }
00000343 else
00000344 {
00000345 MessageBox.Show("Error training network!", "Error training network!",
00000346 MessageBoxButtons.OK, MessageBoxIcon.Error);
00000347 }

Page 91 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000348
00000349 //Remove the dialog locking out the main form
00000350 //and showing that we're currently working
00000351 workingdialog.Hide();
00000352 }
00000353 #endregion
00000354
00000355 #region misc
00000356 /********************\
00000357 |* MISC *|
00000358 \********************/
00000359 private void FrmMain_Shown(object sender, EventArgs e)
00000360 {
00000361 //Run the new neural net wizard
00000362 new_network();
00000363 }
00000364 #endregion
00000365 }
00000366 }

9.1.1.3 - “Wiz1.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.ComponentModel;
00000004 using System.Data;
00000005 using System.Drawing;
00000006 using System.Linq;

Page 92 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000007 using System.Text;


00000008 using System.Windows.Forms;
00000009
00000010 using _05025397;
00000011
00000012 namespace _05025397.Controller
00000013 {
00000014 #region enums
00000015 /********************\
00000016 |* ENUMS *|
00000017 \********************/
00000018 enum Training { GA, BP, CC };
00000019 enum Dataset { XOR, FISHER, CUSTOM };
00000020 #endregion
00000021
00000022 public partial class Wiz1 : Form
00000023 {
00000024
00000025 #region data
00000026 /********************\
00000027 |* DATA *|
00000028 \********************/
00000029 public delegate void NetworkUpdateHandler(object sender, Classes.NetworkUpdateEventArgs e);
00000030
00000031 public event NetworkUpdateHandler NetworkUpdated;
00000032
00000033 private int algorithm;
00000034
00000035 private int Trainset;
00000036

Page 93 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000037 #endregion data


00000038
00000039 #region constructor
00000040 /********************\
00000041 |* CONSTRUCTOR *|
00000042 \********************/
00000043 public Wiz1()
00000044 {
00000045 InitializeComponent();
00000046 this.Icon = Properties.Resources.icon;
00000047 algorithm = (int)Training.GA;
00000048 Trainset = (int)Dataset.XOR;
00000049 }
00000050 #endregion
00000051
00000052 #region control_events
00000053 /********************\
00000054 |* CONTROL EVENTS *|
00000055 \********************/
00000056
00000057 #region buttons
00000058 /********************\
00000059 |* BUTTONS *|
00000060 \********************/
00000061 private void btnCancel_Click(object sender, EventArgs e)
00000062 {
00000063 this.Close();
00000064 }
00000065
00000066 private void rBtnBackprop_CheckedChanged(object sender, EventArgs e)

Page 94 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000067 {
00000068 if (rBtnBackprop.Checked == true)
00000069 {
00000070 picAlg.Image = _05025397.Properties.Resources.IMAGEBackProp;
00000071 lblGA.Visible = false;
00000072 lblBP.Visible = true;
00000073 lblCC.Visible = false;
00000074
00000075 panBackprop.Visible = true;
00000076 panGeneticAlgorithm.Visible = false;
00000077 panCC.Visible = false;
00000078
00000079 algorithm = (int)Training.BP;
00000080 }
00000081 }
00000082
00000083 private void rBtnCascadeCorrelation_CheckedChanged(object sender, EventArgs e)
00000084 {
00000085 if (rBtnCascadeCorrelation.Checked == true)
00000086 {
00000087 picAlg.Image = _05025397.Properties.Resources.IMAGECascadeCorrelation;
00000088 lblGA.Visible = false;
00000089 lblBP.Visible = false;
00000090 lblCC.Visible = true;
00000091
00000092 panBackprop.Visible = false;
00000093 panGeneticAlgorithm.Visible = false;
00000094 panCC.Visible = true;
00000095
00000096 algorithm = (int)Training.CC;

Page 95 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000097 }
00000098 }
00000099
00000100 private void rBtnGeneticAlgorithm_CheckedChanged(object sender, EventArgs e)
00000101 {
00000102 if (rBtnGeneticAlgorithm.Checked == true)
00000103 {
00000104 picAlg.Image = _05025397.Properties.Resources.IMAGEGeneticAlgorithm;
00000105 lblGA.Visible = true;
00000106 lblBP.Visible = false;
00000107 lblCC.Visible = false;
00000108
00000109 panBackprop.Visible = false;
00000110 panGeneticAlgorithm.Visible = true;
00000111 panCC.Visible = false;
00000112
00000113 algorithm = (int)Training.GA;
00000114 }
00000115 }
00000116
00000117 private void btnBack_Click(object sender, EventArgs e)
00000118 {
00000119 if (tabControl1.SelectedIndex > 0)
00000120 {
00000121 tabControl1.SelectTab(tabControl1.SelectedIndex - 1);
00000122 }
00000123 }
00000124
00000125 private void btnNext_Click(object sender, EventArgs e)
00000126 {

Page 96 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000127 if (tabControl1.SelectedIndex < tabControl1.TabCount - 1)


00000128 {
00000129 tabControl1.SelectTab(tabControl1.SelectedIndex + 1);
00000130 }
00000131 else
00000132 {
00000133 VerifyData check = new VerifyData
00000134 (algorithm, txtCC_TargetError.Text, txtCC_MaxNeurons.Text,
00000135 txtCC_Report.Text, txtCC_LearningRate.Text, CC_ITerr.Checked,
00000136 txtGA_PopSize.Text, txtGA_MaxGen.Text,
00000137 txtGA_ProbX.Text, txtGA_ProbM.Text,
00000138 txtGA_Terr.Text, txtGA_Report.Text,
00000139 txtGA_HiddenNodes.Text, chkGA_IgnoreTarget.Checked,
00000140 txtBP_EpochLimit.Text, txtBP_HiddenNodes.Text,
00000141 txtBP_Report.Text, txtBP_Terr.Text,
00000142 txtBP_LearnRate.Text, chkBP_IgnoreTarget.Checked,
00000143 Trainset);
00000144
00000145 if (check.Verified)
00000146 {
00000147 Classes.NetworkUpdateEventArgs args =
00000148 new Classes.NetworkUpdateEventArgs
00000149 (check.TrainingAlgorithm, check.CC_Terr,
00000150 check.CC_MaxNeurons, check.CC_Reports,
00000151 check.CC_LearningRate, check.CC_ITerr,
00000152 check.GA_PopSize, check.GA_GenLimit,
00000153 check.GA_Crossover, check.GA_Mutation,
00000154 check.GA_Terr, check.GA_Reports,
00000155 check.GA_HiddenL, check.GA_ITerr,
00000156 check.BP_EpochLimit, check.BP_HiddenL,

Page 97 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000157 check.BP_Reports, check.BP_Terr,


00000158 check.BP_LearningRate, check.BP_ITerr,
00000159 check.TrDataSet);
00000160
00000161 NetworkUpdated(this, args);
00000162 this.Dispose();
00000163 }
00000164 else
00000165 {
00000166 MessageBox.Show("Error Checking Data: Data could not be verified",
00000167 "Error Checking Data", MessageBoxButtons.OK, MessageBoxIcon.Exclamation);
00000168 }
00000169 }
00000170 }
00000171 #endregion
00000172
00000173 #region checkboxes
00000174 /********************\
00000175 |* CHECKBOXES *|
00000176 \********************/
00000177 private void chkDatasetFisher_CheckedChanged(object sender, EventArgs e)
00000178 {
00000179 if (chkDatasetFisher.Checked)
00000180 {
00000181 picDataset.Image = _05025397.Properties.Resources.IMAGEIrisFlowers;
00000182 Trainset = (int)Dataset.FISHER;
00000183 }
00000184 }
00000185
00000186 private void chkDatasetXOR_CheckedChanged(object sender, EventArgs e)

Page 98 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000187 {
00000188 if (chkDatasetXOR.Checked)
00000189 {
00000190 picDataset.Image = _05025397.Properties.Resources.IMAGEXOR;
00000191 Trainset = (int)Dataset.XOR;
00000192 }
00000193 }
00000194
00000195 private void chkDatasetCustom_CheckedChanged(object sender, EventArgs e)
00000196 {
00000197 if (chkDatasetCustom.Checked)
00000198 {
00000199 picDataset.Image = _05025397.Properties.Resources.IMAGEQuestionMark;
00000200 Trainset = (int)Dataset.CUSTOM;
00000201 }
00000202 }
00000203 #endregion
00000204
00000205 #region misc
00000206 /********************\
00000207 |* MISC *|
00000208 \********************/
00000209 private void tabControl1_SelectedIndexChanged(object sender, EventArgs e)
00000210 {
00000211 if (tabControl1.SelectedIndex == 1)
00000212 {
00000213 if (panCC.Visible == true)
00000214 {
00000215 txtCC_TargetError.Focus();
00000216 }

Page 99 of 225
William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000217 if (panBackprop.Visible == true)


00000218 {
00000219 txtBP_EpochLimit.Focus();
00000220 }
00000221 if (panGeneticAlgorithm.Visible == true)
00000222 {
00000223 txtGA_PopSize.Focus();
00000224 }
00000225 }
00000226 }
00000227
00000228 private void Wiz1_Load(object sender, EventArgs e)
00000229 {
00000230 this.Icon = Properties.Resources.icon;
00000231 tabControl1.SelectTab(0);
00000232 }
00000233 #endregion
00000234 #endregion
00000235 }
00000236 }

Page 100 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.1.4 - “FrmAbout.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.ComponentModel;
00000004 using System.Data;
00000005 using System.Drawing;
00000006 using System.Linq;
00000007 using System.Text;
00000008 using System.Windows.Forms;
00000009
00000010 namespace _05025397.Controller
00000011 {
00000012 public partial class FrmAbout : Form
00000013 {
00000014 #region constructor
00000015 /********************\
00000016 |* CONSTRUCTOR *|
00000017 \********************/
00000018 public FrmAbout()
00000019 {
00000020 InitializeComponent();
00000021 }
00000022 #endregion
00000023
00000024 #region form
00000025 /********************\
00000026 |* FORM *|
00000027 \********************/
00000028 private void FrmAbout_Load(object sender, EventArgs e)
00000029 {

Page 101 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000030 this.Icon = Properties.Resources.icon;


00000031 label5.Text = Application.ProductVersion;
00000032 textBox5.Text = Properties.Resources.LICENCE_LGPL21;
00000033 }
00000034 #endregion
00000035
00000036 #region linklabels
00000037 /********************\
00000038 |* LINKLABELS *|
00000039 \********************/
00000040 private void linkLabel1_LinkClicked(object sender, LinkLabelLinkClickedEventArgs e)
00000041 {
00000042 // Specify that the link was visited.
00000043 this.linkLabel1.LinkVisited = true;
00000044
00000045 try
00000046 {
00000047 // Navigate to a URL.
00000048 System.Diagnostics.Process.Start("mailto:" +
00000049 linkLabel1.Text);
00000050 }
00000051 catch
00000052 {
00000053 MessageBox.Show("No e-mail program defined.",
00000054 "Undefined Application", MessageBoxButtons.OK,
00000055 MessageBoxIcon.Information);
00000056 }
00000057 }
00000058
00000059 private void linkLabel4_LinkClicked(object sender, LinkLabelLinkClickedEventArgs e)

Page 102 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000060 {
00000061 try
00000062 {
00000063 System.Diagnostics.Process.Start
00000064 (linkLabel4.Text);
00000065 }
00000066 catch
00000067 {
00000068 MessageBox.Show("Error opening link.",
00000069 "Error", MessageBoxButtons.OK,
00000070 MessageBoxIcon.Error);
00000071 }
00000072 }
00000073
00000074 private void linkLabel5_LinkClicked(object sender, LinkLabelLinkClickedEventArgs e)
00000075 {
00000076 try
00000077 {
00000078 System.Diagnostics.Process.Start
00000079 (linkLabel5.Text);
00000080 }
00000081 catch
00000082 {
00000083 MessageBox.Show("Error opening link.",
00000084 "Error", MessageBoxButtons.OK,
00000085 MessageBoxIcon.Error);
00000086 }
00000087 }
00000088
00000089 private void linkLabel6_LinkClicked(object sender, LinkLabelLinkClickedEventArgs e)

Page 103 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000090 {
00000091 try
00000092 {
00000093 System.Diagnostics.Process.Start
00000094 (linkLabel6.Text);
00000095 }
00000096 catch
00000097 {
00000098 MessageBox.Show("Error opening link.",
00000099 "Error", MessageBoxButtons.OK,
00000100 MessageBoxIcon.Error);
00000101 }
00000102 }
00000103
00000104 private void linkLabel2_LinkClicked(object sender, LinkLabelLinkClickedEventArgs e)
00000105 {
00000106 try
00000107 {
00000108 System.Diagnostics.Process.Start
00000109 ("http://www.snikks.co.uk");
00000110 }
00000111 catch
00000112 {
00000113 MessageBox.Show("Error opening link.",
00000114 "Error", MessageBoxButtons.OK,
00000115 MessageBoxIcon.Error);
00000116 }
00000117 }
00000118
00000119 private void linkLabel7_LinkClicked(object sender, LinkLabelLinkClickedEventArgs e)

Page 104 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000120 {
00000121 try
00000122 {
00000123 System.Diagnostics.Process.Start
00000124 ("http://www.gnu.org/licenses/old-licenses/lgpl-2.1.html");
00000125 }
00000126 catch
00000127 {
00000128 MessageBox.Show("Error opening link.",
00000129 "Error", MessageBoxButtons.OK,
00000130 MessageBoxIcon.Error);
00000131 }
00000132 }
00000133
00000134 #endregion
00000135
00000136 #region buttons
00000137 /********************\
00000138 |* BUTTONS *|
00000139 \********************/
00000140 private void btnDatasets_Click(object sender, EventArgs e)
00000141 {
00000142 if (tabControl1.Visible != true)
00000143 {
00000144 groupBox1.Visible = false;
00000145 groupBox2.Visible = false;
00000146 tabControl1.Visible = true;
00000147 pictureBox1.Visible = false;
00000148 }
00000149 else

Page 105 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000150 {
00000151 groupBox1.Visible = true;
00000152 groupBox2.Visible = false;
00000153 tabControl1.Visible = false;
00000154 pictureBox1.Visible = true;
00000155 }
00000156 }
00000157
00000158 private void btnLicences_Click(object sender, EventArgs e)
00000159 {
00000160 if (groupBox2.Visible != true)
00000161 {
00000162 groupBox2.Visible = true;
00000163 groupBox1.Visible = false;
00000164 tabControl1.Visible = false;
00000165 pictureBox1.Visible = false;
00000166 }
00000167 else
00000168 {
00000169 groupBox1.Visible = true;
00000170 groupBox2.Visible = false;
00000171 tabControl1.Visible = false;
00000172 pictureBox1.Visible = true;
00000173 }
00000174 }
00000175
00000176 private void btnOkay_Click(object sender, EventArgs e)
00000177 {
00000178 this.Dispose();
00000179 }

Page 106 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000180 #endregion
00000181
00000182 #region radiobuttons
00000183 /********************\
00000184 |* RADIOBUTTONS *|
00000185 \********************/
00000186 private void radioButton1_CheckedChanged(object sender, EventArgs e)
00000187 {
00000188 if (radioButton1.Checked == true)
00000189 textBox5.Text = Properties.Resources.LICENCE_LGPL21;
00000190 }
00000191
00000192 private void radioButton2_CheckedChanged(object sender, EventArgs e)
00000193 {
00000194 if (radioButton2.Checked == true)
00000195 textBox5.Text = Properties.Resources.LICENCE_LGPL30;
00000196 }
00000197
00000198 private void radioButton3_CheckedChanged(object sender, EventArgs e)
00000199 {
00000200 if (radioButton3.Checked == true)
00000201 textBox5.Text = Properties.Resources.LICENCE_GPL30;
00000202 }
00000203 #endregion
00000204 }
00000205 }

Page 107 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.1.5 - “Working.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.ComponentModel;
00000004 using System.Data;
00000005 using System.Drawing;
00000006 using System.Linq;
00000007 using System.Text;
00000008 using System.Windows.Forms;
00000009
00000010 namespace _05025397.Controller
00000011 {
00000012 public partial class Working : Form
00000013 {
00000014 #region functions
00000015 /********************\
00000016 |* FUNCTIONS *|
00000017 \********************/
00000018 public Working()
00000019 {
00000020 InitializeComponent();
00000021 this.Icon = Properties.Resources.icon;
00000022 }
00000023
00000024 private void Working_Click(object sender, EventArgs e)
00000025 {
00000026 System.Media.SystemSounds.Beep.Play();
00000027 }
00000028 #endregion
00000029 }

Page 108 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000030 }

9.1.2 - Controller
9.1.2.1 - “VerifyData.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.Linq;
00000004 using System.Text;
00000005 using System.Windows.Forms;
00000006
00000007 namespace _05025397.Controller
00000008 {
00000009 public class VerifyData
00000010 {
00000011 #region data
00000012 /********************\
00000013 |* DATA *|
00000014 \********************/
00000015 //Dataverified variable
00000016 private bool verified;
00000017
00000018 //Training Algorithm Identifier
00000019 //1 = genetic algorith
00000020 //2 = backprop
00000021 //3 = cascade correlation
00000022 private int trainingalg;
00000023
00000024 //Dataset identifer
00000025 //0 = XOR

Page 109 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000026 //1 = FISHERS IRIS DATA


00000027 //2 = CUSTOM
00000028 private int _TDataSet;
00000029
00000030 //Cascade Correlation
00000031 private double _CC_Terr;
00000032 private int _CC_MaxNeurons;
00000033 private int _CC_Reports;
00000034 private double _CC_LearningRate;
00000035 private bool _CC_ITerr;
00000036
00000037 //Genetic Algorithm
00000038 private int _GA_PopSize;
00000039 private int _GA_GenLimit;
00000040 private int _GA_Crossover;
00000041 private int _GA_Mutation;
00000042 private double _GA_Terr;
00000043 private int _GA_Reports;
00000044 private int _GA_HiddenL;
00000045 private bool _GA_ITerr;
00000046
00000047 //BackProp
00000048 private int _BP_EpochLimit;
00000049 private int _BP_HiddenL;
00000050 private int _BP_Reports;
00000051 private double _BP_Terr;
00000052 private double _BP_LearningRate;
00000053 private bool _BP_ITerr;
00000054
00000055 #endregion

Page 110 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000056
00000057 #region constructor
00000058 /********************\
00000059 |* CONSTRUCTOR *|
00000060 \********************/
00000061 public VerifyData
00000062 (int Algorithm,
00000063 string CCTerr, string CCMaxNeurons,
00000064 string CCReports, string CCLearningRate,
00000065 bool CCITerr, string GAPopSize,
00000066 string GAGenLimit, string GACrossOver,
00000067 string GAMutation, string GATerr,
00000068 string GAReports, string GAHiddenL,
00000069 bool GAITerr, string BPEpochLimit,
00000070 string BPHiddenL, string BPReports,
00000071 string BPTerr, string BPLearningRate,
00000072 bool BPITerr, int TData_Set)
00000073 {
00000074 //Initially set to true, will be changed to
00000075 //false if any data verification errors occur
00000076 verified = true;
00000077
00000078 //Algorithm
00000079 trainingalg = Algorithm;
00000080
00000081 //DataSet
00000082 _TDataSet = TData_Set;
00000083
00000084 if (_TDataSet !=
00000085 (int)Dataset.XOR && _TDataSet !=

Page 111 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000086 (int)Dataset.FISHER && _TDataSet !=


00000087 (int)Dataset.CUSTOM)
00000088 verified = false;
00000089
00000090 if (verified)
00000091 {
00000092 if (trainingalg == 2)
00000093 {
00000094 try
00000095 {
00000096 //Cascade correlation
00000097 _CC_Terr = Convert.ToSingle(CCTerr);
00000098 _CC_MaxNeurons = Convert.ToInt32(CCMaxNeurons);
00000099 _CC_Reports = Convert.ToInt32(CCReports);
00000100 _CC_LearningRate = Convert.ToSingle(CCLearningRate);
00000101 _CC_ITerr = CCITerr;
00000102 }
00000103 catch (FormatException)
00000104 {
00000105 MessageBox.Show
00000106 ("Parsing Error: Please check supplied cascade" +
00000107 "correlation data values",
00000108 "Parsing Error",
00000109 MessageBoxButtons.OK,
00000110 MessageBoxIcon.Error);
00000111 verified = false;
00000112 }
00000113 }
00000114 }
00000115

Page 112 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000116 if (verified)
00000117 {
00000118 if (trainingalg == 0)
00000119 {
00000120 try
00000121 {
00000122 //Genetic algorithm
00000123 _GA_PopSize = Convert.ToInt32(GAPopSize);
00000124 _GA_GenLimit = Convert.ToInt32(GAGenLimit);
00000125 _GA_Crossover = Convert.ToInt32(GACrossOver);
00000126 _GA_Mutation = Convert.ToInt32(GAMutation);
00000127 _GA_Terr = Convert.ToSingle(GATerr);
00000128 _GA_Reports = Convert.ToInt32(GAReports);
00000129 _GA_HiddenL = Convert.ToInt32(GAHiddenL);
00000130 _GA_ITerr = GAITerr;
00000131 }
00000132 catch (FormatException)
00000133 {
00000134 MessageBox.Show
00000135 ("Parsing Error: Please check supplied"
00000136 +" genetic algorithm data values",
00000137 "Parsing Error",
00000138 MessageBoxButtons.OK,
00000139 MessageBoxIcon.Error);
00000140 verified = false;
00000141 }
00000142 }
00000143 }
00000144
00000145 if (verified)

Page 113 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000146 {
00000147 if (trainingalg == 1)
00000148 {
00000149 try
00000150 {
00000151 //Back-propagation
00000152 _BP_EpochLimit = Convert.ToInt32(BPEpochLimit);
00000153 _BP_HiddenL = Convert.ToInt32(BPHiddenL);
00000154 _BP_Reports = Convert.ToInt32(BPReports);
00000155 _BP_Terr = Convert.ToDouble(BPTerr);
00000156 _BP_LearningRate = Convert.ToDouble(BPLearningRate);
00000157 _BP_ITerr = BPITerr;
00000158 }
00000159 catch (FormatException)
00000160 {
00000161 MessageBox.Show
00000162 ("Parsing Error: Please check " +
00000163 "supplied back propagation data values",
00000164 "Parsing Error",
00000165 MessageBoxButtons.OK,
00000166 MessageBoxIcon.Error);
00000167 verified = false;
00000168 }
00000169 }
00000170 }
00000171
00000172 if (verified)
00000173 {
00000174 verified = checkforsanity();
00000175 }

Page 114 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000176 }
00000177
00000178 #endregion
00000179
00000180 #region internal
00000181 /********************\
00000182 |* INTERNAL *|
00000183 \********************/
00000184 private bool checkGAsanity()
00000185 {
00000186 bool sanity = true;
00000187
00000188 if (_GA_PopSize < 3)
00000189 sanity = false;
00000190
00000191 if (_GA_GenLimit < 1)
00000192 sanity = false;
00000193
00000194 if (_GA_Crossover < 0 || _GA_Crossover > 100)
00000195 {
00000196 sanity = false;
00000197 }
00000198
00000199 if (_GA_Mutation < 0 || _GA_Mutation > 100)
00000200 {
00000201 sanity = false;
00000202 }
00000203
00000204 if (_GA_Terr < 0 || _GA_Terr > 1)
00000205 sanity = false;

Page 115 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000206
00000207 if (_GA_Reports < 1 || _GA_Reports > _GA_GenLimit)
00000208 sanity = false;
00000209
00000210 if (_GA_HiddenL < 1)
00000211 sanity = false;
00000212
00000213 if (!sanity)
00000214 MessageBox.Show
00000215 ("Sanity Check Fail: Please check supplied " +
00000216 "genetic algorithm data values",
00000217 "Sanity Check Fail",
00000218 MessageBoxButtons.OK,
00000219 MessageBoxIcon.Error);
00000220
00000221 return sanity;
00000222 }
00000223
00000224 private bool checkBPsanity()
00000225 {
00000226 bool sanity = true;
00000227
00000228 if (_BP_EpochLimit < 1)
00000229 sanity = false;
00000230
00000231 if (_BP_HiddenL < 1)
00000232 sanity = false;
00000233
00000234 if (_BP_Reports < 1 || _BP_Reports > _BP_EpochLimit)
00000235 sanity = false;

Page 116 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000236
00000237 if (_BP_Terr < 0 || _BP_Terr > 1)
00000238 sanity = false;
00000239
00000240 if (_BP_LearningRate < 0 || _BP_LearningRate > 1)
00000241 sanity = false;
00000242
00000243 if (!sanity)
00000244 MessageBox.Show
00000245 ("Sanity Check Fail: Please check supplied " +
00000246 "back propagation data values",
00000247 "Sanity Check Fail",
00000248 MessageBoxButtons.OK,
00000249 MessageBoxIcon.Error);
00000250
00000251 return sanity;
00000252 }
00000253
00000254 private bool checkCCsanity()
00000255 {
00000256 bool sanity = true;
00000257
00000258 if (_CC_Terr < 0 || _CC_Terr > 1)
00000259 sanity = false;
00000260
00000261 if (_CC_MaxNeurons < 1)
00000262 sanity = false;
00000263
00000264 if (_CC_Reports < 1 || _CC_Reports > _CC_MaxNeurons)
00000265 sanity = false;

Page 117 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000266
00000267 if (_CC_LearningRate < 0 || _CC_LearningRate > 1)
00000268 sanity = false;
00000269
00000270 if (!sanity)
00000271 MessageBox.Show
00000272 ("Sanity Check Fail: Please check supplied " +
00000273 "cascade correlation data values",
00000274 "Sanity Check Fail",
00000275 MessageBoxButtons.OK,
00000276 MessageBoxIcon.Error);
00000277
00000278 return sanity;
00000279 }
00000280
00000281 private bool checkforsanity()
00000282 {
00000283 //Some of the values that are allowed through here would not produce
00000284 //good results - however I don't want to prevent any flexibility, only
00000285 //halt any values from proceeding that are so wrong they could cause
00000286 //serious errors later down the line.
00000287
00000288 // - Basically we're making sure the data is clean before it goes
00000289 // - anywhere near a DLL.
00000290 // - We're also doing some very minor processing of the probability
00000291 // - values, if they're greater than 1 but less than 100 scale them
00000292 // - appropriately.
00000293 bool sanity = true;
00000294
00000295 if (trainingalg < 0 || trainingalg > 2)

Page 118 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000296 {
00000297 sanity = false;
00000298 MessageBox.Show
00000299 ("Sanity Check Fail: Training Algorithm not " +
00000300 "selected or selected incorrectly",
00000301 "Sanity Check Fail",
00000302 MessageBoxButtons.OK,
00000303 MessageBoxIcon.Error);
00000304 }
00000305
00000306 if (sanity)
00000307 {
00000308 if (trainingalg == 0)
00000309 {
00000310 sanity = checkGAsanity();
00000311 }
00000312
00000313 if (trainingalg == 1)
00000314 {
00000315 sanity = checkBPsanity();
00000316 }
00000317
00000318 if (trainingalg == 2)
00000319 {
00000320 sanity = checkCCsanity();
00000321 }
00000322 }
00000323
00000324 return sanity;
00000325 }

Page 119 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000326 #endregion
00000327
00000328 #region getters/setters
00000329 /********************\
00000330 |* EXTERNAL *|
00000331 \********************/
00000332
00000333 #region cascade_correlation
00000334 //Cascade Correlation
00000335 public double CC_Terr
00000336 { get { return _CC_Terr; } }
00000337 public int CC_MaxNeurons
00000338 { get { return _CC_MaxNeurons; } }
00000339 public int CC_Reports
00000340 { get { return _CC_Reports; } }
00000341 public double CC_LearningRate
00000342 { get { return _CC_LearningRate; } }
00000343 public bool CC_ITerr
00000344 { get { return _CC_ITerr; } }
00000345 #endregion
00000346
00000347 #region genetic_algorithm
00000348 //Genetic Algorithm
00000349 public int GA_PopSize
00000350 { get { return _GA_PopSize; } }
00000351 public int GA_GenLimit
00000352 { get { return _GA_GenLimit; } }
00000353 public int GA_Crossover
00000354 { get { return _GA_Crossover; } }
00000355 public int GA_Mutation

Page 120 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000356 { get { return _GA_Mutation; } }


00000357 public double GA_Terr
00000358 { get { return _GA_Terr; } }
00000359 public int GA_Reports
00000360 { get { return _GA_Reports; } }
00000361 public int GA_HiddenL
00000362 { get { return _GA_HiddenL; } }
00000363 public bool GA_ITerr
00000364 { get { return _GA_ITerr; } }
00000365 #endregion
00000366
00000367 #region back_propagation
00000368 //Back propagation
00000369 public int BP_EpochLimit
00000370 { get { return _BP_EpochLimit; } }
00000371 public int BP_HiddenL
00000372 { get { return _BP_HiddenL; } }
00000373 public int BP_Reports
00000374 { get { return _BP_Reports; } }
00000375 public double BP_Terr
00000376 { get { return _BP_Terr; } }
00000377 public double BP_LearningRate
00000378 { get { return _BP_LearningRate; } }
00000379 public bool BP_ITerr
00000380 { get { return _BP_ITerr; } }
00000381 #endregion
00000382 #endregion
00000383
00000384 #region external
00000385 //Verification

Page 121 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000386 public bool Verified


00000387 { get { return verified; } }
00000388
00000389 //TrainingAlgorithm
00000390 public int TrainingAlgorithm
00000391 { get { return trainingalg; } }
00000392
00000393 //Dataset
00000394 public int TrDataSet
00000395 { get { return _TDataSet; } }
00000396 #endregion
00000397 }
00000398 }

Page 122 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.2.2 - “DatasetParser.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.Linq;
00000004 using System.Text;
00000005 using System.Windows.Forms;
00000006 using System.Collections;
00000007
00000008 namespace _05025397.Controller
00000009 {
00000010 public class DatasetParser
00000011 {
00000012 #region data
00000013 /********************\
00000014 |* DATA *|
00000015 \********************/
00000016 private bool _Verified;
00000017
00000018 private ArrayList trinputs;
00000019 private ArrayList troutputs;
00000020
00000021 private ArrayList teinputs;
00000022 private ArrayList teoutputs;
00000023
00000024 private ArrayList aldata;
00000025 private ArrayList altest;
00000026
00000027 private string[] data;
00000028 private string[] test;
00000029

Page 123 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000030 //private Scaler scale;


00000031
00000032 private double Ninputs;
00000033 private double Noutputs;
00000034 private double Niterations;
00000035 #endregion
00000036
00000037 #region constructor
00000038 /********************\
00000039 |* CONSTRUCTOR *|
00000040 \********************/
00000041 public DatasetParser(RichTextBox txtData, RichTextBox txtTest)
00000042 {
00000043 //Data init
00000044 bool Verified = true;
00000045
00000046 char[] delimiterChars = { '\n', ' ', '\t', '\r', ',', ';' };
00000047
00000048 trinputs = new ArrayList();
00000049 troutputs = new ArrayList();
00000050 teinputs = new ArrayList();
00000051 teoutputs = new ArrayList();
00000052 aldata = new ArrayList();
00000053 altest = new ArrayList();
00000054
00000055 //Strip carriage returns and trim
00000056 txtData.Text = txtData.Text.Trim();
00000057 txtTest.Text = txtTest.Text.Trim();
00000058 txtData.Text = txtData.Text.Replace("\r", "");
00000059 txtTest.Text = txtTest.Text.Replace("\r", "");

Page 124 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000060
00000061 //Split at the chars specified above
00000062 data = txtData.Text.Split(delimiterChars);
00000063 test = txtTest.Text.Split(delimiterChars);
00000064
00000065 //Get the number of iterations, number of inputs
00000066 //and number of outputs
00000067 Niterations = System.Convert.ToDouble(data[0]);
00000068 Ninputs = System.Convert.ToDouble(data[1]);
00000069 Noutputs = System.Convert.ToDouble(data[2]);
00000070
00000071 //parse the data into appropriately
00000072 //structured arrays
00000073 Verified = parseTrainingData();
00000074 Verified = parseTestingData();
00000075
00000076 _Verified = Verified;
00000077 }
00000078 #endregion
00000079
00000080 #region internal_functions
00000081 /********************\
00000082 |* INTERNAL *|
00000083 \********************/
00000084 private bool parseTrainingData()
00000085 {
00000086 bool success = true;
00000087 try
00000088 {
00000089 for (int i = 3; i < data.Length; i++)

Page 125 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000090 {
00000091 aldata.Add(System.Convert.ToDouble(data[i]));
00000092 }
00000093
00000094 while (aldata.Count > 0)
00000095 {
00000096 for (int i = 0; i < Ninputs; i++)
00000097 {
00000098 trinputs.Add(aldata[0]);
00000099 aldata.RemoveAt(0);
00000100 }
00000101 for (int j = 0; j < Noutputs; j++)
00000102 {
00000103 troutputs.Add(aldata[0]);
00000104 aldata.RemoveAt(0);
00000105 }
00000106 }
00000107 }
00000108 catch
00000109 {
00000110 MessageBox.Show("Error parsing training data",
00000111 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000112 success = false;
00000113 }
00000114 return success;
00000115 }
00000116
00000117 private bool parseTestingData()
00000118 {
00000119 bool success = true;

Page 126 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000120
00000121 try
00000122 {
00000123 for (int i = 3; i < data.Length; i++)
00000124 {
00000125 altest.Add(System.Convert.ToDouble(data[i]));
00000126 }
00000127
00000128 while (altest.Count > 0)
00000129 {
00000130 for (int i = 0; i < Ninputs; i++)
00000131 {
00000132 teinputs.Add(altest[0]);
00000133 altest.RemoveAt(0);
00000134 }
00000135 for (int j = 0; j < Noutputs; j++)
00000136 {
00000137 teoutputs.Add(altest[0]);
00000138 altest.RemoveAt(0);
00000139 }
00000140 }
00000141 }
00000142 catch
00000143 {
00000144 MessageBox.Show("Error parsing testing data",
00000145 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000146 success = false;
00000147 }
00000148 return success;
00000149 }

Page 127 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000150 #endregion
00000151
00000152 #region external_functions
00000153 /********************\
00000154 |* EXTERNAL *|
00000155 \********************/
00000156 public void GetStructure(out double iter,
00000157 out double Ninp, out double Nout)
00000158 {
00000159 Ninp = Ninputs;
00000160 iter = Niterations;
00000161 Nout = Noutputs;
00000162 }
00000163
00000164 public void GetTrainingData
00000165 (out double[][] trinputdata, out double[][] troutputdata)
00000166 {
00000167 trinputdata = new double[trinputs.Count / (int) Ninputs][];
00000168 troutputdata = new double[troutputs.Count / (int) Noutputs][];
00000169
00000170 for (int i = 0; i < (trinputs.Count / (int)Ninputs); i++)
00000171 trinputdata[i] = new double[(int) Ninputs];
00000172
00000173 for (int i = 0; i < (troutputs.Count / (int)Noutputs); i++)
00000174 troutputdata[i] = new double[(int) Noutputs];
00000175
00000176 for (int j = 0, l = 0; j < trinputs.Count; j+=(int) Ninputs, l++)
00000177 {
00000178 for (int k = 0; k < Ninputs; k++)
00000179 {

Page 128 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000180 trinputdata[l][k] = (double) trinputs[j + k];


00000181 }
00000182 }
00000183
00000184 for (int j = 0, l = 0; j < troutputs.Count; j += (int)Noutputs, l++)
00000185 {
00000186 for (int k = 0; k < Noutputs; k++)
00000187 {
00000188 troutputdata[l][k] = (double)troutputs[j + k];
00000189 }
00000190 }
00000191 }
00000192
00000193 public void GetTestingData
00000194 (out double[][] teinputdata, out double[][] teoutputdata)
00000195 {
00000196 teinputdata = new double[teinputs.Count / (int)Ninputs][];
00000197 teoutputdata = new double[teoutputs.Count / (int)Noutputs][];
00000198
00000199 for (int i = 0; i < (teinputs.Count / (int)Ninputs); i++)
00000200 teinputdata[i] = new double[(int)Ninputs];
00000201
00000202 for (int i = 0; i < (teoutputs.Count / (int)Noutputs); i++)
00000203 teoutputdata[i] = new double[(int)Noutputs];
00000204
00000205 for (int j = 0, l = 0; j < teinputs.Count; j += (int)Ninputs, l++)
00000206 {
00000207 for (int k = 0; k < Ninputs; k++)
00000208 {
00000209 teinputdata[l][k] = (double)teinputs[j + k];

Page 129 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000210 }
00000211 }
00000212
00000213 for (int j = 0, l = 0; j < teoutputs.Count; j += (int)Noutputs, l++)
00000214 {
00000215 for (int k = 0; k < Noutputs; k++)
00000216 {
00000217 teoutputdata[l][k] = (double)teoutputs[j + k];
00000218 }
00000219 }
00000220 }
00000221
00000222 public void GetTestingDataAL
00000223 (out ArrayList teinputdata, out ArrayList teoutputdata)
00000224 {
00000225 teinputdata = teinputs;
00000226 teoutputdata = teoutputs;
00000227 }
00000228
00000229 public bool Verified
00000230 {
00000231 get { return _Verified; }
00000232 }
00000233
00000234 #endregion
00000235 }
00000236 }

9.1.2.3 - “Algorithms\TrainingAlgorithm.cs”
00000001 using System;

Page 130 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000002 using System.Collections.Generic;


00000003 using System.Linq;
00000004 using System.Text;
00000005 using System.Windows.Forms;
00000006 using ZedGraph;
00000007 using _05025397;
00000008
00000009 namespace _05025397.Controller.Algorithms
00000010 {
00000011 public abstract class TrainingAlgorithm
00000012 {
00000013 #region data
00000014 /********************\
00000015 |* DATA *|
00000016 \********************/
00000017 private double _TargetError;
00000018 private int _ReportInterval;
00000019 private int _IterMax;
00000020
00000021 protected bool _ITerr;
00000022
00000023 protected bool _Tested;
00000024 protected bool _Trained;
00000025
00000026 private string _ReportData;
00000027
00000028 protected int CurIt;
00000029
00000030 protected PointPairList BestErrList;
00000031

Page 131 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000032 protected Random Rand = new Random();


00000033
00000034 protected DatasetParser getdata;
00000035
00000036 #endregion
00000037
00000038 #region constructor
00000039 /********************\
00000040 |* CONSTRUCTOR *|
00000041 \********************/
00000042 public TrainingAlgorithm(double Target_Error,
00000043 int Reports, int Iter_Max, RichTextBox txtData,
00000044 RichTextBox txtTest, bool ITerr)
00000045 {
00000046 _TargetError = Target_Error;
00000047 _ReportInterval = Reports;
00000048 _IterMax = Iter_Max;
00000049 _ITerr = ITerr;
00000050
00000051 //If the ignore target error is true
00000052 //then set the target error to -1.0
00000053 //this is impossible to reach, so the
00000054 //network will loop to completion.
00000055 if (_ITerr)
00000056 _TargetError = -1.00;
00000057
00000058 BestErrList = new PointPairList();
00000059 getdata = new DatasetParser(txtData, txtTest);
00000060 }
00000061 #endregion

Page 132 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000062
00000063 #region internal
00000064 /********************\
00000065 |* INTERNAL *|
00000066 \********************/
00000067 protected void Report(string output)
00000068 {
00000069 _ReportData += output;
00000070 }
00000071 #endregion
00000072
00000073 #region external
00000074 /********************\
00000075 |* EXTERNAL *|
00000076 \********************/
00000077 public abstract bool Train();
00000078 public abstract bool Test();
00000079 public abstract string network_details();
00000080
00000081 public PointPairList getGraphData()
00000082 {
00000083 return BestErrList;
00000084 }
00000085
00000086 public double TargetError
00000087 {
00000088 set { _TargetError = value; }
00000089 get { return _TargetError; }
00000090 }
00000091

Page 133 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000092 public int ReportInterval


00000093 {
00000094 get { return _ReportInterval; }
00000095 }
00000096
00000097 public int IterMax
00000098 {
00000099 get { return _IterMax; }
00000100 }
00000101
00000102 public bool Trained
00000103 {
00000104 get { return _Trained; }
00000105 set { _Trained = value; }
00000106 }
00000107
00000108 public bool Tested
00000109 {
00000110 get { return _Tested; }
00000111 }
00000112
00000113 public string ReportData
00000114 {
00000115 get { return _ReportData; }
00000116 }
00000117 #endregion
00000118 }
00000119 }

Page 134 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.2.4 - “Algorithms\CascadeCorrelation.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.Linq;
00000004 using System.Text;
00000005 using System.Windows.Forms;
00000006 using System.Collections;
00000007 using System.IO;
00000008 using ZedGraph;
00000009 using _05025397;
00000010 using FANN_Wrapper;
00000011
00000012
00000013 namespace _05025397.Controller.Algorithms
00000014 {
00000015 public class CascadeCorrelation : TrainingAlgorithm
00000016 {
00000017 #region data
00000018 /********************\
00000019 |* DATA *|
00000020 \********************/
00000021 private double _InputL;
00000022 private double _OutputL;
00000023 private double _LearnRate;
00000024
00000025 FANN_Cascade CCNet;
00000026
00000027 #endregion
00000028
00000029 #region constructor

Page 135 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000030 /********************\
00000031 |* CONSTRUCTOR *|
00000032 \********************/
00000033 public CascadeCorrelation(double LearnRate,
00000034 double TargetError, int Reports,
00000035 int Iter_Max, bool ITerr,
00000036 RichTextBox txtData, RichTextBox txtTest)
00000037 : base(TargetError, Reports, Iter_Max,
00000038 txtData, txtTest, ITerr)
00000039 {
00000040 _LearnRate = LearnRate;
00000041
00000042 double iterations;
00000043 getdata.GetStructure(out iterations, out _InputL, out _OutputL);
00000044
00000045 CCNet =
00000046 new FANN_Cascade((int)_InputL, (int)_OutputL,
00000047 _LearnRate, base.TargetError, base.ReportInterval,
00000048 base.IterMax);
00000049
00000050 saveDllData(txtData, txtTest);
00000051 }
00000052 #endregion
00000053
00000054 #region internal_functions
00000055 /********************\
00000056 |* INTERNAL *|
00000057 \********************/
00000058 private void saveDllData(RichTextBox tr, RichTextBox te)
00000059 {

Page 136 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000060 TextWriter tw = new StreamWriter("dlltrdata.dat");


00000061 tw.Write(tr.Text);
00000062 tw.Close();
00000063
00000064 tw = new StreamWriter("dlltedata.dat");
00000065 tw.Write(te.Text);
00000066 tw.Close();
00000067 }
00000068 #endregion
00000069
00000070 #region external_functions
00000071 /********************\
00000072 |* EXTERNAL *|
00000073 \********************/
00000074 public override string network_details()
00000075 {
00000076 StringBuilder output = new StringBuilder();
00000077 output.Append(
00000078 string.Format
00000079 ("Network Details: \n Input Layer: {0:d}",
00000080 (int)_InputL));
00000081 output.Append(
00000082 string.Format
00000083 ("\t\t\t\t\tOutput Layer: {0:d}",
00000084 (int)_OutputL));
00000085 output.Append(
00000086 string.Format
00000087 ("\nLearning Rate: {0:0.00000}",
00000088 _LearnRate));
00000089 output.Append(

Page 137 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000090 string.Format
00000091 ("\t\t\tTarget Error: {0:0.00000}",
00000092 base.TargetError));
00000093 output.Append(
00000094 string.Format
00000095 ("\t\tMaximum Nodes: {0:d}",
00000096 base.IterMax));
00000097 output.Append(
00000098 string.Format
00000099 ("\nReport every {0:d} generations.",
00000100 base.ReportInterval));
00000101 output.Append(
00000102 string.Format
00000103 ("\t\t\t\t\t\tIgnore target error " +
00000104 "(process all iterations): {0}",
00000105 _ITerr));
00000106
00000107 return output.ToString();
00000108 }
00000109
00000110 public override bool Train()
00000111 {
00000112 bool success = true;
00000113 try
00000114 {
00000115 success = CCNet.Train();
00000116 Report(CCNet._tr_output);
00000117 }
00000118 catch
00000119 {

Page 138 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000120 MessageBox.Show("Error running Cascade training!",


00000121 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000122 success = false;
00000123 }
00000124
00000125 success = getGraphDataFromDll();
00000126
00000127 base._Trained = success;
00000128 return success;
00000129 }
00000130
00000131 public bool getGraphDataFromDll()
00000132 {
00000133 bool success = true;
00000134
00000135 try
00000136 {
00000137 double[] besterrlist = CCNet.besterrorlist;
00000138
00000139 for (int i = 0; i < besterrlist.Length; i++)
00000140 {
00000141 base.BestErrList.Add((double)i, besterrlist[i]);
00000142 }
00000143 }
00000144 catch
00000145 {
00000146 MessageBox.Show("Error getting graph data from DLL",
00000147 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000148 success = false;
00000149 }

Page 139 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000150
00000151 return success;
00000152 }
00000153
00000154 public override bool Test()
00000155 {
00000156 bool success = true;
00000157 try
00000158 {
00000159 success = CCNet.Test();
00000160 Report(CCNet._te_output);
00000161 }
00000162 catch
00000163 {
00000164 MessageBox.Show("Error running Cascade testing!",
00000165 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000166 success = false;
00000167 }
00000168 base._Tested = success;
00000169 return success;
00000170 }
00000171 #endregion
00000172 }
00000173 }

Page 140 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.2.5 - “Algorithms\GeneticAlgorithm.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.Linq;
00000004 using System.Text;
00000005 using System.Windows.Forms;
00000006 using System.Collections;
00000007 using ZedGraph;
00000008 using _05025397;
00000009
00000010 namespace _05025397.Controller.Algorithms
00000011 {
00000012 public class GeneticAlgorithm : TrainingAlgorithm
00000013 {
00000014
00000015 #region data
00000016 /********************\
00000017 |* DATA *|
00000018 \********************/
00000019 //GA Data
00000020 private int _PopSize;
00000021 private int _Crossover;
00000022 private int _Mutation;
00000023 private int _HiddenL;
00000024
00000025 //Network Data
00000026 private double _InputL;
00000027 private double _OutputL;
00000028
00000029 //Population

Page 141 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000030 ArrayList Population;


00000031
00000032 #endregion
00000033
00000034 #region constructor
00000035 /********************\
00000036 |* CONSTRUCTOR *|
00000037 \********************/
00000038 public GeneticAlgorithm(int PopSize, int Crossover,
00000039 int Mutation, int HiddenL, bool ITerr,
00000040 double TargetError, int Reports, int Iter_Max,
00000041 RichTextBox txtData, RichTextBox txtTest)
00000042 : base(TargetError, Reports, Iter_Max,
00000043 txtData, txtTest, ITerr)
00000044 {
00000045 _PopSize = PopSize;
00000046 _Crossover = Crossover;
00000047 _Mutation = Mutation;
00000048 _HiddenL = HiddenL;
00000049
00000050 //Init Network params
00000051 double iterations;
00000052
00000053 getdata.GetStructure(out iterations, out _InputL, out _OutputL);
00000054
00000055 //Create new population references
00000056 Population = new ArrayList(_PopSize);
00000057 for (int i = 0; i < _PopSize; i++)
00000058 {
00000059 Population.Add

Page 142 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000060 (new Model.GANetwork(


00000061 (int)_InputL, _HiddenL,
00000062 (int)_OutputL, Rand.Next()));
00000063 }
00000064 }
00000065 #endregion
00000066
00000067 #region internal_functions
00000068 /********************\
00000069 |* INTERNAL *|
00000070 \********************/
00000071 private bool run()
00000072 {
00000073 bool success = true;
00000074
00000075 double[][] results;
00000076 double[][] inputs;
00000077
00000078 getdata.GetScaledTrainingData(out inputs, out results);
00000079
00000080 for (int i = 0; i < base.IterMax; i++)
00000081 {
00000082 //Get the msqe
00000083 for (int j = 0; j < _PopSize; j++)
00000084 {
00000085 ((Model.GANetwork) Population[j]).getMsQE(inputs, results);
00000086 }
00000087
00000088 //Sort the functions according to fitness now.
00000089 success = sort_fitnesses();

Page 143 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000090
00000091 //Report if we're at a report iteration
00000092 //Also update the besterrlist (for graphing)
00000093 if (base.CurIt == ReportInterval)
00000094 {
00000095 base.BestErrList.Add
00000096 ((double)i, ((Model.GANetwork)Population[0]).MSqE);
00000097
00000098 base.Report("Best error at iteration [ "
00000099 + i + " ] was " + ((Model.GANetwork)Population[0]).MSqE + ".\n");
00000100
00000101 base.CurIt = 0;
00000102 }
00000103 base.CurIt++;
00000104
00000105 //Check for having reached the target
00000106 if (((Model.GANetwork)Population[0]).MSqE <= base.TargetError)
00000107 {
00000108 Report("\nNetwork matching or improving upon target error"+
00000109 "found at iteration [ " + i + " ].");
00000110 break;
00000111 }
00000112
00000113 success = build_generation();
00000114 }
00000115 return success;
00000116 }
00000117
00000118 private bool sort_fitnesses()
00000119 {

Page 144 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000120 bool success = true;


00000121 try
00000122 {
00000123 Population.Sort();
00000124 }
00000125 catch
00000126 {
00000127 MessageBox.Show("Error sorting population",
00000128 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000129 success = false;
00000130 }
00000131 return success;
00000132 }
00000133
00000134 private bool build_generation()
00000135 {
00000136 bool success = true;
00000137
00000138 int selection1, selection2;
00000139
00000140 double[] weightarray1;
00000141 double[] weightarray2;
00000142
00000143 double[] weightarray_f;
00000144
00000145 for (int i = (_PopSize - 1); i > (int)(_PopSize / 2); i--)
00000146 {
00000147 //This algorithm is altered from previous versions.
00000148 //It selects two points from the start of the population
00000149 //Then applies crossover if necessary (based on probability)

Page 145 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000150 //applies mutation if necessary (same again)


00000151 //then if neither mutation or Xover have been applied
00000152 //chooses position 1 or 2.
00000153
00000154 //Then it replaces the last position in the population
00000155 //(ie. the least fit) with this new string.
00000156 //Next iteration it replaces the next to last, and
00000157 //so on, all the way to the middle of the population.
00000158 selection1 = Rand.Next((int)_PopSize / 2);
00000159 selection2 = Rand.Next((int)_PopSize / 2);
00000160
00000161 weightarray1 = ((Model.GANetwork)Population[selection1]).getWeights();
00000162 weightarray2 = ((Model.GANetwork)Population[selection1]).getWeights();
00000163
00000164 //Just a quick error check
00000165 if (weightarray1.Length != weightarray2.Length)
00000166 {
00000167 MessageBox.Show("Error: 2D weight array length unequal.",
00000168 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000169 success = false;
00000170 }
00000171
00000172 try
00000173 {
00000174 weightarray_f = new double[weightarray1.Length];
00000175
00000176 if (Rand.Next(100) < _Crossover)
00000177 {
00000178 //Choose a random position in the weight array
00000179 //to be our crossover point

Page 146 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000180 int XPos = Rand.Next(1, (weightarray1.Length - 1));


00000181
00000182 //Simple single point crossover of the weights
00000183 for (int j = 0; j < weightarray1.Length; j++)
00000184 {
00000185 if (j < XPos)
00000186 {
00000187 weightarray_f[j] = weightarray1[j];
00000188 }
00000189 else
00000190 {
00000191 weightarray_f[j] = weightarray2[j];
00000192 }
00000193 }
00000194 }
00000195
00000196 if (Rand.Next(100) < _Mutation)
00000197 {
00000198 //Jiggle the weights in the array
00000199 //up or down by 0.5
00000200 for (int j = 0; j < weightarray_f.Length; j++)
00000201 {
00000202 weightarray_f[j] += Rand.NextDouble() - 0.5;
00000203 }
00000204 }
00000205
00000206 //Set the weights of the current member of the
00000207 //population we're on
00000208 ((Model.GANetwork)Population[i]).setWeights(weightarray_f);
00000209 }

Page 147 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000210 catch
00000211 {
00000212 MessageBox.Show("Error building new generation",
00000213 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000214 success = false;
00000215 }
00000216 }
00000217
00000218 return success;
00000219 }
00000220 #endregion
00000221
00000222 #region external_functions
00000223 /********************\
00000224 |* EXTERNAL *|
00000225 \********************/
00000226 public override string network_details()
00000227 {
00000228 //Add stringbuilder for efficiency here
00000229 StringBuilder output = new StringBuilder();
00000230 output.Append(
00000231 string.Format("Network Details: \n Population: {0:d}",
00000232 _PopSize));
00000233 output.Append(
00000234 string.Format("\t\t\tCrossover Probability: {0:d}",
00000235 _Crossover));
00000236 output.Append(
00000237 string.Format("\t\t\tMutation Probability: {0:d}",
00000238 _Mutation));
00000239 output.Append(

Page 148 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000240 string.Format("\nHidden layer: {0:d}",


00000241 _HiddenL));
00000242 output.Append(
00000243 string.Format("\t\t\tTarget Error: {0:0.00000}",
00000244 base.TargetError));
00000245 output.Append(
00000246 string.Format("\t\t\tMaximum Generation: {0:d}",
00000247 base.IterMax));
00000248 output.Append(
00000249 string.Format("\nReport every {0:d} generations.",
00000250 base.ReportInterval));
00000251 output.Append(
00000252 string.Format("\t\t\t\t\t\t\tIgnore target error (process all iterations): {0}",
00000253 _ITerr));
00000254
00000255 return output.ToString();
00000256 }
00000257
00000258 public override bool Train()
00000259 {
00000260 bool success = true;
00000261
00000262 success = run();
00000263
00000264 base._Trained = success;
00000265
00000266 return success;
00000267 }
00000268
00000269 public override bool Test()

Page 149 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000270 {
00000271 bool success = true;
00000272
00000273 StringBuilder output = new StringBuilder();
00000274
00000275 double[][] tstinput;
00000276 double[][] tstoutput;
00000277 double[] netoutput;
00000278
00000279 getdata.GetScaledTestingData(out tstinput, out tstoutput);
00000280
00000281 output.Append("\n\n\t\t\t~~Network Testing~~");
00000282
00000283 try
00000284 {
00000285 for (int i = 0; i < tstinput.Length; i++)
00000286 {
00000287 ((Model.GANetwork)Population[0]).Run(tstinput[i]);
00000288
00000289 output.Append("\nWith inputs:");
00000290 for (int j = 0; j < tstinput[i].Length; j++)
00000291 {
00000292 output.Append(string.Format(" [{0:g}] ", tstinput[i][j]));
00000293 }
00000294
00000295 output.Append("\nOutput achieved was:");
00000296 netoutput = ((Model.GANetwork)Population[0]).GetOutput();
00000297 for (int j = 0; j < netoutput.Length; j++)
00000298 {
00000299 output.Append(string.Format(" [{0:g}] ", netoutput[j]));

Page 150 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000300 }
00000301
00000302 output.Append("\nOutput Desired was:");
00000303 for (int j = 0; j < tstoutput[i].Length; j++)
00000304 {
00000305 output.Append(string.Format(" [{0:g}] ", tstoutput[i][j]));
00000306 }
00000307 }
00000308 output.Append("\nMean Squared Error with these inputs and outputs is:");
00000309 ((Model.GANetwork)Population[0]).getMsQE(tstinput,tstoutput);
00000310 output.Append(
00000311 string.Format
00000312 ("{0:g}", (double) ((Model.GANetwork)Population[0]).MSqE));
00000313
00000314 base.Report(output.ToString());
00000315 }
00000316 catch
00000317 {
00000318 MessageBox.Show("Error Running Test.",
00000319 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000320 success = false;
00000321 }
00000322 base._Tested = success;
00000323 return success;
00000324 }
00000325 #endregion
00000326
00000327 }
00000328 }

Page 151 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.2.6 - “Algorithms\BackProp.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.Linq;
00000004 using System.Text;
00000005 using System.Windows.Forms;
00000006 using System.Collections;
00000007 using System.IO;
00000008 using ZedGraph;
00000009 using _05025397;
00000010 using FANN_Wrapper;
00000011
00000012 namespace _05025397.Controller.Algorithms
00000013 {
00000014 public class BackProp : TrainingAlgorithm
00000015 {
00000016 #region data
00000017 /********************\
00000018 |* DATA *|
00000019 \********************/
00000020 private double _InputL;
00000021 private double _OutputL;
00000022 private int _HiddenL;
00000023 private double _LearnRate;
00000024
00000025 FANN_BackProp BPnet;
00000026
00000027 #endregion
00000028
00000029 #region constructor

Page 152 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000030 /********************\
00000031 |* CONSTRUCTOR *|
00000032 \********************/
00000033 public BackProp
00000034 (int HiddenL, double LearnRate, bool ITerr,
00000035 double TargetError, int Reports,
00000036 int Iter_Max, RichTextBox txtData, RichTextBox txtTest)
00000037 : base(TargetError, Reports, Iter_Max,
00000038 txtData, txtTest, ITerr)
00000039 {
00000040 _HiddenL = HiddenL;
00000041 _LearnRate = LearnRate;
00000042
00000043 double iterations;
00000044 getdata.GetStructure(out iterations, out _InputL, out _OutputL);
00000045
00000046 BPnet =
00000047 new FANN_BackProp((int)_InputL, HiddenL, (int)_OutputL,
00000048 _LearnRate, base.TargetError, base.ReportInterval,
00000049 base.IterMax);
00000050
00000051 //Copy the training and testing data to the dat files
00000052 //ready for the DLL's to access.
00000053
00000054 saveDllData(txtData, txtTest);
00000055 }
00000056 #endregion
00000057
00000058 #region internal_functions
00000059 /********************\

Page 153 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000060 |* INTERNAL *|
00000061 \********************/
00000062 private void saveDllData(RichTextBox tr, RichTextBox te)
00000063 {
00000064 TextWriter tw = new StreamWriter("dlltrdata.dat");
00000065 tw.Write(tr.Text);
00000066 tw.Close();
00000067
00000068 tw = new StreamWriter("dlltedata.dat");
00000069 tw.Write(te.Text);
00000070 tw.Close();
00000071 }
00000072 #endregion
00000073
00000074 #region external_functions
00000075 /********************\
00000076 |* EXTERNAL *|
00000077 \********************/
00000078 public override string network_details()
00000079 {
00000080 StringBuilder output = new StringBuilder();
00000081 output.Append(
00000082 string.Format
00000083 ("Network Details: \n Input Layer: {0:d}",
00000084 (int) _InputL));
00000085 output.Append(
00000086 string.Format
00000087 ("\t\t\tOutput Layer: {0:d}",
00000088 (int) _OutputL));
00000089 output.Append(

Page 154 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000090 string.Format
00000091 ("\t\t\tHidden Layer: {0:d}",
00000092 (int) _HiddenL));
00000093 output.Append(
00000094 string.Format
00000095 ("\nLearning Rate: {0:0.00000}",
00000096 _LearnRate));
00000097 output.Append(
00000098 string.Format
00000099 ("\t\t\tTarget Error: {0:0.00000}",
00000100 base.TargetError));
00000101 output.Append(
00000102 string.Format
00000103 ("\t\tMaximum Epoch: {0:d}",
00000104 base.IterMax));
00000105 output.Append(
00000106 string.Format
00000107 ("\nReport every {0:d} generations.",
00000108 base.ReportInterval));
00000109 output.Append(
00000110 string.Format
00000111 ("\t\t\t\t\t\tIgnore target error " +
00000112 "(process all iterations): {0}",
00000113 _ITerr));
00000114
00000115 return output.ToString();
00000116 }
00000117
00000118 public override bool Train()
00000119 {

Page 155 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000120 bool success = true;


00000121 try
00000122 {
00000123 success = BPnet.Train();
00000124 Report(BPnet._tr_output);
00000125 }
00000126 catch
00000127 {
00000128 MessageBox.Show("Error running Backprop training!",
00000129 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000130 success = false;
00000131 }
00000132
00000133 success = getGraphDataFromDll();
00000134
00000135 base._Trained = success;
00000136 return success;
00000137 }
00000138
00000139 public bool getGraphDataFromDll()
00000140 {
00000141 bool success = true;
00000142
00000143 try
00000144 {
00000145 double[] besterrlist = BPnet.besterrorlist;
00000146
00000147 for (int i = 0; i < besterrlist.Length; i++)
00000148 {
00000149 base.BestErrList.Add((double)i, besterrlist[i]);

Page 156 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000150 }
00000151 }
00000152 catch
00000153 {
00000154 MessageBox.Show("Error getting graph data from DLL",
00000155 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000156 success = false;
00000157 }
00000158
00000159 return success;
00000160 }
00000161
00000162 public override bool Test()
00000163 {
00000164 bool success = true;
00000165 try
00000166 {
00000167 success = BPnet.Test();
00000168 Report(BPnet._te_output);
00000169 }
00000170 catch
00000171 {
00000172 MessageBox.Show("Error running Backprop testing!",
00000173 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000174 success = false;
00000175 }
00000176 base._Tested = success;
00000177 return success;
00000178 }
00000179 #endregion

Page 157 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000180 }
00000181 }

9.1.3 - Model
9.1.3.1 - “Network.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.Linq;
00000004 using System.Text;
00000005 using System.Windows.Forms;
00000006 using _05025397;
00000007
00000008 namespace _05025397.Model
00000009 {
00000010 public class Network
00000011 {
00000012 #region data
00000013 /********************\
00000014 |* DATA *|
00000015 \********************/
00000016 //[layer][node][input]
00000017 protected double[][][] _inputs;
00000018 protected double[][][] _weights;
00000019 protected double[][] _outputs;
00000020
00000021 protected int _numinputs;
00000022 protected int _hiddennodes;
00000023 protected int _numoutputs;
00000024

Page 158 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000025 protected Random Rand;


00000026 #endregion
00000027
00000028 #region init
00000029 /********************\
00000030 |* INIT *|
00000031 \********************/
00000032 private void init_inputlayer()
00000033 {
00000034 _inputs[0] = new double[_numinputs][];
00000035 _weights[0] = new double[_numinputs][];
00000036 _outputs[0] = new double[_numinputs];
00000037
00000038 for (int i = 0; i < _numinputs; i++)
00000039 {
00000040 _inputs[0][i] = new double[2];
00000041 _weights[0][i] = new double[2];
00000042 }
00000043 }
00000044
00000045 private void init_hiddenlayer()
00000046 {
00000047 _inputs[1] = new double[_hiddennodes][];
00000048 _weights[1] = new double[_hiddennodes][];
00000049 _outputs[1] = new double[_hiddennodes];
00000050
00000051 for (int i = 0; i < _hiddennodes; i++)
00000052 {
00000053 _inputs[1][i] = new double[_numinputs + 1];
00000054 _weights[1][i] = new double[_numinputs + 1];

Page 159 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000055 }
00000056 }
00000057
00000058 private void init_outputlayer()
00000059 {
00000060 _inputs[2] = new double[_numoutputs][];
00000061 _weights[2] = new double[_numoutputs][];
00000062 _outputs[2] = new double[_numoutputs];
00000063
00000064 for (int i = 0; i < _numoutputs; i++)
00000065 {
00000066 _inputs[2][i] = new double[_hiddennodes + 1];
00000067 _weights[2][i] = new double[_hiddennodes + 1];
00000068 }
00000069 }
00000070
00000071 private void init_bias()
00000072 {
00000073 //Input bias
00000074 for (int i = 0; i < _numinputs; i++)
00000075 {
00000076 _inputs[0][i][1] = 1.0;
00000077 _weights[0][i][1] = 1.0;
00000078 }
00000079
00000080 //Hidden layer bias
00000081 for (int i = 0; i < _hiddennodes; i++)
00000082 {
00000083 _inputs[1][i][_numinputs] = 1.0;
00000084 _weights[1][i][_numinputs] = 1.0;

Page 160 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000085 }
00000086
00000087 //Output layer bias
00000088 for (int i = 0; i < _numoutputs; i++)
00000089 {
00000090 _inputs[2][i][_hiddennodes] = 1.0;
00000091 _weights[2][i][_hiddennodes] = 1.0;
00000092 }
00000093 }
00000094
00000095 private void init_inputweights()
00000096 {
00000097 for (int i = 0; i < _numinputs; i++)
00000098 {
00000099 _weights[0][i][0] = Rand.NextDouble();
00000100 }
00000101 }
00000102
00000103 private void init_hiddenweights()
00000104 {
00000105 for (int i = 0; i < _hiddennodes; i++)
00000106 {
00000107 for (int j = 0; j < _numinputs; j++)
00000108 {
00000109 _weights[1][i][j] = Rand.NextDouble();
00000110 }
00000111 }
00000112 }
00000113
00000114 private void init_outputweights()

Page 161 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000115 {
00000116 for (int i = 0; i < _numoutputs; i++)
00000117 {
00000118 for (int j = 0; j < _hiddennodes; j++)
00000119 {
00000120 _weights[2][i][j] = Rand.NextDouble();
00000121 }
00000122 }
00000123 }
00000124 #endregion
00000125
00000126 #region internal
00000127 /********************\
00000128 |* INTERNAL *|
00000129 \********************/
00000130 private double sigmoid(double input)
00000131 {
00000132 return (1 / (1 + Math.Pow(Math.E, -input)));
00000133 }
00000134
00000135 private bool Runinput()
00000136 {
00000137 bool success = true;
00000138 double sum = 0.0;
00000139
00000140 try
00000141 {
00000142 //Calculate results for this layer
00000143 for (int i = 0; i < _numinputs; i++)
00000144 {

Page 162 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000145 for (int j = 0; j < 2; j++)


00000146 {
00000147 sum += (_inputs[0][i][j] * _weights[0][i][j]);
00000148 }
00000149 _outputs[0][i] = sigmoid(sum);
00000150 sum = 0.0;
00000151 }
00000152 }
00000153 catch
00000154 {
00000155 MessageBox.Show("Error processing input layer",
00000156 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000157 success = false;
00000158 }
00000159 return success;
00000160 }
00000161
00000162 private bool Runhidden()
00000163 {
00000164 bool success = true;
00000165 double sum = 0.0;
00000166
00000167 try
00000168 {
00000169 //Feed forward the results from input layer
00000170 for (int i = 0; i < _hiddennodes; i++)
00000171 {
00000172 for (int j = 0; j < _numinputs; j++)
00000173 {
00000174 _inputs[1][i][j] = _outputs[0][j];

Page 163 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000175 }
00000176 }
00000177
00000178 //Calculate results for this layer
00000179 for (int i = 0; i < _hiddennodes; i++)
00000180 {
00000181 for (int j = 0; j < _numinputs + 1; j++)
00000182 {
00000183 sum += (_inputs[1][i][j] * _weights[1][i][j]);
00000184 }
00000185 _outputs[1][i] = sigmoid(sum);
00000186 sum = 0.0;
00000187 }
00000188 }
00000189 catch
00000190 {
00000191 MessageBox.Show("Error processing hidden layer",
00000192 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000193 success = false;
00000194 }
00000195
00000196 return success;
00000197 }
00000198
00000199 private bool Runoutput()
00000200 {
00000201 bool success = true;
00000202 double sum = 0.0;
00000203
00000204 try

Page 164 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000205 {
00000206 //Feed forward the results from hidden layer
00000207 for (int i = 0; i < _numoutputs; i++)
00000208 {
00000209 for (int j = 0; j < _hiddennodes; j++)
00000210 {
00000211 _inputs[2][i][j] = _outputs[1][j];
00000212 }
00000213 }
00000214
00000215 //Calculate results for this layer
00000216 for (int i = 0; i < _numoutputs; i++)
00000217 {
00000218 for (int j = 0; j < _hiddennodes + 1; j++)
00000219 {
00000220 sum += (_inputs[2][i][j] * _weights[2][i][j]);
00000221 }
00000222 _outputs[2][i] = sigmoid(sum);
00000223 sum = 0.0;
00000224 }
00000225 }
00000226 catch
00000227 {
00000228 MessageBox.Show("Error processing output layer",
00000229 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000230 success = false;
00000231 }
00000232
00000233 return success;
00000234 }

Page 165 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000235 #endregion
00000236
00000237 #region external
00000238 /********************\
00000239 |* EXTERNAL *|
00000240 \********************/
00000241 public bool Run(double[] inputs)
00000242 {
00000243 bool success = true;
00000244
00000245 //The numbers of inputs must match up
00000246 if (inputs.Length != _numinputs)
00000247 {
00000248 MessageBox.Show("Error: Incorrect number of inputs supplied to NN",
00000249 "Incorrect number of inputs", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000250 success = false;
00000251 }
00000252
00000253 if (success)
00000254 for (int i = 0; i < _numinputs; i++)
00000255 //Each input node has only one real input
00000256 //and a bias - the number of nodes corresponds
00000257 //to the number of inputs accepted.
00000258 _inputs[0][i][0] = inputs[i];
00000259
00000260 if(success)
00000261 success = Runinput();
00000262
00000263 if(success)
00000264 success = Runhidden();

Page 166 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000265
00000266 if(success)
00000267 success = Runoutput();
00000268
00000269 return success;
00000270 }
00000271
00000272 public double[] GetOutput()
00000273 {
00000274 //Return the outputs from the
00000275 //output layer
00000276 return _outputs[2];
00000277 }
00000278 #endregion
00000279
00000280 #region constructor
00000281 /********************\
00000282 |* CONSTRUCTOR *|
00000283 \********************/
00000284 public Network(int numinputs, int hiddennodes, int numoutputs, int SetRandom)
00000285 {
00000286 //Set random number generator
00000287 Rand = new Random(SetRandom);
00000288
00000289 //Set network structure descriptors
00000290 _numinputs = numinputs;
00000291 _hiddennodes = hiddennodes;
00000292 _numoutputs = numoutputs;
00000293
00000294 //We'll always have 3 layers

Page 167 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000295 //Input, Hidden, and Output


00000296 _inputs = new double[3][][];
00000297 _weights = new double[3][][];
00000298 _outputs = new double[3][];
00000299
00000300 init_inputlayer();
00000301 init_hiddenlayer();
00000302 init_outputlayer();
00000303 init_bias();
00000304 init_inputweights();
00000305 init_hiddenweights();
00000306 init_outputweights();
00000307 }
00000308 #endregion
00000309 }
00000310 }

9.1.3.2 - “GANetwork.cs”
00000001 using System;
00000002 using System.Collections.Generic;
00000003 using System.Linq;
00000004 using System.Text;
00000005 using System.Windows.Forms;
00000006 using _05025397;
00000007 using System.Collections;
00000008
00000009 //This class contains Genetic Algorithm specific functions
00000010 //in addition to the basic feed foward neural net functionality.
00000011
00000012 namespace _05025397.Model

Page 168 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000013 {
00000014 public class GANetwork : Network, IComparable
00000015 {
00000016 #region data
00000017 /********************\
00000018 |* DATA *|
00000019 \********************/
00000020 double _MSqE;
00000021 #endregion
00000022
00000023 #region .NET
00000024 /********************\
00000025 |* .NET *|
00000026 \********************/
00000027 //Interface_Implementation
00000028 //Return Value Meanings:
00000029 //-<Zero: x < y
00000030 //-Zero: x == y
00000031 //->Zero: x > y
00000032 public int CompareTo(object a)
00000033 {
00000034 GANetwork b;
00000035
00000036 if (a is GANetwork)
00000037 {
00000038 b = a as GANetwork;
00000039
00000040 return _MSqE.CompareTo(b.MSqE);
00000041 }
00000042

Page 169 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000043 return 0;
00000044 }
00000045 #endregion
00000046
00000047 #region external
00000048 /********************\
00000049 |* EXTERNAL *|
00000050 \********************/
00000051 public void getMsQE(double[][] inputs, double[][] outputs)
00000052 {
00000053 double sum = 0.0;
00000054 int counter = 0;
00000055 double[] netoutputs = new double[_numoutputs];
00000056
00000057 for (int i = 0; i < inputs.Length; i++)
00000058 {
00000059 base.Run(inputs[i]);
00000060 netoutputs = base.GetOutput();
00000061
00000062 for (int j = 0; j < netoutputs.Length; j++)
00000063 {
00000064 sum += (outputs[i][j] - netoutputs[j]) *
00000065 (outputs[i][j] - netoutputs[j]);
00000066 counter++;
00000067 }
00000068 }
00000069
00000070 _MSqE = (sum / counter);
00000071 }
00000072

Page 170 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000073 public double[] getWeights()


00000074 {
00000075 ArrayList collater = new ArrayList();
00000076
00000077 //Collect all the weights in an array list
00000078 //then spit them out as a 1D array of doubles
00000079 try
00000080 {
00000081 for (int i = 0; i < _weights.Length; i++)
00000082 {
00000083 for (int j = 0; j < _weights[i].Length; j++)
00000084 {
00000085 for (int k = 0; k < _weights[i][j].Length; k++)
00000086 {
00000087 collater.Add(_weights[i][j][k]);
00000088 }
00000089 }
00000090 }
00000091 }
00000092 catch
00000093 {
00000094 MessageBox.Show("Fatal Error collating weights to 1D array",
00000095 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000096 Application.Exit();
00000097 }
00000098
00000099 return (double[])collater.ToArray(typeof(double));
00000100 }
00000101
00000102 public void setWeights(double[] weights)

Page 171 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000103 {
00000104 //Take a 1D array of doubles and apply
00000105 //to the correct positions in our weights
00000106 //array
00000107 try
00000108 {
00000109 for (int i = 0, wc = 0; i < _weights.Length; i++)
00000110 {
00000111 for (int j = 0; j < _weights[i].Length; j++)
00000112 {
00000113 for (int k = 0; k < _weights[i][j].Length; k++, wc++)
00000114 {
00000115 _weights[i][j][k] = weights[wc];
00000116 }
00000117 }
00000118 }
00000119 }
00000120 catch
00000121 {
00000122 MessageBox.Show("Fatal Error adding 1D weight array to 3D weight array",
00000123 "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
00000124 Application.Exit();
00000125 }
00000126 }
00000127
00000128 /********************\
00000129 |* GETTERS/SETTERS *|
00000130 \********************/
00000131 public double MSqE
00000132 {

Page 172 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000133 get { return _MSqE; }


00000134 set { _MSqE = value; }
00000135 }
00000136
00000137 #endregion
00000138
00000139 #region constructor
00000140 /********************\
00000141 |* CONSTRUCTOR *|
00000142 \********************/
00000143 public GANetwork(int Ninputs, int Nhnodes, int Noutputs, int SetRandom)
00000144 : base(Ninputs, Nhnodes, Noutputs, SetRandom)
00000145 {
00000146 MSqE = 0.0;
00000147 }
00000148 #endregion
00000149
00000150 }
00000151 }

Page 173 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.4 - DLL Wrapper for FANN functions


9.1.4.1 - “FANN_Wrapper.h”
00000001 #pragma once
00000002 #include "UnManaged_FANN_BackProp.h"
00000003 #include "UnManaged_FANN_CasC.h"
00000004
00000005 using namespace System;
00000006 using namespace System::Runtime::InteropServices;
00000007
00000008 namespace FANN_Wrapper
00000009 {
00000010 public ref class FANN_Cascade
00000011 //Managed wrapper for UnManaged_FANN_CasC
00000012 //which is a C++ interface to the C programmed
00000013 //FANN dll.
00000014 {
00000015 public:
00000016 /********************\
00000017 |* DATA *|
00000018 \********************/
00000019 UnManaged_FANN_CasC* UMWrapper;
00000020
00000021 public:
00000022 /********************\
00000023 |* EXTERNAL *|
00000024 \********************/
00000025 FANN_Cascade(int InputL, int OutputL,
00000026 double LearnRate, double TargetErr,

Page 174 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000027 int ReportInterval, int MaxNeurons);


00000028 ~FANN_Cascade(void);
00000029 !FANN_Cascade(void);
00000030
00000031 bool Train(void);
00000032 bool Test(void);
00000033
00000034 public:
00000035 /********************\
00000036 |* GETTERS/SETTERS *|
00000037 \********************/
00000038 property String^ _te_output
00000039 {
00000040 String^ get()
00000041 {
00000042 return gcnew String((wchar_t *) UMWrapper->get_report_test());
00000043 }
00000044 }
00000045
00000046 property String^ _tr_output
00000047 {
00000048 String^ get()
00000049 {
00000050 return gcnew String((wchar_t *) UMWrapper->get_report_train());
00000051 }
00000052 }
00000053
00000054 property array<double>^ besterrorlist
00000055 {
00000056 array<double>^ get()

Page 175 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000057 {
00000058 int size;
00000059 const double *ar = UMWrapper->get_besterrs(size);
00000060
00000061 array<double>^ arout =
00000062 gcnew array<double>(size);
00000063
00000064
00000065 for(int i = 0; i < size; i++)
00000066 {
00000067 arout[i] = ar[i];
00000068 }
00000069
00000070 return arout;
00000071 }
00000072 }
00000073 };
00000074
00000075 public ref class FANN_BackProp
00000076 //Managed wrapper for UnManaged_FANN_Backprop
00000077 //which is a C++ interface to the C programmed
00000078 //FANN dll.
00000079 {
00000080 public:
00000081 /********************\
00000082 |* DATA *|
00000083 \********************/
00000084 UnManaged_FANN_Backprop* UMWrapper;
00000085
00000086 public:

Page 176 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000087 /********************\
00000088 |* EXTERNAL *|
00000089 \********************/
00000090 FANN_BackProp(int InputL, int HiddenL, int OutputL,
00000091 double LearnRate, double TargetErr,
00000092 int ReportInterval, int MaximumIteration);
00000093 ~FANN_BackProp(void);
00000094 !FANN_BackProp(void);
00000095
00000096 bool Train(void);
00000097 bool Test(void);
00000098
00000099 public:
00000100 /********************\
00000101 |* GETTERS/SETTERS *|
00000102 \********************/
00000103 property String^ _te_output
00000104 {
00000105 String^ get()
00000106 {
00000107 return gcnew String((wchar_t *) UMWrapper->get_report_test());
00000108 }
00000109 }
00000110
00000111 property String^ _tr_output
00000112 {
00000113 String^ get()
00000114 {
00000115 return gcnew String((wchar_t *) UMWrapper->get_report_train());
00000116 }

Page 177 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000117 }
00000118
00000119 property array<double>^ besterrorlist
00000120 {
00000121 array<double>^ get()
00000122 {
00000123 int size;
00000124 const double *ar = UMWrapper->get_besterrs(size);
00000125
00000126 array<double>^ arout =
00000127 gcnew array<double>(size);
00000128
00000129
00000130 for(int i = 0; i < size; i++)
00000131 {
00000132 arout[i] = ar[i];
00000133 }
00000134
00000135 return arout;
00000136 }
00000137 }
00000138 };
00000139 }

Page 178 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.4.2 - “FANN_Wrapper.cpp”
00000001 #include "stdafx.h"
00000002 #include "FANN_Wrapper.h"
00000003
00000004 namespace FANN_Wrapper
00000005 {
00000006 /********************\
00000007 |* FANN_BackProp *|
00000008 \********************/
00000009
00000010 //Constructor
00000011 FANN_BackProp::FANN_BackProp(int InputL, int HiddenL, int OutputL,
00000012 double LearnRate, double TargetErr,
00000013 int ReportInterval, int MaximumIteration)
00000014 {
00000015 UMWrapper = new UnManaged_FANN_Backprop(
00000016 InputL, HiddenL, OutputL,
00000017 LearnRate, TargetErr,
00000018 ReportInterval, MaximumIteration);
00000019 }
00000020
00000021 //Destructor
00000022 FANN_BackProp::~FANN_BackProp(void)
00000023 {
00000024 if (UMWrapper)
00000025 {
00000026 delete UMWrapper;
00000027 UMWrapper = 0;
00000028 }
00000029 }

Page 179 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000030
00000031 //Finalizer
00000032 FANN_BackProp::!FANN_BackProp(void)
00000033 {
00000034 if (UMWrapper)
00000035 {
00000036 delete UMWrapper;
00000037 UMWrapper = 0;
00000038 }
00000039 }
00000040
00000041
00000042 //Train
00000043 bool FANN_BackProp::Train(void)
00000044 {
00000045 return UMWrapper->Train();
00000046 }
00000047
00000048 //Test
00000049 bool FANN_BackProp::Test(void)
00000050 {
00000051 return UMWrapper->Test();
00000052 }
00000053
00000054 /********************\
00000055 |* FANN_Cascade *|
00000056 \********************/
00000057
00000058 //Constructor
00000059 FANN_Cascade::FANN_Cascade(int InputL, int OutputL,

Page 180 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000060 double LearnRate, double TargetErr,


00000061 int ReportInterval, int MaxNeurons)
00000062 {
00000063 UMWrapper = new UnManaged_FANN_CasC(
00000064 InputL, OutputL,
00000065 LearnRate, TargetErr,
00000066 ReportInterval, MaxNeurons);
00000067 }
00000068
00000069 //Destructor
00000070 FANN_Cascade::~FANN_Cascade(void)
00000071 {
00000072 if (UMWrapper)
00000073 {
00000074 delete UMWrapper;
00000075 UMWrapper = 0;
00000076 }
00000077 }
00000078
00000079 //Finalizer
00000080 FANN_Cascade::!FANN_Cascade(void)
00000081 {
00000082 if (UMWrapper)
00000083 {
00000084 delete UMWrapper;
00000085 UMWrapper = 0;
00000086 }
00000087 }
00000088
00000089 //Train

Page 181 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000090 bool FANN_Cascade::Train(void)


00000091 {
00000092 return UMWrapper->Train();
00000093 }
00000094
00000095 //Test
00000096 bool FANN_Cascade::Test(void)
00000097 {
00000098 return UMWrapper->Test();
00000099 }
00000100 }

Page 182 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.4.3 - “UnManaged_FANN_BackProp.h”
00000001 #pragma once
00000002 #include <iostream>
00000003 #include <string>
00000004 #include <sstream>
00000005 #include <vector>
00000006 #include <doublefann.h>
00000007 #include <fann_cpp.h>
00000008
00000009 class UnManaged_FANN_Backprop
00000010 //An interface to the C programmed
00000011 //FANN dll.
00000012 {
00000013 private:
00000014 /********************\
00000015 |* DATA *|
00000016 \********************/
00000017 //Network Structure
00000018 int _InputL;
00000019 int _HiddenL;
00000020 int _OutputL;
00000021
00000022 //Training Parameters
00000023 double _LearnRate;
00000024 double _TargetErr;
00000025 int _ReportInterval;
00000026 int _MaximumIteration;
00000027 double _Momentum;
00000028
00000029 //Output for C#

Page 183 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000030 std::wstringstream tr_output;


00000031 std::wstringstream te_output;
00000032 wchar_t *wchoutput;
00000033
00000034 std::vector<double> vecbsterr;
00000035 double *bsterr;
00000036
00000037 //FANN Data
00000038 struct fann *ann;
00000039 struct fann_train_data *data;
00000040
00000041 public:
00000042 /********************\
00000043 |* EXTERNAL *|
00000044 \********************/
00000045 //Constructor
00000046 UnManaged_FANN_Backprop(int InputL, int HiddenL, int OutputL,
00000047 double LearnRate, double TargetErr,
00000048 int ReportInterval, int MaximumIteration);
00000049
00000050 //Destructor
00000051 ~UnManaged_FANN_Backprop(void);
00000052
00000053 //Interface functions
00000054 //accessed from C#
00000055 bool Train(void);
00000056 bool Test(void);
00000057
00000058 public:
00000059 /********************\

Page 184 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000060 |* GETTERS/SETTERS *|
00000061 \********************/
00000062 const wchar_t* get_report_train(void);
00000063 const wchar_t* get_report_test(void);
00000064
00000065 const double* get_besterrs(int &size);
00000066 };

Page 185 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.4.4 - “UnManaged_FANN_BackProp.cpp”
00000001 #include "StdAfx.h"
00000002 #include "UnManaged_FANN_BackProp.h"
00000003
00000004 //Constructor
00000005 UnManaged_FANN_Backprop::UnManaged_FANN_Backprop
00000006 (int InputL, int HiddenL, int OutputL,
00000007 double LearnRate, double TargetErr,
00000008 int ReportInterval, int MaximumIteration)
00000009 {
00000010 _InputL = InputL;
00000011 _HiddenL = HiddenL;
00000012 _OutputL = OutputL;
00000013
00000014 _LearnRate = LearnRate;
00000015 _TargetErr = TargetErr;
00000016 _ReportInterval = ReportInterval;
00000017 _MaximumIteration = MaximumIteration;
00000018
00000019 wchoutput = NULL;
00000020 bsterr = NULL;
00000021 data = NULL;
00000022 ann = NULL;
00000023 }
00000024
00000025 //Destructor
00000026 UnManaged_FANN_Backprop::~UnManaged_FANN_Backprop()
00000027 {
00000028
00000029 fann_destroy_train(data);

Page 186 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000030
00000031 delete bsterr;
00000032 bsterr = NULL;
00000033
00000034 delete wchoutput;
00000035 wchoutput = NULL;
00000036 }
00000037
00000038 //Train
00000039 bool UnManaged_FANN_Backprop::Train(void)
00000040 {
00000041 static int firstrun = false;
00000042
00000043 bool success = true;
00000044 int reportcounter = 0;
00000045 double error;
00000046
00000047 ann = fann_create_standard(3, _InputL, _HiddenL, _OutputL);
00000048
00000049 try
00000050 {
00000051 data = fann_read_train_from_file("dlltrdata.dat");
00000052 }
00000053 catch(...)
00000054 {
00000055 throw("");
00000056 success = false;
00000057 }
00000058
00000059 if (fann_num_input_train_data(data) != _InputL)

Page 187 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000060 {
00000061 throw("");
00000062 success = false;
00000063 }
00000064
00000065
00000066 if (fann_num_output_train_data(data) != _OutputL)
00000067 {
00000068 throw("");
00000069 success = false;
00000070 }
00000071
00000072 fann_set_activation_steepness_hidden(ann, 1);
00000073 fann_set_activation_steepness_output(ann, 1);
00000074
00000075 //Sigmoid Activation Functions (the same one
00000076 //the GA uses).
00000077 fann_set_activation_function_hidden
00000078 (ann, FANN_SIGMOID);
00000079 fann_set_activation_function_output
00000080 (ann, FANN_SIGMOID);
00000081
00000082 //Standard backprop
00000083 fann_set_training_algorithm(ann, FANN_TRAIN_BATCH);
00000084
00000085 //Set the learning rate
00000086 fann_set_learning_rate(ann, (float) _LearnRate);
00000087
00000088 //Same range the GA's weights are
00000089 //initialised too

Page 188 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000090 fann_randomize_weights(ann, 0.0, 1.0);


00000091
00000092 //Training Loop
00000093 for (int i = 0; i < _MaximumIteration; i++)
00000094 {
00000095 //Train one epoch then check the
00000096 //mean squared error
00000097 error = (double) fann_train_epoch(ann, data);
00000098
00000099 if (reportcounter == _ReportInterval)
00000100 {
00000101 tr_output << "Best error at iteration [ "
00000102 << i << " ] was " << error << ".\n";
00000103
00000104 vecbsterr.push_back(error);
00000105
00000106 reportcounter = 0;
00000107 }
00000108 reportcounter++;
00000109
00000110 if (error < _TargetErr)
00000111 {
00000112 tr_output << "\nNetwork matching or improving upon target error" <<
00000113 "found at iteration [ " << i << " ].";
00000114 i = _MaximumIteration + 1;
00000115 }
00000116 }
00000117
00000118 fann_destroy_train(data);
00000119

Page 189 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000120 firstrun = true;


00000121 return true;
00000122 }
00000123
00000124 //Test
00000125 bool UnManaged_FANN_Backprop::Test(void)
00000126 {
00000127 fann_type *calc_out = NULL;
00000128 double *error = NULL;
00000129
00000130 data = fann_read_train_from_file("dlltedata.dat");
00000131
00000132 fann_scale_train_data(data, 0, 1);
00000133
00000134 error = new double[fann_length_train_data(data)];
00000135
00000136 te_output << "\n\n\t\t\t~~Network Testing~~";
00000137 for (unsigned int i = 0; i < fann_length_train_data(data); i++)
00000138 {
00000139 calc_out = fann_run(ann, data->input[i]);
00000140
00000141 te_output << "\nWith inputs";
00000142 for (int j = 0; j < _InputL; j++)
00000143 {
00000144 te_output << " [" << data->input[i][j] << "] ";
00000145 }
00000146
00000147 te_output << "\nOutput achieved was";
00000148 for (int j = 0; j < _OutputL; j++)
00000149 {

Page 190 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000150 te_output << " [" << calc_out[j] << "] ";
00000151 }
00000152
00000153 te_output << "\nOutput Desired was";
00000154 for (int j = 0; j < _OutputL; j++)
00000155 {
00000156 te_output << " [" << data->output[i][j] << "] ";
00000157 }
00000158 }
00000159
00000160 te_output << "\nMean Squared Error with these inputs and outputs is:";
00000161 te_output << fann_test_data(ann, data);
00000162
00000163 delete error;
00000164 delete calc_out;
00000165
00000166 return true;
00000167 }
00000168
00000169 //get_report
00000170 const wchar_t* UnManaged_FANN_Backprop::get_report_train(void)
00000171 {
00000172 if (wchoutput != NULL)
00000173 {
00000174 delete wchoutput;
00000175 wchoutput = NULL;
00000176 }
00000177
00000178 wchoutput = new wchar_t[tr_output.str().length() + 1];
00000179 wcscpy(wchoutput, tr_output.str().c_str());

Page 191 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000180
00000181 return wchoutput;
00000182 }
00000183
00000184 const wchar_t* UnManaged_FANN_Backprop::get_report_test(void)
00000185 {
00000186 if (wchoutput != NULL)
00000187 {
00000188 delete wchoutput;
00000189 wchoutput = NULL;
00000190 }
00000191
00000192 wchoutput = new wchar_t[te_output.str().length() + 1];
00000193 wcscpy(wchoutput, te_output.str().c_str());
00000194
00000195 return wchoutput;
00000196 }
00000197
00000198
00000199 //get_besterrs
00000200 const double* UnManaged_FANN_Backprop::get_besterrs(int &size)
00000201 {
00000202 if (bsterr == NULL)
00000203 {
00000204 delete bsterr;
00000205 bsterr = NULL;
00000206 }
00000207
00000208 bsterr = new double[vecbsterr.size()];
00000209 size = vecbsterr.size();

Page 192 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000210
00000211 for (unsigned int i = 0; i < vecbsterr.size(); i++)
00000212 {
00000213 bsterr[i] = vecbsterr[i];
00000214 }
00000215
00000216 vecbsterr.clear();
00000217
00000218 return bsterr;
00000219 }

9.1.4.5 - “UnManaged_FANN_CasC.h”
00000001 #pragma once
00000002 #include <iostream>
00000003 #include <string>
00000004 #include <sstream>
00000005 #include <vector>
00000006 #include <doublefann.h>
00000007
00000008 class UnManaged_FANN_CasC
00000009 {
00000010 public:
00000011 /********************\
00000012 |* DATA *|
00000013 \********************/
00000014 //Network Structure
00000015 int _InputL;
00000016 int _OutputL;
00000017
00000018 //Training Parameters

Page 193 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000019 double _LearnRate;


00000020 double _TargetErr;
00000021 int _ReportInterval;
00000022 int _MaxNeurons;
00000023
00000024 //Output for C#
00000025 static std::wstringstream tr_output;
00000026 std::wstringstream te_output;
00000027 wchar_t *wchoutput;
00000028
00000029 static std::vector<double> vecbsterr;
00000030 double *bsterr;
00000031
00000032 //FANN Data
00000033 struct fann *ann;
00000034 struct fann_train_data *data;
00000035
00000036 public:
00000037 /********************\
00000038 |* EXTERNAL *|
00000039 \********************/
00000040
00000041 //Destructor
00000042 ~UnManaged_FANN_CasC(void);
00000043
00000044 //Constructor
00000045 UnManaged_FANN_CasC
00000046 (int InputL, int OutputL,
00000047 double LearnRate, double TargetErr,
00000048 int ReportInterval, int MaxNeurons);

Page 194 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000049
00000050 //Interface functions
00000051 //accessed from C#
00000052 bool Train(void);
00000053 bool Test(void);
00000054
00000055 static int FANN_API Report_Callback
00000056 (struct fann *ann, struct fann_train_data *train,
00000057 unsigned int max_epochs, unsigned int epochs_between_reports,
00000058 float desired_error, unsigned int epochs);
00000059
00000060 public:
00000061 /********************\
00000062 |* GETTERS/SETTERS *|
00000063 \********************/
00000064 const wchar_t* get_report_train(void);
00000065 const wchar_t* get_report_test(void);
00000066
00000067 const double* get_besterrs(int &size);
00000068 };

Page 195 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.1.4.6 - “UnManaged_FANN_CasC.cpp”
00000001 #include "StdAfx.h"
00000002 #include "UnManaged_FANN_CasC.h"
00000003
00000004 //Static variable declarations
00000005 std::wstringstream UnManaged_FANN_CasC::tr_output;
00000006 std::vector<double> UnManaged_FANN_CasC::vecbsterr;
00000007
00000008 //Constructor
00000009 UnManaged_FANN_CasC::UnManaged_FANN_CasC
00000010 (int InputL, int OutputL,
00000011 double LearnRate, double TargetErr,
00000012 int ReportInterval, int MaxNeurons)
00000013 {
00000014 _InputL = InputL;
00000015 _OutputL = OutputL;
00000016
00000017 _LearnRate = LearnRate;
00000018 _TargetErr = TargetErr;
00000019 _ReportInterval = ReportInterval;
00000020 _MaxNeurons = MaxNeurons;
00000021
00000022 wchoutput = NULL;
00000023 bsterr = NULL;
00000024 ann = NULL;
00000025 data = NULL;
00000026 }
00000027
00000028 //Destructor
00000029 UnManaged_FANN_CasC::~UnManaged_FANN_CasC(void)

Page 196 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000030 {
00000031 delete bsterr;
00000032 bsterr = NULL;
00000033
00000034 fann_destroy_train(data);
00000035
00000036 delete wchoutput;
00000037 wchoutput = NULL;
00000038 }
00000039
00000040
00000041 //get_report
00000042 const wchar_t* UnManaged_FANN_CasC::get_report_train(void)
00000043 {
00000044 if (wchoutput != NULL)
00000045 {
00000046 delete wchoutput;
00000047 wchoutput = NULL;
00000048 }
00000049
00000050 wchoutput = new wchar_t[tr_output.str().length() + 1];
00000051 wcscpy(wchoutput, tr_output.str().c_str());
00000052 return wchoutput;
00000053 }
00000054
00000055 const wchar_t* UnManaged_FANN_CasC::get_report_test(void)
00000056 {
00000057 if (wchoutput != NULL)
00000058 {
00000059 delete wchoutput;

Page 197 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000060 wchoutput = NULL;


00000061 }
00000062
00000063 wchoutput = new wchar_t[te_output.str().length() + 1];
00000064 wcscpy(wchoutput, te_output.str().c_str());
00000065 return wchoutput;
00000066 }
00000067
00000068 //get_besterrs
00000069 const double* UnManaged_FANN_CasC::get_besterrs(int &size)
00000070 {
00000071 if (bsterr == NULL)
00000072 {
00000073 delete bsterr;
00000074 bsterr = NULL;
00000075 }
00000076
00000077 bsterr = new double[vecbsterr.size()];
00000078 size = vecbsterr.size();
00000079
00000080 for (unsigned int i = 0; i < vecbsterr.size(); i++)
00000081 {
00000082 bsterr[i] = vecbsterr[i];
00000083 }
00000084
00000085 vecbsterr.clear();
00000086
00000087 return bsterr;
00000088 }
00000089

Page 198 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000090 //Train
00000091 bool UnManaged_FANN_CasC::Train()
00000092 {
00000093 int reportcounter = 0;
00000094
00000095 ann = fann_create_shortcut(2, _InputL, _OutputL);
00000096
00000097 try
00000098 {
00000099 data = fann_read_train_from_file("dlltrdata.dat");
00000100 }
00000101 catch(...)
00000102 {
00000103 throw("");
00000104 }
00000105
00000106 if (fann_num_input_train_data(data) != _InputL)
00000107 throw("");
00000108
00000109 if (fann_num_output_train_data(data) != _OutputL)
00000110 throw("");
00000111
00000112 fann_set_learning_rate(ann, (float)_LearnRate);
00000113
00000114 //Some more network customisation here
00000115 //might be nice in future.
00000116
00000117 fann_set_quickprop_decay(ann, 0.0);
00000118
00000119 fann_set_quickprop_mu(ann, 2.0);

Page 199 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000120
00000121 fann_set_cascade_weight_multiplier(ann, 1);
00000122
00000123 fann_set_cascade_max_out_epochs(ann, 150);
00000124
00000125 fann_set_activation_steepness_output(ann, 1);
00000126 fann_set_activation_steepness_hidden(ann, 1);
00000127
00000128 fann_set_learning_rate(ann, (float) _LearnRate);
00000129
00000130 fann_set_training_algorithm(ann, FANN_TRAIN_QUICKPROP);
00000131
00000132 fann_set_activation_function_hidden(ann, FANN_SIGMOID);
00000133
00000134 fann_set_activation_function_output(ann, FANN_SIGMOID);
00000135
00000136 fann_randomize_weights(ann, 0.0, 1.0);
00000137 fann_set_cascade_output_change_fraction(ann, 0.01f);
00000138 fann_set_cascade_candidate_change_fraction(ann, 0.01f);
00000139 fann_set_train_stop_function(ann, FANN_STOPFUNC_MSE);
00000140
00000141 fann_set_callback(ann, UnManaged_FANN_CasC::Report_Callback);
00000142
00000143 fann_cascadetrain_on_data(ann, data, _MaxNeurons, _ReportInterval, (float) _TargetErr);
00000144
00000145 return true;
00000146 }
00000147
00000148 //Callback for reporting
00000149 int FANN_API UnManaged_FANN_CasC::Report_Callback

Page 200 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000150 (struct fann *ann, struct fann_train_data *train,


00000151 unsigned int max_epochs, unsigned int epochs_between_reports,
00000152 float desired_error, unsigned int epochs)
00000153 {
00000154 double MSqE = fann_get_MSE(ann);
00000155 static int node = 1;
00000156
00000157 tr_output << "Best error at node [ "
00000158 << node << " ] was " << MSqE << ".\n";
00000159
00000160 vecbsterr.push_back(MSqE);
00000161
00000162 if (MSqE <= desired_error)
00000163 {
00000164 tr_output <<
00000165 "\nNetwork matching or improving upon target error" <<
00000166 "found at node [ " << node << " ].";
00000167 }
00000168 node++;
00000169 return 0;
00000170 }
00000171
00000172 //Test
00000173 bool UnManaged_FANN_CasC::Test()
00000174 {
00000175 fann_type *calc_out = NULL;
00000176 double *error = NULL;
00000177
00000178 data = fann_read_train_from_file("dlltedata.dat");
00000179 fann_scale_train_data(data, 0, 1);

Page 201 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000180 fann_set_callback(ann, NULL);


00000181
00000182 error = new double[fann_length_train_data(data)];
00000183
00000184 te_output << "\n\n\t\t\t~~Network Testing~~";
00000185 for (unsigned int i = 0; i < fann_length_train_data(data); i++)
00000186 {
00000187 calc_out = fann_run(ann, data->input[i]);
00000188
00000189 te_output << "\nWith inputs:";
00000190 for (int j = 0; j < _InputL; j++)
00000191 {
00000192 te_output << " [" << data->input[i][j] << "] ";
00000193 }
00000194
00000195 te_output << "\nOutput achieved was:";
00000196 for (int j = 0; j < _OutputL; j++)
00000197 {
00000198 te_output << " [" << calc_out[j] << "] ";
00000199 }
00000200
00000201 te_output << "\nOutput Desired was:";
00000202 for (int j = 0; j < _OutputL; j++)
00000203 {
00000204 te_output << " [" << data->output[i][j] << "] ";
00000205 }
00000206 }
00000207
00000208 te_output << "\nMean Squared Error with these inputs and outputs is:";
00000209 te_output << fann_test_data(ann, data);

Page 202 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

00000210
00000211 delete error;
00000212 delete calc_out;
00000213
00000214 return true;
00000215 }

Page 203 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.2 - UML Class Diagram


The class diagram on the following pages represents the structure of the various classes
within the program and how they are combined together to create a whole.

Page 204 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Page 205 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Page 206 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

-BackProp
Previous Page
-CascadeCorrelation

1 1

FANN_Cascade FANN_BackProp
-UMWrapper -UMWrapper
+FANN_Cascade(in AllDataToFillAttributes) +FANN_BackProp(in AllDataToFillAttributes)
+Train() : bool +Train() : bool
+Test() : bool +Test() : bool

1 1

1
1

UnManaged_FANN_BackProp
-_InputL : int UnManaged_FANN_CasC
-_HiddenL : int -_InputL : int
-_OutputL : int -_OutputL : int
-_LearnRate : double -_LearnRate : double
-_TargetErr : double -_TargetErr : double
-_ReportInterval : int -_ReportInterval : int
-_MaximumIteration : int -_MaxNeurons : int
-_Momentum : double -tr_output
-tr_output -te_output
-te_output -wchoutput
-wchoutput -vecbsterr
-vecbsterr -bsterr
-bsterr 1 -ann
-ann -data
-data +UnManaged_FANN_CasC(in AllDataToAFillAttributes)
+UnManaged_FANN_Backprop(in AddDataToFillAttributes) 1 +~UnManaged_FANN_CasC()
+~UnManaged_FANN_Backprop() +Train() : bool
+Train() : bool +Test() : bool
+Test() : bool +Report_Callback(in Attributes) : int

1 1

FANN_Library

9.3 - Libraries Researched


• GALib

o Matthew Wall

o http://lancet.mit.edu/ga/

o Accessed: 31/10/2008

Page 207 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

• Libneural

o D. Franklin

o http://ieee.uow.edu/au/~daniel/software/libneural/

o Accessed: 31/10/2008

• Libann

o Anonymous

o http://www.nongnu.org/libann/index.html

o Accessed: 31/10/2008

• Flood

o Roberto Lopez

o http://www.cimne.com/flood/

o Accessed: 31/10/2008

• Annie

o Asim Shankar

o http://annie.sourceforge.net/

o Accessed: 31/10/2008

• FANN (Fast Artificial Neural Network Library)

o Steffen Nissen

Page 208 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

o http://leenissen.dk/fann/index.php

o Accessed: 31/10/2008

9.4 - Datasets
• XOR

1 0 1

1 1
0

1 0
1

0 0
0

• Fishers Iris Data


o http://www.math.uah.edu/stat/data/Fisher.xhtml
• Virus Classification
o http://www.stats.ox.ac.uk/pub/PRNN/README.html

Page 209 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

9.5 - Objectives
FINAL YEAR DEGREE PROJECT

Project Objectives & Deliverables

• Research the fields of genetic algorithms and neural networks.

• Research current information on training neural networks using genetic algorithms.

• Build an application that is capable of training a neural network with a genetic algorithm and with a back-
propagation system, and that allows manual adjustment of relevant variables.

• Build a user-interface for the aforementioned application allowing for easy alteration of appropriate
variables.

• Using above application, as well as previous research, evaluate training methods for neural networks.

• Identify the most effective ways of training neural networks using genetic algorithms, and in which cases
these methods are suitable.

• Deliverable one: A research document describing and detailing the research done for objectives one and
two.

• Deliverable two: An application meeting the requirements of objectives three and four, and a document
describing its implementation and usage.

• Deliverable three: A document detailing and explaining the evaluation of different training techniques, and
the conclusions derived from that evaluation.

Page 210 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

10 - Works Cited
Barber, S. (2006). AI: Neural Network for Beginners pt. 1. Retrieved April 13, 2009,
from Codeproject: www.codeproject.com/KB/recipes/NeuralNetwork_1.aspx

Blum, A. (1992). Neural Network in C++. Wiley Professional Computing.

Bourg, D. M., & Seeman, G. (2004). Chapter 14: Neural Networks. In D. M. Bourg, &
G. Seeman, AI for Game Developers (pp. 356-358). O'Reilly.

Branke, J. (1995). Curved trajectory prediction using a self-organizing neural network.


Retrieved April 13, 2009, from http://www.aifb.uni-
karlsruhe.de/~jbr/Papers/GaNNOverview.ps.gz

Darwin, C. (1859). On the origin of the species by means of natural selection, or the
preservation of favoured races in the struggle for life. London: John Murray.

Davis, C. E. (2001). Graphics processing unit computation of neural networks.


Retrieved April 13, 2009, from http://www.cs.unm.edu/~chris2d/papers/CED_thesis.pdf

Fahlman, S. E. (1988). An Empirical Study of Learning Speed in Back-Propagation


Networks. Retrieved April 16, 2009, from School of Computer Science, Carnegie Mellon:
http://www-2.cs.cmu.edu/afs/cs.cmu.edu/user/sef/www/publications/qp-tr.ps

Fahlman, S. E., & Lebiere, C. (1991). The Cascade-Correlation Learning Architecture.


Retrieved April 16, 2009, from School of Computer Science, Carnegie Mellon: http://www-
2.cs.cmu.edu/afs/cs.cmu.edu/user/sef/www/publications/cascor-tr.ps

Lysle, S. (2007, January 16). Passing Data between Windows Forms. Retrieved April
16, 2009, from C# Corner: http://www.c-
sharpcorner.com/UploadFile/scottlysle/PassData01142007013005AM/PassData.aspx

Page 211 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Marczyk, A. (2004). Genetic algorithms and evolutionary computation. Retrieved April


13, 2009, from http://www.talkorigins.com/faqs/genalg/genalg.html

Marshall, J. A., & Srikanth, V. (1999). Curved trajectory prediction using a self-
organizing neural network. Retrieved April 15, 2009, from Computer Science - UNC-Chapel
Hill: http://www.cs.unc.edu/~marshall/WWW/PAPERS/curve9912.ps.gz

Matthews, J. (2002). Back-propagation for the Uninitiated. Retrieved April 15, 2009,
from generation5: http://www.generation5.org/content/2002/bp.asp

McCulloch, W., & Pitts, W. (1943). A logical calculus of the ideas immanent in
nervous activity. Bulletin of Mathematical Biophysics , 5, 115-133.

Microsoft. (2009). BackgroundWorker Class. Retrieved April 16, 2009, from MSDN:
http://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker.aspx

Microsoft. (2009). Using C++ Interop (Implicit PInvoke). Retrieved April 16, 2009,
from MSDN: http://msdn.microsoft.com/en-us/library/2x8kf7zx.aspx

Mitchell, M. (1996). An introduction to genetic algorithms. Massachusetts: MIT Press.

Nissen, S. (n.d.). Retrieved 21 01, 2009, from FANN: http://leenissen.dk/fann/

Parker, D. B. (1982). Learning Logic. Invention Report S81-64, Stanford University,


Office of Technology Licensing.

Robot Control. (2008). Retrieved April 15, 2009, from Artificial Neural Networks:
http://www.learnartificialneuralnetworks.com/robotcontrol.htm

Rosenblatt, F. (1958). The perceptron: A probabilistic theory for information storage


and organization in the brain. Psychological Review , 65, 386-408.

Page 212 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Stergiou, C., & Dimitrios, S. (1997). Neural Networks. Retrieved April 15, 2009, from
Imperial College London:
http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html#Contents

Werbos, P. (1974). Beyond Regression: New tools for prediction and analysis in the
behavioral sciences. PhD, Harvard University.

ZedGraph. (n.d.). Retrieved 21 01, 2009, from


http://zedgraph.org/wiki/index.php?title=Main_Page

Page 213 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

11 - Bibliography
Barber, S. (2006). AI: Neural Network for Beginners pt. 1. Retrieved April 13, 2009,
from Codeproject: www.codeproject.com/KB/recipes/NeuralNetwork_1.aspx

Barber, S. (2006). AI: Neural Network for Beginners pt. 2. Retrieved April 13, 2009,
from Codeproject: http://www.codeproject.com/KB/recipes/Backprop_ANN.aspx

Barber, S. (2006). AI: Neural Network for Beginners pt. 3. Retrieved April 13, 2009,
from Codeproject: http://www.codeproject.com/KB/cs/GA_ANN_XOR.aspx

Blum, A. (1992). Neural Network in C++. Wiley Professional Computing.

Bourg, D. M., & Seeman, G. (2004). Chapter 14: Neural Networks. In D. M. Bourg, &
G. Seeman, AI for Game Developers (pp. 356-358). O'Reilly.

Branke, J. (1995). Curved trajectory prediction using a self-organizing neural network.


Retrieved April 13, 2009, from http://www.aifb.uni-
karlsruhe.de/~jbr/Papers/GaNNOverview.ps.gz

Darwin, C. (1859). On the origin of the species by means of natural selection, or the
preservation of favoured races in the struggle for life. London: John Murray.

Davis, C. E. (2001). Graphics processing unit computation of neural networks.


Retrieved April 13, 2009, from http://www.cs.unm.edu/~chris2d/papers/CED_thesis.pdf

DiLascia, P. (2002). Call Unmanaged DLLs from C#, Killing Processes Cleanly.
Retrieved April 13, 2009, from MSDN: http://msdn.microsoft.com/en-
us/magazine/cc301501.aspx

DLL Tutorial for Beginners. (2007, August 6). Retrieved April 13, 2009, from
Codeguru: http://www.codeguru.com/Cpp/Cpp/cpp_mfc/tutorials/article.php/c9855/

Page 214 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Fahlman, S. E. (1988). An Empirical Study of Learning Speed in Back-Propagation


Networks. Retrieved April 16, 2009, from School of Computer Science, Carnegie Mellon:
http://www-2.cs.cmu.edu/afs/cs.cmu.edu/user/sef/www/publications/qp-tr.ps

Fahlman, S. E., & Lebiere, C. (1991). The Cascade-Correlation Learning Architecture.


Retrieved April 16, 2009, from School of Computer Science, Carnegie Mellon: http://www-
2.cs.cmu.edu/afs/cs.cmu.edu/user/sef/www/publications/cascor-tr.ps

GoogleChartSharp. (n.d.). Retrieved 21 01, 2009, from


http://code.google.com/p/googlechartsharp/

Lysle, S. (2007, January 16). Passing Data between Windows Forms. Retrieved April
16, 2009, from C# Corner: http://www.c-
sharpcorner.com/UploadFile/scottlysle/PassData01142007013005AM/PassData.aspx

Marczyk, A. (2004). Genetic algorithms and evolutionary computation. Retrieved April


13, 2009, from http://www.talkorigins.com/faqs/genalg/genalg.html

Marshall, J. A., & Srikanth, V. (1999). Curved trajectory prediction using a self-
organizing neural network. Retrieved April 15, 2009, from Computer Science - UNC-Chapel
Hill: http://www.cs.unc.edu/~marshall/WWW/PAPERS/curve9912.ps.gz

Matthews, J. (2002). Back-propagation for the Uninitiated. Retrieved April 15, 2009,
from generation5: http://www.generation5.org/content/2002/bp.asp

Matthews, J. (2000). Back-Propagation: CBPNet. Retrieved April 15, 2009, from


generation5: http://www.generation5.org/content/2000/cbpnet.asp

Matthews, J. (n.d.). generation5. Retrieved 26 01, 2009, from XORGA:


http://www.generation5.org/content/2000/xorga.asp

McCulloch, W., & Pitts, W. (1943). A logical calculus of the ideas immanent in
nervous activity. Bulletin of Mathematical Biophysics , 5, 115-133.

Page 215 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Microsoft. (2009). BackgroundWorker Class. Retrieved April 16, 2009, from MSDN:
http://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker.aspx

Microsoft. (2009). Using C++ Interop (Implicit PInvoke). Retrieved April 16, 2009,
from MSDN: http://msdn.microsoft.com/en-us/library/2x8kf7zx.aspx

Mitchell, M. (1996). An introduction to genetic algorithms. Massachusetts: MIT Press.

Nguyen, D., & Widrow, B. (1990). Improving the learning speed of 2-layer neural
networks by choosinginitial values of the adaptive weights. Neural Networks, 1990., 1990
IJCNN International Joint Conference on , 21-26.

Nissen, S. (n.d.). Retrieved 21 01, 2009, from FANN: http://leenissen.dk/fann/

Nissen, S. (n.d.). Retrieved 26 01, 2009, from Datatypes:


http://leenissen.dk/fann/html/files/fann_data-h.html#fann_train_enum

Nissen, S. (n.d.). Reference Manual. Retrieved 21 01, 2009, from FANN:


http://leenissen.dk/fann/html/files/fann-h.html

Nissen, S. (n.d.). struct fann. Retrieved 21 01, 2009, from FANN:


http://leenissen.dk/fann/fann_1_2_0/r1597.html

Nissen, S. (n.d.). struct fann_train_data. Retrieved 21 01, 2009, from FANN:


http://leenissen.dk/fann/fann_1_2_0/r1837.html

Nplot Charting Library for .NET. (n.d.). Retrieved 21 01, 2009, from
http://netcontrols.org/nplot/wiki/

Parker, D. B. (1982). Learning Logic. Invention Report S81-64, Stanford University,


Office of Technology Licensing.

Robot Control. (2008). Retrieved April 15, 2009, from Artificial Neural Networks:
http://www.learnartificialneuralnetworks.com/robotcontrol.htm

Page 216 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Rosenblatt, F. (1958). The perceptron: A probabilistic theory for information storage


and organization in the brain. Psychological Review , 65, 386-408.

Stergiou, C., & Dimitrios, S. (1997). Neural Networks. Retrieved April 15, 2009, from
Imperial College London:
http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html#Contents

Werbos, P. (1974). Beyond Regression: New tools for prediction and analysis in the
behavioral sciences. PhD, Harvard University.

ZedGraph. (n.d.). Retrieved 21 01, 2009, from


http://zedgraph.org/wiki/index.php?title=Main_Page

12 - Tables
12.1 - Table of Figures
Figure 1 - A human neuron ............................................................................................ 10

Figure 2 - An artificial neuron ........................................................................................ 12

Figure 3 - The sigmoid function shape ........................................................................... 13

Figure 4 - local optimum (units are undefined in this case). .......................................... 27

Figure 5 - A feed-forward neural network ..................................................................... 30

Figure 6 - The initial state of a cascade correlation neural network. The circular
connections are adjustable weights (Fahlman & Lebiere, The Cascade-Correlation Learning
Architecture, 1991). ................................................................................................................. 35

Page 217 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Figure 7 - The second state of a cascade correlation neural network. A first node has
been added. The square connections are locked weights(Fahlman & Lebiere, The Cascade-
Correlation Learning Architecture, 1991)................................................................................ 35

Figure 8 - The third state of a cascade correlation neural network, with two nodes added
(Fahlman & Lebiere, The Cascade-Correlation Learning Architecture, 1991). ...................... 36

Figure 9 – Network and output tab design (Main form) ................................................ 41

Figure 10 - Mean squared error tab design (Main form) ................................................ 42

Figure 11 - Dataset tab design (Main form) ................................................................... 42

Figure 12 - Design of the training algorithm tab (Wizard form) .................................... 43

Figure 13 - Design of the network settings tab (Wizard form) ...................................... 44

Figure 14 - Dataset selection tab design (Wizard form)................................................. 44

Figure 15 - Working form design (Working form) ........................................................ 45

Figure 16 - About form design (About form) ................................................................. 45

Figure 17 - Dataset display design (About form) ........................................................... 46

Figure 18 - Licenses display design (About form) ......................................................... 46

Figure 19 - Network and Output tab............................................................................... 47

Figure 20 - Mean squared error graph tab ..................................................................... 48

Figure 21 - The dataset tab ............................................................................................. 49

Figure 22 - Learning Algorithm tab ............................................................................... 50

Figure 23 - Network Settings tab.................................................................................... 51

Page 218 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Figure 24 - Dataset tab ................................................................................................... 51

Figure 25 - About form, main view ................................................................................ 52

Figure 26 - About form, Datasets view .......................................................................... 53

Figure 27 - About form, Licenses view .......................................................................... 54

Figure 28 - Working form .............................................................................................. 54

Figure 29 - Number of iterations to achieve target (XOR) ............................................ 61

Figure 30 - Mean squared error upon last training test (XOR) ...................................... 61

Figure 31 - The mean squared error at testing (XOR).................................................... 62

Figure 32 - Number of iterations to achieve target (Iris data) ........................................ 64

Figure 33- Mean squared error upon last training test (Iris data) ................................... 64

Figure 34- The mean squared error at testing (Iris data) ................................................ 65

Figure 35 - Number of iterations to achieve target (Viruses) ......................................... 67

Figure 36 - Mean squared error upon last training test (Viruses)................................... 67

Figure 37 - Mean squared error at testing (Viruses)....................................................... 68

12.2 - Table of Equations


Equation 1 - Summation of the inputs and weights ........................................................ 12

Equation 2 - A sigmoid function(Bourg & Seeman, 2004). In this example “y” is the
function output and “t” is the function input, as defined in the graph above. ......................... 13

Equation 3 – The delta rule(Matthews, Back-propagation for the Uninitiated, 2002) ... 21

Page 219 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Equation 4 - differentiated sigmoid function (Matthews, Back-propagation for the


Uninitiated, 2002) .................................................................................................................... 21

Equation 5 - Altered perceptron learning rule (Matthews, Back-propagation for the


Uninitiated, 2002) .................................................................................................................... 21

Equation 6 - Altered delta calculation rule for the output layer(Matthews, Back-
propagation for the Uninitiated, 2002)..................................................................................... 21

Equation 7 - Altered delta calculation rule for the hidden layer(s) (Matthews, Back-
propagation for the Uninitiated, 2002)..................................................................................... 22

Equation 8 - The definition of 's' given that 'o' is the network output at which the error
is measured, 'p' is the training pattern, 'v' is the candidate unit's value, the quantities ‘`v ’ and
‘`Eo’ are the values of v and Eo averaged over all patterns (Fahlman & Lebiere, The Cascade-
Correlation Learning Architecture, 1991)................................................................................ 33

Equation 9 - The partial derivative of ’s’ (Fahlman & Lebiere, The Cascade-Correlation
Learning Architecture, 1991). .................................................................................................. 34

Equation 10 - The partial derivative of ‘s’ with respect to each of the candidates
incoming weights ‘wi’, where ‘oo’ is the sign of the correlation between the candidates value
output ‘o’, ‘f'p’ is the derivative for pattern 'p' of the candidate unit's activation function with
respect to the sum of it’s inputs and ‘Ii,p’ is the input the candidate receives from unit ‘i’ for
pattern ‘p’. ................................................................................................................................ 34

12.3 - Table of tables


Table 1 - The statistical approach to deciding relevant data .......................................... 15

Table 2 - Genetic Algorithm testing data (XOR) ........................................................... 59

Table 3 - Back-propagation testing data (XOR)............................................................. 59

Page 220 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

Table 4 - Cascade Correlation testing data (XOR) ......................................................... 60

Table 5 - Genetic algorithm testing data (Iris data) ........................................................ 63

Table 6 - Back-propagation testing data (Iris data) ........................................................ 63

Table 7 - Cascade correlation testing data (Iris data) ..................................................... 63

Table 8 - Genetic Algorithms testing data (Viruses) ...................................................... 66

Table 9 - Back-propagation testing data (Viruses) ......................................................... 66

Table 10 - Cascade Correlation testing data (Viruses) ................................................... 66

Page 221 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

13 - Diary
• 01/10/2008 – Research progress satisfactory.
o Decision made to adjust the weights of the neural network during
training as opposed to using a genetic algorithm to establish network
structure.
o Decided on XOR as a starting point.
• 10/10/2008
o GALib decided as a genetic algorithm starting point
o Further non-web references as a goal
• 16/10/2008
o Investigation into various neural network libraries taking place.
• 17/10/2008
o Investigating more deeply into journals based on research already
conducted.
o Submitted objectives.
• 24/10/2008
o University computers were down so code demo was impossible;
however, back-propagation network was running and learning XOR.
o Agreed on further investigation into GALib and genetic algorithms.
• 31/10/2008
o First presentation complete and ready
o References building up
o Project plan complete
• 10/11/2008
o Presentation went well
o Progress satisfactory
• 14/11/2008
o Discussed document formatting and layout

Page 222 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

o Progress satisfactory.
• 19/11/2008
o Milestone one submitted successfully.
• 21/11/2008
o Progress satisfactory
• 28/11/2008
o Progress satisfactory
• 05/12/2008
o Progress satisfactory
• 12/12/2008
o Progress satisfactory
• 09/01/2009
o Progress satisfactory
• 16/01/2009
o Progress satisfactory
• 23/01/2009
o Progress satisfactory
• 30/01/2009
o Milestone two has been successfully submitted
o Started investigating other areas to explore outside of project
requirements.
• 06/02/2009
o Still exploring extra areas for project to cover.
• 13/02/2009
o Have decided to extend the project by including cascade correlation
learning algorithm and comparison to genetic algorithms.
o Progress satisfactory
• 20/02/2009
o Progress satisfactory

Page 223 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

• 27/02/2009
o Progress satisfactory
• 06/03/2009
o Project progress delayed by other coursework
• 13/03/2009
o Project progress delayed by other coursework
• 20/03/2009
o Project progressing, catch-up time for previous two weeks will be used
over Easter break.
o Have started report
• 02/04/2009
o Main problems experienced with genetic algorithm seem related to
memory leaks, have decided to implement in managed language (C#) to
try to solve this
o Report continues satisfactorily
• 04/04/2009
o To do:
 About box
 Add dataset notes
 Decided on virus classification as a third dataset via the custom
dataset functionality.
o Report satisfactory
• 05/04/2009
o Further work on user interface needed
o Finish C# genetic algorithm code (C++ code does work but is very slow)
o Report satisfactory
• 06/04/2009
o C# genetic algorithm code is complete, is much faster (fixed several
bugs along the way) and retains none of the problematic memory leaks.

Page 224 of 225


William Sayers 05025397

Genetic Algorithms and Neural Networks


Milestone 3

o Report satisfactory
• 10/04/2009
o Fully functional project – commencing report.
o Report satisfactory.

Page 225 of 225

You might also like