You are on page 1of 4

Logical Operation XOR

The input and the targeted output are as shown in the matrix below.

First the number of neurons in the hidden layer is set starting from 1 neuron to test for the least
mean square error (MSE). The network with least MSE is chosen and tested.

The graph above shows the training result of the network with 1 neuron in the hidden layer which
has a MSE of 0.20206. The result of the training is good considering that the error was close to zero.
The graph above shows the training result of the network with 2 neurons in the hidden layer which
has a MSE of 0.00094387. The results of this network is significantly better than the network with 1
neuron in the hidden layer.

The graph above shows the training result of the network with 3 neurons in the hidden layer which
has a MSE of 0.00056224. The error of this network is almost only half of the error of the network
with 2 neurons and the training process was a lot smoother and stable compared to the other
network.
With these result in mind, an additional neuron is added to see if even better results can be
obtained.

The graph above shows the result for the network with 4 neurons in the hidden layer. The results
were significantly worse as compared network with 3 neurons in the hidden layer. Therefore, the
network with 3 neuron in the hidden layer is chosen and tested.

The above shows the test results of the network. The inputs are shuffled but the network has no
problem identifying the outputs and the error of the results are close to zero.
We can conclude that the network with 3 neurons in the hidden layer is the most suitable network
to classify the XOR logical operation.

Sarveswaran a/l Vinayagan 139492

You might also like