Neural networ s can be used to determine relationships and patterns between inp uts and outputs.

A simple single layer feed forward neural networ which has a t o ability to learn and differentiate data sets is nown as a perceptron. Single layer feed forward perceptron By iteratively learning the weights, it is possible for the perceptron to find a s olution to linearly separable data (data that can be separated by a hyperplane). In this example, we will run a simple perceptron to determine the solution to a 2-input OR. X1 or X2 can be defined as follows: X1 X2 Out 0 0 0 1 0 1 0 1 1 1 1 1 If you want to verify this yourself, run the following code in Matlab. Your code can further be modified to fit your personal needs. We first initialize our var iables of interest, including the input, desired output, bias, learning coeffici ent and weights. input = [0 0; 0 1; 1 0; 1 1]; numIn = 4; desired_out = [0;1;1;1]; bias = -1; coeff = 0.7; rand('state',sum(100*cloc )); weights = -1*2.*rand(3,1); The input and desired_out are self explanatory, with the bias initialized to a c onstant. This value can be set to any non-zero number between -1 and 1. The coef f represents the learning rate, which specifies how large of an adjustment is ma de to the networ weights after each iteration. If the coefficient approaches 1, the weight adjustments are modified more conservatively. Finally, the weights a re randomly assigned. A perceptron is defined by the equation: Therefore, in our example, we have w1*x1+w2*x2+b = out We will assume that weights(1,1) is for the bias and weights(2:3,1) are for X1 a nd X2, respectively. One more variable we will set is the iterations, specifying how many times to tr ain or go through and modify the weights. iterations = 10; Now the feed forward perceptron code. for i = 1:iterations out = zeros(4,1); for j = 1:numIn y = bias*weights(1,1)+... input(j,1)*weights(2,1)+input(j,2)*weights(3,1); out(j) = 1/(1+exp(-y)); delta = desired_out(j)-out(j); weights(1,1) = weights(1,1)+coeff*bias*delta; weights(2,1) = weights(2,1)+coeff*input(j,1)*delta;

 

 

 

 

 

weights(3.1084 11. If you are interested try to run the same code for other logical conditions li e AND or NAND to see what you get. the transfer function (sigmoid.2359 2.6166 3. the outputs begin to converge.1) = weights(3.4423 12. the equation solving for out is determine d as mentioned above.0000 weights = 5. the output converges towards the desired output .1)+coeff*input(j. First. the learning rate. out = 0. When running the perceptron over 10 iterations.9952 weights = 0.0043 0.2)*delta.8823 As the OR logic condition is linearly separable.7409 As the iterations approach 1000. linear. and then run through a sigmoid function to ensure values a re squashed within a [0 1] limit. but other algorithms li e the Levenberg-Marquardt also exist). etc) and t he learning rule (in this case the delta rule is used.3756 0. Convergence time can also change based on the initia l weights. b ut are still not precisely as expected: out = 0. Weights are then modified iteratively based on the delta rule.9987 1.9984 0.     . end end A little explanation of the code.9244 0. a solution will be reached afte r a finite number of loops.8596 0.

Sign up to vote on this title
UsefulNot useful