You are on page 1of 2

ML Assignment

NAME: VIKRAM K USN: 1BM17IS089

1. Question. 4.1. What are the values of weights w0, w1, and w2 for the
perceptron whose decision surface is illustrated in the figure? Assume the
surface crosses the x1 axis at -1 and the x2 axis at 2.

Output of perceptron = sgn(w0+w1x1+w2x2)


Equation of the line = w0+w1x1+w2x2=0
Given : (-1,0) and (0,2) lie on the line
Slope of the line (m) = y2-y1/x2-x2 = 2-0/0-(-1) = 2 and Y intercept (c) = 2
Therefore, equation of the line is y = mx + c
=> y = 2x +2
=> 2 + 2x -y = 0
This gives the values of weights w0,w1,w2 as 2,2,-1
For origin(0,0) output of perceptron = negative, but using the values of
weights calculated above gives positive output, therefore the actual values
of the weights w0,w1,w2 are -2, -2, 1

2. Question. 4.2. Design a two-input perceptron that implements the Boolean


function A∧¬B.
Truth table
A B A∧¬B
-1 -1 -1
The line on the
-1 1 -1
decision
1 -1 1 surface
passing
1 1 -1
through (0,- 1)
and (1,0)
separates the positive and negative points
Slope m = 0-(-1)/1-0 = 1 and y intercept = -1
Equation of the line: y = mx + c
=> y = x – 1
=> -1 + x – y = 0
Therefore, w0 = -1 ,w1 = 1,w2 = -1

(b) Design the two-layer network of perceptrons that implements A XOR B.


A XOR B cannot be done by a single perceptron so we need to build a two layer
network. A XOR B = (A∧¬B) ∨ (¬A∧B) , we can use one perceptron for each of
these terms

Question. 4.3. Consider two perceptrons A and B defined by the threshold


expression w0+w1x1+w2x2>0. Perceptron A has weight values w0=1, w1=2,
w2=1 and perceptron B has weight values w0=0, w1=2, w2=1. Is perceptron A
more_general_than perceptron B? A is more_general_than B if and only if ∀
instance , B()=1 ! A()=1.
Yes, perceptron A more general than perceptron B. For each input
instance x=(x1, x2), if x is satisfied by B, which means 2x1+x2>0, then we
have 2x1+x2+1>0. Hence, x is also satisfied by the A.

Question. 4.5. Derive a gradient descent training rule for a single unit with
output o, where o = w0+w1x1+w1x12 +...+wnxn+wnxn2

You might also like