You are on page 1of 2

ASSIGNMENT 1

1. Discuss the limitations of a single perceptron in handling linearly inseparable datasets.


How can these limitations be overcome using multilayer perceptrons (MLPs)?

2. Consider the case of the XOR function in which the two points {(0, 0), (1, 1)} belong to
one class, and the other two points {(1, 0), (0, 1)} belong to the other class. Design a
multilayer perceptron for this binary classification problem.

3. Given the weights and bias of a perceptron (w1 = 0.5, w2 = -0.3, bias = 0.2), calculate the
output of the perceptron for input values (x1 = 0.8, x2 = -0.5) using the step activation
function.

4. Consider a simple feed-forward neural network with a single hidden layer, as shown
below: Input Layer (2 neurons) --> Hidden Layer (3 neurons) --> Output Layer (1
neuron). Perform one iteration of the backpropagation algorithm for the given sample to
update the weights and biases using the sigmoid activation function and mean squared
error (MSE) loss. Given w1 = 0.3, w2 = 0.5, w3 = 0.2, w4 = -0.4, w5= -0.1, w6 = 0.2,
w7=0.4, w8 = -0.3, w9 = -0.2, learning rate (α) of 0.1.

You might also like