Professional Documents
Culture Documents
ASSIGNMENT
Submitted by:
MOHAMED AHMED
2K20/CO/270
➢ implementation of a neural network with at least two layers, where you have the activation
function defined as the sigmoid function in Python using numpy.
❖ The Code:
import numpy as np
def sigmoid(x):
"""Sigmoid activation function"""
return 1 / (1 + np.exp(-x))
def sigmoid_derivative(x):
"""Derivative of sigmoid function"""
return x * (1 - x)
def forward_pass(inputs):
"""Compute the forward pass of the network"""
hidden_layer_input = np.dot(inputs, input_to_hidden_weights) + hidden_biases
hidden_layer_output = sigmoid(hidden_layer_input)
hidden_layer_error = output_delta.dot(hidden_to_output_weights.T)
hidden_layer_delta = hidden_layer_error * sigmoid_derivative(hidden_output)
input_to_hidden_weights += lr * inputs.T.dot(hidden_layer_delta)
hidden_biases += np.sum(hidden_layer_delta, axis=0, keepdims=True)
hidden_to_output_weights += lr * hidden_output.T.dot(output_delta)
output_biases += np.sum(output_delta, axis=0, keepdims=True)
if epoch % 1000 == 0:
print(f"Epoch {epoch}, Loss: {np.mean(np.square(target - forward_pass(inputs)[1]))}")
# Test
hidden_output, final_output = forward_pass(np.array([0, 1]))
print(final_output)
➢ The given code creates a simple feedforward neural network and trains it on XOR-like
data for demonstration purposes.