You are on page 1of 2

Homework

Deadline: 17th of March (we will not accept solutions after the deadline)
Solution format:
- Part #1 — photo \ pdf file with your calculations
- Part #2 — link to the Google Colaboratory
How to send the solution: direct message to Yuriy Makarov via Teams

Part #1
Let’s assume that we have a regression task and the following neural network.

We have only one hidden layer with two neurons, each neuron has a ReLU activation
function.

Questions:
1. Write down the prediction (𝑦ℎ𝑎𝑡 =...) function
2. What loss function (𝐿(𝑦𝑡𝑟𝑢𝑒,𝑦ℎ𝑎𝑡)) would you like to use here? Write it down.
3. Insert your prediction function into your loss function
4. Take the following derivatives and write them down
∂𝐿
a. ∂𝑤7
∂𝐿
b. ∂𝑤1
∂𝐿
c. ∂𝑤6

5. Advanced question (not obligatory): calculate these derivatives in the case of the
sigmoid activation function in the neurons in the hidden layer
Part #2
Take this notebook as a point to start:
https://colab.research.google.com/drive/1Bau1ARCi8RrJ1cKZQ_xQcLmzJEep-BNJ#scrollTo
=9Po7cjO7C7_1

You should accomplish the following steps:


1. Try different architectures of neural network and find the architecture with the highest
score (the choice of the score is up to you) on the test data
2. Try more simple models
a. Linear regression (with L1 and L2 regularization)
b. Decision Tree
c. Random Forest
3. As a result, please, create a table (pandas DataFrame) with a comparison of
different models. Compare the score on train and test data.

You might also like