Professional Documents
Culture Documents
The dataset has two features('feature1' and 'feature2') and one target variable
The target variable(named as 'class') maps each record to either 0 or 1
Some of the necessary pacakges required to read file and data visualization has been imported for
you
Using pandas read the csv file and assign the resulting dataframe to variable 'data'
The following code extacts features and target variable and assign it to variable X and y respectively
Run the below piece of code to visualize the data in x-y plane.The green and blue dots corresponds
to class 0 and 1 respectively
You can see that the data is not linearly seperable i.e you cannot draw one specific boundary to
classify the data.
In [24]: colors=['green','blue']
cmap = matplotlib.colors.ListedColormap(colors)
#Plot the figure
plt.figure()
plt.title('Non-linearly separable classes')
plt.scatter(X[:,0], X[:,1], c=y,
marker= 'o', s=50,cmap=cmap,alpha = 0.5 )
plt.show()
Before diving into deep neural network lets try to classify the data using simple logistic regression.
The code for logistic regression has been written for you.
Run the below cell to build a simple logistic regression model
Run the below cell to define the method to plot the decision boundary.The code for visualization has
been written for you.
Run the cell below cell to plot the decision boundary perdicted by logistic regression model
In [27]: plot_decision_boundary(X.T,y,lambda x: lr_model.predict(x))
From the above plot you can say that simple logistic regression poorly perfroms in classifying the
data since the decision boundary is not able to effectively classify the two classes.
Now build a deep neural network to classify the same data.
Define the layer dimension as an array called 'layer_dims' with one input layer equal to number of
features, two hidden layer with nine nodes each and one final output layer with** one node**.
Define a function named placeholders to return two placeholders one for input data as A_0 and one for
output data as Y.
Define function named initialize_parameters_deep() to initialize weights and bias for each layer
Use tf.random_normal() to initialise weights and tf.zeros() to initialise bias. Set datatype as float64
Parameters - layer_dims
Returns - dictionary of weights and bias
return parameters
Define functon named linear_forward_prop() to define forward propagation for a given layer.
parameters: A_prev(output from previous layer), W(weigth matrix of current layer), b(bias vector for
current layer),activation(type of activation to be used for out of current layer)
returns: A(output from the current layer)
Use relu activation for hidden layers and for final output layer return the output unactivated i.e if
activation is sigmoid
Train the deep neural network with learning rate 0.3 and number of iterations to 10000
Use X_data and Y_data to train the network
6.931892819495226
0.29380057389147135
0.28654399752605125
0.2827171909803601
0.2794727368687029
0.2763785174902902
0.27361417862396276
0.2707828611444487
0.26923518509569005
0.26726412288801044
Run the cell below to define the method to predict outputof the model for given input and parameters.The
code has been written for you
Run the below cell to plot the decision boundary predicted by the deep nural network
In [38]: plot_decision_boundary(X_data,y,lambda x: predict(x.T,parameters))
In [ ]: