You are on page 1of 4

DEEP LEARNING

1.) Binary classification problem

# Create your first MLP in Keras

from keras.models import Sequential

from keras.layers import Dense

import numpy

# fix random seed for reproducibility

numpy.random.seed(7)

# load pima indians dataset

dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",")

# split into input (X) and output (Y) variables

X = dataset[:,0:8]

Y = dataset[:,8]

# create model

model = Sequential()

model.add(Dense(12, input_dim=8, activation='relu'))

model.add(Dense(8, activation='relu'))

model.add(Dense(1, activation='sigmoid'))

# Compile model

model.compile(loss='binary_crossentropy', optimizer='adam',
metrics=['accuracy'])

# Fit the model


model.fit(X, Y, epochs=150, batch_size=10)

# evaluate the model

scores = model.evaluate(X, Y)

print("\n%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

NOTE:If you try running this example in an IPython or Jupyter notebook you may
get an error. The reason is the output progress bars during training. You can easily
turn these off by setting verbose=0 in the call to model.fit().

------------------------------------------------------------------------------------------------------

2.) 5 Step Life-Cycle for Neural Network Models in Keras

Define Network.

Compile Network.

Fit Network.

Evaluate Network.

Make Predictions.

1.)Neural networks are defined in Keras as a sequence of layers. The container for
these layers is the Sequential class.

The first step is to create an instance of the Sequential class. Then you can create
your layers and add them in the order that they should be connected.

model = Sequential()

model.add(Dense(2))

or layers = [Dense(2)]

model = Sequential(layers)

$Activation functions
Regression: Linear activation function or ‘linear’ and the number of neurons
matching the number of outputs.

Binary Classification (2 class): Logistic activation function or ‘sigmoid’ and one


neuron the output layer.

Multiclass Classification (>2 class): Softmax activation function or ‘softmax’ and


one output neuron per class value, assuming a one-hot encoded output pattern.

$ Loss functions

Regression: Mean Squared Error or ‘mse‘.

Binary Classification (2 class): Logarithmic Loss, also called cross entropy or


‘binary_crossentropy‘.

Multiclass Classification (>2 class): Multiclass Logarithmic Loss or


‘categorical_crossentropy‘.

$Optimizer

Stochastic Gradient Descent or ‘sgd‘ that requires the tuning of a learning rate and
momentum.

ADAM or ‘adam‘ that requires the tuning of learning rate.

RMSprop or ‘rmsprop‘ that requires the tuning of learning rate.

2.)model.compile(optimizer='sgd', loss='mse', metrics=['accuracy'])

3.) Fit

history = model.fit(X, y, batch_size=10, epochs=100)

4.)Evaluate

loss, accuracy = model.evaluate(X, y)

where X,y are the new validation data sets

5.)Predict
predictions = model.predict(x)

-------------------------------------------------------------------------------------------------

3.)Iris flower dataset is well studied and is a good problem for practicing on neural
networks because all of the 4 input variables are numeric and have the same scale
in centimeters. Each instance describes the properties of an observed flower
measurements and the output variable is specific iris species.

This is a multi-class classification problem, meaning that there are more than two
classes to be predicted, in fact there are three flower species. This is an important
type of problem on which to practice with neural networks because the three
class values require specialized handling.

You can download the iris flowers dataset from the UCI Machine Learning
repository and place it in your current working directory with the filename
“iris.csv“.

REFER:https://machinelearningmastery.com/tutorial-first-neural-network-
python-keras/
https://github.com/fchollet/keras-resources

You might also like