Professional Documents
Culture Documents
FOR EMPLOYERS
Vihar Kurama
May 30, 2019
Updated: March 11, 2022
T he main idea behind deep learning is that artificial intelligence should draw
inspiration from the brain. This perspective gave rise to the "neural network”
terminology. The brain contains billions of neurons with tens of thousands of
connections between them. Deep learning algorithms resemble the brain in many
conditions, as both the brain and deep learning models involve a vast number of
computation units (neurons) that are not extraordinarily intelligent in isolation but
become intelligent when they interact with each other.
I think people need to understand that deep learning is making a lot of things,
behind-the-scenes, much better. Deep learning is already working in Google
search, and in image search; it allows you to image search a term like “hug.”—
Geoffrey Hinton
Post Share
Neurons
https://builtin.com/data-science/deep-learning-python 1/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
FOR which
The basic building block for neural networks is artificial neurons, EMPLOYERS
imitate
human brain neurons. These are simple, powerful computational units that have
weighted input signals and produce an output signal using an activation function.
These neurons are spread across several layers in the neural network.
Below is the image of how a neuron is imitated in a neural network. The neuron takes
in a input and has a particular weight with which they are connected with other
neurons. Using the Activation function the nonlinearities are removed and are put
into particular regions where the output is estimated.
Deep learning consists of artificial neural networks that are modeled on similar
networks present in the human brain. As data travels through this artificial mesh,
each layer processes an aspect of the data, filters outliers, spots familiar entities, and
produces the final output.
Post Share
https://builtin.com/data-science/deep-learning-python 2/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
FOR EMPLOYERS
Input layer : This layer consists of the neurons that do nothing than receiving the
inputs and pass it on to the other layers. The number of layers in the input layer
should be equal to the attributes or features in the dataset.
Output Layer:The output layer is the predicted feature, it basically depends on the
type of model you’re building.
Hidden Layer: In between input and output layer there will be hidden layers based on
the type of model. Hidden layers contain vast number of neurons. The neurons in the
hidden layer apply transformations to the inputs and before passing them. As the
network is trained the weights get updated, to be more predictive.
NEURON WEIGHTS
https://builtin.com/data-science/deep-learning-python 3/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
FEEDFORWARD DEEP NETWORKS
Feedforward supervised neural networks were among the first and most successful
learning algorithms. They are also called deep networks, multi-layer Perceptron
(MLP), or simply neural networks and the vanilla architecture with a single hidden
layer is illustrated. Each Neuron is associated with another neuron with some weight,
The image below depicts how data passes through the series of layers.
https://builtin.com/data-science/deep-learning-python 4/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
Mathematically,
There are several activation functions that are used for different use cases. The most
commonly used activation functions are relu, tanh, softmax. The cheat sheet for
activation functions is given below.
CREDITS
Post Share
https://builtin.com/data-science/deep-learning-python 5/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
BACKPROPAGATION
FOR EMPLOYERS
The predicted value of the network is compared to the expected output, and an error
is calculated using a function. This error is then propagated back within the whole
network, one layer at a time, and the weights are updated according to the value that
they contributed to the error. This clever bit of math is called the backpropagation
algorithm. The process is repeated for all of the examples in your training data. One
round of updating the network for the entire training dataset is called an epoch. A
network may be trained for tens, hundreds or many thousands of epochs.
The cost function is the measure of “how good” a neural network did for its given
training input and the expected output. It also may depend on attributes such as
weights and biases.
A cost function is single-valued, not a vector because it rates how well the neural
network performed as a whole. Using the gradient descent optimization algorithm,
the weights are updated incrementally after each epoch.
Post Share
Mathematically,
https://builtin.com/data-science/deep-learning-python 6/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
The magnitude and direction of the weight update are computed by taking a step in
the opposite direction of the cost gradient.
where Δw is a vector that contains the weight updates of each weight coefficient w,
which are computed as follows:
We calculate the gradient descent until the derivative reaches the minimum error,
and each step is determined by the steepness of the slope (gradient).
Post Share
https://builtin.com/data-science/deep-learning-python 7/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
FOR EMPLOYERS
Now consider a problem to find the number of transactions, given accounts and
family members as input. To solve this first, we need to start with creating a forward
propagation neural network.
Our Input layer will be the number of family members and accounts, the number of
hidden layers is one, and the output layer will be the number of transactions. Given
weights as shown in the figure from the input layer to the hidden layer with the
number of family members 2 and number of accounts 3 as inputs. Now the values of
the hidden layer (i, j) and output layer (k) will be calculated using forward propagation
Post Share
by the following steps.
https://builtin.com/data-science/deep-learning-python 8/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
Process
FOR EMPLOYERS
1. Multiply — add process.
2. Dot product (Inputs * Weights).
3. Forward propagation for one data point at a time.
4. Output is the prediction for that data point.
Value of i will be calculated from input value and the weights corresponding to the
neuron connected.
i = (2 * 1) + (3 * 1)
→i=5
Similarly,
j = (2 * -1) + (3 * 1)
→j=1
→k=9
https://builtin.com/data-science/deep-learning-python 9/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
FOR EMPLOYERS
Now that we have seen how the inputs are passed through the layers of the neural
network, let’s now implement an neural network completely from scratch using a
Python library called NumPy.
```
Post Share
https://builtin.com/data-science/deep-learning-python 10/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
import numpy as np
FOR EMPLOYERS
print('a = ')
a = int(input())
# 2
print('b = ')
b = int(input())
# 3
weights = {
# 2 * 1 +3 * 1 = 5
# 2 * -1 + 3 * 1 = 1
print('node_1_hidden: {}'.format(node_1_value))
view raw
Post Share
https://builtin.com/data-science/deep-learning-python 11/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
$python dl_multilayer_perceptron.py
a =
b =
node 0_hidden: 7
node_1_hidden: 1
output layer : 13
For neural Network to achieve their maximum predictive power we need to apply an
activation function for the hidden layers.It is used to capture the non-linearities. We
apply them to the input layers, hidden layers with some equation on the values.
In the previous code snippet, we have seen how the output is generated using a
simple feed-forward neural network, now in the code snippet below, we add an
activation function where the sum of the product of inputs and weights are passed
into the activation function.
Post Share
https://builtin.com/data-science/deep-learning-python 12/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
import numpy as np
FOR EMPLOYERS
print('a = ')
a = int(input())
# 2
print('b = ')
b = int(input())
weights = {
def relu(input):
return(output)
node_0_output = relu(node_0_input)
node_1_output = relu(node_1_input)
print(model_output)
Post Share
https://builtin.com/data-science/deep-learning-python 13/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
$python dl_fp_activation.py
a =
b =
44
About Keras:
Keras is a high-level neural networks API, written in Python and capable of running
on top of TensorFlow, CNTK, or Theano.
It is one of the most popular frameworks for coding neural networks. Recently, Keras
has been merged into tensorflow repository, boosting up more API's and allowing
multiple system usage.
To install keras on your machine using PIP, run the following command.
1. Load Data.
2. Define Model.
3. Compile Model.
4. Fit Model.
Post Share
5. Evaluate Model.
6. Tie It All Together.
https://builtin.com/data-science/deep-learning-python 14/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
FOR EMPLOYERS
Fully connected layers are described using the Dense class. We can specify the
number of neurons in the layer as the first argument, the initialisation method as the
second argument as init and determine the activation function using the activation
argument. Now that the model is defined, we can compile it. Compiling the model
uses the efficient numerical libraries under the covers (the so-called backend) such
as Theano or TensorFlow. So far we have defined our model and compiled it set for
efficient computation. Now it is time to run the model on the PIMA data. We can train
or fit our model on our data by calling the fit() function on the model.
Post Share
https://builtin.com/data-science/deep-learning-python 15/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
import numpy
seed = 7
numpy.random.seed(seed)
X = dataset[:, 0:8]
Y = dataset[:, 8]
model = Sequential()
model.compile(loss='binary_crossentropy',
optimizer='adam', metrics=['accuracy'])
scores = model.evaluate(X, Y)
https://builtin.com/data-science/deep-learning-python 16/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
$python keras_pima.py
Epoch 2/150
Epoch 3/150
Epoch 149/150
Epoch 150/150
The neural network trains until 150 epochs and returns the accuracy value. The
model can be used for predictions which can be achieved by the method model.
ENDING NOTES
Few other architectures like Recurrent Neural Networks are applied widely for
text/voice processing use cases. These neural networks, when applied to large
datasets, need huge computation power and hardware acceleration, achieved by
configuring Graphic Processing Units.
If you are new to using GPUs you can find free configured settings online
PostCollab Notebooks.
through Kaggle Notebooks/ Google Share
To achieve an efficient model,
one must iterate over network architecture which needs a lot of experimenting and
experience. Therefore, a lot of coding practice is strongly recommended.
https://builtin.com/data-science/deep-learning-python 17/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
RELATED
Data Science
Post Share
https://builtin.com/data-science/deep-learning-python 18/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
RECRUIT WITH US
Built In is the online community for startups and tech companies. Find startup jobs, tech news and
events.
About
Our Story
Careers
Content Descriptions
Company News
Get Involved
Recruit With Built In
Resources
https://builtin.com/data-science/deep-learning-python 19/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
Customer Support
FOR EMPLOYERS
Share Feedback
Report a Bug
Remote Jobs in DC
Browse Jobs
Tech Hubs
Built In Austin
Built In Boston
Built In Chicago
Built In Colorado
Built In LA
Built In NYC
Built In Seattle
© Built In 2022
Accessibility Statement
Copyright Policy
Privacy Policy
Post Share
Terms of Use
https://builtin.com/data-science/deep-learning-python 20/21
3/24/22, 10:24 PM Deep Learning With Python: A Guide | Built In
CA Notice of Collection
FOR EMPLOYERS
Post Share
https://builtin.com/data-science/deep-learning-python 21/21