You are on page 1of 27

AI/ML workshop

presented by
DappLogix Software
Pvt. Ltd.

Guest Speakers:
Praful Vinayak
Dolly Vaishnav
Jaideep Sandhu
Why AI is the Next Big Thing?
Just as electricity brought a significant
change in the world, today, AI is doing
the same. Though we may not be
aware of it, AI is everywhere.

AI has created its impact not only on


technology sectors but also in other
sectors such as retail, healthcare,
manufacturing and so on.

From self-driving cars to playing chess,


AI has outperformed humans in each
and every task with its high tech new,
time-tested tools.
Machine Learning

Machine Learning is one of the most


popular ways of implementing AI.

It teaches a computer to learn from experience without the need for explicitly writing
commands to it.

It is the art of creating statistical models that recognize patterns in data which enables
a computer system to predict outcomes, learn from its mistakes and improve.

It has some fascinating applications which range from simple tasks like classifying cat
and dog images, predicting house prices to building recommender systems, and
making an intelligent voice assistant like Siri.
Comparison between Biological and Artificial Neural Network (NN)

VS

Similarity of signal flow between a human NN and an Artificial NN


Why Deep Neural Net?
Types of Neural Networks

Single Layer Perceptron Deep Neural


Network
Binary Classification
General Methodology
Repeat for n iterations

Initialize
Use trained
parameters Compute
Forward Backward Update parameters
and define cost
propagation propagation parameters to predict
hyperparam function
labels
eters

Calculating Cost : Calculating Updating


the Difference the the
activations between gradients parameters
of all the actual and with W and b
layers of predicted respect to
NN output the loss
Forward Propagation Activation Functions
Softmax Classifier
Backward Propagation After finding the loss, we find the gradients of the
loss w.r.t a, z, W and b parameters which will later
be used for updating the parameters W and b.
Visualization of Forward and Backward Propagation
Forward and Backward Functions for an L-layer NN

Forward

Propagation :

Backward

Propagation :

Loss :

Update
Parameters :
Optimization Algorithms
Gradient Descent Gradient Descent 3D Visualization

Considers complete training examples


at a time.
Optimization Algorithms

Take 1 training example Take one mini-batch of training


at a time examples at a time
Comparison of Different Gradient Descents
Train/Test dataset split
Overfitting and Underfitting Explained

High Bias Just the right amount


High Variance
(Train set accuracy low) (Perfect fit to the model)
(Test set accuracy low)
How To Remove Overfitting?

Steps for reducing overfitting:

● Add more data.


● Use data augmentation.
● Use architectures that generalize well.
● Add regularization (mostly dropout, L1/L2 regularization are also possible)
● Early stopping
● Reduce architecture complexity.
Dropout Regularization (For example)
Using Dropout to randomly drop nodes in a NN
(switching off some of the nodes)
Hyperparameters
These are the parameters which are responsible for tuning the parameters W and b and
directly affect the performance of Neural Networks.

Some of the commonly used hyperparameters are:

● Number of hidden units


● Learning rate
● Number of hidden layers
● Choice of activation function
● Number of epochs
● Batch size
Resources For Further Reading
Machine learning and Deep Learning course by Andrew Ng - Coursera.

Medium articles covering Machine Learning in detail.

Kaggle competitions to boost your AI cells.

Analytics Vidhya - for data science tutorials.

Follow Siraj Raval on YouTube.

You might also like