You are on page 1of 2

Standard Deviation - How much points are deviated from the mean(Average) - data point - used for

outliers

Variance - square root of standard deviation,square root of difference from mean.

Overfitting - model learn noise and other details to some extent , predict the new data has more
differences

Underfitting - not sufficiently complex to predict, which means model is simple and not sufficient
complex

Cross Validation Error - means Test Data Error

Cost Function -

Bias - Low Bias and High Bias - means Error in training data - Bias is difference between the expected
estimator value and predictor value.

Variance - Low Variance and High Variance - means Error in Testing Data. - how much expected
predicted value is varies.

Underfitting - data are not sufficient for trianing very simple data

Mean - Average in statistics

Gradient Descent Algorithm - it is optimizing algorithm and it to find the Global Optimum model.

Local Optimum

Mid Point
Drop Out - dropping the random units (hidden and visible layers) in neural network.

Cost function - to minimize the operation cost

SVM is used for Multidimensional Data


Normalization - all values are come between 0 and 1 , from different values

Neural Networks - it mimic the human brain neuron and connections, to learn the thinks
Is a process - neural network is a serious of algorithm try to achieve relationship from the give data
through process

Back Propagation

Neural Networks parameters - Bias , Activation , Neurons(features)

Outliar

Vanished Gradient

Deep Learning - Deep learning is the part of Machine learning ,deep learning is mimic of human brain
to learn as like human,It have networks and neurons as like human brain.

Perceptron - is a simple neural network and it is single layer neural network.


In Gendral Neural Network having
Input Layer
Hidden Layer
Output Layer
Neuron
Weight
Features

How neural network works?


In this , features are send to the input layer, each feature send Seperate node in input layer
(x1,x2,x3)
First layer is layer , second is hidden layer (which may number of layers) and third layer is the
output layer
There is the lines(weight) which connecting with neuron in layers, it transfer data from one end
to others(w1,w2,w3)
From input layer to hidden layer , every feature is connected to all other neuron in the hidden
layer,from hidden layer neurons connected to output layers neurons.
In neuron in the Hidden layer done 2 thinks , one is sumation of weight and feature and
activation of that weight
Sigmoid is example activation function.
Once the weight and feature is multiplied and sum all , which will activate it.while sumation
there is bias(I thought it is training error).
Activation function will your sigmoid function value from 0 to 1.

Multi dimensional data - SVM


Normilization -
Back Propagation
Gradient Algorithm
Hyper Parameter
Vanishing Gradient
Exploding Gradient
Drop out
Overfiting - not gendralized --> to avoid regularization , Drop out , Data Agumentation
Under fitting
What are layer we include in CNN?

Sigmoid function
Software - propaility
Pooling layer ->pooling,Antializing,Max pooling, kernal (3*3 matrix)
Fully Connected Layer
Image Processing - Convolution layer

Cost function (loss function)- best fit

You might also like