Professional Documents
Culture Documents
Seminar
Seminar
Presented By
Anisha N Rao & Aditi Goel
1/51
Support Vector Machine May 13, 2022
Table of Contents
1 Introduction
4 Case Study
5 SVC Implementation
6 Conclusion
2/51
Support Vector Machine May 13, 2022
Introduction
3/51
Support Vector Machine May 13, 2022
Support Vector Classification
Basic Concepts
• Support Vectors : Datapoints that are closest to the hyperplane is called
support vectors. Separating line will be defined with the help of these data
points.
• Hyperplane : As we can see in the diagram, it is a decision plane or
space which is divided between a set of objects having different classes.
• Margin : It may be defined as the gap between two lines on the closest
data points of different classes. It can be calculated as the perpendicular
distance from the line to the support vectors. Large margin is considered
as a good margin and small margin is considered as a bad margin.
4/51
Support Vector Machine May 13, 2022
Figure: Classification Visualization Using SVC
5/51
Support Vector Machine May 13, 2022
Hyperplane
Figure: Hyperplane in 3D
6/51
Support Vector Machine May 13, 2022
Hyperplane Visualization
Figure: Hyperplane in 2D
Figure: Hyperplane in 1D
7/51
Support Vector Machine May 13, 2022
Geometry
• The distance between a point (xo,yo) and a line ax+by+c=0 is equal to:
• Linear SVC : Linear SVC is used for linearly separable data, which means
if a data set can be classified into two classes by using a single straight
line, then such data is termed as linearly separable data, and classifier is
used called as Linear SVM classifier.
• Non-linear SVC : Non-Linear SVC is used for non-linearly separated data,
which means if a data set cannot be classified by using a straight line,
then such data is termed as non-linear data and classifier used is called as
Non-linear SVM classifier.
9/51
Support Vector Machine May 13, 2022
Maximum Margin Classifier - Hard Margin
• MMC is the hyperplane that among all separating hyperplanes ,find the
one that makes the biggest gap(margin) between two classes.
• The core idea of hard margin is to maximize the margin ,under the
constraint that the classifier does not make any mistake.
11/51
Support Vector Machine May 13, 2022
Support Vector Classifier - Soft Margin
12/51
Support Vector Machine May 13, 2022
Figure: Soft Margin
13/51
Support Vector Machine May 13, 2022
14/51
Support Vector Machine May 13, 2022
The Kernel Trick
Sometimes a linear boundary simply wont work ,no matter what is the
value of C.
15/51
Support Vector Machine May 13, 2022
Types Of Kernel
Figure: Kernels
16/51
Support Vector Machine May 13, 2022
Comparing Kernels
SVR Overview
• a regression model estimates a continuous-valued multivariate function
• formulate binary classification as convex Optimization problems
Optimization problem goals :
• find maximum margin separating the hyperplane
• alongside, correctly classify as many training points as possible.
• SVMs represent this optimal hyperplane with support vectors.
18/51
Support Vector Machine May 13, 2022
SVR: Concepts, Mathematical Model, and Graphical
Representation
f (x) = wx + b
represents a hyperplane and the two dotted lines around the hyperplane
y = f (x) + ϵ
and
y = f (x) − ϵ
represent the decision boundaries where ϵ is the distance from the
hyperplane
19/51
Support Vector Machine May 13, 2022
Graphical Presentation
20/51
Support Vector Machine May 13, 2022
• The main aim of the support vector regression model is to decide a
decision boundary at a distance from the original hyperplane such that
data points closest to the hyperplane or the support vectors are within that
boundary line. Any hyperplane that satisfies the SVR should also satisfy:
21/51
Support Vector Machine May 13, 2022
SVR generalization to SVC
22/51
Support Vector Machine May 13, 2022
Basic details on Loss Function
23/51
Support Vector Machine May 13, 2022
A few examples of loss functions
24/51
Support Vector Machine May 13, 2022
(a) Linear loss function
(b) Quadratic loss function
(c) Huber loss function
25/51
Support Vector Machine May 13, 2022
Figure: Solutions For SVR with various orders of polynomial
26/51
Support Vector Machine May 13, 2022
This graph visualizes how the magnitude of the weights can be interpreted
as a measure of flatness.
• Horizontal line is a 0th-order polynomial solution, has a very large
deviation from desired o/ps so large error.
• Linear function produces better approximations for a portion of data but
still underfits the training data.
• 4th-order solution produces the best tradeoff between function flatness
and prediction error.
• Higher-order solution has zero error, high complexity, overfits the
solution on yet to be seen data.
27/51
Support Vector Machine May 13, 2022
Case Study of SVM for Handwriting Recognition
28/51
Support Vector Machine May 13, 2022
Preprocessing
29/51
Support Vector Machine May 13, 2022
Feature Extraction
30/51
Support Vector Machine May 13, 2022
31/51
Support Vector Machine May 13, 2022
32/51
Support Vector Machine May 13, 2022
Hierarchical, Three-Stage SVM
33/51
Support Vector Machine May 13, 2022
Clustering for the upper and lower cases of alphabets.
34/51
Support Vector Machine May 13, 2022
3-stage Hierarchical SVM Block Diagram
35/51
Support Vector Machine May 13, 2022
Confusion plot for classified label versus true label
36/51
Support Vector Machine May 13, 2022
Table Of Experimental Results
37/51
Support Vector Machine May 13, 2022
SVC Implementation
38/51
Support Vector Machine May 13, 2022
39/51
Support Vector Machine May 13, 2022
40/51
Support Vector Machine May 13, 2022
41/51
Support Vector Machine May 13, 2022
42/51
Support Vector Machine May 13, 2022
43/51
Support Vector Machine May 13, 2022
44/51
Support Vector Machine May 13, 2022
45/51
Support Vector Machine May 13, 2022
46/51
Support Vector Machine May 13, 2022
47/51
Support Vector Machine May 13, 2022
Conclusion
SVM Advantages
• Support vector machine is very effective even with high dimensional data.
• When you have a data set where number of features is more than the
number of rows of data, SVM can perform in that case as well.
• When classes in the data are points are well separated SVM works really
well.
• SVM can be used for both regression and classification problem. And
last but not the least SVM can work well with image data as well.
48/51
Support Vector Machine May 13, 2022
SVM Disadvantages
• When classes in the data are points are not well separated, which means
overlapping classes are there, SVM does not perform well.
• We need to choose an optimal kernel for SVM and this task is difficult.
• SVM on large data set comparatively takes more time to train.
• SVM or Support vector machine is not a probabilistic model so we can
not explanation the classification in terms of probability.
• It is difficult to understand and interpret the SVM model compared to
Decision tree as SVM is more complex.
49/51
Support Vector Machine May 13, 2022
References I
https://www.javatpoint.com/
machine-learning-support-vector-machine-algorithm
https://www.geeksforgeeks.org/
support-vector-machine-algorithm
https:
//www.tutorialspoint.com/machine_learning_with_python
https://www.youtube.com/watch?v=jMWjN6mJiSw&t=39s
https://www.youtube.com/results?search_query=john+
pedram+svm+part+25
https://www.researchgate.net/publication/277299933_
Efficient_Learning_Machines_Theories_Concepts_and_
Applications_for_Engineers_and_System_Designers
https://machinehack.com/bootcamp/bootcampcourse
50/51
Support Vector Machine May 13, 2022
THANK YOU
51/51
Support Vector Machine May 13, 2022