You are on page 1of 4

Support Vector Machines (SVMs)

and
Kernel Methods
• SVMs are a class of ML models used for classification and
regression tasks. They work by finding a hyperplane that best
separates data into different classes while maximizing the margin
between those classes.
• SVMs can be extended to handle various types of data and non-
linear problems using different types of support vector and kernel
methods.
Kernel methods are a class of techniques used in ML and data
analysis to solve complex problems by transforming data into a
higher-dimensional space.
They are particularly popular in the context of SVMs and other
algorithms for various tasks like classification, regression, and
dimensionality reduction.
Various types of support vector methods:
Method Description
Linear SVM Basic SVM for linearly separable data, finds a hyperplane that maximizes the margin between classes.

Uses kernel functions to handle non-linearly separable data, mapping it to a higher-dimensional space for
Non-Linear SVM
better separation.
Extension of SVM for multi-class classification, using techniques like one-vs-one or one-vs-all (one-vs-rest)
Multi-Class SVM
to handle multiple classes.
Support Vector Applies SVM principles to regression tasks, finding a hyperplane that best fits the data while maintaining
Regression (SVR) a margin of tolerance.
Allows assigning different weights to data points, giving more importance to certain samples during
Weighted SVM
training.
Semi-Supervised Combines labeled and unlabeled data to make predictions, useful when labeled data is limited but
SVM unlabeled data is abundant.
Applied to structured output spaces, suitable for problems like sequence labeling, natural language
Structured SVM
processing, and structured classification.
General technique for transforming data into a higher-dimensional space using kernel functions,
Kernel Methods
commonly used in conjunction with SVMs for non-linear problems.
Different types of kernel methods used in data analysis
Kernel Method Description
Represents the dot product of data points in the original feature space. Suitable for linearly
Linear Kernel separable data.
Maps data into a higher-dimensional space using polynomial functions, capturing non-linear
Polynomial Kernel relationships.
Radial Basis
Function (RBF) Widely used for non-linear classification, mapping data into an infinite-dimensional space using a
Kernel Gaussian function.
Sigmoid Kernel Useful for problems where the decision boundary is a combination of sigmoid functions.
Similar to the RBF kernel but with a different shape, providing a different way to capture non-
Laplacian Kernel linear patterns.
Another alternative to the RBF kernel, using the exponential function for mapping data into a
Exponential higher-dimensional space.
Kernel
Specifically designed for text data, measuring the similarity b/w sequences of characters or words.
String Kernel
Custom Kernel Allows users to define their own kernel functions tailored to the specific characteristics of the data.

You might also like