You are on page 1of 1

TYPES OF KERNELS IN SVM

1. Linear Kernel:
The linear kernel computes the dot product between two data points in the original feature
space:
K_linear(x, z) = x · z
Here, "x" and "z" are the input data points, and "·" represents the dot product.

2. Polynomial Kernel:
The polynomial kernel raises the dot product of two data points to a specified power "d" and
adds an optional constant "c":
K_polynomial(x, z) = (x · z + c)ᵈ
In this formula:
"x" and "z" are the input data points.
"d" is the degree of the polynomial kernel, which determines the order of the polynomial.
"c" is an optional constant that can be added to the dot product. When "c" is 0, it's a pure
polynomial kernel.

3. Radial Basis Function (RBF) Kernel (Gaussian Kernel):


The RBF kernel, also known as the Gaussian kernel, measures the similarity between two
data points using a Gaussian function of the Euclidean distance between them:
K_rbf(x, z) = exp(-γ * ||x - z||²)

In this formula:
"x" and "z" are the input data points.
"γ" (gamma) is a hyperparameter that controls the shape of the Gaussian function. It
determines the influence of each training example on the similarity.
These are the basic formulas for these three types of kernels. SVMs use these kernels to
implicitly map data points into higher-dimensional spaces, allowing for the construction of
nonlinear decision boundaries in the feature space. The choice of kernel depends on the
nature of the data and the problem you are trying to solve.

You might also like