Professional Documents
Culture Documents
1. Linear Kernel:
The linear kernel computes the dot product between two data points in the original feature
space:
K_linear(x, z) = x · z
Here, "x" and "z" are the input data points, and "·" represents the dot product.
2. Polynomial Kernel:
The polynomial kernel raises the dot product of two data points to a specified power "d" and
adds an optional constant "c":
K_polynomial(x, z) = (x · z + c)ᵈ
In this formula:
"x" and "z" are the input data points.
"d" is the degree of the polynomial kernel, which determines the order of the polynomial.
"c" is an optional constant that can be added to the dot product. When "c" is 0, it's a pure
polynomial kernel.
In this formula:
"x" and "z" are the input data points.
"γ" (gamma) is a hyperparameter that controls the shape of the Gaussian function. It
determines the influence of each training example on the similarity.
These are the basic formulas for these three types of kernels. SVMs use these kernels to
implicitly map data points into higher-dimensional spaces, allowing for the construction of
nonlinear decision boundaries in the feature space. The choice of kernel depends on the
nature of the data and the problem you are trying to solve.