You are on page 1of 14

Support Vector

Machines
Constrained Optimization
Constrained Optimization
• Recall the gradient descent method to find minimum cost function
• Optimization is finding the min or max of a function
• Finding min cost function through gradient descent
• Constrained optimization is optimization with certain constraints on
the values of variable/parameters
• Adding the regularization term in the cost function to penalize the large
values of parameters to exclude them.

Constrained Optimization
• The methods of constrained optimization can be divided into two
types:
• Direct Methods: These methods directly incorporate the constraints
into the optimization algorithm.
• For example, the penalty function method as we saw in regularization

• Indirect Methods: These methods convert the constrained


optimization problem into an unconstrained one with the help of
Lagrange multipliers.
Lagrange Multipliers
Binary Classification Problem
Which one of the two hyperplanes to choose as the decision boundary?

The full-line one as it leaves more room on both sides.


Maximum Margin Classifier
• Margin: the closest point in X from the
hyperplane.
• Every hyperplane is characterized by its
direction (determined by w) and its exact
position in space (determined by w0)
Maximum Margin Classifier

For N
Read m
Nonseparable Classes
Non-linear Decision Boundary

x2

x1

Is there a different / better choice of the features ?


Kernel
Given , compute new feature depending
on proximity to landmarks
x2

x1
Kernels and Similarity
f1 will be maximum i.e. 1,
when x1 = 3 and x2 = 5
Example:
For points near l(1), l(2) (since theta1 = theta2 = 1),
We Predict y = 1 otherwise we predict y = 0

x2

x1

x is close to l(1) and far


away from l(2) and l(3)

You might also like