You are on page 1of 2

1 Hard Margin SVM:

1 T
minimizew,w0 w w
2
s.t. ∀i : yi (wT xi + w0 ) ≥ 1 (1)

2 Deriving Dual Problem for Hard Margin SVM


. The Lagrangian Function can be written as
m
1 T X    
L(w, b, α) = w w− α(i) t(i) wT x(i) + b − 1
2 i=1 (2)
(i)
with ∀i : α ≥0
Partial derivatives of the Lagrangian Function w.r.t w and b implies
m
X
∆w L(w, b, α) = w − α(i) t(i) x(i)
i=1
m
(3)
∂ X
L(w, b, α) = − α(i) t(i)
∂b i=1

When these Partial Derivatives are equal to zero, we have


m
X
ŵ = α̂(i) t(i) x(i)
i=1
m
(4)
X
(i) (i)
α̂ t =0
i=1

If the obtained stationary points are substituted into the Lagrangian,

m m
1 T 1 X (i) (i) (i) T X (i) (i) (i) 1 X X (i) (j) (i) (j)
w w= ( α t x ) ( α t x )= α α t t < x(i) x(j) >
2 2 i=1 i=1
2 i j
(5)
m
X   
 X X X
α(i) t(i) wT x(i) + b − 1 = α(i) t(i) (( α(j) t(j) x(j) )T x(i) + b) − α(i)
i=1 i j i
X X XX
(i) (i)
α t ( α(j) t(j) x(j) )T x(i) = α(i) α(j) t(i) t(j) < x(i) x(j) >
i j i j
(6)
Substituting (5), (6) in (2) gives us,
X 1 X X (i) (j) (i) (j)
L(w, b, α) = α(i) − α α t t < x(i) x(j) > (7)
i
2 i j

1
The Optimization for Dual Problem:
X 1 X X (i) (j) (i) (j)
maxα α(i) − α α t t < x(i) x(j) >
i
2 i j

s.t ∀i : α(i) ≥ 0 (8)


X
α̂(i) t(i) = 0
i

You might also like