Theory of constrained
optimization
Nur Aini Masruroh
Scope
Lagrange function
Necessary and sufficient conditions for
equality constraints
Necessary and sufficient conditions for
inequality constraints
Necessary and sufficient conditions for
equality and inequality constraints
A general form for constrained
optimization problem
Minimize f(x)
Subject to hi(x) = 0 i = 1,,m
gj(x) 0 j = 1,,r
xT = [x1 x2 x3 xn] Rn
Question: does the theory of unconstrained
optimization apply for constrained problem?
Lagrange function
In general form
minimize f(x)
subject to hi(x) = 0 i = 1, , m
gj(x) 0 j = 1, , r
x = [x1 x2 xn]T
It can be converted into:
Minimize
m r
L( x, w, u ) f ( x) wi hi ( x) u j g j ( x)
i 1 j 1
L(x, w) is called Lagrange function and wi and uj are
called Lagrange multipliers
Lagrange function: remark
Function L transforms a constrained optimization
into an unconstrained one
Function L can be used to determine the first order
condition (FOC) necessary condition for optimal
solutions
Necessary and sufficient conditions for
equality constraints
In general, a problem with equality constraints can
be expressed as
Minimize f(x)
Subject to hi(x) = 0 i = 1,,m
xT = [x1 x2 x3 xn]
Define Lagrange function as
m
L( x, w) f ( x) wi hi ( x)
i
Necessary conditions of optimality
m
L f ( x) wi hi ( x) 0
i
L
hi ( x) 0 i 1,2,..., m
wi
Remarks:
1. If x* satisfies the conditions, then L(x*, w*) = f(x*) L gives the same
optimal solutions
2. x* is a stationary point for L. It is generally different from the stationary
point for f(x)
3. wi can be positive or negative
4. As necessary conditions require first order derivatives, they are also called
first order conditions (FOC)
Sufficient conditions of optimality
HL(x*,w*) is positive definite x* is a local minimum
HL(x*,w*) is negative definite x* is a local maximum
where HL(x*,w*) is second order derivatives over x and w.
2 h j
f wi hi nn
2
xi nm
H L ( x*, w*) T
j
h
0
x
i mn ( n m )( m n )
i 1,2,..., n
j 1,2,..., m
Note: as sufficient conditions require second order derivatives, they are
also called second order conditions (SOC)
Try the following example
1. Min f(x) = 4x12 + 5x22
Subject to h(x) = 2x1 + 3x2 6 = 0
2. Min f(x1, x2) = 4x12 + 5x22
Subject to h(x1, x2) = 2x1 + 3x2 7 = 0
Necessary and sufficient conditions for
inequality constraints
In general, a problem with inequality constraints can
be expressed as
Minimize f(x)
Subject to gi(x) 0 i = 1,,r
xT = [x1 x2 x3 xn]
What are the necessary conditions for this problem?
To convert inequality constraints into equalities, lets
introduce slack variables Si(i=1,,r) as follows
gi(x) + si2 = 0
Let L f ( x) ui ( g i ( x) si2 )
i
Apply the previous FOC conditions on L:
Lx f ( x) ui g i ( x) 0
i
Ls ui si 0
Lu g i ( x) si2 0
i 1,2,..., r
Relation between ui and gi(x)
Consider gi(x) + si2 = 0
If si=0, then gi(x) = 0 gi(x) is a binding (active) constraint
If si0, then gi(x) < 0 gi(x) is not binding (inactive)
For L(x*, u*, s*) = f(x*) ui[gi(x) + si2] = 0
When si 0,
ui= 0
gi(x) < 0
uigi(x) = 0
When si = 0,
gi(x) = 0
ui 0
Necessary and sufficient conditions of
optimality for general constrained
problems
General problem:
Minimize f(x)
Subject to hi(x) = 0 i = 1,,m
gi(x) 0 i = 1,,r
xT = [x1 x2 x3 xn]
m r
L( x, w, u ) f ( x) wi hi ( x) u j g j ( x)
i 1 j 1
General Karush-Kuhn-Tucker (KKT)
conditions (necessary)
1. Linear dependence of f, h, g
m r
f ( x) wi hi ( x) u j g j ( x) 0
i 1 j 1
2. Complementary conditions
uigj(x) = 0 j=1,,r
uj 0
3. Constraint feasibility
hi(x) = 0 i=1,,m
gj(x) 0 j=1,,r
Sufficient Conditions of Optimality
(SOC)
HL(x*, w*, u*) is positive definite x* is a
local minimum
HL(x*, w*, u*) is negative definite x* is a
local maximum
Where HL(x*, w*, u*) is Hessian matrix for L
with respect to x*, w*, and u*
Sufficient conditions for the uniqueness
of a local optimum
If f(x) is convex (concave) and the feasible
region is convex and a local minimum
(maximum) exists at x*,
1. x* is global minimum (maximum)
2. The KKT conditions are both necessary and
sufficient
The procedure for solving KKT equations
1. Assume no active inequality uj = 0
2. Solve the equations for x, the multiplier wi of
the equalities, and the multiplier uj of the
active inequality (for the first iteration there
are none)
3. If it satisfies all the constraints, STOP,
solution found. Otherwise go to step 4.
4. Set uj 0, then go to step 2.
Example
Minimize f = x12 + x22
Subject to g1 = 1 x1 0
g2 = x1 3 0
g3 = 2 x2 0
g4 = x2 4 0