Professional Documents
Culture Documents
Problem Sheet:3
Lecturer: D. Ghosh Scribes: Ramsurat & Jauny
(a) Write down the KKT condition for the problem, and find all points that satisfy the condition. Check
whether or not each point is regular.
(b) Determine whether or not the point(s) in part (a) satisfy the second-order necessary condition.
(c) Determine whether or not the point(s) in part (b) satisfy the second-order sufficient condition.
4. Consider the optimization problem
0-1
0-2 Problem Sheet 2(Problem Sheet for Constrained Optimization and Duality Theory): October 18, 2021
minimize x1 x2
subject to x2 − (x1 − 1)3 = 0,
the matrix ∇2 L(x∗ ; ν ∗ ) is indefinite, but the second order sufficient conditions are nevertheless satisfied.
(a) Find all KKT points, and test for the second order optimality conditions.
(b) Show that in fact no minimum exists.
minimize x2 + x22
subject to −x1 − x2 + 4 ≤ 0,
x1 ≥ 0, x2 ≥ 0.
(b) Find the optimal solutions to both the primal and the dual problem, and compare their objective values.
11. Consider the nonconvex optimization problem
minimize x2 + x3
subject to x1 + x2 + x3 = 1,
x21 + x22 + x23 = 1
Find all the points at which both the constraints are active. One of these points is x0 = (0.618, 0.786)t to three
decimal place .Working to this accuracy, find the range of values of α for which x0 satisfies the KKT conditions.
14. Determine the shortest distance from the origin to the set defined by:
4 − x1 − x2 ≤ 0, 5 − 2x1 − x2 ≤ 0.
15. Locate all of the KKT points for the following problem. Are these points local solutions? Are they global
solutions?
minimize − x1
subject to x2 ≤ −(x1 − 4)3 ,
x2 ≥ 0.
19. (a) Solve the problem graphically and determine if the solution satisfies the KKT conditions.
(b) Find the dual of the optimization problem and solve it.
(c) Compare the solution of the dual and the Lagrange multipliers of the primal problem.
minimize − x1 − x2
subject to x1 + 3x2 ≤ 9,
2x1 + x2 ≤ 8,
x1 ≥ 0, x2 ≥ 0.
20. Solve the problem and see if the solution satisfies the KKT conditions.
minimize x1
subject to (x1 + 4)2 − 2 ≤ x2 ,
x1 − x2 + 4 = 0,
x1 ≥ −10.
21. Using the Karush-Kuhn-Tucker conditions see if the point x = (2, −1) is the optimum of the problem:
22. Using the Karush-Kuhn-Tucker conditions see if the points x = (2, 4) or x = (6, 2) are the local optima of the
problem:
minimize e−x − y
subject to ex + ey ≤ 6,
y ≥ x.
Problem Sheet 2(Problem Sheet for Constrained Optimization and Duality Theory): October 18, 2021 0-5
The optimal value of the problem is -4 with x = (0, 2). The Lagrange function is given by
L(x, y) = x21 − 2x2 + y(x21 + x22 − 4), where y ∈ R,
and then
θ(y) = inf2 L(x, y)
x∈R
We have
0 for y ≥ −1
inf {(1 + y)x21 } =
x1 −∞ for y < −1
− y1 for y > 0
inf {yx22 − 2x2 } =
x2 −∞ for y ≤ 0.
Hence, the Lagrange dual is
1
(LD) sup − − 4y
y>0 y
which is a convex problem, and the optimal value is -4, with y = 21 . Duality gap is 0.
First we write the problem in the standard form required for the application of the KKT theory:
minimize f (x)
subject to fi (x) ≤ 0, i = 1, 2, · · · , s,
gj (x) = 0, j = s + 1, s + 2, · · · , m.
In this problem there are no equality constraints, so s = m = 2 and we have
f (x) =x21 + x22 − 4x1 − 4x2 = (x1 − 2)2 + (x2 − 2)2 − 8
f1 (x) =x21 − x2
f2 (x) =x1 + x2 − 2.
Note that we can ignore the constant term in the objective function since it does not effect the optimal solution, so
henceforth f (x) = x21 + x22 − 4x1 − 4x2 = (x1 − 2)2 + (x2 − 2)2 . At this point it is often helpful to graph the solution
set if possible, as it is in this case. It is slice of a parabola.
Since all of these functions are convex, this is an example of a convex programming problem and so the KKT con-
ditions are both necessary and sufficient for global optimality. Hence, if we locate a KKT point we know that it is
necessarily a globally optimal solution.
The Lagrangian for this problem is
L((x1 , x2 ), (u1 , u2 )) = (x1 − 2)2 + (x2 − 2)2 + u1 (x21 + u2 (x1 + x2 − 2)).
Let us now write the KKT conditions for this problem.
Problem Sheet 2(Problem Sheet for Constrained Optimization and Duality Theory): October 18, 2021 0-7
or equivalently
4 =2x1 + 2u1 x1 + u2
4 =2x2 − u1 + u2 .
Next observe that the global minimizer for the objective function is (x1 , x2 ) = (2, 2). Thus, if this point is feasible,
it would be the global solution and the multipliers would both be zero. But it is not feasible. Indeed, both constraints
are violated by this point. Hence, we conjecture that both constraints are active at the solution. In this case, the KKT
pair ((x1 , x2 ), (u1 , u2 )) must satisfy the following 4 key equations
x2 =x21
2 =x1 + x2
4 =2x1 + 2u1 x1 + u2
4 =2x2 − u1 + u2 .
These are 4 equations in 4 unknowns that we can try to solve by elimination. Using the first equation to eliminate x2
from the second equation, we see that x1 must satisfy
so x1 = −2 or x1 = 1. Thus, either (x1 , x2 ) = (−2, 4) or (x1 , x2 ) = (1, 1). Since (1, 1) is closer the global minimizer
of the objective f, let us first investigate (x1 , x2 ) = (1, 1) to see if it is a KKT point. For this we must find the KKT
multipliers (u1 , u2 ). By plugging (x1 , x2 ) = (1, 1) into the second of the key equations given above, we get
By substracting these two equations, we get 0 = 3u1 sou1 = 0 and u2 = 2. Since both of these values are non-
negative, we have found a KKT pair for the original problem. Hence, by convexity we know that (x1 , x2 ) = (1, 1) is
the global solution to the problem.
We have the optimal value -12 with x = (−2, 4). The Lagrange function of the problem is
Now, x21 + yx1 is a parabola which has its minimum at x1 = − y2 . So, this minimum lies within X when y ≤ 4. When
y ≤ 4 the minimum is reached at the boundary of X, at x2 = −2 when y ≥ 2, and at x2 = 4 when y ≤ 2. Hence, we
have
y2
− 2 + 2y − 16 for y ≤ 2
2
θ(y) = − y4 − 4y − 4 for 2 ≤ y ≤ 4
−6y for y ≥ 4.
maximizing θ(y) for y ≥ 0 gives
sup θ(y) = − 13
0≤y≤2
sup θ(y) = − 13
2≤y≤4
Hence, the optimal value of the Lagrange dual is −13, and we have a nonzero duality gap that equals to 1.
Sol 18
2 x1 ≥ −10
1 (x1 + 4)2 − 2 ≤ x2
x2
−1
−2
x1 − x2 + 4 = 0
−3
−11 −10 −9 −8 −7 −6 −5 −4 −3 −2 −1
x1
g2 (x) = x1 − x2 + 4
From the picture we see that the optimum is at (-5,-1). Let’s determine if it satifies
the KKT conditions. The active constraints:
∇L(−5, −1, λ1 , µ, λ2 ) = ∇f (−5, −1)+λ1 ∇g1 (−5, −1)+µ∇h(−5, −1)+λ2 ∇g2 (−5, −1)
1 + λ1 · 2(−5 + 4) + µ · (−1) 1 − 2λ1 − µ 0
= = =
0 + λ1 · (−1) + µ · 1 −λ1 + µ 0
From the lower equation we get λ1 = µ, which can be replaced in the upper equation.
This yields 1 − 2µ − µ = 0, i.e. µ = 31 . Furthermore, λ1 = µ = 13 . The problem is a
minimization problem and the KKT conditions require that the Lagrange multipliers
for inequality constraints are positive. This is now the case and the point satisfies
the KKT conditions.
6
0-10 Problem Sheet 2(Problem Sheet for Constrained Optimization and Duality Theory): October 18, 2021
From the picture we see that the optimum is at (−5, −1). Let‘s determine if it satisfies the KKT conditions. The active
constraints:
g1 (−5, −1) =0
g2 (−5, −1) = − 15
h1 (−5, −1) =0
Constraint 1 is active, but the other inequality is not. This means that the Lagrange multiplier of the second inequality
constraint is zero, i.e., λ2 = 0. Next, we see if the point is zero point of the gradient of the Lagrangian function. The
gradient ofthe
objective and constraintfunctions are:
1 2(x1 + 4) −1 −1
∇f (x) = , ∇g1 (x) = , ∇g2 (x) = , ∇h1 (x) = The gradient of the Lagrange function
0 −1 0 1
is zero:
1 − 2λ1 − µ 0
∇L((−5, −1), λ1 , µ, λ2 ) = =
−λ1 + µ 0
From above equations we get λ1 = µ, which can be replaced in the equation. This yields µ = 31 . The problem is
a minimization problem and the KKT conditions require the Lagrange multipliers for the inequality constraints are
positive. This is now the case and the point satisfies the KKT conditions.
g1 (x) =x22 + 1 − x1
g2 (x) =x1 − 2.
g1 (2, −1) =0
g2 (2, −1) =0.
Both constraints are active. We determine if the point is zero point of the Lagrange function‘s gradient. The gradients
of the objective function and constraint functions are:
−2x1 −1 1
∇f (x) = , ∇g1 (x) = , ∇g2 (x) = .
−2x2 2x2 0
We write the gradient of the Lagrange function in the examination point (2, −1) :
∇L((2, −1), λ1 , λ2 ) =∇f (2, −1) + λ1 ∇g1 (2, −1) + λ2 ∇g12 (2, −1)
−4 − λ1 + λ2 0
= =
2 − 2λ1 0
Problem Sheet 2(Problem Sheet for Constrained Optimization and Duality Theory): October 18, 2021 0-11
g1 (x) = − x21 + x2
g2 (x) =x1 + 2x2 − 10
g3 (x) =x1 − 3x2 .
Lagrange function is
L(x, λ) = f (x) + λ1 g1 (x) + λ2 g2 (x) + λ3 g3 (x).
The KKT conditions are:
∇L =0
λi gi (x) =0, i = 1, 2, 3,
λi ≥0, i = 1, 2, 3.
g1 (2, 4) = − 22 + 4 = 0
g2 (2, 4) =2 + 2.4 − 10 = 0
g3 (2, 4) =2 − 3.4 = −10.
Constraints g1 and g2 are active in the current point, i.e., they have the value 0. The third constraint on the other hand
is not active. According the third KKT condition, the Lagrange multipliers of an inactive constraints are zero, i.e., now
λ3 = 0.
The gradients of the objective function and constraint functions are:
−3x21 −2x1 1 1
∇f (x) = , ∇g1 (x) = , ∇g2 (x) = , ∇g3 (x) = .
−3 3 2 −3
The gradients at the point under examination:
∇L((2, 4), λ1 , λ2 , λ3 ) =∇f (2, 4) + λ1 ∇g1 (2, 4) + λ2 ∇g2 (2, 4) + λ3 ∇g3 (2, 4)
−3.22 −2.2 1 1
= + λ1 + λ2 + λ3
−3 3 2 −3
−12 − 4λ1 + λ2 + λ3
= .
−3 + 3λ1 + 2λ2 − 3λ3
We remember that λ3 = 0. Now the first KKT condition (∇L = 0) yields:
−12 − 4λ1 + λ2 0
∇L = = .
−3 + 3λ1 + 2λ2 0
0-12 Problem Sheet 2(Problem Sheet for Constrained Optimization and Duality Theory): October 18, 2021
subject to
g1 (x, y) := ex + ey − 6 ≤ 0,
g2 (x, y) := x − y ≤ 0.
Define the Lagrangean function with λ1 , λ2 ≥ 0
From (ii)
λ2 + 1 = λ1 ey =⇒ λ1 > 0,
and then by (iii)
ex + ey = 6.
Suppose in (iv) that x = y, then ex = ey = 3.
From (i) and (ii) =⇒
1
3
− 3λ1 − λ2 = 0, λ1 = 2/9,
=⇒
1 − 3λ1 + λ2 = 0. λ2 = −1/3,
32
Hence x < y and λ2 = 0, as well as ex + ey = 6 and λ1 > 0.
(i) =⇒ λ1 = e−2x ,
=⇒
(ii) =⇒ λ1 = e−y
y = 2x,
=⇒ ex = 2 or ex = −3 (impossible!).
e2x + ex = 6
So,
x∗ = ln 2, y ∗ = 2x = ln 4,
λ∗1 = 1/4, λ∗2 = 0.
∂g1 ∗ ∗ ∗ ∂g1 ∗ ∗ ∗
(x , y ) = ex 6= 0 or (x , y ) = ey 6= 0
∂x ∂y
and is satisfied.
Actually, (CQ) holds at all points (x, y) ∈ R2 . Namely,
x
e ey
Dg(x, y) = ,
1 −1
with det Dg(x, y) = −(ex + ey ) < 0 and ∇g1 (x, y) 6= 0, ∇g2 (x, y) 6= 0 for all
(x, y) ∈ R2 .
As we will see from Theorem 4.6, (ln 2, ln 4) is the global minimum
point we need to find. N
33
Problem Sheet 2(Problem Sheet for Constrained Optimization and Duality Theory): October 18, 2021 0-15
The optimal value of the primal problem is −1 at x = (± √12 , ± √12 ). Lagrange function is
Hence, we have
−∞ for y<1
θ(y) =
−y for y≥1
hence y ∗ = 1 is the dual optimum and dual optimal value is -1.