This action might not be possible to undo. Are you sure you want to continue?

BooksAudiobooksComicsSheet Music### Categories

### Categories

### Categories

Editors' Picks Books

Hand-picked favorites from

our editors

our editors

Editors' Picks Audiobooks

Hand-picked favorites from

our editors

our editors

Editors' Picks Comics

Hand-picked favorites from

our editors

our editors

Editors' Picks Sheet Music

Hand-picked favorites from

our editors

our editors

Top Books

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Audiobooks

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Comics

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Sheet Music

What's trending, bestsellers,

award-winners & more

award-winners & more

Welcome to Scribd! Start your free trial and access books, documents and more.Find out more

1

In the previous lecture the optimization of functions of multiple variables subjected to equality constraints using the method of constrained variation and the method of Lagrange multipliers was dealt. In this lecture the Kuhn-Tucker conditions will be discussed with examples for a point to be a local optimum in case of a function subject to inequality constraints.

Kuhn-Tucker Conditions It was previously established that for both an unconstrained optimization problem and an optimization problem with an equality constraint the first-order conditions are sufficient for a global optimum when the objective and constraint functions satisfy appropriate concavity/convexity conditions. The same is true for an optimization problem with inequality constraints. The Kuhn-Tucker conditions are both necessary and sufficient if the objective function is concave and each constraint is linear or each constraint function is concave, i.e. the problems belong to a class called the convex programming problems. Consider the following optimization problem: Minimize f(X) subject to gj(X) ≤ 0 for j = 1,2,…,p ; where X = [x1 x2 . . . xn] Then the Kuhn-Tucker conditions for X* = [x1* x2* . . . xn*] to be a local minimum are

m ∂f ∂g + ∑λj =0 ∂xi j =1 ∂xi

i = 1, 2,..., n j = 1, 2,..., m j = 1, 2,..., m j = 1, 2,..., m

λjg j = 0

gj ≤ 0

(1)

λj ≥ 0

D Nagesh Kumar, IISc, Bangalore

M2L5

Example (1) 2 2 Minimize f = x12 + 2 x2 + 3 x3 subject to the constraints g1 = x1 − x2 − 2 x3 ≤ 12 g 2 = x1 + 2 x2 − 3 x3 ≤ 8 using Kuhn-Tucker conditions. 2 x1 + λ1 + λ2 = 0 4 x2 − λ1 + 2λ2 = 0 6 x3 − 2λ1 − 3λ2 = 0 b) λ j g j = 0 i. if the constraints are of the form gj(X) ≥ 0. Solution: The Kuhn Tucker conditions are given by ∂f ∂g ∂g + λ1 1 + λ2 2 = 0 ∂xi ∂xi ∂xi a) i.e.e. IISc. (2) (3) (4) λ1 ( x1 − x2 − 2 x3 − 12) = 0 λ2 ( x1 + 2 x2 − 3x3 − 8) = 0 c) g j ≤ 0 i.e. then λ j have to be nonnegative. (5) (6) D Nagesh Kumar. then λ j have to be nonpositive in (1).Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions 2 In case of minimization problems. Bangalore M2L5 . if the problem is one of maximization with the constraints in the form gj(X) ≥ 0. On the other hand..

Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions 3 x1 − x2 − 2 x3 − 12 ≤ 0 x1 + 2 x2 − 3 x3 − 8 ≤ 0 (7) (8) d) λ j ≥ 0 i. therefore.e.. IISc. (3) and (4) we have x1 = x2 = −λ2 / 2 and x3 = λ2 / 2 . we have 17λ1 + 12λ2 = −144 . λ2 =0. x1 − x2 − 2 x3 − 12 = 0 Case 1: λ1 = 0 From (2). Using these in (6) we get λ22 + 8λ2 = 0. this solution set satisfies all of (6) to (9) Case 2: x1 − x2 − 2 x3 − 12 = 0 −λ1 − λ2 λ1 − 2λ2 2λ1 + 3λ2 − − − 12 = 0 or. λ2 ≥ 0 . But conditions (9) and (10) give us λ1 ≥ 0 and λ2 ≥ 0 simultaneously. Hence the solution set for this optimization problem is X* = [ 0 0 0 ] D Nagesh Kumar. 2 4 3 Using (2). (3) and (4). 0 ]. Bangalore M2L5 . X* = [ 0. which cannot be possible with 17λ1 + 12λ2 = −144 . 0. ∴ λ2 = 0 or − 8 From (10). λ1 ≥ 0 λ2 ≥ 0 (9) (10) From (5) either λ1 = 0 or.

IISc.Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions Example (2) 2 Minimize f = x12 + x2 + 60 x1 subject to the constraints 4 g1 = x1 − 80 ≥ 0 g 2 = x1 + x2 − 120 ≥ 0 using Kuhn-Tucker conditions. 2 x1 + 60 + λ1 + λ2 = 0 2 x2 + λ2 = 0 b) λ j g j = 0 i. Solution The Kuhn Tucker conditions are given by a) ∂g ∂f ∂g ∂g + λ1 1 + λ2 2 + λ3 3 = 0 ∂xi ∂xi ∂xi ∂xi i. x1 − 80 ≥ 0 x1 + x2 + 120 ≥ 0 d) λ j ≤ 0 i.e.. (13) (14) (15) (16) D Nagesh Kumar. Bangalore M2L5 ..e. (11) (12) λ1 ( x1 − 80) = 0 λ2 ( x1 + x2 − 120) = 0 c) g j ≤ 0 i.e.e.

But this solution set violates (15) and (16) For λ2 = −150 . 0]. This solution set violates (15) and (16) D Nagesh Kumar. X* = [ 30. Bangalore M2L5 . But this solution set violates (15) . ∴ λ2 = 0 or − 150 Considering λ2 = 0 . we have (19) −2 x2 ( x2 − 40 ) = 0 . X* = [ 45. Case 2: ( x1 − 80) = 0 Using x1 = 80 in (11) and (12).Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions 5 (17) (18) λ1 ≤ 0 λ2 ≤ 0 From (13) either λ1 = 0 or. either x2 = 0 or x2 − 40 = 0 For x2 = 0 . IISc. 75]. we have λ2 = −2 x2 λ1 = 2 x2 − 220 Substitute (19) in (14). For this to be true. ( x1 − 80) = 0 Case 1: λ1 = 0 From (11) and (12) we have x1 = − λ2 2 − 30 and x2 = − λ2 2 Using these in (14) we get λ2 ( λ2 − 150 ) = 0 . λ1 = −220 .

the solution set for this optimization problem is X* = [ 80 40 ]. 6 D Nagesh Kumar.Optimization Methods: Optimization using Calculus – Kuhn-Tucker Conditions For x2 − 40 = 0 . Bangalore M2L5 . Therefore. IISc. λ1 = −140 and λ2 = −80 . This solution set is satisfying all equations from (15) to (19) and hence the desired.

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue listening from where you left off, or restart the preview.

scribd