You are on page 1of 3

Coursework of Optimization

Moh. Kamalul Wafi and Prof. Alessandro Astolfi

Due date is Friday, March 20, 2020 23.59 WIB

This given coursework is proposed in order to elaborate the understanding of constrained and
unconstrained optimization delivered n the class. Please create a group consisting of 5 people
to solve this coursework. You can use any model of report or framework, word or LATEX, Bahasa or
English!. The most trivial suggested software being used is Matlab.
Submit the coursework : (1) Softfile (.m)-Matlab; and (2) Hardfile outlining the report and code

Problem 1 - Constrained Optimization.


Maximixe: p(x) = 5x1 + 4x2 ;
Subject to: 3x1 + 5x2 ≤ 78, 4x1 + x2 ≤ 36, and x1 , x2 ≥ 0

x1 x2 s1 s2 p v
 
  0 1 0.2353 −0.1765 0 12
3 5 1 0 0 78 g1 
 1 0 −0.0588

I = 4 1 0 1 0 36 g2 → O =  0.2941 0 6 
 (1)
 
−5 −4 0 0 1 0 f
0 0 0.6471 0.7647 1 78

Maximixe: p(x) = 7x1 + 8x2 + 10x3 ;


Subject to: 2x1 + 3x2 + 2x3 ≤ 1000, x1 + x2 + 2x3 ≤ 800, and x1 , x2 , x3 ≥ 0

x1 x2 x3 s1 s2 p v
 
  1 2 0 1 −1 0 200
2 3 2 1 0 0 1000 g1 
 0 − 1 1

I = 1 800  g2 → O =  2 1 −2 1 0 300 (2)

1 2 0 1 0 


−7 −8 −10 0 0 1 0 f
0 1 0 2 3 1 4400

Maximixe: p(x) = 6x1 + 8x2 + 4x3 ;


Subject to: 2x1 + x2 + 3x3 ≤ 1, x1 + 2x2 + 1x3 ≤ 1, 3x1 + 4x2 − 2x3 ≤ 3, and x1 , x2 , x3 ≥ 0

x1 x2 x3 s1 s2 s3 p v
 
5 2
1 0 3 3 − 13 0 0 1
3
2 1 3 1 0 0 0 1 g1
   
1 2 1 0 1 0 0 1 g2
 0 1
 − 13 − 13 2
3 0 0 1
3


I = →O=  (3)
−2 − 17 − 23 5
−3 2
  0 0
3 4 0 0 1 0 3 g3  3 1 0 3


−6 −8 −4 0 0 0 1 0 f  
10 4 10 14
0 0 3 3 3 0 1 3

A1: Compute analytically at least question (3)!

A2: Implement the Simplex method algorithm and prove that those three questions (I) yield in
the desired outputs (O). Decide the values of (x) and optimum result (v) for each question!

1
Problem 2 - Unconstrained Optimization.
To contrast the performance, in terms of behaviour and speed of convergence, of various uncon-
strained optimization problem, it is customary to construct test functions. The proposed function
to be examined is known as Rosenbrock function, such that
 2
f (x, y) = 100 y − x2 + (1 − x)2 (4)

B1: Compute analytically the stationary points of the function in Eq.(4) and verify if they are
maxima/minima/saddle points

B2: Plot using matlab the level sets of the function f (x, y)

B3: Implement the procedures for minimization of the function f (x, y) using (1) Gradient method;
(2) Newton method; and (3) Quasi-Newton (DFP) method applying Armijo line search with
initial point (x0 , y0 ) = (−0.75, 1)

B4: Plot on the (x, y)-plane, the sequences (xk , yk ) of points generated by each algorithm. Are
these sequences converging to a stationary points of f (x, y) computed in B1?

B5: For the sequences (xk , yk ), consider the cost:


h i
Jk = log (xk − 1)2 + (yk − 1)2 (5)

and plot the sequences for those algorithms the cost Jk as a function of k. Use such plot to
analyse the speed of convergence for those algorithms

2
Theorem 1 - Algorithms of Gradient, Newton, and Quasi-Newton (DFP).
T1) Gradient Algorithm
S0 : Given x0 ∈ Rn
S1 : Set k = 0
S2 : Compute ∇f (xk ). If ∇f (xk ) = 0, Stop. Else set dk = −∇f (xk )
S3 : Compute a step αk along the direction dk with Armijo, such that

f (xk + αk dk ) ≤ f (xk ) (6)

S 4 : Set xk+1 = xk + αk dk , k = k + 1 and go to S2


T2) Newton Algorithm
S 0 : Given x0 ∈ Rn and  is sufficiently small ( ≤ 0.01)
S 1 : Set k = 0
S 2 : Compute ∇f (xk ). If ∇f (xk ) = 0, Stop. Else compute ∇2 f (xk ). If ∇2 f (xk ) is singular,
set dk = −∇f (xk ) and go to S6
S 3 : Compute Newton direction (z) solving the (linear system),

∇2 f (xk )z = −∇f (xk )



>

S 4 : If ∇f (xk ) z <  ∇f (xk ) kzk, set dk = −∇f (xk ) and go to S6

S 5 : If ∇f (xk )> z < 0, set dk = z; If ∇f (xk )> z > 0, set dk = −z


S 6 : Compute a step αk along the direction dk with Armijo so that the Eq.(6) is satisfied with
α0 = 1. Set xk+1 = xk + αk dk , k = k + 1 and go to S2
T3) Quasi-Newton (DFP) Algorithm
S 0 : Given x0 ∈ Rn
S 1 : Set k = 0
S 2 : Compute ∇f (xk ). If ∇f (xk ) = 0, Stop. Else compute Hk with BFGS, such that

H0
 =I
DFP δk δk> Hk γk γk> Hk
H
 k+1 = H k + −
δk> γk γk> Hk γk

where γk = ∇f (xk+1 ) − ∇f (xk ) and δk = xk+1 − xk . Set dk = −Hk ∇f (xk )


S 3 : Compute a step αk along the direction dk with Armijo so that the Eq.(6) is satisfied
S 4 : Set xk+1 = xk + αk dk , k = k + 1 and go to S2
Theorem 2 - Algorithm of Armijo Line Search.  
Armijo method was the first non-exact linear search method. Let ϕ > 0, σ ∈ (0, 1), and γ ∈ 0, 12
S 1 : Set α = ϕ
 
S 2 : If f xk + α.k − f (xk ) ≤ γα∇f (xk )> dk , set αk = α and Stop. Else go to S3

S 3 : Set α = σα and go to S2

You might also like