You are on page 1of 27

Matrix Computations: Theory and Numerical Methods

Optimization: Basic Principles

John Leth
jjl@es.aau.dk
Department of Electronic Systems,
Aalborg University,
Denmark
Agenda

Optimization Optimization
Continuous differentiable
Continuous differentiable
Gradient and Hessian
Gradient and Hessian Taylor series
Taylor series Example

Example Extrema
Example
Extrema The nonlinear optimization
Example problem
Necessary and sufficient
The nonlinear optimization problem conditions
Necessary and sufficient conditions Necessary conditions
Example
Necessary conditions
Sufficient conditions
Example Example (revised)
Sufficient conditions Convex Analysis
Example (revised) Convex sets
Convex combination, hull
Convex Analysis and cone
Convex functions
Convex sets
Properties
Convex combination, hull and cone Properties (continued)
Convex functions Optimization: Local and
global minimizers
Properties Convex optimization:
Properties (continued) Existence of global
minimizers
Optimization: Local and global minimizers Convex optimization:
Maximizers
Convex optimization: Existence of global minimizers
Convex optimization: Maximizers
24
Optimization
Preliminaries

3 Optimization
Continuous differentiable
In this (and sequel) lecture let Gradient and Hessian
Taylor series

f : R → R; x 7→ f (x ) Example
Extrema
Example
with domain R ⊂ Rn which we call the feasible region. The nonlinear optimization
problem
Example (matlab) Necessary and sufficient
conditions
Necessary conditions
Example

f (x ) = x12 + x22 + 1, Sufficient conditions


Example (revised)

R = {x ∈ R2 | x2 ≥ x12 + 1, x2 ≤ 1.5} Convex Analysis


Convex sets

Note that if we write Convex combination, hull


and cone
Convex functions
c1 (x ) = −x2 + x12 + 1, and c2 (x ) = −x2 + 1.5, Properties
Properties (continued)

then we may write the feasible region R as Optimization: Local and


global minimizers
Convex optimization:
R = {x ∈ R2 | 0 ≥ c1 (x ), 0 ≤ c2 (x )} Existence of global
minimizers
Convex optimization:
Maximizers

24
Optimization
Continuous differentiable

Optimization
4 Continuous differentiable
Gradient and Hessian
f : R ⊂ Rn → R; x 7→ f (x )
Taylor series
Example
Definition Extrema
We write f ∈ C1 if it has continuous first order partial derivatives, that is Example
The nonlinear optimization
problem
∂f
(x ) exists for all i = 1, . . . , n, and all x ∈ R, Necessary and sufficient
conditions
∂xi
Necessary conditions
Example
and Sufficient conditions
Example (revised)
∂f
x 7→ (x ) is continuous on R for all i = 1, . . . , n. Convex Analysis
∂xi Convex sets
Convex combination, hull
and cone

Example (matlab) Convex functions


Properties
Properties (continued)
Optimization: Local and
2 1 global minimizers
f (x ) = x sin(1/x ) is differentiable but not C Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Optimization
Continuous differentiable

f : R ⊂ Rn → R; x 7→ f (x ) Optimization
4 Continuous differentiable

Definition Gradient and Hessian


Taylor series
We write f ∈ C 1 if it has continuous first order partial derivatives, that is Example
Extrema
∂f Example
(x ) exists for all i = 1, . . . , n, and all x ∈ R, The nonlinear optimization
∂xi problem
Necessary and sufficient
and conditions
Necessary conditions

∂f Example
x 7→ (x ) is continuous on R for all i = 1, . . . , n. Sufficient conditions
∂xi Example (revised)

Convex Analysis
We write f ∈ C2 if it has continuous second order partial derivatives, that is, Convex sets
f ∈ C 1 and Convex combination, hull
and cone

∂2f Convex functions

(x ) exists for all i, j = 1, . . . , n, and all x ∈ R, Properties


∂xi ∂xj Properties (continued)
Optimization: Local and
global minimizers
and Convex optimization:
Existence of global

∂2f minimizers

x 7→ (x ) is continuous on R for all i, j = 1, . . . , n. Convex optimization:


∂xi ∂xj Maximizers

24
Optimization
Gradient and Hessian

Optimization
f : R ⊂ Rn → R; x 7→ f (x )
Continuous differentiable

Definition 5 Gradient and Hessian


Taylor series
If f ∈ C 1 we define the gradient (of f at x ) as the n-dimensional vector Example
Extrema
∂f ∂f
  Example
∇f (x ) = (x ), . . . , (x ) ∈ Rn The nonlinear optimization
∂x1 ∂xn problem
Necessary and sufficient

If f ∈ C 2 we define the Hessian (of f at x ) as the n × n matrix


conditions
Necessary conditions

h i Example

∂2f n×n Sufficient conditions


H(x ) = ∂xj ∂xi
(x ) ∈ R Example (revised)

Convex Analysis
Convex sets
Recall (or note) that Convex combination, hull
and cone
▶ The gradient ∇f (x ) is just the (total) derivative f ′ (x ) (or equivalently, Convex functions

the 1 × n Jacobian matrix Jf (x )). Properties


Properties (continued)
▶ The Hessian H(x ) is symmetric (since f ∈ C 2 ). Optimization: Local and
 T global minimizers
∂f
▶ If we let gi (x ) =
∂xi
(x ) then1 H(x ) = ∇g1 (x ) · · · ∇gn (x ) Convex optimization:
Existence of global
minimizers
▶ The Hessian H(x ) is just the (total) derivative f ′′ (x )
(or equivalently, the Convex optimization:

symmetric n × n Jacobian matrix J∇f (x )T = J∇f (x )). Maximizers

1
Here ∇g( x ) is considered a column vector 24
Optimization
Taylor series

Optimization
f : R ⊂ Rn → R; x 7→ f (x ) Continuous differentiable

The gradient and Hessian can be used to approximate f via the Taylor series. Gradient and Hessian
6 Taylor series

Theorem Example

Let x ∈ R, d ∈ Rn and assume that Extrema


Example
The nonlinear optimization
{x + αd | 0 ≤ α ≤ α̂} ⊂ R problem
Necessary and sufficient
conditions
for some 0 < α̂ ≤ 1. Necessary conditions
Example
▶ If f ∈ C 1 then there exists 0 ≤ α ≤ α̂ such that Sufficient conditions
Example (revised)
T
f (x + d) = f (x ) + ∇f (x + αd) d. Convex Analysis
Convex sets
▶ If f ∈ C 2 then there exists 0 ≤ α ≤ α̂ such that Convex combination, hull
and cone
Convex functions
1 T
f (x + d) = f (x ) + ∇f (x + αd)T d + d H(x + αd)d. Properties

2 Properties (continued)
Optimization: Local and
global minimizers

By continuity of ∇f and H we obtain (for ||d|| small) a: Convex optimization:


Existence of global
▶ Linear approximation: f (x + d) ≈ f (x ) + ∇f (x )T d. minimizers
Convex optimization:
▶ Quadratic approximation: f (x + d) ≈ f (x ) + ∇f (x )T d + 1 d T H(x )d. Maximizers
2

24
Optimization
Taylor series

f : R ⊂ Rn → R; x 7→ f (x ) Optimization
Continuous differentiable
The gradient and Hessian can be used to approximate f via the Taylor series. Gradient and Hessian
6 Taylor series
Theorem Example
Let x ∈ R, d ∈ Rn and assume that Extrema
Example

{x + αd | 0 ≤ α ≤ α̂} ⊂ R The nonlinear optimization


problem
Necessary and sufficient

for some 0 < α̂ ≤ 1. conditions


Necessary conditions
▶ If f ∈ C 1 then there exists 0 ≤ α ≤ α̂ such that Example
Sufficient conditions

f (x + d) = f (x ) + ∇f (x + αd)T d. Example (revised)

Convex Analysis

▶ If f ∈ C 2 then there exists 0 ≤ α ≤ α̂ such that Convex sets


Convex combination, hull
and cone
1 Convex functions
f (x + d) = f (x ) + ∇f (x + αd)T d + d T H(x + αd)d. Properties
2 Properties (continued)
Optimization: Local and
global minimizers
By continuity of ∇f and H we obtain (for ||d|| small) a: Convex optimization:
Existence of global
▶ Linear approximation: f (x + d) = f (x ) + ∇f (x )T d + O(∥d∥2 ). minimizers
Convex optimization:
▶ Quadratic approximation: Maximizers

f (x + d) ≈ f (x ) + ∇f (x )T d + 21 d T H(x )d + O(∥d∥3 ).

24
Optimization
Example

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
7 Example
Extrema
Example
The nonlinear optimization
problem
Necessary and sufficient
conditions

Example Necessary conditions


Example
Sufficient conditions
MATLAB Example (revised)

Convex Analysis
Convex sets
Convex combination, hull
and cone
Convex functions
Properties
Properties (continued)
Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Optimization
Extrema

Optimization
Continuous differentiable
f : R ⊂ Rn → R; x 7→ f (x )
Gradient and Hessian
Taylor series
Definition Example
A point x′
∈ R is called 8 Extrema
Example
▶ A (weak) [strong] local minimizer of f if there exists ϵ > 0 such that
The nonlinear optimization
problem
′ ′
(f (x ) ≥ f (x )) [f (x ) > f (x )]. Necessary and sufficient
conditions
Necessary conditions
for all x ∈ R with ||x − x ′ || < ϵ [and x ̸= x ′ ]. Example
Sufficient conditions
▶ A (weak) [strong] global minimizer of f if Example (revised)

′ ′ Convex Analysis
(f (x ) ≥ f (x )) [f (x ) > f (x )]. Convex sets
Convex combination, hull
for all x ∈ R. and cone
Convex functions
Points x ′ as above are collectively called minimizers, and the values f (x ′ ) are Properties

collectively called minima. Properties (continued)


Optimization: Local and
The definition of maximizers and maxima is obtained by reversing the global minimizers
Convex optimization:
inequalities above. Existence of global
minimizers
The minimizers and maximizers are collectively called extremum points, and Convex optimization:
the minima and maxima are collectively called extrema. Maximizers

24
Optimization
Example

Optimization
Continuous differentiable

Example Gradient and Hessian


Taylor series
Example
Extrema
9 Example
The nonlinear optimization
problem
Necessary and sufficient
conditions
Necessary conditions
Example
Sufficient conditions
Example (revised)

Convex Analysis
Convex sets
Convex combination, hull
and cone
Convex functions
Properties
Properties (continued)
Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Optimization
The nonlinear optimization problem

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
Example
f : R ⊂ Rn → R; x 7→ f (x )
Extrema
The nonlinear optimization problem: Example
10 The nonlinear optimization
problem
minimize f (x ) Necessary and sufficient
conditions
subject to Necessary conditions

x ∈ R. Example
Sufficient conditions
Example (revised)
Note the relations
Convex Analysis
Convex sets
min f (x ) = − max −f (x ) Convex combination, hull
and cone
argmin f (x ) = argmax − f (x ) Convex functions
Properties
The function f in the above problem is sometimes called a cost function (or Properties (continued)

functional) Optimization: Local and


global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Optimization
Necessary and sufficient conditions

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
f : R ⊂ Rn → R; x 7→ f (x )
Example
There are two sets of conditions for the nonlinear optimization problem: Extrema

▶ Necessary conditions which are satisfied at a minimizer x ′ . Example


The nonlinear optimization
▶ Sufficient conditions which guarantee that x ′ is a minimizer problem
11 Necessary and sufficient
conditions
We will need the following concept to describe the necessary and sufficient
Necessary conditions
conditions. Example
Sufficient conditions
Definition Example (revised)

A vector d ∈ Rn
is called a feasible direction (wrt., R) at point x ∈ Rn if there Convex Analysis
exists α̂ > 0 such that Convex sets
Convex combination, hull
and cone
x + αd ∈ R, Convex functions
Properties
for all 0 ≤ α ≤ α̂. Properties (continued)
Optimization: Local and
In the sequel let F (x ) denote the set of feasible directions at x , and int(R) global minimizers

denote the interior of the feasible region R. Convex optimization:


Existence of global
minimizers
Convex optimization:
Maximizers

24
Optimization
Necessary conditions

f : R ⊂ Rn → R; x 7→ f (x ) Optimization
Continuous differentiable

Theorem (First order necessary conditions) Gradient and Hessian


Taylor series
Let f ∈ C 1. If x′ is a local minimizer, then Example
Extrema
′ T ′
∇f (x ) d ≥ 0 for all d ∈ F (x ), Example
The nonlinear optimization
′ ′ problem
∇f (x ) = 0 if x ∈ int(R). Necessary and sufficient
conditions
12 Necessary conditions
Example
Theorem (Second order necessary conditions) Sufficient conditions

Let f ∈ C 2 . If x ′ is a local minimizer, then Example (revised)

Convex Analysis
▶ For all d ∈ F (x ′ ) Convex sets
Convex combination, hull

∇f (x ′ )T d ≥ 0, and cone
Convex functions

If ∇f (x ′ )T d = 0 then d T H(x ′ )d ≥ 0. Properties


Properties (continued)
Optimization: Local and
▶ If x ′ ∈ int(R) global minimizers
Convex optimization:
Existence of global
∇f (x ′ ) = 0, minimizers
Convex optimization:
d T H(x ′ )d ≥ 0 for all d ̸= 0. Maximizers

24
Optimization
Example

Optimization
Example (matlab) Continuous differentiable

See if x ′ = (0, 0) is a local minimizer for


Gradient and Hessian
Taylor series
Example
f 1(x ) = x12 + x22 , R=R , 2
Extrema
Example
f 2(x ) = x12 − x22 , R = R2 , The nonlinear optimization
problem

f 3(x ) = x13 + x23 , R=R . 2 Necessary and sufficient


conditions
Necessary conditions
We compute 13 Example
Sufficient conditions
h i
′ ′ 2 0 Example (revised)
∇f 1(x ) = (0, 0), H1(x ) = ,
0 2 Convex Analysis
h i Convex sets
2 0
∇f 2(x ′ ) = (0, 0), H2(x ′ ) = , Convex combination, hull
and cone
0 −2 Convex functions
h i
′ ′ 0 0 Properties
∇f 3(x ) = (0, 0), H3(x ) = . Properties (continued)
0 0 Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Hence x′
is a local minimizer candidate for f 1 and f 3, but not for f 2. Note Convex optimization:

that x ′ would be a minimizer candidate for f 2 if e.g. R is the cone spanned by Maximizers

(1, 1) and (1, −1).


24
Optimization
Sufficient conditions

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
Example
Extrema
Example

f : R ⊂ Rn → R; x 7→ f (x ) The nonlinear optimization


problem

Theorem (Second order sufficient conditions) Necessary and sufficient


conditions

Let f ∈ C 2 . If x ′ ∈ int(R) and Necessary conditions


Example
14 Sufficient conditions

∇f (x ) = 0, Example (revised)

d T H(x ′ )d > 0 for all d ̸= 0, Convex Analysis


Convex sets
Convex combination, hull
then x ′ is a strong local minimizer. and cone
Convex functions
Properties
Properties (continued)
Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Optimization
Example (revised)

Optimization
Continuous differentiable
Example (revised) Gradient and Hessian
Taylor series
See if x ′ = (0, 0) is a local minimizer for Example
Extrema
f 1(x ) = x12 + x22 , R=R , 2 Example
The nonlinear optimization
f 2(x ) = x12 − x22 , R = R2 , problem
Necessary and sufficient

x13 x23 , R = R2 .
conditions
f 3(x ) = + Necessary conditions
Example
We compute Sufficient conditions
15 Example (revised)
h i
′ 2′ 0 Convex Analysis
∇f 1(x ) = (0, 0), H1(x ) = ,
0 2 Convex sets

h i Convex combination, hull

′ ′2 0 and cone
∇f 2(x ) = (0, 0), H2(x ) = , Convex functions
0 −2 Properties
h i Properties (continued)
′ 0′ 0
∇f 3(x ) = (0, 0), H3(x ) = . Optimization: Local and
0 0 global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Hence x ′ is a strong local minimizer for f 1. Maximizers

24
Convex Analysis
Introduction

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
Example
Extrema
Example
The nonlinear optimization
problem
Necessary and sufficient
conditions
In the sequel we will study a particular nice class of real valued functions, Necessary conditions
referred to as convex (and concave) functions, whose elements are guaranteed Example

to have extremum points all being global. Moreover, the first order necessary Sufficient conditions
Example (revised)
conditions presented above also become sufficient conditions in this class
16 Convex Analysis
Convex sets
Convex combination, hull
and cone
Convex functions
Properties
Properties (continued)
Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Convex Analysis
Convex sets

Optimization

Definition Continuous differentiable


Gradient and Hessian
A set Rc ⊂ Rn is said to be convex if for every two points x1 , x2 ∈ Rc Taylor series
Example
αx1 + (1 − α)x2 ∈ Rc , Extrema
Example
The nonlinear optimization
for all 0 < α < 1. A set which is not convex is called non-convex. problem
Necessary and sufficient
conditions
Necessary conditions
Example
Sufficient conditions
Example (revised)

Convex Analysis
17 Convex sets
Convex combination, hull
and cone
Convex functions
Properties
Properties (continued)
Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Convex Analysis
Convex combination, hull and cone

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
Example
Here are two “examples” of convex sets. Extrema
Example

Definition The nonlinear optimization


problem
A convex combination of the points x1 , . . . , xm in Rn is a point x on the form Necessary and sufficient
conditions
P P Necessary conditions
x= αi xi with αi ≥ 0 and αi = 1. Example
Sufficient conditions
The convex hull of a subset R ⊂ Rn
is the (convex) set of all convex Example (revised)

combinations of points in R. Convex Analysis


A subset C ⊂ Rn is a convex cone if Convex sets
18 Convex combination, hull
and cone
αx + βy ∈ C , Convex functions
Properties

for all α, β > 0 and x , y ∈ C . Properties (continued)


Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Convex Analysis
Convex functions

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
Definition Example

A function Extrema
Example
The nonlinear optimization
f : Rc → R; x 7→ f (x ) problem
Necessary and sufficient
conditions
defined on a convex set Rc ⊂ Rn is said to be convex if for every two points Necessary conditions
x1 , x2 ∈ Rc Example
Sufficient conditions
Example (revised)
f (αx1 + (1 − α)x2 ) ≤ αf (x1 ) + (1 − α)f (x2 ),
Convex Analysis

for all 0 < α < 1. If Convex sets


Convex combination, hull
and cone
f (αx1 + (1 − α)x2 ) < αf (x1 ) + (1 − α)f (x2 ), x1 ̸= x2 , 19 Convex functions
Properties
Properties (continued)
for all 0 < α < 1, the f is said to be strictly convex. Optimization: Local and
A (strictly) concave function f is one with the property that −f is (strictly) global minimizers

convex. Convex optimization:


Existence of global
minimizers
Convex optimization:
Maximizers

24
Convex Analysis
Convex functions

Definition Optimization
Continuous differentiable
A function
Gradient and Hessian
Taylor series
f : Rc → R; x 7→ f (x ) Example
Extrema

defined on a convex set Rc ⊂ Rn is said to be convex if for every two points Example
The nonlinear optimization
x1 , x2 ∈ Rc problem
Necessary and sufficient
conditions
f (αx1 + (1 − α)x2 ) ≤ αf (x1 ) + (1 − α)f (x2 ), Necessary conditions
Example
for all 0 < α < 1. Sufficient conditions
Example (revised)

Convex Analysis
Convex sets
Convex combination, hull
and cone
19 Convex functions
Properties
Properties (continued)
Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Convex Analysis
Properties

Optimization
In the sequel we let Rc denote a convex set. Continuous differentiable
Gradient and Hessian

Theorem Taylor series


Example
▶ If f1 and f2 are convex functions on Rc and a, b ≥ 0, then af1 + bf2 is Extrema

convex on Rc . Example
The nonlinear optimization
▶ If f is a (strictly) convex function on Rc and K ∈ R, then problem
Necessary and sufficient
conditions
{x | f (x ) ≤ K } ⊂ Rc , Necessary conditions
Example
( {x | f (x ) < K } ⊂ Rc ), Sufficient conditions
Example (revised)
is a convex set. Convex Analysis
Convex sets
Convex combination, hull
and cone
Convex functions
20 Properties
Properties (continued)
Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
▶ A function f is convex on Rc iff its epigraph {(x , µ) | µ ≥ f (x )} is convex. Maximizers

24
Convex Analysis
Properties (continued)

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
Example

In the sequel we write A > 0 (resp. A ≥ 0) to indicate that the n × n matrix A Extrema
Example
is positive (semi) definite. The nonlinear optimization
problem
Theorem Necessary and sufficient
conditions
▶ Let f ∈ C 1 . Then f is a (strictly) convex function on Rc iff for all Necessary conditions

x , x ′ ∈ Rc Example
Sufficient conditions
Example (revised)
f (x ) ≥ f (x ′ ) + ∇f (x ′ )T (x − x ′ ),
Convex Analysis
( f (x ) > f (x ′ ) + ∇f (x ′ )T (x − x ′ ) ). Convex sets
Convex combination, hull
and cone
▶ Let f ∈ C 2 . Then f is a (strictly) convex function on Rc iff H(x ) ≥ 0 Convex functions

(H(x ) > 0) for all x ∈ Rc . Properties


21 Properties (continued)
Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Convex Analysis
Optimization: Local and global minimizers

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
Example
Extrema
Example
The nonlinear optimization
problem
Necessary and sufficient
Theorem (Local and global minimizers) conditions
Necessary conditions
If f is a convex function on Rc , then Example
Sufficient conditions
▶ The set of minimizers of f is convex.
Example (revised)

▶ Any local minimum of f is a global minimum of f . Convex Analysis


Convex sets
Convex combination, hull
and cone
Convex functions
Properties
Properties (continued)
22 Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Convex Analysis
Convex optimization: Existence of global minimizers

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
Example
Extrema
Example
The nonlinear optimization
problem
Theorem (Existence of global minimizers) Necessary and sufficient

Let f ∈ C 1 be a convex function on Rc . If x ′ ∈ Rc is such that


conditions
Necessary conditions
Example
∇f (x ′ )T (x − x ′ ) ≥ 0, Sufficient conditions
Example (revised)

for all x ∈ Rc , then x ′ is a (global) minimizer of f . Convex Analysis


Convex sets
Note that this say that the first order necessary conditions also are sufficient Convex combination, hull
conditions. and cone
Convex functions
Properties
Properties (continued)
Optimization: Local and
global minimizers
23 Convex optimization:
Existence of global
minimizers
Convex optimization:
Maximizers

24
Convex Analysis
Convex optimization: Maximizers

Optimization
Continuous differentiable
Gradient and Hessian
Taylor series
Example
Extrema
Example
The nonlinear optimization
problem
Theorem (Maximizers) Necessary and sufficient
conditions
Let f be a convex function on a bounded and closed (convex) set Rc . Then Necessary conditions
any maximum lie on the boundary of Rc . Example
Sufficient conditions

Example Example (revised)

Convex Analysis
Convex sets
MATLAB
Convex combination, hull
and cone
Convex functions
Properties
Properties (continued)
Optimization: Local and
global minimizers
Convex optimization:
Existence of global
minimizers
24 Convex optimization:
Maximizers

24

You might also like