You are on page 1of 31

University of Zagreb, Croatia

Lecture 1
Heuristic Optimization Methods:
Introduction to Optimization
Slides prepared by Nina Skorin-Kapov

Academic year 2020/2021


Graduate Studies Programme
Outline

 Introduction to optimization and heuristics

 General steps in solving optimization problems

 Optimization models

 Introduction to combinatorial optimization

2
Why optimization?

 Countless applications – science, engineering,


business, economics
 Some form of optimization is used in every
company
 Every process can potentially be optimized
 Minimize:time, cost, risk, …
 Maximize: profit, quality, efficiency, …

 Example: Design a network to optimize the


cost and Quality of Service

3
Why heuristics?

 Many problems are too complex (intractable) to


solve exactly in reasonable time
 In practice, we are satisfied with ‘good’
solutions, not necessarily optimal, that we can
find quickly
 “Heuristics”
 Comes from old Greek word ‘heuriskein’ = the art of
discovering new strategies (rules) to solve problems
 Algorithms which do not guarantee optimality but
usually find ‘good enough’ solutions in reasonable
time

4
Solving optimization problems

 General decision-making process

Identify
Model Optimize
(formulate)
the the
the
problem problem
problem

 If the solution is not good enough try a new model


and/or optimization approach

5
Solving optimization problems

1. Identify the problem


 Identify the decision problem (objectives, inputs
(parameters), constraints)
 May be imprecise
2. Model the problem
 Model: simplification of reality
 Build an abstract mathematical model or reduce to
a well studied optimization problem
3. Optimize the problem
 Generate a ‘good’ (sub)optimal solution using a
solving procedure/algorithm

6
Identifying the problem

 Identify the decision problem


Identify
 objectives the
 inputs (parameters) problem
 constraints
 May be imprecise
 Difficult to identify the relevant information

7
Modeling the problem

 Model – simplification of reality Model


the
 The quality of the solution depends on problem
the quality of the model
 NOTE: we are finding a solution for the abstract
model and not the original
 2 common approaches:
 Simplified model, exact approach

Problem Modelsimpl Solutionexact(Modelsimpl.)

 Precise model, approximate approach


Problem Modelprecise Solutionapprox(Modelprecise)
8
Optimizing the problem

 Optimizing the problem described Optimize


the
by the abstract model problem
 Can create problem specific algorithms
or apply SOA algorithms used for similar
problems
 Many standard optimization techniques
available
 Identifying the best solution approach for a
specific problem is difficult

9
Optimization problems

 Can be defined with ( S , f )


S is the set of candidate solutions (solution space,
search space)
 f : S →  is the objective function to optimize
 S is defined by input parameters and constraints

 Main components of an optimization problem


necessary to identify/model
1. Objective function
2. Variables
3. (Constraints)

10
Global optimum

 Global optimum:
 A solution s  S is a global optimum if

 s  S , f ( s)  f ( s) (minimization problem)


s  S , f ( s)  f ( s) (maximization problem)

 The main goal in solving an optimization


problem is finding the global optimum (or one of
them if there are multiple global optima)

11
Local vs. Global Optimum

Minimization problem

f(s)

Global
Local optima optimum

sS s*
12
Optimization models

Most commonly used:


 Mathematical programming
 Constraint programming

 Other:
 Modeling uncertainty
 Modeling dynamicity
 Modeling robustness

13
Mathematical program

minimize/m aximize f ( x1 , , xn ) Objective function (Rn →R)


subject to :
g1 ( x1 , , xn )  b1
 Inequality constraints (Rn →R)

g m ( x1 , , xn )  bm
h1 ( x1 ,  , xn ) = c1
 Equality constraints (Rn → R)
h p ( x1 ,  , xn ) = c p
x = (x1 ,..., xn ) n Vector of decision variables

Find x = ( x1 ,  , xn ) that min/ max f subject to the constraint s


14
Solution space

Solution space:
 
S = x   n such that x fulfils the constraint s
S  n
Rn
S

OBJECTIVE: Find the global optimum, i.e., x in S that


minimizes/maximizes the objective function

- S empty: no solution exists; problem is too constrained


- S non-empty: one or more (even infinite) optimal solutions can
exist (with the same value of f)

15
Some classifications

 Optimization problems modeled using


mathematical programming can be classified
according to the nature of:
 the objective function f and constraint functions gi
and hj
 the decision variables x
 the type of input parameters

16
Some classifications

Classification Refers to…


 Constraint
Linear vs nonlinear …the type of objective function and/or
programming constraints
Discrete vs. continuous …the type of variables
programming
Stochastic vs. … the nature of the parameters in the
deterministic constraints or objective function
programming
Finite vs. non-finite …the number of decision variables
programming
Constrained vs. non- …whether the constraints are defined or not
constrained

17
Linear programming (LP)

Standard form of In matrix min c T x 


an LP problem form: subject to :
Ax = b
min c1 x1 + ... + cn xn  x  0
Vector of
subject to : continuous
decision xT = ( x1 ,..., xn ),
a11 x1 + ... + a1n xn = b1 variables cT = (c1 ,..., cn ),
... Constant b T = (b1 ,..., bm ),
vectors
am1 x1 + ... + amn xn = bm  a11

... a1n 

Constant A =  ... ... ... 
x1  0,..., xn  0 matrix a ... amn 
 m1

18
Linear programming (LP)

 
min cT x
Canonical form of In matrix subject to :
an LP problem form:
Ax  b
min c1 x1 + ... + cn xn  x0
Vector of
subject to : continuous
xT = ( x1 ,..., xn ),
decision
a11 x1 + ... + a1n xn  b1 variables cT = (c1 ,..., cn ),
Constant bT = (b1 ,...,bm ),
... vectors
 a11 ... a1n 
 
am1 x1 + ... + amn xn  bm Constant A =  ... ... ... 
matrix a 
x1  0,..., xn  0  m1 ... amn 

19
Linear programming (LP)
Each inequality constraint can be transformed into:
-one equality constraint with a new decision variable (slack variable)
AND
-one inequality constraint: the slack variable must be non-negative

min 5 x1 + 3 x2 − 6 x3 
min 5 x1 + 3 x2 − 6 x3   
   subject to : 
subject to :   x2 + 10 x3 = 9 
 x2 + 10 x3 = 9  equivalent to  
   x1 + 3 x2 + x4 = 1 
 1
x + 3 x 2  1   x  0, x  0, x  0 
 x  0, x  0, x  0   1 2 3

 1 
 x4  0
2 3


slack variable:
-one per inequality constraint
-does not appear in the objective function
20
Linear programming (LP)

Each equality constraint, can be transformed to two


inequality constraints

min 5 x1 + 3 x2 − 6 x3 
min 5 x1 + 3 x2 − 6 x3   
   subject to : 
subject to :   x2 + 10 x3  9 
 x2 + 10 x3 = 9  equivalent to  
   x2 + 10 x3  9 
 1
x + 3 x 2  1   x + 3x  1 
 x  0, x  0, x  0   1 2

 1 
 x1  0, x2  0, x3  0 
2 3

21
LP Example

 A company makes 2 types of ice cream: A and


B. To make ice cream A, they need 1 bag of
sugar, one bottle of milk and 1 bag of flavoring
per kg of ice cream. To make ice cream B, they
need 2 bottles of milk and 1 bag of flavoring per
kg (it is sugar-free). The company buys 7 bags
of sugar, 10 bottles of milk and 8 bags of
flavoring daily. The profit per kg of ice cream A
is 2.25 Euros and for ice cream B is 4 Euros.
How much of each type should the company
produce daily to maximize their profit?

22
LP Example – LP formulation

Objective
Max 2.25x1 + 4x2
Function
s.t. x1 < 7
Problem specific
x1 + 2x2 < 10
constraints
x1 + x 2 < 8

x1 > 0 and x2 > 0 Non-negativity


Constraints

Efficient exact algorithms have been


developed to solve LPs:
•Simplex method
•Interior point methods
23
Integer linear programming (ILP or IP)

 Many real-world problems have discrete


variables
 ILP: Discrete decision variables, linear
objective function and constraints
 BIP: Binary decision variables
 Mixed ILP (MILP): discrete and continuous
decision variables
 Harder to solve than LP
 Exacttechniques: Branch and bound,
 Approximate: relaxations, decomposition
approaches, cutting plane algorithms
24
Nonlinear programming

 The objective function and/or constraints are


not linear
 Difficult to solve – often use linearization
techniques
 Linearizinga product of the variables
 Logical conditions
 Ordered set of variables

 For quadratic and convex continuous problems


some exact methods exist for small problems
 Heuristics are good candidates

25
Constraint programming (CP)

 Uses richer modeling tools than (MI)LP models


 Model: composed of a set of variables each
with a finite domain of values
 Symbolic or mathematical constraints can be
expressed
 Global constraints can refer to a set of
variables

26
CP vs. IP Example

 General assignment problem: assign objects


o1,…,on to locations l1,…, ln

n variables
 CP:
alldifferent ( y1 ,  , yn ) yi=the index of the location
where object oi is assigned
 (B)IP

 1, oi assigned l j n2 variables
xij = 
 0, otherwise
27
CP vs IP

 However, solving the problem with CP is not


necessarily faster and easier
 (M)IP solvers – use tecniques to prune search
tree
 CP solvers – use constraint programming
techniques to reduce the variable domains
 Good for “tight” (very constrained) problems
 e.g. scheduling

28
Other models

 Uncertainty
 Input data is subject to noise
 Ex: variable processing in scheduling problems
 Use stochastic programming
 Dynamicity
 Input data changes over time (may or may not be known in
advance)
 Ex: dynamic routing
 The objective function is deterministic but changes over time
 Robustness
 Decision variables may change after the solution has been
implemented
 The solution should be acceptable in the presence of small
changes

29
Combinatorial optimization

 Combinatorial optimization problems:


 Discrete decision variables
 Finite search space
 Objective function can be anything ((non)linear,
nonanalytic, …)

 Many real world problems fall into this category

30
Examples of classical combinatorial
optimization problems

 Travelling salesman problem (TSP)


 Satisfiability problem (SAT)
 Knapsack problem
 Graph coloring (GC)
 Steiner tree problem (STP)
 Bin packing (BP)

31

You might also like