You are on page 1of 186

Linear Programming and Applications

Generic Solution Methods for LP


ISE: 503

mnusyed@gmail.com

February 12, 2023

Syed N Mujahid Linear Programming February 12, 2023 1 / 38


Outline of the Course

① Preliminaries

② Introduction to Linear Programming

③ Generic Solution Methods for LP

④ Simplex Based LP Solution Methods

⑤ Enhanced Solution Methods via Duality

⑥ Decomposition Principle for LP

⑦ Network Simplex Method & Applications

Syed N Mujahid Linear Programming February 12, 2023 2 / 38


Chapter Outline
1 Graphical Search Methods
2D Graphical Method
Requirement Space Method
2 Algebraic Search Tools
Improving Directions
Feasible Directions
General Algorithms
3 Improving Search Illustrated
Boundary Point
Interior Point
Exterior Point
4 Preprocessing LPs
Basic Analysis
Advanced Analysis
5 Summary
Syed N Mujahid Linear Programming February 12, 2023 3 / 38
Generic Solution Methods for LP
1 Graphical Search Methods
2D Graphical Method
Requirement Space Method
2 Algebraic Search Tools
Improving Directions
Feasible Directions
General Algorithms
3 Improving Search Illustrated
Boundary Point
Interior Point
Exterior Point
4 Preprocessing LPs
Basic Analysis
Advanced Analysis
5 Summary
Syed N Mujahid Linear Programming February 12, 2023 4 / 38
2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

maximize :
100 aw + 30 ac
subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw ≥ 0

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

maximize :
100 aw + 30 ac
subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw ≥ 0

How many variables?

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

aw , ac ≥ 0
Restriction on Domain

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

aw , ac ≥ 0
Restriction on Domain

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

10 ac ≥ 30

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

10 ac ≥ 30

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

10 ac ≥ 30
First draw the equation 10 ac = 30
The select the appropriate Half Space

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

ac + aw ≤ 7
Half Space

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

ac + aw ≤ 7
Half Space

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

ac + aw ≤ 7
Half Space

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

4 ac + 10 aw ≤ 40
Half Space

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

4 ac + 10 aw ≤ 40
Half Space

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

4 ac + 10 aw ≤ 40
Half Space

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

4 ac + 10 aw ≤ 40
Half Space

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

a2 a1 = [3, 0]T
a2 = [3, 2.8]T
a3
a3 = [5, 2]T
a4 = [7, 2]T

a1
ac a4
[0,0]T

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

a2 Observation-1:
a3 ☞ Called as ‘Feasible Set’
☞ Intersection of linear equations (Hyperplanes
in high dimensions)
a1 ☞ Bounded (Typically, when it is bounded we
a4 call it a ‘Polytope’. When it is unbounded, then
[0,0]T ac we call it a ‘Polyhedron’)

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

a2 Observation-2:
a3 ☞ Feasible set divides the variable space into two
sets.
☞ Generally, a solution can be represented by a
a1 VECTOR.
a4 ☞ ‘Feasible Solution’
[0,0]T ac ☞ ‘Infeasible Solution’

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

subject to :
ac + aw ≤ 7
4 ac + 10 aw ≤ 40
10 ac ≥ 30
ac ≥ 0
aw aw ≥ 0

a2 Feasible or Infeasible Solution?


a5 = [2, 2]T infeasible
a5 a3 a6 = [5, 1]T feasible

a6
a1
ac a4
[0,0]T

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

let us plot objective function.

maximize :
100 aw + 30 ac

aw

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

maximize :
100 aw + 30 ac

It can be plotted by converting to an equation.


aw Like:
z = 100 aw + 30 ac
It will be one dimension higher to the number of
variables. In order to reduce the extra dimension,
we can plot its contours.

[0,0]T ac
100 aw +30 ac =100

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

maximize :
100 aw + 30 ac

It can be plotted by converting to an equation.


aw Like:
z = 100 aw + 30 ac
It will be one dimension higher to the number of
variables. In order to reduce the extra dimension,
we can plot its contours.

[0,0]T ac
z=100

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

maximize :
100 aw + 30 ac

z=800
It can be plotted by converting to an equation.
aw Like:
z=700 z = 100 aw + 30 ac

z=600 It will be one dimension higher to the number of


variables. In order to reduce the extra dimension,
z=500 we can plot its contours.
z=400

z=300

[0,0]T ac z=200

z=100

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

maximize :
100 aw + 30 ac

It can be plotted by converting to an equation.


aw Like:
z = 100 aw + 30 ac
It will be one dimension higher to the number of
variables. In order to reduce the extra dimension,
z=500 we can plot its contours.
z=400

z=300

[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

maximize :
100 aw + 30 ac
subject to :
ac + aw ≤ 7
aw 4 ac + 10 aw ≤ 40
10 ac ≥ 30
a2 ac ≥ 0
z=500
a3 aw ≥ 0
z=400

z=300
a1
ac a4
[0,0]T

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-1
Exercise-1: Solve the LP graphically.

maximize :
100 aw + 30 ac
subject to :
ac + aw ≤ 7
aw 4 ac + 10 aw ≤ 40
10 ac ≥ 30
a2 ac ≥ 0
z=500
a3 aw ≥ 0
z=400
The optimal solution will be a feasible solution that has the
maximum objective function value.
z=300
a1 optimal solution: a2 = a⋆ = [3, 2.8]T
a4 objective value: z ⋆ = 370
[0,0]T ac

Syed N Mujahid Linear Programming February 12, 2023 5 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

maximize :
5 x1 + 5 x2
subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x 2 ≥ 0

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

maximize :
5 x1 + 5 x2
subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x 2 ≥ 0

How many variables?

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
x1 , x2 ≥ 0
Restriction on Domain

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
x1 , x2 ≥ 0
Restriction on Domain

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
2x1 + x2 ≤ 8
Half Space

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
2x1 + x2 ≤ 8
Half Space

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
2x1 + x2 ≤ 8
Half Space

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
2x1 + x2 ≤ 8
Half Space

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
x1 + x2 ≤ 6
Half Space

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
x1 + x2 ≤ 6
Half Space

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
x1 + 3x2 ≥ 3
Half Space

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
x1 + 3x2 ≥ 3
Half Space

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
x1 + 3x2 ≥ 3
Half Space

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
a2 x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
a3
a1 = [0, 1]T
a2 = [0, 6]T
a3 = [2, 4]T
a4 = [4, 0]T
a5 = [3, 0]T
a1

a5 a4 x
1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
a2 x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
a3
The objective function contours:

z = 5 x1 + 5 x2

a1

a5 a4 x
1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
a2 x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
a3

x1 + 5 x2 = 1
a1

a5 a4 x
1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

subject to :
2x1 + x2 ≤ 8
x1 + x2 ≤ 6
a2 x1 + 3x2 ≥ 3
x1 , x2 ≥ 0
x2
a3

z=1
a1

a5 a4 x
1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.


z=8

z=7

z=6

z=5

z=4 x2
z=3

z=2

z=1

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

z=7

z=6

z=5

x2

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

z=7

z=6

z=5

x2

x1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

z=7

z=6

z=5 a2

x2
a3

a1

a5 a4 x
1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

z=7

z=6

z=5 a2

x2
a3

a1

a5 a4 x
1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

maximize :
5 x1 + 5 x2
a2 subject to :
2x1 + x2 ≤ 8
x2 x1 + x2 ≤ 6
a3
x1 + 3x2 ≥ 3
x1 , x 2 ≥ 0

a1

a5 a4 x
1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Example-2

Exercise-2: Solve the LP graphically.

maximize :
5 x1 + 5 x2
a2 subject to :
2x1 + x2 ≤ 8
x2 x1 + x2 ≤ 6
a3
x1 + 3x2 ≥ 3
x1 , x 2 ≥ 0

The optimal solution will be a feasible solution that has the


maximum objective function value.
a1 optimal solution: All points that lie on a2 a3 are optimal.
objective value: z ⋆ = 30

a5 a4 x
1

Syed N Mujahid Linear Programming February 12, 2023 6 / 38


2D Graphical Method: Summary
Solving a linear program will result in following different cases:

✪ Infeasible Solution
✪ Unbounded Solution
✪ Optimal Solution
★ Unique Optimal
★ Alternate Optimal

Syed N Mujahid Linear Programming February 12, 2023 7 / 38


2D Graphical Method: Summary
Solving a linear program will result in following different cases:

✪ Infeasible Solution
✪ Unbounded Solution
✪ Optimal Solution
★ Unique Optimal
★ Alternate Optimal

Syed N Mujahid Linear Programming February 12, 2023 7 / 38


2D Graphical Method: Summary
Solving a linear program will result in following different cases:

✪ Infeasible Solution
For the case of maximization,
✪ Unbounded Solution the problem is unbounded.
✪ Optimal Solution If the objective is to minimize,
then we have a solution.
★ Unique Optimal Note: Unbounded feasible set
does not imply unbounded LP.
★ Alternate Optimal

𝑧 = 50
𝑧 = 20 𝑧 = 30 𝑧 = 40
𝑧 = 10

Syed N Mujahid Linear Programming February 12, 2023 7 / 38


2D Graphical Method: Summary
Solving a linear program will result in following different cases:

✪ Infeasible Solution
✪ Unbounded Solution
𝑧 = 300
✪ Optimal Solution
★ Unique Optimal 𝑧 = 200
★ Alternate Optimal 𝑧 = 100

𝑧 = 300

𝑧 = 200
𝑧 = 100

The objective function type is to maximize.

Syed N Mujahid Linear Programming February 12, 2023 7 / 38


2D Graphical Method: Summary
Solving a linear program will result in following different cases:

✪ Infeasible Solution
✪ Unbounded Solution
✪ Optimal Solution
★ Unique Optimal
★ Alternate Optimal
𝑧 = 150

𝑧 = 100
𝑧 = 33

𝑧 = 50

𝑧 = 20

𝑧 = 15

The objective function type is to maximize.

Syed N Mujahid Linear Programming February 12, 2023 7 / 38


Requirement Space: Key Idea Equality Constraints
Consider the LP in standard form:

minimize: cT x, (1)
subject to: Ax = b, (2)
x ≥ 0. (3)

Syed N Mujahid Linear Programming February 12, 2023 8 / 38


Requirement Space: Key Idea Equality Constraints
Consider the LP in standard form:

minimize: cT x, (1)
subject to: Ax = b, (2)
x ≥ 0. (3)

Requirement Space: Equation (2) & (3) enforce that b, RHS requirement vector, should belong to the conic
combination of columns of A.

Syed N Mujahid Linear Programming February 12, 2023 8 / 38


Requirement Space: Key Idea Equality Constraints
Consider the LP in standard form:

minimize: cT x, (1)
subject to: Ax = b, (2)
x ≥ 0. (3)

Requirement Space: Equation (2) & (3) enforce that b, RHS requirement vector, should belong to the conic
combination of columns of A.
☞ Feasibility: the problem is feasible, if and only if, b belongs to the conic combination of columns of A.

Syed N Mujahid Linear Programming February 12, 2023 8 / 38


Requirement Space: Key Idea Equality Constraints
Consider the LP in standard form:

minimize: cT x, (1)
subject to: Ax = b, (2)
x ≥ 0. (3)

Requirement Space: Equation (2) & (3) enforce that b, RHS requirement vector, should belong to the conic
combination of columns of A.
☞ Feasibility: the problem is feasible, if and only if, b belongs to the conic combination of columns of A.

Syed N Mujahid Linear Programming February 12, 2023 8 / 38


Requirement Space: Key Idea Equality Constraints
Consider the LP in standard form:

minimize: cT x, (1)
subject to: Ax = b, (2)
x ≥ 0. (3)

Requirement Space: Equation (2) & (3) enforce that b, RHS requirement vector, should belong to the conic
combination of columns of A.
☞ Feasibility: the problem is feasible, if and only if, b belongs to the conic combination of columns of A.
☞ Optimality: the objective function can be seen as another equation with a new variable, i.e.,
   
c z
x=
A b

Thus, the finding optimal solution is similar to finding optimal value of z in the above expanded space (increase
by 1d).

Syed N Mujahid Linear Programming February 12, 2023 8 / 38


Requirement Space: Key Idea Equality Constraints
Consider the LP in standard form:

minimize: cT x, (1)
subject to: Ax = b, (2)
x ≥ 0. (3)

Requirement Space: Equation (2) & (3) enforce that b, RHS requirement vector, should belong to the conic
combination of columns of A.
☞ Feasibility: the problem is feasible, if and only if, b belongs to the conic combination of columns of A.
☞ Optimality: the objective function can be seen as another equation with a new variable, i.e.,
   
c z
x=
A b

Thus, the finding optimal solution is similar to finding optimal value of z in the above expanded space (increase
by 1d).

Syed N Mujahid Linear Programming February 12, 2023 8 / 38


Requirement Space: Key Idea Equality Constraints
Consider the LP in standard form:

minimize: cT x, (1)
subject to: Ax = b, (2)
x ≥ 0. (3)

Requirement Space: Equation (2) & (3) enforce that b, RHS requirement vector, should belong to the conic
combination of columns of A.
☞ Feasibility: the problem is feasible, if and only if, b belongs to the conic combination of columns of A.
☞ Optimality: the objective function can be seen as another equation with a new variable, i.e.,
   
c z
x=
A b

Thus, the finding optimal solution is similar to finding optimal value of z in the above expanded space (increase
by 1d).

Syed N Mujahid Linear Programming February 12, 2023 8 / 38


Requirement Space: Key Idea Equality Constraints
Consider the LP in standard form:

minimize: cT x, (1)
subject to: Ax = b, (2)
x ≥ 0. (3)

Requirement Space: Equation (2) & (3) enforce that b, RHS requirement vector, should belong to the conic
combination of columns of A.
☞ Feasibility: the problem is feasible, if and only if, b belongs to the conic combination of columns of A.
☞ Optimality: the objective function can be seen as another equation with a new variable, i.e.,
   
c z
x=
A b

Thus, the finding optimal solution is similar to finding optimal value of z in the above expanded space (increase
by 1d).

Syed N Mujahid Linear Programming February 12, 2023 8 / 38


Requirement Space: Key Idea Equality Constraints
Consider the LP in standard form:

minimize: cT x, (1)
subject to: Ax = b, (2)
x ≥ 0. (3)

Requirement Space: Equation (2) & (3) enforce that b, RHS requirement vector, should belong to the conic
combination of columns of A.
☞ Feasibility: the problem is feasible, if and only if, b belongs to the conic combination of columns of A.
☞ Optimality: the objective function can be seen as another equation with a new variable, i.e.,
   
c z
x=
A b

Thus, the finding optimal solution is similar to finding optimal value of z in the above expanded space (increase
by 1d).
☞ Unboundedness: Using vector [z, bT ]T in the expanded space, unboundedness can be detected.

Syed N Mujahid Linear Programming February 12, 2023 8 / 38


Requirement Space: Key Idea Inequality Constraints

The requirement space idea can be extended to analyze LP in canonical form.

𝒂𝟏
𝒂𝟐
𝒂𝟑

Syed N Mujahid Linear Programming February 12, 2023 9 / 38


Requirement Space: Key Idea Inequality Constraints

The requirement space idea can be extended to analyze LP in canonical form.

𝒂𝟏
𝒂𝟐
𝒂𝟑

Syed N Mujahid Linear Programming February 12, 2023 9 / 38


Requirement Space: Key Idea Inequality Constraints

The requirement space idea can be extended to analyze LP in canonical form.

𝒂𝟏
𝒂𝟐
𝒂𝟑

Syed N Mujahid Linear Programming February 12, 2023 9 / 38


Requirement Space: Key Idea Inequality Constraints

The requirement space idea can be extended to analyze LP in canonical form.

𝒂𝟏
𝒂𝟐
𝒂𝟑

≤𝒃

Syed N Mujahid Linear Programming February 12, 2023 9 / 38


Requirement Space: Key Idea Inequality Constraints

The requirement space idea can be extended to analyze LP in canonical form.

𝒂𝟏
𝒂𝟐
𝒂𝟑

Syed N Mujahid Linear Programming February 12, 2023 9 / 38


Requirement Space: Key Idea Inequality Constraints

The requirement space idea can be extended to analyze LP in canonical form.

𝒂𝟏
𝒂𝟐
𝒂𝟑

≤𝒃

Syed N Mujahid Linear Programming February 12, 2023 9 / 38


Requirement Space: Ferkas Theorem
Ferkas Lemma
Let A be an m × n matrix, and c be an n-vector. Exactly one of the following two systems has a
solution:

System 1 : Ax ≤ 0 and cT x > 0 for some x ∈ Rn .


System 2 : AT y = c and y ≥ 0 for some y ∈ Rm .

Syed N Mujahid Linear Programming February 12, 2023 10 / 38


Requirement Space: Ferkas Theorem
Ferkas Lemma
Let A be an m × n matrix, and c be an n-vector. Exactly one of the following two systems has a
solution:

System 1 : Ax ≤ 0 and cT x > 0 for some x ∈ Rn .


System 2 : AT y = c and y ≥ 0 for some y ∈ Rm .

Syed N Mujahid Linear Programming February 12, 2023 10 / 38


Generic Solution Methods for LP
1 Graphical Search Methods
2D Graphical Method
Requirement Space Method
2 Algebraic Search Tools
Improving Directions
Feasible Directions
General Algorithms
3 Improving Search Illustrated
Boundary Point
Interior Point
Exterior Point
4 Preprocessing LPs
Basic Analysis
Advanced Analysis
5 Summary
Syed N Mujahid Linear Programming February 12, 2023 11 / 38
Move

Moving along a direction


Let x1 ∈ Rn be a point, and suppose that we want to move along a non-zero direction
d ∈ Rn from x1 . Then the ray emanating from x1 can be expressed as:

x = x1 + λ d,

where λ > 0 is the step size along direction d.

x
x
x

x1

Syed N Mujahid Linear Programming February 12, 2023 12 / 38


Gradients of a Function

The gradient is defined as:  ∂f (x) 


∂x
 1 
 
 ∂f (x) 
 ∂x2 
 
∇f (x) =  
 .. 
 
 . 
 
 
∂f (x)
∂xn

Syed N Mujahid Linear Programming February 12, 2023 13 / 38


Gradients of a Function

The gradient is defined as:  ∂f (x) 


∂x
 1 
 
 ∂f (x) 
 ∂x2 
 
∇f (x) =  
 .. 
 
 . 
 
 
∂f (x)
∂xn

☞ Gradient of f (x) = cT x is

Syed N Mujahid Linear Programming February 12, 2023 13 / 38


Gradients of a Function

The gradient is defined as:  ∂f (x) 


∂x
 1 
 
 ∂f (x) 
 ∂x2 
 
∇f (x) =  
 .. 
 
 . 
 
 
∂f (x)
∂xn

☞ Gradient of f (x) = cT x is c

Syed N Mujahid Linear Programming February 12, 2023 13 / 38


Gradients of a Function

The gradient is defined as:  ∂f (x) 


∂x
 1 
 
 ∂f (x) 
 ∂x2 
 
∇f (x) =  
 .. 
 
 . 
 
 
∂f (x)
∂xn

☞ Gradient of f (x) = cT x is c
☞ A gradient at a point indicates possible direction which increases the function’s value.

Syed N Mujahid Linear Programming February 12, 2023 13 / 38


Improving Directions

Improving direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a improving direction, if
f (x1 + λd) is better than f (x1 ), where f () : Rn 7→ R is any objective function.

For the linear programs in the standard form, an improving direction will be:

Syed N Mujahid Linear Programming February 12, 2023 14 / 38


Improving Directions

Improving direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a improving direction, if
f (x1 + λd) is better than f (x1 ), where f () : Rn 7→ R is any objective function.

For the linear programs in the standard form, an improving direction will be:

maximization: cT (x + d) > cT x
minimization: cT (x + d) < cT x

Syed N Mujahid Linear Programming February 12, 2023 14 / 38


Improving Directions

Improving direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a improving direction, if
f (x1 + λd) is better than f (x1 ), where f () : Rn 7→ R is any objective function.

For the linear programs in the standard form, an improving direction will be:

maximization: cT d > 0
minimization: cT d < 0

Syed N Mujahid Linear Programming February 12, 2023 14 / 38


Improving Directions

Improving direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a improving direction, if
f (x1 + λd) is better than f (x1 ), where f () : Rn 7→ R is any objective function.

For the linear programs in the standard form, an improving direction will be:

maximization: cT d > 0
minimization: cT d < 0

☞ What can you say about direction c and −c?

Syed N Mujahid Linear Programming February 12, 2023 14 / 38


Feasible Directions

Feasible direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a feasible direction, if there
exists a scalar λ > 0 such that x1 + λd is feasible.

For the linear programs in the standard form, the properties of the feasible direction will be:

Syed N Mujahid Linear Programming February 12, 2023 15 / 38


Feasible Directions

Feasible direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a feasible direction, if there
exists a scalar λ > 0 such that x1 + λd is feasible.

For the linear programs in the standard form, the properties of the feasible direction will be:
A(x1 +λd) = b, (1)
x1 + λd ≥ 0, (2)
λ > 0, (3)
d ̸= 0. (4)

Syed N Mujahid Linear Programming February 12, 2023 15 / 38


Feasible Directions

Feasible direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a feasible direction, if there
exists a scalar λ > 0 such that x1 + λd is feasible.

For the linear programs in the standard form, the properties of the feasible direction will be:
Ad = 0, (1)
x1 + λd ≥ 0, (2)
λ > 0, (3)
d ̸= 0. (4)

Syed N Mujahid Linear Programming February 12, 2023 15 / 38


Feasible Directions

Feasible direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a feasible direction, if there
exists a scalar λ > 0 such that x1 + λd is feasible.

For the linear programs in the standard form, the properties of the feasible direction will be:
Ad = 0, (1)
x1 + λd ≥ 0, (2)
λ > 0, (3)
d ̸= 0. (4)

☞ What can you infer when a d ≥ 0 satisfy Equation (1)?

Syed N Mujahid Linear Programming February 12, 2023 15 / 38


Feasible Directions

Feasible direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a feasible direction, if there
exists a scalar λ > 0 such that x1 + λd is feasible.

For the linear programs in the standard form, the properties of the feasible direction will be:
Ad = 0, (1)
x1 + λd ≥ 0, (2)
λ > 0, (3)
d ̸= 0. (4)

☞ Let us say, one of the element of d, say dr is negative. What will be the maximum step
length, say λmax ?
Syed N Mujahid Linear Programming February 12, 2023 15 / 38
Feasible Directions

Feasible direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a feasible direction, if there
exists a scalar λ > 0 such that x1 + λd is feasible.

For the linear programs in the standard form, the properties of the feasible direction will be:
Ad = 0, (1)
x1 + λd ≥ 0, (2)
λ > 0, (3)
d ̸= 0. (4)

☞ Let us say, some of the elements of d are negative. What will be the maximum step length,
say λmax ?
Syed N Mujahid Linear Programming February 12, 2023 15 / 38
Feasible Directions

Feasible direction
Let x1 ∈ Rn be a feasible point. A nonzero direction d ∈ Rn is a feasible direction, if there
exists a scalar λ > 0 such that x1 + λd is feasible.

For the linear programs in the standard form, the properties of the feasible direction will be:
Ad = 0, (1)
x1 + λd ≥ 0, (2)
λ > 0, (3)
d ̸= 0. (4)

☞  
x1r
λmax = min | dr < 0, Ad = 0
|dr |
Syed N Mujahid Linear Programming February 12, 2023 15 / 38
An Idea

☞ Let x(0) be an initial feasible solution.

Syed N Mujahid Linear Programming February 12, 2023 16 / 38


An Idea

☞ Let x(0) be an initial feasible solution.


☞ When can x(0) be optimal?

Syed N Mujahid Linear Programming February 12, 2023 16 / 38


An Idea

☞ Let x(0) be an initial feasible solution.


☞ Let d(0) be a nonzero direction at x(0) , which is both improving and feasible.

Syed N Mujahid Linear Programming February 12, 2023 16 / 38


An Idea

☞ Let x(0) be an initial feasible solution.


☞ Let d(0) be a nonzero direction at x(0) , which is both improving and feasible.
☞ Having the knowledge of x(0) and d(0) , what could be a possible mechanism to get
an improved solution, say x(1) ?

Syed N Mujahid Linear Programming February 12, 2023 16 / 38


An Idea

☞ Let x(0) be an initial feasible solution.


☞ Let d(0) be a nonzero direction at x(0) , which is both improving and feasible.
x(1) = x(0) + λ(0) d(0)

Syed N Mujahid Linear Programming February 12, 2023 16 / 38


An Idea

☞ Let x(0) be an initial feasible solution.


☞ Let d(0) be a nonzero direction at x(0) , which is both improving and feasible.
x(1) = x(0) + λ(0) d(0)

☞ Questions: How to get λ(0) and d(0) ?

Syed N Mujahid Linear Programming February 12, 2023 16 / 38


Generic Solution Methods for LP
1 Graphical Search Methods
2D Graphical Method
Requirement Space Method
2 Algebraic Search Tools
Improving Directions
Feasible Directions
General Algorithms
3 Improving Search Illustrated
Boundary Point
Interior Point
Exterior Point
4 Preprocessing LPs
Basic Analysis
Advanced Analysis
5 Summary
Syed N Mujahid Linear Programming February 12, 2023 17 / 38
Improving Search: Boundary Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 18 / 38


Improving Search: Boundary Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 18 / 38


Improving Search: Boundary Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 18 / 38


Improving Search: Boundary Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 18 / 38


Improving Search: Boundary Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 18 / 38


Improving Search: Boundary Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 18 / 38


Improving Search: Boundary Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 18 / 38


Improving Search: Interior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 19 / 38


Improving Search: Interior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 19 / 38


Improving Search: Interior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 19 / 38


Improving Search: Interior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 19 / 38


Improving Search: Interior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 19 / 38


Improving Search: Interior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 19 / 38


Improving Search: Interior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 19 / 38


Improving Search: Interior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 19 / 38


Improving Search: Interior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 19 / 38


Improving Search: Interior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 19 / 38


Improving Search: Exterior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 20 / 38


Improving Search: Exterior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 20 / 38


Improving Search: Exterior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 20 / 38


Improving Search: Exterior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 20 / 38


Improving Search: Exterior Point

𝒙𝟏

𝒙𝟐

Syed N Mujahid Linear Programming February 12, 2023 20 / 38


Generic Solution Methods for LP
1 Graphical Search Methods
2D Graphical Method
Requirement Space Method
2 Algebraic Search Tools
Improving Directions
Feasible Directions
General Algorithms
3 Improving Search Illustrated
Boundary Point
Interior Point
Exterior Point
4 Preprocessing LPs
Basic Analysis
Advanced Analysis
5 Summary
Syed N Mujahid Linear Programming February 12, 2023 21 / 38
Preprocessing LPs: Goals

minimize: cT x,
subject to:
Ax ⋇ b,
x ≥ 0.

The main goals of LP preprocessing are:


✎ Fix as many variables as possible;
✎ Reduce the number of variables and constraints by eliminations.
✎ Eliminate as many redundant constraints as possible;
✎ Estimate new implied bounds on as many variables as possible.

Syed N Mujahid Linear Programming February 12, 2023 22 / 38


Preprocessing: Basic Analysis
The basic reductions for an LP can be stated as follows:
☞ All-Zero Rows
☞ All-Zero Columns
☞ Singleton Rows
☞ Singleton Columns
☞ Grained Rows
☞ Grained Columns

Syed N Mujahid Linear Programming February 12, 2023 23 / 38


Preprocessing: Basic Analysis
The basic reductions for an LP can be stated as follows:
☞ All-Zero Rows: Redundant or Infeasible Constraint
If LP at any time have any of the following constraint structures, then that constraint can be removed form LP.

0T x = 0,
0T x ≤ +ve or
T
0 x ≥ −ve.

Otherwise, if the LP at any time have any of the following constraint structure, then the LP is infeasible:

0T x < −ve or
0T x > +ve.

☞ All-Zero Columns
☞ Singleton Rows
☞ Singleton Columns
☞ Grained Rows
☞ Grained Columns
Syed N Mujahid Linear Programming February 12, 2023 23 / 38
Preprocessing: Basic Analysis
The basic reductions for an LP can be stated as follows:
☞ All-Zero Rows: Redundant or Infeasible Constraint
☞ All-Zero Columns: Fixing or Unbounded Variable
If LP at any time have any of the following column structure, then the primal variable corresponding to the column
can be fixed, and the column can be removed from the LP:

··· c j xj ···
··· 0xj ···
··· 0xj ···
.. .. ..
. . .
··· 0xj ···
Now, following cases arise:
✎ If cj > 0, then xj = 0.
✎ If cj < 0, then the primal problem is unbounded.
Remember, the updated bounds on a variable should be feasible w.r.t the existing bounds. In addition to the above
cases, the objective function is compensated accordingly.
☞ Singleton Rows
☞ Singleton Columns
☞ Grained Rows
☞ Grained Columns
Syed N Mujahid Linear Programming February 12, 2023 23 / 38
Preprocessing: Basic Analysis
The basic reductions for an LP can be stated as follows:
☞ All-Zero Rows: Redundant or Infeasible Constraint
☞ All-Zero Columns: Fixing or Unbounded Variable
☞ Singleton Rows: New bounds or Fixing Variable
If LP at any time have any of the following constraint structures, then that primal variable bounds can be improved.
0 ··· 0+ ai,j xj +0 ··· = bi ,
0 ··· 0+ ai,j xj +0 ··· ≥ bi , or
0 ··· 0+ ai,j xj +0 ··· ≤ bi .
Now, following cases arise:
✎ If ai,j > 0, then correspondingly the bounds on the primal variable can be defined as:
xj = bi /ai,j ,
xj ≥ bi /ai,j , or
xj ≤ bi /ai,j .
✎ If ai,j < 0, then correspondingly the bounds on the primal variable can be defined as:
xj = bi /ai,j ,
xj ≤ bi /ai,j , or
xj ≥ bi /ai,j .
If the primal variable’s above bounds are tight, then some of the primal constraints may become redundant. If the
bounds are conflicting with the existing bounds, then the primal problem is infeasible. If the primal variable’s bounds
do not contain zero, then the corresponding dual constraint is never active. If the primal variable’s value is fixed, then
the variable can be removed from the LP and the objective function is compensated accordingly.
☞ SingletonSyed N Mujahid
Columns Linear Programming February 12, 2023 23 / 38
Preprocessing: Basic Analysis
The basic reductions for an LP can be stated as follows:
☞ All-Zero Rows: Redundant or Infeasible Constraint
☞ All-Zero Columns: Fixing or Unbounded Variable
☞ Singleton Rows: New bounds or Fixing Variable
☞ Singleton Columns: New bounds or Fixing Dual Variable
If LP at any time have any of the following constraint structures, then that dual variable bounds can be improved.
··· cj xj ···
··· 0xj ···
. . .
. . .
. . .
··· 0xj ···
··· ai,j xj ···
··· 0xj ···
. . .
. . .
. . .
··· 0xj ···

Now assuming that xj is a non-negative variable, following cases arise:


✎ If ai,j > 0, then yi ≤ cj /ai,j is a bound on dual variable.
✎ If ai,j < 0, then yi ≥ cj /ai,j is a bound on dual variable.
If the dual variable’s above bounds are tight, then some of the dual constraints may become redundant. If the bounds
are conflicting with the existing bounds, then the dual problem is infeasible. If the dual variable’s bounds do not
contain zero, then the corresponding primal constraint is never active. If the dual variable’s value is fixed, then the
variable can be removed from the LP and the objective function is compensated accordingly.
☞ Grained Syed
RowsN Mujahid Linear Programming February 12, 2023 23 / 38
Preprocessing: Basic Analysis
The basic reductions for an LP can be stated as follows:
☞ All-Zero Rows: Redundant or Infeasible Constraint
☞ All-Zero Columns: Fixing or Unbounded Variable
☞ Singleton Rows: New bounds or Fixing Variable
☞ Singleton Columns: New bounds or Fixing Dual Variable
☞ Grained Rows: Eliminating Constraint or Infeasible Problem
The idea of singleton rows and singleton columns can be further extended. In fact, the singleton rows can be further
extended to grained rows. Grained rows is a constraint that is obviously redundant or infeasible. For example, say in a
grained row, all the coefficients are non-negative. Following two cases arise for the grained rows:
✎ If bi < 0, and there is a grained constraint is redundant and can be eliminated.

ai,1 x1 + ··· +ai,j−1 x( j − 1) +ai,j xj +ai,j+1 x( j + 1) ··· +ai,n xn ≥ bi ,

☞ Grained Columns

Syed N Mujahid Linear Programming February 12, 2023 23 / 38


Preprocessing: Basic Analysis
The basic reductions for an LP can be stated as follows:
☞ All-Zero Rows: Redundant or Infeasible Constraint
☞ All-Zero Columns: Fixing or Unbounded Variable
☞ Singleton Rows: New bounds or Fixing Variable
☞ Singleton Columns: New bounds or Fixing Dual Variable
☞ Grained Rows: Eliminating Constraint or Infeasible Problem
The idea of singleton rows and singleton columns can be further extended. In fact, the singleton rows can be further
extended to grained rows. Grained rows is a constraint that is obviously redundant or infeasible. For example, say in a
grained row, all the coefficients are non-negative. Following two cases arise for the grained rows:
✎ If bi < 0, and there is a grained constraint of the following type:

ai,1 x1 + ··· +ai,j−1 x( j − 1) +ai,j xj +ai,j+1 x( j + 1) ··· +ai,n xn ≤ bi ,

then the primal problem is infeasible. If bi = 0, then all the primal variables xj can be fixed to zero.
☞ Grained Columns

Syed N Mujahid Linear Programming February 12, 2023 23 / 38


Preprocessing: Basic Analysis
The basic reductions for an LP can be stated as follows:
☞ All-Zero Rows: Redundant or Infeasible Constraint
☞ All-Zero Columns: Fixing or Unbounded Variable
☞ Singleton Rows: New bounds or Fixing Variable
☞ Singleton Columns: New bounds or Fixing Dual Variable
☞ Grained Rows: Eliminating Constraint or Infeasible Problem
☞ Grained Columns: Fixing Variable or Unbounded Problem
The idea of singleton rows and singleton columns can be further extended. In fact, the singleton columns can be
further extended to grained columns. Grained column is a column of LP, whose corresponding dual constraint is a
grained constraint. For example, let us say that in a grained column, all the coefficients are non-negative and the
constraints are specific type. Following two cases arise for the grained columns:
✎ If cj > 0, and there is a grained column of the following type:

··· cj xj ···
··· a1,j xj ··· ≤ b1
··· a2,j xj ··· ≤ b2
. . . . .
.. .. .. .. ..
··· am,j xj ··· ≤ bm
then xj = 0.
Syed N Mujahid Linear Programming February 12, 2023 23 / 38
Preprocessing: Basic Analysis
The basic reductions for an LP can be stated as follows:
☞ All-Zero Rows: Redundant or Infeasible Constraint
☞ All-Zero Columns: Fixing or Unbounded Variable
☞ Singleton Rows: New bounds or Fixing Variable
☞ Singleton Columns: New bounds or Fixing Dual Variable
☞ Grained Rows: Eliminating Constraint or Infeasible Problem
☞ Grained Columns: Fixing Variable or Unbounded Problem
The idea of singleton rows and singleton columns can be further extended. In fact, the singleton columns can be
further extended to grained columns. Grained column is a column of LP, whose corresponding dual constraint is a
grained constraint. For example, let us say that in a grained column, all the coefficients are non-negative and the
constraints are specific type. Following two cases arise for the grained columns:
✎ If cj < 0, and there is a grained column of the following type:

··· cj xj ···
··· a1,j xj ··· ≥ b1
··· a2,j xj ··· ≥ b2
.. .. .. .. ..
. . . . .
··· am,j xj ··· ≥ bm
then the primal problem is unbounded. If cj = 0, then all the dual variables corresponding to the constraint
containing non-zero coefficient for xj can be fixed.
Syed N Mujahid Linear Programming February 12, 2023 23 / 38
Preprocessing: Duplicate Rows
If in the LP, there exists two rows, in the following structure, then these rows are duplicate.

··· ··· ···


.. .. .. .. ..
. . . . .
ai,1 x1 ··· ai,1 xn ⅁i bi
.. .. .. .. ..
. . . . .
αai,1 x1 ··· αai,1 xn ⅁k bk
.. .. .. .. ..
. . . . .

where α > 0. Now, the following cases arise:


✎ If ⅁i ≡ ⅁k , then one of the constraint is redundant and can be removed from the LP.
✎ If ⅁i ̸≡ ⅁k , then the LP may become primal infeasible.

Syed N Mujahid Linear Programming February 12, 2023 24 / 38


Preprocessing: Duplicate Columns
If in the LP, there exists two columns, in the following structure, then these columns are duplicate.

··· cj xj ··· c r xr ···


··· a1,j xj ··· αa1,r xr ···
··· a2,j xj ··· αa2,r xr ···
.. .. .. .. ..
. . . .α .
··· am,j xj ··· αam,r xr ···
where α ̸= 0. Now, the following cases arise:
✎ If cr = αcj and primal variable types are same, then one of the columns can be removed from the problem.
✎ If cr ̸= αcj , then one of the primal variables can be fixed (dual redundancy); or dual infeasibility can be detected.

Syed N Mujahid Linear Programming February 12, 2023 25 / 38


Preprocessing: Primal Constraints Analysis
Upper and lower bounds on the constraint can be computed from the bounds on the variables. For example, let the
constraint be: X
ai,j xj ⅁ bi
j

where ⅁ can be equality or inequality. Consider the following:


X X
U Pi = ai,j upj + ai,j lpj (1)
j:ai,j >0 j:ai,j <0
X X
LPi = ai,j lpj + ai,j upj (2)
j:ai,j >0 j:ai,j <0

Now, the following cases arise:


✎ If ⅁ ≡ =, then if LPi ≤ bi ≤ U Pi then the constraint is feasible, and if LPi = bi or bi = U Pi , then the variables can
be fixed at the corresponding bounds. Otherwise if LPi > bi or U Pi < bi , then the constraint is primal infeasible.
✎ If ⅁ ≡ ≤, then if LPi ≤ bi ≤ U Pi then the constraint is feasible, and if LPi = bi , then the variables can be fixed at
the corresponding bounds. Otherwise if LPi > bi , then the constraint is primal infeasible. If U Pi ≤ bi then the
constraint is redundant.
✎ If ⅁ ≡ ≥, then if LPi ≤ bi ≤ U Pi then the constraint is feasible, and if U Pi = bi , then the variables can be fixed at
the corresponding bounds. Otherwise if U Pi < bi , then the constraint is primal infeasible. If LPi ≥ bi then the
constraint is redundant.
Syed N Mujahid Linear Programming February 12, 2023 26 / 38
Preprocessing: Dual Constraints Analysis
Upper and lower bounds on the constraint can be computed from the bounds on the dual variables. For example, let the
constraint be: X
ai,j yi ⅁ cj
i

where ⅁ can be equality or inequality. Consider the following:


X X
U Dj = ai,j udi + ai,j ldi (3)
i:ai,j >0 i:ai,j <0

X X
LDj = ai,j ldi + ai,j udi (4)
i:ai,j >0 i:ai,j <0

Similar to the primal constraint check. It requires analysis of dual constraints. Following things can happen:
✎ A redundant constraint in the dual implies the corresponding variable is undefined (typically fixed to zero, when the
bounds are treated separately from the constraints.),
✎ An infeasible constraint in the dual implies infeasibility or unboundedness in the primal,
✎ A variable fixing constraint in the dual implies (when the dual variable is not fixed to zero) the corresponding primal
constraints are active.

Syed N Mujahid Linear Programming February 12, 2023 27 / 38


Preprocessing: Primal Implied Bounds
For any variable j and constraint i, bounds on the variable can be obtained as follows:
X
ai,j xj + ai,r xr ⅁ bi
r̸=j

where ⅁ can be equality or inequality, and ai,j > 0. Consider the following:
X X
U Ii,j = ai,r upr + ai,r lpr = U Pi − ai,j upj (5)
r:ai,r >0 r:ai,r <0
r̸=j r̸=j
X X
LIi,j = ai,r lpr + ai,r upr = LPi − ai,j lpj (6)
r:ai,r >0 r:ai,r <0
r̸=j r̸=j

Now the following cases arise:


✎ If ⅁ ≡ ≥ then, the constraint is written as:
X
ai,j xj ≥ bi − ai,r xr
r̸=j

which implies:
ai,j xj ≥ bi − (LPi − ai,j lpj )
or a new upper bound on xj can be defined as:
Syed N Mujahid 1 Programming
Linear February 12, 2023 28 / 38
Preprocessing: Primal Implied Bounds
To sum, following things can happen:
✎ Bounds on the variables can be improved (tighter the initial bounds).
✎ Infeasibility can be detected from variable bounds.
✎ Variables can be fixed.

Syed N Mujahid Linear Programming February 12, 2023 29 / 38


Preprocessing: Dual Implied Bounds
Same as primal implied bounds. Following things can happen:
✎ Bounds on the variables can be improved (tighter the initial bounds) in the dual. If the dual variable bounds does not
contain zero, then the corresponding primal constraint is active.
✎ Infeasibility can be detected from dual variable bounds, which implies the primal is infeasible or unbounded.
✎ Dual variables can be fixed. If they are fixed to a nonzero value, then the corresponding primal constraint is active.

Syed N Mujahid Linear Programming February 12, 2023 30 / 38


Preprocessing: Free Variable Elimination
If in an LP, you find a free variable inside equality type constraint. Then the variable and the constraint can be removed from
the LP by substitution. For example, let variable j be unrestricted in constraint i.
X
ai,j xj + ai,r xr = bi
r̸=j

Then substituting the following in place of xj will eliminate variable j and one constraint i:
1 X
xj = (bi − ai,r xr )
ai,j r̸=j

However, the above process may not be time efficient when there are too many substitutions containing too many terms.
Thus, following formula can be used to identify if the substitution is efficient or not.

(dj − 1)(ri − 1) ≤ β(dj + ri ) (7)


where dj is the number of nonzeros in column j and ri is the number of nonzeros in row i, and β is a fill-in tolerance. The
smaller the value of beta, the efficient the substitution. Typically, the value can be set to 4 for practical scenarios.

Syed N Mujahid Linear Programming February 12, 2023 31 / 38


Preprocessing: Example - 1
Q. Solve the following LP:

min. : 3x1 + 4x2 − 3x3 + 10x4


s.t. : 2x1 − 1x2 + 2x3 +10x4 ≤ 12
2x1 −1x2 + 3x3 + x4 ≤ 8
−4x1 + 3x2 −3x3 +10x4 ≥ 3
x1 , x 2 , x 3 , x 4 ≥ 0

Syed N Mujahid Linear Programming February 12, 2023 32 / 38


Preprocessing: Example - 1
Q. Solve the following LP:

min. : 3x1 + 4x2 − 3x3 + 10x4 min. : −12y1 − 8y2 + 3y3


s.t. : 2x1 − 1x2 + 2x3 +10x4 ≤ 12 s.t. : 2y1 + 2y2 + 4y3 ≤ 3
2x1 −1x2 + 3x3 + x4 ≤ 8 −y1 −3x3 − 3y3 ≤ 4
4x1 − 3x2 −3x3 +10x4 ≤ −3 2y1 + 3y2 −3x3 ≤ −3
x1 , x 2 , x 3 , x 4 ≥ 0 −3x3 y2 −3x3 ≤ 10
y1 , y 2 , y 3 ≤ 0
Sol: Applying the presolving ideas:

Syed N Mujahid Linear Programming February 12, 2023 32 / 38


Preprocessing: Example - 1
Q. Solve the following LP:

min. : 3x1 + 4x2 − 3x3 + 10x4 min. : −12y1 − 8y2 + 3y3


s.t. : 2x1 − 1x2 + 2x3 +10x4 ≤ 12 s.t. : 2y1 + 2y2 + 4y3 ≤ 3
2x1 −1x2 + 3x3 + x4 ≤ 8 −y1 −3x3 − 3y3 ≤ 4
4x1 − 3x2 −3x3 +10x4 ≤ −3 2y1 + 3y2 −3x3 ≤ −3
x1 , x 2 , x 3 , x 4 ≥ 0 −3x3 y2 −3x3 ≤ 10
y1 , y 2 , y 3 ≤ 0
Sol: Applying the presolving ideas:

All-Zero Rows
Column corresponding to variable x4 is singleton. This implies y2 ≤ 10.
All-Zero Columns
Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 32 / 38


Preprocessing: Example - 1
Q. Solve the following LP:

min. : 3x1 + 4x2 − 3x3 + 10x4 min. : −12y1 − 8y2 + 3y3


s.t. : 2x1 − 1x2 + 2x3 +10x4 ≤ 12 s.t. : 2y1 + 2y2 + 4y3 ≤ 3
2x1 −1x2 + 3x3 + x4 ≤ 8 −y1 −3x3 − 3y3 ≤ 4
4x1 − 3x2 −3x3 +10x4 ≤ −3 2y1 + 3y2 −3x3 ≤ −3
x1 , x 2 , x 3 , x 4 ≥ 0 −3x3 y2 −3x3 ≤ 10
y1 , y 2 , y 3 ≤ 0
Sol: Applying the presolving ideas:

All-Zero Rows
Column corresponding to variable x4 is singleton. This implies y2 ≤ 10.
All-Zero Columns
Since y2 ≤ 10 is redundant, we have x4 = 0. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 32 / 38


Preprocessing: Example - 1
Q. Solve the following LP:

min. : 3x1 + 4x2 − 3x3 min. : −12y1 − 8y2 + 3y3


s.t. : 2x1 − 1x2 + 2x3 ≤ 12 s.t. : 2y1 + 2y2 + 4y3 ≤ 3
2x1 −1x2 + 3x3 ≤ 8 −y1 −3x3 − 3y3 ≤ 4
4x1 − 3x2 −3x3 ≤ −3 2y1 + 3y2 −3x3 ≤ −3
x1 , x 2 , x 3 ≥ 0 y1 , y 2 , y 3 ≤ 0
Sol: Applying the presolving ideas:

All-Zero Rows
Column corresponding to variable x4 is singleton. This implies y2 ≤ 10.
All-Zero Columns
Since y2 ≤ 10 is redundant, we have x4 = 0. Singleton Rows
Dual constraint corresponding to variable x1 is redundant, this implies x1 = 0 at optimality (or Singleton Columns
Column corresponding to variable x1 is grained, this implies x1 = 0). Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 32 / 38


Preprocessing: Example - 1
Q. Solve the following LP:

min. : +4x2 − 3x3 min. : −12y1 − 8y2 + 3y3


s.t. : −1x2 + 2x3 ≤ 12 s.t. : −y1 −3x3 − 3y3 ≤ 4
−1x2 + 3x3 ≤ 8 2y1 + 3y2 −3x3 ≤ −3
−3x2 −3x3 ≤ −3 y1 , y 2 , y 3 ≤ 0
x2 , x 3 ≥ 0
Sol: Applying the presolving ideas:

All-Zero Rows
Column corresponding to variable x4 is singleton. This implies y2 ≤ 10.
All-Zero Columns
Since y2 ≤ 10 is redundant, we have x4 = 0. Singleton Rows
Dual constraint corresponding to variable x1 is redundant, this implies x1 = 0 at optimality (or Singleton Columns
Column corresponding to variable x1 is grained, this implies x1 = 0). Grained Rows
Grained Columns
There are two singleton rows, which implies x2 ≥ 1 and x3 ≤ 8 3
. Now, the bounds on x2 does Duplicate Rows & Columns
not contain zero, which implies the corresponding dual constraint is tight at optimality. Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 32 / 38


Preprocessing: Example - 1
Q. Solve the following LP:

min. : +4x2 − 3x3 min. : −12y1 − 8y2 + 3y3


s.t. : −1x2 + 2x3 ≤ 12 s.t. : −y1 −3x3 − 3y3 = 4
−1x2 + 3x3 ≤ 8 2y1 + 3y2 −3x3 ≤ −3
−3x2 −3x3 ≤ −3 y1 , y 2 , y 3 ≤ 0
x2 , x 3 ≥ 0
Sol: Applying the presolving ideas:

All-Zero Rows
Column corresponding to variable x4 is singleton. This implies y2 ≤ 10.
All-Zero Columns
Since y2 ≤ 10 is redundant, we have x4 = 0. Singleton Rows
Dual constraint corresponding to variable x1 is redundant, this implies x1 = 0 at optimality (or Singleton Columns
Column corresponding to variable x1 is grained, this implies x1 = 0). Grained Rows
Grained Columns
There are two singleton rows, which implies x2 ≥ 1 and x3 ≤ 8 3
. Now, the bounds on x2 does Duplicate Rows & Columns
not contain zero, which implies the corresponding dual constraint is tight at optimality. Primal & Dual Constraints Analysis
Notice that with the updated bounds, the first constraint becomes redundant. Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 32 / 38


Preprocessing: Example - 1
Q. Solve the following LP:

min. : +4x2 − 3x3 min. : −12y1 − 8y2 + 3y3


s.t. : −1x2 + 3x3 ≤ 8 s.t. : −y1 −3x3 − 3y3 = 4
−3x2 −3x3 ≤ −3 2y1 + 3y2 −3x3 ≤ −3
x2 , x 3 ≥ 0 y1 , y 2 , y 3 ≤ 0
Sol: Applying the presolving ideas:

All-Zero Rows
Column corresponding to variable x4 is singleton. This implies y2 ≤ 10.
All-Zero Columns
Since y2 ≤ 10 is redundant, we have x4 = 0. Singleton Rows
Dual constraint corresponding to variable x1 is redundant, this implies x1 = 0 at optimality (or Singleton Columns
Column corresponding to variable x1 is grained, this implies x1 = 0). Grained Rows
Grained Columns
There are two singleton rows, which implies x2 ≥ 1 and x3 ≤ 8 3
. Now, the bounds on x2 does Duplicate Rows & Columns
not contain zero, which implies the corresponding dual constraint is tight at optimality. Primal & Dual Constraints Analysis
Notice that with the updated bounds, the first constraint becomes redundant. Primal & Dual Implied Bounds
Free Variable Elimination
Thus, the LP boils down to variable separable problem.

Syed N Mujahid Linear Programming February 12, 2023 32 / 38


Preprocessing: Example - 1
Q. Solve the following LP:

min. : +4x2 − 3x3 min. : −12y1 − 8y2 + 3y3


s.t. : −1x2 + 3x3 ≤ 8 s.t. : −y1 −3x3 − 3y3 = 4
−3x2 −3x3 ≤ −3 2y1 + 3y2 −3x3 ≤ −3
x2 , x 3 ≥ 0 y1 , y 2 , y 3 ≤ 0
Sol: Applying the presolving ideas:

All-Zero Rows
Column corresponding to variable x4 is singleton. This implies y2 ≤ 10.
All-Zero Columns
Since y2 ≤ 10 is redundant, we have x4 = 0. Singleton Rows
Dual constraint corresponding to variable x1 is redundant, this implies x1 = 0 at optimality (or Singleton Columns
Column corresponding to variable x1 is grained, this implies x1 = 0). Grained Rows
Grained Columns
There are two singleton rows, which implies x2 ≥ 1 and x3 ≤ 8 3
. Now, the bounds on x2 does Duplicate Rows & Columns
not contain zero, which implies the corresponding dual constraint is tight at optimality. Primal & Dual Constraints Analysis
Notice that with the updated bounds, the first constraint becomes redundant. Primal & Dual Implied Bounds
Free Variable Elimination
Thus, the LP boils down to variable separable problem.
The optimal solution is x1 = 0, x2 = 1, x3 = 8
3
and x4 = 0.

Syed N Mujahid Linear Programming February 12, 2023 32 / 38


Preprocessing: Example - 2
Q. Solve the following LP:

max. : 5x1 + 3x2 + x3


s.t. : x1 + x2 + 2x3 ≤ 6
3x1 + 3x2 + 6x3 ≤ 18
x1 , x 2 , x 3 ≥ 0

Syed N Mujahid Linear Programming February 12, 2023 33 / 38


Preprocessing: Example - 2
Q. Solve the following LP:

max. : 5x1 + 3x2 + x3 min. : 6y1 + 18y2


s.t. : x1 + x2 + 2x3 ≤ 6 s.t. : y1 + 3y2 ≥ 5
3x1 + 3x2 + 6x3 ≤ 18 y1 + 3y2 ≥ 3
x1 , x 2 , x 3 ≥ 0 2y1 + 6y2 ≥ 1
y1 , y 2 ≥ 0

Sol: Applying the presolving ideas:

All-Zero Rows
All-Zero Columns
Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 33 / 38


Preprocessing: Example - 2
Q. Solve the following LP:

max. : 5x1 + 3x2 + x3 min. : 6y1 + 18y2


s.t. : x1 + x2 + 2x3 ≤ 6 s.t. : y1 + 3y2 ≥ 5
3x1 + 3x2 + 6x3 ≤ 18 y1 + 3y2 ≥ 3
x1 , x 2 , x 3 ≥ 0 2y1 + 6y2 ≥ 1
y1 , y 2 ≥ 0

Sol: Applying the presolving ideas:

All-Zero Rows
One of the two constraints is duplicate, and can be removed.
All-Zero Columns
Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 33 / 38


Preprocessing: Example - 2
Q. Solve the following LP:

max. : 5x1 + 3x2 + x3 min. : 6y1


s.t. : x1 + x2 + 2x3 ≤ 6 s.t. : y1 ≥ 5
x1 , x 2 , x 3 ≥ 0 y1 ≥ 3
2y1 ≥ 1
y1 ≥ 0

Sol: Applying the presolving ideas:

All-Zero Rows
One of the two constraints is duplicate, and can be removed.
All-Zero Columns
Now, all the columns are singleton. This implies dual bounds can be analyzed. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 33 / 38


Preprocessing: Example - 2
Q. Solve the following LP:

max. : 5x1 + 3x2 + x3 min. : 6y1


s.t. : x1 + x2 + 2x3 ≤ 6 s.t. : y1 ≥ 5
x1 , x 2 , x 3 ≥ 0 y1 ≥ 3
2y1 ≥ 1
y1 ≥ 0

Sol: Applying the presolving ideas:

All-Zero Rows
One of the two constraints is duplicate, and can be removed.
All-Zero Columns
Now, all the columns are singleton. This implies dual bounds can be analyzed. Singleton Rows
We get the following: y1 ≥ 5, y1 ≥ 3, y1 ≥ 0.5, from the three columns. Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 33 / 38


Preprocessing: Example - 2
Q. Solve the following LP:

max. : 5x1 + 3x2 + x3 min. : 6y1


s.t. : x1 + x2 + 2x3 ≤ 6 s.t. : y1 ≥ 5
x1 , x 2 , x 3 ≥ 0 y1 ≥ 3
2y1 ≥ 1
y1 ≥ 0

Sol: Applying the presolving ideas:

All-Zero Rows
One of the two constraints is duplicate, and can be removed.
All-Zero Columns
Now, all the columns are singleton. This implies dual bounds can be analyzed. Singleton Rows
We get the following: y1 ≥ 5, y1 ≥ 3, y1 ≥ 0.5, from the three columns. Singleton Columns
Grained Rows
Thus, the updated bound on the dual variable is 5 ≤ y1 ≤ ∞. The constraints corresponding to Grained Columns
x2 and x3 are redundant. This implies x2 = x3 = 0. Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 33 / 38


Preprocessing: Example - 2
Q. Solve the following LP:

max. : x3 min. : 6y1


s.t. : 2x3 ≤ 6 s.t. : y1 ≥ 5
x3 ≥ 0 y1 ≥ 0

Sol: Applying the presolving ideas:

All-Zero Rows
One of the two constraints is duplicate, and can be removed.
All-Zero Columns
Now, all the columns are singleton. This implies dual bounds can be analyzed. Singleton Rows
We get the following: y1 ≥ 5, y1 ≥ 3, y1 ≥ 0.5, from the three columns. Singleton Columns
Grained Rows
Thus, the updated bound on the dual variable is 5 ≤ y1 ≤ ∞. The constraints corresponding to Grained Columns
x2 and x3 are redundant. This implies x2 = x3 = 0. Duplicate Rows & Columns
Clearly, the bounds on the dual variable does not contain zero. Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 33 / 38


Preprocessing: Example - 2
Q. Solve the following LP:

max. : x3 min. : 6y1


s.t. : 2x3 ≤ 6 s.t. : y1 ≥ 5
x3 ≥ 0 y1 ≥ 0

Sol: Applying the presolving ideas:

All-Zero Rows
One of the two constraints is duplicate, and can be removed.
All-Zero Columns
Now, all the columns are singleton. This implies dual bounds can be analyzed. Singleton Rows
We get the following: y1 ≥ 5, y1 ≥ 3, y1 ≥ 0.5, from the three columns. Singleton Columns
Grained Rows
Thus, the updated bound on the dual variable is 5 ≤ y1 ≤ ∞. The constraints corresponding to Grained Columns
x2 and x3 are redundant. This implies x2 = x3 = 0. Duplicate Rows & Columns
Clearly, the bounds on the dual variable does not contain zero. This implies x1 = 6. Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Thus, the optimal solution is: x1 = 6, x2 = 0 and x3 = 0.
Free Variable Elimination

Syed N Mujahid Linear Programming February 12, 2023 33 / 38


Preprocessing: Example - 3
Q. Solve the following LP:
max : 2x1 +3x2 −x3 −x4
s.t. x1 +x2 +x3 −2x4 ≤4
x1 +x4 ≤3
x1 , x2 , x3 , x4 ≥0

Syed N Mujahid Linear Programming February 12, 2023 34 / 38


Preprocessing: Example - 3
Q. Solve the following LP:
max : 2x1 +3x2 −x3 −x4
s.t. x1 +x2 +x3 −2x4 ≤4
x1 +x4 ≤3
x1 , x2 , x3 , x4 ≥0
Sol: Applying the presolving ideas:

Syed N Mujahid Linear Programming February 12, 2023 34 / 38


Preprocessing: Example - 3
Q. Solve the following LP:
min : −2x1 −3x2 +x3 +x4 LP UP ld ud
s.t. x1 +x2 +x3 −2x4 ≤4 −∞ 0
−x1 −x2 +x3 −x4 ≤1 −∞ 0
x1 +x4 ≤3 −∞ 0
x1 , x2 , x3 , x4 ≥0
lp : 0 0 0 0
up : ∞ ∞ ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

Syed N Mujahid Linear Programming February 12, 2023 34 / 38


Preprocessing: Example - 3
Q. Solve the following LP:
min : −2x1 −3x2 +x3 +x4 LP UP ld ud
s.t. x1 +x2 +x3 −2x4 ≤4 −∞ 0
−x1 −x2 +x3 −x4 ≤1 −∞ 0
x1 +x4 ≤3 −∞ 0
x1 , x2 , x3 , x4 ≥0
lp : 0 0 0 0
up : ∞ ∞ ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
All-Zero Columns
Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Syed N Mujahid Linear Programming Free Variable Elimination
February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −2x1 −3x2 +x3 +x4 LP UP ld ud
s.t. x1 +x2 +x3 −2x4 ≤4 −∞ 0
−x1 −x2 +x3 −x4 ≤1 −∞ 0
x1 +x4 ≤3 −∞ 0
x1 , x2 , x3 , x4 ≥0
lp : 0 0 0 0
up : ∞ ∞ ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
All-Zero Columns
Column for x3 is a grained column. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Syed N Mujahid Linear Programming Free Variable Elimination
February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −2x1 −3x2 +x4 LP UP ld ud
s.t. x1 +x2 −2x4 ≤4 −∞ 0
−x1 −x2 −x4 ≤1 −∞ 0
x1 +x4 ≤3 −∞ 0
x1 , x2 , x4 ≥0
lp : 0 0 0
up : ∞ ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
All-Zero Columns
Column for x3 is a grained column. ⇒ x3 = 0. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Syed N Mujahid Linear Programming Free Variable Elimination
February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −2x1 −3x2 +x4 LP UP ld ud
s.t. x1 +x2 −2x4 ≤4 − ∞ ∞ −∞ 0
−x1 −x2 −x4 ≤1 − ∞ 0 −∞ 0
x1 +x4 ≤3 0 ∞ −∞ 0
x1 , x2 , x4 ≥0
lp : 0 0 0
up : ∞ ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0
All-Zero Columns
Primal Constraint Analysis gives these bounds. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Syed N Mujahid Linear Programming Free Variable Elimination
February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −2x1 −3x2 +x4 LP UP ld ud
s.t. x1 +x2 −2x4 ≤4 − ∞ ∞ −∞ 0
−x1 −x2 −x4 ≤1 − ∞ 0 −∞ 0
x1 +x4 ≤3 0 ∞ −∞ 0
x1 , x2 , x4 ≥0
lp : 0 0 0
up : ∞ ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0
All-Zero Columns
Primal Constraint Analysis gives these bounds. The second constraint is redundant. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Syed N Mujahid Linear Programming Free Variable Elimination
February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −2x1 −3x2 +x4 LP UP ld ud
s.t. x1 +x2 −2x4 ≤4 − ∞ ∞ −∞ 0
x1 +x4 ≤3 0 ∞ −∞ 0
x1 , x2 , x4 ≥0
lp : 0 0 0
up : ∞ ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0
All-Zero Columns
Now, the column for x2 is a singleton column. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −2x1 −3x2 +x4 LP UP ld ud
s.t. x1 +x2 −2x4 ≤4 − ∞ ∞ −∞ −3
x1 +x4 ≤3 0 ∞ −∞ 0
x1 , x2 , x4 ≥0
lp : 0 0 0
up : ∞ ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0
All-Zero Columns
Now, the column for x2 is a singleton column. y1 ≤ −3. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −2x1 −3x2 +x4 LP UP ld ud
s.t. x1 +x2 −2x4 ≤4 − ∞ ∞ −∞ −3
x1 +x4 ≤3 0 ∞ −∞ 0
x1 , x2 , x4 ≥0
lp : 0 0 0
up : ∞ ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0
All-Zero Columns
Now, the column for x2 is a singleton column. y1 ≤ −3. Singleton Rows
Since the dual bound is improved, let us analyze the dual constraints. The dual constraint analysis Singleton Columns
implies that the dual constraint corresponding to x1 is redundant. Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −2x1 −3x2 +x4 LP UP ld ud
s.t. x1 +x2 −2x4 ≤4 − ∞ ∞ −∞ −3
x1 +x4 ≤3 0 ∞ −∞ 0
x1 , x2 , x4 ≥0
lp : 0 0 0
up : ∞ ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0
All-Zero Columns
Now, the column for x2 is a singleton column. y1 ≤ −3. Singleton Rows
Since the dual bound is improved, let us analyze the dual constraints. The dual constraint analysis Singleton Columns
implies that the dual constraint corresponding to x1 is redundant. ⇒ x1 = 0. Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −3x2 +x4 LP UP ld ud
s.t. +x2 −2x4 ≤4 − ∞ ∞ −∞ −3
+x4 ≤3 0 ∞ −∞ 0
x2 , x4 ≥0
lp : 0 0
up : ∞ ∞
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0 , x1 = 0
All-Zero Columns
The last constraint is singleton. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −3x2 +x4 LP UP ld ud
s.t. +x2 −2x4 ≤4 − ∞ ∞ −∞ −3
+x4 ≤3 0 ∞ −∞ 0
x2 , x4 ≥0
lp : 0 0
up : ∞ 3
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0 , x1 = 0
All-Zero Columns
The last constraint is singleton. ⇒ x4 ≤ 3. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −3x2 +x4 LP UP ld ud
s.t. +x2 −2x4 ≤4 −3 ∞ −∞ −3
+x4 ≤3 0 3 −∞ 0
x2 , x4 ≥0
lp : 0 0
up : ∞ 3
LD :
UD :
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0 , x1 = 0
All-Zero Columns
The last constraint is singleton. ⇒ x4 ≤ 3. Since the primal bound is improved, let us analyze the Singleton Rows
primal constraints. The LP and UP bound for the constraints is now updated. Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −3x2 +x4 LP UP ld ud
s.t. +x2 −2x4 ≤4 −3 ∞ −∞ −3
+x4 ≤3 0 3 −∞ 0
x2 , x4 ≥0
lp : 0 0
up : ∞ 3
LD : −∞ −∞
UD : −3 ∞
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0 , x1 = 0
All-Zero Columns
Constraint Analysis: Dual limits can be updated. Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −3x2 +x4 LP UP ld ud
s.t. +x2 −2x4 ≤4 −3 ∞ −∞ −3
+x4 ≤3 0 3 −∞ 0
x2 , x4 ≥0
lp : 0 0
up : ∞ 3
LD : −∞ −∞
UD : −3 ∞
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0 , x1 = 0
All-Zero Columns
Implied Bounds Analysis: Primal limits for variable x2 can be updated as: Singleton Rows
Singleton Columns
x2 ≤ 4 + 2x4 → x2 ≤ 10 Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −3x2 +x4 LP UP ld ud
s.t. +x2 −2x4 ≤4 −3 ∞ −∞ −3
+x4 ≤3 0 3 −∞ 0
x2 , x4 ≥0
lp : 0 0
up : 10 3
LD : −∞ −∞
UD : −3 ∞
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0 , x1 = 0
All-Zero Columns
Implied Bounds Analysis: Primal limits for variable x2 can be updated as: Singleton Rows
Singleton Columns
x2 ≤ 4 + 2x4 → x2 ≤ 10 Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −3x2 +x4 LP UP ld ud
s.t. +x2 −2x4 ≤4 −3 ∞ −∞ −3
+x4 ≤3 0 3 −∞ 0
x2 , x4 ≥0
lp : 0 0
up : 10 3
LD : −∞ −∞
UD : −3 ∞
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0 , x1 = 0
All-Zero Columns
Implied Bounds Analysis: Dual limits from constraint corresponding to column of x4 can be Singleton Rows
updated as: Singleton Columns
y2 ≤ 1 + 2y1 → y2 ≤ −5 Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −3x2 +x4 LP UP ld ud
s.t. +x2 −2x4 ≤4 −3 ∞ −∞ −3
+x4 ≤3 0 3 −∞ −5
x2 , x4 ≥0
lp : 0 0
up : 10 3
LD : −∞ −∞
UD : −3 ∞
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0 , x1 = 0
All-Zero Columns
Implied Bounds Analysis: Dual limits from constraint corresponding to column of x4 can be Singleton Rows
updated as: Singleton Columns
y2 ≤ 1 + 2y1 → y2 ≤ −5 Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −3x2 +x4 LP UP ld ud
s.t. +x2 −2x4 ≤4 −3 ∞ −∞ −3
+x4 ≤3 0 3 −∞ −5
x2 , x4 ≥0
lp : 0 0
up : 10 3
LD : −∞ −∞
UD : −3 ∞
Sol: Applying the presolving ideas:

All-Zero Rows
x3 = 0 , x1 = 0
All-Zero Columns
Dual variables bounds indicate the dual variable cannot be zero. This implies the above primal Singleton Rows
constraints are active. This in turn implies x4 = 3 and x2 = 10. Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Preprocessing: Example - 3
Q. Solve the following LP:
min : −3x2 +x4 LP UP ld ud
s.t. +x2 −2x4 ≤4 −3 ∞ −∞ −3
+x4 ≤3 0 3 −∞ −5
x2 , x4 ≥0
lp : 0 0
up : 10 3
LD : −∞ −∞
UD : −3 ∞
Sol: Applying the presolving ideas:

All-Zero Rows
Thus the optimal solution is: x3 = 0 , x1 = 0, x4 = 3, x2 = 10
All-Zero Columns
Singleton Rows
Singleton Columns
Grained Rows
Grained Columns
Duplicate Rows & Columns
Primal & Dual Constraints Analysis
Primal & Dual Implied Bounds
Free Variable Elimination
Syed N Mujahid Linear Programming February 12, 2023 34 / 38
Generic Solution Methods for LP
1 Graphical Search Methods
2D Graphical Method
Requirement Space Method
2 Algebraic Search Tools
Improving Directions
Feasible Directions
General Algorithms
3 Improving Search Illustrated
Boundary Point
Interior Point
Exterior Point
4 Preprocessing LPs
Basic Analysis
Advanced Analysis
5 Summary
Syed N Mujahid Linear Programming February 12, 2023 35 / 38
Conclusions

We have seen ...


★ Solving LPs using graphical methods.
★ Identifying feasibility in requirement space.
★ Ferkas Lemma to detect feasibility.
★ Concept of moving along a direction.
★ Concept of gradient and function maximization.
★ Concept of improving directions.
★ Concept of feasible directions.
★ Concept of any OR algorithm.
★ Illustration of boundary point methods.
★ Illustration of interior point methods.
★ Illustration of exterior point methods.
★ Basic presolving techniques.
★ Advanced presolving techniques.

Syed N Mujahid Linear Programming February 12, 2023 36 / 38


References

Chapter 1.3, 1.4, 5.3: Bazaraa, M. S., Jarvis, J. J., & Sherali, H. D. (2011). Linear
programming and network flows. John Wiley & Sons.
Bixby, R. E. (2002). Solving real- world linear programs: A decade and more of
progress. Operations research, 50(1), 3-15.
Brearley, A. L., Mitra, G., & Williams, H. P. (1975). Analysis of mathematical
programming problems prior to applying the simplex algorithm. Mathematical
programming, 8(1), 54-83.
Andersen, E. D., & Andersen, K. D. (1995). Presolving in linear programming.
Mathematical Programming, 71(2), 221-245.
Mészáros, C., & Suhl, U. H. (2003). Advanced preprocessing techniques for linear and
quadratic programming. OR Spectrum, 25(4), 575-595.

Syed N Mujahid Linear Programming February 12, 2023 37 / 38


Discussion

Any Questions ???

Syed N Mujahid Linear Programming February 12, 2023 38 / 38

You might also like