You are on page 1of 26

Algorithm Design and Analysis

LECTURE 13
Dynamic Programming
• Fibonacci
• Weighted Interval
Scheduling

Adam Smith
9/10/10
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Design Techniques So Far
• Greedy. Build up a solution incrementally,
myopically optimizing some local criterion.
• Recursion / divide &conquer. Break up a
problem into subproblems, solve subproblems,
and combine solutions.
• Dynamic programming. Break problem into
overlapping subproblems, and build up solutions
to larger and larger sub-problems.

9/10/10
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Dynamic Programming History

Bellman. [1950s] Pioneered the systematic study of dynamic programming.

Etymology.
 Dynamic programming = planning over time.
 Secretary of Defense was hostile to mathematical research.
 Bellman sought an impressive name to avoid confrontation.

"it's impossible to use dynamic in a pejorative sense"


"something not even a Congressman could object to"

Reference: Bellman, R. E. Eye of the Hurricane, An Autobiography.


Dynamic Programming Applications

Areas.
 Bioinformatics.
 Control theory.
 Information theory.
 Operations research.
 Computer science: theory, graphics, AI, compilers, systems, ….

Some famous dynamic programming algorithms.


Unix diff for comparing two files.
Viterbi for hidden Markov models / decoding convolutional codes
Smith-Waterman for genetic sequence alignment.
Bellman-Ford for shortest path routing in networks.
Cocke-Kasami-Younger for parsing context free grammars.
Fibonacci Sequence

Sequence defined by
• a1 = 1
• a2 = 1 1, 1, 3, 5, 8, 13, 21, 34, …
• an = an-1 + an-2

How should you compute the Fibonacci sequence?

Recursive algorithm: Fib(n)


1. If n =1 or n=2, then
2. return 1
3. Else
4. a = Fib(n-1)
5. b = Fib(n-2)
6. return a+b

Running Time?
Computing Fibonacci Sequence Faster

Observation: Lots of redundancy! The recursive algorithm only solves n-


1 different sub-problems

“Memoization”: Store the values returned by recursive calls in a sub-


table

Resulting Algorithm:
Fib(n)
1. If n =1 or n=2, then
2. return 1
3. Else
4. f[1]=1; f[2]=1
5. For i=3 to n
6. f[i]  f[i-1]+f[i-2]
7. return f[n]

Linear time, if integer operations take constant time


Computing Fibonacci Sequence Faster

Observation: Fibonacci recurrence is linear

Can compute An using only O(log n) matrix multiplications; each one


takes O(1) integer multiplications and additions.

Total running time? O(log n) integer operations. Exponential


improvement!

Exercise: how big an improvement if we count bit operations?


Multiplying k-bit numbers takes O(k log k) time.
How many bits needed to write down an?
Weighted Interval Scheduling

Weighted interval scheduling problem.


Job j starts at sj, finishes at fj, and has weight or value vj .
Two jobs compatible if they don't overlap.
Goal: find maximum weight subset of mutually compatible jobs.

h
Time
0 1 2 3 4 5 6 7 8 9 10
Unweighted Interval Scheduling Review

Recall. Greedy algorithm works if all weights are 1.


Consider jobs in ascending order of finish time.
Add job to subset if it is compatible with previously chosen jobs.

Observation. Greedy algorithm can fail spectacularly with weights

weight = 999 b

weight = 1 a
Time
0 1 2 3 4 5 6 7 8 9 10 11
Weighted Interval Scheduling

Notation. Label jobs by finishing time: f1  f2  . . .  fn .


Def. p(j) = largest index i < j such that job i is compatible with j.
= largest i such that fi  sj

Ex: p(8) = 5, p(7) = 3, p(2) = 0.

8
Time
0 1 2 3 4 5 6 7 8 9 10 11
Dynamic Programming: Binary Choice

Notation. OPT(j) = value of optimal solution to the problem consisting


of job requests 1, 2, ..., j.

 Case 1: OPT selects job j.


– collect profit vj
– can't use incompatible jobs { p(j) + 1, p(j) + 2, ..., j - 1 }
– must include optimal solution to problem consisting of remaining
compatible jobs 1, 2, ..., p(j)
optimal substructure

 Case 2: OPT does not select job j.


– must include optimal solution to problem consisting of remaining
compatible jobs 1, 2, ..., j-1

 0 if j  0
OPT( j)  
max  v j  OPT( p( j)), OPT( j 1)  otherwise
Weighted Interval Scheduling: Brute Force

Brute force algorithm.

Input: n, s1,…,sn , f1,…,fn , v1,…,vn

Sort jobs by finish times so that f1  f2  ...  fn.

Compute p(1), p(2), …, p(n)

Compute-Opt(j) {
if (j = 0)
return 0
else
return max(vj + Compute-Opt(p(j)), Compute-Opt(j-1))
}

Worst case:
T(n) = T(n-1) + T(n-2) + (…)
Weighted Interval Scheduling: Memoization

Memoization. Store results of each sub-problem in a table;


lookup as needed.

Input: n, s1,…,sn , f1,…,fn , v1,…,vn

Sort jobs by finish times so that f1  f2  ...  fn.


Compute p(1), p(2), …, p(n)

for j = 1 to n
M[j] = empty
M[0] = 0 global array

M-Compute-Opt(j) {
if (M[j] is empty)
M[j] = max(wj + M-Compute-Opt(p(j)), M-Compute-Opt(j-1))
return M[j]
}
Weighted Interval Scheduling: Running Time

Claim. Memoized version of algorithm takes O(n log n) time.


 Sort by finish time: O(n log n).
 Computing p() : O(n log n) via repeated binary search.

 M-Compute-Opt(j): each invocation takes O(1) time and either


– (i) returns an existing value M[j]
– (ii) fills in one new entry M[j] and makes two recursive calls

 Case (ii) occurs at most n times  at most 2n recursive calls overall

 Overall running time of M-Compute-Opt(n) is O(n). ▪

Remark. O(n) if jobs are pre-sorted by start and finish times.


Equivalent algorithm: Bottom-Up

Bottom-up dynamic programming. Unwind recursion.

Input: n, s1,…,sn , f1,…,fn , v1,…,vn

Sort jobs by finish times so that f1  f2  ...  fn.

Compute p(1), p(2), …, p(n) how fast?

Iterative-Compute-Opt {
M[0] = 0
for j = 1 to n
M[j] = max(vj + M[p(j)], M[j-1])
}

Total time = (sorting) + O(n) = O(n log(n))


Weighted Interval Scheduling: Finding a Solution

Q. Dynamic programming algorithms computes optimal value.


What if we want the solution itself?
A. Do some post-processing.

Run M-Compute-Opt(n)
Run Find-Solution(n)

Find-Solution(j) {
if (j = 0)
output nothing
else if (vj + M[p(j)] > M[j-1])
print j
Find-Solution(p(j))
else
Find-Solution(j-1)
}

 # of recursive calls  n  O(n).


Knapsack Problem

• Given n objects and a "knapsack."


- Item i weighs wi > 0 kilograms and has value vi > 0.
- Knapsack has capacity of W kilograms.
- Goal: fill knapsack so as to maximize total value.
• Ex: { 3, 4 } has value 40. W = 11

# value weight
• Many “packing” problems fit this model
1 1 1
– Assigning production jobs to factories
2 6 2
– Deciding which jobs to do on a single
3 18 5
processor with bounded time
4 22 6
– Deciding which problems to do on an exam
5 28 7
10/8/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Knapsack Problem

• Given n objects and a "knapsack."


- Item i weighs wi > 0 kilograms and has value vi > 0.
- Knapsack has capacity of W kilograms.
- Goal: fill knapsack so as to maximize total value.
• Ex: { 3, 4 } has value 40. W = 11

• Greedy: repeatedly add item with # value weight

maximum ratio vi / wi 1 1 1
2 6 2
• Example: { 5, 2, 1 } achieves only 3 18 5
value = 35 greedy not optimal. 4 22 6
5 28 7
10/8/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Knapsack Problem

• Given n objects and a "knapsack."


- Item i weighs wi > 0 kilograms and has value vi > 0.
- Knapsack has capacity of W kilograms.
- Goal: fill knapsack so as to maximize total value.
• Ex: { 3, 4 } has value 40. W = 11

• Greedy algorithm? # value weight

1 1 1
2 6 2
3 18 5
4 22 6
5 28 7
10/8/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Dynamic programming: attempt 1

•Definition: OPT(i) =
maximum profit subset of items 1, …, i.
– Case 1: OPT does not select item i.
• OPT selects best of { 1, 2, …, i-1 }
– Case 2: OPT selects item i.
• without knowing what other items were selected before i,
we don't even know if we have enough room for i
•Conclusion. Need more sub-problems!

10/8/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Adding a new variable

•Definition: OPT(i, w) = max profit subset of


items 1, …, i with weight limit w.
– Case 1: OPT does not select item i.
• OPT selects best of { 1, 2, …, i-1 } with weight limit w
– Case 2: OPT selects item i.
• new weight limit = w – wi
• OPT selects best of { 1, 2, …, i–1 } with new weight limit
 0 if i  0

OPT(i, w)  OPT(i 1, w) if w i  w
max OPT(i 1, w), v  OPT(i 1, w  w ) otherwise
  i i 

10/8/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Bottom-up algorithm
•Fill up an n-by-W array.

Input: n, W, w1,…,wN, v1,…,vN

for w = 0 to W
M[0, w] = 0

for i = 1 to n
for w = 1 to W
if (wi > w)
M[i, w] = M[i-1, w]
else
M[i, w] = max {M[i-1, w], vi + M[i-1, w-wi ]}

return M[n, W]

10/8/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Knapsack table

0 1 2 3 4 5 6 7 8 9 10 11

 0 0 0 0 0 0 0 0 0 0 0 0
{1} 0 1 1 1 1 1 1 1 1 1 1 1

n { 1, 2 } 0 1 6 7 7 7 7 7 7 7 7 7
{ 1, 2, 3 } 0 1 6 7 7 18 19 24 25 25 25 25
{ 1, 2, 3, 4 } 0 1 6 7 7 18 22 24 28 29 29 40
{ 1, 2, 3, 4, 5 } 0 1 6 7 7 18 22 28 29 34 34 40

W = 11 Item Value Weight


1 1 1
OPT: { 4, 3 }
2 6 2
value = 22 + 18 = 40
3 18 5
4 22 6
10/8/2008 5
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, 28
K. Wayne 7
How do we make turn this into a
proof of correctness?
• Dynamic programming (and divide and conquer)
lends itself directly to induction.
– Base cases: OPT(i,0)=0, OPT(0,w)=0 (no items!).
– Inductive step: explaining the correctness of recursive
formula
• If the following values are correctly computed:
– OPT(0,w-1),OPT(1,w-1),…,OPT(n,w-1)
– OPT(0,w),…,OPT(i-1,w)
• Then the recursive formula computes OPT(i,w) correctly
– Case 1: …, Case 2: … (from previous slide).

10/8/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Time and Space Complexity

• Given n objects and a "knapsack."


- Item i weighs wi > 0 kilograms and has value vi > 0.
- Knapsack has capacity of W kilograms.
- Goal: fill knapsack so as to maximize total value.

W = 11
What is the input size?
# value weight
• In words? 1 1 1

• In bits? 2 6 2
3 18 5
4 22 6
5 28 7
10/8/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne
Time and space complexity
•Time and space: (n W).
– Not polynomial in input size!
– Space can be made O(W) (exercise!)
– "Pseudo-polynomial."
– Decision version of Knapsack is NP-complete.
[KT, chapter 8]
•Knapsack approximation algorithm. There is a
poly-time algorithm that produces a solution with
value within 0.01% of optimum. [KT, section 11.8]

10/8/2008
A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne

You might also like