You are on page 1of 5

CS5613- Advanced Analysis of Algorithms MS(CS) - MAJU

Lecture 5

DYNAMIC PROGRAMMING

Dynamic Programming is a general algorithm design technique for solving problems


defined by or formulated as recurrences with overlapping sub instances.

 Invented by American mathematician Richard Bellman in the 1950s to solve optimization


problems and later assimilated by CS

 “Programming” here means “planning”

 Main idea:
o Set up a recurrence relating a solution to a larger instance to solutions of some
smaller instances
o Solve smaller instances once
o Record solutions in a table
o Extract solution to the initial instance from that table

 Dynamic Programming is an algorithm design technique for optimization problems: often


minimizing or maximizing.

 Dynamic Programming and memorization work together.

 Like divide and conquer, DP solves problems by combining solutions to subproblems.

 Unlike divide and conquer, subproblems are not independent.


» Subproblems may share subproblems,
» However, solution to one subproblem may not affect the solutions to other subproblems
of the same problem.

Major components of DP

 Recursion: Solves problem recursively.


 Memorization: To store the result of already solved problems.

In General DP=Recursion + Memorization

Prepared by: Engr. M. Nadeem Page 1


CS5613- Advanced Analysis of Algorithms MS(CS) - MAJU

Comparison with divide-and-conquer Algorithm


 Divide-and-conquer algorithms split a problem into separate subproblems, solve the
subproblems, and combine the results for a solution to the original problem
o Example: Quicksort
o Example: Mergesort
o Example: Binary search
 Divide-and-conquer algorithms can be thought of as top-down algorithms
 In contrast, a dynamic programming algorithm proceeds by solving small problems, then
combining them to find the solution to larger problems
 Dynamic programming can be thought of as bottom-up

Comparison with greedy Algorithm

Greedy technique focuses on expanding partially constructed solutions until you arrive at a
solution for a complete problem. It is then said, it must be "the best local choice among all
feasible choices available on that step".

Optimal substructure means that you can greedily solve subproblems and combine the solutions
to solve the larger problem. The difference between dynamic programming and greedy
algorithms is that with dynamic programming, there are overlapping subproblems, and
those subproblems are solved using memoization. "Memoization" is the technique whereby
solutions to subproblems are used to solve other subproblems more quickly.

Prepared by: Engr. M. Nadeem Page 2


CS5613- Advanced Analysis of Algorithms MS(CS) - MAJU

Example:

Prepared by: Engr. M. Nadeem Page 3


CS5613- Advanced Analysis of Algorithms MS(CS) - MAJU

Prepared by: Engr. M. Nadeem Page 4


CS5613- Advanced Analysis of Algorithms MS(CS) - MAJU

Dynamic Programming Algorithms:

1. Longest Common Subsequence Problem


2. 0/1 Knapsack Problem
3. Coin Changing Problem

Advantages of Dynamic programming:


 A recursive procedure does not have memory
 Dynamic programming stores previous values to avoid multiple calculations

Prepared by: Engr. M. Nadeem Page 5

You might also like