Professional Documents
Culture Documents
2
Outline
• Introduction to • Optimal Binary Search
Dynamic Trees
Programming • Knapsack problem
– General method with • Bellman-Ford Algorithm
Examples
– Multistage Graphs • Travelling Sales
Person problem
• Transitive Closure:
– Warshall’s Algorithm,
• Reliability design
• All Pairs Shortest
Paths:
– Floyd's Algorithm,
3
Dynamic
•
Programming
Dynamic programming is a technique for
solving problems with overlapping
subproblems.
– subproblems arise from a recurrence relating a
given problem’s solution to solutions of its smaller
subproblems.
– DP suggests solving each of the smaller
subproblems only once and recording the results in
a table from which a solution to the original
problem can then be obtained.
• The Dynamic programming can be used when the
solution to a problem can be viewed as the result
of sequence of decisions.
4
Example-1: Knapsack
Problem
5
Example-2: File merging
problem
6
Principle of Optimality
7
Outline
• Introduction to • Optimal Binary Search
Dynamic Trees
Programming • Knapsack problem
– General method with • Bellman-Ford Algorithm
Examples
– Multistage Graphs • Travelling Sales
Person problem
• Transitive Closure:
– Warshall’s Algorithm,
• Reliability design
• All Pairs Shortest Paths:
– Floyd's Algorithm,
8
Multistage
Graphs
9
DP Solution using forward
approach
Cost of node j at
stage i to reach
terminal node t
Stage number
Node
10
cost(5,12)=0
cost(4,9)=c(9,12) = 4
cost(4,10)=c(10,12)=
2
cost(4,11)=c(11,12)=
5
11
cost(4,I) = c(I,L) = 7
cost(4,J) = c(J,L) = 8
cost(4,K) = c(K,L) = 11
13
cost(3,D) = c(D,T) = 18
cost(3,E) = c(E,T) = 13
cost(3,F) = c(F,T) = 2
14
18
When j=11, only one candidate for r, cost(11) = d(11)=12
r=12 When j=10, only one candidate for c(11,12)+cost(12)=5+0=5; cost(10) d(10)=12
r, r=12 When j=9, only one candidate for cost(9) = c(9,12)+cost(12)=4+0=4; d(9)=12
= c(10,12)+cost(12)=2+0=2;
r, r=12
When j=8, two candidates for r, cost(8) = min { c(8,10)+cost(10)=5+2=7;
r=10,11 c(8,11)+cost(11)=6+5=11; } = 7 d(8)=10
When j=7, two candidates for r, cost(7) = min { c(7,9)+cost(9)=4+4=8;
r=9,10 c(7,10)+cost(10)=3+2=5; } = 5 d(7)=10
When j=6, two candidates for r, cost(6) = min { c(6,9)+cost(9)=6+4=10;
r=9,10 c(6,10)+cost(10)=5+2=7; } = 7 d(6)=1019
7 4
5 2
When j=11, cost(11) =5; d(11)=12
When j=10, cost(10) =2; d(10)=12
When j=9, cost(9) =4; d(9)=12 7 5
When j=8, cost(8) = 7 d(8)=10
When j=7, cost(7) = 5 d(7)=10
When j=6, cost(6) = 7 d(6)=10
Θ( |V| + |E|)
21
Backward
Approach
Cost of node j at
stage i to from
source node s
22
bcost(2,2) = 9
bcost(2,3)=7
bcost(2,4)=3
bcost(2,5)=2
23
Example
bcost(2,B) = c(A,B) = 7
bcost(2,C) = c(A,C) = 6
bcost(2,D) = c(A,D) = 5
bcost(2,E) = c(A,E) = 9.
bcost(3,F) = min { c(B,F) + bcost(2,B) | c(C,F) + bcost(2,C) }
bcost(3,F) = min { 4 + 7 | 10 + 6 } = 11
bcost(3,G) = min { c(B,G) + bcost(2,B) | c(C,G) + bcost(2,C) | c(E,G)
+ bcost(2,E) }
bcost(3,G) = min { 8 + 7 | 3 + 6 | 6 + 9 } = 9
bcost(3,H) = min { c(B,H) + bcost(2,B) | c(D,H) + bcost(2,D) | c(E,H)
+ bcost(2,E) }
bcost(3,H) = min { 11 + 7 | 9 + 5 | 12 + 9 } = 14
bcost(4,I) = min { c(F,I) + bcost(3,F) | c(G,I) + bcost(3,G) }
bcost(4,I) = min { 12 + 11 | 5 + 9 } = 14
bcost(4,J) = min { c(F,J) + bcost(3,F) | c(G,J) + bcost(3,G) | c(H,J)
+ bcost(3,H) }
bcost(4,J) = min { 9 + 11 | 7 + 9 | 10 + 14 } = 16
bcost(4,K) = min { c(H,K) + cost(3,H) }
bcost(5,L)
bcost(4,K) = min { 8
c(I,L) + =
+ 14 } bcost(4,I)
22 | c(J,L) + bcost(4,J) | c(K,L)
+ bcost(4,K) }
bcost(5,L) = min { 7 + 14 | 8 + 16 | 11 + 22 } =
21
Path: A-C-G-I-L Reused costs are
colored! 24
26
Multistage
Graph
27
Outline
• Introduction to • Optimal Binary Search
Dynamic Trees
Programming • Knapsack problem
– General method with • Bellman-Ford Algorithm
Examples
– Multistage Graphs • Travelling Sales
Person problem
• Transitive Closure:
– Warshall’s Algorithm,
• Reliability design
• All Pairs Shortest Paths:
– Floyd's Algorithm,
28
Transitive
Closure
29
Transitive
•
Closure
We can generate the transitive closure of a
digraph with the help of depth-first search or
breadth-first search.
• Since this method traverses the same digraph
several times, we can use a better algorithm called
Warshall’s algorithm.
• Warshall’s algorithm constructs the transitive
closure through a series of n × n boolean matrices:
30
Warshalls
•
Algorithm
The element in the ith row and jth column of matrix R(k) is
equal to 1 if and only if
– there exists a directed path of a positive length from
the ith vertex to the jth vertex with each intermediate
vertex, if any, numbered not higher than k.
• This means that there exists a path from the ith vertex vi
to the jth vertex vj with each intermediate vertex
numbered not higher than k:
vi, a list of intermediate vertices each numbered not higher
than k, vj
31
Warshalls
Algorithm
Two situations regarding this path are possible.
1. In the first, the list of its intermediate vertices
does not contain the kth vertex.
32
33
Algorithm
36
Analysis
• Its time efficiency is Θ(n3).
• We can make the algorithm to run faster by
treating matrix rows as bit strings and employ the
bitwise or operation available in most modern
computer languages.
• Space efficiency
– Although separate matrices for recording
intermediate results of the algorithm are
used, that can be avoided.
37
Outline
• Introduction to • Optimal Binary Search
Dynamic Trees
Programming • Knapsack problem
– General method with • Bellman-Ford Algorithm
Examples
– Multistage Graphs • Travelling Sales
Person problem
• Transitive Closure:
– Warshall’s Algorithm,
• Reliability design
• All Pairs Shortest
Paths:
– Floyd's Algorithm,
38
All Pairs Shortest
•
Paths
Problem definition: Given a weighted connected
graph (undirected or directed), the all-pairs shortest
paths problem asks to find the distances—i.e., the
lengths of the shortest paths - from each vertex to
all other vertices.
• Applications:
– Communications, transportation networks, and
operations research.
– pre-computing distances for motion planning in
computer games.
39
All Pairs Shortest
Paths
(a) Digraph. (b) Its weight matrix. (c) Its distance matrix
41
Floyds
Algorithm
42
Analysis
• Its time efficiency is Θ(n3),
– similar to the warshall’s
algorithm.
43
Example
44
Outline
• Introduction to • Optimal Binary Search
Dynamic Trees
Programming • Knapsack problem
– General method with • Bellman-Ford
Examples
Algorithm
– Multistage Graphs
• Travelling Sales
• Transitive Closure:
Person problem
– Warshall’s Algorithm,
• Reliability design
• All Pairs Shortest Paths:
– Floyd's Algorithm,
45
Optimal Binary Search
Tree
• A binary search tree is one of the most
important data structures in computer science.
• One of its principal applications is to implement a
dictionary, a set of elements with the operations
of searching, insertion, and deletion.
• The binary search tree for which the average
number of comparisons in a search is the smallest
as possible is called as optimal binary search tree.
• If probabilities of searching for elements of a set
are known an optimal binary search tree can be
constructed.
46
Optimal Binary Search
Tree
Example
• Consider four keys A, B, C, and D to be searched for
with probabilities 0.1, 0.2, 0.4, and 0.3,
respectively.
• 14 different binary search trees are possible
• Two are shown here
48
Optimal BST
Problem definition:
• Given a sorted array a1, . . . , an of search keys and an
array p1, . . . , pn of probabilities of searching,
where pi is the probability of searches to ai.
• Construct a binary search tree of all keys such that,
smallest average number of comparisons made in
a successful search.
49
Optimal BST – Solution using
•
DP
So let a , . . . , a be distinct keys ordered from the
1 n
smallest to the largest and let p1, . . . , pn be the
probabilities of searching for them.
• Let C(i, j) be the smallest average number of
comparisons made in a successful search in a binary
search tree Ti j made up of keys ai, . . , aj, where i, j
are some integer indices, 1≤ i ≤ j ≤ n.
• We are interested just in C(1, n).
50
Optimal BST – Solution using
•
DP
To derive a recurrence, we will consider all
possible ways to choose a root ak among the keys
ai, . . . , aj .
51
52
index 1 2 3 4
Example
53
54
index 1 2 3 4
Example
55
index 1 2 3 4
Example
56
index 1 2 3 4
Example
57
index 1 2 3 4
Example
58
59
60
Analysis
• The algorithm requires O(n2) time and O(n2) storage.
• Therefore, as ‘n’ increases it will run out of
storage even before it runs out of time.
• The storage needed can be reduced by almost half
by implementing the two-dimensional arrays as
one- dimensional arrays.
61
Home
Work
• Find the optimal binary search tree for the
keys A, B, C, D, E with search probabilities
0.1, 0.1, 0.2, 0.2, 0.4 respectively.
62
Outline
• Introduction to • Optimal Binary Search
Dynamic Trees
Programming • Knapsack problem
– General method with • Bellman-Ford Algorithm
Examples
– Multistage Graphs • Travelling Sales
Person problem
• Transitive Closure
– Warshall’s Algorithm,
• Reliability design
• All Pairs Shortest Paths:
– Floyd's Algorithm,
63
Knapsack problem using
•
DP
Given n items of known weights w , . . . , w and
1 n
values v1, . . . , vn and a knapsack of capacity W, find
the most valuable subset of the items that fit into
the knapsack.
• To design a DP algorithm, we need to derive a
recurrence relation that expresses a solution in
terms of its smaller sub instances.
• Let us consider an instance defined by the first i
items, 1≤ i ≤ n, with weights w1, . . . , wi, values v1, . . .
, vi , and knapsack capacity j, 1 ≤ j ≤ W.
64
Knapsack problem using
•
DP
Let F(i, j) be the value of an optimal solution to
this instance.
• We can divide all the subsets of the first i items
that fit the knapsack of capacity j into two
categories:
– those that do not include the ith item
– and those that do.
• Among the subsets that do not include the ith item,
– the value of an optimal subset is, by definition,
– i.e F(i , j) = F(i − 1, j).
65
Knapsack problem using
•
DP
Among the subsets that do include the ith item
(hence, j − wi ≥ 0),
– an optimal subset is made up of this item and
– an optimal subset of the first i−1 items that
fits into the knapsack of capacity j − wi .
– The value of such an optimal subset
is F(i , j) = vi + F(i − 1, j − wi)
• Thus, optimal solution among all feasible subsets of
the first I items is the maximum of these two
values.
66
Knapsack problem using
•
DP
It is convenient to define the initial conditions as
follows: F(0, j) = 0 for j ≥ 0 and F(i, 0) = 0 for i ≥ 0.
67
Algorithm
vi
v
i
68
Example-1
73
Algorithm MFKnapsack(i, j )
• Implements the memory function method for
the knapsack problem
• Input: A nonnegative integer i indicating the number of
the first items being considered and a nonnegative integer
j indicating the knapsack capacity
• Output: The value of an optimal feasible subset of the first i
items
• Note: Uses as global variables input arrays Weights[1..n],
V alues[1..n], and table F[0..n, 0..W ] whose entries are
initialized with −1’s except for row 0 and column 0
initialized
with 0’s
74
Algorithm MFKnapsack(i, j )
75
Example
76
Outline
• Introduction to • Optimal Binary Search
Dynamic Trees
Programming • Knapsack problem
– General method with • Bellman-For
Examples
d Algorithm
– Multistage Graphs
• Travelling Sales
• Transitive Closure:
Person problem
– Warshall’s Algorithm,
• Reliability design
• All Pairs Shortest Paths:
– Floyd's Algorithm,
78
Single Source shortest path with –ve
wts
Problem definition
• Single source shortest path - Given a graph and a
source vertex s in graph, find shortest paths from s
to all vertices in the given graph.
• The graph may contain negative weight edges.
79
Bellman Ford
•
Algorithm
Like other Dynamic Programming Problems, the
algorithm calculates shortest paths in
bottom-up manner.
• It first calculates the shortest distances
– for the shortest paths which have at-most one edge in
the path.
– Then, it calculates shortest paths with at-most 2
edges, and so on.
80
81
82
dist1(1) = 0
dist1(2) = 6
dist1(3) = 5
dist1(4) = 5
dist1(5) = ∞
dist1(6) = ∞
dist1(7) = ∞
k=1
83
For every vertex (except source) look for
the incoming edge (except from source)
dist2(1) = 0
2 dist1(2) 6
dist (2) = min = min =3
dist1(3) + cost(3,2) 5−2
dist2(3) = min dist1(3) = min 5 =3
dist1(4) + cost(4,3) 5−2
dist2(4) = 5
dist1(5)
dist2(5) = min ∞
1
dist (2) + cost(2,5) = min6−1 = 5
5+1
2
dist1(3) + cost(3,5)
dist (6) = min = min =4
dist1(6)
∞
5−1
dist2(7) = min dist1(4) + cost(4,6)
dist1(7)
∞
dist1(5) + cost(5,7) = min ∞ + 3 =∞
∞+3
dist1(6) + cost(6,7)
k=2
84
For every vertex (except source) look for
the
3
incoming edge (except from source)
dist (1) = 0
3 dist2(2) 3
dist (2) = min = min =1
dist2(3) + cost(3,2) 3−2
dist3(3) = min dist2(3) = min 3 =3
dist2(4) + cost(4,3) 5−2
dist3(4) = 5
dist2(5) 5
3
dist (5) = min dist2(2) + cost(2,5) = min 3−1 = 2
dist2(3) + cost(3,5) 3+1
dist3(6) = min dist2(6) = min 4 =4
dist2(4) + cost(4,6) 5−1
dist2(7)
dist3(7) = min = min ∞ 5+3 =
dist2(5) + cost(5,7)
7
dist2(6) + cost(6,7) 4+3
k=3
85
For every vertex (except source) look for
the
4
incoming edge (except from source)
dist (1) = 0
4 dist3(2) 1
dist (2) = min = min =1
dist3(3) + cost(3,2) 3−2
dist4(3) = min dist3(3) = min 3 =3
dist3(4) + cost(4,3) 5−2
dist4(4) = 5
dist3(5) 2
4
dist (5) = min dist3(2) + cost(2,5) = min 1−1 = 0
dist3(3) + cost(3,5) 3+1
dist4(6) = min dist3(6) = min 4 =4
dist3(4) + cost(4,6) 5−1
dist3(7) 7
dist4(7) = min = min =5
dist3(5) + cost(5,7) 2+3
4+3
dist3(6) + cost(6,7)
k=4
86
For every vertex (except source) look for
the incoming edge (except from source)
dist5(1) = 0
5 dist4(2) 1
dist (2) = min = min =1
dist4(3) + cost(3,2) 3−2
dist5(3) = min dist4(3) = min 3 =3
dist4(4) + cost(4,3) 5−2
dist5(4) = 5
dist4(5) 0
5
dist (5) = min dist4(2) + cost(2,5) = min 1−1 = 0
dist4(3) + cost(3,5) 3+1
dist5(6) = min dist4(6) = min 4 =4
dist4(4) + cost(4,6) 5−1
dist4(7) 5
dist5(7) = min = min =3
dist4(5) + cost(5,7) 0+3
4+3
dist4(6) + cost(6,7)
k=5
87
For every vertex (except source) look for
the incoming edge (except from source)
dist6(1) = 0
6 dist5(2) 1
dist (2) = min = min =1
dist5(3) + cost(3,2) 3−2
dist6(3) = min dist5(3) = min 3 =3
5
dist (4) + cost(4,3) 5−2
dist6(4) = 5
dist5(5) 0
6 = min 1−1 = 0
dist (5) = min dist5(2) + cost(2,5)
dist5(3) + cost(3,5) 3+1
dist6(6) = min dist5(6) = min 4 =4
5
dist (4) + cost(4,6) 5−1
dist5(7) 5
dist6(7) = min = min =3
dist5(5) + cost(5,7) 0+3
4+3
dist5(6) + cost(6,7)
k=6
88
89
Algorithm
90
Analysis
• Bellman-Ford works in
O(VE)
91
Example-2 Home work
Find shortest path from S to all other
vertices using Bellman Ford algorithm
92
Example-2
93
Outline
• Introduction to • Optimal Binary Search
Dynamic Trees
Programming • Knapsack problem
– General method with • Bellman-Ford
Examples
Algorithm
– Multistage Graphs
• Travelling Sales
• Transitive Closure:
Person problem
– Warshall’s Algorithm,
• Reliability design
• All Pairs Shortest Paths:
– Floyd's Algorithm,
94
Travelling Salesman
Problem
Problem Statement
• The travelling salesman problem consists of
a salesman and a set of cities.
• The salesman has to visit each one of the cities
exactly once starting from a certain one (e.g.
the hometown) and returning to the same city.
• The challenge of the problem is that the traveling
salesman wants to minimize the total length of
the trip
• It is assumed that all cities are connected to all
other cities
95
Travelling Salesman
Problem
Naive Solution:
1. Consider city 1 as the starting and ending point.
2. Generate all (n-1)! Permutations of cities.
3. Calculate cost of every permutation and keep track
of minimum cost permutation.
4. Return the permutation with minimum cost.
Time Complexity: ?
n!
96
TSP using DP
• Assume tour to be a simple path that starts & ends at 1
• Every tour consists of
– an edge <1, k> for some k ∈ V − {1} and
– a path from vertex k to vertex 1
• The path from k to 1 goes through each vertex in V −
{1, k} exactly once
• If the tour is optimal, then the path from k to 1 must be
a shortest path going through all vertices in V − {1, k}
• Let g(i, S) be the length of a shortest path starting at
vertex i, going through all vertices in S, and
terminating at vertex 1
• g(1, V − {1}) is the length of an optimal salesperson tour
97
TSP using DP
• From the principal of
optimality
• Generalizin
g
• Example
k = 2, c + g(2,{3,4,5})
12
g(1,{2,3,4,5}) = min k = 3, c + g(3,{2,4,5})
13
k = 4, c + g(4,{2,3,5})
14
k = 5, c + g(5,{2,3,4})
15
98
Example-1
100
Analysis
102
Outline
• Introduction to Dynamic Programming
• Optimal Binary Search
Trees
– General method with Examples
– Multistage Graphs • Knapsack problem
• Transitive Closure: • Bellman-Ford Algorithm
– Warshall’s Algorithm, • Travelling Sales
• All Pairs Shortest Paths: Person problem
– Floyd's Algorithm, • Reliability design
103
Reliability Design
• If
the reliability of whole system
is
𝑛
𝖦 𝑟𝑖
𝑖=1
104
Reliability Design
No. of devices in A B C
llel Reliability Cost Reliability Cost Reliability Cost
0.9 0.8 0.5
1 30 15 20
m1=1 m2=1 m3=1
0.99 0.96 0.75
2 60 30 40
m1=2 m2=2 m3=2
0.992 0.875
3 - - 45 60
m2=3 m3=3
• 𝒊 - Set of all tuples of the from (f,x) generated from various sequences
𝑺
m1,of m2.. mi considering i stages where f is reliability and x is cost
• 𝑺𝒊 - i is the stage no., j is the no. of devices used at stage i
𝒋
• Example
𝟏
– 𝑺𝟏 means 1 device of type A at stage
1 𝟏
– 𝑺𝟐 means 2 device of type A at stage
1
107
# of devices in A B C
llel Reliability Cost Reliability Cost Reliability Cost
1 0.9 m1=1 30 0.8 m2=1 15 0.5 m3=1 20
2 0.99 60 0.96 m2=2 30 0.75 m3=2 40
m1=2
3 - - 0.992 m2=3 45 0.875 m3=3 60
𝑺𝟏𝟏 = 𝟎. 𝟗, 𝟑𝟎 m1=1 m1=2
𝟏
𝑺𝟏𝟐 = 𝟎. 𝟗𝟗, 𝟔𝟎 𝐶𝑜𝑚𝑏𝑖𝑛𝑖𝑛𝑔: 𝑺 = 𝟎. 𝟗, 𝟑𝟎 , 𝟎. 𝟗𝟗,
.9*.8 .99*.8 𝟔𝟎
𝑺𝟐𝟏 = 𝟎. 𝟕𝟐, 𝟒𝟓 , (𝟎. 𝟕𝟗𝟐, 𝟕𝟓)
.9*.96 .99*.96
𝑺𝟐𝟐 = (𝟎. 𝟖𝟔𝟒, For (0.9504,90) refer Note
𝟔𝟎) .9*.992 #1 .99*.992
𝑺𝟐𝟑 = 𝟎. 𝟖𝟗𝟐𝟖, 𝟕𝟓 For (0.982,105) refer Note
Note #2
#1
𝐶𝑜𝑚𝑏𝑖𝑛𝑖𝑛𝑔: 𝑺𝟐 = 𝟎. 𝟕𝟐, 𝟒𝟓 , 𝟎. 𝟕𝟗𝟐, 𝟕𝟓 , 𝟎. 𝟖𝟔𝟒, 𝟔𝟎 , 𝟎. 𝟖𝟗𝟐𝟖, 𝟕𝟓
𝑺𝟐 = 𝟎. 𝟕𝟐, 𝟒𝟓 , 𝟎. 𝟖𝟔𝟒, 𝟔𝟎 , 𝟎. 𝟖𝟗𝟐𝟖, 𝟕𝟓
m1=1 m1=1 m1=1
m2=1 m2=2 m2=3
Note #1: This configuration will not leave adequate funds to complete
No t e # 2 : F or the same c o s t , r e t a i n t h e o n e w
system H a riv in o d N 1 8C S 4 2- D e s ig n an d A na ly sis o f 108
# of devices in A B C
llel Reliability Cost Reliability Cost Reliability Cost
1 0.9 m1=1 30 0.8 m2=1 15 0.5 m3=1 20
2 0.99 60 0.96 m2=2 30 0.75 m3=2 40
m1=2
3 - - 0.992 m2=3 45 0.875 m3=3 60
𝑺𝟐 = 𝟎. 𝟕𝟐, 𝟒𝟓 , 𝟎. 𝟖𝟔𝟒, 𝟔𝟎 , 𝟎. 𝟖𝟗𝟐𝟖, 𝟕𝟓
m1=1 m1=1 m1=1
m2=1 m2=2 m2=3
𝑺𝟑𝟏 =
𝑺𝟑𝟐 =
m1=1
𝑺𝟑𝟑 = m2=1
M3 =3
𝐶𝑜𝑚𝑏𝑖𝑛𝑖𝑛𝑔: 𝑺𝟑 =
Best Reliability 0.648 with cost=100 where A -1 unit, B -1 unit, C- 2 units in ||el.
109
Assignment-4
1. Write multistage graph algorithm using
backward approach.
2. Define transitive clossure of a directed graph. Find
the transitive clossure matrix for the graph whose
adjacency matrix is given.
110
Assignment-4
4. For the given graph obtain optimal cost tour
using dynamic programming.
111