Contents of Chapter 4

• Chapter 4 The Greedy method
– – – – – – – – – – 4.1 The general method 4.2 Knapsack problem 4.3 Tree vertex splitting 4.4 Job sequencing with deadlines 4.5 Minimum cost spanning trees 4.6 Optimal storage on tapes 4.7 Optimal merge patterns 4.8 Single-source shortest paths 4.9 References and readings 4.10 Additional exercises

4.1 General Method
• Greedy method control abstraction for subset paradigm (Program 4.1)

SolType Greedy(Type a[], int n) // a[1:n] contains the n inputs. { SolType solution = EMPTY; // Initialize the solution for (int i = 1; i <= n; i++) { Type x = Select(a); if Feasible(solution, x) solution = Union(solution, x); } return solution; } – Terminologies: feasible solution, objective function, optimal
solution – Subset paradigm vs. ordering paradigm • Subset paradigm: selection of optimal subset (Sec. 4.2 – 4.5) • Ordering paradigm: finding optimal ordering (Sec. 4.6-4.8)

4.2 Knapsack Problem
• Problem definition
– Given n objects and a knapsack where object i has a weight wi and the knapsack has a capacity m – If a fraction xi of object i placed into knapsack, a profit pixi is earned – The objective is to obtain a filling of knapsack maximizing the total profit

• Problem formulation (Formula 4.1-4.3)

maximize

subject to ∑ wi xi ≤ m
1≤i ≤n

1≤i ≤n

∑p x
i

i

( 4.1) (4.2) ( 4.3)

and 0 ≤ xi ≤ 1, 1 ≤ i ≤ n

• A feasible solution is any set satisfying (4.2) and (4.3) • An optimal solution is a feasible solution for which (4.1) is maximized

1 In case the sum of all the weights is ≤ m. (p1. • Lemma 4.w2. 1 ≤ i ≤ n is an optimal solution. 2/3.10) ( x1 .w3)=(18.p2. 3. 4.15).5 • Lemma 4. x 3 ) 1. 1) 20 (0. • Knapsack problem fits the subset paradigm . x2 .2 Knapsack Problem • Example 4. m=20. 1/4) 16. 1/2) 20 24.24.5 (1. 1.p3)=(25. 0) 20 (0.2 31 31.1 – n=3. ∑w x ∑ p x i i i i (1/2.4. (w1.2 All optimal solutions will fill the knapsack exactly. 2. 1/3. 2/15. then xi = 1.25 28.15.

2 Knapsack Problem • Greedy strategy using total profit as optimization function – Solution 2 in Example 4.1 – Optimal .4.1 – Suboptimal • Greedy strategy using weight (capacity used) as optimization function – Solution 3 in Example 4.1 – Suboptimal • Greedy strategy using ratio of profit to weight (pi/wi) as optimization function – Solution 4 in Example 4.

U -= w[i].2 Knapsack Problem • Algorithm for greedy strategies (Program 4. // Initialize x. i++) x[i] = 0. i<=n. i++) { if (w[i] > U) break.4.2) – Assuming the objects already sorted into nonincreasing order of pi/wi void GreedyKnapsack(float m.0. i<=n. } . m is the knapsack // size and x[1:n] is the solution vector. float U = m. for (i=1. int n) // p[1:n] and w[1:n] contain the profits and weights // respectively of the n objects ordered such that // p[i]/w[i] >= p[i+1]/w[i+1]. } if (i <= n) x[i] = U/w[i]. x[i] = 1.0. { for (int i=1.

1 If p1/w1 ≥ p2/w2 ≥ … ≥ pn/wn. it is shown how to make the xi in the optimal solution equal to that in the greedy solution without any loss in total value.2 Knapsack Problem – Time complexity • Sorting: O(n log n) using fast sorting algorithm like merge sort • GreedyKnapsack: O(n) • So. .4. • Proving technique Compare the greedy solution with any optimal solution. If the two solutions differ. then find the first xi at which they differ. Repeated use of this transformation shows that the greedy solution is optimal. then GreedyKnapsack generates an optimal solution to the given instance of the knapsack problem. total time is O(n log n) • Theorem 4. Next.

2) 2. 3) 3.1. (1. 3 or 3. (2) 8.2 – n=4. (p1.d3.d2. 3) 5. (4) processing sequence value 2.d4)=(2. (1. (2. 4) 4.1) Feasible Solution 1. (d1. 1 127 2.4. (1) 7. 4) 6. 1 115 4.p4)=(100.p3.10.15.27).2. 3 25 4. 3 42 1 100 2 10 3 15 4 27 . (1. (3) 9.4 Job Sequencing with Deadlines • Example 4.p2. 1 110 1. (3.

3 Let J be a set of k jobs and permutation of jobs in J such that Then J is a feasible solution iff the jobs in J can be processed in the order without violating any deadline. 1 2 k σ . σ = i .4} with total profit 127 – It is optimal • How to determine the feasibility of J ? – Trying out all the permutations • Computational explosion since there are n! permutations – Possible by checking only one permutation • By Theorem 4. ≤ d ik . and added to J  J={1.4. but discarded because not feasible  J={1.3 a • Theorem 4...4} • Final solution is J={1. i .4 Job Sequencing with Deadlines • Greedy strategy using total profit as optimization function – Applying to Example 4. i d i1 ≤ d i 2 ≤ .2 • Begin with J=φ • Job 1 considered....4} • Job 2 considered. but discarded because not feasible  J={1.4} • Job 3 considered.. and added to J  J={1} • Job 4 considered.

} } . i++) { if (all jobs in J ∪{i} can be completed by their deadlines) J = J ∪{i}. int n) – Assuming the jobs are ordered such that p[1]≥ p[2]≥ … // J is a set of jobs that can be ≥ p[n] // completed by their deadlines. { J = {1}.5) GreedyJob(int a[]. set J. for (int i=2. i<=n.4 The greedy method described above always obtains an optimal solution to the job sequencing problem.4 Job Sequencing with Deadlines • Theorem 4. • High level description of job sequencing algorithm (Program 4.4.

are the jobs in J and d[J[1]] ≤ d[J[2]] ≤ …. 1≤ r≤ k+1 .5 ? – How to represent J to avoid sorting the jobs in J each time ? • 1-D array J[1:k] such that J[r].4 Job Sequencing with Deadlines • How to implement Program 4. ≤ d[J[k]] • To test whether J ∪ {i} is feasible.4. 1≤ r≤ k. just insert i into J preserving the deadline ordering and then verify that d[J[r]]≤ r.

. for (int i=2. // Also. i<=n. d[0] = J[0] = 0. // Include job 1.. Find position for // i and check feasibility of insertion. (Program 4. The jobs // are ordered such that p[1]>=p[2]>= . if ((d[J[r]] <= d[i]) && (d[i] > r)) { // Insert i into J[]. int n) // d[i]>=1. for (int q=k. { //Consider jobs in nonincreasing // order of p[i].. J[i] // is the ith job in the optimal solution. J[r+1] = i. k++. 1<=i<=k. at termination d[J[i]]<=d[J[i+1]]. >=p[n]. i++) { int r = k. 1<=i<=n are the deadlines.4. int k=1. n>=1. // Initialize. 1<=i<k. q--) J[q+1] = J[q].4 Job Sequencing with Deadlines • C++ description of job sequencing algorithm int JS(int d[]. J[1] = 1. while ((d[J[r]] > d[i]) && (d[J[r]] != r)) r--. } } } return (k). q>=(r+1).6) int j[].

5 Spanning trees • Applications – Obtaining an independent set of circuit equations for an electric network – etc . E) be at undirected connected graph. • Example 4.1 Let G=(V.5 Minimum-cost Spanning Trees • Definition 4. A subgraph t=(V.4. E’) of G is a spanning tree of G iff t is a tree.

5 Minimum-cost Spanning Trees • Example of MCST (Figure 4.4.6) – Finding a spanning tree of G with minimum cost 1 10 6 25 24 5 7 28 2 14 16 3 18 12 4 6 25 5 10 1 2 14 7 16 3 12 22 (b) 4 22 (a) .

4.6 (Figure 4.5.1 Prim’s Algorithm • Example 4.7) 1 1 10 10 2 6 5 7 4 3 6 25 5 4 7 1 2 3 6 25 5 1 2 7 5 10 16 3 12 22 4 6 25 5 22 4 14 7 2 16 3 12 10 7 2 3 (a ) 1 10 6 25 5 7 2 3 12 6 25 10 (b 1) 22 4 (c) (d 22 4 (e) (f) .

near(7))=24 So.5. near(2))=28 near(3)=1 (or 5 or 6).near(j)) is minimum among all choices for near(j) • The next edge is defined by the vertex j such that near(j)≠ 0 (j not already in the tree) and cost(j. near(3))=∞ // no edge to the tree near(4)=5.7 (b) near(1)=0 // already in the tree near(2)=1. Figure 4. the next vertex is 4 . cost(4. cost(3. cost(2. near(4))=22 near(5)=0 // already in the tree near(6)=0 // already in the tree near(7)=5.near(j)) is minimum – eg. cost(7.4.1 Prim’s Algorithm • Implementation of Prim’s algorithm – How to determine the next edge to be added? • Associating with each vertex j not yet included in the tree a value near(j) • near(j): a vertex in the tree such that cost(j.

int n. 15 t[1][1] = k. L.4. 13 let (k. 28 if ((near[k]!=0) && 29 (cost[k][near[k]]>cost[k][j])) 30 near[k] = j. 16 for (int i=1.5. 25 mincost = mincost + cost[j][near[j]]. 17 if (cost[i][L] < cost[i][k]) near[i] = L. j. i <= n-1.1 Prim’s Algorithm • Prim’s MCST algorithm (Program 4. 20 for (i=2. 19 near[k] = near[L] = 0. 18 else near[i] = k. 24 t[i][1] = j.8) 1 float Prim(int E[][SIZE]. t[1][2] = L. i<=n. 22 let j be an index such that near[j]!=0 and 23 cost[j][near[j]] is minimum. i++) // Initialize near. k.L) be an edge of minimum cost in E. int t[][2]) 11 { 12 int near[SIZE]. 14 float mincost = cost[k][L]. 31 } 32 return(mincost). 27 for (k=1. t[i][2] = near[j]. i++) { // Find n-2 additional 21 // edges for t. 26 near[j]=0. k<=n. k++) // Update near[]. float cost[][SIZE]. 33 } .

5.4.1 Prim’s Algorithm • Time complexity – – – – Line 13: O(|E|) Line 14: Θ (1) for loop of line 16: Θ (n) Total of for loop of line 20: O(n2) • n iterations • Each iteration – Lines 22 & 23: O(n) – for loop of line 27: O(n) – So. Prim’s algorithm: O(n2) • More efficient implementation using red-black tree – Using red-black tree • Lines 22 and 23 take O(log n) • Line 27: O(|E|) – So total time: O((n+|E|) log n) .

2 Kruskal’s Algorithm • Example 4.5.8) 1 1 10 2 6 5 7 4 3 6 5 7 1 2 3 10 6 5 7 2 3 12 4 4 (a ) 1 10 6 5 14 7 4 2 3 12 10 6 5 (b 1) 14 7 2 10 16 3 12 4 6 5 (c) 1 14 7 22 4 2 16 3 12 (d (e) (f) .7 (Figure 4.4.

9) • How to implement ? t = EMPTY.5. w) from E. } – Two functions should be considered • Determining an edge with minimum cost (line 3) • Deleting this edge (line 4) – Using minheap • Construction of minheap: O(|E|) • Next edge processing: O(log |E|) – Using Union/Find set operations to maintain the intermediate forest .2 Kruskal’s Algorithm • Pseudo code of Kruskal’s algorithm (Program 4. w). w) dose not create a cycle in t add (v. w) from E of lowest cost. while ((t has fewer than n-1 edges) && (E!=EMPTY)) { choose an edge (v. delete (v. else discard (v.. if (v.4. w) to t.

} . if (j != k) { i++. else return(mincost). int t[][2]) { int parent[SIZE]. int n.4.v) from the heap and reheapify using Adjust.5. while ((i < n-1) && (heap not empty)) { delete a minimum cost edge (u. t[i][1] = u. float mincost = 0. i++) parent[i] = -1. mincost += cost[u][v]. int k = Find(v). k).2 Kruskal’s Algorithm • Kruskal’s algorithm (Program 4. } } if ( i != n-1) cout << “No spanning tree” << endl.0. int j = Find(u). construct a heap out of the edge costs using Heapify.10) float Kruskal(int E[][SIZE]. i = 0. Union(j. i<=n. float cost[][SIZE]. y[i][2] = v. // Each vertex is in a different set. for (int i=1.

x2. 20. requiring the fewest comparisons or record moves) to pairwise merge them into one sorted file – It fits ordering paradigm • Example 4.x3) with lengths (30. merging the result with x1 (60 moves)  total 90 moves – The solution 2 is better ..e. find an optimal way (i. merging the result with x3 (60 moves)  total 110 moves – Solution 2: merging x2 and x3 (30 moves). 10) – Solution 1: merging x1 and x2 (50 record moves).9 – Three sorted files (x1.4.7 Optimal Merge Patterns • Problem – Given n sorted files.

11) 95 z4 z2 35 z1 15 5 x4 10 x3 20 x1 30 x5 60 z3 30 x2 – Total number of record moves = weighted external path length ∑d q i =1 n i i – The optimal 2-way merge pattern = binary merge tree with minimum weighted external path length .10.30.30) (Figure 4.7 Optimal Merge Patterns • A greedy method (for 2-way merge problem) – At each step.4..g. five files with lengths (20. merge the two smallest files – e.5.

// smallest lengths. // Get a new tree node. } return (Least(list)). } . { for (int i=1.13) struct treenode { struct treenode *lchild. // Tree left in l is the merge tree. Insert(list. }. int weight. Type *Tree(int n) // list is a global list of n single node // binary trees as described above. *rchild. // Merge two trees with pt -> rchild = Least(list). i<n.7 Optimal Merge Patterns • Algorithm (Program 4.4. pt -> weight = (pt->lchild)->weight + (pt->rchild)->weight. pt -> lchild = Least(list). *pt). typedef struct treenode Type. i++) { Type *pt = new Type.

4.10 (Figure 4.7 Optimal Merge Patterns • Example 4.12) .

7 Optimal Merge Patterns • Time – If list is kept in nondecreasing order: O(n2) – If list is represented as a minheap: O(n log n) .4.

.4.7 Optimal Merge Patterns • Huffman codes – Obtaining an optimal set of codes for messages M1. .. 4 messages (Figure 4.g. M2. Mn+1 – e.14) 0 1 0 0 M1 • • • • M1=000 M2=001 M3=01 M4=1 1 M3 1 M2 M4 .

10.4.5.….8.q2.12.q7)=(4.7 Optimal Merge Patterns • Huffman codes (Continued) – When qi is the relative frequency for Mi • The expected decode time (also the expected message length) qd 1≤i ≤n + 1 ∑ i i • The minimum decode time (also minimum message length) is possible by choosing the code words resulting in a binary tree with minimum weighted external path length (that is.7. the same algorithm as 2-way merge problem is applicable) – Example using Exercise 4 • (q1.20) .

15) .8 Single-source Shortest Paths • Example 4.11 (Figure 4.4.

in Figure 4. and ending at w . going through only those vertices that are in S.. in nondecreasing order of path lengths • e.15 – 14: 10 – 145: 25 – … • We need to determine 1) the next vertex to which a shortest path must be generated and 2) a shortest path to this vertex – Notations • S = set of vertices (including v0 ) to which the shortest paths have already been generated • dist(w) = length of shortest path starting from v0.4.g.8 Single-source Shortest Paths • Design of greedy algorithm – Building the shortest paths one by one.

• The destination of the next path generated must be that of vertex u which has the minimum distance. • Having selected a vertex u as in observation 2 and generated the shortest v0 to u path. and goes through only those vertices that are in S. dist(u). then the path begins at v0. ends at u. vertex u becomes a member of S.8 Single-source Shortest Paths • Design of greedy algorithm (Continued) – Three observations • If the next shortest path is to vertex u. among all vertices not in S.4. .

num < n. int n) { int u. float cost[][SIZE]. i++) { // Initialize S. // Put u in S.4.8 Single-source Shortest Paths • Greedy algorithm (Program 4.0. S[u] = true. } S[v]=true. choose u from among those vertices not in S such that dist[u] is minimum. for (int i=1. i<= n. num++) { // Determine n-1 paths from v. float dist[]. for (int w=1.14): Dijkstra’s algorithm – Time: O(n2) ShortestPaths(int v. dist[v]=0. dist[i] = cost[v][i]. w<=n. // Put v in S. S[i] = false. if (( S[w]=false) && (dist[w] > dist[u] + cost[u][w])) dist[w] = dist[u] + cost[u][w]. } } . for (int num = 2. w++) //Update distances. bool S[SIZE].

8 Single-source Shortest Paths • Example 4.4.16) .12 (Figures 4.

4.17) .12 (Figures 4.8 Single-source Shortest Paths • Example 4.

Sign up to vote on this title
UsefulNot useful