You are on page 1of 28

# Algorithm Design Techniques

 Brute force  Divide-and-conquer  Decrease-and-conquer  Transform-and-conquer  Space-and-time tradeoffs  Dynamic programming

 Greedy techniques

CSC 3102

0.1

B.B. Karki, LSU

Greedy Techniques

CSC 3102

0.2

B.B. Karki, LSU

locally optimal and inrevocable.3 B. each expanding a partially constructed solution obtained so far. until a complete solution to the problem is reached. LSU .  Dijkstra’s algorithm  Huffman tree and code  A binary tree that minimizes the weighted path length from the root to the leaves containing a set of predefined weights  An optimal prefix-free variable-length encoding scheme. CSC 3102 0. the choice made must be feasible.Basics  Constructing a solution to an optimization problem through a sequence of steps.B.  Examples:  Constructing a minimum spanning tree (MST) of a weighted connected graph  Grows a MST through a greedy inclusion of the nearest vertex or shortest edge to the tree under construction  Prim’s and Kruskal’s algorithms  Solving the single-source shortest-path problem  Finds shortest paths from a given vertex (the source) to all other vertices. Karki.  On each step.

Minimum Spanning Tree CSC 3102 0. LSU . Karki.B.4 B.

connect them in the cheapest a 5 1 2 9 b 7 possible way so that there will be a path between every pair of points  Graph representation. LSU c d . e.g.B.  A minimum spanning tree of a weighted connected a 5 b a 1 9 b 7 c graph is its spanning tree of the smallest weight.  A spanning tree of a connected graph is its 1 3 A complete graph on four vertices Many spanning trees are possible c d connected acyclic subgraph (a tree) that contains all its vertices.5 3 W=9 d c W = 17 d a 1 2 b 3 A minimum spanning tree with W = 6 B.MST: Problem Statement  Given n points. CSC 3102 0.. then there will be only one minimum spanning tree otherwise more than one MST exist. a network  Solve the minimum spanning tree problem. Karki.  The weight of a tree is the sum of the weights on all its edges  Sum of the lengths of all edges  If the edge weights are unique.

Constructing a MST  Exhaustive search approach: List all spanning trees and find the one with the minimum weight from the list  The number of spanning trees grows exponentially with the graph size  Efficient algorithms for finding a MST for a connected weighted graph  Prim’s algorithm (R. Kruskal. Prim.6 B.C.B. 1957)  Constructs a MST one vertex at a time by including the nearest vertex to the vertices already in the tree  Kruskal’s algorithm (J. Karki. CSC 3102 0. 1956)  Constructs a MST one edge at a time by selecting edges in increasing order of their weights provided that the inclusion does not create a cycle.B. LSU .

7 B. Karki. CSC 3102 0. since exactly one vertex is added to the tree at each iteration  The MST is then defined by the set of edges used for the tree expansions. we expand the current tree (VT) by simply attaching to it the nearest vertex not in that tree  Find the smallest edge connecting VT to V . LSU .1.B.Prim’s Algorithm  Constructs a MST through a sequence of expanding subtrees  The initial subtree (VT) in such a sequence consists of a single vertex selected arbitrarily from the set V of the n vertices of the graph  On each iteration.VT  The algorithm stops after all the graph’s vertices have been included in the tree being constructed  The total number of iteration is n .

VT ← {v0} ET ← Ø for i ← 1 to |V | -1 do find a minimum weight edge e* = (v*. at least. Algorithm Prim(G) // Input: weighted connected // graph G = 〈V. Karki.VT that is connected by a shorter edge than u’s current distance label.VT to the minimum spanning tree VT  For each remaining vertex u in V . the name label is null and the weight is infinity Split non-tree (u) vertices into two sets  Fringe: u’s are adjacent to.B. update its labels by u* and weight of the edge between u* and u.VT Attach two labels to each non-tree vertex u  Name of the nearest tree vertex and weight of the corresponding edge  For vertices that are not adjacent to any tree vertex. u*) among all edges (v.Pseudocode  Each vertex (u) not in the current tree (VT) needs information about the shortest edge connecting it to a tree vertex (v)  To find a vertex u* with the smallest weight label in the V . E〉 // Output: ET . the set of edges // composing a MST of G. one tree vertex  Unseen: u’s are yet to be affected by the algorithm.VT VT ← VT ∪ {u*} ET ← ET ∪ {e*} Return ET  Two operations after finding vertex u* to be added to the tree VT  Move u* from the set V . LSU . CSC 3102 0.8 B. u) such that v is in VT and u is in V .

4) c(b. 6) f(b. ∞) e(a. ∞) e(a. 3) c(-. 1) d(-.Example  Tree vertices and remaining vertices.B. 6) e(a. 2) e(f. 1) d(c. 3) c(b. LSU . 5) b(a. 5) d(f. 2) d(f. a(-. -) b(a. The labels indicate the nearest tree vertex and edge weight.9 f d  CSC 3102 e B. 5) b 3 4 5 1 4 c 6 5 2  a 6 f d 8  e   b 3 4 1 c 5 2  a MST with the minimum total weight of 15 0. 6) f(a. Karki. ∞) d(-. 4) d(f. Selected vertex on each iteration is shown in bold. 5) e(f. 6) f(b. 4) f(b.

with a smaller weight than T as e has smaller weight than f  So T was not minimum.Correctness  Correctness can be proved by induction:  T0 consisting of a single vertex must be a v´ f u´ part of any MST For ith inductive step. generated from Ti-1 is also a part of MST  Proof by contradiction:  Let e = (v. assume that Ti-1 is part of some T and then prove that Ti. LSU .10 CSC 3102 B. u´) connecting Ti-1 to G-Ti-1  T+e-f is another spanning tree. Karki. which together with edge e forms a cycle in G. u) be the smallest edge connecting a vertex in Ti-1 to G-Ti-1 to expand Ti-1 to Ti  Suppose you have a tree T not containing e.B.  e v u Ti-1 G-Ti-1 Graph G Because T is a spanning tree it contains a unique path from v to u. then show that T is not the MST. 0. which is what we wanted to prove. This path has to include another edge f (v´.

 For a graph represented by its weight (adjacency) matrix and the priority queue implemented as an unordered array.  Deletion of smallest element and insertion of a new element in a minheap of size n are O(log n) operations. CSC 3102 0.VT whose vertex priorities are the distances (edge weights) to the nearest tree vertices. the running time is Θ (|V|2)  The priority queue implemented with a min-heap data structure:  A complete binary tree in which every element is less than or equal to its children.Efficiency  Efficiency depends on the data structures chosen for the graph itself and the priority queue of the set V. LSU .  The root contains the smallest element.11 B. the running time is O(|E| log |V|).  For a graph represented by its adjacency linked lists and the priority queue implemented as a min-heap. and so is the operation of changing an element’s priority. Karki.B.

reducing its weight. the smallest edge connecting VT with that vertex.  Perform |V | -1 steps in which we remove the smallest element in the heap. we might replace a value on the heap. e*. wt(f)) with (u. Karki. edge. u) do if u is not already in VT find value (u. wt(e*)) have the smallest weight in the heap remove (u*. e.Pseudocode with Min-Heap  Use a min-heap to remember.12 B. wt(e*)) from the heap add u* and e* to VT for each edge e = (u*. e*. LSU . Algorithm PrimWithHeaps(G) VT ← {v0} ET ← Ø make a heap of values (vertex. and at most 2 |E| steps in which we examine an edge e = (v. wt(edge)) for i ← 1 to |V | -1 do let (u*.B. u). wt(e)) return ET CSC 3102 0. f. For each of these steps. for each vertex. f. wt(f)) in heap if wt(e) < wt(f) replace (u.

ecounter ← 0 k←0 while encounter < |V| .Kruskal’s Algorithm  A greedy algorithm for constructing a minimum spanning tree (MST) of a weighted connected graph. Algorithm Kruskal(G) ET ← Ø.k} is acyclic ET ← ET ∪ {ei.1 edges for which the sum of the edge weights is the smallest.  Constructs a MST as an expanding sequence of subgraphs. ecounter ← ecounter + 1 return ET CSC 3102 0.  The algorithm begins by sorting the graph’s edges in non-decreasing order of their weights and then scans the list adding the next edge on the list to the current subgraph provided that the inclusion does not create a cycle. Karki.  Finds an acyclic subgraph with |V| .B. LSU .k}.13 B. which are always acyclic but are not necessarily connected until the final stage.1 for k ← k + 1 to n do if ET ∪ {ei.

Example b 3  4 5 1 4 c 6 5 2 Sorted list of tree edges: the selected edges are shown in red. only five edges need to be picked up. LSU . de) will create a cycle. For a graph of 6 vertices. ae. Karki.14 e B. a 6 f d 8 e   b 3 4 1 c 5 2 a f d Total weight = 15 CSC 3102 0. bc ef ab bf cf af df ae cd de 1 2 3 4 4 5 5 6 6 8 Picking up any of the remaining edges (cf.B. cd. af.

the algorithm takes next edge (u.B.A Different View  A progression through a series of forests containing all vertices of a given graph and some of its edges  The initial forest consists of |V| trivial trees. each comprising a single vertex of the graph  The final forest consists of a single tree (MST). v) from the ordered list of the graph edge’s.Kruskal’s Algorithm . finds the trees containing the vertices u and v. if these trees are not the same. LSU .  On each iteration (operation). This avoids a cycle. and.15 B. unites them in a larger tree by adding the edge. v e u CSC 3102 0. Karki.  Checking whether two vertices belong to two different trees requires an application of the so-called union-find algorithm  The time efficiency of Kruskal’s algorithm is in O(|E| log|E|).

Karki. Sk.represents each subset by a rooted tree with one element per node and the root’s element as the subset’s representative. CSC 3102 0. LSU .uses an array indexed by the elements of the set and the array’s values indicate the subset’s representatives containing those elements. containing a different element of S. Each subset is implemented as a linked list.  Subset’s representative:  Use one element from each of the disjoint subsets in a collection  Two principal implementations  Quick find .creates an one-element set {x}  find(x) .y) .16 B.B.constructs the union of disjoint subsets containing x and y.  Abstract data type for the finite set:  makeset(x) .  Quick union .returns a subset containing x  union(x. S2 ….Union-Find Algorithm  Kruskal’s algorithm requires a dynamic partition of some n-element set S into a collection of disjoint subsets S1.  Union-find operation: acts on the collection of n one-element subsets to give larger subsets.  Initialization: each disjoint subset is one-element subset..

Karki.B. LSU .17 B.Single-Source Shortest-Paths Problem CSC 3102 0.

0. Karki. edge weights. b. Different versions of the problem: respectively. 5. 7 source to a different vertex in the graph. each leading from the  CSC 3102 B.  Single-pair shortest-path problem  Single-destination shortest-paths problem If the source is different. 3 4 c b  The resulting tree is a spanning tree. c and e from the source a of path lengths of 3. then a  All pairs shortest-paths problem different tree results. find the shortest paths to all its other vertices.18  Find a family of paths. 4 7 a e d  Dijkstra’s algorithm finds the shortest paths to the graph’s vertices in order of their 9 5 distance from a given source.Problem Statement  For a given vertex called the source in a weighted connected graph. A tree representing all possible  Works for a graph with nonnegative shortest paths to four vertices. d.B. 7 and 9.  A variety of applications exist: 2 3 5 6  to find shortest route between two cities. LSU .  Traveling salesman problem.

the algorithm has already identified the shortest paths to i . the source.  Both construct an expanding subtree of vertices by selecting the next vertex from the priority queue of the remaining vertices and using similar labeling. the priorities are computed in differently:  Dijkstra’s algorithm compares path lengths (by adding edge weights) while Prim’s algorithm compares the edge weights as given.  The next vertices nearest to the source can be found among the vertices adjacent to the vertices of Ti.  However. LSU . and the edges of the shortest paths leading to them from the source form a subtree Ti of the given graph. They are the candidates from which the algorithm selects the next vertex to the source.  These vertices.  These adjacent vertices are referred to as “fringe vertices”. and so on.19 B. CSC 3102 0.1 other vertices nearest to the source. then to a second nearest.  In general.  The algorithm works by first finding the shortest path from the source to a vertex nearest to it. Karki. before its ith iteration commences.Dijkstra’s Algorithm  Dijkstra’s algorithm works in the same way as the Prim’s algorithm does.B.

 After a vertex u* to be added to the tree is identified.  The other label indicates the name of the next-to-last vertex on such a path  The parent of the vertex in the tree being constructed.  The numeric label d indicates the length of the shortest path from v0 v* u*  the source to this vertex found by the algorithm so far  When a vertex is added to the tree. CSC 3102 0. update the labels of u by u* and du* + w(u*. Karki.20 B. Each vertex has two labels. u). finding the next nearest vertex u* becomes a simple task of finding a fringe vertex with the smallest d value. d indicates the length of the shortest path from the source to that vertex. u) < du.Labeling  For every fringe vertex u. u) such that du* + w(u*. LSU .B. and selects the vertex with the smallest such sum. the algorithm computes the sum of the distance to the nearest vertex v and the length dv of the shortest path from the source to v.  With such labeling. perform two operations:  Move u* from the fringe to the set of tree vertices.  For each remaining fringe vertex u that is connected u* by an edge of weight w(u*. respectively.

v.  Initialize: initialize vertex priority queue to empty. pu ← u* Decrease (Q.  The priority queue Q of the fringe vertices. u) < du du ← du* + w(u*. Karki. u. u). and its penultimate vertex pv // for every vertex v in V (pv is the list // of predecessors for each v ) Initialize (Q) for every vertex v in V do dv ← ∞. LSU . du ) B.Pseudocode  Shows explicit operations on two sets of labeled vertices:  The set VT of vertices for which a shortest path has already been found.  Insert: initialize vertex priority in the priority queue.21 Algorithm Dijkstra(G) // Input: weighted connected graph // G = 〈V.1 do u* ← DeleteMin(Q) VT ← VT ∪ {u*} for every vertex u in V .  DeleteMin: delete the minimum priority element.VT that is adjacent to u* do if du* + w(u*. CSC 3102 0. s. dv ) dv ← 0.B. Decrease (Q. ds ) VT ← Ø for i ← 0 to |V | .  Decrease: update priority of s with ds. E〉 and its vertex s // Output: The length dv of a shortest path from // s to v. pv ← Ø Insert (Q.

22 a-b a-b-d a-b-c a-b-d-e of length 3 of length 5 of length 7 of length 9 B. 7) d 5 e 9 e(d. 0) b 3 2 7 4 5 c 6 4 a d e b(a.Example  Tree vertices and remaining vertices. LSU CSC 3102 . Karki. 3) c(-. ∞) d(a. 3) 3 b 3 2 7 4 c 4 c(b.  a(-. Selected vertex on each iteration is shown in bold. 9)  e(d. 3+2) e(-. ∞)  b(a. 5+4)  c(b. 3+4) d(b. 5) a c(b. ∞)  d(b. 7) e(-. The labels indicate the nearest tree vertex and path length. 9) The shortest paths of four vertices from the source vertex a: b: d: c: e: 0. 7) e(d.B.

 Hence.  For general step.Correctness  Correctness can be proved by induction:  For i = 1. Karki.  All vertices on a shortest path from s to vi+1 must be in Ti because they are closer to s than vi+1. assume that it is true for the algorithm’s tree Ti with i vertices. CSC 3102 0. LSU .B. Let vi+1 be the vertex to be added next to the tree by the algorithm.  dv is the shortest path from s to v (contained in Ti) by the assumption of induction.23 B. the (i+1)st closet vertex can be selected as the algorithm does: by minimizing the sum of dv and the length of the edge from u to an adjacent vertex not in the tree. the assertion is true for the trivial path from the source to itself.

CSC 3102 0. Karki. the running time is Θ (|V |2).24 B.B. the running time is O(|E| log |V|).Efficiency  The time efficiency of Dijkstra’s algorithm depends on the data structures chosen for the graph itself and the priority queue.  For a graph represented by its adjacency linked lists and the priority queue implemented as a min-heap. LSU .  For a graph represented by its weight matrix and the priority queue implemented an unordered array.

B.Huffman Tree and Code CSC 3102 0.25 B. LSU . Karki.

26 B.  Assigns shorter bits to high-frequency characters and longer ones to low-frequency characters.Huffman Tree and Code  Huffman trees allow us to encode a text that comprises characters from some n-character alphabet  Huffman code represents an optimal prefix-free variable-length scheme that assigns bit strings to characters based on their frequencies in a given text. LSU .B. CSC 3102 0. Karki.  Uses a greedy construction of binary tree whose leaves represent the alphabet characters and whose left and right edges are labeled with 0’s and 1’s.

27 B. Karki. LSU .B.Huffman’s Algorithm  Initialize n one-node trees and label them with characters of the alphabet with the frequency of each character recorded as the weight in its tree’s root  Find two trees with the smallest weights and make them the left and right subtrees of a new tree and record the sum of their weights in the root of the new tree as its weight  The resulting binary tree is called Huffman tree  Obtain the codeword of a character by recording the labels (0 or 1) on the simple path from the root to the character’s leaf  This is the Huffman code  It provides an optimal encoding  Dynamic Huffman encoding  Coding tree is updated each time a new character is read from the source text CSC 3102 0.

35 A In the fixed-length scheme each codeword will contain three bits.Constructing a Huffman Coding Tree  See the text book for the following five-character alphabet {A.75. B.15 101 1.25) = 0.2 C 0.28 B.25 i=1 0.19 0. -} example: character probability codeword A 0. So Huffman code results € in compression by (3 . which is 25 %.2. 0. C.35 11 B 0. Karki.1 100 C 0.2 D 0.4 2 0.0 Expected number of bit per character is 5 l = ∑ li pi = 2. LSU .1 B 0.25 0.15 - CSC 3102 0. D.2 00 D 0.2 01 0.6 5 Variance: Var = € ∑ (l − l ) i i=1 pi ≈ 0.B.