- [IJIT-V3I3P9]:S. Lana Fernando, A. Sebastian
- Koofer(2)
- CCITT(Huffman)Encoding
- IME 6110 Min Spanning Tree Spring 2016
- Target Response Electrical usage Profile Clustering using Big Data
- Behavioral Economics Internet Search and Antitrust.pdf
- Compression Algorithms for Medical Images
- Akume
- Ec2037 Mutimedia Question Bank
- RWEP Final5 12 RWEP Final5 12 Summary Lecture
- shannonfano-100209005018-phpapp01
- Communication Models
- [IJCST-V4I4P24]:Manjunath C R, Nagaraj G S, Srikanth S
- UNB SyntaxIdentifier List
- MMC Chap3
- A Generalization of Dijkstras Shortest Path Algorithm Wtih Applications to VLSI Routing
- Routing
- data structures notes
- rravi-fsttcs09-talk
- ITCT Writeup
- Balanced and Energy-efficient Multi-hop
- Traveling Salesman Problem in Distributed Environment
- cad for vlsi 1.pptx
- talkMTNS
- Interval Partitioning
- Solutions
- Hci Chapters Quick Reference (2)
- Android Assignment Submission
- Format Synopsis
- ijcsit2012030266_2
- Dbms Project Report
- Hci Chapters Quick Reference
- Software Testing Lab Manual
- Evaluation of Web Security Mechanisms Using Vulnerability _ Attack Injection
- Ppt Synopsis
- Security Features for dbms
- 0NUMTQxNTk5
- Cloud-Based Multimedia Content Protection System
- Software Testingasbc
- Software Testing Lab Syllabus
- iT0NUMTQxNTk5
- 9-Maged
- Lect 0928
- Table of Contents
- Csc_5sem_CS2308LM.doc
- SRC_Lab Manual_aps.pdf
- NLTK Presentation
- Machine Translation
- information security
- OnlineCourseRegistration
- CO_MQP (1)
- Class Timte Table
- Self Study Synopsis Format
- is1
- IPSEC Report

Definition:

Greedy technique is a general algorithm design strategy, built on following elements:

• configurations: different choices, values to find

• objective function: some configurations to be either maximized or minimized

The method:

•

•

•

•

•

•

**Applicable to optimization problems ONLY
**

Constructs a solution through a sequence of steps

Each step expands a partially constructed solution so far, until a complete solution

to the problem is reached.

On each step, the choice made must be

Feasible: it has to satisfy the problem’s constraints

Locally optimal: it has to be the best local choice among all feasible choices

available on that step

Irrevocable: Once made, it cannot be changed on subsequent steps of the

algorithm

NOTE:

• Greedy method works best when applied to problems with the greedy-choice

property

• A globally-optimal solution can always be found by a series of local

improvements from a starting configuration.

**Greedy method vs. Dynamic programming method:
**

•

•

•

**LIKE dynamic programming, greedy method solves optimization problems.
**

LIKE dynamic programming, greedy method problems exhibit optimal

substructure

UNLIKE dynamic programming, greedy method problems exhibit the greedy

choice property -avoids back-tracing.

**Applications of the Greedy Strategy:
**

•

•

Optimal solutions:

Change making

Minimum Spanning Tree (MST)

Single-source shortest paths

Huffman codes

Approximations:

Traveling Salesman Problem (TSP)

Fractional Knapsack problem

1

2 . connected graph G is defined as: A spanning tree of G with minimum total weight. Among them the spanning tree with the minimum weight 6 is the MST for the given graph Question: Why can’t we use BRUTE FORCE method in constructing MST ? Answer: If we use Brute force method• Exhaustive search approach has to be applied. • Two serious obstacles faced: 1. Example: Consider the example of spanning tree: For the given graph there are three possible spanning trees.Spanning Tree Definition: Spanning tree is a connected acyclic sub-graph (tree) of the given graph (G) that includes all of G’s vertices Example: Consider the following graph 1 b a 2 5 c d 3 The spanning trees for the above graph are as follows: 1 1 b b a a 2 5 d 5 3 c d c Weight (T2) = 8 Weight (T1) = 9 1 a b 2 d c 3 Weight (T3) = 6 Minimum Spanning Tree (MST) Definition: MST of a weighted. The number of spanning trees grows exponentially with graph size. Generating all spanning trees for the given graph is not easy. 2.

VT VT VT U { u*} ET ET U { e*} return ET 3 . Traveling salesperson problem. u) such that v is in VT and u is in V . E } //Output: ET the set of edges composing a MST of G // the set of tree vertices can be initialized with any vertex VT { v0} ET Ø for i 1 to |V| . TV cable. Reducing data storage in sequencing amino acids in a protein Learning salient features for real-time face verification Auto config protocol for Ethernet bridging to avoid cycles in a network. • Unseen edge: An edge with both vertices not in Ti • Algorithm: ALGORITHM Prim (G) //Prim’s algorithm for constructing a MST //Input: A weighted connected graph G = { V.1 do Find a minimum-weight edge e* = (v*. electrical. road Approximation algorithms for NP-hard problems. Steiner tree Cluster analysis. hydraulic. computer.MST Applications: • • • • • • Network design. u*) among all the edges (v. etc Prim’s Algorithm -to find minimum spanning tree Some useful definitions: Fringe edge: An edge which has one vertex is in partially constructed tree Ti and the other is not. Telephone.

4) b c 3 4 a f 4 . 3 ) c(b.6) e(a. T2.∞) e(a.1) d(-. • At each stage construct Ti + 1 from Ti by adding the minimum weight edge connecting a vertex in tree (Ti) to one vertex not yet in tree. .6) f(a.3) c(-. choose from “fringe” edges (this is the “greedy” step!) Algorithm stops when all vertices are included Example: Apply Prim’s algorithm for the following graph to find MST.∞) d(-.∞) e(a.5) b ( a. 1 b 3 c 4 5 a 4 6 5 f d 2 8 6 e Solution: Tree vertices Remaining vertices a ( -.The method: STEP 1: Start with a tree.6) f(b. 1 ) d(c. consisting of one vertex STEP 2: “Grow” tree one vertex/edge at a time • Construct a series of expanding sub-trees T1. T0.6) f(b.) b(a.4) Graph b 3 a 1 3 b c a 1 c ( b. … Tn-1.

o Array . 4) d(f.” The choice of priority queue matters in Prim implementation.1 b c 3 f ( b.5) a c 4 5 f d 2 e d( f. The weight of the minimum spanning tree is 15 Efficiency: Efficiency of Prim’s algorithm is based on data structure used to store priority queue. but not in practice 5 .5) e(f. 2) d(f.2) 4 a f 2 e 1 b 3 e ( f.better for sparse graphs o Fibonacci heap .optimal for dense graphs o Binary heap . 5) Algorithm stops since all vertices are included. • Unordered array: Efficiency: Θ(n2) • Binary heap: Efficiency: Θ(m log n) • Min-heap: For graph with n nodes and m edges: Efficiency: (n + m) log n Conclusion: • • Prim’s algorithm is a “vertex based algorithm” Prim’s algorithm “Needs priority queue for locating the nearest vertex.best in theory.

Kruskal’s Algorithm -to find minimum spanning tree Algorithm: ALGORITHM Kruskal (G) //Kruskal’s algorithm for constructing a MST //Input: A weighted connected graph G = { V. STEP 3: Number of trees are reduced by ONE at every inclusion of an edge At each stage: • Among the edges which are not yet included. • the edge will reduce the number of trees by one by combining two trees of the forest Algorithm stops when |V| -1 edges are included in the MST i. E } //Output: ET the set of edges composing a MST of G Sort E in ascending order of the edge weights // initialize the set of tree edges and its size ET Ø edge_counter 0 //initialize the number of processed edges k0 while edge_counter < |V| . 6 .e : when the number of trees in the forest is reduced to ONE. select the one with minimum weight AND which does not form a cycle.1 kk+1 if ET U { ei k} is acyclic ET ET U { ei k } edge_counter edge_counter + 1 return ET The method: STEP 1: Sort the edges by increasing weight STEP 2: Start with a forest having |V| number of trees.

Example: Apply Kruskal’s algorithm for the following graph to find MST. 1 b c 3 4 4 6 5 a 5 f d 2 8 6 e Solution: The list of edges is: Edge Weight ab 3 af 5 ae 6 bc 1 bf 4 cf 4 cd 6 df 5 de 8 ef 2 cf 4 af 5 df 5 ae 6 cd 6 de 8 Sort the edges in ascending order: Edge Weight Edge Weight Insertion status Insertion order Edge Weight Insertion status Insertion order bc 1 ef 2 ab 3 bf 4 1 bc b c 1 f a d YES e 1 1 ef b c 2 a f YES 2 d 2 e 7 .

Edge Weight Insertion status Insertion order Edge Weight Insertion status Insertion order Edge Weight Insertion status Insertion order Edge Weight Insertion status Insertion order Edge Weight Insertion status Insertion order 1 ab 3 3 b c a f d YES 2 e 3 1 bf 3 4 b c 4 a f YES d 2 e 4 cf 4 NO af 5 NO - df 1 3 b c 5 a 4 f d YES 5 2 5 e Algorithm stops as |V| -1 edges are included in the MST 8 .

the minimum Decrease(Q. • This algorithm is often used in routing.g.Efficiency: Efficiency of Kruskal’s algorithm is based on the time needed for sorting the edge weights of a given graph. making ds.: Dijkstra's algorithm is usually the working principle behind link-state routing protocols ALGORITHM Dijkstra(G. find for each vertex x. Pv null // Pv . E. the parent of v insert(Q. • With an efficient sorting algorithm: Efficiency: Θ(|E| log |E| ) Conclusion: • • Kruskal’s algorithm is an “edge based algorithm” Prim’s algorithm with a heap is faster than Kruskal’s algorithm. Algorithm • • By Dutch computer scientist Edsger Dijkstra in 1959.to find Single Source Shortest Paths Some useful definitions: • Shortest Path Problem: Given a connected directed graph G with non-negative weights on the edges and a root vertex r. ds) VT ∅ 9 . s. Dv) //initialize vertex priority in priority queue ds 0 //update priority of s with ds. Dijkstra’s Algorithm . a directed path P (x) from r to x so that the sum of the weights on the edges in the path P (x) is as small as possible. s) //Input: Weighted connected graph G and source vertex s //Output: The length Dv of a shortest path from s to v and its penultimate vertex Pv for every vertex v in V //initialize vertex priority in the priority queue Initialize (Q) for every vertex v in V do Dv ∞ . Solves the single-source shortest path problem for a graph with nonnegative edge weights. v.

D(x) + w(xy) } Example: Apply Dijkstra’s algorithm to find Single source shortest paths with vertex a as the source. Stage 2 : Uses the data structure for finding a shortest path from source to any vertex v. and for each vertex x.for i 0 to |V| . x ) with D(x) = w( rx ) when rx is an edge. For each temporary vertex y distinct from x. u). u) < Du Du Du + w (u*. Pu u* Decrease(Q. • Scan first from the root and take initial paths P( r. D(x) = ∞ when rx is not an edge. set D(y) = min{ D(y). 1 b 3 c 4 5 a 4 6 5 f d 2 8 6 e Solution: Length Dv of shortest path from source (s) to other vertices v and Penultimate vertex Pv for every vertex v in V: 10 . x ) = ( r. • At each step. u. keep track of a “distance” D(x) and a directed path P(x) from root to vertex x of length D(x). Stage 1: A greedy algorithm computes the shortest distance from source to all other nodes in the graph and saves in a data structure. Du) The method Dijkstra’s algorithm solves the single source shortest path problem in 2 stages. choosing the locally best vertex VT VT U {u*} for every vertex u in V – VT that is adjacent to u* do if Du* + w (u*.1 do u* DeleteMin(Q) //expanding the tree.

6) d( c.b.b. e ] Df = 5 Pf = [ a.c] Pd = [a. Dd = ∞ .∞) e(a. 4 ) f ( a. Db = ∞ . 5) e ( a. b ] Dc = ∞ Pc = null Dd = ∞ Pd = null De = 6 Pe = [ a. b ] Pc = [a. f ] Da = 0 Db = 3 Dc = 4 Dd = ∞ De = 6 Df = 5 Da = 0 Db = 3 d ( c . b ] Pc = [a. 10 ) Dd=10 e ( a .6) f(a.b. Df = ∞ .d] Pe = [ a. Tree vertices a ( -.3) c(-. e ] Df = 5 Pf = [ a. f ] Da = 0 Pa = a Db = 3 Pb = [ a.b. Dc = ∞ .d] De = 6 Pe = [ a.b.6) f(a. f ] Pa = a Pb = [ a.c] Dd=10 Pd = [a.5) De = 6 Df = 5 Da = 0 Db = 3 Dc = 4 d ( c . e ] Pf = [ a. f ] Pa = a Pb = [ a.b.c. 3 ) c ( b. f ] 1 b 3 c a 5 a f a 6 e 1 b 3 c 6 d a Algorithm stops since no edges to scan 11 . 6 ) De = 6 Df = 5 c ( b . 3+1 ) d(-. b ] Dc = 4 Pc = [a. 0 ) b ( a. e ] Pf = [ a.∞) e(a. 4+6 ) Dc = 4 e(a.b.5) d ( c.∞) d(-. 10 ) Graph b 3 a Pa = a Pb = [ a.6) Dd=10 f(a. 10) Pa = null Pb = null Pc = null Pd = null Pe = null Pf = null Remaining vertices b(a. De = ∞ .5) Distance & Path vertex Da = 0 Pa = a Db = 3 Pb = [ a.Da = 0 .c] Pd = null Pe = [ a. e ] Pf = [ a. b ] Pc = [a.c] Pd = [a.d] Pe = [ a.c.c.

Variable length encoding: Assigns code words of different lengths to different characters. o Since.Conclusion: • • • • Doesn’t work with negative weights Applicable to both undirected and directed graphs Use unordered array to store the priority queue: Efficiency = Θ(n2) Use min-heap to store the priority queue: Efficiency = O(m log n) Huffman Trees Some useful definitions: • • • • • Code word: Encoding a text that comprises n characters from some alphabet by assigning to each of the text’s characters some sequence of bits. no codeword can be a prefix of another codeword Huffman algorithm: • • • • Constructs binary prefix code tree By David A Huffman in 1951. o All left edges are labeled 0 o All right edges are labeled 1 o Codeword of a character is obtained by recording the labels on the simple path from the root to the character’s leaf. Huffman coding uses frequencies of the symbols in the string to build a variable rate prefix code o Each symbol is mapped to a binary string o More frequent symbols have shorter codes o No code is a prefix of another code Huffman Codes for Data Compression achieves 20-90% Compression 12 . This bits sequence is called code word Fixed length encoding: Assigns to each character a bit string of the same length. there is no simple path to a leaf that continues to another leaf. no codeword is a prefix of a codeword of another character. Huffman’s algorithm achieves data compression by finding the best variable length binary encoding scheme for the symbols that occur in the file to be compressed. Problem: How can we tell how many bits of an encoded text represent ith character? We can use prefix free codes Prefix free code: In Prefix free code. Binary prefix code : o The characters are associated with the leaves of a binary tree.

Make them the left and right sub-tree of a new tree and record the sum of their weights in the root of the new tree as its weight” Example: Construct a Huffman code for the following data: Character A B C D - probability 0.15 - C 0.15 0.15 13 .1 D 0.2 C 0.15 • • Encode the text ABACABAD using the code.15 0.4 0.15 0.15 0.4 3 B D 0.1 0. Decode the text whose encoding is 100010111001010 Solution: A 0.2 A 0.2 0.15 C 0.4 A 0.25 0. (More generally the weight of a tree will be equal to the sum of the frequencies in the tree’s leaves) Step 2: Repeat the following operation until a single tree is obtained. Record the frequency of each character in its tree’s root to indicate the tree’s weight.1 B 0.1 0.2 D 0.4 1 2 B 0.Construction: Step 1: Initialize n one-node trees and label them with the characters of the alphabet. “Find two trees with smallest weight.

4 ) + ( 3 * 0.4 0.35 0.15 Code word 0 100 111 101 110 Encoded text for ABACABAD using the code words: 0100011101000110 Decoded text for encoded text 100010111001010 is: BAD-ADA Compute compression ratio: Bits per character = Codeword length * Frequency = ( 1 * 0.1 0.15 ) + ( 3 * 0.15 ) = 2.15 0.2 0.2 ) + ( 3 * 0.2 Algorithm STOPS as SINGLE TREE is obtained Code words are: Character A B C D - probability 0.6 A 0.4 0.35 0.0 0 1 0.15 1 C 0.2 1.1 0. 100% = 26.1 4 A 0.1 0.25 B 0.15 D 0.15 C 0.2 0.B 0.25 0 0 1 B 0.15 D 0.4 0.15 D 0.25 0.20 Compression ratio is = ( 3 – 2.20 )/ 3 .1) + ( 3 * 0.6 5 A 0.15 C 0.4 6 1 0 0.6% 14 .35 0.

- [IJIT-V3I3P9]:S. Lana Fernando, A. SebastianUploaded byIJITJournals
- Koofer(2)Uploaded bydavyl91
- CCITT(Huffman)EncodingUploaded byShashank S. Prasad
- IME 6110 Min Spanning Tree Spring 2016Uploaded byLibyaFlower
- Target Response Electrical usage Profile Clustering using Big DataUploaded byAnonymous kw8Yrp0R5r
- Behavioral Economics Internet Search and Antitrust.pdfUploaded byAnonymous EzhB5q3S
- Compression Algorithms for Medical ImagesUploaded byselvapriya
- AkumeUploaded byPiero Conislla Cardenas
- Ec2037 Mutimedia Question BankUploaded byAnirudhan Ravi
- RWEP Final5 12 RWEP Final5 12 Summary LectureUploaded bysat_sni
- shannonfano-100209005018-phpapp01Uploaded bytsegay
- Communication ModelsUploaded byErro Sertan
- [IJCST-V4I4P24]:Manjunath C R, Nagaraj G S, Srikanth SUploaded byEighthSenseGroup
- UNB SyntaxIdentifier ListUploaded byJignesh Dhua
- MMC Chap3Uploaded bySOMESH B S
- A Generalization of Dijkstras Shortest Path Algorithm Wtih Applications to VLSI RoutingUploaded byQuanita Kiran
- RoutingUploaded byVishal Dani
- data structures notesUploaded byHirensKodnani
- rravi-fsttcs09-talkUploaded byRavi Infotagger
- ITCT WriteupUploaded byapi-3740807
- Balanced and Energy-efficient Multi-hopUploaded byAyan
- Traveling Salesman Problem in Distributed EnvironmentUploaded byCS & IT
- cad for vlsi 1.pptxUploaded byshekhartit
- talkMTNSUploaded byVaziel Ivans
- Interval PartitioningUploaded bychuyennv90
- SolutionsUploaded byJorge Cerda Garcia

- Hci Chapters Quick Reference (2)Uploaded byPrateekMandi
- Android Assignment SubmissionUploaded byPrateekMandi
- Format SynopsisUploaded byPrateekMandi
- ijcsit2012030266_2Uploaded byPrateekMandi
- Dbms Project ReportUploaded byPrateekMandi
- Hci Chapters Quick ReferenceUploaded byPrateekMandi
- Software Testing Lab ManualUploaded byPrateekMandi
- Evaluation of Web Security Mechanisms Using Vulnerability _ Attack InjectionUploaded byPrateekMandi
- Ppt SynopsisUploaded byPrateekMandi
- Security Features for dbmsUploaded byPrateekMandi
- 0NUMTQxNTk5Uploaded byPrateekMandi
- Cloud-Based Multimedia Content Protection SystemUploaded byPrateekMandi
- Software TestingasbcUploaded byPrateekMandi
- Software Testing Lab SyllabusUploaded byPrateekMandi
- iT0NUMTQxNTk5Uploaded byPrateekMandi
- 9-MagedUploaded byPrateekMandi
- Lect 0928Uploaded byPrateekMandi
- Table of ContentsUploaded byPrateekMandi
- Csc_5sem_CS2308LM.docUploaded byPrateekMandi
- SRC_Lab Manual_aps.pdfUploaded byPrateekMandi
- NLTK PresentationUploaded byPrateekMandi
- Machine TranslationUploaded byPrateekMandi
- information securityUploaded byPrateekMandi
- OnlineCourseRegistrationUploaded byPrateekMandi
- CO_MQP (1)Uploaded byPrateekMandi
- Class Timte TableUploaded byPrateekMandi
- Self Study Synopsis FormatUploaded byPrateekMandi
- is1Uploaded byPrateekMandi
- IPSEC ReportUploaded byPrateekMandi