You are on page 1of 31

Greedy Method

Greedy Approach
● A greedy algorithm always makes the choice that looks best at the moment.
● It makes a locally optimal choice in the hope that this choice will lead to a
globally optimal solution.
● Greedy algorithms do not always yield optimal solutions, but for many
problems they do.
Optimization Problems
● An optimization problem is one in which you want to find, not just a
solution, but the best solution
● A “greedy algorithm” sometimes works well for optimization problems
○ best for problems with the greedy-choice property
■ a globally-optimal solution can always be found by a series of local
improvements from a starting configuration.
● A greedy algorithm works in phases. At each phase:
○ You take the best you can get right now, without regard for future
consequences
○ You hope that by choosing a local optimum at each step, you will
end up at a global optimum
Example - Making coin-change
❖ Problem: Given a an amount N, find the least number of coins to make
the change for N
❖ Objective function: Minimize number of coins returned.
❖ Greedy solution: Always return the largest coin you can
❖ Example 1: Coins are valued $.32, $.08, $.01
➢ Has the greedy-choice property, since no amount over $.32 can be made with a
minimum number of coins by omitting a $.32 coin (similarly for amounts over
$.08, but under $.32).
❖ Example 2: Coins are valued $.30, $.20, $.05, $.01
➢ Does not have greedy-choice property, since $.40 is best made with two $.20’s,
but the greedy solution will pick three coins (which ones?)
Example Problems Solved using Greedy Methods
● Huffman Code
● Fractional Knapsack
● Scheduling Jobs with Deadlines
● Optimal Binary Merge Pattern
Huffman Coding - Intro

❖ Fixed Length Code:


➢ If only six different characters are used in a text then we need 3
bits to represent six characters.
■ a = 000; b = 001; …f = 101.
■ This method thus requires 300,000 bits to code the entire
file having 100,000 characters.
Huffman Coding - Intro
❖ Variable Length Code:
➢ code frequent characters with shorter word and infrequent
characters lengthy code words.

➢ Thus using variable length code it requires


(45.1 +13.3 12.3 16.3 9.4 5.4)1000 = 224,000 Bits
Thus saving approx 25%.
Huffman Coding - Intro
❖ Prefix-free Code:
➢ no code word is also a prefix of some other codeword.
➢ These prefix codes are desirable because they simplify
decoding.

➢ 001011101 can be parsed uniquely as 0-0-101-1101 which


decodes as aabe
Huffman Coding
❖ Compression:
➢ Find the frequency of occurrence of
each character
➢ Encode Frequent characters short bit
strings
➢ Rarer characters longer bit strings
❖ Encoding
➢ Use a tree
➢ Encode by following tree from root to
leaf
➢ Eg: E is 00 and S is 011
➢ Frequent characters: E, T 2 bit
encodings
➢ Others: A, S, N, O 3 bit encodings
Huffman Coding
Huffman Coding
Huffman Coding
Huffman Coding
Huffman Coding
Huffman Coding
Huffman Coding - Time Complexity
The Fractional Knapsack Problem (FKP)
Given: A set S of n items, with each item i having bi - a positive benefit and wi - a
positive weight
Goal: Choose items with maximum total benefit but with weight at most W.
If we are allowed to take fractional amounts, then this is the fractional
knapsack problem.
In this case, we let xi denote the amount we take of item i

Objective: maximize

Constraint:
FKP - Example
FKP - Greedy Solution
Greedy choice: Keep taking item with highest value
(benefit to weight ratio)
Run time: O(n log n). (for sorting)
FKP - Greedy Algorithm
Greedy-Fractional-Knapsack (w[0..n-1], v[0..n-1], W) {
for (i = 0 to n-1 )
x[i] = 0 ;
r[i] = v[i] / w[i];
sort the items in descending order with respect to r[0..n-1]
weight = W ; profit = 0.0;
for (i = 0 to n-1 ) {
if weight >= W then {
x[i] = 1 ;
weight = weight - w[i] ;
profit = profit + v[i};
}
else {
x[i] = weight / w[i] ;
profit = profit + v[i]*x[i];
break ;
}
}
return x and profit
}
JOB SEQUENCING WITH DEADLINES

The problem statement:


• There are n jobs to be processed on a machine.
• Each job i has a deadline di≥ 0 and profit pi≥0 .
• Pi is earned iff the job is completed by its deadline.
• The job is completed if it is processed on a machine for
unit time.
• Only one machine is available for processing jobs.
• Only one job is processed at a time on the machine.
21
JOB SEQUENCING WITH DEADLINES

Objective:
To find the set of jobs which can be completed
within the deadline with maximum profit value.

22
JOB SEQUENCING WITH DEADLINES
- Contd..
• A feasible solution is a subset of jobs J such that each
job is completed by its deadline.
• An optimal solution is a feasible solution with
maximum profit value.
Example : Let n = 4, (p1,p2,p3,p4) = (100,10,15,27),
(d1,d2,d3,d4) = (2,1,2,1)

23
JOB SEQUENCING WITH DEADLINES
- Contd..
Sr.No. Feasible Processing Profit value
Solution Sequence
(i) (1,2) (2,1) 110
(ii) (1,3) (1,3) or (3,1) 115
(iii) (1,4) (4,1) 127 is
the optimal one
(iv) (2,3) (2,3) 25
(v) (3,4) (4,3) 42
(vi) (1) (1) 100
(vii) (2) (2) 10
(viii) (3) (3) 15
(ix) (4) (4) 27

24
JOB SEQUENCING WITH DEADLINES
- Contd..
J = { 1} is a feasible one
J = { 1, 4} is a feasible one with processing
sequence ( 4,1)
J = { 1, 3, 4} is not feasible
J = { 1, 2, 4} is not feasible
J = { 1, 4} is optimal

25
Job a b c d e

Deadline 2 1 3 2 1

Profit 60 100 20 40 20

26
Algorithm: Job-Sequencing-With-Deadline (id[n],p[n], d[n])
//Assume jobs are ordered such that p[1]≥p[2]≥….≥p[n]
t = max(d[])
result = [False] * t
# To store result (Sequence of jobs)
job = ['-1'] * t
# Iterate through all given jobs
for i = 0 to n-1:
for j = min(t-1, d[i]) down to 0:
# Free slot found
if result[j] is False:
result[j] = True
job[j] = id[i]
break
print(job[])

27
Optimal Binary Merge Pattern

● Problem Statement: To merge a set of sorted files of different


length into a single sorted file
● Objective: find an optimal(least) record moves to merge all
sorted files into a single file
● Merging a file of p-records and a file of q-records requires
possibly p+q record moves,
● Greedy Solution: merge two smallest files together at each
step.
Optimal Binary Merge Pattern - Example

Merge 3 sorted lists L1, L2, and L3, of sizes 30, 20, and 10,
respectively, into a single file by merging only two at a Time.
We intend to find an optimal merge pattern which minimizes
the total number of comparisons.
Case 1: merge L1 and L2, which uses 30 + 20 = 50 comparisons
resulting in a list of size 50, then merge this list with list L3, using
another 50 + 10 = 60 comparisons, Total number of comparisons
50 + 60 = 110.
Case 2: merge lists L2 and L3, using 20 + 10 = 30 comparisons,
then merge it with list L1, for another 30 + 30 = 60 comparisons.
Total number of comparisons is 30 + 60 = 90.
Optimal Binary Merge Pattern - Example

merge cost = sum of all weighted external path lengths


Optimal Binary Merge Pattern - Example

You might also like