Professional Documents
Culture Documents
Greedy Method
Greedy Method
Greedy Approach
● A greedy algorithm always makes the choice that looks best at the moment.
● It makes a locally optimal choice in the hope that this choice will lead to a
globally optimal solution.
● Greedy algorithms do not always yield optimal solutions, but for many
problems they do.
Optimization Problems
● An optimization problem is one in which you want to find, not just a
solution, but the best solution
● A “greedy algorithm” sometimes works well for optimization problems
○ best for problems with the greedy-choice property
■ a globally-optimal solution can always be found by a series of local
improvements from a starting configuration.
● A greedy algorithm works in phases. At each phase:
○ You take the best you can get right now, without regard for future
consequences
○ You hope that by choosing a local optimum at each step, you will
end up at a global optimum
Example - Making coin-change
❖ Problem: Given a an amount N, find the least number of coins to make
the change for N
❖ Objective function: Minimize number of coins returned.
❖ Greedy solution: Always return the largest coin you can
❖ Example 1: Coins are valued $.32, $.08, $.01
➢ Has the greedy-choice property, since no amount over $.32 can be made with a
minimum number of coins by omitting a $.32 coin (similarly for amounts over
$.08, but under $.32).
❖ Example 2: Coins are valued $.30, $.20, $.05, $.01
➢ Does not have greedy-choice property, since $.40 is best made with two $.20’s,
but the greedy solution will pick three coins (which ones?)
Example Problems Solved using Greedy Methods
● Huffman Code
● Fractional Knapsack
● Scheduling Jobs with Deadlines
● Optimal Binary Merge Pattern
Huffman Coding - Intro
Objective: maximize
Constraint:
FKP - Example
FKP - Greedy Solution
Greedy choice: Keep taking item with highest value
(benefit to weight ratio)
Run time: O(n log n). (for sorting)
FKP - Greedy Algorithm
Greedy-Fractional-Knapsack (w[0..n-1], v[0..n-1], W) {
for (i = 0 to n-1 )
x[i] = 0 ;
r[i] = v[i] / w[i];
sort the items in descending order with respect to r[0..n-1]
weight = W ; profit = 0.0;
for (i = 0 to n-1 ) {
if weight >= W then {
x[i] = 1 ;
weight = weight - w[i] ;
profit = profit + v[i};
}
else {
x[i] = weight / w[i] ;
profit = profit + v[i]*x[i];
break ;
}
}
return x and profit
}
JOB SEQUENCING WITH DEADLINES
Objective:
To find the set of jobs which can be completed
within the deadline with maximum profit value.
22
JOB SEQUENCING WITH DEADLINES
- Contd..
• A feasible solution is a subset of jobs J such that each
job is completed by its deadline.
• An optimal solution is a feasible solution with
maximum profit value.
Example : Let n = 4, (p1,p2,p3,p4) = (100,10,15,27),
(d1,d2,d3,d4) = (2,1,2,1)
23
JOB SEQUENCING WITH DEADLINES
- Contd..
Sr.No. Feasible Processing Profit value
Solution Sequence
(i) (1,2) (2,1) 110
(ii) (1,3) (1,3) or (3,1) 115
(iii) (1,4) (4,1) 127 is
the optimal one
(iv) (2,3) (2,3) 25
(v) (3,4) (4,3) 42
(vi) (1) (1) 100
(vii) (2) (2) 10
(viii) (3) (3) 15
(ix) (4) (4) 27
24
JOB SEQUENCING WITH DEADLINES
- Contd..
J = { 1} is a feasible one
J = { 1, 4} is a feasible one with processing
sequence ( 4,1)
J = { 1, 3, 4} is not feasible
J = { 1, 2, 4} is not feasible
J = { 1, 4} is optimal
25
Job a b c d e
Deadline 2 1 3 2 1
Profit 60 100 20 40 20
26
Algorithm: Job-Sequencing-With-Deadline (id[n],p[n], d[n])
//Assume jobs are ordered such that p[1]≥p[2]≥….≥p[n]
t = max(d[])
result = [False] * t
# To store result (Sequence of jobs)
job = ['-1'] * t
# Iterate through all given jobs
for i = 0 to n-1:
for j = min(t-1, d[i]) down to 0:
# Free slot found
if result[j] is False:
result[j] = True
job[j] = id[i]
break
print(job[])
27
Optimal Binary Merge Pattern
Merge 3 sorted lists L1, L2, and L3, of sizes 30, 20, and 10,
respectively, into a single file by merging only two at a Time.
We intend to find an optimal merge pattern which minimizes
the total number of comparisons.
Case 1: merge L1 and L2, which uses 30 + 20 = 50 comparisons
resulting in a list of size 50, then merge this list with list L3, using
another 50 + 10 = 60 comparisons, Total number of comparisons
50 + 60 = 110.
Case 2: merge lists L2 and L3, using 20 + 10 = 30 comparisons,
then merge it with list L1, for another 30 + 30 = 60 comparisons.
Total number of comparisons is 30 + 60 = 90.
Optimal Binary Merge Pattern - Example