This action might not be possible to undo. Are you sure you want to continue?
CS 332: Algorithms
Quicksort
David Luebke 2 1/31/2012
Homework 2
Assigned today, due next Wednesday
Will be on web page shortly after class
Go over now
David Luebke 3 1/31/2012
Review: Quicksort
Sorts in place
Sorts O(n lg n) in the average case
Sorts O(n
2
) in the worst case
But in practice, it¶s quick
And the worst case doesn¶t happen often (but more
on this later«)
David Luebke 4 1/31/2012
Quicksort
Another divideandconquer algorithm
The array A[p..r] is partitioned into two non
empty subarrays A[p..q] and A[q+1..r]
Invariant: All elements in A[p..q] are less than all
elements in A[q+1..r]
The subarrays are recursively sorted by calls to
quicksort
Unlike merge sort, no combining step: two
subarrays form an alreadysorted array
David Luebke 5 1/31/2012
Quicksort Code
Quicksort(A, p, r)
{
if (p < r)
{
q = Partition(A, p, r);
Quicksort(A, p, q);
Quicksort(A, q+1, r);
}
}
David Luebke 6 1/31/2012
Partition
Clearly, all the action takes place in the
partition() function
Rearranges the subarray in place
End result:
Two subarrays
All values in first subarray · all values in second
Returns the index of the ³pivot´ element
separating the two subarrays
How do you suppose we implement this?
David Luebke 7 1/31/2012
Partition In Words
Partition(A, p, r):
Select an element to act as the ³pivot´ (which?)
Grow two regions, A[p..i] and A[j..r]
All elements in A[p..i] <= pivot
All elements in A[j..r] >= pivot
Increment i until A[i] >= pivot
Decrement j until A[j] <= pivot
Swap A[i] and A[j]
Repeat until i >= j
Return j
Note: slightly different from
book¶s partition()
David Luebke 8 1/31/2012
Partition Code
Partition(A, p, r)
x = A[p];
i = p  1;
j = r + 1;
while (TRUE)
repeat
j;
until A[j] <= x;
repeat
i++;
until A[i] >= x;
if (i < j)
Swap(A, i, j);
else
return j;
Illustrate on
A = {5, 3, 2, 6, 4, 1, 3, 7};
What is the running time of
partition()?
David Luebke 9 1/31/2012
Partition Code
Partition(A, p, r)
x = A[p];
i = p  1;
j = r + 1;
while (TRUE)
repeat
j;
until A[j] <= x;
repeat
i++;
until A[i] >= x;
if (i < j)
Swap(A, i, j);
else
return j;
partition() runs in O(n) time
David Luebke 10 1/31/2012
Analyzing Quicksort
What will be the worst case for the algorithm?
Partition is always unbalanced
What will be the best case for the algorithm?
Partition is perfectly balanced
Which is more likely?
The latter, by far, except...
Will any particular input elicit the worst case?
Yes: Alreadysorted input
David Luebke 11 1/31/2012
Analyzing Quicksort
In the worst case:
T(1) = O(1)
T(n) = T(n  1) + O(n)
Works out to
T(n) = O(n
2
)
David Luebke 12 1/31/2012
Analyzing Quicksort
In the best case:
T(n) = 2T(n/2) + O(n)
What does this work out to?
T(n) = O(n lg n)
David Luebke 13 1/31/2012
Improving Quicksort
The real liability of quicksort is that it runs in
O(n
2
) on alreadysorted input
Book discusses two solutions:
Randomize the input array, OR
Pick a random pivot element
How will these solve the problem?
By insuring that no particular input can be chosen
to make quicksort run in O(n
2
) time
David Luebke 14 1/31/2012
Analyzing Quicksort: Average Case
Assuming random input, averagecase running
time is much closer to O(n lg n) than O(n
2
)
First, a more intuitive explanation/example:
Suppose that partition() always produces a 9to1
split. This looks quite unbalanced!
The recurrence is thus:
T(n) = T(9n/10) + T(n/10) + n
How deep will the recursion go? (draw it)
Use n instead of O(n)
for convenience (how?)
David Luebke 15 1/31/2012
Analyzing Quicksort: Average Case
Intuitively, a reallife run of quicksort will
produce a mix of ³bad´ and ³good´ splits
Randomly distributed among the recursion tree
Pretend for intuition that they alternate between
bestcase (n/2 : n/2) and worstcase (n1 : 1)
What happens if we badsplit root node, then
goodsplit the resulting size (n1) node?
David Luebke 16 1/31/2012
Analyzing Quicksort: Average Case
Intuitively, a reallife run of quicksort will
produce a mix of ³bad´ and ³good´ splits
Randomly distributed among the recursion tree
Pretend for intuition that they alternate between
bestcase (n/2 : n/2) and worstcase (n1 : 1)
What happens if we badsplit root node, then
goodsplit the resulting size (n1) node?
We fail English
David Luebke 17 1/31/2012
Analyzing Quicksort: Average Case
Intuitively, a reallife run of quicksort will
produce a mix of ³bad´ and ³good´ splits
Randomly distributed among the recursion tree
Pretend for intuition that they alternate between
bestcase (n/2 : n/2) and worstcase (n1 : 1)
What happens if we badsplit root node, then
goodsplit the resulting size (n1) node?
We end up with three subarrays, size 1, (n1)/2, (n1)/2
Combined cost of splits = n + n 1 = 2n 1 = O(n)
No worse than if we had goodsplit the root node!
David Luebke 18 1/31/2012
Analyzing Quicksort: Average Case
Intuitively, the O(n) cost of a bad split
(or 2 or 3 bad splits) can be absorbed
into the O(n) cost of each good split
Thus running time of alternating bad and good
splits is still O(n lg n), with slightly higher
constants
How can we be more rigorous?
David Luebke 19 1/31/2012
Analyzing Quicksort: Average Case
For simplicity, assume:
All inputs distinct (no repeats)
Slightly different partition() procedure
partition around a random element, which is not
included in subarrays
all splits (0:n1, 1:n2, 2:n3, « , n1:0) equally likely
What is the probability of a particular split
happening?
Answer: 1/n
David Luebke 20 1/31/2012
Analyzing Quicksort: Average Case
So partition generates splits
(0:n1, 1:n2, 2:n3, « , n2:1, n1:0)
each with probability 1/n
If T(n) is the expected running time,
What is each term under the summation for?
What is the O(n) term for?
¸ ) ¸ ) ¸ ) . J ¸ )
¿
=
O + + =
1
0
1
1
n
k
n k n T k T
n
n T
David Luebke 21 1/31/2012
Analyzing Quicksort: Average Case
So«
Note: this is just like the book¶s recurrence (p166),
except that the summation starts with k=0
We¶ll take care of that in a second
¸ ) ¸ ) ¸ ) . J ¸ )
¸ ) ¸ )
¿
¿
=
=
O + =
O + + =
1
0
1
0
2
1
1
n
k
n
k
n k T
n
n k n T k T
n
n T
Write it on
the board
David Luebke 22 1/31/2012
Analyzing Quicksort: Average Case
We can solve this recurrence using the dreaded
substitution method
Guess the answer
Assume that the inductive hypothesis holds
Substitute it in for some value < n
Prove that it follows for n
David Luebke 23 1/31/2012
Analyzing Quicksort: Average Case
We can solve this recurrence using the dreaded
substitution method
Guess the answer
What¶s the answer?
Assume that the inductive hypothesis holds
Substitute it in for some value < n
Prove that it follows for n
David Luebke 24 1/31/2012
Analyzing Quicksort: Average Case
We can solve this recurrence using the dreaded
substitution method
Guess the answer
T(n) = O(n lg n)
Assume that the inductive hypothesis holds
Substitute it in for some value < n
Prove that it follows for n
David Luebke 25 1/31/2012
Analyzing Quicksort: Average Case
We can solve this recurrence using the dreaded
substitution method
Guess the answer
T(n) = O(n lg n)
Assume that the inductive hypothesis holds
What¶s the inductive hypothesis?
Substitute it in for some value < n
Prove that it follows for n
David Luebke 26 1/31/2012
Analyzing Quicksort: Average Case
We can solve this recurrence using the dreaded
substitution method
Guess the answer
T(n) = O(n lg n)
Assume that the inductive hypothesis holds
T(n) · an lg n + b for some constants a and b
Substitute it in for some value < n
Prove that it follows for n
David Luebke 27 1/31/2012
Analyzing Quicksort: Average Case
We can solve this recurrence using the dreaded
substitution method
Guess the answer
T(n) = O(n lg n)
Assume that the inductive hypothesis holds
T(n) · an lg n + b for some constants a and b
Substitute it in for some value < n
What value?
Prove that it follows for n
David Luebke 28 1/31/2012
Analyzing Quicksort: Average Case
We can solve this recurrence using the dreaded
substitution method
Guess the answer
T(n) = O(n lg n)
Assume that the inductive hypothesis holds
T(n) · an lg n + b for some constants a and b
Substitute it in for some value < n
The value k in the recurrence
Prove that it follows for n
David Luebke 29 1/31/2012
Analyzing Quicksort: Average Case
We can solve this recurrence using the dreaded
substitution method
Guess the answer
T(n) = O(n lg n)
Assume that the inductive hypothesis holds
T(n) · an lg n + b for some constants a and b
Substitute it in for some value < n
The value k in the recurrence
Prove that it follows for n
Grind through it«
David Luebke 30 1/31/2012
Note: leaving the same
recurrence as the book
What are we doing here?
Analyzing Quicksort: Average Case
¸ ) ¸ ) ¸ )
¸ ) ¸ )
¸ ) ¸ )
¸ ) ¸ )
¸ ) ¸ )
¿
¿
¿
¿
¿
=
=
=
=
=
O + + =
O + + + =
O +
¦

¦
¸
+ + ·
O + + ·
O + =
1
1
1
1
1
1
1
0
1
0
lg
2
2
lg
2
lg
2
lg
2
2
n
k
n
k
n
k
n
k
n
k
n b k ak
n
n
n
b
b k ak
n
n b k ak b
n
n b k ak
n
n k T
n
n T
The recurrence to be solved
What are we doing here?
What are we doing here?
Plug in inductive hypothesis
Expand out the k=0 case
2b/n is just a constant,
so fold it into O(n)
David Luebke 31 1/31/2012
What are we doing here?
What are we doing here?
Evaluate the summation:
b+b+«+b = b (n1)
The recurrence to be solved
Since n1<n, 2b(n1)/n < 2b
Analyzing Quicksort: Average Case
¸ ) ¸ ) ¸ )
¸ )
¸ )
¸ ) n b k k
n
a
n n
n
b
k k
n
a
n b
n
k ak
n
n b k ak
n
n T
n
k
n
k
n
k
n
k
n
k
O + + ·
O + + =
O + + =
O + + =
¿
¿
¿ ¿
¿
=
=
=
=
=
2 lg
2
) 1 (
2
lg
2
2
lg
2
lg
2
1
1
1
1
1
1
1
1
1
1
What are we doing here? Distribute the summation
This summation gets its own set of slides later
David Luebke 32 1/31/2012
How did we do this?
Pick a large enough that
an/4 dominates O(n)+b
What are we doing here?
Remember, our goal is to get
T(n) · an lg n + b
What the hell? We¶ll prove this later
What are we doing here? Distribute the (2a/n) term
The recurrence to be solved
Analyzing Quicksort: Average Case
¸ ) ¸ )
¸ )
¸ )
¸ )
b n an
n
a
b n b n an
n b n
a
n an
n b n n n
n
a
n b k k
n
a
n T
n
k
+ ·
¦
'
+
'
+ O + + =
O + + =
O + +
¦
'
+
'
·
O + + ·
¿
=
lg
4
lg
2
4
lg
2
8
1
lg
2
1 2
2 lg
2
2 2
1
1
David Luebke 33 1/31/2012
Analyzing Quicksort: Average Case
So T(n) · an lg n + b for certain a and b
Thus the induction holds
Thus T(n) = O(n lg n)
Thus quicksort runs in O(n lg n) time on average
(phew!)
Oh yeah, the summation«
David Luebke 34 1/31/2012
What are we doing here?
The lg k in the second term
is bounded by lg n
Tightly Bounding
The Key Summation
¦
¦
¦
¦
¦
¦
¿ ¿
¿ ¿
¿ ¿ ¿
=
=
=
=
=
=
=
+ =
+ ·
+ =
1
2
1 2
1
1
2
1 2
1
1
2
1 2
1
1
1
lg lg
lg lg
lg lg lg
n
n k
n
k
n
n k
n
k
n
n k
n
k
n
k
k n k k
n k k k
k k k k k k
What are we doing here?
Move the lg n outside the
summation
What are we doing here?
Split the summation for a
tighter bound
David Luebke 35 1/31/2012
The summation bound so far
Tightly Bounding
The Key Summation
¦
¦
¸ )
¦
¦
¸ )
¦
¦
¸ )
¦
¦
¿ ¿
¿ ¿
¿ ¿
¿ ¿ ¿
=
=
=
=
=
=
=
=
=
+ =
+ =
+ ·
+ ·
1
2
1 2
1
1
2
1 2
1
1
2
1 2
1
1
2
1 2
1
1
1
lg 1 lg
lg 1 lg
lg 2 lg
lg lg lg
n
n k
n
k
n
n k
n
k
n
n k
n
k
n
n k
n
k
n
k
k n k n
k n n k
k n n k
k n k k k k
What are we doing here?
The lg k in the first term is
bounded by lg n/2
What are we doing here? lg n/2 = lg n  1
What are we doing here?
Move (lg n  1) outside the
summation
David Luebke 36 1/31/2012
The summation bound so far
Tightly Bounding
The Key Summation
¸ )
¦
¦
¦ ¦
¦
¦
¸ )
¦
¿
¿ ¿
¿ ¿ ¿
¿ ¿ ¿
=
=
=
=
=
=
=
=
=
¦
'
+
'
=
=
+ =
+ ·
1 2
1
1 2
1
1
1
1
2
1 2
1
1 2
1
1
2
1 2
1
1
1
2
) ( 1
lg
lg
lg lg
lg 1 lg lg
n
k
n
k
n
k
n
n k
n
k
n
k
n
n k
n
k
n
k
k
n n
n
k k n
k n k k n
k n k n k k
What are we doing here? Distribute the (lg n  1)
What are we doing here?
The summations overlap in
range; combine them
What are we doing here? The Guassian series
David Luebke 37 1/31/2012
The summation bound so far
Tightly Bounding
The Key Summation
¸ )
¦
¸ ) . J
¸ ) . J
¸ )
4 8
1
lg lg
2
1
1
2 2 2
1
lg 1
2
1
lg 1
2
1
lg
2
) ( 1
lg
2 2
1 2
1
1 2
1
1
1
n
n n n n n
n n
n n n
k n n n
k n
n n
k k
n
k
n
k
n
k
+ ·
¦
'
+
'
¦
'
+
'
·
·
¦
'
+
'
·
¿
¿ ¿
=
=
=
What are we doing here?
Rearrange first term, place
upper bound on second
What are we doing? X Guassian series
What are we doing?
Multiply it
all out
David Luebke 38 1/31/2012
Tightly Bounding
The Key Summation
¸ )
! ! Done!
2 when
8
1
lg
2
1
4 8
1
lg lg
2
1
lg
2 2
2 2
1
1
> ·
+ ·
¿
=
n n n n
n
n n n n n k k
n
k
Homework 2
Assigned today, due next Wednesday Will be on web page shortly after class Go over now
David Luebke
2
1/31/2012
Review: Quicksort
Sorts in place Sorts O(n lg n) in the average case Sorts O(n2) in the worst case
But in practice, it¶s quick And the worst case doesn¶t happen often (but more on this later«)
David Luebke
3
1/31/2012
r] is partitioned into two nonempty subarrays A[p....q] and A[q+1. no combining step: two subarrays form an alreadysorted array David Luebke 4 1/31/2012 .q] are less than all elements in A[q+1.r] Invariant: All elements in A[p.r] The subarrays are recursively sorted by calls to quicksort Unlike merge sort...Quicksort Another divideandconquer algorithm The array A[p.
p. Quicksort(A. q). r). } } David Luebke 5 1/31/2012 . r) { if (p < r) { q = Partition(A. p. q+1. p.Quicksort Code Quicksort(A. Quicksort(A. r).
Partition Clearly. all the action takes place in the partition() function Rearranges the subarray in place End result: Two subarrays All values in first subarray e all values in second Returns the index of the ³pivot´ element separating the two subarrays How do you suppose we implement this? 6 1/31/2012 David Luebke .
r): Select an element to act as the ³pivot´ (which?) Grow two regions....i] and A[j.r] >= pivot David Luebke Increment i until A[i] >= pivot Decrement j until A[j] <= pivot Swap A[i] and A[j] Note: slightly different from book¶s partition() Repeat until i >= j Return j 7 1/31/2012 .Partition In Words Partition(A. p.. A[p.r] All elements in A[p.i] <= pivot All elements in A[j.
2. 4. until A[j] <= x. 7}. p. while (TRUE) repeat j. j). else return j. j = r + 1. r) x = A[p]. Illustrate on i = p .Partition Code Partition(A. What is the running time of repeat partition()? i++. 1. until A[i] >= x. 3. A = {5. i. if (i < j) Swap(A.1. 3. 6. David Luebke 8 1/31/2012 .
until A[i] >= x. while (TRUE) repeat j. repeat i++. if (i < j) Swap(A. p. until A[j] <= x. David Luebke 9 partition() runs in O(n) time 1/31/2012 .1. i = p .Partition Code Partition(A. j = r + 1. r) x = A[p]. j). i. else return j.
Analyzing Quicksort What will be the worst case for the algorithm? Partition is always unbalanced Partition is perfectly balanced The latter. by far... Yes: Alreadysorted input 10 1/31/2012 What will be the best case for the algorithm? Which is more likely? Will any particular input elicit the worst case? David Luebke . except.
1) + 5(n) Works out to T(n) = 5(n2) David Luebke 11 1/31/2012 .Analyzing Quicksort In the worst case: T(1) = 5(1) T(n) = T(n .
Analyzing Quicksort In the best case: T(n) = 2T(n/2) + 5(n) What does this work out to? T(n) = 5(n lg n) David Luebke 12 1/31/2012 .
Improving Quicksort The real liability of quicksort is that it runs in O(n2) on alreadysorted input Book discusses two solutions: Randomize the input array. OR Pick a random pivot element By insuring that no particular input can be chosen to make quicksort run in O(n2) time How will these solve the problem? David Luebke 13 1/31/2012 .
a more intuitive explanation/example: Suppose that partition() always produces a 9to1 split.Analyzing Quicksort: Average Case Assuming random input. This looks quite unbalanced! The recurrence is thus: Use n instead of O(n) for convenience (how?) T(n) = T(9n/10) + T(n/10) + n How deep will the recursion go? (draw it) David Luebke 14 1/31/2012 . averagecase running time is much closer to O(n lg n) than O(n2) First.
a reallife run of quicksort will produce a mix of ³bad´ and ³good´ splits Randomly distributed among the recursion tree Pretend for intuition that they alternate between bestcase (n/2 : n/2) and worstcase (n1 : 1) What happens if we badsplit root node.Analyzing Quicksort: Average Case Intuitively. then goodsplit the resulting size (n1) node? David Luebke 15 1/31/2012 .
then goodsplit the resulting size (n1) node? We fail English David Luebke 16 1/31/2012 .Analyzing Quicksort: Average Case Intuitively. a reallife run of quicksort will produce a mix of ³bad´ and ³good´ splits Randomly distributed among the recursion tree Pretend for intuition that they alternate between bestcase (n/2 : n/2) and worstcase (n1 : 1) What happens if we badsplit root node.
then goodsplit the resulting size (n1) node? end up with three subarrays.Analyzing Quicksort: Average Case Intuitively. (n1)/2. size 1. a reallife run of quicksort will produce a mix of ³bad´ and ³good´ splits Randomly distributed among the recursion tree Pretend for intuition that they alternate between bestcase (n/2 : n/2) and worstcase (n1 : 1) What happens if we badsplit root node. (n1)/2 Combined cost of splits = n + n 1 = 2n 1 = O(n) No worse than if we had goodsplit the root node! We David Luebke 17 1/31/2012 .
with slightly higher constants How can we be more rigorous? David Luebke 18 1/31/2012 .Analyzing Quicksort: Average Case Intuitively. the O(n) cost of a bad split (or 2 or 3 bad splits) can be absorbed into the O(n) cost of each good split Thus running time of alternating bad and good splits is still O(n lg n).
Analyzing Quicksort: Average Case For simplicity. « . 2:n3. assume: All inputs distinct (no repeats) Slightly different partition() procedure partition around a random element. 1:n2. n1:0) equally likely What is the probability of a particular split happening? Answer: 1/n 19 1/31/2012 David Luebke . which is not included in subarrays all splits (0:n1.
« . 2:n3. 1:n2.Analyzing Quicksort: Average Case So partition generates splits (0:n1. n1:0) each with probability 1/n If T(n) is the expected running time. 1 n 1 T . n2:1.
n ! § ? .
k T .
n 1 k A 5.
n T n k !0 What is each term under the summation for? What is the 5(n) term for? 20 1/31/2012 David Luebke .
Analyzing Quicksort: Average Case So« 1 n 1 T .
n ! § ? .
k T .
n 1 k A 5.
n T n k !0 2 n 1 ! § T .
k 5.
n n k !0 Write it on the board Note: this is just like the book¶s recurrence (p166). except that the summation starts with k=0 We¶ll take care of that in a second 21 1/31/2012 David Luebke .
Analyzing Quicksort: Average Case We can solve this recurrence using the dreaded substitution method Guess the answer Assume that the inductive hypothesis holds Substitute it in for some value < n Prove that it follows for n David Luebke 22 1/31/2012 .
Analyzing Quicksort: Average Case We can solve this recurrence using the dreaded substitution method Guess the answer What¶s the answer? Assume that the inductive hypothesis holds Substitute it in for some value < n Prove that it follows for n David Luebke 23 1/31/2012 .
Analyzing Quicksort: Average Case We can solve this recurrence using the dreaded substitution method Guess the answer T(n) = O(n lg n) Assume that the inductive hypothesis holds Substitute it in for some value < n Prove that it follows for n David Luebke 24 1/31/2012 .
Analyzing Quicksort: Average Case We can solve this recurrence using the dreaded substitution method Guess the answer T(n) = O(n lg n) the inductive hypothesis? Assume that the inductive hypothesis holds What¶s Substitute it in for some value < n Prove that it follows for n David Luebke 25 1/31/2012 .
Analyzing Quicksort: Average Case We can solve this recurrence using the dreaded substitution method Guess the answer T(n) = O(n lg n) e an lg n + b for some constants a and b Assume that the inductive hypothesis holds T(n) Substitute it in for some value < n Prove that it follows for n David Luebke 26 1/31/2012 .
Analyzing Quicksort: Average Case We can solve this recurrence using the dreaded substitution method Guess the answer T(n) = O(n lg n) e an lg n + b for some constants a and b value? Assume that the inductive hypothesis holds T(n) Substitute it in for some value < n What Prove that it follows for n 27 1/31/2012 David Luebke .
Analyzing Quicksort: Average Case We can solve this recurrence using the dreaded substitution method Guess the answer T(n) = O(n lg n) e an lg n + b for some constants a and b value k in the recurrence Assume that the inductive hypothesis holds T(n) Substitute it in for some value < n The Prove that it follows for n 28 1/31/2012 David Luebke .
Analyzing Quicksort: Average Case We can solve this recurrence using the dreaded substitution method Guess the answer T(n) = O(n lg n) e an lg n + b for some constants a and b value k in the recurrence through it« 29 1/31/2012 Assume that the inductive hypothesis holds T(n) Substitute it in for some value < n The Prove that it follows for n Grind David Luebke .
Analyzing Quicksort: Average Case 2 n 1 T .
n ! § T .
k 5.
n n k !0 2 n 1 e § .
ak lg k b 5.
n n k !0 The recurrence to be solved Plug in inductive hypothesis What are we doing here? 2 « n 1 » What are we doing here? e ¬b § .
ak lg k b ¼ 5.
n Expand out the k=0 case n k !1 ½ 2 n 1 2b ! § .
ak lg k b 5.
n n k !1 n 2 n 1 ! § .
ak lg k b 5.
What are we doing here? so fold it into 5(n) Note: leaving the same recurrence as the book 1/31/2012 .n n k !1 David Luebke 30 2b/n is just a constant.
Analyzing Quicksort: Average Case 2 n 1 T .
n ! § .
ak lg k b 5.
n n k !1 2 2 ! § ak lg k § b 5.
n n k !1 n k !1 n 1 n 1 The recurrence to be solved Distribute the summation What are we doing here? 2a n 1 2b the summation: k lg k ( n 1) 5.
n Evaluateare we doing here? ! What § b+b+«+b = b (n1) n k !1 n 2a n 1 e § k lg k 2b 5.
we doing here? What are 2b(n1)/n < 2b .n n k !1 This summation gets its own set of slides later David Luebke 31 1/31/2012 Since n1<n.
Analyzing Quicksort: Average Case 2a n 1 T .
n e § k lg k 2b 5.
n n k !1 e ! ! e David Luebke The recurrence to be solved 2a ¨ 1 2 1 2¸ We¶ll prove this © n lg n n ¹ 2b 5.
n What the hell? later 8 º n ª2 a an lg n n 2b 5.
our goal is to get ¨ an lg n b © 5.n Distribute the (2a/n) term What are we doing here? 4 a ¸ Remember.
n b n ¹ What are we doing here? T(n) e an lg n + b 4 º ª Pick a large enough that an lg n b How did we do this? an/4 dominates 5(n)+b 32 1/31/2012 .
Analyzing Quicksort: Average Case So T(n) e an lg n + b for certain a and b Thus the induction holds Thus T(n) = O(n lg n) Thus quicksort runs in O(n lg n) time on average (phew!) Oh yeah. the summation« David Luebke 33 1/31/2012 .
Tightly Bounding The Key Summation § k lg k ! § k lg k § k lg k k !1 k !1 k ! «n 2 » n 1 n 1 «n 2 »1 n 1 Split the summation for a What are we doing here? tighter bound e ! «n 2 »1 k !1 § k lg k § k lg n k ! «n 2 » The lg k in the second term What are we doing here? is bounded by lg n «n 2 »1 k !1 § k lg k lg n § k k ! «n 2 » n 1 Move the lg n outside the What are we doing here? summation David Luebke 34 1/31/2012 .
Tightly Bounding The Key Summation § k lg k e k !1 n 1 «n 2 »1 k !1 § k lg k lg n k !1 k ! «n 2 » §k n 1 k ! «n 2 » n 1 n 1 The summation bound so far e ! «n 2 »1 § k lg.
n 2 lg n § k § k .
1 ! .lg n 1 lg n k !1 k ! «n 2 » n 1 The lg k in the first term is What are we doing here? bounded by lg n/2 «n 2 »1 §k §k lg n/2 = lg n we doing here? What are .
1) outside the What are we doing here? summation 1/31/2012 k ! «n 2 » .lg n 1 § k lg n k !1 David Luebke 35 «n 2 »1 Move (lg n .
Tightly Bounding The Key Summation § k lg k e .
lg n 1 § k lg n k !1 k !1 n 1 «n 2 »1 k ! «n 2 » §k n 1 n 1 The summation bound so far ! lg n «n 2 »1 k !1 §k «n 2 »1 k !1 § k lg n k ! «n 2 » §k What are we doing here? Distribute the (lg n . combine them ¨ .1) ! lg n§ k k !1 n 1 «n 2 »1 k !1 §k «n 2 »1 k !1 The summations overlap in What are we doing here? range.
n 1 (n) ¸ ! lg n© ¹ 2 ª º David Luebke §k The Guassian series here? What are we doing 36 1/31/2012 .
Tightly Bounding The Key Summation ¨ .
n 1 ( n) ¸ ¹ § k lg k e © 2 º lg n § k ª k !1 k !1 n 2 1 1 e ?n.
n 1 Alg n § k 2 k !1 n 1 «n 2 »1 The summation bound so far Rearrange first term. place What are we doing here? upper bound on second 1 1 ¨ n ¸¨ n ¸ e ?n.
n 1 Alg n © ¹© 1¹ 2 ª 2 ºª 2 º 2 1 2 1 2 n e n lg n n lg n n 2 8 4 X Guassian series What are we doing? .
Multiply it What are we doing? all out 1/31/2012 David Luebke 37 .
Tightly Bounding The Key Summation 1 2 1 2 n § k lg k e 2 .
n lg n n lg n 8 n 4 k !1 1 2 1 2 e n lg n n when n u 2 2 8 Done!!! n 1 David Luebke 38 1/31/2012 .
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue reading from where you left off, or restart the preview.