Professional Documents
Culture Documents
1-Divide and Conquer Algorithms
1-Divide and Conquer Algorithms
7n-3 is O(n)
8n2log n + 5n2 + n is O(n2log n)
Asymptotic Notation (terminology)
Special classes of algorithms:
constant: O(1)
logarithmic: O(log n)
linear: O(n)
quadratic: O(n2)
polynomial: O(nk), k ≥ 1
exponential: O(an), a > 1
NOTE: One algorithm is more efficient than another
if its Worst Case Running time has lower order of
Growth
Comparing the asymptotic running time
-an algorithm that runs in O(n) time is better
than one that runs in O(n2) time
-similarly, O(log n) is better than O(n)
-hierarchy of functions: log n << n << n2 <<
n3 << 2n
Categories of algorithm efficiency
Efficiency Big O
Constant O(1)
Logarithmic O(log n)
Linear O(n)
Linear logarithmic O(n log n)
Quadratic O(n2)
Polynomial O(nk)
Exponential O(cn)
Factorial O(n!)
Designing Algorithms
10 4 20 8 15 2 1 -5
DIVIDE:
DIVIDE: 10 4 20 8 15 2 1 -5
Sort 4 10 8 20 2 15 -5 1
(Conquer)
COMBINE:
4 8 10 20 -5 1 2 15
COMBINE:
-5 1 2 4 8 10 15 20
Brute Force: n2, Merge Sort: n log n
What did we do?
We are breaking down a larger problem into
smaller sets (DIVIDE).
1 if p < r
2 then q └(p+r)/2 ┘
3 Merge_sort(A, p,q)
4 Merge_sort(A, q+1, r)
5 MERGE (A,p, q, r)
Merging Two Sequences
Merge_Sort(A,p,r) { T(n)
if (p < r) { (1)
q = floor((p + r) / 2); (1)
MergeSort(A,p,q); T(n/2)
MergeSort(A, q+1,r); T(n/2)
Merge(A, p, q,r); (n)
}
}
Worst Case Running time for Merge Sort after adding D(n) and C(n):
So T(n) = (1) when n = 1, and
2T(n/2) + (n) when n > 1
So what (more succinctly) is T(n)?
Recurrences
The expression:
c n 1
T ( n)
2T cn n 1
n
2
is a recurrence.
Recurrence: an equation that describes a
function in terms of its value on smaller
functions
Recurrence Examples
0 n0 0 n0
s ( n) s ( n)
c s (n 1) n 0 n s (n 1) n 0
n 1
c c n 1
T ( n) T ( n)
2T c n 1
n n
2 aT cn n 1
b
Growth of Function
When the input size is very large then the
better way to do analysis of algorithm is by
doing Asymptotic Anaysis.
Asymptotic efficiency- In this we are
concerned with how the running time of an
algorithm increases with the size of input in
the limit as the size of input increases without
bound.
So for simplification of Asymptotic Anaysis
certain standards and Asymptotic Notations
hav been defined.
ASYMPTOTICS NOTATIONS
Θ-notation
For a given function g(n),
(g(n)) is given by:
100n + 5 ≠ (n2)
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Relations Between , , O
Theorem
Theorem :: ForFor any
any two
two functions
functions g(n)
g(n) and
and f(n),
f(n),
f(n) == (g(n))
f(n) (g(n)) iff
iff
f(n)
f(n) == O(g(n))
O(g(n)) and f(n) == (g(n)).
and f(n) (g(n)).
Ω (n2 )
Θ(n2)
O(n2 )
o-notation
For a given function g(n), the set little-o:
o(g(n)) = {f(n): c > 0, n0 > 0 such that
n n0, we have 0 f(n) < cg(n)}.
Ex: 2n = o(n2) but 2n2 ≠ o(n2)
Running time:
T(n) = O(1) if n ≤ 1
2T(n/2) + O(n) if n > 1
Solving Recurrence Relations
Substitution method:
Make a guess
Verify the guess using induction
Recursion trees:
Visualize how the recurrence unfolds
May lead to a guess to be verified using substitution
If done carefully, may lead to an exact solution
Master theorem:
“Cook-book” solution to a common class of
recurrence relations
Substitution Method
Three steps:
1. Guess the of the solution
2. Prove that the guess is correct assuming that
it can be verified for all n less than some n0.
(inductive step)
3. Verify the guess for all n ≤ n0. (base case)
Why do we switch the two parts of the inductive
proof?
Our guess is vague.
Example:
T(n) ≤ cn lg n (that is, T(n) = O(n lg n))
T ( n ) 2T ( n / 2) bn log n
2( c(n / 2) log(n / 2)) bn log n
cn (log n log 2) bn log n
cn log n cn bn log n
Wrong: we cannot make this last line be less than cn log n
Guess-and-Test Method
Recall the recurrence equation:
b if n 2
T (n )
2T ( n / 2) bn log n if n 2
Guess #2: T(n) < cn log2 n.
T (n) 2T (n / 2) bn log n
2(c(n / 2) log 2 (n / 2)) bn log n
cn(log n log 2) 2 bn log n
cn log 2 n 2cn log n cn bn log n
if c > b. cn log 2 n
So, T(n) is O(n log2 n).
In general, to use this method, you need to have a good guess and you
need to be good at induction proofs.
Master Method
Let a ≥ 1 and b > 1, let f(n) be a function over the positive integers, and let
T(n) be given by the following recurrence:
T(n) = aT(n/b) + f(n)
It follows the following three cases:
T (n) 2T (n / 2) n log n
Solution:
Here a =2 , b= 2 ,logba=1, f(n) = n log n , k = 1
F(n) = n log n = (n log n) so case 2 says T(n) is (n log2 n).
Example 3
T (n) 3T (n / 4) n lg n
a 3, b 4; n log 4 3 n 0.793
f (n) n lg n, f (n) (n log 4 3 ) with 0.2
Case 3:
Regularity condition
af (n / b) 3(n / 4) lg(n / 4) (3 / 4)n lg n cf (n) for c 3 / 4
T (n) (n lg n)
T (n) 2T (n / 2) n lg n
a 2, b 2; n log2 2 n1
f (n) n lg n, f (n) (n1 ) with ?
also n lg n / n1 lg n
neither Case 3 nor Case 2!
Master Method, Example
4. T (n) T (n / 3) n log n
Solution:
b= 3, a=1, logba = 0, f(n) = nlog n = (nloga +1)
so case 3 says T(n) is (n log n).
Regularity condition
a f(n/b) <= cf(n) for some c <1
For this case
a =1 ,b = 3
1(n/3 log(n/3)) = (1/3)(nlog (n/3)) <= 1/3(n log n)= c f(n) for c=1/3
Therfore solution is T(n) = θ(n log n).
Example:5 T (n) 8T (n / 2) n 2
T (n) T (n / 3) n
nlogb a nlog3 1 n 0 1
F (n) (n0 ) for 1
we still need to check af (n / b) n / 3 1/ 3( f (n)
T (n) ( f (n)) (n)(case 3)
Example:9
T (n) 9T ( n / 3) n 2.5
a 9, b 3, and f (n) n 2.5
so n logb a n log3 9 2
f (n) (n 2 ) with 1/ 2
case3 3 if af (n / b) cf (n) for some c 1
f (n / b) 9(n / 3) 2.5 (1/ 3) 0.5 f ( n)
u sin g c (1/ 3)0.5 case 3 applies and
T (n) (n 2.5)
Suppose T(n) = aT(n/b) + cnk for n>1 and n a
power of b
T(1) = d
Where b >= 2 and k>=0 are integers a>0, c>0 d>=
0
Then T (n) ( n k ) ifa b k
( n k lg n) if a b k
( n logb a ) if a b k
Iteration method
In Iterative substitution we iteratively apply
recurrence equation to itself and see if we
can find a pattern.
Idea is to expand(iterate) the recurrence and
express it as a summation of terms
depending only on n and the initial conditions.
Ex
T(n) = k + T(n/2) if n>1
=c if n=1
T(n) = k + T(n/2)
T(n/2) = k + T(n/4)
T(n) = k + k+ T(n/4)
Repeating this process we get
T(n) = k + k + k + T(n/8)
And repeating over and over we get
T(n) = k + k+ … K + T(1)
= k + k+ … K + c
How many k’s are there? This is no. of times we can divide n
by 2 to get down 1 that is log n.Thus
T(n) = (log n)*k + c where k and c are both constants
Thus
T(n) = O(log n)
Repeated Substitution Method
• Let’s find the running time of merge sort (let’s assume
that n=2k, for some k).
1 if n 1
T ( n)
2T (n / 2) n if n 1
T (n) 2T n / 2 n substitute
2 2T n / 4 n / 2 n expand
22 T (n / 4) 2n substitute
22 (2T (n /8) n / 4) 2n expand
23 T ( n /8) 3n observe the pattern
T (n) 2i T (n / 2i ) in
2lg n T (n / n) n lg n n n lg n
Tower of Hanoi Problem
n discs are stacked on pole A. We should move them
to pole B, keeping the following constraints:
We can move a single disc at a time.
We can move only discs that are placed on the top of
their pole.
A disc may be placed only on top of a larger disc, or
on an empty pole.
The third tower C can be used to temporarily hold
disks
Analyze the given solution for the Hanoi towers
problem; how many moves are needed to complete
the task?
Recursive Solution
Recursive Solution
Recursive Solution
Recursive Solution
Tower of Hanoi
Tower of Hanoi
Tower of Hanoi
Tower of Hanoi
Tower of Hanoi
Tower of Hanoi
Tower of Hanoi
Tower of Hanoi
Recursive Algorithm
void Hanoi( int n, string a, string b, string c)
{
if (n == 1) /* base case */
Move( a, b );
else { /* recursion */
Hanoi( n-1, a, c, b );
Move( a, b );
Hanoi( n-1, c, b, a );
}
}
T(n) = the number of moves needed in order to move n disks
from tower A to tower B.
T(n-1) = Number of moves required in order to move n-1 disks
from tower A to tower C
1 = One move is needed in order to put the largest disk in tower
B
T(n-1) = Number of moves required in order to move n-1 disks
from tower C to tower B
Show that the number of moves T(n) required by the algorithm
to solve the n-disk problem satisfies the recurrence relation:
T(1) = 1
T(n) = 2T(n-1) + 1
Guess and Prove
Calculate M(n) for n M(n)
small n and look for
a pattern. 1 1
5 31
Analysis Of Recursive Towers of Hanoi Algorithm
By Iterative substitution method
Expanding:
T(1) = a (1)
T(n) = 2T(n – 1) + b if n > 1 (2)
= 2[2T(n – 2) + b] + b = 22 T(n – 2) + 2b + b by substituting T(n – 1) in (2)
= 22 [2T(n – 3) + b] + 2b + b = 23 T(n – 3) + 22b + 2b + b by substituting T(n-2) in (2)
= 23 [2T(n – 4) + b] + 22b + 2b + b = 24 T(n – 4) + 23 b + 22b + 21b + 20b
by substituting T(n – 3) in (2)
= ……
= 2k T(n – k) + b[2k- 1 + 2k– 2 + . . . 21 + 20]
= 2M(n-1)+1
=2M’(m/b)+1
As a = 2 ; f(n) = n 0 = 1
mlogba = mlogb2 for some b >1
So by Case 1:
Pivot element
A y≤x x y≥x
QUICKSORT (A, p, r)
if p < r
then q ← PARTITION(A, p, r)
QUICKSORT (A, p, q-1)
QUICKSORT (A, q+1, r)
Partition Algorithm
1. This is the initial array that you are starting the sort
with
2. Swap
Quick Sort
1. Scan
• If i & j both stop at elements equal to the pivot, & if the array is
made up of all duplicates, then everyone swaps. In this case all
subsets in the recursion are even & the complexity is O(N log N).
Analysis of Quick Sort