You are on page 1of 23

26-4-21

PCC-CS404
Design and Analysis of Algorithms

Asymptotic Notations
Recurrence Relation
Asymptotic Notations

Commonly used asymptotic notations to calculate


the running time complexity of an algorithm.
• Ο Notation
• Ω Notation
• θ Notation

2
Big-O
f  n   O  g  n   : there exist positive constants c and n 0 such that
0  f  n   cg  n  for all n  n0

What does it mean?


– If f(n) = O(n2), then:
• f(n) can be larger than n2 sometimes, but…
• We can choose some constant c and some value n0 such that for
every value of n larger than n0 : f(n) < cn2
• That is, for values larger than n0, f(n) is never more than a constant
multiplier greater than n2
• Or, in other words, f(n) does not grow more than a constant factor
faster than n2.

3
Big-O(cont.)
Visualization of O(g(n))
cg(n)

f(n)

n0

4
Big-O(cont.)

Examples:

We are considering F(n)= n2

n2 ≤ cn2  c ≥ 1  c = 1 and n0= 1

n2 = O(n2):

5
Example
Prove that 5n+3= O (n)

6
Example

7
Big-O(cont.)

8
Big-O(cont.)

Examples :

• Prove that: 20 n  2 n  5  O n
2 2

• Let c = 21 and n0 = 4
• 21n2 > 20n2 + 2n + 5 for all n > 4
n2 > 2n + 5 for all n > 4
TRUE

9
Big Omega – Notation
• () – A lower bound
f  n    g  n   : there exist positive constants c and n 0 such that
0  f  n   cg  n  for all n  n0

Consider F(n) = n2
– n2 = (n)
– Let c = 1, n0 = 2
– For all n  2, n2 > 1  n

10
Big Omega – Notation(cont.)
Visualization of (g(n))
f(n)

cg(n)

n0

11
Example

12
-Notation

 provides a tight bound

f  n    g  n   : there exist positive constants c1 , c 2 , and n 0 such that


0  c1 g  n   f  n   c 2 g  n  for all n  n0

f  n    g  n    f  n   O  g  n   AND f  n    g  n  

13
-Notation(cont.)
Visualization of (g(n))
c2g(n)

f(n)

c1g(n)

n0

14
Example
Prove that 5n+3=Ѳ(n)

15
Classifying functions by their
Asymptotic Growth Rates
• O(g(n)), Big-Oh of g of n, the Asymptotic Upper
Bound;
• (g(n)), Theta of g of n, the Asymptotic Tight
Bound; and
• (g(n)), Omega of g of n, the Asymptotic Lower
Bound.
Growth of Functions

17
Recursion
Recursion is a particularly powerful kind of reduction, which can be described loosely as
follows:
• If the given instance of the problem is small or simple enough, just solve it.
• Otherwise, reduce the problem to one or more simpler instances of the same problem.

Recursion is generally expressed in terms of recurrences. In other words, when an


algorithm calls to itself, we can often describe its running time by a recurrence equation
which describes the overall running time of a problem of size n in terms of the running
time on smaller inputs.
E.g. the worst case running time T(n) of the merge sort procedure by recurrence can be
expressed as
T(n)= ϴ(1) ; if n=1
2T(n/2) + ϴ(n) ;if n>1
whose solution can be found as T(n)=ϴ(nlog n)

18
There are various techniques to solve recurrences.

1. Substitution Method
2. Iteration Method
3. Recursion Tree Method
4. Master Method

19
Substitution Method
The Substitution Method Consists of two main steps:

1. Guess the Solution.


2. Use the mathematical induction to find the boundary condition and shows that the
guess is correct.

20
Substitution Method

21
Substitution Method

22
END

23

You might also like