You are on page 1of 19

ASYMPTOTIC ANALYSIS

Growth of Functions
Changing the hardware/ software environment
•Affects T(n) by a constant factor
•Does not alter the growth rate of T(n)
Although we can sometimes determine the exact running time
of an algorithm, the extra precision is not usually worth the effort
of computing it.

For large inputs, the multiplicative constants and lower order


terms of an exact running time are dominated by the effects of
the input size itself.
n logn n nlogn n2 n3 2n

4 2 4 8 16 64 16
8 3 8 24 64 512 256
16 4 16 64 256 4,096 65,536
4,294,967,29
32 5 32 160 1,024 32,768
6
64 6 64 384 4,094 262,144 1.84 * 1019
128 7 128 896 16,384 2,097,152 3.40 * 1038
256 8 256 2,048 65,536 16,777,216 1.15 * 1077
512 9 512 4,608 262,144 134,217,728 1.34 * 10154
1,073,741,82
1024 10 1,024 10,240 1,048,576 1.79 * 10308
4
The Growth Rate of the Six Popular functions
Asymptotic Complexity

•Running time (order of growth or rate of growth) of an algorithm


as a function of input size n for large n.

•Expressed using only the highest-order term in the


expression for the exact running time.
-Instead of exact running time, say (n2).

•Describes behavior of function in the limit.

•Written using Asymptotic Notation.


Asymptotic Notation

•, O, W, o, w

•Defined for functions over the natural numbers.


Ex: f(n) = (n2).
Describes how f(n) grows in comparison to n2.

•Define a set of functions; in practice used to compare two


function sizes.

•The notations describe different rate-of-growth relations between


the defining function and the defined set of functions
-notation
For function g(n), we define (g(n)), big-Theta of n, as the set:

(g(n)) = {f(n) : positive constants c1, c2, and n0, such that n 
n0, we have 0  c1g(n)  f(n)  c2g(n)}

g(n) is an asymptotically tight bound for f(n).


A function f (n) belongs to the set ( g (n)) if there exist positive
constants c1 and c2 such that it can be “sand- wiched” between
c1 g (n) and c2 g (n) or sufficienly large n.

f (n)  ( g (n)) means that there exists some constant c1 and c2


s.t.
c1 g (n)  f ( n)  c 2 g ( n) for large enough n.
•Determine positive constants c1, c2, and n0 such that

for all n ≥ n0.


•Dividing by n2,

•The right-hand inequality can be made to hold for any value


of n ≥ 1 by choosing c2 ≥ 1/2.

•The left-hand inequality can be made to hold for any value


of n ≥ 7 by choosing c1 ≤1/14. Thus, by choosing c1 = 1/14,
c2 = 1/2, and n0 = 7

Thus, We can verify that 1/2n2 - 3n = Θ(n2).


O-notation

For function g(n), we define O(g(n)), big-O of n, as the set:


O(g(n)) = {f(n) :  positive constants c and n0, such that n  n0,
we have 0  f(n)  cg(n) }

•Set of all functions whose


rate of growth is the same
as or lower than that of g(n).

•g(n) is an asymptotic upper


bound for f(n).

f(n) = (g(n))  f(n) = O(g(n)). (g(n))  O(g(n)).


Examples

O(g(n)) = {f(n) :  positive constants c and n0, such that n 


n0, we have 0  f(n)  cg(n) }

1. 7n-2= O(n)
need c > 0 and n0  1 such that 7n-2  c•n for n 
n0

this is true for c = 7 and n0 = 1


3 log n + log log n = O(log n)

need c > 0 and n0  1 such that 3 log n + log log n 


c•log n for n  n0 this is true for c = 4 and n0 = 2
Algorithm prefixAverages2(X, n)
Input array X of n integers
Output array A of prefix averages of X #operations
A  new array of n integers n
s0 1
for i  0 to n  1 do n
{s  s + X[i] n
A[i]  s / (i + 1) } n
return A 1

Algorithm prefixAverages2 runs in O(n) time


 -notation

For function g(n), we define (g(n)), big-Omega of n, as the set:


(g(n)) = {f(n) :  positive constants c and n0, such that n  n0,
we have 0  cg(n)  f(n)}
•Set of all functions whose rate of
growth is the same as or higher
than that of g(n).

•g(n) is an asymptotic lower


bound for f(n).

f(n) = (g(n))  f(n) = (g(n)).


(g(n))  (g(n))..
100n + 5 ≠ (n2)

To find c, n0 such that: 0  cn2  100n + 5

100n + 5  100n + 5n (for n  1) = 105n

cn2  105n

Since n is positive  cn – 105  0

 contradiction: n cannot be smaller than a constant


Relations Between , W, O

Theorem : For any two functions g(n) and f(n), f(n) =


(g(n)) iff f(n) = O(g(n)) and f(n) = (g(n)).

•I.e., (g(n)) = O(g(n))  (g(n))

RR
O( f ) ( f )
•f
( f )
o-notation

For a given function g(n), the set little-o:

o(g(n)) = {f(n):  c > 0,  n0 > 0 such that


 n  n0, we have 0  f(n) < cg(n)}.

f(n) becomes insignificant relative to g(n) as n approaches


infinity:
lim [f(n) / g(n)] = 0
n

g(n) is an upper bound for f(n) that is not asymptotically tight.


w -notation

For a given function g(n), the set little-omega:


w(g(n)) = {f(n):  c > 0,  n0 > 0 such that
 n  n0, we have 0  cg(n) < f(n)}.

f(n) becomes arbitrarily large relative to g(n) as n approaches


infinity:
lim [f(n) / g(n)] = .
n

g(n) is a lower bound for f(n) that is not asymptotically tight.


Intuitions about asymptotic notations

Big-Oh
f(n) is O(g(n)) if f(n) is asymptotically less than or equal to
g(n)
big-Omega
f(n) is (g(n)) if f(n) is asymptotically greater than or equal
to g(n)
big-Theta
f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
little-oh
f(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n)
little-omega
f(n) is (g(n)) if is asymptotically strictly greater than g(n)

You might also like