You are on page 1of 26

CS 203

Data Structure & Algorithms

Performance Analysis
Time Complexity
for(i=0; i < N; i++) { for(i=0; i < N; i++) {
statement; for(j=0; j < N;j++) {
} statement;
}
}
• In server machine, it
• In desktop machine, it
requires 0.1ns
requires 0.2ns

Which one is better?

When we calculate time complexity of an algorithm, we consider only input


data and ignore the remaining things, as they are machine dependent
Asymptotic Notation
• Asymptotic notation of an algorithm is a mathematical
representation of its complexity
– To represent the complexity of an algorithm, consider only the most
significant terms in the complexity of that algorithm and ignore least
significant terms in the complexity of that algorithm

• Algorithm 1 : 5n2 + 2n + 1

• Algorithm 2 : 10n2 + 8n + 3

• Majorly, we use THREE types of Asymptotic Notations and


those are as follows...
– Big - Oh (O)
– Big - Omega (Ω)
– Big - Theta (Θ)
Asymptotic Notation: Big - Oh Notation (O)
• Big - Oh notation is used to define the upper bound of an
algorithm in terms of Time Complexity.
• Consider function f(n) the time complexity of an algorithm
and g(n) is the most significant term. If f(n) <= C g(n) for all
n >= n0, C > 0 and n0 >= 1. Then we can represent f(n) as
O(g(n)).

f(n) = O(g(n))
Asymptotic Notation: Big - Oh Notation (O)
• Example
– Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as O(g(n)) then it must satisfy f(n) <= C x
g(n) for all values of C > 0 and n0>= 1
– f(n) <= C g(n)
3n + 2 <= C n

Above condition is always TRUE for all values of C = 4 and n >= 2.


By using Big - Oh notation we can represent the time complexity as
follows...
3n + 2 = O(n)
Asymptotic Notation: Big - Omege Notation (Ω)
• Big - Omega notation is used to define the lower bound of
an algorithm in terms of Time Complexity.
• Consider function f(n) the time complexity of an algorithm
and g(n) is the most significant term. If f(n) >= C x g(n) for all
n >= n0, C > 0 and n0 >= 1. Then we can represent f(n) as
Ω(g(n)).

f(n) = Ω(g(n))
Asymptotic Notation: Big - Omege Notation (Ω)
• Example
– Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as Ω(g(n)) then it must satisfy f(n) >= C
g(n) for all values of C > 0 and n0>= 1
– f(n) >= C g(n)
3n + 2 >= C n
• Above condition is always TRUE for all values of C = 1 and n >= 1.
By using Big - Omega notation we can represent the time complexity as
follows...
3n + 2 = Ω(n)
Asymptotic Notation: Big - Theta Notation (Θ)
• Big - Theta notation is used to define the average bound of
an algorithm in terms of Time Complexity.
• Consider function f(n) the time complexity of an algorithm
and g(n) is the most significant term. If C1 g(n) <= f(n) >= C2
g(n) for all n >= n0, C1, C2 > 0 and n0 >= 1. Then we can
represent f(n) as Θ(g(n)).

f(n) = Θ(g(n))
Asymptotic Notation: Big - Theta Notation (Θ)
• Example
– Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as Θ(g(n)) then it must satisfy C1 g(n) <=
f(n) >= C2 g(n) for all values of C1, C2 > 0 and n0>= 1
– C1 g(n) <= f(n) >= C2 g(n)
C1 n <= 3n + 2 >= C2 n

• Above condition is always TRUE for all values of C1 = 1, C2 = 4 and n >= 1.


By using Big - Theta notation we can represent the time compexity as
follows...
3n + 2 = Θ(n)
o-notation
For a given function g(n), the set little-o:

o(g(n)) = {f(n):  c > 0,  n0 > 0 such that


 n  n0, we have 0  f(n) < cg(n)}.

f(n) becomes insignificant relative to g(n) as n approaches


infinity:
lim [f(n) / g(n)] = 0
n

g(n) is an upper bound for f(n) that is not asymptotically tight.

Observe the difference in this definition from previous ones.


Why?
w -notation
For a given function g(n), the set little-omega:

w(g(n)) = {f(n):  c > 0,  n0 > 0 such that


 n  n0, we have 0  cg(n) < f(n)}.
f(n) becomes arbitrarily large relative to g(n) as n approaches
infinity:
lim [f(n) / g(n)] = .
n

g(n) is a lower bound for f(n) that is not asymptotically tight.


Big-O Comparisons

Function A Function #2

n3 + 2n2 100n2 + 1000

n0.1 log n
vs.
n + 100n0.1 2n + 10 log n

5n5 n!

n-152n/100 1000n15

82log n 3n7 + 7n
Race 1

n3 + 2n2 vs. 100n2 + 1000


Race 2

n0.1 vs. log n

In this one, crossover point is very late! So, which algorithm is really better???
Race C

n + 100n0.1 vs. 2n + 10 log n

Is the “better” algorithm asymptotically better???


Race 4

5n5 vs. n!
Race 5

n-152n/100 vs. 1000n15


Race VI

82log(n) vs. 3n7 + 7n


Big-O Winners (i.e. losers)

Function A Function #2 Winner

n3 + 2n2 100n2 + 1000 O(n2)

n0.1 log n O(log n)


vs.
n + 100n0.1 2n + 10 log n O(n) TIE

5n5 n! O(n5)

n-152n/100 1000n15 O(n15)

82log n 3n7 + 7n O(n6) why???


Properties
• Theorem:
f(n) = (g(n))  f = O(g(n)) and f = (g(n))
• Transitivity:
– f(n) = (g(n)) and g(n) = (h(n))  f(n) = (h(n))
– Same for O and 
• Reflexivity:
– f(n) = (f(n))
– Same for O and 
• Symmetry:
– f(n) = (g(n)) if and only if g(n) = (f(n))
• Transpose symmetry:
– f(n) = O(g(n)) if and only if g(n) = (f(n))
Properties
• Transitivity
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n))
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n))
f(n) = w(g(n)) & g(n) = w(h(n))  f(n) = w(h(n))

• Reflexivity
f(n) = (f(n))
f(n) = O(f(n))
f(n) = (f(n))
Properties
• Symmetry
f(n) = (g(n)) iff g(n) = (f(n))

• Complementarity
f(n) = O(g(n)) iff g(n) = (f(n))
f(n) = o(g(n)) iff g(n) = w((f(n))
Kinds of Analysis

Running time may depend on actual input, not just


length of input

Distinguish
– Worst case
• Your worst enemy is choosing input
– Average case
• Assume probability distribution of inputs
– Amortized
• Average time over many runs
– Best case (not too useful)
Analyzing Code

C++ operations constant time


Consecutive stmts sum of times
Conditionals larger branch plus test
Loops sum of iterations
Function calls cost of function body
Recursive functions solve recursive equation
End of Lecture -2
Reference
• http://btechsmartclass.com/

You might also like