You are on page 1of 24

Algorithm

continues…
Asymptotic analysis
• Time Complexity (T(n)) - count of “elementary steps“ / “instructions” executed
• There is a problem!! Exact count of elementary steps (T(n)) depends on distribution of
input
• Consider Linear Search algorithm for input size n = 100000; check the table-
‘key’ found in position T(n) The comparison “if(A[i] == key)” inside for - loop will run
• n times in worst possible situation – linear in terms of input
Nowhere =n
size n
100000 =n • Element found in the last position
99999 =n-1 • Element NOT found
• Constant number of times in best possible situation
⁞ ⁞
• Element found in first position
1 =1 • n/2 times on an average – linear in terms of input size n

• Exact count of elementary steps is NOT feasible


• Complexity is generally expressed in terms of its relationship to some known function
--- "proportionality" approach
• This type of analysis is known as asymptotic analysis -> Big-Oh, Big-Omega, Big-Theta
Big – Oh notation – O( )
• Look at the following graph – what you can see?
Big – Oh notation – O( )

An example
= cn

n
n-1
= n-2

n-(n-1)

• Therefore, f(n) = O(n)


• f(n) groups together a set of functions under big-oh notation
• This is the case of linear search algorithm
Pay Attention:

Quick note
•The choice of n₀ and c is not unique
•There can be many (actually, infinitely many) different
combinations of n₀ and c that would make the proof
work
•One combination of n₀ and c satisfying the inequality
would suffice to find the upper/lower bound
•One may choose n₀ to however large need it to be
Pay Attention:

Pay Attention:

Following the same line
•If f(n) = n2 + n + 5. Then
– f(n) is O(n2)
– f(n) is O(n3)
– f(n) is not O(n)
•If f(n) = 3n
•f(n) is O(4n)
•f(n) is not O(2n)
•Prove 5n log2 n + 8n − 200 = O(n log2 n)
Proof: 5n log2 n + 8n − 200 ≤ 5n log2 n + 8n
≤ 5n log2 n + 8n log2 n; for n ≥ 2 (as log2 n ≥ 1)
≤ 13n log2 n (Hence proved...)
•Definition:- f(n) is O(g(n)) if and only if there exists a real, positive
constant c and a positive integer n0 such that
f(n) ≤ cg(n) ∀ n ≥ n0
• Note that O(g(n)) is a class of functions, NOT a single function;
f(n)=O(g(n)) means f(n)∈O(g(n))
• The ”Oh” notation specifies asymptotic upper bounds
Most common complexity classes (fastest to
slowest)
• O(1), pronounced `Oh of one', or constant complexity;
• O(log2 log2 n), `Oh of log log en';
• O(log2 n), `Oh of log en', or logarithmic complexity;
• O(n), `Oh of en', or linear complexity;
• O(nlog2 n), `Oh of en log en';
• O(n2), `Oh of en squared', or quadratic complexity;
• O(n3), `Oh of en cubed', or cubic complexity;
• O(2n), `Oh of two to the en', or exponential complexity.
Big Omega and Big Theta Notations
The - Ω notation specifies asymptotic lower bounds
Definition:- f(n) is said to be Ω-(g(n)) if ∃ a positive real
constant c and a positive integer n0 such that
f(n) ≥ cg(n) ∀ n ≥ n0

An example: Prove that 2n³ - 7n + 1 is in Ω(n³)


Here, f(n) = 2n³ - 7n + 1 and g(n) = n³
To prove f(n) = Ω(g(n)) one has to prove f(n) ≥ cg(n) ∀ n ≥
n0
i.e, 2n³ - 7n + 1 ≥ c n³
Trick: choose ‘c’ a value lesser than the coefficient of the
highest power of ‘n’ in f(n)
Choose n₀= 3, c = 1, then for all n >= n₀,
2n³ - 7n + 1 ≥ n³ (because n ≥ 3)
Therefore by definition of big-Omega, 2n³ - 7n + 1 is in Ω(n³)
Same example continues…

The Θ notation describes asymptotic tight
bounds
Definition 1: f(n) is Θ(g(n)) iff ∃ positive real constants c1 and
c2 and a positive integer n0, such that
c1g(n) ≤ f(n) ≤ c2g(n) ∀ n ≥ n0
Definition 2: f(n) is Θ(g(n)) iff f(n) ∈ O(g(n)) and f(n) ∈ Ω(g(n))
•g(n) is a tight bound on f(n)
Examples:
• 4n = Θ(n) [ c1 = 1, c2 = 4, n ≥ 1]
• 2n2+n+2= Θ(n2) [c1 = 1, c2 = 5, n ≥ 1]
• A polynomial is big-Θ of its largest term.
✔ For any integer k, 1k + 2k + ... + nk is Θ
(nk+1).
Few rules
• cnm = O(nk) for any constant c and any m ≤ k.
• O(f(n)) + O(g(n)) = O(f(n) + g(n))= O(max(f(n), g(n)))
• O(f(n))O(g(n)) = O(f(n)g(n)).
• O(cf(n)) = O(f(n)) for any constant c.
• c is O(1) for any constant c.
• logbn = O(log n) for any base b
• INFORMAL summary
• f(n) = O(g(n)) roughly means f(n) ≤ g(n)
• f(n) = Ω(g(n)) roughly means f(n) ≥ g(n)
• f(n) = Θ(g(n)) roughly means f(n) = g(n)
Practice questions:
Question 1: Consider the following three claims
1. (n + k)m = O(nm), where k and m are constants
2. 2n + 1 = O(2n)
3. 22n + 1 = O(2n)
Which of these claims are correct ?
(A) 1 and 2 (B) 1 and 3 (C) 2 and 3 (D) 1, 2, and 3
Practice questions:
Question 1: Consider the following three claims
1. (n + k)m = O(nm), where k and m are constants
2. 2n + 1 = O(2n)
3. 22n + 1 = O(2n)
Which of these claims are correct ?
A) 1 and 2 (B) 1 and 3 (C) 2 and 3 (D) 1, 2, and 3

Explanation: Option 3 is incorrect because 22n + 1 = 2n X 2n X 2


Do it yourself…
•If f(n) = Θ(g(n)) and g(n) = Θ(h(n)), then h(n) = Θ(f(n)). (TRUE /
FALSE)
•Solution: True. Θ is transitive.
•n/100 = Ω(n). (TRUE / FALSE)
•True. n / 100 < c ∗ n for c = 1 / 200 .
•Mention the bigger function. If they are the same, mention
“equal.”
(a) O(n42) or O(42n) or equal
(b) Θ(log2 n) or Θ(5 ∗ 1030) or equal
(c) Ω(log2 n) or Ω(log3 n) or equal
Check…
•If f(n) = Θ(g(n)) and g(n) = Θ(h(n)), then h(n) = Θ(f(n)). (TRUE / FALSE)
•Solution: True because Θ is transitive.
•n/100 = Ω(n). (TRUE / FALSE)
•Solution: True. n / 100 < c ∗ n for c = 1 / 200 .
•Mention the bigger function. If they are the same, mention “equal.”
(a) O(n42) or O(42n) or equal
(b) Θ(log2 n) or Θ(5 ∗ 1030) or equal
(c) Ω(log2 n) or Ω(log3 n) or equal
•Changing base of logarithms is equal to multiplying it by a constant.
Big - O does not care about constants.
loga(b) = logc(b) / logc(a) => log2(n) = log3(n) / log3(2).
There are two other notations…


Questions
pls…

You might also like