0% found this document useful (0 votes)
65 views28 pages

Algorithm Complexity Guide

There are lot of examples I put in this PDF. To help in understanding time complexity in a better way. Happy Learning :-)
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views28 pages

Algorithm Complexity Guide

There are lot of examples I put in this PDF. To help in understanding time complexity in a better way. Happy Learning :-)
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

github.

com/himanshuprojects

Time Complexity
Complexity of an algorithm is analyzed in two perspectives: Time and Space.
github.com/himanshuprojects

Asymptotic notation/behaviour
• How f(n) performs with larger value of n.
• Notation-
• O − Big Oh
• Ω − Big omega
• θ − Big theta
• o − Little Oh
• ω − Little omega
github.com/himanshuprojects

Big Oh Notation, Ο
• The notation Ο(n) is the formal way to express
the upper bound of an algorithm's running
time. It measures the worst case time
complexity or the longest amount of time an
algorithm can possibly take to complete.
• For example, for a function f(n)
• Ο(f(n)) = { g(n) : there exists c > 0 and n0 such that f(n) ≤
c.g(n) for all n > n0. }
github.com/himanshuprojects

Omega Notation, Ω
• The notation Ω(n) is the formal way to express
the lower bound of an algorithm's running
time. It measures the best case time
complexity or the best amount of time an
algorithm can possibly take to complete.
• For example, for a function f(n)
• Ω(f(n)) ≥ { g(n) : there exists c > 0 and n0 such
that g(n) ≤ c.f(n) for all n > n0. }
github.com/himanshuprojects

Theta Notation, θ
• The notation θ(n) is the formal way to express
both the lower bound and the upper bound
(tight bound) of an algorithm's running time.
It is represented as follows −
• θ(f(n)) = { g(n) if and only if g(n) = Ο(f(n)) and
g(n) = Ω(f(n)) for all n > n0. }
github.com/himanshuprojects

Common Asymptotic Notations

In sorting there are two types of algo based on time compl-


Navie –O(n2)
Best algo- O(n logn)
github.com/himanshuprojects

Typical functions
• We are intersted in order of magnitude(ignore
constant)
• Is T(n) proportional to logn ,n,n2...nk, 2^n, n!
• Log –log n,nlogn
• polynomial- n,n2,...nk
• Expontional -2^n, n!
github.com/himanshuprojects

Size

Sec->
github.com/himanshuprojects

BIG –O notes
• One important thing most people forget when talking about Big-O, thus I feel the
need to mention that:
• You cannot use Big-O to compare the speed of two algorithms. Big-O only says
how much slower an algorithm will get (approximately) if you double the number
of items processed, or how much faster it will get if you cut the number in half.
• However, if you have two entirely different algorithms and one (A) is O(n^2) and
the other one (B) is O(log n), it is not said that A is slower than B. Actually, with
100 items, A might be ten times faster than B. It only says that with 200 items, A
will grow slower by the factor n^2 and B will grow slower by the factor log n. So, if
you benchmark both and you know how much time A takes to process 100 items,
and how much time B needs for the same 100 items, and A is faster than B, you
can calculate at what amount of items B will overtake A in speed (as the speed of B
decreases much slower than the one of A, it will overtake A sooner or later—this is
for sure).
• We are dropping all the terms which are growing slowly and keep one which
grows fastest.
• Ignore constants 4n->n
github.com/himanshuprojects

Running Time complexity


1. Consecutive stmt
2. Subroutine
3. Nested Subroutine
4. Logarithmic
5. Square root
6. Recursive
github.com/himanshuprojects

Consecutive stmt
• We need to add time complexity of consecutive
stmts. or
• we count the statement with maximum
complexity.
int m=0; ->executed in constant time c1
m=m+1; ->executed in constant time c2
m=m+2; ->executed in constant time c3
Total time = c1+c2+c3
=max(c1,c2c3)
=0(c)
github.com/himanshuprojects

Conditional
• The running time is never more than the running time of the test(s) plus
the running time of the block with maximum complexity.
..Test(s) –condition contains function call like fun() and in turns fun()
contains some nested loops- so take time compl. Of those also.
github.com/himanshuprojects
github.com/himanshuprojects

Subroutine
Consecutive Subroutines
github.com/himanshuprojects

• The total running time of the nested loops is the running time of the outer loop
multiplied by the inner loop(s).
github.com/himanshuprojects
github.com/himanshuprojects
Combination
github.com/himanshuprojects

Logarithms
github.com/himanshuprojects Easy -thumbs up
github.com/himanshuprojects
github.com/himanshuprojects
github.com/himanshuprojects
Square root compl.
github.com/himanshuprojects
github.com/himanshuprojects

To find complexity- solve recursion first.


github.com/himanshuprojects
github.com/himanshuprojects
github.com/himanshuprojects
How
github.com/himanshuprojects to make recursive so easy(in parsing)

You might also like