You are on page 1of 13

Complexity of Algorithms

Nitin Mishra
M.Tech NIT Allahabad
Asst.Professor ITM-Bhilwara
Complexity of Algorithms
• Efficiency or complexity of an algorithm
is stated as a function relating the input
length to the number of steps (time
complexity) or storage locations (space
complexity).
Time Complexity
• Running time of the program as a function
of size of input
Space Complexity
• Amount of Computer memory required
during the program execution, as a
function of input size.
Asymptotic Notation
• Helps to compare algorithms
• Suppose we are considering two algorithms, A
and B, for solving a given problem. Furthermore,
let us say that we have done a careful analysis
of the running times of each of the algorithms
and determined them to be Ta(n) and Tb(n),
respectively, where n is a measure of the
problem size. Then it should be a fairly simple
matter to compare the two functions and to
determine which algorithm is the best!
An Asymptotic Upper Bound-Big Oh

• In 1892, P. Bachmann  invented a notation for


characterizing the asymptotic behavior of
functions. His invention has come to be known
as big oh notation:
• Definition (Big Oh)     Consider a function f(n)
which is non-negative for all integers . We say
that ``f(n) is big oh g(n),'' which we write
f(n)=O(g(n)), if there exists an integer no and a
constant c>0 such that for all integers n>n0,
f(n)<cg(n) .
An Asymptotic Lower Bound-Omega

• The big oh notation introduced in the preceding section


is an asymptotic upper bound. In this section, we
introduce a similar notation for characterizing the
asymptotic behavior of a function, but in this case it is a
lower bound.
• Definition (Omega)     Consider a function f(n) which is
non-negative for all integers n>0 . We say that ``f(n) is
omega g(n),'' which we write f(n)=Ω(n) , if there exists an
integer and a constant c>0 such that for all integers
n>no , f(n)>cg(n) .
• The definition of omega is almost identical to that of big
oh. The only difference is in the comparison--for big oh it
is ; for omega, it is . All of the same conventions and
caveats apply to omega as they do to big oh.
Theta and Little Oh

• Definition (Theta)    

Consider a function f(n) which is non-


negative for all integers n>0 . We say that
``f(n) is theta g(n),'' which we write
f(n)=Θ(g(n)) , if and only if f(n) is O(g(n))
and f(n) is Ω(g(n)).
Types of Analysis
• Worst case running time
• Average case running time
• Best case running time
• Amortized running time
Worst case Running Time
• . The behavior of the algorithm with respect to
the worst possible case of the input instance.
The worst-case running time of an algorithm is
an upper bound on the running time for any
input. Knowing it gives us a guarantee that the
algorithm will "never take any longer. There is no
need to ­make an educated guess about the
running time.
• For expressing worst case running time of an
algorithm Big-Oh notation is used.
Average case Running Time
• The expected behavior when the input is randomly
drawn from a given distribution. 'rhe average-case
running time of an algorithm is an estimate of the running
time for an "average" input.
• Computation of average-case running time entails
"knowing all possible input sequences, the probability
distribution of occurrence of these sequences, and the
running times for the individual sequences. Often it is
assumed that all inputs of a given size are equally likely.
For expressing average case running time of an
algorithm 8 notation is used.
Best case Running time:
• The behavior of the algorithm when input
is in already in order.
• For example in sorting, if elements are
already sorted for a specific algorithm.
• The best case running time rarely occurs
in practice comparatively with the first and
second case.
• For expressing best case running time of
an algorithm Big­Ohm notation is used.
Amortized Running Time
• Here the time required to perform a sequence of
(related) operations is averaged over all the
operations performed.
• Amortized analysis can be used to show that the
average cost of an operation is small, if one
averages over a sequence of operations, even
though a simple operation might be expensive.
• Amortized analysis guarantees the average
performance of each operation in the worst
case.

You might also like