You are on page 1of 32

Asymptotic

Notation & Analysis


CCC121 – Data Structures and Algorithms
Discussions

Asymptotic Analysis
01 • Definition
• Types of Time Requirements

Asymptotic Notations
02 • Big Oh Notation, Ο
• Omega Notation, Ω
• Theta Notation, θ
• Common Asymptotic Notations
Recap

Data Structure

• Is the organization of the data in a way so that it can be used efficiently.


• In order to use data efficiently, we have to store it efficiently.
• Efficiency of data structures is always measured in terms of time and space.

• An ideal data structure could be the one that takes the least possible time for
all its operations and consumes the least memory space.
Recap

Question

How could we compare the time complexity of the data structures?

Answer
On the basis of operations performed on them
.
Example

arr 1 2 5 7 9 11 15

Target: Add data at the beginning of the list


Example

arr 1 2 5 7 9 11 15

33
Target: Add data at the beginning of the list
Example

arr 1 2 5 7 9 11 15

33
Target: Add data at the beginning of the list

3 1 2 5 7 9 11 15
Example

arr 3 1 2 5 7 9 11 15

Linked List

Head 1 2000 2 3000 6 Null Tail

1000 2000 3000

4 Null
4000
Example

arr 3 1 2 5 7 9 11 15

Linked List

1 2000 2 3000 6 Null Tail


1000 2000 3000

Head 4 1000
4000
Example

arr 3 1 2 5 7 9 11 15

Linked List

1 2000 2 3000 6 Null Tail


1000 2000 3000

Head 4 1000 Inserting an element at the beginning of the List


is way faster in the Linked list than in arrays
4000
Time Complexity

Q. How to find the time complexity or running time of


an operation performed on data structures?
How to find the time complexity?

Method #1: Examine the exact running time

Run the operation in the machine and check the running time.
Run the operation for different inputs on the data structures you want
to compare one by one.

Problems with this approach:


Possibility that for some input size, first data structure gives the best
performance and for the other input size, the second gives the best
performance.
Or that in some machine for some input size of data structure, it is perfor
ming well while in the other machine, the second one is performing well
with a different size.
Points to be noted:

Measuring the actual running time is not practical at all

The running time generally depends on the size of the input.

arr 1 2 5 7 9 11 15

How many shifts do we have in here?


33
In this case, we have to make 7 shifts which will
take 7 units of time.

This clearly shows that the running time generally depends on the size
of the input
Points to be noted:

Therefore, if the size of the input is n, then f(n) is a function of n denotes the
time complexity.
In other words, f(n) represents the number of instructions executed for the
input value of n
For example:

For (i=0; i<n; i++) { Assuming, this instruction will


printf(“Hello World”); take 1 unit of time.
}
printf instruction is expected n number of times.

Therefore, f(n) = n
But, how to find f(n)?

We can compare two data structures for a particular operation by comparing


their f(n) values.

We are interested in growth rate of f(n) with respect to n because it might


be possible that for smaller input size, one data structure may seem better
than the other but for larger input size it may not.

This concept is applicable in comparing the two algorithms as well.


Example:

f(n) = 5n2 + 6n + 12
for n = 1

% of running time due to 5n2 (5 / (5 + 6 +12)) * 100 = 21.74%

% of running time due to 6n (6 / (5 + 6 +12)) * 100 = 26.09%

% of running time due to 12 (12 / (5 + 6 +12)) * 100 = 52.17%


Example:

f(n) = 5n2 + 6n + 12

n 5n2 6n 12
1 21.74% 26.09% 52.17%
10 87.41% 10.49% 2.09%
100 98.79% 1.19% 0.02%
1000 99.88% 0.12% 0.0002%

It is a clear observation that for larger values of n, the squared term


consumes almost 99% of time.
Growth rate
Example:

f(n) = 5n2 + 6n + 12

n 5n2 6n 12
1 21.74% 26.09% 52.17%
10 87.41% 10.49% 2.09%
100 98.79% 1.19% 0.02%
1000 99.88% 0.12% 0.0002%

So, there is no harm if we can eliminate the rest of the terms as they
are not contributing much to the time.
Asymptotic Complexity

f(n) = 5n2
We are getting the approximate time complexity and we are satisfied with this
result because this approximate result is very near to the actual result.

This approximate measure of time complexity is called Asymptotic


Complexity
Growth Rate of Standard Function

g(n)
n log2n n nlog2n n2 n3 2n
1 0 1 0 1 1 2
2 1 2 2 4 8 4
4 2 4 8 16 64 16
8 3 8 24 64 512 256
16 4 16 64 256 4096 65536
32 5 32 100 1024 32768 429x107
Big O Notation

Big O notation is used to measure the performance of any algorithm by


providing the order growth of the function.

It gives the upper bound on a function by which we can make sure that the
function will never grow faster than this upper bound.

We want the approximate runtime of the operations performed on data


structures. We are not interested in the exact time.
Growth Rate of Standard Function
Big O Notation
Big O Notation
Asymptotic Analysis

Definition

• a systematic way to organize data in order to use it efficiently


• defining the mathematical boundaries/framing of its run-time performance
(using asymptotic analysis, we can very well conclude the best case, average case,
and worst case scenario of an algorithm)
• asymptotic analysis is input bound
(i.e., if there's no input to the algorithm, it is concluded to work in a constant time.
Other than the "input" all other factors are considered constant)
• refers to computing the running time of any operation in mathematical
units of computation.
Asymptotic Analysis

Types of Time Requirements

• Best Case − Minimum time required for program execution.

• Average Case − Average time required for program execution.

• Worst Case − Maximum time required for program execution.


Asymptotic Notations

Big Oh Notation, Ο

• The notation Ο(n) is the


formal way to express the
upper bound of an
algorithm's running time.
• It measures the worst case
time complexity or the
longest amount of time an
algorithm can possibly take
to complete.

For example, for a function f(n)


Ο(f(n)) = { g(n) : there exists c > 0 and n0 such that g(n) ≤ c.f(n) for all n> n0. }
Asymptotic Notations

Omega Notation, Ω

• The notation Ω(n) is the


formal way to express the
lower bound of an
algorithm's running time.
• It measures the best case
time complexity or the
best amount of time an
algorithm can possibly take
to complete.

For example, for a function f(n)


Ω(f(n)) ≥ { g(n) : there exists c > 0 and n0 such that g(n) ≤ c.f(n) for all n> n0. }
Asymptotic Notations

Theta Notation, θ

• The notation θ(n) is the formal


way to express both the lower
bound and the upper bound
of an algorithm's running time.

θ(f(n)) = { g(n) if and only if g(n) = Ο(f(n)) and g(n) = Ω(f(n)) for all n > n0. }
Asymptotic Notations

Common Asymptotic Notations

• constant − Ο(1) • quadratic − Ο(n2)


• logarithmic − Ο(log n) • cubic − Ο(n3)
• linear − Ο(n) • polynomial − nΟ(1)
• n log n − Ο(n log n) • exponential − 2Ο(n)
END OF PRESENTATION

You might also like