You are on page 1of 20

Algorithms and Problem Solving (15B17CI411)

EVEN 2021

Module 1: Lecture 2

Jaypee Institute of Information Technology (JIIT)


A-10, Sector 62, Noida
Asymptotic Analysis

• The main idea of asymptotic analysis is to have a measure of the


efficiency of algorithms that don’t depend on machine-specific
constants and don’t require algorithms to be implemented and time
taken by programs to be compared.
• Asymptotic notations are mathematical tools to represent the time
complexity of algorithms for asymptotic analysis.
Growth of Functions - Asymptotic Notations
• It is a way to describe the characteristics of a function in the limit.
• It describes the rate of growth of functions.
• It is a way to compare “sizes” of functions
Growth of Functions - Asymptotic Notations

Big-O notation represents the upper bound of


the running time of an algorithm.

Therefore, it gives the worst-case complexity


of an algorithm.
Growth of Functions - Asymptotic Notations
Growth of Functions - Asymptotic Notations

Omega notation represents the lower bound


of the running time of an algorithm.

Thus, it provides the best case complexity of


an algorithm.
Growth of Functions - Asymptotic Notations
Growth of Functions - Asymptotic Notations

Theta notation encloses the function from


above and below.

Since it represents the upper and the lower


bound of the running time of an algorithm, it
is used for analyzing the average-case
complexity of an algorithm.
Growth of Functions - Asymptotic Notations
Growth of Functions - Asymptotic Notations

“Little-ο” o() means loose upper-bound of f(n).


Growth of Functions - Asymptotic Notations
How to analyze an algorithm??
Why consider worst-case running time???
• The worst-case running time gives a guaranteed upper bound on the running
time for any input.
• For some algorithms, the worst case occurs often. For example, when searching,
the worst case often occurs when the item being searched for is not present,
and searches for absent items may be frequent.
How to analyze an algorithm??
Why consider worst-case running time???
• The worst-case running time gives a guaranteed upper bound on the running time
for any input.
• For some algorithms, the worst case occurs often. For example, when searching,
the worst case often occurs when the item being searched for is not present, and
searches for absent items may be frequent.

Why not analyze the average case?

Because it’s often about as bad as the worst


case.
Algorithms Let Ci be the cost of ith line.
Cost No. Of times Executed
• Linear search algorithm
C1 t1
C2 t2
for i = 0 to (n - 1) -----------------C1
C3 1/ 0
if (A[i] == item) ----------------C2
loc = i ------------------C3 C4 1/ 0
Exit ---------------------C4 C5 1/ 0
set loc = -1 -------------------------C5
Total time taken:
e.g. A = {1,2,3,4,5}
Search item 1  T(5) = C1+C2+C3+C4 = O(1) --- (Best case)
Search item 3  T(5) = C1*3+C2*3+C3+C4 = O(n) --- (Avg. case)
Search item 5  T(5) = C1*5+C2*5+C3+C4 = O(n) --- (Worst case)
Search item 6  T(5) = C1*5+C2*5+C5 = O(n) --- (Worst case)
Algorithms Let Ci be the cost of ith line.
Cost No. Of times Executed
C1 1
• Max and Min algorithm
C2 1
C3 n-1
max = A[0] ------------------C1 C4 n-1
min = A[0] -------------------C2 C5 t1
for ( i = 1 to n-1 ) -----------C3
C6 t2
if ( A[i] > max ) ----------C4
C7 t3
max = A[i] -------C5
else if ( A[i] < min ) -----C6 Total time taken:
min = A[i] -------C7
e.g. A = {1,2,3,4,5}  T(5) = C1+C2+4C3+4C4+4C5 = O(n) --- (Avg. case)
(n-1) comparisons
e.g. A = {5,4,3,2,1}  T(5) = C1+C2+4C3+4C4+4C6+4C7 = O(n) --- (Worst case)
2(n-1) comparisons
e.g. A = {1}  T(1) = C1+C2 = O(1) --- (Best case)
How to analyze an algorithm??
Example: Insertion Sort
Pseudo code:
for j=2 to A-length ------------------------------C1
key=A[j]--------------------------------------C2
//Insert A[j] into sorted Array A[1.....j-1]--C3
i=j-1--------------------------------------------C4
while i>0 & A[i]>key-------------------------C5
A[i+1]=A[i]-------------------------------C6
i=i-1---------------------------------------C7
A[i+1]=key------------------------------------C8
How to analyze an algorithm??
Example: Insertion Sort
Pseudo code:
for j=2 to A-length ------------------------------C1
key=A[j]--------------------------------------C2
//Insert A[j] into sorted Array A[1.....j-1]--C3
i=j-1--------------------------------------------C4
while i>0 & A[i]>key------------------------C5
A[i+1]=A[i]-------------------------------C6
i=i-1---------------------------------------C7
A[i+1]=key------------------------------------C8
How to analyze an algorithm??
Let Ci be the cost of ith line.
Example: Insertion Sort Since comment lines will not incur any cost C3=0
Pseudo code: Cost No. Of times Executed
for j=2 to A-length -------------------------------C1 C1 n
key=A[j]----------------------------------------C2 C2 n-1
//Insert A[j] into sorted Array A[1.....j-1]--C3 C3 n-1

i=j-1--------------------------------------------C4 C4 n-1

while i>0 & A[i]>key------------------------C5 C5


A[i+1]=A[i]------------------------------C6
C6
i=i-1--------------------------------------C7
C7
A[i+1]=key-----------------------------------C8
C8 n-1
How to analyze an algorithm??
Example: Insertion Sort

Run time = C1(n) + C2 (n-1) + 0 (n-1) + C4 (n-1) + C5( ) + C6 ( ) + C7 ( ) + C8 (n-1)


How to analyze an algorithm??
Example: Insertion Sort

Run time = C1(n) + C2 (n-1) + 0 (n-1) + C4 (n-1) + C5( ) + C6 ( ) + C7 ( ) + C8 (n-1)

Best Case: When the array is sorted.


(All tj values are 1) T(n) = O(n)

Worst Case: It occurs when Array is reverse sorted.


(tj = j) T(n) = O(n^2)

You might also like