You are on page 1of 45

Lecture 2

Analysis of Algorithms
(Part 2)
Big O Notation
• Big-O is an Asymptotic Notation for the ceiling
of growth for a given function. It provides us
with an asymptotic upper bound for the growth
rate of the runtime of an algorithm.

• f(n) = O(g(n)) that is, f of n is Big–O of g


of n if there exist positive constants c and
𝒏𝟎 such that f(n) ≤ cg(n) for all 𝒏 ≥ 𝒏𝟎 .

• It means that for large amounts of data, f(n)


will grow no more than a constant factor than
g(n).

Analysis of Algorithms 2
Big O Notation
• Example: Show that 30n+8 is O(n).

• Here we have 𝑓 𝑛 = 30n+8, and 𝑔 𝑛 = 𝑛


• We need to prove that 𝟑𝟎𝒏 + 𝟖 ≤ 𝒄𝒏 for some constant c.
• Let, c = 31. Then we have
30𝑛 + 8 ≤ 31𝑛 for all 𝑛 ≥ 8.
• Therefore,
30𝑛 + 8 = O 𝑛 with c = 31 and 𝑛0 = 8 (Proved)

Analysis of Algorithms 3
Big O Notation
• Note 30n+8 isn’t less than
n anywhere (n>0). cn =
31n 30n+8

Value of function →
• It isn’t even less than 31n
everywhere.
30n+8
n
• But it is less than 31n = O(n)
everywhere to
the right of n=8. n>n0=8 →
Increasing n →

Analysis of Algorithms 4
Big O Notation
• Example: Is 3𝑛 + 2 = O 𝑛 ?

• Here we have 𝑓 𝑛 = 3𝑛 + 2, and 𝑔 𝑛 = 𝑛


• We need to prove that 𝟑𝒏 + 𝟐 ≤ 𝒄𝒏 for some constant c.

• We notice that when c = 4, we have 3𝑛 + 2 ≤ 4𝑛 for all 𝑛 ≥ 2.


• Therefore,
𝑓 𝑛 = O 𝑔(𝑛)
Or, 3𝑛 + 2 = O 𝑛
• with c = 4 and 𝑛0 = 2

Analysis of Algorithms 5
Big O Notation
• Example: Is 𝑛2 + 𝑛 = 𝑂 𝑛3 ?

• Here we have 𝑓 𝑛 = 𝑛2 + 𝑛, and 𝑔 𝑛 = 𝑛3


• Notice that if 𝑛 ≥ 1, we have 𝑛 ≤ 𝑛3 .
• Also, notice that if 𝑛 ≥ 1, we have n2 ≤ 𝑛3
• Therefore,
𝑛2 + 𝑛 ≤ 𝑛3 + 𝑛3 = 2𝑛3
• We have just shown that
𝑛2 + 𝑛 ≤ 2𝑛3 for all 𝑛 ≥ 1
• Thus, we have shown that 𝑛2 + 𝑛 = 𝑂 𝑛3 with 𝐜 = 𝟐 𝐚𝐧𝐝 𝒏𝟎 = 𝟏

Analysis of Algorithms 6
Big O Notation
Big O visualization
O(g(n)) is the set
of functions
with smaller or
same order of
growth as g(n)

Analysis of Algorithms 7
Big Ω Notation
― The asymptotic lower bound
― The function 𝐟 𝐧 = Ω(𝐠(𝐧)) if and only if there exist positive constants c
and n0 such that f(n) ⩾ c.g(n) for any n > n0 .
― read as “f of n is omega of g of n”

Analysis of Algorithms 8
Big Ω Notation
• Example: 3𝑛 + 2 = Ω 𝑛 ?

• Here we have 𝑓 𝑛 = 3𝑛 + 2, and 𝑔 𝑛 = 𝑛


• Notice that for any 𝑛 ≥ 1, we have 3𝑛 + 2 ≥ 𝑛.

• Therefore,
𝑓 𝑛 = Ω 𝑔(𝑛)
or 3𝑛 + 2 = Ω 𝑛
• with c = 1 and 𝑛0 = 1

Analysis of Algorithms 9
Big θ Notation
― The asymptotic tight bound
― The function 𝐟 𝐧 = 𝛉(𝐠(𝐧)) if and only if there exist positive
constants c1 , c2 and n0 such that c1 . 𝑔 𝑛 ≤ f 𝑛 ≤ c2 . 𝑔(𝑛) for any n > n0 .
― read as “f of n is theta of g of n”

Analysis of Algorithms 10
Big θ Notation
• Example: n2 + 5n + 7 = θ(n2 ) ?

• Here we have 𝑓 𝑛 = n2 + 5n + 7, and 𝑔 𝑛 = n2


• When 𝑛 ≥ 1, we have
n2 + 5n + 7 ≤ n2 + 5n2 + 7n2
≤ 13n2
• Again, when n ≥ 1,
𝑛2 ≤ n2 + 5n + 7
• Thus, 𝑓 𝑛 = θ 𝑔(𝑛) or n2 + 5n + 7 = θ(n2 ) with 𝒏𝟎 = 𝟏, 𝒄𝟏 = 𝟏 and 𝒄𝟐 =
𝟏𝟑

Analysis of Algorithms 11
Big θ Notation
• Subset relations between order-of-growth sets.

R→R
O( f ) ( f )
•f
( f )

Analysis of Algorithms 12
Other Notations (Self-Study)
• Little o Notation
• Little Omega Notation (ω)

Analysis of Algorithms 13
Common Rates of Growth
According to the Big O notation, we have five different categories
of algorithms. Their running time complexities are given as:
Faster algorithms

• Constant : O(1)

• Logarithmic : O(logn)

• Linear : O(n)

• Log-linear : O(n logn)

• Polynomial : O(𝒏𝒌 ) where k > 1

• Exponential : O(𝒌𝒏 ) where k > 1


Slower algorithms

Analysis of Algorithms 14
Common Rates of Growth

Analysis of Algorithms 15
Common Rates of Growth

Analysis of Algorithms 16
Properties of Asymptotic Notations
General Properties:
• If f(n) is O(g(n)) then a*f(n) is also O(g(n)) ; where a is a constant.

• Example:
f(n) = 2n²+5 is O(n²)
then 7*f(n) = 7(2n²+5)
= 14n²+35 is also O(n²)

• Similarly, this property satisfies both Θ and Ω notation. We can say


If f(n) is Θ(g(n)) then a*f(n) is also Θ(g(n)); where a is a constant.
If f(n) is Ω (g(n)) then a*f(n) is also Ω (g(n)); where a is a constant.

Analysis of Algorithms 17
Properties of Asymptotic Notations
Reflexive Properties:
• If f(n) is given then f(n) is O(f(n)).

• Example: f(n) = n²
Then f(n) = O(n²) i.e O(f(n))

• Similarly, this property satisfies both Θ and Ω notation. We can say


If f(n) is given then f(n) is Θ(f(n)).
If f(n) is given then f(n) is Ω (f(n)).

Analysis of Algorithms 18
Properties of Asymptotic Notations
Transitive Properties :
• If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) = O(h(n)) .

• Example: if f(n) = n , g(n) = n² and h(n)=n³


n is O(n²) and n² is O(n³) then n is O(n³)

• Similarly this property satisfies for both Θ and Ω notation. We can say
If f(n) is Θ(g(n)) and g(n) is Θ(h(n)) then f(n) = Θ(h(n)) .
If f(n) is Ω (g(n)) and g(n) is Ω (h(n)) then f(n) = Ω (h(n))

Analysis of Algorithms 19
Properties of Asymptotic Notations
Symmetric Properties :
• If f(n) is Θ(g(n)) then g(n) is Θ(f(n)) .

• Example: f(n) = n² and g(n) = n² then f(n) = Θ(n²) and g(n) =


Θ(n²)

• This property only satisfies for Θ notation.

Transpose Symmetric Properties :


• If f(n) is O(g(n)) then g(n) is Ω (f(n)).

• Example: f(n) = n , g(n) = n² then n is O(n²) and n² is Ω (n)

• This property only satisfies for O and Ω notations.

Analysis of Algorithms 20
Properties of Asymptotic Notations
Some More Properties :
1. If f(n) = O(g(n)) and f(n) = Ω(g(n)) then f(n) = Θ(g(n))
2. If f(n) = O(g(n)) and d(n)=O(e(n))
then f(n) + d(n) = O( max( g(n), e(n) ))

Example: f(n) = n i.e O(n)


d(n) = n² i.e O(n²)
then f(n) + d(n) = n + n² i.e O(n²)

3. If f(n)=O(g(n)) and d(n)=O(e(n))


then f(n) * d(n) = O( g(n) * e(n) )

Example: f(n) = n i.e O(n)


d(n) = n² i.e O(n²)
then f(n) * d(n) = n * n² = n³ i.e O(n³)

Analysis of Algorithms 21
General Rules for (Iterative) Algorithm Analysis
RULE 1- FOR LOOPS:
• The running time of a for loop is at most the running time of the
statements inside the for loop (including tests) times the number of
iterations.

(Taken from Chapter 2 of ‘Data Structures & Algorithm Analysis in C’ by Mark A. Weiss.)

Analysis of Algorithms 22
General Rules for (Iterative) Algorithm Analysis
RULE 2- NESTED FOR LOOPS:
• Analyze these inside out. The total running time of a statement inside a
group of nested for loops is the running time of the statement
multiplied by the product of the sizes of all the for loops.
• As an example, the following program fragment is 𝑂(𝑛2 ):

for( i=0; i<n; i++ )


for( j=0; j<n; j++ )
k++;

(Taken from Chapter 2 of ‘Data Structures & Algorithm Analysis in C’ by Mark A. Weiss.)

Analysis of Algorithms 23
General Rules for (Iterative) Algorithm Analysis
RULE 3- CONSECUTIVE STATEMENTS:
• These just add (which means that the maximum is the one that
counts). As an example, the following program fragment, which has
𝑂(𝑛) work followed by 𝑂(𝑛2 ) work, is also 𝑂 (𝑛2 ):

for( i=0; i<n; i++)


a[i] = 0;
for( i=0; i<n; i++ )
for( j=0; j<n; j++ )
a[i] += a[j] + i + j;

(Taken from Chapter 2 of ‘Data Structures & Algorithm Analysis in C’ by Mark A. Weiss.)

Analysis of Algorithms 24
General Rules for (Iterative) Algorithm Analysis
RULE 4- lF/ELSE:
• For the fragment

if( cond )
S1
else
S2

• the running time of an if/else statement is never more than the


running time of the test plus the larger of the running times of S1 and
S2.

(Taken from Chapter 2 of ‘Data Structures & Algorithm Analysis in C’ by Mark A. Weiss.)

Analysis of Algorithms 25
Worst-case, Average-case, Best-case Analysis
• Worst-case running time
✓ This denotes the behaviour of an algorithm with respect to
the worst possible case of the input instance.

✓ The worst-case running time of an algorithm is an upper


bound on the running time for any input.

✓ Therefore, having the knowledge of worst-case running time


gives us an assurance that the algorithm will never go beyond
this time limit.

Analysis of Algorithms 26
Worst-case, Average-case, Best-case Analysis
• Average-case running time
✓ The average-case running time of an algorithm is an estimate
of the running time for an ‘average’ input.

✓ It specifies the expected behaviour of the algorithm when the


input is randomly drawn from a given distribution.

✓ Average-case running time assumes that all inputs of a given


size are equally likely.

Analysis of Algorithms 27
Worst-case, Average-case, Best-case Analysis
• Best-case running time
✓ The term ‘best-case performance’ is used to analyse an algorithm
under optimal conditions.

✓ For example, the best case for a simple linear search on an array
occurs when the desired element is the first in the list.

✓ However, while developing and choosing an algorithm to solve a


problem, we hardly base our decision on the best-case
performance.

✓ It is always recommended to improve the average performance and


the worst-case performance of an algorithm.

Analysis of Algorithms 28
Analysis of Algorithms 29
Related Readings
➢ Introduction to Algorithms (CLRS)
• Chapter 3 (3.1)

➢ Fundamentals of Computer Algorithms (Horowitz, Sahni, Rajasekaran)


• Chapter 1 (1.3.4)

• https://yourbasic.org/algorithms/big-o-notation-explained/

• Asymptotic Notations:
https://drive.google.com/file/d/1PaF_NqJeqykBVT63RcSJRtm_2iUFpz
MB/view?usp=sharing

Analysis of Algorithms 30
Try Yourself!
Q1. Show that, Q3. Consider the following two functions:
a) 3n2 + 4n − 2 = 𝚯(n2) 𝑛3 for 0 ≤ 𝑛 ≤ 10000
𝑔1 𝑛 = ቊ 2
b) 3logn + 100 = O(logn) 𝑛 𝑓𝑜𝑟 𝑛 ≥ 10000
𝑛 𝑓𝑜𝑟 0 ≤ 𝑛 ≤ 100
𝑔2 𝑛 = ቊ 3
Q2. Consider the following three claims 𝑛 𝑓𝑜𝑟 𝑛 ≥ 100
1) 𝐧 + 𝟓 𝟐 = 𝚯(𝐧𝟐 ) Which of the following statements are true?
2) 𝟐𝐧 + 𝟏 = 𝐎(𝟐𝐧 )
(A) g1 n is O(g 2 n )
3) 𝟐𝟐𝐧 + 𝟏 = 𝐎(𝟐𝐧 )
(B) g 2 n is O(n)

Which of these claims are correct ?


(C) g 2 n is O(g1 n )

(A) 1 and 2 (D) g1 n is O(n3 )


(B) 1 and 3 Q4. Show that,
𝒑
(C) 2 and 3
෍ 𝒌𝒏𝒌 = 𝚯 𝒏𝒑
(D) 1, 2, and 3 𝒌=𝟏
[p is a positive integer]

Analysis of Algorithms 31
Analysis of Algorithms 32
Iterative Algorithm Analysis

➢ Analyze the following algorithms.

Or,

➢ Find out the order of growth/time complexity/big-O of the following


algorithms.

Analysis of Algorithms 33
Iterative Algorithm Analysis
Alg()
{
int i;
for i = 1 to n
{
printf(“Hello!”);
}
}

Analysis of Algorithms 34
Iterative Algorithm Analysis
Solution: For this algorithm, the running time mainly depends on the
number of times the loop is executed.

Since the loop will be executed n times, this algorithm is 𝑶(𝒏).

Analysis of Algorithms 35
Iterative Algorithm Analysis
Alg()
{
int i = 0, s = 0;
while (s < n)
{
i++;
s = s + i;
printf(“Hello!”);
}
}

Analysis of Algorithms 36
Iterative Algorithm Analysis
Solution: For this algorithm, the running time mainly depends on the number
of times the loop is executed.
Here, i is incremented by 1 and s is incremented by i after each iteration.
Let, the loop will be executed k times (i.e. after k iterations, 𝑠 ≥ 𝑛).

Now,
i= 1 2 3 4 5 6 … k
s= 1 3 6 10 15 21 … n

From the table above, we get the following equation,


1 + 2 + 3 + 4 + ⋯+ 𝑘 = 𝑛
𝑘(𝑘+1)
Or, =𝑛
2
1 1 (when, n is large, k is also
Or, 𝑘 2 + 𝑘 = 𝑛 large. In such case, we can
2 2 keep the most dominant term
Or, 𝑘 2 ≈ 2𝑛 𝒌𝟐 and ignore other smaller
terms and constants.)

Analysis of Algorithms 37
Iterative Algorithm Analysis
Or, 𝑘 = √2√𝑛

Therefore, this algorithm is 𝑶(√𝒏).

Analysis of Algorithms 38
Iterative Algorithm Analysis
Alg()
{
int i;
for (𝒊 = 1; 𝒊𝟐 < 𝒏; 𝒊++)
{
printf(“Hello!”);
}
}

Analysis of Algorithms 39
Iterative Algorithm Analysis
Solution: For this algorithm, the running time mainly depends on the
number of times the loop is executed.

Here, we can write the condition check 𝒊𝟐 < 𝒏 as 𝒊 < √𝒏.

So the loop will be executed √𝒏 times and this algorithm is 𝑶(√𝒏).

Analysis of Algorithms 40
Iterative Algorithm Analysis
Alg()
{
int i, j, k;
for (𝒊 = 1; 𝒊 ≤ 𝒏; 𝒊++){
for (𝒋 = 𝟏; 𝒋 ≤ 𝒏; 𝒋++){
for (𝒌 = 𝟏; 𝒌 ≤ 𝟏𝟎𝟎; 𝒌++){
printf(“Hello!”);
}
}
}
}
Analysis of Algorithms 41
Iterative Algorithm Analysis
Alg()
{
int i, j, k;
for (𝒊 = 1; 𝒊 ≤ 𝒏; 𝒊++){
for (𝒋 = 𝟏; 𝒋 ≤ 𝒊; 𝒋++){
for (𝒌 = 𝟏; 𝒌 ≤ 𝟏𝟎𝟎; 𝒌++){
printf(“Hello!”);
}
}
}
}
Analysis of Algorithms 42
Iterative Algorithm Analysis
Solution: For this algorithm, the running time mainly depends on the
number of times the innermost loop (loop k) is executed.
Here,
When i = 1 2 3 4 … n

Loop j is
1 times 2 times 3 times 4 times … n times
executed
Loop k is 1 x 100 2 x 100 3 x 100 4 x 100 n x 100

executed times times times times times

From the table above, we get,


Running time = (1 × 100) + (2 × 100) + (3 × 100) + … + (𝑛 × 100)
= 100 × 1 + 2 + 3 + ⋯ + 𝑛
𝑛(𝑛+1)
= 100 ×
2

Analysis of Algorithms 43
Iterative Algorithm Analysis
= 50𝑛2 + 50𝑛
= 𝑶 𝒏𝟐

Therefore, this algorithm is 𝑶 𝒏𝟐

Analysis of Algorithms 44
Related Materials
➢ Time Complexity Analysis Of Iterative Programs:
https://www.youtube.com/watch?v=FEnwM-iDb2g

➢ Data Structures & Algorithm Analysis in C (Mark A. Weiss)


• Chapter 2

➢ Introduction to Algorithms (CLRS)


• Chapter 2

• Asymptotic Notations:
https://drive.google.com/file/d/1PaF_NqJeqykBVT63RcSJRtm_2iUFpzMB/view
?usp=sharing

Analysis of Algorithms 45

You might also like