You are on page 1of 25

20-4-21

PCC-CS404
Design and Analysis of Algorithms

Introduction Class
And
Revision Class
Algorithm
Algorithm is a step-by-step procedure, which
defines a set of instructions to be executed in a
certain order to get the desired output.
Characteristics of an Algorithm
• Unambiguous
• Input
• Output
• Finiteness
• Feasibility
• Independent
2
Need of Algorithm
• To understand the basic idea of the problem.

• To find an approach to solve the problem.

• To improve the efficiency of existing techniques.

• To understand the basic principles of designing the algorithms.

• To compare the performance of the algorithm with respect to other techniques.

• It is the best method of description without describing the implementation detail.

• The Algorithm gives a clear description of requirements and goal of the problem to
the designer.

3
Need of Algorithm
• A good design can produce a good solution.

• To understand the flow of the problem.

• To measure the behavior (or performance) of the methods in all cases (best cases,
worst cases, average cases)

• With the help of an algorithm, we can also identify the resources (memory, input-
output) cycles required by the algorithm.

• With the help of algorithm, we convert art into a science.

• To understand the principle of designing.

• We can measure and analyze the complexity (time and space) of the problems
concerning input size without implementing and running it; it will reduce the cost of
design.
4
Algorithm Design Techniques
• Divide and Conquer Approach

•  Greedy Technique

• Dynamic Programming

• Branch and Bound

• Backtracking Algorithm

•  Randomized Algorithms

5
Analysis of algorithm
• The analysis is a process of estimating the efficiency of an algorithm. There
are two fundamental parameters based on which we can analysis the
algorithm:

• Space Complexity: The space complexity can be understood as the amount


of space required by an algorithm to run to completion.

• Time Complexity: Time complexity is a function of input size n that refers to


the amount of time needed by an algorithm to run to completion.

6
Analysis of algorithm
• Generally, we make three types of analysis, which is as follows:

• Worst-case time complexity: For 'n' input size, the worst-case time complexity can be defined as
the maximum amount of time needed by an algorithm to complete its execution. Thus, it is
nothing but a function defined by the maximum number of steps performed on an instance
having an input size of n.

• Average case time complexity: For 'n' input size, the average-case time complexity can be
defined as the average amount of time needed by an algorithm to complete its execution. Thus,
it is nothing but a function defined by the average number of steps performed on an instance
having an input size of n.

• Best case time complexity: For 'n' input size, the best-case time complexity can be defined as the
minimum amount of time needed by an algorithm to complete its execution. Thus, it is nothing
but a function defined by the minimum number of steps performed on an instance having an
input size of n.

7
Analyzing Algorithms
• Predict the amount of resources required:
• memory: how much space is needed?
• computational time: how fast the algorithm runs?

• FACT: running time grows with the size of the input

Def: Running time = the number of primitive operations (steps)


executed before termination
– Arithmetic operations (+, -, *), data movement, control, decision making (if,
while), comparison etc.

8
Algorithm Analysis: Example
• Algo:MIN (a[1], …, a[n])
m ← a[1];
for i ← 2 to n
if a[i] < m
then m ← a[i];
• Running time:
– the number of primitive operations (steps) executed
before termination
T(n) =1 [first step] + (n) [for loop] + (n-1) [if condition] +
(n-1) [the assignment in then] = 3n - 1
• Order (rate) of growth:
– The leading term of the formula
– Expresses the asymptotic behavior of the algorithm

9
Asymptotic Analysis

Asymptotic analysis of an algorithm refers to defining the


mathematical boundation/framing of its run-time
performance

Note : Asymptotic analysis is input bound i.e., if there's no


input to the algorithm, it is concluded to work in a constant
time.

The time required by an algorithm falls under three types −


• Best Case
• Average Case
• Worst Case 

10
Asymtotic Analysis(cont.)
• Best Case Complexity:
– the function defined by the minimum number of steps
taken on any instance of size n
• Average Case Complexity:
– the function defined by the average number of steps
taken on any instance of size n

• Worst Case Complexity:


– the function defined by the maximum number of
steps taken on any instance of size n
Best, Worst, and Average Case Complexity

Number of Worst Case


steps
Complexity

Average Case
Complexity

Best Case
Complexity

N
(input size)
Asymptotic Notations

Commonly used asymptotic notations to calculate


the running time complexity of an algorithm.
• Ο Notation
• Ω Notation
• θ Notation

13
Big-O
f  n   O  g  n   : there exist positive constants c and n 0 such that
0  f  n   cg  n  for all n  n0

What does it mean?


– If f(n) = O(n2), then:
• f(n) can be larger than n2 sometimes, but…
• We can choose some constant c and some value n0 such that for
every value of n larger than n0 : f(n) < cn2
• That is, for values larger than n0, f(n) is never more than a constant
multiplier greater than n2
• Or, in other words, f(n) does not grow more than a constant factor
faster than n2.

14
Big-O(cont.)
Visualization of O(g(n))
cg(n)

f(n)

n0

15
Big-O(cont.)

Examples:

We are considering F(n)= n2

n2 ≤ cn2  c ≥ 1  c = 1 and n0= 1

n2 = O(n2):

16
Big-O(cont.)

17
Big-O(cont.)

Examples :

• Prove that: 20 n  2 n  5  O n
2 2

• Let c = 21 and n0 = 4
• 21n2 > 20n2 + 2n + 5 for all n > 4
n2 > 2n + 5 for all n > 4
TRUE

18
Big Omega – Notation
• () – A lower bound
f  n    g  n   : there exist positive constants c and n 0 such that
0  f  n   cg  n  for all n  n0

Consider F(n) = n2
– n2 = (n)
– Let c = 1, n0 = 2
– For all n  2, n2 > 1  n

19
Big Omega – Notation(cont.)
Visualization of (g(n))
f(n)

cg(n)

n0

20
-Notation

 provides a tight bound

f  n    g  n   : there exist positive constants c1 , c 2 , and n 0 such that


0  c1 g  n   f  n   c 2 g  n  for all n  n0

f  n    g  n    f  n   O  g  n   AND f  n    g  n  

21
-Notation(cont.)
Visualization of (g(n))
c2g(n)

f(n)

c1g(n)

n0

22
Classifying functions by their
Asymptotic Growth Rates
• O(g(n)), Big-Oh of g of n, the Asymptotic Upper
Bound;
• (g(n)), Theta of g of n, the Asymptotic Tight
Bound; and
• (g(n)), Omega of g of n, the Asymptotic Lower
Bound.
Growth of Functions

24
END

25

You might also like