Professional Documents
Culture Documents
best case
Input Algorithm Output average case
worst case
120
• Most algorithms transform input objects into output objects.
100
• The running time of an algorithm typically grows with the
Running Time
80
input size.
60
• Average case time is often difficult to determine. 40
1
Performance Analysis of Algorithms:
Performance analysis of an algorithm is the process of calculating amount of
computer memory and time required to run an algorithm.
1.Time Complexity:
Time required to complete the task of that algorithm
2.Space Complexity:
Space required to complete the task of that algorithm
2
Performance Analysis of Algorithms:
Space and Time Calculation
By inspecting the pseudocode, we can determine the data space needed and
the maximum number of primitive operations executed by an algorithm
Primitive operations :
Examples:
• Assigning a value to a variable
• Evaluating an expression
• Condition expression
• Indexing into an array
• Calling a method
• Returning from a method
3
Performance Analysis of Algorithms:
PROBLEM : Exchanging (Swapping) the contents
Algorithm : swap
Input : a,b
Output : Exchange Content
{
t := a;
a := b;
b := t;
}
{
a := a+b;
b := a-b;
a := a-b;
}
5
Performance Analysis of Algorithms:
Algorithm : Sum of two numbers Algorithm: max in two numbers
Input : a,b Input : a,b
Output : Sum Output : max
Algorithm Sum(a,b) Algorithm Maximum(a,b)
{ {
c := a+b; if a > b then
} max:=a;
else
Memory <---3DS(data space) max:= b;
Time <--1 assignment + }
1 addition
Memory <---3DS(data space)
Time <--1 condition +
1 assignment
6
Performance Analysis of Algorithms:
Let us assume that multiplication operator (*) is not available to compute the
product of two integers m and n.
Algorithm : * in terms of repeated + operations
Version:1
Let m=12 , n=6
p:=0;
Version 1 : p:=p + n will be executed 12 times ( m times)
for i:=1 to m do Version 2 : p:=p + m will be executed 6 times ( n times)
p := p +n;
Version:2
p:=0;
for i:=1 to n do
p := p +m;
7
Space Complexity:
• Space Complexity of an algorithm is the amount of memory it needs to run to completion.
• Fixed part : Space needed for instructions, data/variables, and constants, that is independent of
input and output sizes.
1. Such as int a(2Bytes) , float b(4Bytes) etc
2. Constants
denoted with C
• Variable part : Space needed for instance variables, whose size is dependent on the size of a
particular problem instances .
1. Dynamic Array i.e a[]
2. Recursion stack space
denoted with
Memory Space S(P) needed by a problem or algorithm P is :
8
Space Complexity Calculations:
Example1:
Algorithm Fun1 (a, b, c)
{
return a+b+b*c+a*(a+b);
}
what is S(P)=c + Sp
9
Time Complexity
– Time required to complete the task of that algorithm (Time
Complexity).
• Time complexity is sum of two components
• Fixed Part: Compile time - Independent of the problem instances.
- Also once compiled program will be run several times without
recompilation.
Note: denoted with C
• Variable Part: Run time - Dependent on particular problem instances.
Note: denoted with
10
Time Complexity (Conti…)
Time Complexity T(P) needed by a problem or algorithm P consists of two components.
11
Time Complexity Calculation:
1. Using Tabular Method:
we use step/execution count by build a table.
12
Time Complexity Calculation:
1. Using Tabular Method:
we use step/execution count by build a table.
Exaple1: Iterative function for Sum of array elements:
13
Time Complexity Calculation:
Exaple2: Matrix Addition
14
Time Complexity Calculation:
2. Using Count Varible Method:
we use step count in a program to solve a problem
• In the Second method, we introduce a new global variable count , into the program.
15
2. Count Variable Method example
Algorithm sum(a,n)
{
s:=0; 1
S(P)=n+3
16
2. Count Variable Method ex: count=0
Algorithm sum(a,n)
{
s:=0;
count:=count+1;// this for S;=0 assignment.
for i:=1 to n do
{
count:=count+1; // For for loop condition(true)
s:=s+a[i];
count:=count+1; // For assignment
}
count:=count+1; //For last time of for loop condition(false)
count:=count+1; //For the return
return s;
}
17
Example:
t best
the time for the best exit i.e earliest exit
t worst
the time for the worst exit i.e most delayed exit
18
Example:
19
Example:
20
Asymptotic Notations:
• It deals with the behavior of functions in the limit i.e for sufficiently large values of
parameters
There are 3 types of notations mainly:
1. Big Oh (or) Order of Notation (O) 2.Omega Notation(Ω) 3.Theta
Notation(Ɵ)
1. Big Oh Notation:
• O notation provides an asymptotic Upper Bound on a function
f(n). The lower bound on f(n) indicates that the function f(n)
will be the worst-case that it does not consume more than this
computing time.
• The function f(n)=O(g(n)) [“read as f of n is order of g of n “ ]
if and only if there exist positive constants c and n0 such that
f(n)<=c*g(n) for all n , n>=n0.
21
Asymptotic Notations:
22
Asymptotic Notations:
2. Omega Notation:
• Ω notation provides an asymptotic Lower Bound on a function
f(n). The lower bound on f(n) indicates that the function f(n)
will be the best-case that it does not consume less than this
computing time.
• The function f(n)=Ω(g(n)) [“read as f of n is omega of g of n“]
if and only if there exist positive constants C and n0 such that
f(n)>=c*g(n) for all n , n>=n0.
23
Asymptotic Notations:
3. Theta Notation:
• In some function lower and upper bound may be same i.e, Ω
and O have the same function.
• So, the function f(n) will be the average-case.
• The function f(n)=Ɵ(g(n)) [“read as f of n is theta of g of n“]
if and only if there exist positive constants C1,C2 and n0
such that c1*g(n)<= f(n)<=c2*g(n) for all n , n>=n0.
24
THANK YOU
25