You are on page 1of 39

Design & analysis of Algorithm

Design & Analysis of Algorithm 1


Example Of Algorithm

Design & Analysis of Algorithm 2


Few Classical Examples

Classical Multiplication Algorithms

English

American

A la russe

Divide and Conquer

Design & Analysis of Algorithm


Classic Multiplication

Design & Analysis of Algorithm 4


Multiplication

Design & Analysis of Algorithm 5


Multiplication

Design & Analysis of Algorithm 6


The Selection Problem

Which algorithm is better?

Design & Analysis of Algorithm 7


Which algorithm is better?
The algorithms are correct,
but which is the best?
• Measure the running time
(number of operations
needed).
• Measure the amount of
memory used.
• Note that the running time
of the algorithms increase
as the size of the input
increases.

Design & Analysis of Algorithm 8


Running Time

• Most algorithms transform best case


average case
input objects into output worst case
objects. 120

• The running time of an 100


algorithm typically grows with

Running Time
80
the input size.
60
• Average case time is often
40
difficult to determine.
• We focus on the worst case 20

running time. 0
1000 2000 3000 4000
– Easier to analyze
Input Size
– Crucial to applications such as
games etc
Design & Analysis of Algorithm 9
Experimental Studies
• Write a program 9000
implementing the algorithm 8000

• Run the program with inputs 7000

of varying size and 6000

Time (ms)
composition 5000

• Use a method like 4000

System.currentTimeMillis() to 3000

get an accurate measure of 2000

the actual running time 1000

• Plot the results 0


0 50 100
Input Size

Design & Analysis of Algorithm 10


Limitations of Experiments

• It is necessary to implement the algorithm,


which may be difficult
• Results may not be indicative of the running
time on other inputs not included in the
experiment.
• In order to compare two algorithms, the
same hardware and software environments
must be used

Design & Analysis of Algorithm 11


Theoretical Analysis
• Uses a high-level description of the
algorithm instead of an implementation
• Characterizes running time as a function of
the input size, n.
• Takes into account all possible inputs
• Allows us to evaluate the speed of an
algorithm independent of the
hardware/software environment
Design & Analysis of Algorithm 12
RAM model
• has one processor
• executes one
instruction at a time
• each instruction takes
"unit time“
• has fixed-size operands,
and
• has fixed size storage
(RAM and disk).

Design & Analysis of Algorithm 13


Counting Primitive Operations (§3.4)

• By inspecting the pseudo code, we can determine the maximum


number of primitive/basic operations executed by an algorithm, as
a function of the input size

Algorithm arrayMax(A, n)
# operations
currentMax  A[0] 2
for (i =1; i<n; i++) 2n
(i=1 once, i<n n times, i++ (n-1) times: post increment i.e.1+ n + n -1 = 2n )
if A[i]  currentMax then 2(n  1)
currentMax  A[i] 2(n  1)
return currentMax 1
Total 6n 1

Design & Analysis of Algorithm 14


Estimating Running Time
• Algorithm arrayMax executes 6n  1 primitive
operations in the worst case.
Define:
a = Time taken by the fastest primitive operation
b = Time taken by the slowest primitive operation
• Let T(n) be worst-case time of arrayMax. Then
a (6n  1)  T(n)  b(6n  1 )
• Hence, the running time T(n) is bounded by two
linear functions

Design & Analysis of Algorithm 15


Growth Rate of Running Time

• Changing the hardware/ software


environment
– Affects T(n) by a constant factor, but
– Does not alter the growth rate of T(n)
• The linear growth rate of the running time
T(n) is an essential property of algorithm
arrayMax

Design & Analysis of Algorithm 16


Function of Growth rate

Design & Analysis of Algorithm 17


Asymptotic Notations

• Asymptotic Notations are languages that allow


us to analyze an algorithm’s running time by
identifying its behavior as the input size for
the algorithm increases. This is also known as
an algorithm’s growth rate.

Design & Analysis of Algorithm 18


Asymptotic Notations
• Asymptotic analysis of an algorithm refers to
defining the mathematical boundary or framing of
its run-time performance
• Refers to computing the running time of any
operation in mathematical units of computation
•  Evaluate the performance of an algorithm in
terms of input size
• Calculate, how does the time (or space) taken by
an algorithm increases with the input size
Design & Analysis of Algorithm 19
Input Size

• Input size (number of elements in the input)


– size of an array
– polynomial degree
– # of elements in a matrix

– # of bits in the binary representation of the input


– vertices and edges in a graph

20 of Algorithm
Design & Analysis
Example
• Suppose there are two Algorithms i.e.
– Algorithm 1 -- Algorithm 2
• Algorithm 1 take f(n) time and Algorithm 2 take
g(n^2)
• As n ↑ running time will ↑ linearly in case of first
algorithm, while for the second one, running time
increases exponentially
• When can running time be same for both
algorithms????
– If n is significantly small, the running time of both
operations will be nearly the same
Design & Analysis of Algorithm 21
3-Cases: Algorithm Analysis
• The time required by an algorithm falls under
three types
– Best Case − Minimum time required for program
execution
– Average Case − Average time required for program
execution
– Worst Case − Maximum time required for program
execution

Design & Analysis of Algorithm 22


Asymptotic notation types
• The main idea of asymptotic analysis is to have
a measure of efficiency of algorithms that
doesn’t depend on machine specific
constants, and doesn’t require algorithms to be
implemented and time taken by programs to be
compared. Commonly used notations are;
– Ο Notation
– Ω Notation
– θ Notation

Design & Analysis of Algorithm 23


Types of Analysis
• Worst case (at most BIG O)
– Provides an upper bound on running time
– An absolute guarantee that the algorithm would not run longer, no
matter what the inputs are
• Best case (at least Omega Ω )
– Provides a lower bound on running time
– Input is the one for which the algorithm runs the fastest

Lower Bound  Running Time  Upper Bound


• Average case (Theta Θ )
– Provides a prediction about the running tim

Design & Analysis of Algorithm


24
Big Oh Notation - O
• Express the upper bound of an algorithm's running
time
• Describes the worst case senario
• It's a measure of the longest amount of time it
could possibly take for the algorithm to complete.
• Formally, for non-negative functions, f(n) and g(n),
if there exists an integer n0 and a constant c > 0
such that for all integers n > n0, f(n) ≤ cg(n), then
f(n) is Big O of g(n).
Design & Analysis of Algorithm 25
How analysis can be done?
• Ideal Solution
• Express running time as a growth function of
the input size n (i.e., f(n)).
• Compare different functions corresponding to
running times.
• Such an analysis is independent of machine
time, programming style, etc.

26
Design & Analysis of Algorithm
Big Oh Notation - O

Design & Analysis of Algorithm 27


Big O- Constant time
#include <stdio.h>
int main()
{
    printf("Hello World");
}
• In above code “Hello World!!!” print only once on a
screen.
• So, time complexity is constant: O(1) i.e.
– every time constant amount of time require to execute
code, no matter which operating system or which machine
configurations you are using.
Design & Analysis of Algorithm 28
Big O- Constant time
• O(1)

• Function runs in O(1) time relative to its input.


The size of input is 1 or 1,000 but the function
would still require just one step.

Design & Analysis of Algorithm 29


Big O- Linear time

• In above code “Hello World!!!” will print N


times. So, time complexity of above code is
O(N).
Design & Analysis of Algorithm 30
Big O- Linear time
• O(n)

• Function runs in O(n) time relative to its input.


If size of input is 10 then it would print 10
values or so on.

Design & Analysis of Algorithm 31


Big O- Quadratic time
• O(n^2)

• Nested loops- each loop takes n time


• Function runs in O(n^2) time relative to its
input.
Design & Analysis of Algorithm 32
Asymptotic Notation
• Big-Omega Notation Ω
• This is almost the same definition as Big Oh, except that "f(n) ≥ cg(n)”
• This makes g(n) a lower bound function, instead of an upper bound
function.
• It describes the best that can happen for a given data size.

• For non-negative functions, f(n) and g(n), if there exists an integer n0 and a
constant c > 0 such that for all integers n > n0, f(n) ≥ cg(n), then f(n) is
omega of g(n). This is denoted as "f(n) = Ω(g(n))".

33 of Algorithm
Design & Analysis
Asymptotic notations (cont.)
•  - notation

(g(n)) is the set of functions


with larger or same order of
growth as g(n)

34 of Algorithm
Design & Analysis
Asymptotic Notation

• Theta Notation Θ
• Theta Notation For non-negative functions, f(n) and g(n), f(n) is theta of
g(n) if and only if f(n) = O(g(n)) and f(n) = Ω(g(n)). This is denoted as "f(n) =
Θ(g(n))".

This is basically saying that the function, f(n) is bounded both from the top
and bottom by the same function, g(n).

35 of Algorithm
Design & Analysis
Asymptotic notations (cont.)
• -notation

(g(n)) is the set of functions


with the same order of growth as
g(n)

36 of Algorithm
Design & Analysis
Asymptotic notations (cont.)
• Little-O Notation
• For non-negative functions, f(n) and g(n), f(n) is little o of g(n) if and only if
f(n) = O(g(n)), but f(n) ≠ Θ(g(n)). This is denoted as "f(n) = o(g(n))".
• This represents a loose bounding version of Big O. g(n) bounds from the
top, but it does not bound the bottom.

• Little Omega Notation


• For non-negative functions, f(n) and g(n), f(n) is little omega of g(n) if and
only if f(n) = Ω(g(n)), but f(n) ≠ Θ(g(n)). This is denoted as "f(n) = ω(g(n))".

• It bounds from the bottom, but not from the top.

Design & Analysis of Algorithm 37


Example
• Associate a "cost" with each statement.
• Find the "total cost“ by finding the total number of times
each statement is executed. 
Algorithm 1 Algorithm 2

Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
... ...
arr[N-1] = 0; c1 
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2

38
Design & Analysis of Algorithm 38
Another Example
• Algorithm 3 Cost
  sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2

39
Design & Analysis of Algorithm 39

You might also like