You are on page 1of 51

DATA STRUCTURES

(CS3401)

Dr. Somaraju Suvvari,


Asst. Prof, Dept. of CSE, NIT Patna.
somaraju@nitp.ac.in; soma2402@gmail.com;

Dr Somaraju Suvvari NITP -- CS3401 1


The Course

DATA STRUCTURES

Dr Somaraju Suvvari
2
NITP -- CS3401
Lecture – 2
Algorithms

Dr Somaraju Suvvari
3
NITP -- CS3401
What is an algorithm?
• Algorithm is a sequence of well defined, simple, unambiguous and effective
statements, which when executed sequentially will produce the desired result in finite
amount of time.

Dr Somaraju Suvvari
4
NITP -- CS3401
Basic Issues Related to Algorithms

1. How to design algorithms ?

2. How to express algorithms ?

3. Proving correctness ?

4. Efficiency (or complexity) analysis

Dr Somaraju Suvvari
5
NITP -- CS3401
Algorithm design strategies
1. Brute force - follow definition / try all possibilities

2. Divide and conquer // Discuss later part of the course (Unit – 6)


– break problem into distinct subproblems

3. Greedy approach // Discuss later part of the course (Unit – 5)


– repeatedly do what is best now

4. Dynamic programming - break problem into overlapping subproblems

5. Backtracking - It tries out different sequences of decisions until we find one that "works.“
6. Branch-and-bound - Systematic method for solving optimization problems.

Dr Somaraju Suvvari
6
NITP -- CS3401
PROPERTIES OF AN ALGORITHM
1. An algorithm takes zero or more inputs - Input

2. An algorithm results in one or more outputs - Output

3. All operations can be carried out in a finite amount of time

4. An algorithm should be efficient and flexible

5. An algorithm must terminate after a finite number of steps - Finiteness.

6. Every instruction must be clear and unambiguous - Definiteness.

7. Every instruction must be sufficiently basic that it can in principle be carried out by
a person using only pencil and paper. It is not enough that each operation be
definite, but it must also be feasible - Effectiveness
Dr Somaraju Suvvari
7
NITP -- CS3401
Sample Example Algorithms
Sequence
Algorithm for adding two
numbers
1) A=10;
2) B=20;
3) Sum=A+B;
Algorithm for adding two numbers
Decision
Repetition

Algorithm for test for equality of two numbers Algorithm to print first 10 Natural Numbers

Dr Somaraju Suvvari
8
NITP -- CS3401
Example Algorithms

Dr Somaraju Suvvari
9
NITP -- CS3401
Example Algorithms

Dr Somaraju Suvvari
10
NITP -- CS3401
Analysis of Algorithms

Dr Somaraju Suvvari
11
NITP -- CS3401
Performance of a Program
The performance of a program is the amount resources like, computer memory and
time needed to run a program.

The performance of most of the algorithms are measures using two metrics:

1. Space Complexity

2. Time Complexity

Dr Somaraju Suvvari
12
NITP -- CS3401
Analysis of Algorithms
Programs derived from two algorithms for solving the same problem should both be
▪ Machine independent (Assumes RAM model)

▪ Language independent

▪ Environment independent (load on the system,...)

▪ Realistic

Dr Somaraju Suvvari
13
NITP -- CS3401
Space Complexity
Space Complexity -The space complexity of a program is the amount of memory it
needs to run to completion.

The space need by a program has the following components:


1. Instruction space: Instruction space is the space needed to store the compiled
version of the program instructions.

2. Data space: Data space is the space needed to store all constant and variable values.

3. Environment stack space: The environment stack is used to save information


needed to resume execution of partially completed functions.

Dr Somaraju Suvvari
14
NITP -- CS3401
Time Complexity
• Time Complexity - The time needed by an algorithm expressed as a function of the
size of a problem is called the TIME COMPLEXITY of the algorithm.

• The time complexity of a program is the amount of computer time it needs to run to
completion.

• The behavior of the complexity as size increases is called the asymptotic time
complexity.

Dr Somaraju Suvvari
15
NITP -- CS3401
Time Complexity
The time complexity of an algorithm is basically the running time of a
program as a function of the input size.

The time complexity is the number of operations an algorithm performs


to complete its task with respect to input size

Depends on instructions execution

– Instructions: Dependent or Independent

Instruction is independent on input size is said to be constant time.

– for ex: int i; // doesn’t depend on input size;


Dr Somaraju Suvvari
16
NITP -- CS3401
Time Complexity
Instructions depends on input size- execution time depending on input size.

Ex: i < n // it may depend on input size (n).

T(P)= C + tp = compile time + Running time

T(P) - is the running time of a program

Fixed Part: C-compile time independent of the instance.

tp-variable part, runtime depends on problem instance.

Dr Somaraju Suvvari
17
NITP -- CS3401
Faster Algorithm vs. Faster CPU
• A faster algorithm running on a slower machine will always win for large enough
instances

• Suppose algorithm S1 sorts n keys in 2n2 instructions

• Suppose computer C1 executes 1 billion instruc/sec

• When n = 1 million, takes 2000 sec

• Suppose algorithm S2 sorts n keys in 50nlog2n instructions

• Suppose computer C2 executes 10 million instruc/sec

• When n = 1 million, takes 100 sec

Dr Somaraju Suvvari
18 18
NITP -- CS3401
How to find time complexity?

A=10; // 1-time
B=20; // 1-time
SUM=A+B; // 1-time
printf(“sum = %d”, SUM); // 1-time
-------------------
4- constant
Algorithm for adding two numbers
i=1; 1-time
Algorithm for sum of N natural numbers N=10; 1-time
While(i<=N) N+1 times
{
printf (“ %d\t”, i);N-times
i=i+1; N-Times
}
-------------------------------
N=10; // 1-time 3N+3
for(i=1; i<=N; i++) // (1+(N+1)+N) = 2N+2 times --------------------------------
{
printf (“ %d\t”, i); // N-times
}
-------------------------------
3N+3 Dr Somaraju Suvvari 19
NITP -- CS3401
--------------------------------
Linear Loop
for(i=0;i<100;i=i+2)
for(i=0;i<100;i++)
statement block;
statement block;
• 50-times
• 100-times
• If N=100,
• If N=100,
• Then time complexity is N/2
• Then time complexity is N

Logarithmic loop
for(i=1;i<1000; i=i*2)
statement block;

• 10-times
• If N=100,
• Then time complexity is
log(N)
Dr Somaraju Suvvari
20
NITP -- CS3401
Nested Loops M=10,N=10;
for(i=0; i<M;i++)
for(j=0;j<N;j=j*2)
M=10,N=5; • Inner loop is executed N times printf(“hai”);
for(i=0; i<M;i++) when i=0,
• Inner loop will be executed M
for(j=0;j<N;j++) times i=0 to M. Inner loop is executed log(N)
printf(“hai”); • Outer loop is executed M times, times when i=1,
• Hence the time complexity is Inner loop will be executed M
M*N. times i=1 to M.
Hence the time complexity is
M*log(N).

M=10,; i=0, j=0 1-time


for(i=0; i<M;i++) i=1, j=0,1 2-times
M=10,;
for(j=0;j<=i; j++) i=2, j=0,1,2 3-times
for(i=0; i<M;i++)
printf(“hai”); ……………….. …………
for(j=0;j<M;j++)
……………….. …………
printf(“hai”);
i=9, j=0,1, …, 9 10-times
------------
Time Complexity is (M2)-Quadratic
55 times
------------
Dr Somaraju Suvvari M(M+1)/2 21
NITP -- CS3401 2
Time Complexity is (M )-Quadratic
Complexity of Algorithms
The field of computer science, which studies efficiency of algorithms, is known as analysis
of algorithms.

Algorithms can be evaluated by a variety of criteria.

Most often we shall be interested in the rate of growth of the time or space required to solve
larger and larger instances of a problem.

We will associate with the problem an integer, called the size of the problem, which is a
measure of the quantity of input data.
Dr Somaraju Suvvari
22 22
NITP -- CS3401
Complexity of Algorithms
The complexity of an algorithm M is the function f(n) which gives the running time and/or
storage space requirement of the algorithm in terms of the size ‘n’ of the input data.

Dr Somaraju Suvvari
23 23
NITP -- CS3401
Complexity of Algorithms
The function f(n), gives the running time of an algorithm, depends not only on the size ‘n’ of
the input data but also on the particular data.

The complexity function f(n) for certain cases are:

1. Best Case : The minimum possible value of f(n) is called the best case.

2. Average Case : The expected value of f(n).

3. Worst Case : The maximum value of f(n) for any key possible input.

Next slide
Dr Somaraju Suvvari
24 24
NITP -- CS3401
ASYMPTOTIC ANALYSIS

Dr Somaraju Suvvari
25
NITP -- CS3401
Asymptotic Analysis

Ignore machine dependent constants - Otherwise impossible to verify and


to compare algorithms.

Look at growth of T(n) as n → ∞ .

T(n) – Time taken by the algorithm for an input size n.

Dr Somaraju Suvvari
26
NITP -- CS3401
Asymptotic Analysis

The various asymptotic notations are:

1. O ( Big Oh notation )
2. Ω ( Big Omega notation )
3. θ ( Theta notation )

Dr Somaraju Suvvari
27
NITP -- CS3401
O ( Big-Oh notation )
It is used to define the upper bound of an algorithm in terms of Time Complexity

It indicates the Asymptotic upper bound by an algorithm for all input values.

Definition:

– Given function f(n) and g(n) , we say that f(n) is O(g(n)) if there are exist
positive constants c and n0 such that f(n) ≤ cg(n) for n ≥ n0.

Dr Somaraju Suvvari
28
NITP -- CS3401
O ( Big Oh notation )

The growth rate of f(n) is less than or


equal to the growth rate of g(n)

g(n) is an upper bound on f(n)

Dr Somaraju Suvvari
29
NITP -- CS3401
Rules for finding Big – Oh
1. If f(n) is a polynomial of degree d, 3. Use the simplest expression of the class
then f(n) is O(nd), i.e, – Say “3n+5 is O(n)” instead of
– Drop lowest term “3n+5 is O(3n)”

– Drop constant factors 4. If T1(n) = O(f(n)) and T2(n) = O(g(n)),

2. Use the simplest possible class of then

function – T1(n) + T2(n) = max( O(f(n)), O(g(n))),

– Say “2n is O(n)” instead of “2n is – T1(n) * T2(n) = O(f(n) * g(n))


O(n2)”

Dr Somaraju Suvvari
30
NITP -- CS3401
Big-Oh Examples
• Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n

• If we want to represent f(n) as O(g(n)) then it must satisfy f(n) <= c g(n) for all values
of c > 0 and n0>= 1
f(n) <= c g(n)
⇒3n + 2 <= c n

• Above condition is always TRUE for all values if c = 4 and n >= 2.


By using Big - Oh notation we can represent the time complexity as follows...
3n + 2 = O(n)

• 5n2 = O(n2), c = 5, n0 >=1


Dr Somaraju Suvvari
31
NITP -- CS3401
Big-Oh Examples

Dr Somaraju Suvvari
32
NITP -- CS3401
Big-Oh Examples
int searchK(int arr[], int n, int k) f(n)=3n+4
{// for-loop to iterate with each element in the array g(n)=n,
for (int i = 0; i < n; ++i) f(n) = c * g(n)
3n+4 <= cn
{ // check if ith element is equal to "k" or not What is c?
if (arr[i] == k) What is n?
return 1; // return 1, if you find "k" n>=2, c=5
} return 0; // return 0, if you didn't find "k" n=2, 3*2+4 < =5*2
} n=3, 3*3+4 <5*3 => 13< 15.

* i = 0 ------------> 1
* i < n ------------> n+1 times Time complexity is O(n)
* i++ --------------> n times
* if(arr[i] == k) --> n times
* return 1 ---------> 1(if "k" is there in the array)
* return 0 ---------> 1(if "k" is not there in the array)
------------
3n+4
Dr Somaraju Suvvari
------------ NITP -- CS3401
33
Big-Oh Examples
• Let f(n) = 2n2, Then f(n) = O(n4); f(n) = O(n3); f(n) = O(n2)
(best answer, asymptotically tight)
• n2 / 2 – 3n = O(n2);

• 1 + 4n = O(n);

• 7n2 + 10n + 3 = O(n2)

• log10 n = log2 n / log2 10 = O(log2 n) = O(log n)

• 10 = O(1), 1010 = O(1);

• n = O(2n), but 2n is not O(n); 210n is not O(2n)

Dr Somaraju Suvvari
34
NITP -- CS3401
Big-Oh Examples
1. 2n2 + 5n – 6 = O(2n)

2. 2n2 + 5n – 6 = O(n3)

3. 2n2 + 5n – 6 = O(n2) // Tight

4. 2n2 + 5n – 6 ≠ O(n)

Dr Somaraju Suvvari
35
NITP -- CS3401
Ω ( Big Omega notation )
Big - Omega notation is used to define the lower bound of an algorithm in terms of
Time Complexity.

Always indicates the minimum time required by an algorithm for all input values.

Definition:

• Given function f(n) and g(n) , we say that f(n) is Ω(g(n))


if there are exist positive constants c and n0 such that
f(n)  cg(n) for all n ≥ n0.

Dr Somaraju Suvvari
36
NITP -- CS3401
Big-Omega Examples
Example : f(n) = 10n2 + 4n + 2

Let us take g(n) = n2

c = 10 & n0 = 0

Let us check the above condition

10n2 + 4n + 2 ≥ 10n2 for all n ≥ 0

The condition is satisfied. Hence f(n) = Ω(n2 ).

Dr Somaraju Suvvari
37
NITP -- CS3401
Big-Omega Examples
• 5n2 is (n2)

f(n) is (g(n)) if there is a constant c > 0 and an integer constant n0  1 such that
f(n)  c•g(n) for n  n0

let c = 5 and n0 = 1

• 5n2 is (n)

f(n) is (g(n)) if there is a constant c > 0 and an integer constant n0  1 such that
f(n)  c•g(n) for n  n0

let c = 1 and n0 = 1

Dr Somaraju Suvvari
38
NITP -- CS3401
Big-Omega Examples
1. 2n2 + 5n – 6 ≠ Ω(2n)

2. 2n2 + 5n – 6 ≠ Ω(n3)

3. 2n2 + 5n – 6 = Ω(n2)

4. 2n2 + 5n – 6 = Ω(n)

Dr Somaraju Suvvari
39
NITP -- CS3401
θ ( Theta notation )
• The theta notation is used when the function f(n) can be bounded by both from above
and below the same function g(n).

Definition: f(n) = θ (g(n)), if there exist positive constants c1, c2, and n0 such that

0 ≤ c1 g(n)≤ f (n) ≤c2 g(n) for all n ≥ n0 }

Dr Somaraju Suvvari
40
NITP -- CS3401
Theta (θ) Examples
• 5n2 is (n2)

f(n) = 5n2, g(n) = n2

It is true for the values: c1 = 5, c2 = 6 and n0 >= 1

Note: f(n) is (g(n)) if it is (n2) and O(n2).

Dr Somaraju Suvvari
41
NITP -- CS3401
Theta (θ) Examples
1. 2n2 + 5n – 6 ≠ Θ(2n)

2. 2n2 + 5n – 6 ≠ Θ(n3)

3. 2n2 + 5n – 6 = Θ(n2)

4. 2n2 + 5n – 6 ≠ Θ(n)

Dr Somaraju Suvvari
42
NITP -- CS3401
Exercise

Dr Somaraju Suvvari
43
NITP -- CS3401
Master Theorem
The Master Theorem applies to recurrences of the following form:
𝒏
T (n) = aT ( ) + f(n)
𝒃
where a ≥ 1 and b > 1 are constants and f(n) is an asymptotically positive function.
There are 3 cases:
1. If f(n) = O (𝑛𝑙𝑜𝑔𝑏 𝑎−𝜖 ) for some constant ↋ > 0, then T (n) = Θ (𝑛𝑙𝑜𝑏𝑏 𝑎 ).

2. If f(n) = Θ (𝑛𝑙𝑜𝑔𝑏 𝑎 𝑙𝑜𝑔𝑘 n), with k ≥ 0, then T(n) = Θ(𝑛𝑙𝑜𝑏𝑏 𝑎 𝑙𝑜𝑔𝑘+1 n)

3. If f(n) = Ω (𝑛𝑙𝑜𝑔𝑏 𝑎+𝜖 ) with ↋ > 0, and f(n) satisfies the regularity condition, then
T(n) = Θ(f(n)).
𝒏
Regularity condition: a.f( ) ≤ c.f(n) for some constant c < 1 and all sufficiently large n.
𝒃

Dr Somaraju Suvvari
44
NITP -- CS3401
Master Theorem (Examples)
1. T(n) = 3T(n/2) + n2
Solution:
f(n) = n2 a = 3, b=2, 𝑛𝑙𝑜𝑏𝑏 𝑎 = 𝑛𝑙𝑜𝑏2 3
Case-3
T(n) = Θ(n2)
2. T(n) = 4T (n/2) + n2
Solution:
Case-2
T(n) = Θ(n2 log n)
3. T (n) = T (n/2) + 2n
Solution:
Case-3
T(n) = Θ(2n)
Dr Somaraju Suvvari
45
NITP -- CS3401
Master Theorem (Examples)
4. T (n) = 2nT (n/2) + nn
Solution
Does not apply, since a is constant

5. T (n) = 64T (n/8) − n2 log n


Solution
Does not apply, since f(n) is not positive

6. T (n) = 16T (n/4) + n


Solution
case-1
T(n) = Θ(n2)
Dr Somaraju Suvvari
46
NITP -- CS3401
Master Theorem (Examples)
7. T (n) = 2T (n/2) + n log n
Solution
case - 2
T(n) = Θ(n log2n)

8. T (n) = 2T(n/2) + n/log n


Solution
Does not apply (non-polynomial difference between f(n) and 𝑛𝑙𝑜𝑔𝑏 𝑎 )

9. T (n) = 4T (n/2) + n/ log n


Solution
Case-1
T (n) = Θ(n2)
Dr Somaraju Suvvari
47
NITP -- CS3401
Master Theorem (Examples)
9. T(n) = 2T(n/4) + n0.51
Solution
case-3
T(n) = Θ(n0.51)

10. T (n) = √ 2 T (n/2) + log n


Solution
case-1
T (n) = Θ(√ n)
11. T (n) = 3T (n/2) + n
Solution
case-1
T (n) = Θ(𝑛𝑙𝑜𝑔2 3 )
Dr Somaraju Suvvari
48
NITP -- CS3401
Numerical Comparison of Different Algorithms

Dr Somaraju Suvvari
49
NITP -- CS3401
Comparison of Different Algorithms

Dr Somaraju Suvvari
50
NITP -- CS3401
Thank You

Dr Somaraju Suvvari
51
NITP -- CS3401

You might also like