You are on page 1of 31

Data Structures and

Algorithms
Introduction

• Data Structures
– Arrays
– Lists
– Trees
– Graphs

• Algorithms
Abu Ja'far Mohammed Ibn Musa al-Khowarizmi
Algorithms
Example
• 1. Algorithm Max(A,n)
• 2. // A is an array of size n
• 3. {
• 4. Result := A[1];
• 5. for i=2 to n do
• 6. If A[i] > Result then Result :=
A[i];
• 7. return Result;
• 8. }
What are Algorithms
Satisfy the criteria
1.Input
2.Output
3.Definiteness
4.Finiteness
5.Effectiveness
6.Generality
Write an algorithm to find the speed at which the
electrons travel.
Areas of Study
• How to Devise

• How to Validate

• How to Analyze

• How to Test
Recursion

When a function calls itself, either directly or indirectly.

If this isn't clear , refer to the definition of recursion .


Recursive Algorithms
• Direct Recursive

Program A
Calls A

Program A
• Indirect Recursive Calls B

Program B

Calls A
Whose Code is Better ?

Who ever punches harder.............


Performance Analysis
Memory

Program

• Space Complexity
Program

CPU

• Time Complexity
Space Complexity

Space Complexity: The number of storage


locations needed by the algorithm is termed as the
space complexity

S(P) = c + Sp
c = Fixedpart
Sp = Variable part.
Example
Algorithm ex1(a,b,c)
{
return a+b+c;
}

One storage location is needed for each of the


variables, hence space complexity is 3, this is fixed.
Example
Algorithm ex2(a,n) One location for s
One location for n
{ N locations for a
One location for i,
s := 0.0;
for i : = 0 to n do Therefore total locations
needed are
s : = s + a[i];
return s; S(p) >= n+3

}
Example
Algorithm ex3(a,n)
{
If (n<= 0) then return 0.0;
else return ex3(a,n-1) + a[n];
}
Each function call needs:
One Location for n
One Location for a[n]
One location for the return address
Time Complexity

Time Complexity: Number of steps executed in the


algorithm.

T(P) = complitetime + runtime.

Which is more important ?


Example

Algorithm ex1(a,b)
{
return a+b;
}

There is only one step so Time complexity is = 1.


Counting steps
Algorithm ex2(a,n) Increment a variable
{ after every statement
s := 0.0;
for i : = 0 to n do
{
s : = s + a[i];

}
return s;
}
Example
Algorithm ex2(a,n) Count the number of
{ Statements
s := 0.0;
count = count+1;
for i : = 0 to n do
{
count=count+1; Count the number of times
s : = s + a[i]; the for loops is executed
count=count + 1;
}
count = count +1 ;
count = count +1 ;
return s;
}
Example
Each Function call has 2
Algorithm ex3(a,n) statements.
{
count = count + 1
If (n<= 0) then
{
count = count + 1;
return 0.0; How many times is the
} recursion called ?
else
{
count = count + 1
return ex3(a,n-1) + a[n];
}
Try it out

Find out the time and space complexity of


bubble sort algorithm
Which looks better ?

16*n3 + 10n2 + 14n + 1

O(n3 )
Asymptotic Notation

• Big Oh

• Omega
Big Oh
Upper Bound
Big Oh
f(n) is O(g(n) ) iff
f(n) <= c* g(n) for all n> n0
Graphical Represntation

f(N) = O(g(N)) cg(N)

f(N)
running time

N
n0
Example

3n +2 = O(n) as 3n +2 <= 4n for all n>=2


1000n2 + 100n -6 = O(n2 ) as
1000n2 + 100n -6 <= 1001n2 for n >= 100
Big Oh

O(1) means constant computing time

O(n) mean linearly increasing


computing time.
Omega
Lower Bound
f(n) is omega( g(n)) iff
f(n) >= c* g(n) for all n>=n0
Graphical Represntation

f(N) = Ω(g(N)) f(N)

cg(N)
running time

N
n0
Example

3n+2 = omega(n) as 3n+2 >= 3n for n>=1