You are on page 1of 25

Harcourt Butler Technical University ,Kanpur Session

(2022-2023)

Design & Analysis of


Algorithm
(ECS-355)

Unit -I Notes

Submitted to: Submitted by:


Dr. Imran Khan Pranjal Dhar Dwivedi
Assistant Professor 200104037

INDEX
1
S. No Topic From page To Page
1 Course Outcomes 3 3
2 Algorithm 4 4
3 Analysis of Algorithm 4 7
4 Time Complexity 8 9
Analysis and
Calculation
5 Master’s Theorem
Searching Algorithms
6 9 9
10 15
7 Sorting Algorithms 16 26

2
ALGORITHM
A finite set of instruction that specifies a sequence of operation is to be carried
out in order to solve a specific problem or class of problems is called an
Algorithm.

Example:
Q. Write an algorithm for travelling to New Delhi.
Algorithm:
1. Go to an appropriate portal for booking the ticket for a train i.e.
an mobile application or a website.
2. Input the necessary details like boarding and destination station, date of
boarding and the desired class.
3. Select a train from the given options.
4. Enter the number of passengers boarding the train and their respective
details.
5. Select the mode of payment and pay the required amount to the payment
portal.
6. Collect the now confirmed tickets.
7. Reach the railway station on required time and board the train from the
appropriate station.
8. Finally leave the train at the destination.

Analysis of Algorithms
Analysis of algorithm is the process of analyzing the problem-solving capability
of the algorithm in terms of the time and size required (the size of memory for
storage while implementation). However, the main concern of analysis of
algorithms is the required time or performance.

Time Complexity: The time complexity is defined as the process of


determining a formula for total time required towards the execution of that
algorithm. This calculation is totally independent of implementation and
programming language.

Space Complexity: Space complexity is defining as the process of defining a


formula for prediction of how much memory space is required for the successful

3
execution of the algorithm. The memory space is generally considered as the
primary memory.

Types of Algorithm Analysis:-


1.Best case
2.Worst case
3.Average case
• Best case: Define the input for which algorithm takes less time or
minimum time. In the best case calculate the lower bound of an
algorithm. Example: In the linear search when search data is present at
the first location of large data then the best case occurs.
• Worst Case: Define the input for which algorithm takes a long time or
maximum time. In the worst calculate the upper bound of an algorithm.
Example: In the linear search when search data is not present at all then
the worst case occurs.
• Average case: In the average case take all random inputs and calculate the
computation time for all inputs.
And then we divide it by the total number of inputs.
Average case = all random case time / total no of case

Asymptotic analysis

It is a technique of representing limiting behaviour. The methodology has the


applications across science. It can be used to analyze the performance of an
algorithm for some large data set.

1. Big-oh notation: Big-oh is the formal method of expressing the upper


bound of an algorithm's running time. It is the measure of the longest amount of
time. The function f (n) = O (g (n)) [read as "f of n is big-oh of g of n"] if and
only if exist positive constant c and such that

f (n) ⩽ k.g (n)f(n)⩽k.g(n) for n>n0n>n0 in all case

Hence, function g (n) is an upper bound for function f (n), as g (n)


grows faster than f (n)

4
Hence, the complexity of f(n) can be represented as O (g (n)).

2. Omega () Notation: The function f (n) = Ω (g (n)) [read as "f of n is


omega of g of n"] if and only if there exists positive constant c and n0 such that
F (n) ≥ k* g (n) for all n, n≥ n0

Hence, the complexity of f (n) can be represented as Ω (g (n)).

3. Theta (θ): The function f (n) = θ (g (n)) [read as "f is the theta of g of n"]
if and only if there exists positive constant k 1, k2 and k0 such that k1 * g (n) ≤
f(n)≤ k2 g(n)for all n, n≥ n0

5
Hence, the complexity of f (n) can be represented as θ (g(n)).

6
Time Complexity Analysis and Calculation
Q1. T(n)=T(n-1)+2

T(n-1)=T(n-2)+2
T(n-2)=T(n-3)+2
|
|
|
|
T(n)=T(n-4)+2+2+2+2….. =T(n-k)
+2+2+2…k times n-k=0
=>n=k
=T(0)+2n
=1+2n

=>O(n)

Q2. T(n)=2T(n-1)+1

T(n-1)=2T(n-2)+1
T(n-2)=2T(n-3)+1
|
|
|
|
T(n)=2[2T(n-3)+1]+1
= 4T(n-3)+3;
=2[4T(n-3)+2+1]+1
= 8T(n-3) +4+2+1
T(n)=2^K[T(n-k)]+2^K-1
=2^n + 2^n -1
=2^(n+1)-1

=>O(2^n)

7
Master’s Theorem
T(n) = aT(n/b) + f(n)

where, T(n) has the following asymptotic bounds:

1. If f(n) = O(nlogb a-ϵ), then T(n) = O(nlogb a).

2. If f(n) = O(nlogb a), then T(n) = O(nlogb a * log n).

3. If f(n) = O(nlogb a+ϵ), then T(n) = O(f(n)).

ϵ > 0 is a constant.

Example 1:
T(n) = 3T(n/2) + n^2

Here, a =
3 n/b =
n/2
f(n) = n2

logb a = log2 3 ≈ 1.58 < 2

ie. f(n) < nlogb a+ϵ , where, ϵ is a constant.

Case 3 implies here.

Thus, T(n) = f(n) = O(n^2)

Masters Theroem Limitations:


The master theorem cannot be used if:
• T(n) is not monotone. eg. T(n) = sin n
• f(n) is not a polynomial. eg. f(n) = 2n
• a is not a constant. eg. a = 2n
• a<1

8
Searching Algothims
Linear Search
Algorithm:-
Linear Search ( Array Arr, Value a ) // Arr is the name of the array, and a is the
searched element.
Step 1: Set i to 0 // i is the index of an array which starts from 0
Step 2: if i > n then go to step 7 // n is the number of elements in array
Step 3: if Arr[i] = a then go to step 6
Step 4: Set i to i + 1
Step 5: Go to step 2
Step 6: Print element a found at index i and go to step 8
Step 7: Print element not found
Step 8: Exit
C++ Code:
// C++ code to linearly search x in arr[]. If x
// is present then return its location, otherwise
// return -1

#include <iostream> using


namespace std;

int search(int arr[], int N, int x)


{
int i;
for (i = 0; i < N; i++)
if (arr[i] == x)

9
return i;
return -1;
}

// Driver's code int


main(void)
{
int arr[] = { 2, 3, 4, 10, 40 };
int x = 10;
int N = sizeof(arr) / sizeof(arr[0]);

// Function call int result =


search(arr, N, x);
(result == -1)
? cout << "Element is not present in array" : cout
<< "Element is present at index " << result;
return 0;
}

Time Complexity Analysis:


• Best Case Complexity:- In Linear search, best case occurs when the
element we are finding is at the first position of the array. The best-case
time complexity of linear search is O(1).
• Average Case Complexity:- The average case time complexity of linear
search is O(n).
• Worst Case Complexity:- In Linear search, the worst case occurs when
the element we are looking is present at the end of the array. The worst-
case in linear search could be when the target element is not present in the
given array, and we have to traverse the entire array. The worst-case time
complexity of linear search is O(n).
10
11
Binary Search Algorithm:-
binarySearch(arr, x, low, high) // Arr is the name of the array, and x is the
searched element.
Step 1: repeat till low = high
mid = (low + high)/2
if (x == arr[mid])
return mid

else if (x > arr[mid]) // x is on the right side


low = mid + 1
else // x is on the left side
high = mid – 1
C++ Code:
// C++ program to implement iterative Binary Search
#include <bits/stdc++.h> using
namespace std;

// A iterative binary search function. It returns


// location of x in given array arr[l..r] if present,
// otherwise -1 int binarySearch(int arr[], int l, int
r, int x)
{
while (l <= r) { int
m = l + (r - l) / 2;

// Check if x is present at mid


if (arr[m] == x)
return m;

// If x greater, ignore left half

12
if (arr[m] < x)
l = m + 1;

// If x is smaller, ignore right half


else
r = m - 1;
}

// if we reach here, then element was


// not present
return -1;
}

int main(void)
{
int arr[] = { 2, 3, 4, 10, 40 };
int x = 10;
int n = sizeof(arr) / sizeof(arr[0]); int result =
binarySearch(arr, 0, n - 1, x);
(result == -1)
? cout << "Element is not present in array" : cout
<< "Element is present at index " << result;
return 0;
}

Time Complexity Analysis:

13
o Best Case Complexity - In Binary search, best case occurs when the
element we are finding is at the mid position of the array. The best-case
time complexity of binary search is O(1).
o Average Case Complexity - The average case time complexity of binary
search is O(logn).
o Worst Case Complexity - The worst-case in binary search could be when
the target element is not present in the given array, and we have to
traverse the entire array. The worst-case time complexity of binary search
is O(logn).

14
Sorting Algorithms
Insertion-Sort

Algorithm
INSERTION-SORT. (A)
1. for I = 2 to A.length
2. key = A[i]
3. j =I–1
4. while j = 0 and A[j] > key
5. A[ j + 1] = A[ j ]
6. j=j–1
7. A[ j + 1 ] = key

Time Complexity: O( n2)

Program

def iteration(arr): for i in


range(1, len(arr)):
key = arr[i] j=i-1
while j>=0 and arr[j]>key:
arr[j+1] = arr[j]
j=j-1 arr[j+1] = key
arr = [12, 13, 7, 3, 11]
iteration(arr) for i in
range(len(arr)):
print("%d"%arr[i],end=" ")

Output :

3 7 11 12 13

Merge Sort

Algorithm
15
step 1: start
step 2: declare array and left, right, mid variable step 3:
perform merge function.
if left > right return mid=
(left+right)/2 mergesort(array, left,
mid) mergesort(array, mid+1, right)
merge(array, left, mid, right) step 4:
Stop

Time Complexity: O(n*log(n))


Program
// C++ program for Merge Sort #include
<iostream>
using namespace std;

// Merges two subarrays of array[].


// First subarray is arr[begin..mid] // Second subarray is
arr[mid+1..end] void merge(int array[], int const left, int
const mid,
int const right)
{
auto const subArrayOne = mid - left + 1;
auto const subArrayTwo = right - mid;

// Create temp arrays auto *leftArray = new


int[subArrayOne],
*rightArray = new int[subArrayTwo];

// Copy data to temp arrays leftArray[] and rightArray[] for (auto i = 0; i <
subArrayOne; i++)
leftArray[i] = array[left + i]; for (auto j = 0; j <
subArrayTwo; j++)
rightArray[j] = array[mid + 1 + j];

auto indexOfSubArrayOne

16
= 0, // Initial index of first sub-array
indexOfSubArrayTwo
= 0; // Initial index of second sub-array int
indexOfMergedArray
= left; // Initial index of merged array

// Merge the temp arrays back into array[left..right] while


(indexOfSubArrayOne < subArrayOne &&
indexOfSubArrayTwo < subArrayTwo) { if
(leftArray[indexOfSubArrayOne] <=
rightArray[indexOfSubArrayTwo]) {
array[indexOfMergedArray] =
leftArray[indexOfSubArrayOne];
indexOfSubArrayOne++;
}
else {
array[indexOfMergedArray] =
rightArray[indexOfSubArrayTwo];
indexOfSubArrayTwo++;
}
indexOfMergedArray++;
}
// Copy the remaining elements of //
left[], if there are any
while (indexOfSubArrayOne < subArrayOne) {
array[indexOfMergedArray] =
leftArray[indexOfSubArrayOne];
indexOfSubArrayOne++;
indexOfMergedArray++;
}
// Copy the remaining elements of //
right[], if there are any
while (indexOfSubArrayTwo < subArrayTwo) {
array[indexOfMergedArray] =
rightArray[indexOfSubArrayTwo];
indexOfSubArrayTwo++; indexOfMergedArray++;
}
delete[] leftArray;
delete[] rightArray;
}

17
// begin is for left index and end is
// right index of the sub-array // of
arr to be sorted */
void mergeSort(int array[], int const begin, int const end)
{
if (begin >= end)
return; // Returns recursively

auto mid = begin + (end - begin) / 2; mergeSort(array,


begin, mid); mergeSort(array, mid + 1, end);
merge(array, begin, mid, end);
}
void printArray(int A[], int size)
{
for (auto i = 0; i < size; i++)
cout << A[i] << " ";
}

// Driver code
int main()
{
int arr[] = { 12, 11, 13, 5, 6, 7 };
auto arr_size = sizeof(arr) / sizeof(arr[0]);

cout << "Given array is \n";


printArray(arr, arr_size);

mergeSort(arr, 0, arr_size - 1);

cout << "\nSorted array is \n";


printArray(arr, arr_size); return 0;
}

Bubble Sort

Algorithm
Begin BubbleSort(list)

18
for all elements of list

if list[i] > list[i + 1]

swap(list[i], list[i + 1])

end if

end

for

return list

end BubbleSort
Time Complexity: O( n2 )

Program

def bubblesort(elements):

swapped = False

for n in range(len(elements) - 1, 0, -1): for i in range(n):

if elements[i] > elements[i + 1]:

swapped = True

19
elements[i], elements[i + 1] = elements[i + 1], elements[i]

if not swapped:

return

elements = [39, 12, 18, 85, 72, 10, 2, 18]

print("Unsorted list is,")

print(elements)

bubblesort(elements)

print("Sorted Array is, ")

print(elements)

Output:

Unsorted list is,


[39, 12, 18, 85, 72, 10, 2, 18] Sorted
Array is,
[2, 10, 12, 18, 18, 39, 72, 85]
Selection Sort

Algorithm

Step 1 − Set MIN to location 0

20
Step 2 − Search the minimum element in the list

Step 3 − Swap with value at location MIN

Step 4 − Increment MIN to point to next element

Step 5 − Repeat until list is sorted

Time Complexity: O(n2).

Program

def selectionSort(array, size):

for ind in range(size):

min_index = ind

for j in range(ind + 1, size):

if array[j] < array[min_index]:

min_index = j

(array[ind], array[min_index]) = (array[min_index], array[ind]) arr = [-2,


45, 0, 11, -9, 88, -97, -202, 747]

size = len(arr)

selectionSort(arr, size)

Output

The array after sorting in Ascending Order by selection sort is:


21
[-202, -97, -9, -2, 0, 11, 45, 88, 747]

22
Quick Sort

Algorithm

QUICKSORT (array A, start, end)


{
1 if (start < end)
2{
3 p = partition(A, start, end)
4 QUICKSORT (A, start, p - 1)
5 QUICKSORT (A, p + 1, end)
6}
}

PARTITION (array A, start, end)


{
1 pivot ? A[end]
2 i ? start-1
3 for j ? start to end -1 {
4 do if (A[j] < pivot) {
5 then i ? i + 1
6 swap A[i] with A[j]
7 }}
8 swap A[i+1] with A[end]
9 return i+1
}

Program

/* C++ implementation of QuickSort */ #include


<bits/stdc++.h>
using namespace std;

// A utility function to swap two elements void


swap(int* a, int* b)
{
int t = *a; *a
= *b; *b = t;
}
23
/* This function takes last element as pivot, places the
pivot element at its correct position in sorted array, and
places all smaller (smaller than pivot) to left of pivot and
all greater elements to right of pivot */ int partition (int
arr[], int low, int high)
{
int pivot = arr[high]; // pivot int i = (low - 1); // Index of smaller
element and indicates the right position of pivot found so far

for (int j = low; j <= high - 1; j++)


{
// If current element is smaller than the pivot
if (arr[j] < pivot)
{
i++; // increment index of smaller element
swap(&arr[i], &arr[j]);
}
}
swap(&arr[i + 1], &arr[high]); return (i +
1);
}

/* The main function that implements QuickSort


arr[] --> Array to be sorted, low --> Starting
index, high --> Ending index */ void
quickSort(int arr[], int low, int high)
{
if (low < high)
{
/* pi is partitioning index, arr[p] is now at right place */
int pi = partition(arr, low, high);

// Separately sort elements before


// partition and after partition
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}

24
/* Function to print an array */
void printArray(int arr[], int size)
{
int i;
for (i = 0; i < size; i++)
cout << arr[i] << " ";
cout << endl;
}

// Driver Code
int main()
{
int arr[] = {10, 7, 8, 9, 1, 5}; int n =
sizeof(arr) / sizeof(arr[0]);
quickSort(arr, 0, n - 1); cout << "Sorted
array: \n";
printArray(arr, n);
return 0;
}

Time complexity : O(n2)

25

You might also like