Professional Documents
Culture Documents
Week 5 Lecture
by
Dr Gaurav Kumar
Asst. Prof, Bennett University
Quick Recap of Last Week’s Learnings
TC of Bubble Sort
TC of Selection Sort
Total Time= Total Comparisons + Total Swapping
Worst Case Analysis
1 5 0
1 2 3 4 5 6
2 4 0
(List is already sorted)
3 3 0
TC = O(n2) 4 2 0
5 1 0
Can TC further improve? For Visualization
Quick Recap of Last Week’s Learnings
5 1 0
Quick Recap of Last Week’s Learnings
Hence, by introducing flag concept (as similar to bubble sort) in the selection sorting algorithm, we can
Note: Kindly make a note while reading the Week -3 Lecture Slides (old)
Some Important Points
a) f(n)= O(nlogn)
b) f(n)= O(n2)
Correct Answer is c
General Assessment Time
10
9
8
7
6
5
4
3
2
1
What is the Time Complexity of this
algorithm?
a) f(n)= O(n)
Algorithm print(n)
{ b) f(n)= O(n2)
for ( i=1; i>0; i=i*2)
c) f(n)= O(logn)
{
Which one is the printf(“This is number: %d”, i); d) I am confused
correct answer? }
}
Correct Answer is c
General Assessment Time
10
9
8
7
6
5
4
3
2
1
What is the Worst Case Time Complexity
of Shell Sort Algorithm (Gap Interval N/2…)?
a) f(n)= O(nlogn)
b) f(n)= O(n2)
d) I am confused
correct answer?
Correct Answer is c
Analyzing Time Complexity of Shell Sort
5 4 3 2 1
Step 1
Divide the unsorted list into n sub-
lists, each comprising 1 element
(a list of 1 element is sorted).
Understanding Merge Sort Algorithm p q r
Step 2
Repeatedly merge sub-lists to produce
newly sorted sub-lists until there is only 1
sub-list remaining. This will be the sorted
list.
Understanding Merge Sort Algorithm p q r
2
In merge sort we follow the following steps
r)/2 and mark the middle index as q, and break the array into
two subarrays, from p to q and from q + 1 to r index.
4. Once we have divided the main array into subarrays with single
element, then we start merging the subarrays.
Understanding Merge Sort Algorithm p q r
1. We take a variable p and point to the starting index of an array and we take
void mergeSort(int a[], int p, int r)
another variable r and point it to the last index of an array.
{
2. Then we find the middle of the array using the formula (p + r)/2 and mark the if(p < r)
middle index as q, and break the array into two subarrays, from p to q and {
from q + 1 to r index. q = (p + r) / 2; //division
mergeSort(a, p, q);
3. Then we divide these 2 subarrays again, just like we divided our main array and
mergeSort(a, q+1, r);
this process will continue.
merge(a, p, q, r);
4. Once we have divided the main array into subarrays with single element, then
}
we start merging the subarrays.
}
Understanding Merge Sort Algorithms
q = (p + r) / 2; //division
mergeSort(a, p, q);
mergeSort(a, q+1, r); mergeSort(a, 0, 1)
mergeSort(a, 2, 3)
merge(a, p, q, r);
} merge(a, 2, 2,3)
merge(a, 0, 0,1)
} mergeSort(a, 0, 0) mergeSort(a, 1, 1)
(will not execute) (will not execute)
mergeSort(a, 2, 2) mergeSort(a, 3, 3)
(will not execute) (will not execute) ….. ….
Understanding Merge Concept in Merge Sort Algorithm
void merge(array, low, mid, high) // Copy the remaining elements of the list if there are any
{ while (j <= high)
i mid j high
i=low { mid j high
i
j= mid+1 temp[k] = a[j];
k= high j++;
1 2 3 4 5 6
while (i < =mid && j <= high) k++;
k
{ } k
if (a[i]<=a[j]) while (i <= mid)
{ {
temp[k] = a [i]; temp[k] = a[i];
mid i j high
i++, k++; i++;
} k++;
else { } k 4 5 6
temp[k] = a[j];
j++, k++; for(k=low; k<high; k++)
1 2 3
} {
} a[k] = temp[k];
}
}
Understanding Merge Sort Algorithm
{
if(p < r)
1
{
1
q = (p + r) / 2;
mergeSort(a, p, q); T(n/2)
merge(a, p, q, r); n
}
} T(n)= 2T(n/2) + n +2
T(n)= 2T(n/2) + n
Understanding TC of Merge Sort Algorithms
Recurrence Relation
T(n)= 2T(n/2) + n for n>1
=1 for n=1
Time complexity of Merge Sort is O(nLogn) in all 3 cases (worst, average and best).
Understanding TC of Merge Sort Algorithms
*Merge operation takes n times to sort and merge the n Total Time = merge() + merge() + merge() (3 Times)
element Total Time = 8+8+8
Understanding TC of Merge Sort Algorithms
if(p < r)
{ Number of operations
for merging n elements
q = (p + r) / 2; merge() n
Total Time = merge() + merge() + merge() Total Time = n + n + n +…..+n (logn times)
+…..+merge() (logn times) TC=nlogn
Time and Space Complexity of Merge Sort
Sorting In Place: No
Stable: Yes
*This is auxiliary Space Complexity
Non-Comparison based
Count Sort
Sorting Algorithm
Count Sort (Liner Sorting Algorithm)
Step -1- Find out the maximum element from the given array.
Count Sort (Liner Sorting Algorithm)
Step -2- Create an Auxiliary or Temp or Count Array of length max+1 and initialize all
elements with 0. This array is used for storing the count of the elements in the array.
Count array
Count Sort (Liner Sorting Algorithm)
Step -3- Store the count of each element at their respective index in count array
For example: if the count of element 3 is 2 then, 2 is stored in the 3rd position of count array. If element "5" is not present in the
Original Array
Count array
Count Sort (Liner Sorting Algorithm)
Step -4 Store cumulative sum of the elements of the count array. It helps in placing
the elements into the correct index of the sorted array.
(Cumulative Sum= Count[i] = Count [i] + Count [i-1])
Count Array
Cumulative Sum
Count Sort (Liner Sorting Algorithm)
Step -5 Find the index of each element of the original array in the count array. This gives the
cumulative count. (Cumulative count helps placing the elements into the correct index of the
sorted array) Place the element at the correct index in output array.
countingSort(array, size)
There are mainly four main loops.
{
max <- find largest element in array
for-loop time of counting
initialize count array with all zeros
1st O(max)
for j <- 1 to size
find the total count of each unique element 2nd O(size)
and store the count at jth index in count array
3rd O(max)
for i <- 1 to max
4th O(size)
find the cumulative sum and store it in count array itself
for j <- size down to 1
restore the elements to array Overall complexity = O(max)+O(size)+O(max)+O(size)
decrease count of each element restored by 1 = O(max+size)
}
•Worst Case Complexity: O(n+k)
•Best Case Complexity: O(n+k)
•Average Case Complexity: O(n+k)
Note: Counting sort is a stable algorithm because the relative order of similar elements are not changed after sorting.
Any Queries?
Office MCub311
Discussion Time: 3-5 PM
Mob: +91-8586968801