You are on page 1of 54

Sorting Algorithms

• Sorting is an arrangement of data in a particular order.

• Sorting is used to represent data in a more readable format. Some


real-life examples of sorting is Contact List in Your Mobile Phone
arranged alphabetically .

• There are two important applications of sorting:


 As an aid to searching, and
 for matching entries in lists.

• Sorting is also used in the solution of many other more complex


problems.

• In fact, estimates suggest that over 25 percent of all computing time


is spent on sorting, with some organizations spending more than 50
percent of their computing time sorting lists.
• So, the problem of finding efficient sorting algorithms is immensely
important.

• There are number of sorting algorithms, best algorithm is which can


solve a problem in the minimum time and minimum space required
to do so.

Some types of the Sorting algorithm are:-

Bubble sort
Bubble Sort is a simple algorithm which is used to sort a given set of n
elements provided in form of an array with n number of elements.

Bubble Sort compares all the element one by one and sort them based
on their values
Steps 1. If the given array has to be sorted in ascending order, then
bubble sort will start by comparing the first element of the array with
the second element, if the first element is greater than the second
element, it will swap both the elements, and then move on to
compare the second and the third element, and so on.

2. If we have total n elements, then we need to repeat this process for


n-1 times.

3. It is known as bubble sort, because with every complete iteration


the largest element in the given array, bubbles up towards the last
place or the highest index, just like a water bubble rises up to the
water surface.

4. Sorting takes place by stepping through all the elements one-by-one


and comparing it with the adjacent element and swapping them if
required
• Bubble Sort is also called as Sinking Sort.
• Procedure: This is the simplest and easiest sorting technique.
In this technique, the two successive items A[i] and A[i+ 1] are
exchanged whenever A[i] >= A[i+ 1].
• For example, consider the elements shown: 40, 50, 30, 20, 10
• The elements can be sorted as shown in figure.
• In the first pass 40 is compared with 50 and they are in order. So, no
exchange has been done.
• Next 50 is compared with 30 and they are exchanged since 50 is
greater than 30.
• If we proceed in the same manner, at the end of the first pass the
largest item occupies the last position.
• On each successive pass, the items with the next largest value will
be moved to the bottom and thus elements are arranged in ascending
order.
• Note: Observe that after each pass, the larger values sink to the
bottom of the array and hence it is called sinking sort.
Algorithm for Bubble Sort

Step 1 :Start
Step 2 : Initialize an array arr[] of size n and enter unsorted
elements
Step 3 : Set i=n-1 and j=0
Step 4 : Repeat the following steps while(i>=0)
Step 5: Set j=0
Step 6: Repeat following steps while(j<=i)
if arr[j] > arr [j+1] then
swap arr[j] with arr[j+1]
set j =j + 1
end of Step 6 while loop
set i = i – 1
End of Step 4 while loop
Step 7 : Print the Sorted array arr[i]
Step 8: End
//Bubble sort
#include<stdio.h>
#include<conio.h>
void main()
{
int n,i, j, temp,a[20];
clrscr();
printf("Enter the no. of elements to be sorted: \n");
scanf("%d",&n);
printf("Enter the number values you want to sort\n");
for(i = 0; i < n; i++)
scanf("%d", &a[i])
for(i = 1; i < n; i++)
{
for(j=0; j<n-i; j++)
{
if(a[j] >= a[j+1])
{
temp = a[j];
a[j] = a[j+1];
a[j+1] = temp;
}
}
}
//printing the sorted elements
printf("Array after sorting\n") ;
for ( i = 0 ; i <n ; i++ )
printf ( "%d\t", a[i] ) ;
getch();
}
Advantages
• Very simple and easy to program
• Straight forward approach

Disadvantages
• It runs slowly and hence it is not efficient. More efficient sorting
techniques are present .
• Even if the elements are sorted, n-l passes are required to sort.
Advantages of Bubble Sort over other sorting algorithms.

1. The best-case time complexity is O(N) and is very easy to


implement.

2. Comparatively generates faster output with respect to other


algorithms if the array is already sorted.

The space complexity for Bubble Sort is O(1), because only a single
additional memory space is required i.e. for temp variable.

Worst and Average Case Time Complexity: O(N2).


• The worst case occurs
• when an array is reverse sorted.

Best Case Time Complexity: O(N). The best case occurs when an array is
already sorted.
Selection Sort

• Procedure: As the name indicates, we first find the smallest item in


the list and we exchange it with the first item.

• Obtain the second smallest in the list and exchange it with the second
element and so on. Finally, all the items will be arranged in ascending
order.

• Since, the next least item is selected and exchanged appropriately so


that elements are finally sorted, this technique is called Selection sort.
For example, let us take the following elements and are sorted using
selection sort as shown: 45, 20, 40, 5, 15
Design: The smallest element from ith position onwards can be
obtained using the following code

pos = i;
for (j = i +1; j < n; j++)
{
if (a[i] < a[pos])
pos = j;
}

After finding the position of the smallest number, it should be


exchanged with ith position. The equivalent statements are shown
below:

temp = a[pos];
a[pos] = a[i];
a[i] = temp;
Advantages

• The main advantage of the selection sort is that it performs well on


a small list.

Furthermore, because it is an in-place sorting algorithm, no additional


temporary storage is required beyond what is needed to hold the
original list.

Disadvantage

• The primary disadvantage of the selection sort is its poor efficiency


when dealing with a huge list of items
//Selection Sort
#include<stdio.h> for (i=0; i<n; i++)
#include<conio.h> {
void main() printf(" %d",a[i]);
{ } getch();
int n, i, j, temp, a[20], pos; }
clrscr();
printf ("Enter total elements:\n");
scanf ("%d", &n);
printf ("Enter %d elements:\n", n);
for (i=0; i<n; i++)
{
scanf ("%d", &a[i]);
}
for (i = 0; i < n - 1; i++)
{
pos = i;
for (j = i +1; j < n; j++)
{
if (a[i] < a[pos])
{
pos = j;
}
}
temp = a[pos];
a[pos] = a[i];
a[i] = temp;
}
Algorithm for Selection Sort

Step 1 :Start
Step 2 : Initialize an array arr[] of size n and enter unsorted elements
Step 3 : Set i=0
Step 4 : Repeat the following steps while(i<n)
Step 5: Set j= i + 1
Step 6: Repeat following steps while(j<=n)
if arr[i] > arr [j] then
swap arr[i] with arr[j]
set j =j + 1
end of Step 6 while loop
set i = i + 1
End of Step 4 while loop
Step 7 : Print the Sorted array arr[i]
Step 8: End
Complexity Analysis of Selection Sort
Time Complexity: The time complexity of Selection Sort is O(N2) as
there are two nested loops:
• One loop to select an element of Array one by one = O(N)
• Another loop to compare that element with every other Array
element = O(N)

Therefore overall complexity = O(N)*O(N) = O(N*N) = O(N2)

The space complexity of Selection Sort is O(1) as only a single


additional memory space is required for the temp variable.

• Worst Case Time Complexity [ Big-O ]: O(n2)


• Best Case Time Complexity [Big-omega]: O(n2)
• Average Time Complexity [Big-theta]: O(n2)
• Space Complexity: O(1)
Quick Sort

• This sorting technique works very well on large set of data. The first
step in this technique is to partition the given table into two sub-tables
such that the elements towards left of the key element (Pivot Element)
are less than the key element and elements towards right of the key
element are greater than key element.

• After this step, the array is partitioned into two sub tables. The
elements in the left sub table are lesser and elements in the right sub
table are greater than the key element
1. Selecting Pivot The process starts by selecting one element
(known as the pivot) from the list; this can be any element any
element at random, like the first or last element.

2. Rearranging the Array Now, the goal here is to rearrange the list
such that all the elements less than the pivot are towards the left of
it, and all the elements greater than the pivot are towards the right of
it.

3. The pivot element is compared to all of the items starting with the
first index. If the element is greater than the pivot element, pointer
index(pindex) is updated.

4. When compared to other elements, if a smaller element than the


pivot element is found, the smaller element is swapped with the
larger element identified before. By the end of first pass the pivot
element finds its right place in array
The total array becomes sorted in this way. Check the below image for the recursion
tree
Partitioning Algorithm
1. Select the first element as pivot point
2. Take two integers i & j that point to low and high of the array
respectively
3. Increment the value of i until Array[i] is less than pivot
4. Decrement the value of j until Array[j] is greater than pivot
5. Swap Array[i] and Array[j]
6. If i and j pass each other, that is, i > j, swap pivot and Array[j]
7. Now, this pivot is the correct place for this element in the array.

Quick Sort Algorithm Steps


1. Call partition method: partition (Array, low, high)
2. Make recursive call to the left sub-array: quickSort( Array, low,
partition-1)
3. Make recursive call to the right sub-array: quickSort( Array,
partition+1, high)
4. Continue this until high is greater than low.
Quick Sort Complexity
The time taken by QuickSort depends upon the input array and
partition strategy.

Partition of elements take n time and in quicksort problem is divide by


the factor ,so the solution for the above recurrence in best and average
case is (nLogn)

However The worst case occurs when the partition process always
picks the greatest or smallest element as the pivot so the worst case
would occur when the array is already sorted in increasing or
decreasing order.
The complexity in that case is O(n2 )
1. Best Time Complexity : O(nlogn)
2. Average Time Complexity : O(nlogn)
3. Worst Time Complexity : O(n2 )
4. Space Complexity : O(n) as we are not creating any container other
than given array therefore Space complexity will be in order of n.
Advantages Disadvantages
The quick sort is regarded as The slight disadvantage of
the best sorting algorithm. quick sort is that its worst-case
performance is similar to
average performances of the
bubble, insertion or selections
sorts.
It is able to deal well with a If the list is already sorted than
huge list of items. bubble sort is much more
efficient than quick sort
Because it sorts in place, no If the sorting element is integers
additional storage is required than radix sort is more efficient
as well than quick sort.
What is Insertion Sort?
• Insertion Sort is a sorting algorithm where the array is sorted by
taking one element at a time.

• The principle behind insertion sort is to take one element, iterate


through the sorted array & find its correct position in the sorted
array.

• Insertion Sort works in a similar manner as we arrange a deck of


cards.
Insertion Sort
• Insertion sort is a simple sorting algorithm that works similar to the
way you sort playing cards in your hands.

• The array is virtually split into a sorted and an unsorted part.

• Values from the unsorted part are picked and placed at the correct
position in the sorted part.

Insertion Sort Algorithm


• To sort an array of size N in ascending order iterate over the array
and compare the current element (key) to its predecessor, if the key
element is smaller than its predecessor, compare it to the elements
before.

• Move the greater elements one position up to make space for the
swapped element.
Working of Insertion Sort algorithm
Consider an example: arr[]: {12, 11, 13, 5, 6}

12 11 13 5 6

First Pass:

Initially, the first two elements of the array are compared in insertion
sort.
12 11 13 5 6

Here, 12 is greater than 11 hence they are not in the ascending order
and 12 is not at its correct position. Thus, swap 11 and 12.

So, for now 11 is stored in a sorted sub-array.


11 12 13 5 6
Second Pass:

Now, move to the next two elements and compare them


11 12 13 5 6
Here, 13 is greater than 12, thus both elements seems to be in
ascending order, hence, no swapping will occur.

12 also stored in a sorted sub-array along with 11


Third Pass:

Now, two elements are present in the sorted sub-array which are 11
and 12

Moving forward to the next two elements which are 13 and 5


11 12 13 5 6

Both 5 and 13 are not present at their correct place so swap them
11 12 5 13 6

After swapping, elements 12 and 5 are not sorted, thus swap again
11 5 12 13 6

Here, again 11 and 5 are not sorted, hence swap again


5 11 12 13 6

Here, 5 is at its correct position


Fourth Pass:

Now, the elements which are present in the sorted sub-array are 5, 11
and 12

Moving to the next two elements 13 and 6


5 11 12 13 6

Clearly, they are not sorted, thus perform swap between both
5 11 12 6 13

Now, 6 is smaller than 12, hence, swap again


5 11 6 12 13

Here, also swapping makes 11 and 6 unsorted hence, swap again


5 6 11 12 13

Finally, the array is completely sorted.


Line 1: We don’t process the first element, as it has nothing to compare
against.

Line 2: Loop from i=1 till the end, to process each element.

Line 3: Extract the element at position i i.e. array[i]. Let it be called E.

Line 4: To compare E with its left elements, loop j from i-1 to 0

Line 5: Compare E with the left element, if E is lesser, then move array[j]
to right by 1.

Line 6: Once we have found the position for E, place it there.


Working of Insertion Sort
Suppose we need to sort the following array.

Initial array

1. The first element in the array is assumed to be sorted. Take the


second element and store it separately in key.

Compare key with the first element. If the first element is greater
than key, then key is placed in front of the first element
If the first element is greater than key, then key is placed in front of
the first element.
2. Now, the first two elements are sorted.

Take the third element and compare it with the elements on the left
of it. Placed it just behind the element smaller than it. If there is no
element smaller than it, then place it at the beginning of the array.
Place 1 at the beginning
3. Similarly, place every unsorted element at its correct position.

Place 4 behind 1
Place 3 behind 1 and the array is sorted
Example
1. Time Complexity

Case Time Complexity


Best Case
O(n)

Average Case
O(n2)

Worst Case
O(n2)

Best Case Complexity - It occurs when there is no sorting required, i.e.


the array is already sorted. The best-case time complexity of insertion
sort is O(n).
Average Case Complexity - It occurs when the array elements are in
jumbled order that is not properly ascending and not properly
descending. The average case time complexity of insertion sort is O(n2).
Worst Case Complexity –
It occurs when the array elements are required to be sorted in reverse
order. That means suppose you have to sort the array elements in
ascending order, but its elements are in descending order.

The worst-case time complexity of insertion sort is O(n2).

2. Space Complexity
Space Complexity
O(1)

The space complexity of insertion sort is O(1). It is because, in insertion


sort, an extra variable is required for swapping.
#include<stdio.h>
int main()
{
int arra[10],i,j,n,array_key;
printf("Input no. of values in the array: \n");
scanf("%d",&n);
printf("Input %d array value(s): \n",n);
for(i=0;i<n;i++)
scanf("%d",&arra[i]);

/* Insertion Sort */
for (i = 1; i < n; i++)
{
array_key = arra[i];
j = i-1;

while (j >= 0 && arra[j] > array_key)


{
arra[j+1] = arra[j];
j = j-1;
}
arra[j+1] = array_key;
}
printf("Sorted Array: \n");
for (i=0; i < n; i++)
printf("%d \n", arra[i]);
}
Advantages of the Insertion Sort
• It, like other quadratic sorting algorithms, is efficient for small data
sets.
• It just necessitates a constant amount of O(1) extra memory space.
• It works well with data sets that have been sorted in a significant way.
• It does not affect the relative order of elements with the same key.

Disadvantages of the Insertion Sort


• Insertion sort is inefficient against more extensive data sets
• The insertion sort exhibits the worst-case time complexity of O(n2)
• It does not perform well than other, more advanced sorting
algorithms
Merge Sort Algorithm

• Merge Sort is one of the most popular sorting algorithms that is


based on the principle of Divide and Conquer Algorithm.

• Here, a problem is divided into multiple sub-problems.

• Each sub-problem is solved individually.

• Finally, sub-problems are combined to form the final solution.


Divide and Conquer Strategy

• Using the Divide and Conquer technique, we divide a problem into


subproblems.

• When the solution to each subproblem is ready, we 'combine' the


results from the subproblems to solve the main problem.

• Suppose we had to sort an array A.

• A subproblem would be to sort a sub-section of this array starting at


index p and ending at index r, denoted as A[p..r].

Divide

• If q is the half-way point between p and r, then we can split the


subarray A[p..r] into two arrays A[p..q] and A[q+1, r].
Conquer

In the conquer step, we try to sort both the subarrays A[p..q] and A[q+1,
r]. If we haven't yet reached the base case, we again divide both these
subarrays and try to sort them.

Combine

When the conquer step reaches the base step and we get two sorted
subarrays A[p..q] and A[q+1, r] for array A[p..r], we combine the results
by creating a sorted array A[p..r] from two sorted
subarrays A[p..q] and A[q+1, r].
#include<stdio.h>
void merge(int arr[],int min,int mid,int max)
{
int tmp[30];
int i,j,k,m;
j=min;
m=mid+1;
for(i=min; j<=mid && m<=max ; i++)
{
if(arr[j]<=arr[m])
{
tmp[i]=arr[j];
j++;
}
else
{
tmp[i]=arr[m];
m++;
}
}
if(j>mid)
{
for(k=m; k<=max; k++)
{
tmp[i]=arr[k];
i++;
}
}
else
{
for(k=j; k<=mid; k++)
{
tmp[i]=arr[k];
i++;
}
}
for(k=min; k<=max; k++)
arr[k]=tmp[k];
}
void sortm(int arr[],int min,int max)
{
int mid;
if(min<max)
{
mid=(min+max)/2;
sortm(arr,min,mid);
sortm(arr,mid+1,max);
merge(arr,min,mid,max);
}
}
int main()
{
int arr[30];
int i,size;
printf("\tMerge sort\n");
printf("-----------------------------------\n");
printf(" How many numbers you want to
sort?: ");
scanf("%d",&size);
printf("\n Enter %d elements :\n ");
for(i=0; i<size; i++)
{
scanf("%d",&arr[i]);
}
sortm(arr,0,size-1);
printf("\n Sorted elements after using
merge sort:\n\n");
for(i=0; i<size; i++)
printf(" %d ",arr[i]);
return 0;
}
Our task is to merge two subarrays A[p..q] and A[q+1..r] to create
a sorted array A[p..r]. So the inputs to the function are A, p, q and
r
The merge function works as follows:
1.Create copies of the subarrays L <- A[p..q] and M <- A[q+1..r].
2.Create three pointers i, j and k
a.i maintains current index of L, starting at 1
b.j maintains current index of M, starting at 1
c.k maintains the current index of A[p..q], starting at p.
3.Until we reach the end of either L or M, pick the larger among
the elements from L and M and place them in the correct position
at A[p..q]
4.When we run out of elements in either L or M, pick up the
remaining elements and put in A[p..q]
Merge Sort Algorithm
MergeSort(arr[], l, r), where l is the index of the first element & r is
the index of the last element.
If r > l
1. Find the middle index of the array to divide it in two halves:
m = (l+r)/2
2. Call MergeSort for first half:
mergeSort(array, l, m)
3. Call mergeSort for second half:
mergeSort(array, m+1, r)
4. Recursively, merge the two halves in a sorted manner, so that only
one sorted array is left:
merge(array, l, m, r)
CASE TIME # WHEN?
COMPLEXITY COMPARISONS
Worst Case O(N logN) N logN Specific
distribution
Average Case O(N logN) NlogN Average
Best Case O(N logN) NlogN Already sorted

Space Complexity: O(N)

You might also like