You are on page 1of 18

SORTING

A sorting algorithm is stable if it preserves the original order of elements with equal key values (where the key
is the value the algorithm sorts by). For example,

[1]

When the cards are sorted by value with a stable sort, the two 5s must remain in the same order in the sorted
output that they were originally in. When they are sorted with a non-stable sort, the 5s may end up in the
opposite order in the sorted output.
Bubblesort

1. compare 1st and 2nd elements


2. if 1st larger than 2nd, swap
3. compare 2nd and 3rd, and swap if necessary
4. continue until compare the last two elements
5. the largest element is now the last element in the array.
6. repeat statring from the beginning until no swaps are performed (i.e.,
the array is sorted)
7. each time you go through the elements bubbling up the largest element
8. no need to try the last i elements for the ith run since the end
elements are already sorted

Program

#include<stdio.h>

int main()

int a[50],n,i,j,temp;

printf("Enter the size of array: ");

scanf("%d",&n);

printf("Enter the array elements: ");


for(i=0;i<n;++i)

{ printf("a[%d]=",i+1);

scanf("%d",&a[i]);

for(i=1;i<n;++i)

for(j=0;j<(n-i);++j)

if(a[j]>a[j+1])

temp=a[j];

a[j]=a[j+1];

a[j+1]=temp;

printf("\nArray after sorting: ");

for(i=0;i<n;++i)

printf("\na[%d]=%d \n ",i,a[i]);

return 0;

Step-by-step example[edit]
Let us take the array of numbers "5 1 4 2 8", and sort the array from lowest number to greatest number using bubble
sort. In each step, elements written in bold are being compared. Three passes will be required.
First Pass

(51428) ( 1 5 4 2 8 ), Here, algorithm compares the first two elements, and swaps since 5 > 1.

(15428) ( 1 4 5 2 8 ), Swap since 5 > 4


(14528) ( 1 4 2 5 8 ), Swap since 5 > 2

(14258) ( 1 4 2 5 8 ), Now, since these elements are already in order (8 > 5), algorithm does not swap
them.
Second Pass

(14258) (14258)

(14258) ( 1 2 4 5 8 ), Swap since 4 > 2

(12458) (12458)

(12458) (12458)
Now, the array is already sorted, but the algorithm does not know if it is completed. The algorithm needs
one whole pass without any swap to know it is sorted.
Third Pass

(12458) (12458)

(12458) (12458)

(12458) (12458)

(12458) (12458)

Insertion Sort

1. algorithm
o passes through each element
o everything before element is sorted
o puts element in appropriate place in sorted half of array by checking
each element starting from the back of the sorted part of the array
2. Code Methods: insertionsort
3. Worst Case O(N2) – when array is reverse sorted
4. Best Case O(N) – when array is already sorted
5. Swap Number
o = number of inversions (i.e., i < j but a[i] > a[j])
o Running Time = O(I + N)
o Notice that for a sorted array I = 0
o For a reverse array I = O(N2)
o Average Number of inversions in an array of N distinct elements is N *
(N – 1 ) / 4 = O(N2)
6. Average Case O(n2)
7. Any algorithm that sorts by exchanging adjacent elements requires
O(N2 time on average.
o Including Bubblesort and Selection Sort

Insertion sort iterates, consuming one input element each repetition, and growing a sorted output list. At each
iteration, insertion sort removes one element from the input data, finds the location it belongs within the sorted list,
and inserts it there. It repeats until no input elements remain.
Sorting is typically done in-place, by iterating up the array, growing the sorted list behind it. At each array-position, it
checks the value there against the largest value in the sorted list (which happens to be next to it, in the previous
array-position checked). If larger, it leaves the element in place and moves to the next. If smaller, it finds the correct
position within the sorted list, shifts all the larger values up to make a space, and inserts into that correct position.
The resulting array after k iterations has the property where the first k + 1 entries are sorted ("+1" because the first
entry is skipped). In each iteration the first remaining entry of the input is removed, and inserted into the result at the
correct position, thus extending the result:
becomes

with each element greater than x copied to the right as it is compared against x.
The most common variant of insertion sort, which operates on arrays, can be described as follows:

1. Suppose there exists a function called Insert designed to insert a value into a sorted sequence at the
beginning of an array. It operates by beginning at the end of the sequence and shifting each element one
place to the right until a suitable position is found for the new element. The function has the side effect of
overwriting the value stored immediately after the sorted sequence in the array.
2. To perform an insertion sort, begin at the left-most element of the array and invoke Insert to insert each
element encountered into its correct position. The ordered sequence into which the element is inserted is
stored at the beginning of the array in the set of indices already examined. Each insertion overwrites a
single value: the value being inserted.

Program
#include<stdio.h>

int main()

int i,j,n,temp,a[30];

printf("\nEnter the number of elements:");

scanf("%d",&n);

printf("\nEnter the elements\n");

for(i=0;i<n;i++)

printf("Enter a[%d] = ",i+1);

scanf("%d",&a[i]);
}

for(i=1;i<=n-1;i++)

temp=a[i];

j=i-1;

while((temp<a[j])&&(j>=0))

a[j+1]=a[j]; //moves element forward

j=j-1;

a[j+1]=temp; //insert element in proper place

printf("\nSorted list is as follows\n");

for(i=0;i<n;i++)

printf("\na[%d] = %d ",i+1,a[i]);

}
return 0;

Mergesort

1. Code Methods: mergesort (2), merge


2. Worst Case: O(NlogN)
3. Recursively merges two sorted lists
4. Time to merge two sorted lists is linear (N-1 comparisons)
5. 1 13 24 26 merge 2 15 27 38 gives 1 2 13 15 24 26 27 38
6. Classic Divide and Conquer Strategy
7. If more than one element, divide and merge sort the first and second
half
8. Analysis
1. Recurrance Relation
2. T(1) = 1
3. T(N) = 2T(N/2) + N
4. T(N)/ N = T(N/2)/N/2 + 1
5. T(N/2) / N/2 = T(N/4) / N/4 + 1
6. T(2) / 2 = T(1)/1 + 1
7. Sum up all of the equations
8. T(N)/N = T(1)/1 + logN <- this logN is the sum of the 1’s
9. T(N) = NlogN + N = O(NlogN)
9. Uses extra memory to merge and copy back into array
10. Merging is cornerstone of external sorting routines
11. Not often used for main memory sorts
12. Can also do it not recursively
13. Or can use less memory – much more complex algorithm (impractical)

Program

#include<stdio.h>

#include<conio.h>

void mergesort(int a[],int i,int j);

void merge(int a[],int i1,int j1,int i2,int j2);


void main()

int a[30],n,i;

clrscr();

printf("Enter no of elements:");

scanf("%d",&n);

printf("Enter array elements:");

for(i=0;i<n;i++)

scanf("%d",&a[i]);

mergesort(a,0,n-1);

printf("\nSorted array is:");

for(i=0;i<n;i++)

printf("%d",a[i]);

getch();

void mergesort(int a[],int i,int j)

int mid;

if(i<j)

mid=(i+j)/2;

mergesort(a,i,mid);
mergesort(a,mid+1,j);

merge(a,i,mid,mid+1,j);

void merge(int a[],int i1,int j1,int i2,int j2)

int temp[50];

int i,j,k;

i=i1;

j=i2;

k=0;

while(i<=j1&&j<=j2)

if(a[i]<a[j])

temp[k++]=a[i++];

else

temp[k++]=a[j++];

while(i<=j1)

temp[k++]=a[i++];

while(j<=j2)

temp[k++]=a[j++];
for(i=i1,j=0;i<=j2;i++,j++)

a[i]=temp[j];

Like QuickSort, Merge Sort is a Divide and Conquer algorithm. It divides input array in two
halves, calls itself for the two halves and then merges the two sorted halves. The merge()
function is used for merging two halves. The merge(arr, l, m, r) is key process that
assumes that arr[l..m] and arr[m+1..r] are sorted and merges the two sorted sub-arrays
into one. See following C implementation for details.
MergeSort(arr[], l, r)
If r > l
1. Find the middle point to divide the array into two halves:
middle m = (l+r)/2
2. Call mergeSort for first half:
Call mergeSort(arr, l, m)
3. Call mergeSort for second half:
Call mergeSort(arr, m+1, r)
4. Merge the two halves sorted in step 2 and 3:
Call merge(arr, l, m, r)

The following diagram from wikipedia shows the complete merge sort process for an
example array {38, 27, 43, 3, 9, 82, 10}. If we take a closer look at the diagram, we can
see that the array is recursively divided in two halves till the size becomes 1. Once the
size becomes 1, the merge processes comes into action and starts merging arrays back
till the complete array is merged.
Quicksort

1. Code Methods: quicksort (2), median3


2. Fastest known sorting algorithm in practice
3. Worst Case: O(N2)
4. Average Case: O(NlogN)
5. Worst Case can be made very unlikely with little effort
6. Divide and Conquer Algorithm
7. Algorithm
1. If the number of elements in S is 0 or 1, return
2. pick any element in v in S. This is called the pivot.
3. Partition the elements in S into two groups, those below the pivot
(numerically) and those above the pivot.
4. return quicksort of s1 followed by v followed by quicksort of s2
8. Hope that half keys are greater than the pivot, other are less
9. Subproblems are not guaranteed to be of equal size which is potentially
bad
10. Faster than mergesort because partitioning step can be performed in
place and efficiently
11. Picking the Pivot
1. Pick first element
1. Awful performance if list is sorted or reverse sorted
2. All elements will go in S1 or S2
3. Will take quadratic time and do nothing at all
4. Presorted input or mostly sorted input is quite frequent
5. Horrible pivot point
2. Pick larger of the first two elements
1. bad for the same reasons as above
3. Pick pivot randomly
1. Good solution
2. Unless random number generator has a flaw (which is common)
3. Random number generation expensive
4. Median of Three
1. Best choice would be the median of the array
2. Would take too long
3. Take median of first last and middle
4. elimnates sorted case
5. reduces running time of quicksort by about 5%
12. Partitioning Strategy
1. Swap pivot into last element
2. Assuming all distinct elements
3. while i less than j
4. Move i to right while elements less than pivot
5. Move j to left while elements greater than pivot
6. When i and j have stopped – swap elements and continue
7. Once i and j have crossed, stop
8. i is at lowest element greater than pivot
9. swap i and pivot
10. Duplicates to pivot
1. both i and j go past the pivot
2. Otherwise, if only i does, will put all duplicate in S1
3. if both go past and all the same element – could create very
uneven
subarrays
4. giving O(N2) results
5. best to make both stop and swap, since the extra swaps is better
than uneven subarrays
6. Why would we sort identical elements?
7. Think of sorting 100,000 elements where 5000 are identical
8. Eventually will call quicksort on 5000 elements (since quicksort is
recursive)
9. Will want these to be sorted efficiently
13. Small Arrays
1. N <= 20
2. Insertion sort is faster
3. Use insertion sort inside of quicksort for arrays smaller than 20
4. Faster by 15% in running time
5. Good cutoff is N = 10, although any number between 5 and 20 is fine
6. Saves nasty cases – such as taking median of three elements when only
two
14. Routines
1. Driver calls quicksort method that specifies the start and stop of
array to be sorted
2. Median of Three Pivot selection
1. returns pivot
2. places smallest of left, median, and right to left
3. places largest to the right
4. places median in the middle then
5. switched pivot to last element -1
3. Heart of Quicksort
1. Calls insertion sort on small arrays
2. Skips outer elements since done in median3 with the ++ and —
3. resores pivot into center from right – 1
4. Need to increase i and j each loop (i.e., not inside while) because
otherwise i and j will never change and will always equal pivot
15. Analysis
1. Worst Case: O(N2)
2. Best Case: O(NlogN) using same proof as merge sort
3. Average Case: O(NlogN)
Program

#include <stdio.h>

void quick_sort(int[],int,int);

int partition(int[],int,int);

int main()

{
int a[50],n,i;

printf("How many elements?");

scanf("%d",&n);

printf("\nEnter array elements:");

for(i=0;i<n;i++)

scanf("%d",&a[i]);

quick_sort(a,0,n-1);

printf("\nArray after sorting:");

for(i=0;i<n;i++)

printf("%d ",a[i]);

return 0;

void quick_sort(int a[],int l,int u)

int j;

if(l<u)

j=partition(a,l,u);

quick_sort(a,l,j-1);

quick_sort(a,j+1,u);

}
}

int partition(int a[],int l,int u)

int v,i,j,temp;

v=a[l];

i=l;

j=u+1;

do

do

i++;

while(a[i]<v&&i<=u);

do

j--;

while(v<a[j]);

if(i<j)

temp=a[i];

a[i]=a[j];

a[j]=temp;
}

}while(i<j);

a[l]=a[j];

a[j]=v;

return(j);

Quicksort (sometimes called partition-exchange sort) is an efficient sorting algorithm, serving as a systematic
method for placing the elements of an array in order. Developed by Tony Hoare in 1959[1] and published in 1961,[2] it is
still a commonly used algorithm for sorting. When implemented well, it can be about two or three times faster than its
main competitors, merge sort and heapsort.[3][contradictory]
Quicksort is a comparison sort, meaning that it can sort items of any type for which a "less-than" relation (formally,
a total order) is defined. In efficient implementations it is not a stable sort, meaning that the relative order of equal sort
items is not preserved. Quicksort can operate in-place on an array, requiring small additional amounts of memory to
perform the sorting. It is very similar to selection sort, except that it does not always choose worst-case partition.
Mathematical analysis of quicksort shows that, on average, the algorithm takes O(n log n) comparisons to
sort n items. In the worst case, it makes O(n2) comparisons, though this behavior is rare.

You might also like