You are on page 1of 17

Name:Rochak Bhattarai

Roll No:Cs190175

CASE STUDY(Group-6)
1. Q. Write different 20-30 sorting algorithms available and write
their time and space complexity and give comparison view(G-
6).
1.Selection sort
 Step 1: Repeat Steps 2 and 3 for K = 1 to N-1
 Step 2: CALL SMALLEST (ARR, K, N, POS)
 Step 3: SWAP A[K] with ARR[POS]
[END OF LOOP]
 Step 4: EXIT

SMALLEST (ARR, K, N, POS)

 Step 1: [INITIALIZE] SET SMALL = ARR[K]


 Step 2: [INITIALIZE] SET POS = K
 Step 3: Repeat for J = K+1 to N -1
IF SMALL > ARR[J]
SET SMALL = ARR[J]
SET POS = J
[END OF IF]
[END OF LOOP]
 Step 4: RETURN POS

Time complexity and space complexity

Complexity Best Case Average Case Worst Case


2
Time Ω(n) θ(n ) o(n2)
Space o (1)

2.HEAP SORT

HEAP_SORT (ARR, N)
 Step 1: [Build Heap H]
Repeat for i=0 to N-1
CALL INSERT_HEAP (ARR, N, ARR[i])
[END OF LOOP]
 Step 2: Repeatedly Delete the root element
Repeat while N > 0
CALL Delete Heap (ARR, N, VAL)
SET N = N+1
[END OF LOOP]
 Step 3: END

Time complexity and space complexity

Complexity Best Case Average Case Worst case


Time Complexity Ω (n log (n)) θ (n log (n)) O (n log (n))
Space Complexity O (1)

3.COUNTING SORT

 STEP 1 START
 STEP 2 Store the input array
 STEP 3 Count the key values by number of occurrences of object
 STEP 4 Update the array by adding previous key elements and assigning to
objects
 STEP 5 Sort by replacing the object into new array and key= key-1
 STEP 6 STOP

Time complexity and space complexity

Complexity Best Case Average Case Worst Case


Time Complexity Ω(n+k) θ(n+k) O(n+k)
Space Complexity O(k)

4.QUICK SORT

PARTITION (ARR, BEG, END, LOC)

 Step 1: [INITIALIZE] SET LEFT = BEG, RIGHT = END, LOC = BEG, FLAG =
 Step 2: Repeat Steps 3 to 6 while FLAG =
 Step 3: Repeat while ARR[LOC] <=ARR[RIGHT]
AND LOC!= RIGHT
SET RIGHT = RIGHT - 1
[END OF LOOP]
 Step 4: IF LOC = RIGHT
SET FLAG = 1
ELSE IF ARR[LOC] > ARR[RIGHT]
SWAP ARR[LOC] with ARR[RIGHT]
SET LOC = RIGHT
[END OF IF]
 Step 5: IF FLAG = 0
Repeat while ARR[LOC] >= ARR[LEFT] AND LOC != LEFT
SET LEFT = LEFT + 1
[END OF LOOP]
 Step 6:IF LOC = LEFT
SET FLAG = 1
ELSE IF ARR[LOC] < ARR[LEFT]
SWAP ARR[LOC] with ARR[LEFT]
SET LOC = LEFT
[END OF IF]
[END OF IF]
 Step 7: [END OF LOOP]
 Step 8: END

QUICK_SORT (ARR, BEG, END)

 Step 1: IF (BEG < END)


CALL PARTITION (ARR, BEG, END, LOC)
CALL QUICKSORT (ARR, BEG, LOC - 1)
CALL QUICKSORT (ARR, LOC + 1, END)
[END OF IF]
 Step 2: END

Time complexity and space complexity

Complexity Best Case Average Case Worst Case


Time Complexity O(n) for 3 way O(n log n) O(n2)
partition or O(n log
n) simple partition
Space Complexity O(log n)
5.BUBBLE SORT

 Step 1: Repeat Step 2 For i = 0 to N-1


 Step 2: Repeat For J = i + 1 to N - I
 Step 3: IF A[J] > A[i]
SWAP A[J] and A[i]
[END OF INNER LOOP]
[END OF OUTER LOOP
 Step 4: EXIT

Time complexity and space complexity

Space O(1)
Worst case running time O(n2)
Average case running time O(n)
Best case running time O(n2)

6.MERGE SORT

 Step 1: [INITIALIZE] SET I = BEG, J = MID + 1, INDEX = 0


 Step 2: Repeat while (I <= MID) AND (J<=END)
IF ARR[I] < ARR[J]
SET TEMP[INDEX] = ARR[I]
SET I = I + 1
ELSE
SET TEMP[INDEX] = ARR[J]
SET J = J + 1
[END OF IF]
SET INDEX = INDEX + 1
[END OF LOOP]
Step 3: [Copy the remaining
elements of right sub-array, if
any]
IF I > MID
Repeat while J <= END
SET TEMP[INDEX] = ARR[J]
SET INDEX = INDEX + 1, SET J = J + 1
[END OF LOOP]
[Copy the remaining elements of
left sub-array, if any]
ELSE
Repeat while I <= MID
SET TEMP[INDEX] = ARR[I]
SET INDEX = INDEX + 1, SET I = I + 1
[END OF LOOP]
[END OF IF]
 Step 4: [Copy the contents of TEMP back to ARR] SET K = 0
 Step 5: Repeat while K < INDEX
SET ARR[K] = TEMP[K]
SET K = K + 1
[END OF LOOP]
 Step 6: Exit

MERGE_SORT(ARR, BEG, END)

ADVERTISEMENT

 Step 1: IF BEG < END


SET MID = (BEG + END)/2
CALL MERGE_SORT (ARR, BEG, MID)
CALL MERGE_SORT (ARR, MID + 1, END)
MERGE (ARR, BEG, MID, END)
[END OF IF]
 Step 2: END

Time complexity and space complexity

Complexity Best case Average Case Worst Case


Time Complexity O (n log n) O (n log n) O (n log n)
Space Complexity O(n)

7.CYCLE SORT

cycleSort(array, size)
Begin

for start:= 0 to n – 2 do
key := array[start]
location := start
for i := start + 1 to n-1 do
if array[i] < key then
location:=location +1
done

if location = start then


ignore lower part, go for next iteration
while key = array[location] do
location := location +1
done

if location ≠ start then


swap array[location] with key
while location ≠ start do
location := start
for i := start + 1 to n-1 do
if array[i] < key then
location:=location +1
done

while key = array[location]


location := location +1
if key ≠ array[location]
swap array[location] and key
done
done
End

Output − The sorted Array

The complexity of Cycle Sort Technique


 Time Complexity: O(n^2)
 Space Complexity: O(1)

8.INSERTION SORT

 Step 1: Repeat Steps 2 to 5 for K = 1 to N-1


 Step 2: SET TEMP = ARR[K]
 Step 3: SET J = K - 1
 Step 4: Repeat while TEMP <=ARR[J]
SET ARR[J + 1] = ARR[J]
SET J = J - 1
[END OF INNER LOOP]
 Step 5: SET ARR[J + 1] = TEMP
[END OF LOOP]
 Step 6: EXIT

Time complexity and space complexity

Complexity Best Case Average Case Worst Case


2
Time Ω(n) θ(n ) o(n2)
Space o(1)

9.RADIX SORT

 Step 1:Find the largest number in ARR as LARGE


 Step 2: [INITIALIZE] SET NOP = Number of digits
in LARGE
 Step 3: SET PASS =0
 Step 4: Repeat Step 5 while PASS <= NOP-1
 Step 5: SET I = 0 and INITIALIZE buckets
 Step 6:Repeat Steps 7 to 9 while I
 Step 7: SET DIGIT = digit at PASSth place in A[I]
 Step 8: Add A[I] to the bucket numbered DIGIT
 Step 9: INCREMENT bucket count for bucket
numbered DIGIT
[END OF LOOP]
 Step 10: Collect the numbers in the bucket
[END OF LOOP]
 Step 11: END

TIME AND SPACE COMPLEXITY

Complexity Best Case Average Case Worst Case


Time Complexity Ω(n+k) θ(nk) O(nk)
Space Complexity O(n+k)

10.PIEGION HOLE SORT

1. for i=1 to s

2. U[i] =0

3. for j=1 to length[T]

4. U [T[ j ]]= U [T[ j ]]+1

5. q=1

6. for j =1 to s

7. while U[ j ]>0

8. T[q]=j

9. U[ j ]=U[ j ]-1

10.q=q+1

TIME AND SPACE COMPLEXITY

Pigeonhole

Time complexity= O(n+2^k)

Space Complexity= O(2^k)

11.TREE SORT
insert (Node node, Key value):Inserts a new node in the BST
Data: node: The input node of the object
value: The value of node key
Result: Returns the inserted node object
if node!=null then
return Node(value);
end
else if value<=node.value then
node.left<-insert(node.left,value);
end
else
node.right<-insert(node.right,value);
end
return node;

Space Complexity: O(n)


Time Complexity:
Best Case:O(n logn)
Average Case:O(n logn)
Worst Case: O(n2)

12.SHEEL SORT
Shell Sort(Arr, n)

 Step 1: SET FLAG = 1, GAP_SIZE = N


 Step 2: Repeat Steps 3 to 6 while FLAG = 1 OR GAP_SIZE > 1
 Step 3:SET FLAG = 0
 Step 4:SET GAP_SIZE = (GAP_SIZE + 1) / 2
 Step 5:Repeat Step 6 for I = 0 to I < (N -GAP_SIZE)
 Step 6:IF Arr[I + GAP_SIZE] > Arr[I]
SWAP Arr[I + GAP_SIZE], Arr[I]
SET FLAG = 0
 Step 7: END

TIME COMPLEXITY AND SPACE COMPLEXITY


Complexity Best Case Average Case Worst Case
2
Time Complexity Ω(n log(n)) θ(n log(n) ) O(n log(n)2)
Space Complexity O(1)

13.COMB SORT
 STEP 1 START
 STEP 2 Calculate the gap value if gap value==1 goto step 5 else goto step 3
 STEP 3 Iterate over data set and compare each item with gap item then goto
step 4.
 STEP 4 Swap the element if require else goto step 2
 STEP 5 Print the sorted array.
 STEP 6 STOP

TIME COMPLEXITY AND SPACE COMPLEXITY

Algorithm Complexity
Worst Case Complexity O(n2)
Best Case Complexity θ(n log n)
Average Case Complexity Ω(n2/2p) where p is number of
increments.
Worst Case Space Complexity O(1)

14.BITONIC SORT
Step 1 − Create a bitonic sequence.
Step 2 − Now, we have a bitonic sequence with one part in increasing order and
others in decreasing order.
Step 3 − We will compare and swap the first elements of both halves. Then second,
third, fourth elements for them.
Step 4 − We will compare and swap, every second element of the sequence.
Step 5 − At last, we will compare and swap adjacent elements of the sequence.
Step 6 − After all swaps, we will get the sorted array.
TIME AND SPACE COPLEXITY

In order to form a sorted sequence of length n from two sorted sequences of length n/2, there are
log(n) comparator stages required (e.g. the 3 = log(8) comparator stages to form sequence i from d
and d'). The number of comparator stages T(n) of the entire sorting network is given by:

T(n) = log(n) + T(n/2)


The solution of this recurrence equation is

T(n) = log(n) + log(n)-1 + log(n)-2 + ... + 1 = log(n) · (log(n)+1) / 2


Each stage of the sorting network consists of n/2 comparators. On the whole, these are Θ(n·log(n)2)
comparators.

15.COCKTAIL SORT

2. The first stage loop through the array like bubble sort from left to right. The
adjacent elements are compared and if left element is greater than the right
element, then we swap those elements. The largest element of the list is
placed at the end of the array in the forward pass.
3. The second stage loop through the array from the right most unsorted
element to the left. The adjacent elements are compared and if right element
is smaller than the left element then, we swap those elements. The smallest
element of the list is placed at the beginning of the array in backward pass.

TIME AND SPACE COMPLEXITY

Complexity Best Case Average Case Worst Case


2 2
Time Complexity O(n ) O(n ) O(n2)
Space Complexity O(1)

16.TIM SORT
Consider an array of n elements which needs to be sorted. In Tim sort, the array is
divided into several parts where each of them is called a Run. These Runs are sorted
by using insertion sort one by one and then the sorted runs get combined using a
combine function. The idea of Tim sort is originated from the fact that, insertion sort
works more optimally on the short lists rather than working on the larger lists.

4. Divide the array into the number of blocks known as run.


5. Consider size of run either 32 or 64(in the below implementation, size of run
is 32.)
6. Sort the individual elements of every run one by one using insertion sort.
7. Merge the sorted runs one by one using merge function of merge sort.
8. Double the size of merged sub-arrays after every iteration.

TIME COMPLEXITY AND SPACE COMPLEXITY

Complexity Best Case Average Case Worst Case


Time Complexity O(n) O(n log n) O(n log n)
Space Complexity n
17.STRAND SORT

Below are simple steps used in the algorithm.


1.Let ip[] be input list and op[] be output list.
2.Create an empty sub list and move first item of ip[] to it.
3.Traverse remaining items of ip. For every item x, check if x is greater than last
inserted item to sub list. If yes, remove x from ip and add at the end of sublist.
If no, ignore x (Keep it it in ip)
4.Merge sublist into op (output list)
5.Recur for remaining items in ip and current items in op.
TIME AND SPACE COMPLEXITY
TIME: -0(n)
Worst-case=0(n*n)

18.PANCAKE SORTING
LET GIVEN ARRAY BE arr[] and size be n.
 Start from current size equal to n and reduce current size by one while
it’s greater than 1. Let the current size be curr_size. Do following for
every curr_size
o Find index of the maximum element in arr[0..curr_szie-1]. Let the
index be ‘mi’
o Call flip(arr, mi)
o Call flip(arr, curr_size-1)

TIME COMPLEXITY AND SPACE COMPLEXITY

Time complexity =0(n^2)


Space complexity: -0(n)

19.BOGO SORTING
while not Sorted(list) do
shuffle (list)done
TO GENERATE PERMUTATION
1.FOR I<n
2.swamp(a[I])
3. while not Sorted(list) do
shuffle (list)done

TIME COMPLEXITY AND SPACE COMPLEXITY

Worst-case performance Unbounded (randomized version), O((n+1)!)


(Deterministic version)
Best-case performance O(n)[1]

Average performance O((n+1)!) [1]

Worst-case space complexity O(1)

19.GNOME SORTING
Gnome Sort also called Stupid sort is based on the concept
of a Garden Gnome sorting his flower pots
1.If you are at the start of the array then go to the
right element (from arr[0] to arr[1]).
2.If the current array element is larger or equal to the
previous array element then go one step right

if (arr[i] >= arr[i-1]) i++;

1.If the current array element is smaller than the


previous array element then swap these two elements and go
one step backwards

if (arr[i] < arr[i-1])


{ swap(arr[i], arr[i-1]);
i--; }
2.Repeat steps 2) and 3) till ‘i’ reaches the end of the
array (i.e- ‘n-1’)
3.If the end of the array is reached then stop and the
array is sorted.
TIME COMPLEXITY AND SPACE COMPLEXITY

TIME: -0(n),0(n log n),0(n^@),0(2^n)


Space: - 0(1)

20. SLEEP SORTING


FIRST, WE CREATE A FUNCTION _beginthread() and
waitFormultipleobjects()
1.sleepsort(int arr[], int n)
2.handle thread[n]
3.for(I<n)
4.thread[I]=(handle)_beginthread(&routine,0,&arr[I]);
5.process the thread
waitFormultipleobjects(nthread,true,infinite)
6. return
TIME COMPLEXITY AND SPACE COMPLEXITY

Time: - O (n log(n))
Space: - O(n)

21. Stooge SORTING


The Stooge sort is a recursive sorting algorithm. It is defined as below (for
ascending order sorting).
Step 1: If value at index 0 is greater than value at last
index, swap them.
Step 2: Recursively,
a) Stooge sort the initial 2/3rd of the array.
b) Stooge sort the last 2/3rd of the array.
c) Stooge sort the initial 2/3rd again to confirm.
TIME COMPLEXITY AND SPACE COMPLEXITY

The running time complexity of stooge sort can be written


as,
T(n) = 3T(3n/2) +? (1)
Solution of above recurrence is O(n (log3/log1.5)) = O (n2.709),
hence it is slower than even bubble sort(n^2).

22.TAG SORTING

FIRST, WE CREATE A TAG ARRAY


 Every Person object is tagged to one element in the tag array and instead of
swapping the person object for sorting based on salary, we swap the tag []
integers.
 While printing the sorted array we take a cue from the tag array to print the
sorted Persons array.
 Eventually, we’ll escape swapping large Person's object.
 class Person
 {
 private int id;
 private float salary;
 private Object someBigObject = new Object ();
 public Person (int id, float salary)
 { } public float getSalary()
 { } public String toString()
 { }
 We use this class-oriented pseudo code to get our desired
output

TIME COMPLEXITY AND SPACE COMPLEXITY

Time: -O (n log n)
Space:-O(n)
23.BRICK SORTING OR ODD/EVEN SORTING
function oddEvenSort(list)

function swap (list, i, j)

var temp = list[i];

list[i] = list[j]; list[j] = temp;

var sorted = false;

while (! sorted) {

sorted = true;

for (var i = 1; i < list.length - 1; i += 2) {

if (list[i] > list[i + 1]) {

swap (list, i, i + 1);

sorted = false; } }

for (var i = 0; i < list.length - 1; i += 2)

if (list[i] > list[i + 1]) {

swap (list, i, i + 1);

sorted = false; }

TIME COMPLEXITY AND SPACE COMPLEXITY

WORST-CASE=O(n^2)
BEST-CASE=O(n)
SPACE COMPLEXITY=O (1)
24.RECURSIVE INSERTION SORTING
Step1 − loop from 1 to n-1 and do −
Step2.1 − select element at position i, array[i].
Step2.2 − insert the element in its position in the sorted sub-array array [0] to arr[i].
TIME COMPLEXITY AND SPACE COMPLEXITY

TIME: - Ω(n), θ(n2)


SPACE-O (1)
25.MERGE SORT FOR LINKED LIST
MergeSort(headRef)
1) If the head is NULL or there is only one element in the Linked
List then return.
2) Else divide the linked list into two halves.
FrontBackSplit(head, &a, &b); /* a and b are two halves */
3) Sort the two halves a and b.
MergeSort(a);
MergeSort(b);
4) Merge the sorted a and b (using SortedMerge() discussed here)
and update the head pointer using headRef.
*headRef = SortedMerge(a, b);

TIME COMPLEXITY AND SPACE COMPLEXITY

Complexity Best case Average Case Worst Case


Time Complexity O(n log n) O(n log n) O(n log n)
Space Complexity O(n)

THE END

You might also like