You are on page 1of 20

Spring 2019

Data Structures II
#2
Heap Sort
What is Heap??
The (binary) heap data structure is an array object that we
can view as a nearly complete binary tree.

The tree is completely filled on all levels except possibly


the lowest, which is filled from the left up to a point.

Relationships between indexes of parents and children.


- Leftchild(i) = 2*i

- Rightchild(i) = 2*i+1

- Parent(i) = i/2 (integer division)

1
There are two kinds of binary heaps: max-heaps and min-heaps.
In both kinds, the values in the nodes satisfy a heap property.
Max-heap Min-heap
for every node other than root for every node other than root,
, A[PARENT(i)] ≥ A(i) A[PARENT(i)] ≤A(i)
The root has the largest key The root has the smallest key
Max Heap Example Min Heap Example

Build Max Heap algorithm


Put everything in the array and then heapify/fix the trees in a
bottom-up way starting from last parent.

2
Max Heapify Algorithm

Complexity of max heapify


- Inserting one new element to a heap with n-1 nodes requires
no more comparisons than the heap’s height
- So the Max Heapify complexity is O(log n)

3
Example Calling Max Heapify(A,2)

Build Max Heap algorithm

4
Complexity of BUILD-MAX-HEAP

We can compute a simple upper bound on the running time of


BUILD-MAX-HEAP as follows:

Each call to MAX-HEAPIFY costs O(lg n) time.


BUILD-MAX-HEAP makes O(n) such calls.
Thus, the running time is O(n lg n).

This upper bound, though correct, is not asymptotically tight.


We can derive a tighter bound by observing that the time for
MAX-HEAPIFY to run at a node varies with the height of the
node in the tree, the tighter bound will be O(n)

5
Example for Build Max Heap

6
The heapsort algorithm

It starts by using BUILD-MAX-HEAP to build a max-heap on


the input array A[1…..n] where n = A.length.
Since the maximum element of the array is stored at the root A[1]
we can put it into its correct final position by exchanging it with
A[n].

If we now discard node n from the heap and we can do so by


simply decrementing A.heap-size we observe that the children of
the root remain max-heaps, but the new root element might violate
the max-heap property.

All we need to do to restore the max-heap property, however, is


call MAX-HEAPIFY (A, 1)
which leaves a max-heap in A[1….n-1].

The heap sort algorithm then repeats this process for the max-heap
of size n -1 down to a heap of size 2.

7
Complexity of BUILD-MAX-HEAP

The HEAPSORT procedure takes time O(n lg n), since the call to
BUILD-MAX-HEAP takes time O(n) and each of the n -1 calls
to MAX-HEAPIFY takes time O(lg n)
so total running time =O(n)+O(n lg n)  O(n lg n)
Heap sort Example

8
Merge Sort
A sorting algorithm based on divide and conquer.
Because we are dealing with sub problems we state the
sub problem as sorting a subarray A[p…r]. Initily p=1 and
r=n but this values change as we recurse through sub
problems to sort A[p…r]

9
Merge Algorithm that merges two sorted lists
into one sorted List.

10
Example for merge sort

11
Analyzing merge sort running time
For simplicity assume n is power of 2 the base case
occurs when n=1
When n≥2 time for merge sort steps
Divide: just compute q as the average of p and r  O(1)
Conquer: Recursively solve 2 sub problems each of size
n/2
Combine: Merge n element sub array takes O(n)

So we write the merge sort recurrence as


following:
T (n) =2T (n/2) +n
T (1) =1
By Solving this recurrence so the merge sort algorithm is
O(n log n)

12
Insertion Sort

Example:

Sort the Array A {5, 2, 4, 6, 1, 3}

13
The operation of INSERTION-SORT on the array
A = {5, 2, 4, 6, 1, 3}.

- Array indices appear above the rectangles, and values


stored in the array positions appear within the
rectangles.

- (a)- (e) the iterations of the for loop of lines 1–8.

- In each iteration, the black rectangle holds the key


taken from A[j], which is compared with the values
in shaded rectangles to its left in the test of line 5.

Shaded arrows show array values moved one position


to the right in line 6, and black arrows indicate where
the key moves to in line 8.

- (f) The final sorted array.

Complexity of insertion sort is O (n2).

14
Selection Sort
Selection-Sort(A)
for i = 1 to A.length
// Find min. value among a[i], a[i+1], ..., a[n-1]
min_j  i;
for j = i+1 to n
if a[j] < a[min_j]
min_j  j;
Exchange a[i]  a[min_j]

Complexity of Selection sort  O (n2)

Bubble Sort

Complexity of Bubble sort  O (n2)

15
Quick Sort

Description of the Quick sort


Quicksort, like merge sort, applies the divide-and-conquer
paradigm introduced

- Divide: Partition (rearrange) the array A[p…r]


into two (possibly empty) subarrays A[p….q-1] and
A[q+1….r] such that each element of A[p….q-1] is
less than or equal to A[q], which is, in turn, less than
or equal to each element of A[q+1….r].
Compute the index q as part of this partitioning
procedure.

- Conquer: Sort the two subarrays A[p….q-1] and


A[q+1….r] by recursive calls to quicksort.

- Combine: Because the subarrays are already sorted,


no work is needed to combine them: the entire array
A [p…r] is now sorted.

16
17
Example: Pivot element is 5

18
Complexity of quick sort
- Average case running time : when the array
divided into two nearly equal parts
-
- So we write the Quick sort recurrence as following:
T (n) =2T (n/2) +n
T (1) =1
By Solving this recurrence so the merge sort
algorithm is O (n log n)

- Worst Case running time: The worst-case behavior


for quicksort occurs when the partitioning routine
produces one sub problem with n-1 elements and one
with 0 elements.
- So we write the Quick sort recurrence as following:
T (n) =T (n-1) +T(0)+n
T (n) =T (n-1) +n
By Solving this recurrence so the merge sort
algorithm is O (n2)

19

You might also like