You are on page 1of 75

# Divide and Conquer

 



divide the problem into a number of subproblems conquer the subproblems (solve them) combine the subproblem solutions to get the solution to the original problem

Note: often the ³conquer´ step is done recursively
1
9/6/2010 2:54:52 AM http://www.nurul.com

Divide and Conquer 
A common approach to solving a problem is to

partition the problem into smaller parts, find solutions for the parts, and then combine the solutions for the parts into a solution for the whole.  This approach, especially when used recursively, often yields efficient solutions to problems in which the sub-problems are smaller versions of the original problem. We illustrate the technique with some examples followed by an analysis of the resulting recurrence equations.

9/6/2010 2:54:52 AM

http://www.nurul.com

2

Recursive algorithm 

to solve a given problem, they call themselves recursively one or more times to deal with closely related subproblems.  

usually the subproblems are smaller in size than the `parent' problem divide-and-conquer algorithms are often recursive
http://www.nurul.com 3

9/6/2010 2:54:52 AM

nurul.com 4 .Algorithm for General Divide and Conquer Sorting Algorithm for General Divide and Conquer Sorting Begin Algorithm Start Sort(L) If L has length greater than 1 then Begin Partition the list into two lists. high and low Start Sort(high) Start Sort(low) Combine high and low End End Algorithm 9/6/2010 2:54:52 AM http://www.

nurul. its running time can often be described by a recurrence equation which describes the overall running time on a problem of size n in terms of the running time on smaller inputs.Analyzing Divide-and-Conquer Algorithms   When an algorithm contains a recursive call to itself. For divide-and-conquer algorithms.com 5 9/6/2010 2:54:52 AM . we get recurrences that look like: T(n) = _ 5(1) if n < c aT(n/b) +D(n) +C(n) otherwise http://www.

Example: Towers of Hanoi Again!!   Problem: Move n disks from peg A to B using peg C as a temporary peg Divide and Conquer approach:  Two subproblems of size n-1: (1) Move the n-1 smallest disks from A to C (*) Move nth smallest disk from A to B (easy) (2) Move the n-1 smallest disks from C to B  Moving n-1 smallest disks is done by recursive application the method 9/6/2010 2:54:52 AM http://www.com 6 .nurul.

recursively using merge sort combine: merge the resulting two sorted n/2 element sequences 9/6/2010 2:54:53 AM http://www.com 7 .nurul.Example: Merge Sort    divide the n-element sequence to be sorted into two n/2 element sequences conquer: sort the subproblems.

Example: Merge Sort Analysis _ T(n) = 5(1) 2T(n/2) 5.

n) if n < c otherwise a = 2(two subproblems) n/b = n/2 (each subproblem has size approx n/2) D(n) = 5(1) (just compute midpoint of array) C(n) = 5.

nurul.n) (merging can be done by scanning sorted subarrays) 9/6/2010 2:54:53 AM http://www.com 8 .

 One obvious way to find the maximum and minimum elements would be to find each separately.com 9  .nurul. 9/6/2010 2:54:53 AM http://www.Finding The Maximum and Minimum of N-Elements Consider the problem of finding both the maximum and the minimum elements of a set S containing n elements.  For simplicity we shall assume the n is a power of 2.

1 comparisons between elements of S.2 comparisons.1 elements with n . find the minimum of the remaining n . giving a total of 2n .3 comparisons to find the maximum and minimum.Algorithm   finds the maximum element of S in n .nurul. assuming n 9/6/2010 2:54:53 AM http://www.com 10 .

each with n/2 elements.)   The divide-and-conquer approach would divide the set S into two subsets S1 and S2.com 11 9/6/2010 2:54:53 AM .nurul. by recursive applications of the algorithm. http://www.Algorithm (cont. The algorithm would then find the maximum and minimum elements of each of the two halves.

MIN(a. b). MIN(min1. (max1. each with half the elements. min2) MAXMIN(S2). b} : return (MAX(a. return (MAX(max1. min1) MAXMIN(S1). min2)) 9/6/2010 2:54:53 AM http://www. max2). (max2.Pseudocode Procedure MAXMIN(S): /* This procedure returns the max and min elements of S.nurul. */ case S = {a } : return (a. a) begin case S {a.com 12 . b)) else : divide S into two subsets S1 and S2.

partition elements into two or more sub-collections. sort each. combine into a single sorted list http://www. sort elements into non-decreasing order Divide-and-Conquer:   If n=1 terminate (every one-element list is already sorted) If n>1.nurul.com 13  How do we partition? 9/6/2010 2:54:53 AM .Merge Sort    Apply divide-and-conquer to sorting problem Problem: Given n elements.

  First n-1 elements into set A.nurul. last element set B Sort A using this partitioning scheme recursively  Partitioning .Choice 1 B already sorted   Combine A and B using method Insert() (= insertion into sorted array) Leads to recursive version of InsertionSort()  Number of comparisons: O(n2) n n(n  1)   Best case = n-1 Worst case = 9/6/2010 2:54:53 AM c§i ! i! 2 2 http://www.com 14 .

remaining elements in A Sort A recursively To combine sorted A and B.Partitioning . append B to sorted A   Use Max() to find largest element p recursive SelectionSort() Use bubbling process to find and move largest element to right-most position p recursive BubbleSort() 9/6/2010 2:54:53 AM http://www.nurul.com 15  All O(n2) .Choice 2    Put element with largest key in B.

nurul. B gets rest Sort A and B recursively Combine sorted A and B using a process called merge. which combines two sorted lists into one  16 How? Just like merging sorted runs when 9/6/2010 doing an external sort 2:54:53 AM http://www.Choice 3     Let s try to achieve balanced partitioning A gets n/k elements.com .Partitioning .

± Finally.Merge Sort  Idea: ± Take the array you would like to sort and divide it in half to create 2 unsorted subarrays.  Efficiency: O(nlog2n) 9/6/2010 2:54:53 AM http://www.nurul. sort each of the 2 subarrays. ± Next. merge the 2 sorted subarrays into 1 sorted array.com 17 .

nurul.com 18 .Merge Sort 9/6/2010 2:54:53 AM http://www.

com 19 . we have overlooked a very important step.  How did we sort the 2 halves before performing the merge step? We used merge sort! 9/6/2010 2:54:53 AM http://www.nurul.Merge Sort  Although the merge step produces a sorted array.

we eventually get a subarray of size 1. we can back out and merge 2 arrays of size 1. 9/6/2010 2:54:53 AM http://www.nurul.com 20 .Merge Sort  By continually calling the merge sort algorithm.  Since an array with only 1 element is clearly sorted.

Merge Sort

9/6/2010 2:54:53 AM

http://www.nurul.com

21

Merge Sort
 The basic merging algorithm consists of:
± 2 input arrays (arrayA and arrayB) ± An ouput array (arrayC) ± 3 position holders (indexA, indexB, indexC), which are initially set to the beginning of their respective arrays.

9/6/2010 2:54:53 AM

http://www.nurul.com

22

Merge Sort
 The smaller of arrayA[indexA] and arrayB[indexB] is copied into arrayC[indexC] and the appropriate position holders are advanced.  When either input list is exhausted, the remainder of the other list is copied into arrayC.
9/6/2010 2:54:53 AM http://www.nurul.com 23

com 24 .Merge Sort arrayA 1 indexA 13 24 26 We compare arrayA[indexA] with arrayB[indexB]. Whichever value is smaller is placed into arrayC[indexC].nurul. 1 < 2 so we insert arrayA[indexA] into arrayC[indexC] arrayB 2 indexB 15 27 38 arrayC indexC 9/6/2010 2:54:53 AM http://www.

nurul.Merge Sort arrayA 1 13 indexA 24 26 2 < 13 so we insert arrayB[indexB] into arrayC[indexC] arrayB 2 indexB 15 27 38 arrayC 1 indexC 9/6/2010 2:54:53 AM http://www.com 25 .

nurul.com 26 .Merge Sort arrayA 1 13 indexA 24 26 13 < 15 so we insert arrayA[indexA] into arrayC[indexC] arrayB 2 15 indexB 27 38 arrayC 1 2 indexC 9/6/2010 2:54:54 AM http://www.

com 27 9/6/2010 2:54:54 AM .Merge Sort arrayA 1 13 24 indexA 26 15 < 24 so we insert arrayB[indexB] into arrayC[indexC] arrayB 2 15 indexB 27 38 arrayC 1 2 13 indexC http://www.nurul.

com 28 .Merge Sort arrayA 1 13 24 indexA 26 24 < 27 so we insert arrayA[indexA] into arrayC[indexC] arrayB 2 15 27 indexB 38 arrayC 1 2 13 9/6/2010 2:54:54 AM 15 indexC http://www.nurul.

com 29 .nurul.Merge Sort arrayA 1 13 24 26 indexA 26 < 27 so we insert arrayA[indexA] into arrayC[indexC] arrayB 2 15 27 indexB 38 arrayC 1 2 13 15 9/6/2010 2:54:55 AM 24 indexC http://www.

into arrayC arrayB 2 15 27 indexB 38 arrayC 1 2 13 15 24 9/6/2010 2:54:55 AM 26 indexC http://www.Merge Sort arrayA 1 13 24 26 Since we have exhausted one of the arrays. arrayA. arrayB. we simply copy the remaining items from the other array.com 30 .nurul.

Merge Sort arrayA 1 13 24 26 arrayB 2 15 27 38 arrayC 1 2 13 9/6/2010 2:54:55 AM 15 24 26 27 38 http://www.com 31 .nurul.

// Sort the 2nd half of the list mergesort(list.com 32 . last) { if( first < last ) mid = (first + last)/2. first.Merge Sort Pseudocode mergesort(list.nurul. // Sort the 1st half of the list mergesort(list. end if } 9/6/2010 2:54:55 AM http://www. // Merge the 2 sorted halves merge(list. last). mid. mid+1. last). mid). first. first.

last) { // Initialize the first and last indices of our subarrays firstA = first lastA = mid firstB = mid+1 lastB = last index = firstA // Index into our temp array 9/6/2010 2:54:55 AM http://www. first.com 33 .Merge Sort Pseudocode (cont) merge(list. mid.nurul.

Merge Sort Pseudocode (cont) // Start the merging loop( firstA <= lastA AND firstB <= lastB ) if( list[firstA] < list[firstB] ) tempArray[index] = list[firstA] firstA = firstA + 1 else tempArray[index] = list[firstB] firstB = firstB + 1 end if index = index + 1. end loop .

nurul.Merge Sort Pseudocode (cont) // At this point. one of our subarrays is empty // Now go through and copy any remaining items // from the non-empty array into our temp array loop (firstA <= lastA) tempArray[index] = list[firstA] firstA = firstA + 1 index = index + 1 end loop loop ( firstB <= lastB ) tempArray[index] = list[firstB] firstB = firstB + 1 index = index + 1 end loop 9/6/2010 2:54:55 AM http://www.com 35 .

Merge Sort Pseudocode (cont) // Finally. we copy our temp array back into // our original array index = first loop (index <= last) list[index] = tempArray[index] index = index + 1 end loop } 9/6/2010 2:54:55 AM http://www.com 36 .nurul.

9/6/2010 2:54:55 AM http://www.nurul.Evaluation Recurrence equation:  Assume n is a power of 2 c1 if n=1 T(n) = 2T(n/2) + c2n n=2k  if n>1.com 37 .

nurul.Solution By Substitution: T(n) = 2T(n/2) + c2n T(n/2) = 2T(n/4) + c2n/2 T(n) = 4T(n/4) + 2 c2n T(n) = 8T(n/8) + 3 c2n T(n) = 2iT(n/2i) + ic2n 9/6/2010 2:54:55 AM http://www.com 38 .

thus an upper bound for TmergeSort(n) is O(nlogn) .) Assuming n = 2k. we get T(n) = c1n + c2nlogn. since T(1) = c1.Solution (cont. this happens when i=k T(n) = 2kT(1) + kc2n Since 2k=n. we know k=log n. expansion halts when we get T(1) on right side.

so that left and right are same size 9/6/2010 2:54:55 AM http://www. no merge required to combine hope that pivot is near median.Quick Sort   Most efficient (general) internal sorting routine In essence: sort array A by picking some key value v in the array as a pivot element     three segments: left. pivot. right all elements in left are smaller than pivot.com 40 .nurul. all elements in right are larger than or equal to pivot sort elements in left and right.

Quick Sort  In the bubble sort.  Quick sort is more efficient than bubble sort because a typical exchange involves elements that are far apart.  This means that many exchanges may be needed to move an element to its correct position. so fewer exchanges are required to correctly position an element. 9/6/2010 2:54:55 AM http://www.com 41 . consecutive items are compared and possibly exchanged on each pass through the list.nurul.

and divides the list into 3 groups: ± Elements whose keys are less than (or equal to) the pivot¶s key. known as the pivot. ± The pivot element ± Elements whose keys are greater than (or equal to) the pivot¶s key.com 42 . 9/6/2010 2:54:55 AM http://www.nurul.Quick Sort  Each iteration of the quick sort selects an element.

 The basic algorithm is as follows: 9/6/2010 2:54:56 AM http://www.com 43 .nurul.Quick Sort  The sorting then continues by quick sorting the left partition followed by quick sorting the right partition.

nurul.com 44 . and all values to the right of the element are greater than (or equal to) the element. 9/6/2010 2:54:56 AM http://www. We now have 1 element in its proper location and two unsorted subarrays.  Recursive Step: Perform step 1 on each unsorted subarray. This occurs when all values to the left of the element in the array are less than (or equal to) the element.Quick Sort  Partitioning Step: Take an element in the unsorted array and determine its final location in the sorted array.

another element is placed in its final location of the sorted array.  Therefore that element is in its final location. that subarray is sorted.com 45 . 9/6/2010 2:54:56 AM http://www.nurul.  When a subarray consists of one element. and two unsorted subarrays are created.Quick Sort  Each time step 1 is performed on a subarray.

but the one we are about to describe is known to work well.com 46 . several ³versions´ of quick sort)..nurul.  We could also chose a different pivot element and swap it with the last element in the array.e. 9/6/2010 2:54:56 AM http://www.  For simplicity we will choose the last element to be the pivot element.Quick Sort  There are several partitioning strategies used in practice (i.

Quick Sort  Below is the array we would like to sort: 1 4 8 9 0 11 5 10 7 6 9/6/2010 2:54:56 AM http://www.com 47 .nurul.

Quick Sort  The index left starts at the first element and right starts at the next-to-last element. 9/6/2010 2:54:56 AM http://www.com 48 .nurul. 1 left 4 8 9 0 11 5 10 7 right 6  We want to move all the elements smaller than the pivot to the left part of the array and all the elements larger than the pivot to the right part.

com 49 . 1 4 8 left 9 0 11 5 10 7 right 6 9/6/2010 2:54:56 AM http://www. skipping over elements that are smaller than the pivot.nurul.Quick Sort  We move left to the right.

9/6/2010 2:54:56 AM http://www. left is on an element greater than (or equal to) the pivot and right is on an element smaller than (or equal to) the pivot.Quick Sort  We then move right to the left.nurul. skipping over elements that are greater than the pivot. 1 4 8 left 9 0 11 5 right 10 7 6  When left and right have stopped.com 50 .

1 4 8 left 9 0 11 5 right 10 7 6 1 4 5 left 9 0 11 8 right 10 7 6 .Quick Sort  If left is to the left of right (or if left = right). those elements are swapped.

nurul.  We then repeat the process until left and right cross. 9/6/2010 2:54:57 AM http://www.Quick Sort  The effect is to push a large element to the right and a small element to the left.com 52 .

Quick Sort 1 4 5 left 9 0 11 8 right 9 0 11 8 10 7 6 10 7 6 1 4 5 left right 1 4 5 0 9 11 8 10 7 6 left right .

Quick Sort 1 4 5 0 9 11 8 10 7 6 left right 1 4 5 0 9 11 8 10 7 6 right left 9/6/2010 2:54:57 AM http://www.nurul.com 54 .

Quick Sort 1 4 5 0 9 11 8 10 7 6 right left  At this point. 1 4 5 0 6 11 8 10 7 9 right left . left and right have crossed so no swap is performed.  The final part of the partitioning is to swap the pivot element with left.

1 4 5 0 6 11 8 10 7 9 right left .Quick Sort  Note that all elements to the left of the pivot are less than (or equal to) the pivot and all elements to the right of the pivot are greater than (or equal to) the pivot.  Hence. the pivot element has been placed in its final sorted position.

nurul.Quick Sort  We now repeat the process using the subarrays to the left and right of the pivot. 1 4 5 0 6 11 8 10 7 9 1 4 5 0 11 8 10 7 9 57 9/6/2010 2:54:58 AM http://www.com .

Quick Sort Pseudocode quicksort(list. leftMostIndex. rightMostIndex) { if( leftMostIndex < rightMostIndex ) pivot = list[rightMostIndex] left = leftMostIndex right = rightMostIndex ± 1 end if loop (left <= right) // Find key on left that belongs on right loop (list[left] < pivot) Make sure not to left = left + 1 run off the array end loop // Find key on right that belongs on left loop (right >= left AND list[right] > pivot) right = right ± 1 end loop .

right) left = left + 1 right = right ± 1 end if Must account for special case of end loop list[left]=list[right]=pivot // Move the pivot element to its correct location swap(list. leftMostIndex. rightMostIndex) // Continue splitting list and sorting quickSort(list.com 59 . left. right) quickSort(list.nurul. left+1. rightMostIndex) } 9/6/2010 2:54:58 AM http://www. left.Quick Sort Pseudocode (cont) // Swap out-of-order elements if (left <= right) // Why swap if left = right? swap(list.

nurul. when the subarrays get small.Quick Sort  A couple of notes about quick sort:  There are more optimal ways to choose the pivot value (such as the median-of-three method). 9/6/2010 2:54:58 AM http://www.  Also.com 60 . it becomes more efficient to use the insertion sort as opposed to continued use of quick sort.

but rand() is expensive and adds to running time of algorithm  Pivot usually taken as median of at least 3 elements or as middle element of S .Choosing The Pivot  Pivot can be any value in domain. but does not need to be in S   can be average of selected elements in S can be random selection.

this reduces running time by about 5%  center element is in location that divides keys into half Choice determines performance time  .median = (n/2)th largest element -. so the median is more likely to partition the list into 2 nearly equal halves -.)  median of 3 is less likely to be nearly the smallest or largest element in the list.Choosing The Pivot (cont.or pick 3 elements randomly and choose the median.

Analysis of Quicksort   based on choice of pivot many versions of analysis have been done.com 63 . but all lead to same results running time = time to do two recursive calls + linear time in the partition  9/6/2010 2:54:58 AM http://www.nurul.

Worst Case Analysis  occurs when pivot is smallest or largest element (one sublist has n .1 + C(n .1 entries and the other is empty) also if list is pre-sorted or is in reverse order: all elements go into S1 or S2   since C(0) = 0. we have C(n) = n .1) we can solve this recurrence relation by starting at the bottom:  .

.1 + C(n .1) = (n . C(n) = n .1) + (n ..2) + .Worst Case Analysis (cont.) C(1) = 0 C(2) = 1 + C(1) = 1 C(3) = 2 + C(2) = 2 + 1 C(4) = 3 + C(3) = 3 + 2 + 1 .1/2 n (sum of integers from 1 to n-1) = O(n2) .. + 2 + 1 = (n .1)n/2 = 1/2 n2 .. ...

)  Selection sort makes approx. in its worst case. quicksort is as bad as the worst case of selection sort. 1/2 n2 1/n key comparisons. Number of swaps made by quicksort is about 3 times as many as worst case of insertion sort  . in worst case. So. or O(n2).Worst Case Analysis (cont.

Average Case Analysis  Average behavior of quicksort. hence has probability of 1/n pivot is equally likely to be any element number of key comparisons made is n-1. as in worst case   . when applied to lists in random order  assume all possible orderings of S1 equally likely.

Recursion makes C(p-1) comparisons for the sublist of entries less than the pivot and C(n-p) comparisons for the sublist of entries greater than the pivot.1) + C(n .p)  .p) be average number of comparisons on list of length n where the pivot for the first partition is p.)  Let C(n) be average number of comparisons done by quicksort on list of length n and C(n.p) = n .Average Case Analysis(cont. for n>2 C(n. Thus.1 + C(p .

take the average of these expressions... by adding them from p=1 to p=n and dividing by n. This yields the recurrence relation for the number of comparisons done in the average case: C(n) = n .Average Case Analysis (cont. since p is random. + C(n-1)) .)  To find C(n) .1 + 2/n(C(0) + C(1) + .

(n-1)(n-2) + 2C(n-1) . we would have the same expression with n replaced by n-1. + C(n-2))  Multiply the first expression by n.Average Case Analysis (cont.)  To solve: note that if sorting a list of length n-1. as long as n>2: C(n-1) = n . the second by n-1 and subtract: nC(n) .2 +2/(n-1)(C(0) + C(1) + ...(n-1)C(n-1) = n(n-1) .

Average Case Analysis (cont.) Rearrange: (C(n)-2)/(n+1) = (C(n-1) .2)/n + 2/(n+1) Now let S(n) be defined as (C(n)-2 )/(n+1) Then S(n) = S(n-1) + 2/(n+1) .

+ 2/2 . S(n) = 2/(n+1) + 2/n + 2/(n-1) . + 1) =2 Hn+1 ..)  Solve by working down from n. = 2 (1/(n+1) + 1/n + 1/(n-1) + ... each time replacing expression by a smaller case until we reach n=1: S(n) = 2/(n+1) + S(n-1) S(n-1) = 2/n + S(n-2) S(n-2) = 2/(n-1) + S(n-3) ......Average Case Analysis (cont.

so that C(n) = 1.Average Case Analysis (cont.693.)  Now we can write: (C(n)-2)/(n+1) = 2 Hn+1 (C(n)-2)/(n+1) = 2 ln n + O(1) C(n) =(n+1)*2 ln n + (n+1)*O(1) = 2n ln n + O(n)  In its average case. quicksort performs C(n) = 2n ln n + O(n) comparisons of keys in sorting a list of n entries ln n = (ln 2)(lg n) and ln 2 = 0.386n lg n + O(n) = O(n log n) .

..Best case Analysis  pivot is middle element. two subfiles are each half the size of original T(n) = 2T(n/2) + cn T(n)/n = T(n/2)/(n/2) + c (*divide equation by n)*) Solve using telescoping sums: T(n/2)/(n/2) =T(n/4)/(n/4) + c T(n/4)/(n/4) =T(n/8)/(n/8) + c T(n/8)/(n/8) =T(n/16)/(n/16) + c ..

Best case Analysis (cont. we have T(n)/n = T(1)/1 + c*log n This yields: T(n) = c*n log n + n= O(n log n)  This analysis gives an overestimate of time complexity. .) T(2)/n = T(1)/1 + c // add up all equations 2 1 // and cancel all terms // leaving the following // equation Since there are log n equations . but is acceptable to get O( ) and not THETA.