# Divide-and-Conquer

5. Divide-and-Conquer

Divide-and-conquer. Break up problem into several parts. Solve each part recursively. Combine solutions to sub-problems into overall solution.
! ! !

Divide et impera. Veni, vidi, vici. - Julius Caesar

Most common usage. Break up problem of size n into two equal parts of size !n. Solve two parts recursively. Combine two solutions into overall solution in linear time.
! ! !

Consequence. Brute force: n2. Divide-and-conquer: n log n.
! !

Algorithm Design by Éva Tardos and Jon Kleinberg •

Slides by Kevin Wayne

• Copyright © 2004 Addison Wesley

2

Sorting

5.1 Mergesort

Sorting. Given n elements, rearrange in ascending order.
Obvious sorting applications.
!

Non-obvious sorting applications.
!

List files in a directory. Organize an MP3 library. List names in a phone book. Display Google PageRank results.

Data compression. Computer graphics. Interval scheduling. Computational biology. Minimum spanning tree. Supply chain management. Simulate a system of particles. Book recommendations on Amazon. Load balancing on a parallel computer. ...

!

!

!

!

!

!

!

Problems become easier once sorted.
!

!

Find the median. Find the closest pair. Binary search in a database. Identify statistical outliers. Find duplicates in a mailing list.

!

!

!

!

!

!

!

Algorithm Design by Éva Tardos and Jon Kleinberg •

Slides by Kevin Wayne

• Copyright © 2004 Addison Wesley

4

Mergesort recurrence.Mergesort Mergesort. Combine two pre-sorted lists into a sorted whole. ! ! ! Merging Merging. In-place merge. T(2) T(2) T(2) T(n / 2k) T(2) T(2) T(2) T(2) T(2) n/2 (2) n log2n 7 8 . Assorted proofs. T(n) = O(n log2 n). [Kronrud. T(n) = number of comparisons to mergesort an input of size n. Divide array into two halves. ' 0 if n = 1 ) T(n) " ( T ( #n /2\$ ) + T ( %n /2& ) + { otherwise n 14243 ) 14243 merging solve right half * solve left half Proof by Recursion Tree " \$ T(n) = # \$ % 0 2T(n /2) + { n 1 24 4 3 if n = 1 otherwise sorting both halves merging ! T(n) n T(n/2) T(n/4) T(n/4) 2(n/2) T(n/2) Solution. 1969] using only a constant amount of extra storage 5 6 A Useful Recurrence Relation Def.. Recursively sort each half.. Merge two halves to make sorted whole. How to merge efficiently? Linear number of comparisons. 2k (n / 2k) . We describe several ways to prove this recurrence.. ! ! Jon von Neumann (1945) A A A A L G L G L G G O O H O R R I R I I H T T I O H H M R M M S S S S T T divide sort merge O(1) 2T(n/2) O(n) A A G G L H O I R H I M S T L M Challenge for the bored.. ! T(n/4) T(n/4) log2n 4(n/4) . Initially we assume n is a power of 2 and replace ! with =. Use temporary array.

Goal: show that T(2n) = 2n log2 (2n). assumes n is a power of 2 " \$ T(n) = # \$ % sorting both halves 0 2T(n /2) + { n 1 24 4 3 if n = 1 otherwise merging " \$ T(n) = # \$ % sorting both halves 0 2T(n /2) + { n 1 24 4 3 if n = 1 otherwise merging Pf. then T(n) = n log2 n.. n–1. If T(n) satisfies this recurrence.3 Counting Inversions Pf.Proof by Telescoping Claim. Induction step: assume true for 1. . then T(n) = n log2 n. (by induction on n) ! Base case: n = 1. Define n1 = \$n / 2% . n2 = "n / 2#. ! ! ! T(2n) = = = = 9 2T(n) + 2n 2n log2 n + 2n 2n(log2 (2n) "1) + 2n 2n log2 (2n) 10 ! ! Analysis of Mergesort Recurrence Claim. assumes n is a power of 2 Proof by Induction Claim. ! ! ! T(n) " T(n1 ) + T(n2 ) + n " " = " = n1# lg n1 \$ + n2 # lg n2 \$ + n n1# lg n2 \$ + n2 # lg n2 \$ + n n # lg n2 \$ + n n( # lg n\$ %1 ) + n n # lg n\$ ! n2 = "n /2# \$ " 2" lg n # /2 # = 2 " lg n # /2 % lg n2 \$ " lg n# &1 11 Algorithm Design by Éva Tardos and Jon Kleinberg • Slides by Kevin Wayne • Copyright © 2004 Addison Wesley ! . Inductive hypothesis: T(n) = n log2 n. If T(n) satisfies the following recurrence. For n > 1: ! T(n) = n = = L = = 2T(n /2) n T(n /2) n /2 T(n / 4) n/4 T(n /n) n /n log2 n +1 +1 +1 +1 + 1 +L+ 1 1 24 4 3 log 2 n Pf. then T(n) ! n "lg n#. ' 0 if n = 1 ) T(n) " ( T ( #n /2\$ ) + T ( %n /2& ) + { otherwise n ) 14243 14243 merging solve right half * solve left half log2n 5. If T(n) satisfies this recurrence. (by induction on n) ! Base case: n = 1. . 2..

! ! ! Songs A Me You 1 1 B 2 3 C 3 4 D 4 2 E 5 5 Inversions 3-2. ! 1 5 4 8 10 2 6 9 12 11 3 7 1 1 5 5 4 4 8 8 10 10 2 2 6 6 9 9 12 12 11 11 3 3 7 7 Divide: O(1). ! ! ! ! ! ! Similarity metric: number of inversions between two rankings. …. Counting Inversions: Divide-and-Conquer Divide-and-conquer.Counting Inversions Music site tries to match your song preferences with others. 13 14 Counting Inversions: Divide-and-Conquer Divide-and-conquer. Songs i and j inverted if i < j. but ai > aj. Nonparametric statistics (e. an. Divide: separate list into two pieces. My rank: 1. 15 16 . …. n. Your rank: a1. Kendall's Tau distance). Measuring the "sortedness" of an array. ! ! Applications Applications. Collaborative filtering. a2. Rank aggregation for meta-searching on the Web. 4-2 Brute force: check all &(n2) pairs i and j. Music site consults database to find people with similar tastes.. Voting theory.g. Sensitivity analysis of Google's ranking function. 2. You rank n songs.

A) ' Sort-and-Count(A) (rB. Combine: count inversions where ai and aj are in different halves. 10-7 Combine: ??? Total = 5 + 8 + 9 = 22. B) ' Sort-and-Count(B) (rB. 9-3. 10-9. Merge two sorted halves into sorted whole. 10-6. 11-3. 12-3. B) } return r = rA + rB + r and the sorted list L 3 7 10 14 18 19 2 6 11 3 16 2 17 2 23 0 25 0 13 blue-green inversions: 6 + 3 + 2 + 2 + 0 + 0 2 3 7 10 11 14 16 17 18 19 23 25 Count: O(n) Merge: O(n) T(n) " T ( #n /2\$ ) + T ( %n /2& ) + O(n) ' T(n) = O(n log n) 19 20 ! . 10-3. 12-11. 8-2. 4-2. ! ! ! 1 1 5 5 4 4 8 8 10 10 2 2 6 6 9 9 12 12 11 11 3 3 7 7 Divide: O(1). Divide: separate list into two pieces. 5-2. 11-7 5 blue-blue inversions 9 blue-green inversions 8 green-green inversions 5-3. Conquer: 2T(n / 2) Conquer: 2T(n / 2) 5 blue-blue inversions 5-4. and return sum of three quantities. [Sort-and-Count] L is sorted. [Merge-and-Count] A and B are sorted. L) ' Merge-and-Count(A. 17 18 Counting Inversions: Combine Combine: count blue-green inversions Assume each half is sorted. 9-7. 12-7.Counting Inversions: Divide-and-Conquer Divide-and-conquer. 8-3. 8-7. 8-6. Count inversions where ai and aj are in different halves. Post-condition. Conquer: recursively count inversions in each half. to maintain sorted invariant Sort-and-Count(L) { if list L has one element return 0 and the list L Divide the list into two halves A and B (rA. ! ! Counting Inversions: Divide-and-Conquer Divide-and-conquer. 10-2 8 green-green inversions 6-3. Conquer: recursively count inversions in each half. Divide: separate list into two pieces. 4-3. 1 1 5 5 4 4 8 8 10 10 2 2 6 6 9 9 12 12 11 11 3 3 7 7 Divide: O(1). ! ! ! Counting Inversions: Implementation Pre-condition.

! ! fast closest pair inspired fast algorithms for these problems Brute force. Voronoi. air traffic control. Assumption. L L 23 24 . Check all pairs of points p and q with &(n2) comparisons. Fundamental geometric primitive. No two points have same x coordinate. Sub-divide region into 4 quadrants. geographic information systems. Euclidean MST. Graphics. Obstacle. Given n points in the plane. 1-D version.4 Closest Pair of Points Closest pair. molecular modeling. to make presentation cleaner Algorithm Design by Éva Tardos and Jon Kleinberg • Slides by Kevin Wayne • Copyright © 2004 Addison Wesley 22 Closest Pair of Points: First Attempt Divide. computer vision. Sub-divide region into 4 quadrants. O(n log n) easy if points are on a line. Impossible to ensure n/4 points in each piece. find a pair with smallest Euclidean distance between them. Special case of nearest neighbor.Closest Pair of Points 5. Closest Pair of Points: First Attempt Divide.

Conquer: find closest pair in each side recursively. Return best of 3 solutions. L L 8 21 12 21 ( = min(12. Conquer: find closest pair in each side recursively. seems like &(n2) Combine: find closest pair with one point in each side. 21) 12 27 28 . ! Closest Pair of Points Algorithm. assuming that distance < (.Closest Pair of Points Algorithm. Divide: draw vertical line L so that roughly !n points on each side. ! ! L L 21 12 25 26 Closest Pair of Points Algorithm. Divide: draw vertical line L so that roughly !n points on each side. ! ! ! ! Closest Pair of Points Find closest pair with one point in each side. Divide: draw vertical line L so that roughly !n points on each side.

Pf. Observation: only need to consider points within ( of line L. assuming that distance < (. ! ! L 7 6 L 21 12 ( = min(12. assuming that distance < (. ! Closest Pair of Points Find closest pair with one point in each side. No two points lie in same !(-by-!( box. with the ith smallest y-coordinate. Let si be the point in the 2(-strip. 21) 3 2 1 ( 29 ( 30 Closest Pair of Points Find closest pair with one point in each side. 21) 12 4 5 21 ( = min(12. Claim. then the distance between si and sj is at least (. i 27 !( 12 3 26 25 2 1 ( 31 ( ( 32 . assuming that distance < (. Sort points in 2(-strip by their y coordinate. Observation: only need to consider points within ( of line L. Two points at least 2 rows apart 2 rows have distance ) 2(!(). 21) Fact.Closest Pair of Points Find closest pair with one point in each side. ! ! ! 39 31 j 7 6 4 L !( 29 30 !( 28 5 21 ( = min(12. Observation: only need to consider points within ( of line L. Only check distances of those within 11 positions in sorted list! ! ! ! Closest Pair of Points Def. Sort points in 2(-strip by their y coordinate. If |i – j| ) 12. Still true if we replace 12 with 7.

Each recursive returns two lists: all points sorted by y coordinate. Sort by merging two pre-sorted lists. Given two n-digit integers a and b. Can we achieve O(n log n)? O(n) O(n log n) O(n) A. Brute force solution: &(n2) bit operations. ! ! T(n) " 2T ( n /2) + O(n) # T(n) = O(n log n) 33 ! 34 Integer Arithmetic 5. Closest-Pair(p1. and all points sorted by x coordinate. update (. (1 = Closest-Pair(left half) (2 = Closest-Pair(right half) ( = min((1. compute a " b. Yes. compute a + b. (2) Delete all points further than ( from separation line L Sort remaining points by y-coordinate. return (.5 Integer Multiplication Add. ! 1 1 0 1 0 1 0 1 * 0 1 1 1 1 1 0 1 1 1 0 1 0 1 0 1 0 Multiply 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 1 0 1 1 0 1 0 1 0 1 0 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 Add 1 0 1 0 0 1 1 0 1 0 0 1 1 1 0 1 1 0 1 0 1 0 1 0 1 1 0 1 0 1 0 1 0 1 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 36 + 1 0 0 Algorithm Design by Éva Tardos and Jon Kleinberg • Slides by Kevin Wayne • Copyright © 2004 Addison Wesley . ! Multiply. pn) { Compute separation line L such that half the points are on one side and half on the other side. ….Closest Pair Algorithm Running time. Scan points in y-order and compare distance between each point and next 11 neighbors. Given two n-digit integers a and b. If any of these distances is less than (. O(n) bit operations. } O(n log n) Closest Pair of Points: Analysis T(n) " 2T ( n /2) + O(n log n) # T(n) = O(n log2 n) 2T(n / 2) ! Q. Don't sort points in strip from scratch each time.

. shift ! assumes n is a power of 2 ( T(n) = O(n 37 log 2 3 ) = O(n 1.585 ) 38 ! Karatsuba: Recursion Tree " 0 if n = 1 T(n) = # \$ 3T(n /2) + n otherwise log 2 n k=0 k 3 ( 2) 1+ log 2 n 3 #1 2 T(n) = " n 3 ( 2) = #1 = 3nlog2 3 # 2 Matrix Multiplication ! ! T(n) n T(n/2) T(n/2) T(n/2) 3(n/2) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) 9(n/4) . and shift to obtain result.. subtract. subtract.. shift Theorem.. T(n / 2k) . ! ! ! x = 2 " x1 + x0 y = 2 n / 2 " y1 + y0 xy = 2 n / 2 " x1 + x0 2 n / 2 " y1 + y0 = 2 n " x1 y1 + 2 n / 2 " ( x1 y0 + x0 y1 ) + x0 y0 n/2 ( )( ) x = 2 n / 2 " x1 y = 2 n / 2 " y1 xy = 2 n " x1 y1 = 2 n " x1 y1 A + + + + x0 y0 2 n / 2 " ( x1 y0 + x0 y1 ) + x0 y0 2 n / 2 " ( (x1 + x0 ) (y1 + y0 ) # x1 y1 # x0 y0 ) + x0 y0 B A C C ! T(n) = 4T (n /2 ) + "(n) # T(n) = "(n 2 ) 123 1 24 4 3 recursive calls add. 1962] Can multiply two n-digit integers ! in O(n1. 3k (n / 2k) .. ! ! Karatsuba Multiplication To multiply two n-digit integers: Add two !n digit integers. and shift !n-digit integers to obtain result. T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) 3 lg n (2) 39 Algorithm Design by Éva Tardos and Jon Kleinberg • Slides by Kevin Wayne • Copyright © 2004 Addison Wesley . T(n) " T ( #n /2\$ ) + T ( %n /2& ) + T ( 1+ %n /2& ) + '(n) 1 24 4 3 14444444 4444444 2 3 recursive calls add. Multiply three !n-digit integers.Divide-and-Conquer Multiplication: Warmup To multiply two n-digit integers: Multiply four !n-digit integers. Add two !n-digit integers. [Karatsuba-Ofman.585) bit operations.. Add.. ..

18 = 10 + 8 additions (or subtractions). "C11 \$ #C21 " A11 C12 % ' = \$ C22 & # A21 A12 % " B11 ' ( \$ A22 & # B21 B12 % ' B22 & Fast Matrix Multiplication Fast matrix multiplication. ! T(n) = 7T (n /2 ) + "(n 2 ) # T(n) = "(n log2 7 ) = O(n 2. Combine: add appropriate products using 4 matrix additions. multiply 2-by-2 block matrices with only 7 multiplications. Divide: partition A and B into !n-by-!n blocks. ! ! ! ! 7 ! ! multiplications. Matrix Multiplication: Warmup Divide-and-conquer. Can we improve upon brute force? ! C11 C12 C21 C22 = = = = ( A11 " B11 ) + ( A12 " B21 ) ( A11 " B12 ) + ( A12 " B22 ) ( A21 " B11 ) + ( A22 " B21 ) ( A21 " B12 ) + ( A22 " B22 ) T(n) = 8T ( n /2 ) + "(n2 ) # T(n) = "(n 3 ) 144 44 2 3 1 24 4 3 recursive calls add. ! ! ! ! ! C11 C12 C21 C22 = = = = P5 + P4 " P2 + P6 P + P2 1 P3 + P4 P5 + P " P3 " P7 1 P 1 P2 P3 P4 P5 P6 P7 = A11 " ( B12 # B22 ) = ( A11 + A12 ) " B22 = ( A21 + A22 ) " B11 = A22 " ( B21 # B11 ) = ( A11 + A22 ) " ( B11 + B22 ) = ( A12 # A22 ) " ( B21 + B22 ) = ( A11 # A21 ) " ( B11 + B12 ) Analysis. Fundamental question.81 ) 2 3 1 24 14 4 4 3 recursive calls add. T(n) = # arithmetic operations. 1969) Divide: partition A and B into !n-by-!n blocks. compute C = AB. form submatrices ! ! 41 42 Matrix Multiplication: Key Idea Key idea. Conquer: multiply 7 !n-by-!n matrices recursively. ! ! ! cij = " a ik bkj k =1 n "c11 \$ \$c21 \$M \$c # n1 c12 c22 M cn2 "a11 L c1n % ' \$ L c2n ' a = \$ 21 \$M O M' \$a L cnn ' & # n1 a12 a 22 M a n2 "b11 b12 L a1n % ' \$ L a 2n ' b b ( \$ 21 22 \$M O M ' M \$b b L a nn ' & # n1 n2 L b1n % ' L b2n ' O M' L bnn ' & ! ! "C11 C12 % " A11 \$ ' = \$ #C21 C22 & # A21 A12 % " B11 '( \$ A22 & # B21 B12 % ' B22 & Brute force. Compute: 14 !n-by-!n matrices via 10 matrix additions. Assume n is a power of 2.Matrix Multiplication Matrix multiplication. Conquer: multiply 8 !n-by-!n recursively. (Strassen. subtract 43 44 . Given two n-by-n matrices A and B. &(n3) arithmetic operations. Combine: 7 products into 4 terms using 8 matrix additions.

Two 70-by-70 matrices with only 143. Numerical stability. O(n2+*) for any * > 0. and other matrix ops. Odd matrix dimensions.500. Also impossible. Caching effects.77 ) Q. Two 3-by-3 matrices with only 21 scalar multiplications? A. Caveat.521813).] Conjecture.59 "(n ) = O(n ) Common misperception: "Strassen is only a theoretical curiosity.521801). January. Theoretical improvements to Strassen are progressively less practical. 1979: O(n2. determinant. ! " (n log3 21 ) = O(n 2.Fast Matrix Multiplication in Practice Implementation issues. [Hopcroft and Kerr. O(n2. 47 ." Advanced Computation Group at Apple Computer reports 8x speedup on G4 Velocity Engine when n ~ 2. Range of instances where it's useful is a subject of controversy. eigenvalues. Sparsity. 1987. 1971] log 2 6 2.376) [Coppersmith-Winograd. ! ! Q.80 " (n ) = O(n ) Remark. December. Crossover to classical algorithm around n = 128. Can "Strassenize" Ax=b. Yes! [Strassen. ! ! ! ! ! Fast Matrix Multiplication in Theory Q. 1969] "(n log2 7 ) = O(n 2. 1980: O(n2.81 ) Q. Impossible. Yes! [Pan. Multiply two 2-by-2 matrices with only 6 scalar multiplications? ! A. Multiply two 2-by-2 matrices with only 7 scalar multiplications? A.640 scalar multiplications? ! A. Decimal wars. 1980] log 70 143640 2. ! ! ! 45 46 Fast Matrix Multiplication in Theory Best known.

Sign up to vote on this title