Professional Documents
Culture Documents
General Method
Divide and conquer is a design strategy which is well known to breaking down
efficiency barriers. When the method applies, it often leads to a large
improvement in time complexity. For example, from O (n2 ) to O (n log n) to sort
the elements. Divide and conquer strategy is as follows: divide the problem
instance into two or more smaller instances of the same problem, solve the
smaller instances recursively, and assemble the solutions to form a solution of
the original instance. The recursion stops when an instance is reached which is
too small to divide. When dividing the instance, one can either use whatever
division comes most easily to hand or invest time in making the division
carefully so that the assembly is simplified.
When the instance I of the problem P is sufficiently small, return the answer P(I)
directly, or resort to a different, usually simpler, algorithm that is well suited for
small instances.
Inductive Step:
1. Divide into some number of smaller instances of the same problem P.
2. Conquier on each of the smaller instances to obtain their answers.
3. Combine the answers to produce an answer for the original instance I.
For most of the recursive algorithm, you will be able to find the Time complexity
For the algorithm using the master's theorem, but there are some cases master's
theorem may not be applicable. These are the cases in which the master's
theorem is not applicable. When the problem T(n) is not monotone, for example,
T(n) = sin n. Problem function f(n) is not a polynomial.
As the master theorem to find time complexity is not hot efficient in these cases,
and advanced master theorem for recursive recurrence was designed. It is design
to handle recurrence problem of the form −
Where,
n is the size of the problem.
a = number of subproblems in recursion, a > 0
n/b = size of each subproblem b > 1, k >= 0 and p is a real number.
To solve the recurrence relation using master theorem, the following conditions
are checked and then evaluated.
Examples:
1. T(n) = 2T(n/2) + 1
here,
a=2
b=2
k = 0 (nk = 1)
p = 0 (logpn = 1)
also, logba = 1 and logba > k. [condition 1]
2. T(n) = 2T(n/2) + n
here,
a=2
b=2
k = 1 (n1 = n)
p = 0 (logpn = 1)
also, logba = 1 and logba = k. [condition 2]
Therefore, T(n) = Θ(nlogbalogp+1n)
= Θ(n log n)
3. T(n) = T(n/2) + n2
here,
a=1
b=2
k=2
p = 0 (logp n = 1)
also, logba = 0 and logba < k. [condition 3]
Fact: If all the names in the world are written down together in order and you
want to search for the position of a specific name, binary search will accomplish
this in a maximum of 35 iterations.