What is Recurrence?

Recurrence is a mathematical function that is defined recursively. It is a type of mathematical equation or formula that defines a sequence of values based on one or more previous values in the sequence. In computer science, recurrences are often used to describe the time complexity of algorithms.

Master Method for Solving Recurrence

The master method is a technique for solving recurrences of the form T(n) = aT(n/b) + f(n), where T(n) represents the running time of an algorithm, a is the number of subproblems that the algorithm breaks the original problem into, n/b is the size of each subproblem, and f(n) is the time required to divide the problem and combine the solutions. The master method provides a way to determine the time complexity of an algorithm in terms of its recurrence relation.

The master method has three cases based on the values of a, b, and f(n):

  1. If f(n) = O(n^c) for some constant c and a < b^c, then T(n) = Θ(n^c).
  2. If f(n) = Θ(n^c) for some constant c and a = b^c, then T(n) = Θ(n^c log n).
  3. If f(n) = Ω(n^c) for some constant c and a > b^c, and if af(n/b) <= cf(n) for some constant c < 1 and sufficiently large n, then T(n) = Θ(f(n)).

In the first case, the time required for dividing the problem and combining the solutions is small compared to the time required to solve the subproblems. This means that the total running time of the algorithm is dominated by the time required to solve the subproblems, which is proportional to n^c. Therefore, the time complexity of the algorithm is Θ(n^c).

In the second case, the time required for dividing the problem and combining the solutions is roughly equal to the time required to solve the subproblems. This means that the total running time of the algorithm is dominated by the time required to divide the problem and combine the solutions, which is proportional to n^c log n. Therefore, the time complexity of the algorithm is Θ(n^c log n).

In the third case, the time required for dividing the problem and combining the solutions is large compared to the time required to solve the subproblems. This means that the total running time of the algorithm is dominated by the time required to divide the problem and combine the solutions, which is proportional to f(n). Therefore, the time complexity of the algorithm is Θ(f(n)).

In conclusion, the master method provides a simple and efficient way to determine the time complexity of an algorithm in terms of its recurrence relation. By analyzing the values of a, b, and f(n), we can determine the appropriate case for the recurrence and obtain an asymptotic bound for the time complexity of the algorithm.

How to Use Master Theorem?

To apply the master theorem to an algorithm, we need to identify the values of a, b, and f(n) from its recurrence relation. Once we have these values, we can determine which case of the master theorem applies and compute the time complexity of the algorithm.

Example: Master Theorem for Merge Sort

T(n) = 2*T(n/2) + n

Here, a = 2, b = 2, and f(n) = n. Therefore, we can apply the master theorem’s first case, which states that if f(n) = O(n^(log_b(a – ε))) for some constant ε > 0, then T(n) = Θ(n^(log_b(a))). In our case, log_2(2) = 1, so n^(log_b(a)) = n. Since n is O(n), we can conclude that T(n) = Θ(n*log n).

Master Theorem’s Limitations

The master theorem is a powerful tool for analyzing the time complexity of algorithms that follow a specific form of recurrence relation. However, it has some limitations. For instance, the master theorem assumes that all subproblems have equal sizes, which may not be the case in some algorithms. Furthermore, the master theorem only works for a limited class of recurrence relations and cannot be applied to all algorithms.

Extended Master Theorem

To address some of the limitations of the master theorem, researchers have developed an extended version of the theorem that applies to more general recurrence relations. The extended master theorem provides a formula to compute the time complexity of algorithms that follow recurrence relations of the form:

T(n) = a*T(n/b) + g(n)

Here, g(n) represents the time it takes to solve the subproblems and combine their solutions.

Example: Extended Master Theorem for Quick Sort

Let us apply the extended master theorem to the Quick Sort algorithm, which has the recurrence relation:

T(n) = T(k) + T(n-k-1) + n

Here, k is the size of the partition, and n-k-1 is the size of the other partition. To apply the extended master theorem, we define g(n) as n and h(n) as k. Then, the recurrence relation becomes:

T(n) = T(h(n)) + T(n-h(n)-1) + g(n)

We can assume that h(n) divides n, so h(n) = n/2. Then, we can apply the extended master theorem’s case 1, which states that if g(n) = O(n^(log_b(a – ε))) for some constant ε > 0, then T(n) = Θ(n^(log_b(a)) + g(n)).

In our case, a = 2, b = 2, and g(n) = n. Therefore, log_2(2) = 1, and n^(log_b(a)) = n. Since n is O(n), we can conclude that T(n) = Θ(n*log n)


KUK-CSE-OS Questions

KUK-CSE-DM Questions

JOIN OUR NEWSLETTER
And get notified everytime we publish a new blog post.