Recurrence relations are a fundamental concept in computer science and mathematics, often used to analyze the time complexity of algorithms and recursive functions. They provide a way to express the relationship between a problem’s size and the time it takes to solve it recursively. Solving recurrence relations is crucial for understanding and predicting the efficiency of algorithms. In this article, we will explore the Recurrence Relation and its all Methods : Substitution Method, Recursion Tree Method, Master Method, Iteration Method.

Understanding Recurrence Relations

A recurrence relation is a mathematical equation or inequality that defines a sequence of values, where each value is expressed in terms of one or more previous values in the sequence. In the context of algorithm analysis, recurrence relations are often used to describe the time complexity of recursive algorithms.

The general form of a recurrence relation is:

T(n) = f(n),    for n < n₀
T(n) = g(n) + T(n/sub),    for n ≥ n₀

Here:

  • T(n) represents the time or resource complexity of a problem of size n.
  • f(n) and g(n) are functions that describe the work done at each level of recursion.
  • n₀ is the base case or smallest problem size.
  • n/sub represents the size of subproblems within the recursion.

Now, let’s dive into the methods for solving these recurrence relations.

1. Substitution Method

The Substitution Method is an intuitive way to solve recurrence relations. It involves guessing a solution and then proving its correctness through mathematical induction. Here are the steps for using the Substitution Method:

Step 1: Guess the Solution

Guess a form for the solution based on patterns you observe in the recurrence relation. This often involves guessing a function of n that represents the time complexity.

Step 2: Prove the Guess

Use mathematical induction to prove that the guessed solution is correct. This involves three parts:

  • Base Case: Prove that the guess holds for the base case (usually the smallest problem size n₀).
  • Inductive Hypothesis: Assume that the guess is true for all problem sizes up to a certain value k.
  • Inductive Step: Prove that if the guess is true for k, then it must also be true for k + 1.

Step 3: Solve for Constants

After proving the correctness of the guess, solve for any unknown constants introduced in the guessed solution by using initial conditions or base cases.

Example: Solving a Simple Recurrence Relation

Let’s consider the recurrence relation for the time complexity of a simple recursive function:

T(n) = T(n - 1) + 1

Step 1 (Guess): Guess that T(n) = O(n).

Step 2 (Prove): Prove by induction that T(n) = O(n).

  • Base Case: For n = 1, T(1) = 1, which is O(1) (constant time).
  • Inductive Hypothesis: Assume that T(k) = O(k) for some k.
  • Inductive Step: Now, we need to prove that T(k + 1) = O(k + 1).
T(k + 1) = T(k) + 1   (from the recurrence relation)
         ≤ O(k) + 1   (by the inductive hypothesis)
         = O(k + 1)

Therefore, by mathematical induction, the guess is proven correct.

Step 3 (Solve for Constants): There are no constants to solve for in this case.

2. Recursion Tree Method

The Recursion Tree Method is a visual way to solve recurrence relations. It involves constructing a tree to represent the recursive calls and their dependencies. This method is particularly useful for recurrence relations with multiple levels of recursion. Here are the steps for using the Recursion Tree Method:

Step 1: Create the Recursion Tree

  • Start with the initial problem (root node of the tree).
  • For each recursive call in the recurrence relation, create child nodes representing the subproblems.

Step 2: Analyze the Tree

  • Analyze the depth and branching factor of the tree.
  • Determine how many levels the tree has and how many subproblems are generated at each level.

Step 3: Summing Up Work

  • Calculate the work done at each level of the tree (usually represented as a sum).
  • Sum up the work done at all levels to find the total time complexity.

Example: Solving a Recurrence Relation with Recursion Tree

Let’s consider the recurrence relation for the time complexity of the merge sort algorithm:

T(n) = 2T(n/2) + Θ(n)

Step 1 (Create the Recursion Tree):

          T(n)
         /    \
     T(n/2)  T(n/2)
    /   \     /   \
 T(n/4) T(n/4) T(n/4) T(n/4)
   ...   ...   ...   ...

Step 2 (Analyze the Tree):

  • The depth of the tree is log₂(n) (because the problem size is halved at each level).
  • At each level, there are 2ⁱ nodes (where is the level number).

Step 3 (Summing Up Work):

  • The work done at each level is Θ(n) (due to the Θ(n) work in the recurrence relation).

The total work is the sum of the work at each level:

Total Work = Θ(n) + Θ(n) + Θ(n) + ... + Θ(n) (log₂(n) times)
           = Θ(n log₂(n))

So, the time complexity of merge sort is Θ(n log₂(n)).

3. Master Theorem

The Master Theorem is a general framework for solving recurrence relations of a specific form. It provides a convenient way to determine the time complexity of divide-and-conquer algorithms. The recurrence relation must have the following form:

T(n) = aT(n/b) + f(n)

Where:

  • a is the number of subproblems.
  • b is the factor by which the problem size is reduced.
  • f(n) is the work done outside the recursive calls.

The Master Theorem provides three cases (denoted as Case 1, Case 2, and Case 3) with corresponding time complexity bounds. To apply the Master Theorem, you need to compare f(n) to a specific function based on a and b and determine which case applies. Here are the cases:

Case 1:

If f(n) is O(n^c) for some c < log_b(a), then the time complexity is Θ(n^log_b(a)).

Case 2:

If f(n) is Θ(n^log_b(a)), then the time complexity is Θ(n^log_b(a) * log(n)).

Case 3:

If f(n) is Ω(n^c) for some c > log_b(a) and if a * f(n/b) ≤ kf(n) for some k < 1 and sufficiently large n, then the time complexity is Θ(f(n)).

Example: Solving a Recurrence Relation with the Master Theorem

Let’s apply the Master Theorem to the recurrence relation of the merge sort algorithm:

T(n) = 2T(n/2) + Θ(n)

In this case:

  • a = 2 (two subproblems are created at each level).
  • b = 2 (the problem size is halved at each level).
  • f(n) = Θ(n) (work done outside recursive calls).

We need to compare f(n) to n^log_b(a):

n^log_b(a) = n^log_2(2) = n¹

Now, let’s determine which case of the Master Theorem applies:

  • f(n) (Θ(n)) is not O(n^c) for any c < 1 (since n is the lowest power).
  • f(n) (Θ(n)) is not Θ(n¹) (it’s Θ(n)).
  • f(n) (Θ(n)) is Ω(n¹) (since n is the highest power).

Since we are in Case 3 and a * f(n/b) = 2 * (Θ(n/2)) = Θ(n) (which is less than k * f(n) for any k < 1), the time complexity is Θ(f(n)).

Therefore, the time complexity of merge sort, as confirmed by the Master Theorem, is Θ(n).

Conclusion

Solving recurrence relations is a crucial skill for analyzing and understanding the time complexity of recursive algorithms. The Substitution Method, Recursion Tree Method, and Master Theorem are valuable tools for this purpose, each with its strengths and applications. By mastering these methods, you can gain insight into the efficiency of algorithms and make informed decisions when designing or optimizing recursive algorithms.


more related content on Advanced Algorithms (AA)

JOIN OUR NEWSLETTER
And get notified everytime we publish a new blog post.