# Master’s Theorem and Its Use in Calculating Time Complexity

Author: Kanika Joshi

The master’s Theorem is a formula used to analyze the time complexity of recursive relations. Now, the question arises what are recursive relations?

In this article, we will look at what exactly the master theorem is, and how we can calculate the time complexity of any recursive program using the master’s theorem.

### Table of Contents

- What is Master’s Theorem?
- Application of Master’s Theorem for:
- Dividing functions
- Decreasing functions

Recursive relations or functions are such functions that call themself until a base condition is achieved. Recursive relations sometimes lead to an infinite loop if there is no terminating or base condition for the function. So, the master’s theorem is a formula that helps in identifying the time complexity of any recursive relation in a quicker and easier manner.

**What is Master’s Theorem?**

Recursive functions call themselves continuously and it makes the function complex, if the algorithm gets complex it’s more complicated to calculate the time complexity of the program. Master’s theorem is the easiest and fastest way to calculate the time complexity of recursive functions.

Master’s Theorem can be applied on:

- Dividing function
- Decreasing function

**1. For Dividing Functions**

This formula is usually used to calculate time complexity of “Dividing” recursive relations, which are of the following form:

**T(n) = aT(n/b) + f(n) **

where,

f(n) = θ(nklogpn)

n is the input size

a should be greater than or equal to 1, i.e. a>=1, and a is the number of subproblems present in the recursive function.

b should be greater than 1, i.e. b>1, and n/b is the size of each subproblem

Master’s theorem on dividing functions can only be applied when the relation is of this form

**T(n) = aT(n/b) + f(n), and f(n) = **θ**(n****k****lo****g****p****n)**.

Like,

T(n) = 3T(n/4) + nlogn

T(n)= 2T(n/3) + n2

So, the master’s theorem for dividing function is stated as:

**T(n) = aT(n/b) +θ****(n****k****lo****g****p****n)**

Case I: If logba>k, then,

- T(n) =
**θ(****n****log****b****a****)**

Case II: If logba=k, then,

- If p>-1, then T(n) =
**θ(****n****k****log****(p+1)****n****)** - If p= -1, then T(n) =
**θ(****n****k****loglogn****)** - If p<-1, then T(n) =
**θ(****n****k****)**

Case III: If logba<k, then

- If p>=0, then T(n) =
**θ(****n****k****log****p****n****)** - If p<0, then T(n) =
**θ(****n****k****)**

**Example: **Calculate the time complexity of the following recursive relation:

T(n)= 2T(n/3) + n2

**Solution:**

We can compare the recursive relation with the Master’s theorem, and identify the value of a, b, k, and p.

So, after comparing the above given recurisve relation with the Master’s theorem [**T(n) = aT(n/b) +θ**], the values of a =2, b = 3, k=2, p=0

logba=k

a = bk

2 < 32

From here, we can see that Case III will be applied.

Since p= 0, then T(n) = **θ(****n****k****log****p****n****)**

**n****2****log****0****n**

**n****2**

**Therefore, the time complexity for the above relation is T(n) = θ(n2)**

*Explore Free Online Courses with Certificates*

**2. For Decreasing Functions**

This formula is usually used to calculate time complexity of “Decreasing” recursive relations, which are of the following form:

**T(n) = aT(n-b) + f(n) **

where,

f(n)= **θ(****n****k****)**

**a is the number of subproblems, **

**n is the number of input**

**n-b is size of subproblem.**

**a>0**

**b>0**

**k>=0**

Here, the value of “a(number of subproblems)” will decide the time complexity of the relation.

Master’s theorem on decreasing functions can only be applied when the relation is of this form

**T(n) = aT(n-b) + f(n), and f(n) = **θ**(n****k****)**.

Like,

T(n) = 2T(n-4) + n

T(n)= 2T(n/3) + n2

So, the master’s theorem for decreasing function is stated as:

**T(n) = aT(n-b) +θ****(n****k****)**

**Case I: If a< 1, then T(n)= ****θ****(n****k****)**

**Case II: If a=1, then T(n)= θ****(n****k+1****)**

**Case III: if a>1, then T(n)= θ****(n****n/b***** f(n)****)**

**Example: **Calculate the time complexity of the following recursive relation:

**T(n) = T(n-3) +θ****(n****2****)**

**Solution: **

First check, ** a>0, b>0 and k>=0**. So here,** a= 1, b=3, and k= 2**. Therefore, this can be solved using master’s theorem for decreasing function.

Here, a=1, so clearly Case II will be applied.

**T(n)= θ****(n****k+1****)**

**T(n)= θ****(n****2+1****)**

**T(n)= θ****(n****3****)**

**Therefore, the time complexity for the above relation is T(n) = ****θ(n****3****)**

**Application:**

- Master’s Theorem can be used in solving Merge sort and Binary search.

**Limitations:**

- Master’s Theorem can’t be applied to every recurrence relation of the form
**T(n) = aT(n/b) + f(n)** - It can’t be applied when a is less than 1, and a is not a constant, i.e. a= x
- It can’t be applied when f(n ) is not a polynomial and T(n) is not monotonic.

Recently completed any professional course/certification from the market? Tell us what liked or disliked in the course for more curated content.

Click here to submit its review with Shiksha Online.

**About the Author**

This is a collection of insightful articles from domain experts in the fields of Cloud Computing, DevOps, AWS, Data Science, Machine Learning, AI, and Natural Language Processing. The range of topics caters to upski... Read Full Bio