Recurrence Relation For Complexity Analysis of Algorithms
Recurrence Relation For Complexity Analysis of Algorithms
Analysis of Algorithms:
Many algorithms are recursive. When we analyze them, we get a recurrence relation for
time complexity. We get running time on an input of size n as a function of n and the
running time on inputs of smaller sizes. For example in Merge Sort, to sort a given array,
we divide it into two halves and recursively repeat the process for the two halves. Finally,
we merge the results. Time complexity of Merge Sort can be written as T(n) = 2T(n/2) +
cn. There are many other algorithms like Binary Search, Tower of Hanoi, etc.
There are mainly three ways of solving recurrences:
Substitution Method:
We make a guess for the solution and then we use mathematical induction to prove the
guess is correct or incorrect.
For example consider the recurrence T(n) = 2T(n/2) + n
We guess the solution as T(n) = O(nLogn). Now we use induction to prove our guess.
We need to prove that T(n) <= cnLogn. We can assume that it is true for values smaller
than n.
T(n) = 2T(n/2) + n
<= 2cn/2Log(n/2) + n
= cnLogn – cnLog2 + n
= cnLogn – cn + n
<= cnLogn
Master Method:
Master Method is a direct way to get the solution. The master method works only for the
following type of recurrences or for recurrences that can be transformed into the
following type.
T(n) = aT(n/b) + f(n) where a >= 1 and b > 1
There are the following three cases:
If f(n) = O(nc) where c < Logba then T(n) = Θ(nLog a)
b
In the recurrence tree method, we calculate the total work done. If the work done at
leaves is polynomially more, then leaves are the dominant part, and our result becomes
the work done at leaves (Case 1). If work done at leaves and root is asymptotically the
same, then our result becomes height multiplied by work done at any level (Case 2). If
work done at the root is asymptotically more, then our result becomes work done at the
root (Case 3).
Examples of some standard algorithms whose time complexity can be evaluated
using the Master Method
Merge Sort: T(n) = 2T(n/2) + Θ(n). It falls in case 2 as c is 1 and Log ba] is also 1. So
the solution is Θ(n Logn)
Binary Search: T(n) = T(n/2) + Θ(1). It also falls in case 2 as c is 0 and Log ba is also
0. So the solution is Θ(Logn)
Examples:
Example 1: Say you have derived the recurrence relation T(n) = 8T(n/2) + cn2, where c is
some positive constant. We see that this has the appropriate form for applying the master
method, and that a=8, b=2, and h(n) = cn2. cn2 is O(nlog28 − ε) = O(n3 − ε) for any ε ≤ 1, so this
falls into case 1. Therefore, T(n) is Θ(n3).
Example 2: Say you have derived the recurrence relation T(n) = T(n/2) + cn, where c is
some positive constant. We see that this has the appropriate form for applying the master
method, and that a=1, b=2, and h(n) = cn. Then h(n) is Ω(nlog21 + ε) = Ω(nε) for any ε ≤ 1, so
this falls into case 3. And ah(n/b) = cn/2 = ½h(n), therefore T(n) is Θ(n).
Example 3: Say you have derived the recurrence relation T(n) = 8T(n/4) + cn3/2, where c is
some positive constant. We see that this has the appropriate form for applying the master
method, and that a=8, b=4, and h(n) = cn3/2. cn3/2 is Θ(nlog48) = Θ(n3/2), so this falls into case
2. Therefore, T(n) is Θ(n3/2log n).
Notes:
It is not necessary that a recurrence of the form T(n) = aT(n/b) + f(n) can be solved
using Master Theorem. The given three cases have some gaps between them. For
example, the recurrence T(n) = 2T(n/2) + n/Logn cannot be solved using master
method.
Case 2 can be extended for f(n) = Θ(ncLogkn)
If f(n) = Θ(ncLogkn) for some constant k >= 0 and c = Logba, then T(n) = Θ(ncLogk+1n)