Design and Analysis of Algorithm
Topperworld.in
Recurrence realation
• A recurrence relation is a mathematical equation that describes a
sequence of values, where each term in the sequence is defined in terms
of previous terms.
• In the context of Design and Analysis of Algorithms (DAA), recurrence
relations are often used to analyze the time complexity of algorithms by
expressing the time complexity of a problem of size n in terms of the time
complexity of smaller instances of the same problem.
The general form of a recurrence relation is often expressed as:
T(n)=f(n)+T(g(n))
where:
• T(n) is the time complexity of the problem of size.
• f(n) is the time complexity of the basic operations at the current level,
• g(n) represents the size of subproblems at the next level.
There are Four methods commonly used to solve recurrence relations:
Substitution Method
Recursion Tree
Iterations Method
Master Theorem
❖ Substitution Method
• Guess the form of the solution.
• Use mathematical induction to prove that the guess is correct.
For example consider the recurrence
T (n) = T +n
©Topperworld
Design and Analysis of Algorithm
We have to show that it is asymptotically bound by O (log n).
Solution:
For T (n) = O (log n)
We have to show that for some constant c
T (n) ≤c logn.
Put this in given Recurrence Equation.
Now,
T (n) ≤c log +1
≤c log + 1 = c logn-clog2 2+1
≤c logn for c≥1
Thus T (n) =O logn.
Example2. Consider the Recurrence
T (n) = 2T + n n>1
Find an Asymptotic bound on T.
Solution:
We guess the solution is O (n (logn)).Thus for constant 'c'.
For T (n) ≤c n log
Put this in given Recurrence Equation.
Now,
T (n) ≤2c log +n
≤cnlogn-cnlog2+n
©Topperworld
Design and Analysis of Algorithm
=cn logn-n (clog2-1)
≤cn logn for (c≥1)
Thus T (n) = O (n logn)
The time Complexity of Substitution Method is O(nlogn), and the space
complexity is often the same unless additional space complexities (like
recursion stack) need to be considered.
©Topperworld
❖ Recursion Tree Method:
• Represent the recurrence relation as a tree.
• Analyze the total cost by summing up the costs of all levels of the tree.
• In this method, we draw a recurrence tree and calculate the time taken
by every level of the tree.
• Finally, we sum the work done at all levels.
• To draw the recurrence tree, we start from the given recurrence and
keep drawing till we find a pattern among levels.
• The pattern is typically arithmetic or geometric series.
For example, consider the recurrence relation
T(n) = T(n/4) + T(n/2) + cn2
cn2
/ \
T(n/4) T(n/2)
If we further break down the expression T(n/4) and T(n/2),
we get the following recursion tree.
©Topperworld
Design and Analysis of Algorithm
Solution:
cn2
/ \
c(n2)/16 c(n2)/4
/ \ / \
T(n/16) T(n/8) T(n/8) T(n/4)
Breaking down further gives us following
cn2
/ \
c(n2)/16 c(n2)/4
/ \ / \
c(n2)/256 c(n2)/64 c(n2)/64 c(n2)/16
/ \ / \ / \ / \
✓ To know the value of T(n), we need to calculate the sum of tree
✓ nodes level by level.
✓ If we sum the above tree level by level,
✓ we get the following series T(n) = c(n^2 + 5(n^2)/16 + 25(n^2)/256) +
….
✓ The above series is a geometrical progression with a ratio of 5/16.
✓ To get an upper bound, we can sum the infinite series.
✓ We get the sum as (n2)/(1 – 5/16) which is O(n2).
©Topperworld
Design and Analysis of Algorithm
Example: QuickSort Algorithm
Let's apply the recursion tree method to analyze the time and space
complexity of the QuickSort algorithm.
Time Complexity:
The partitioning step in QuickSort takes O(n) time, and the algorithm splits
the array into two subarrays of approximately equal size.
The recursion tree for QuickSort has a height of logn (assuming balanced
partitions at each level)= Number of levels × Cost per level
T(n)=Number of levels×Cost per level
©Topperworld
=logT(n)=logn×O(n)
T(n)=O(nlogn)
Space Complexity:
The space complexity of QuickSort is influenced by the recursion stack. At
each level, the space used is proportional to the size of the subproblems at
that level = Number of levels×Space per level
S(n)=Number of levels×Space per level
log S(n)=logn×O(logn)
log S(n)=O(logn)
The dominant term in the space complexity is log n, indicating a logarithmic
growth in space usage.
So ,
Time Complexity is 0(n).
Space Complexity is log n.
©Topperworld
Design and Analysis of Algorithm
©Topperworld
❖ Iteration Methods:
• It means to expand the recurrence and express it as a summation of
terms of n and initial condition.
Example1:
Consider the Recurrence T (n) = 1 if n=1
= 2T (n-1) if n>1
Solution:
T (n) = 2T (n-1)
= 2[2T (n-2)] = 22T (n-2)
= 4[2T (n-3)] = 23T (n-3)
= 8[2T (n-4)] = 24T (n-4) ….(Eq.1)
Repeat the procedure for i times
T (n) = 2i T (n-i)
Put n-I =1 or I = n-1 in …(Eq.1)
T (n) = 2n-1 T (1)
= 2n-1 .1 {T (1) =1 .....given}
= 2n-1
❖ Master Theorem:
• The Master Theorem is a tool used in the analysis of algorithms,
particularly those that follow a divide-and-conquer approach.
• It provides a straightforward way to determine the time complexity of
recursive algorithms by expressing their time recurrence relations in a
specific form.
The general form of a recurrence relation that can be solved using the
Master Theorem is: T(n) = aT(n/b) + f(n)
©Topperworld
Design and Analysis of Algorithm
Here:
o T(n) is the time complexity of the algorithm for a problem of size n,
o a is the number of subproblems at each recursion level,
o b is the factor by which the problem size is reduced in each recursive
call,
o f(n) is the cost of dividing the problem and combining the results.
o The Master Theorem has three cases, each of which provides a solution
for the time complexity T(n)
Case1:
Case2:
Case 3:
sufficiently large n, then the time complexity is:
The Master Theorem provides a convenient way to analyze the time
complexity of many divide-and-conquer algorithms without going through
the process of constructing a recursion tree or using the substitution method.
However, it is applicable to a specific form of recurrence relations, and not all
recurrence relations fit the Master Theorem's framework. In such cases,
other methods like the recursion.
©Topperworld