# GATE2004-83, ISRO2015-40

6.3k views

The time complexity of the following C function is (assume $n > 0$)

int recursive (int n) {
if(n == 1)
return (1);
else
return (recursive (n-1) + recursive (n-1));
}
1. $O(n)$
2. $O(n \log n)$
3. $O(n^2)$
4. $O(2^n)$

edited

Answer is D) $O(2^n)$

int recursive (int n) {
if(n == 1)      // takes constant time say 'A' time
return (1); // takes constant time say 'A' time
else
// takes T(n-1) + T(n-1) time
return (recursive (n-1) + recursive (n-1));
}


$T(n) = 2T(n - 1) + a$ is the recurrence equation found from the pseudo code. Note: $a$ is a constant $O(1)$ cost that the non-recursive part of the function takes.

Solving the recurrence by Back Substitution:

\begin{align} T(n) &= 2T(n - 1) + a \\[1em] T(n - 1) &= 2T(n - 2) + a \\[1em] T(n - 2) &= 2T(n - 3) + a \\[1em] &\vdots \end{align}

Thus, we can re-write the equation for $T(n)$ as follows

\begin{align*} T(n) &= 2 \Bigl [ 2T(n - 2) + a \Bigr ] + a &= 4T(n - 2) + 2a + a \\[1em] &= 4 \Bigl [ 2T(n - 3) + a \Bigr ] + 3a &= 8T(n - 3) + 4a + 2a + a \\[1em] &\vdots \\[1em] &= 2^k T(n - k) + (2^k - 1) a \end{align*}

On Substituting Limiting Condition

$$T(1) = 1 \\ \implies n - k = 1 \\ \implies k = n - 1$$

Therefore, our solution becomes

$$2^{n - 1} + \Bigl ( 2^{n - 1} - 1 \Bigr ) a \\ = O(2^n)$$

edited
2
You explain well - all points included :)

HTML spacing for alignment is bad- even if we align spaces properly when the screen resolution changes it becomes bad. Latex is better :)
1
Thank you sir :) . I will try to improve next time.
0
It will be (2^k)-1
Its similar to tower of hanoi problem

$$T(n)=2T(n-1) + 1$$
T(1)=1

T(2)=2.1 + 1 =3

T(3)=2.3 +1 =7

T(4)=2.7 +1 =15 .... .....

T(n)=2.T(n-1)+ 1

we can see that its a pattern getting formed which is $t(n)=2^n-1$ so it is d $O(2^n)$
0
@Bhagirathi: why this 1 after 2T(n-1)
1

@Gate Mm:

T(n) = 2T(n-1)+1. Here +1 is for the base condition if(n==1) return 1. It has the constant time complexity of 1.

1
No it's not for that. 1 in the recursive equation signifies that some work has been done to divide a problem into two subproblems.

Base condition if(n==1) return 1 is used at the end.
0
@amitatgateoverflow is correct.

1 or some constant term is there to denote the amount of work to be done to divide the original problem and then combine the solutions to those subproblems to get the final answer.

The recurrence relation from the code is :

T(n) = 2T(n-1) + 1

The above recurrence can be solved easily with the help of Subtract and conquer master's theorem.

Here a=2, b=1 d=0

Since a>1, the third case applies

O(n0 . 2n/1) = O(2n) Option (d)

0
how could we know 'd ' value.
0

Read the theorem statement again and you would know.

F(n) is in O(nd)

0
Sir can we solve it by finding homogenous and particular solution?

Answer by that method :: (2^n)-1. Is that correct?
0
yes you can do it that way too , but it would be a lengthy way to solve . Single term recurrence relation is advised to be solved by substitution or tree method.

Another way to visualize this problem.

0
F(2) = 2 ??? how ??
0
@Puja Mishra You can find that easily by putting $n = 2$ in the code. The $return$ part in the $else$ block will return $recursive(1) + recursive(1)$.
1 vote

$T(n)=2T(n-1)+1$

$a_{n}=2a_{n-1}+1$

First solve this

$a_{n}=2a_{n-1}$

$r^n=2r^{n-1}$

$\dfrac{r^n}{r^{n-1}}=2$

$r=2$

$a_{n}^{(h)}=d(2)^n...........(1)$

Now solve this

$a_{n}^{(p)}=p_{0}$

$a_{n}=2a_{n-1}+1$

$a_{n}-2a_{n-1}=1$

$p_{0}-2(p_{0})=1$

$p_{0}=-1$

$a_{n}^{(p)}=-1...........(2)$

$Add\ (1)+(2)$

$a_{n}=d(2^n)-1............(3)$

$Given\ a_{1}=1$

$Substitute\ in\ (3)$

$d=1$

$a_{n}=1.(2^n)-1$

$T(n)=O(2^n)$

1 vote

Hope this helps :)

## Related questions

1
6.5k views
The recurrence equation $T(1) = 1$ $T(n) = 2T(n-1) + n, n \geq 2$ evaluates to $2^{n+1} - n - 2$ $2^n - n$ $2^{n+1} - 2n - 2$ $2^n + n$
Let $A[1,\ldots,n]$ be an array storing a bit ($1$ or $0$) at each location, and $f(m)$ is a function whose time complexity is $\Theta(m)$. Consider the following program fragment written in a C like language: counter = 0; for (i=1; i<=n; i++) { if a[i] == 1) counter++; ... 0;} } The complexity of this program fragment is $\Omega(n^2)$ $\Omega (n\log n) \text{ and } O(n^2)$ $\Theta(n)$ $o(n)$
Two matrices $M_1$ and $M_2$ are to be stored in arrays $A$ and $B$ respectively. Each array can be stored either in row-major or column-major order in contiguous memory locations. The time complexity of an algorithm to compute $M_1 \times M_2$ will be best if ... is in column-major order best if both are in row-major order best if both are in column-major order independent of the storage scheme