6.1k views

What is the $\text{time complexity}$ of the following recursive function?

int  DoSomething (int n) {
if (n <= 2)
return 1;
else
return (DoSomething (floor (sqrt(n))) + n);
}

1. $\Theta(n^2)$

2. $\Theta(n \log_2n)$

3. $\Theta(\log_2n)$

4. $\Theta(\log_2\log_2n)$

edited | 6.1k views
0

We are asked the time complexity which will be the number of recursive calls in the function as in each call we perform a constant no. of operations and a recursive call. The recurrence relation for this is (considering constant time "$c$" as $1$)

$T(n) = T(\sqrt n) + 1$

$\qquad= T\left(n^{1/4}\right) + 2$
$\\\qquad=T\left(n^{1/8}\right) + 3$

Going like this we will eventually reach $T(3)$ or $T(2)$. For asymptotic case this doesn't matter and we can assume we reach $T(2)$ and in next step reach $T(1)$. So, all we want to know is how many steps it takes to reach $T(1)$ which will be $1+$no. of steps to reach $T(2)$.

From the recurrence relation we know that $T(2)$ happens when $n^{\left(\frac{1}{2^k}\right)} = 2$.

Taking $\log$ and equating,

$\frac{1}{2^k} \log n = 1$
$\\\implies 2^k = \log n$
$\\\implies k = \log \log n$.

So, $T(1)$ happens in $\log \log n + 1$ calls, but for asymptotic complexity we can write as $\Theta \left( \log \log n\right)$

Alternatively,

Substituting values

$T(1) = 1$
$T(2) = 1$
$T(3) = T(1) + 1 = 2$
$\vdots$
$T(8) = T(2) + 1 = 2$
$T(9) = T(3) + 1 = 3$
$\vdots$

$T\left(\left({\left({2^2}\right)^2}\right)^2\right) = T\left(\left({2^2}\right)^2\right) + 1$
$\\\quad= T(2^2)+ 2$
$\\\quad= T(2) + 3 = 1 + 3 = 4,$
$\\ \log \log n = 3 \text{ as } n = 256$.

$T\left(\left({\left({\left({\left({2^2}\right)^2}\right)^2}\right)^2}\right)^2\right) = 6,$
$\\\quad \log \log n = 5 \text{ as } n = 65536 \times 65536 = 2^{32}$

$T\left(2^{(2^{10})}\right) = T\left(2^{512}\right) + 1$
$\\\quad= T(2^{256}) + 2$
$\\\quad= T(2^{128}) + 3$
$\\\quad = T(2^{64}) + 4$
$\\\quad= T(2^{32}) + 5$
$\\\quad= T(2^{16}) + 6$
$\\\quad= T(2^8)+7$
$\\\quad= T(2^4) + 8$
$\quad\\= T(2^2) + 9$
$\quad = T(2) + 10 = 11,$
$\quad \log \log n = 10$

http://stackoverflow.com/questions/16472012/what-would-cause-an-algorithm-to-have-olog-log-n-complexity

edited
doubt
+3

T(n)=T(n)+1T(n)=T(n)+1T(n)=T(n)+1why is the recurrence not T(n)=T(root(n)) + n?T(n)=T(n)+1

+31
We are asked to find the time complexity- not the return value of function. Time complexity depends on what happens inside the code w.r.t. $n$. For $n > 2$, we have a recursive call on $\sqrt n$ and then an addition and some assignments and conditional checking - all independent of $n$. So, these all constitute $O(1)$. Ideally I should have used $c$ there instead of $1$ but using $1$ makes calculation easy and is fine for asymptotic complexity.

Suppose the question is for return value, we have to use $n$ instead of $1$.
0
But the question is also returning value by recurrence .So, why we are not using n
0
great explanation sir
+2
because n is value not time
0

@Arjun Sir

after first paragraph of your answer how you write "+1"  at end ...first Line T(n)=T(n−−√)+1

in question it is saying "+n"

return (DoSomething (floor (sqrt(n))) + n);
0
should'nt the equation be T(n) = sqrt(n) + n as they have in the return statement?
+1
0
I read your explaination but i am still not clear with it...

What dose that single n stands for?
+1
We are using 1 instead of $n$ because there is no computation associated with $n$ in $doSomething(f(n) + n)$, i.e., We can find the value of $n$ in $O(1)$ time.

Recursive relation for the DoSomething() is

  T(n) =  T(√n) + C1 if n > 2


We have ignored the floor() part as it doesn’t matter here if it’s a floor or ceiling.

  Let n = 2^m,  T(n) = T(2^m)
Let T(2^m) =  S(m)

From the above two, T(n) = S(m)

S(m) = S(m/2) + C1  /* This is simply binary search recursion*/
S(m)  = O(logm)
= O(loglogn)  /* Since n = 2^m */

Now, let us go back to the original recursive function T(n)
T(n)  = S(m)
= O(LogLogn)

0

isn't the recurrence

T(n) =  T(√n) + n, if n > 2     as there is '+n' is the else part ?
0
0
0
why did you take C1 why not n??
+2
Because c1 is some constant time addition will take constant time as compare to T(n)
0
thanks got it

T(n)=T(√n) + c

Let n= 2^m   hence m=log n
T(2^m) = T(√(2^m))+c
Let T(2^m) = S(m)

Now S(m)=S(m/2)+c
Apply master's theorem.
a=1, b=2, k=0,p=0
a=b^k, p>-1,
Hence S(m)= theta(log m)
Hence it is theta(log log n)  (Since m=log n)

0
@Ahwan why did you take "C" instead of "n"
+1
It is calling itself and doing constant work.  Except calling itself is it doing anything of O(n) ?
The program is not going to end when started with n > 2. A suitable base condition is required for the program to stop.

Time complexity for this program should be infinity.
+1 vote
The recurrence relation for this is following:

$T(n)=T(n^{1/2})+n$ ......(1)

put  $n=n^{1/2}$

we get,

$T(n^{1/2})=T(n^{1/4})+n^{1/2}$

put this in (1), we get,

$T(n)=T(n^{1/4})+n^{1/2}+n$

similarly,

$T(n)=T(n^{1/2^{k}})+n^{1/2^{k-1}}+......+n$

This will stop when  $n^{1/2^{k}}=2$

Taking log of base 2 both side,we get

$1/2^{k} logn=log2 logn=2^{k} log log n=k$

T(n)=O(log log n)

This is my solution. Please let me know if there is any mistake.

0
4=2log2= loglog2 ???
0
by applying loglogn+n for T(16) the answer coming is 18 not 21....plz can u explain?

1
2