The Gateway to Computer Science Excellence
First time here? Checkout the FAQ!
x
+22 votes
4.1k views

What is the $\text{time complexity}$ of the following recursive function?

int  DoSomething (int n) {
    if (n <= 2)
        return 1;
    else 
        return (DoSomething (floor (sqrt(n))) + n);
}
  1. $\Theta(n^2)$

  2. $\Theta(n \log_2n)$

  3. $\Theta(\log_2n)$

  4. $\Theta(\log_2\log_2n)$

asked in Algorithms by Veteran (59.4k points)
edited by | 4.1k views
0
Please help me to understand this concept!!!

5 Answers

+36 votes
Best answer

We are asked the time complexity which will be the number of recursive calls in the function as in each call we perform a constant no. of operations and a recursive call. The recurrence relation for this is (considering constant time "$c$" as $1$)

$T(n) = T(\sqrt n) + 1$

           $= T\left(n^{1/4}\right) + 2 \\=T\left(n^{1/8}\right) + 3 $

Going like this we will eventually reach $T(3)$ or $T(2)$. For asymptotic case this doesn't matter and we can assume we reach $T(2)$ and in next step reach $T(1)$. So, all we want to know is how many steps it takes to reach $T(1)$ which will be $1+ $no. of steps to reach $T(2)$. 

From the recurrence relation we know that $T(2)$ happens when $n^{\left(\frac{1}{2^k}\right)} = 2$.

Taking $\log $ and equating,

$\frac{1}{2^k} \log n = 1 \\2^k = \log n \\k = \log \log n$. 

So, $T(1)$ happens in $\log \log n + 1$ calls, but for asymptotic complexity we can write as $\Theta \left( \log \log n\right)$


Alternatively,

Substituting values

$T(1) = 1$
$T(2) = 1$
$T(3) = T(1) + 1 = 2$
$\dots$
$T(8) = T(2) + 1 = 2$
$T(9) = T(3) + 1 = 3$
$\dots$


$T\left(\left({\left({2^2}\right)^2}\right)^2\right) = T\left(\left({2^2}\right)^2\right) + 1 \\= T(2^2)+ 2 \\= T(2) + 3 = 1 + 3 = 4, \\ \log \log n = 3 \text{ as } n = 256$.


$T\left(\left({\left({\left({\left({2^2}\right)^2}\right)^2}\right)^2}\right)^2\right) = 6,\\ \log \log n = 5 \text{ as } n = 65536 \times 65536 = 2^{32}$

$T\left(2^{(2^{10})}\right) = T\left(2^{512}\right) + 1 \\= T(2^{256}) + 2 \\= T(2^{128}) + 3\\ = T(2^{64}) + 4 \\= T(2^{32}) + 5 \\= T(2^{16}) + 6 \\= T(2^8)+7 \\= T(2^4) + 8 \\= T(2^2) + 9 \\= T(2) + 10 = 11,\\ \log \log n = 10$

So, answer is D

http://stackoverflow.com/questions/16472012/what-would-cause-an-algorithm-to-have-olog-log-n-complexity

answered by Veteran (342k points)
edited by
doubt
+3

T(n)=T(n)+1T(n)=T(n)+1T(n)=T(n)+1why is the recurrence not T(n)=T(root(n)) + n?T(n)=T(n)+1

+27
We are asked to find the time complexity- not the return value of function. Time complexity depends on what happens inside the code w.r.t. $n$. For $n > 2$, we have a recursive call on $\sqrt n$ and then an addition and some assignments and conditional checking - all independent of $n$. So, these all constitute $O(1)$. Ideally I should have used $c$ there instead of $1$ but using $1$ makes calculation easy and is fine for asymptotic complexity.

Suppose the question is for return value, we have to use $n$ instead of $1$.
0
But the question is also returning value by recurrence .So, why we are not using n
0
great explanation sir
+2
because n is value not time
0

@Arjun Sir

after first paragraph of your answer how you write "+1"  at end ...first Line T(n)=T(n−−√)+1

in question it is saying "+n"

return (DoSomething (floor (sqrt(n))) + n);

 

0
should'nt the equation be T(n) = sqrt(n) + n as they have in the return statement?
+1
Please read the above comments.
0
I read your explaination but i am still not clear with it...

What dose that single n stands for?
0
We are using 1 instead of $n$ because there is no computation associated with $n$ in $doSomething(f(n) + n)$, i.e., We can find the value of $n$ in $O(1)$ time.
+11 votes

Recursive relation for the DoSomething() is

  T(n) =  T(√n) + C1 if n > 2  

We have ignored the floor() part as it doesn’t matter here if it’s a floor or ceiling.

  Let n = 2^m,  T(n) = T(2^m)
  Let T(2^m) =  S(m)

  From the above two, T(n) = S(m)

  S(m) = S(m/2) + C1  /* This is simply binary search recursion*/
  S(m)  = O(logm)      
          = O(loglogn)  /* Since n = 2^m */
  
  Now, let us go back to the original recursive function T(n) 
  T(n)  = S(m) 
          = O(LogLogn)
answered by Active (4.1k points)
0

isn't the recurrence 

T(n) =  T(√n) + n, if n > 2     as there is '+n' is the else part ? 
0
Sorry, I got the answer from Arjun Sir comments.  Thanks
0
Addition operation taks constant time.
0
why did you take C1 why not n??
+1
Because c1 is some constant time addition will take constant time as compare to T(n)
0
thanks got it
+7 votes

T(n)=T(√n) + c  

Let n= 2^m   hence m=log n
T(2^m) = T(√(2^m))+c
Let T(2^m) = S(m)

Now S(m)=S(m/2)+c
Apply master's theorem.
a=1, b=2, k=0,p=0
a=b^k, p>-1,  
Hence S(m)= theta(log m)
Hence it is theta(log log n)  (Since m=log n)

answered by Boss (10.3k points)
0
@Ahwan why did you take "C" instead of "n"
0
It is calling itself and doing constant work.  Except calling itself is it doing anything of O(n) ?
+1 vote
The program is not going to end when started with n > 2. A suitable base condition is required for the program to stop.

Time complexity for this program should be infinity.
answered by (305 points)
0 votes

This is my solution. Please let me know if there is any mistake.

answered by Active (1.1k points)
0
4=2log2= loglog2 ???


Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true

35,528 questions
42,803 answers
121,619 comments
42,166 users