in Algorithms edited by
22,603 views
45 votes
45 votes

What is the $\text{time complexity}$ of the following recursive function?

int  DoSomething (int n) {
    if (n <= 2)
        return 1;
    else 
        return (DoSomething (floor (sqrt(n))) + n);
}
  1. $\Theta(n^2)$

  2. $\Theta(n \log_2n)$

  3. $\Theta(\log_2n)$

  4. $\Theta(\log_2\log_2n)$

in Algorithms edited by
22.6k views

4 Comments

edited by
int  DoSomething (int n) {
    if (n <= 2)
        return 1;
    else 
        return (DoSomething (floor (sqrt(n))) + n);
}

If we want our recurrence relation to be,

T(n) = T(sqrt(n))+n

Anyone suggest what manipulation in else gives me the recurrence I wanted..

0
0
@ASNR1010

Recurrence relation is :

T(n) = T(sqrt(n)) + 1

Because, in recurrence time complexity is depend on total number of calls not on the number added to it.

Suppose the  code is:

int DoSomething (int n) {

    if (n <= 2)

        return 1;

    else

        return (DoSomething (floor (sqrt(n))) + n);

    for (int i = 0; i <n ; i++)

          printf ("*");

}

In that case recurrence is:

T(n) = T(sqrt(n)) + n (n because in each call, loop is executed n times)
6
6
It is pretty easy question considering the level of gate, more suitable for exams like isro and ugc.
0
0

8 Answers

82 votes
82 votes
Best answer

We are asked the time complexity which will be the number of recursive calls in the function as in each call we perform a constant no. of operations and a recursive call. The recurrence relation for this is (considering constant time "$c$" as $1$)

$T(n) = T(\sqrt n) + 1$

$\qquad= T\left(n^{1/4}\right) + 2 $
$\\\qquad=T\left(n^{1/8}\right) + 3 $

Going like this we will eventually reach $T(3)$ or $T(2)$. For asymptotic case this doesn't matter and we can assume we reach $T(2)$ and in next step reach $T(1)$. So, all we want to know is how many steps it takes to reach $T(1)$ which will be $1+ $no. of steps to reach $T(2)$. 

From the recurrence relation we know that $T(2)$ happens when $n^{\left(\frac{1}{2^k}\right)} = 2$.

Taking $\log $ and equating,

$\frac{1}{2^k} \log n = 1$
$ \\\implies 2^k = \log n$
$ \\\implies k = \log \log n$. 

So, $T(1)$ happens in $\log \log n + 1$ calls, but for asymptotic complexity we can write as $\Theta \left( \log \log n\right)$


Alternatively,

Substituting values

$T(1) = 1$
$T(2) = 1$
$T(3) = T(1) + 1 = 2$
$\vdots$
$T(8) = T(2) + 1 = 2$
$T(9) = T(3) + 1 = 3$
$\vdots$

$T\left(\left({\left({2^2}\right)^2}\right)^2\right) = T\left(\left({2^2}\right)^2\right) + 1$
$ \\\quad= T(2^2)+ 2$
$ \\\quad= T(2) + 3 = 1 + 3 = 4, $
$\\ \log \log n = 3 \text{ as } n = 256$.


$T\left(\left({\left({\left({\left({2^2}\right)^2}\right)^2}\right)^2}\right)^2\right) = 6,$
$\\\quad \log \log n = 5 \text{ as } n = 65536 \times 65536 = 2^{32}$

$T\left(2^{(2^{10})}\right) = T\left(2^{512}\right) + 1 $
$\\\quad= T(2^{256}) + 2 $
$\\\quad= T(2^{128}) + 3$
$\\\quad = T(2^{64}) + 4 $
$\\\quad= T(2^{32}) + 5 $
$\\\quad= T(2^{16}) + 6 $
$\\\quad= T(2^8)+7 $
$\\\quad= T(2^4) + 8 $
$\quad\\= T(2^2) + 9$
$\quad = T(2) + 10 = 11,$
$\quad \log \log n = 10$

So, answer is D.

http://stackoverflow.com/questions/16472012/what-would-cause-an-algorithm-to-have-olog-log-n-complexity

edited by
by

15 Comments

retagged Jun 22 by
doubt

T(n)=T(n)+1T(n)=T(n)+1T(n)=T(n)+1why is the recurrence not T(n)=T(root(n)) + n?T(n)=T(n)+1

11
11
We are asked to find the time complexity- not the return value of function. Time complexity depends on what happens inside the code w.r.t. $n$. For $n > 2$, we have a recursive call on $\sqrt n$ and then an addition and some assignments and conditional checking - all independent of $n$. So, these all constitute $O(1)$. Ideally I should have used $c$ there instead of $1$ but using $1$ makes calculation easy and is fine for asymptotic complexity.

Suppose the question is for return value, we have to use $n$ instead of $1$.
61
61
But the question is also returning value by recurrence .So, why we are not using n
0
0
great explanation sir
0
0
because n is value not time
6
6

@Arjun Sir

after first paragraph of your answer how you write "+1"  at end ...first Line T(n)=T(n−−√)+1

in question it is saying "+n"

return (DoSomething (floor (sqrt(n))) + n);
0
0
should'nt the equation be T(n) = sqrt(n) + n as they have in the return statement?
0
0
Please read the above comments.
2
2
I read your explaination but i am still not clear with it...

What dose that single n stands for?
0
0
We are using 1 instead of $n$ because there is no computation associated with $n$ in $doSomething(f(n) + n)$, i.e., We can find the value of $n$ in $O(1)$ time.
2
2

@Arjun sir, why have we not included the time to compute sqrt(n) in the recurrence relation?

0
0
I guess a lot of people like me didn't notice that the 'n' in the return statement is outside DoSomething recursive function call and hence the confusion.
5
5
@arjun sir thanks for this beautiful explanation
0
0
edited by

@aspi_r_osa

i did the same mistake of taking the recc relation for time as T(n)=T(sqrt(n))+n and this will give ans as theta(n).

but this is wrong as the addition will take only constant time and that n there is given only to create confusion so treat it as a constant it is not related to that function which is recursively being called .

so the correct recc relation for time will be T(n)=T(sqrt(n)) + c  by solving which we get

theta(loglogn).

hope it is clear now!

0
0
31 votes
31 votes

Recursive relation for the DoSomething() is

  T(n) =  T(√n) + C1 if n > 2  

We have ignored the floor() part as it doesn’t matter here if it’s a floor or ceiling.

  Let n = 2^m,  T(n) = T(2^m)
  Let T(2^m) =  S(m)

  From the above two, T(n) = S(m)

  S(m) = S(m/2) + C1  /* This is simply binary search recursion*/
  S(m)  = O(logm)      
          = O(loglogn)  /* Since n = 2^m */
  
  Now, let us go back to the original recursive function T(n) 
  T(n)  = S(m) 
          = O(LogLogn)

4 Comments

thanks got it
0
0

If it would have been 

T(n) =  T(√n) + n , it would mean to solve a solve a problem of size of n we need to solve for size(root(n)) and to solve it we need O(n) i.e iterate over n items, which is not the case. So, the soln requires c1 time (simple statements take O(1)
0
0
what is the n here? how is it 2^m? can someone please explain?
0
0
20 votes
20 votes

T(n)=T(√n) + c  

Let n= 2^m   hence m=log n
T(2^m) = T(√(2^m))+c
Let T(2^m) = S(m)

Now S(m)=S(m/2)+c
Apply master's theorem.
a=1, b=2, k=0,p=0
a=b^k, p>-1,  
Hence S(m)= theta(log m)
Hence it is theta(log log n)  (Since m=log n)

by

4 Comments

@Ahwan why did you take "C" instead of "n"
0
0
It is calling itself and doing constant work.  Except calling itself is it doing anything of O(n) ?
1
1

  T(2^m) = S(m) 

 S(m)=S(m/2)+c   

how did this come? I don't want to memorize it

0
0
  1. Here we have p>-1 and the case for that is t(n)=theta(n^log(a) log^(p+1) n)
0
0
4 votes
4 votes

 =(floor(sqrt(n))) + n

here n is constant so 

T(n)=T$\sqrt{n}$ + 1

T(n) = T(n1/2) +1 EQU-A

put n1/2 in place of n we will get

T(n1/2) = T(n1/4) +1  EQU-1

NOW PUT EQU-1 INTO EQU-A WE WILL GET FOLLOWING

T(n) = T(n1/4) +1 +1 BY DOING MORE THIS BACK SUBSTITUTION WE WILL GET GENERALIZED EQUA.

T(n) = T(n1/(2^i)) + $\sum_{1}^{i+1}1$ EQU-F

n1/(2^i) = 2 taking log both the sided we will get following

logn=2taking again log both LHS AND RHS 

loglogn=i 

$\sum_{1}^{i+1}1$ = i+1

putting value of i into EQU-F WE WILL GET TIME COMPLEXITY = O(loglogn)

2 Comments

Thanks
0
0
how can u tell that n is constant as it varies in every iteration

I am not able to understand that point can u please elaborate
1
1
Answer:

Related questions