retagged by
388 views
0 votes
0 votes

Time complexity of this Recurrence relation 

T(n)=T(√n) +1

  1. O(nlogn)
  2. O(log log n)
  3. O(n^2)
  4. n^2 logn

 

retagged by

1 Answer

1 votes
1 votes
Answer is B.

You can do back substituting, $T(n) = T(n^{1/2}) + 1 = (T(n^{1/4})+1) + 1 = ((T(n^{1/8})+1)+1) + 1$

In general, we will get, $T(n)= T(n^{\frac{1}{2^k}}) + k$

Now we will back substitute till the recursive term is eliminated, assuming T(2) to be const.

i.e.  $n^{\frac{1}{2^k}} = 2$, take log on both sides, $2^k = log n$, again take log on both sides, and you will get $k = log log n$

thus $T(n)= O(log log n)$

Related questions

0 votes
0 votes
1 answer
1
admin asked Jan 5, 2019
298 views
The recurrence equation $T(n) = T(\sqrt{n}) + O(1)$ has the following asymptotic solution:$T(n) = O(\sqrt{n})$$T(n) = O(\log n)$$T(n) = O(n^{1/\log n})$$T(n) = O(\log \lo...
0 votes
0 votes
1 answer
2
admin asked Jan 5, 2019
404 views
An unordered list contains $n$ distinct elements. The number of comparisons to find an element in the list that is larger than the second minimum in the list is$\Theta(n ...
0 votes
0 votes
1 answer
3
admin asked Jan 5, 2019
461 views
Which of the given options provides the increasing order of asymptotic complexity of functions $\text{f}_{1}, \text{f}_{2}, \text{f}_{3}$ and $ \text{f}_{4} $ ?$\text{f}_...
0 votes
0 votes
1 answer
4