We are asked the time complexity which will be the number of recursive calls in the function as in each call we perform a constant no. of operations and a recursive call. The recurrence relation for this is (considering constant time "$c$" as $1$)
$T(n) = T(\sqrt n) + 1$
$\qquad= T\left(n^{1/4}\right) + 2 $
$\\\qquad=T\left(n^{1/8}\right) + 3 $
Going like this we will eventually reach $T(3)$ or $T(2)$. For asymptotic case this doesn't matter and we can assume we reach $T(2)$ and in next step reach $T(1)$. So, all we want to know is how many steps it takes to reach $T(1)$ which will be $1+ $no. of steps to reach $T(2)$.
From the recurrence relation we know that $T(2)$ happens when $n^{\left(\frac{1}{2^k}\right)} = 2$.
Taking $\log $ and equating,
$\frac{1}{2^k} \log n = 1$
$ \\\implies 2^k = \log n$
$ \\\implies k = \log \log n$.
So, $T(1)$ happens in $\log \log n + 1$ calls, but for asymptotic complexity we can write as $\Theta \left( \log \log n\right)$
Alternatively,
Substituting values
$T(1) = 1$
$T(2) = 1$
$T(3) = T(1) + 1 = 2$
$\vdots$
$T(8) = T(2) + 1 = 2$
$T(9) = T(3) + 1 = 3$
$\vdots$
$T\left(\left({\left({2^2}\right)^2}\right)^2\right) = T\left(\left({2^2}\right)^2\right) + 1$
$ \\\quad= T(2^2)+ 2$
$ \\\quad= T(2) + 3 = 1 + 3 = 4, $
$\\ \log \log n = 3 \text{ as } n = 256$.
$T\left(\left({\left({\left({\left({2^2}\right)^2}\right)^2}\right)^2}\right)^2\right) = 6,$
$\\\quad \log \log n = 5 \text{ as } n = 65536 \times 65536 = 2^{32}$
$T\left(2^{(2^{10})}\right) = T\left(2^{512}\right) + 1 $
$\\\quad= T(2^{256}) + 2 $
$\\\quad= T(2^{128}) + 3$
$\\\quad = T(2^{64}) + 4 $
$\\\quad= T(2^{32}) + 5 $
$\\\quad= T(2^{16}) + 6 $
$\\\quad= T(2^8)+7 $
$\\\quad= T(2^4) + 8 $
$\quad\\= T(2^2) + 9$
$\quad = T(2) + 10 = 11,$
$\quad \log \log n = 10$
So, answer is D.
http://stackoverflow.com/questions/16472012/what-would-cause-an-algorithm-to-have-olog-log-n-complexity