2k views

The running time of the following algorithm

Procedure $A(n)$

If $n \leqslant 2$ return ($1$) else return $(A( \lceil \sqrt{n} \rceil))$;

is best described by

1. $O(n)$
2. $O(\log n)$
3. $O(\log \log n)$
4. $O(1)$
edited | 2k views
0
how to solve this question..?

The complexity will be the number of times the recursion happens which is equal to the number of times we can take square root of n recursively, till n becomes $2$.

$T(n) = T\left(\lceil \sqrt n \rceil \right) + 1$

$T(2) = 1$
$T\left(2^2\right) = T(2) + 1 = 2$
$T\left(2^{2^2}\right) = T(4) + 1 = 3$
$T\left(2^{2^3}\right) = T(16) + 1 = 4$

So, $T(n) = \lg\lg n + 1 = O\left(\log \log n\right)$
edited by
+2
why is it +1 and not some constant c in recurrence relation
0
why we are adding constant in recurrence relation?
+1
because u will do some constant amount of work at every recurrence
+1

Read this article once, then you may not need to solve this again ever, it becomes very intuitive.

https://stackoverflow.com/questions/16472012/what-would-cause-an-algorithm-to-have-olog-log-n-complexity

log log n.
substitute root n = m then proceed.
T(n)=T(sqrt(n))+1

We substitute n=2^m. Then T(2^m)=T(2^m/2) +1

we substitute T(2^m)=S(m).

S(m)=S(m/2)+1.

Now applying masters theoram we get S(m)=O(log m). As m=logn.

T(n)=O(log log n)
answered ago by Junior (511 points)