3.4k views

The running time of the following algorithm

Procedure $A(n)$

If $n \leqslant 2$ return ($1$) else return $(A( \lceil \sqrt{n} \rceil))$;

is best described by

1. $O(n)$
2. $O(\log n)$
3. $O(\log \log n)$
4. $O(1)$
edited | 3.4k views
0
how to solve this question..?

The complexity will be the number of times the recursion happens which is equal to the number of times we can take square root of n recursively, till n becomes $2$.

$T(n) = T\left(\lceil \sqrt n \rceil \right) + 1$

$T(2) = 1$
$T\left(2^2\right) = T(2) + 1 = 2$
$T\left(2^{2^2}\right) = T(4) + 1 = 3$
$T\left(2^{2^3}\right) = T(16) + 1 = 4$

So, $T(n) = \lg\lg n + 1 = O\left(\log \log n\right)$

edited
+3
why is it +1 and not some constant c in recurrence relation
0
why we are adding constant in recurrence relation?
+1
because u will do some constant amount of work at every recurrence
+4

Read this article once, then you may not need to solve this again ever, it becomes very intuitive.

https://stackoverflow.com/questions/16472012/what-would-cause-an-algorithm-to-have-olog-log-n-complexity

0

@Hemant Parihar why this is not following O(lognlogn) complexity though problem is diveded into √n each time?

https://gateoverflow.in/197521/recurrence-relation

log log n.
substitute root n = m then proceed.
T(n)=T(sqrt(n))+1

We substitute n=2^m. Then T(2^m)=T(2^m/2) +1

we substitute T(2^m)=S(m).

S(m)=S(m/2)+1.

Now applying masters theoram we get S(m)=O(log m). As m=logn.

T(n)=O(log log n)
Another way to solve this is:

T(n) = T(n^1/2)+c

T(n) = T(n^1/4)+c+c

T(n) = T(n^1/8)+c+c+c

T(n) = T(n^1/16)+c+c+c+c

................

.........

...........

T(n) = T(n^(1/2^k))+k.c

So, n^(1/2^k) = 2

taking log will give us:-      log(n) = 2^k--------------futher taking log --------- k =loglogn

T(n) = T(2)+loglogn . c

so , ans will be C) O(loglogn)
+1

1
2