200 views

Consider the following algorithms. Assume, procedure $A$ and procedure $B$ take $O(1)$ and $O(1/n)$ unit of time respectively. Derive the time complexity of the algorithm in $O$ -notation.

algorithm what (n)
begin
if n = 1 then call A
else
begin
what (n-1);
call B(n)
end
end

Let's suppose $n=5$. So for that time complexity will be:

$T(5)=O(1/5)+O(1/4)+O(1/3)+O(1/2)+O(1)$

$=c(1/5+1/4+1/3+1/2+1/1)$ (c=constant)

= approximately equal to $c$

So complexity should be $O(1)$.

But answer is $O(n)$.What I am doing wrong?

edited | 200 views
0

you have to generalize running time which works for every input size. Suppose, you have large input size $k$ then you have to compute running time for the given algorithm and to compute it, you have to make recurrence for the running time.

you can check the answer here https://gateoverflow.in/1510/gate1999-11a

0
If I generalise with the help of my example then also I am getting as O(1).

For n-
c(1/n+1/n-1+1/n-2 ..........1/3+1/2+1/1)
=~c
@ankitgupta
+1

Have u considered this line--> what(n-1) ?

if u have considered u would have gotten the complexity like this-->

T(n) =T(n-1) + 1

which gives us--> O(n)

Now for A and B we have O(1) and O(1/n)

so total--> O(n) + O(1) + O(1/n) = O(n)

0

Thanks now I got my answer.😁

@Hirak

Since what () is being called recursively, therefore the final call for what () occurs when n = 1, after which A () is called which takes O(1) amount of time. As the final what () ends, (B (n - (n - 2)) gets called, which is B (2) and takes O(1/2) amount of time. After the end of B (2), what (2) ends and the process repeats until B (1/n) is the final function that is called. Therefore, the time complexity is -

T(n) = O(1) + O(1/2) + O(1/3) ... + O(1/n)

But here's the thing. O(1) represents constant time required to perform a task. This is the smallest amount of time required to perform a certain task by the computer. The time required to perform a task can not get smaller than this. Hence, O(1/2), O(1/3), etc. would be practically impossible on any conventional computer. Therefore these operations would rather take O(1) amount of time each. Thus, the total time complexity of the function becomes -

T(n) = O(1) + O(1) + O(1) + ... up to n

which is,

T(n) = n * O(1) =  O(n)