2.8k views

Let $A[1, …n]$ be an array storing a bit ($1$ or $0$) at each location, and $f(m)$ is a function whose time complexity is $\Theta(m)$. Consider the following program fragment written in a C like language:

counter = 0;
for (i=1; i<=n; i++)
{ if a[i] == 1) counter++;
else {f (counter); counter = 0;}
}


The complexity of this program fragment is

1. $\Omega(n^2)$

2. $\Omega (n\log n) \text{ and } O(n^2)$

3. $\Theta(n)$

4. $o(n)$

edited | 2.8k views
Confused between option C and D? which is correct ?

Please note that inside the else condition, f() is called first, then counter is set to 0.

Consider the following cases:

a) All 1s in A[]: Time taken is Θ(n) as
only counter++ is executed n times.

b) All 0s in A[]: Time taken is Θ(n) as
only f(0) is called n times

c) Half 1s, then half 0s: Time taken is  Θ(n) as
only f(n/2) is called once.
Can you please explain the option c??

The key part in the code is "$counter = 0$" in the else part as we can see below.

Lets take the best case. This happens when $a[i] = 1$ for all $i$, and then the loop executes with time complexity $\Theta(1)$ for each iteration and hence overall time complexity of $\Theta(n)$ and we can say time complexity of the code fragment is $Ω(n)$ and hence options A and B are false.

Now, consider the worst case. This happens when $a[i] = 0$ or when else part is executed. Here, the time complexity of each iteration will be $\Theta(counter)$ and after each else, counter is reset to $0$. Let $k$ iterations go to the else part during the worst case. Then the worst case time complexity will be $\Theta(x_1) + \Theta(x_2) + \dots +\Theta(x_k) + \Theta(n-k)$, where $x_i$ is the value of the counter when, $A[i] = 0$ and $f(counter)$ is called.  But due to $counter = 0$ after each call to $f()$, we have, $x_1 + x_2 + \dots + x_k = n$. So, $\Theta(x_1) + \Theta(x_2) + \dots +\Theta(x_k) + \Theta(n-k) = \Theta(n) + \Theta(n-k) = \Theta(n)$.

Since the time complexity is $Ω(n)$ and $\Theta(n)$ we can say it is $\Theta(n)$ - Option (C). (Option D is false because the small o needs the growth rate to be STRICTLY lower and not equal to or lower as the case for big 0)

If $counter = 0$ was not there in else part, then time complexity would be $Ω(n)$ and $O(n^2)$ as in worst case we can have equal number of $0$'s and $1$'s in array $a$ giving time complexity $\Theta(1) + \Theta(2) + \dots + \Theta(n/2) + \Theta(n/2)$ would give $O(n^2)$.

edited by
Sir, your conversion of $\Theta (n) to \Omega (n)$ is not clear to me. means "hence overall time complexity of $\Theta(n)$ and we can say time complexity of the code fragment is $Ω(n)$ ".

means lower bound couldnot be less than n, though upper bound could be $n^2$, $n^3$......... rt?

Sir

(In the last paragraph..if no counter=0)In worst case i think there will be all 1's right?...i mean each iteration will go to else part

then complexity would be 1+2+3+4+...n=O(n2)....

Please correct me if i m wrong

$\Theta (n/2)+\Theta (n/2)=O(n)$

how $O(n^{2})$?
@ arjun sir, what if array has all 1s and only last cell has 0? then counter will be incremented to n-1 and then at the final iteration when A[n] is 0, f(counter)  that is f(n-1) will be called which leads to time complexity of n^2. I dont know where i am thinking wrong. Plzz help :(

Got it where i was going wrong. :)
@sushmita how it would be theta(n) because in worst case ( if array has all 1s and only last cell has 0)  it should be O(n^2)
@arjun sir, if counter=0 was not there in else part then small o(n) is correct right?

i mean the option D is correct

If counter=0 is not there then in worst case when equal number of 0's and 1's are present then

TC=O(n/2)+(1+2+3+....n/2) =O(n2)

But due to counter=0 after each call to f(), we have, x1+x2+⋯+xk=n.

I could not understand how the sum value becomes n. Can anyone help?

Θ(n/2)+Θ(n/2)=O(n) yes!

but here it is Θ(1)+Θ(2)+⋯+Θ(n/2)+Θ(n/2)

and we know 1+2+3+4+...n/2 = ((n/2)*((n/2)+1) )/2 which make it's order n2

Please note that inside the else condition, f() is called first, then counter is set to 0.

Consider the following cases:

a) All 1s in A[]: Time taken is Θ(n) as
only counter++ is executed n times.

b) All 0s in A[]: Time taken is Θ(n) as
only f(0) is called n times

c) Half 1s, then half 0s: Time taken is  Θ(n) as
only f(n/2) is called once.

I think best case time complexity when all of the array elements contains 1 , then it will be Ω(N) .

and in worst case time complexity when all of the array elements contains 0 . then it will be e(0)+e(0)+....+ e(0) (its totally depends upon the e(m) . )

or the array elemnt is like this 11111 to m000....  then time complexity 1+1+1......+e(m)+e(m)+....+ e(m) ( then also its totally depends upon the e(m) . )

edited
I think your analysis in case of worst time complexity is wrong
because each time func(m)is invoked counter becomes 0 and as the
time complexity of fun(m)is O(m) and when counter would be zero then
the func would take O(0) time .
Worst time would be O(n).
no when function is called counter remains same , because the counter = 0 is after calling f(counter) so the counter remain same .
+1 vote
ans should be C which is Θ(n)

because in best case time complexity is Ω(n) when array contain all 1's then only if condition executing n times

in worst case time complexity is O(n)

let if condition  executing n/2 times then counter value is n/2 and "(n/2)+1" th iteration else condition executing so  f(n/2) executing n/2 times  and make counter value is 0 and again remaing (n/2) iterations running if condition (n/2) times so total time (n/2)+(n/2)+(n/2) so worst case time complexity is O(n)
edited by
+1 vote

ans of this always Q(n).

best case also O(n).

worst case also O(n).

if all 1's then O(n) bcoz never enter into f(c). every time const.

if all 0's then O(n) bcoz every time const eventhough function call every time.

if half times 1's and half times 0's suppose n/2 times 1's so time is n/2 bcoz every time const.

after n/2 1's if 0 comes now count becomes n/2 so f(n/2) takes n/2 time and count becomes 0 immediately.

means whatever work he done until now this spoil(waste). now n/2 times 0's every time const so  time is n/2.

so total time becomes n/2+n/2+n/2=3n/2. means O(n).

so "c"  is ans.

if there is no counter =0 then n/2+n/2 times n/2 bcoz every time counter value is n/2 ,counter never decrease

so time=(n^2)/4

edited

Consider the best case for this code snippet, for which test case is a string of all 1's
so, best case time complexity = $\mathcal{O}(n)$

for considering worst case for this code snippet, it has a variable time complexity, coz it depends on the function $f(m)$ whose time complexity is $\Theta(m)$
To show it has a variable time complexity, consider two cases to get the idea:

Case 1: $0\ 1 \ 0 \ 11 \ 0 \ 111 \ 0 \ 1111 \ 0 \dots$
in such a case the function behaves as some part of code snippet which has a linear increment i.e. time complexity of $\mathcal{O}(n)$

Case 2: $0\ 1 \ 0 \ 11 \ 0 \ 1111 \ 0 \ 11111111 \ 0 \dots$
in such a case the function behaves as code snippet which has a time complexity of $\mathcal{O}(n^2)$

but in case there are some constant number of 1's in between zeros it may also behave as of time complexity = $\mathcal{O}(1)$

Hence, for this code snippet we can say that $o(n)$ is a stronger option to choose. It covers all except the best case.

The key here is to see that counter can be incremented by maximum n times and the complexity of f depends upon counter.

Now if counter can contain maximum n,then it is not possible for f to have more than n complexity.

f can have one call taking n complexity ,or multiple calls whose combined complexity is n.

Consider that first we get x 1 and then 0 ,now f(x) will run till x,then we get y 1 followed by  0,now f(y) will run till y.

total elements x+y=n,so lop goes till n,

total time f executed =x+y

Min. f executes 0 and max. n.
--------------------------------------------------------------------------------------------------------------------

Also,note that if array contains all 1's then time would be thetha(n),and based on this we can reject A,B,D.
edited
If counter is 0 , then f(counter) will be O(0) = 0.

Therefore the worst case input will all 1 and at last single 0 that is (n-1) 1's followed by one 0.

In this case , counter will have value n-1, when 0 comes else statement will call f(n-1)= O(n-1)=O(n).

i think function is called but not defined

means function definition is missing