6.2k views

Let $A[1,\ldots,n]$ be an array storing a bit ($1$ or $0$) at each location, and $f(m)$ is a function whose time complexity is $\Theta(m)$. Consider the following program fragment written in a C like language:

counter = 0;
for (i=1; i<=n; i++)
{
if a[i] == 1) counter++;
else {f (counter); counter = 0;}
}

The complexity of this program fragment is

1. $\Omega(n^2)$

2. $\Omega (n\log n) \text{ and } O(n^2)$

3. $\Theta(n)$

4. $o(n)$

edited | 6.2k views
0
Confused between option C and D? which is correct ?
+12

Please note that inside the else condition, f() is called first, then counter is set to 0.

Consider the following cases:

a) All 1s in A[]: Time taken is Θ(n) as
only counter++ is executed n times.

b) All 0s in A[]: Time taken is Θ(n) as
only f(0) is called n times

c) Half 1s, then half 0s: Time taken is  Θ(n) as
only f(n/2) is called once.
0
Can you please explain the option c??
0

@Ankit Chourasiya Agree with your explanation.

The key point in the else part is to understand how basically complexity works. Here Function call will be observed as it will run n times overall (in the worst case at the else part) .

0

Duplicate of which question. Can u include the link as well? @

0
is it D?
0
yes d
0
why is it d instead of c
0
Sry it's c

The key part in the code is "$counter = 0$" in the else part as we can see below.

Lets take the best case. This happens when $a[i] = 1$ for all $i$, and then the loop executes with time complexity $\Theta(1)$ for each iteration and hence overall time complexity of $\Theta(n)$ and we can say time complexity of the code fragment is $Ω(n)$ and hence options A and B are false.

Now, consider the worst case. This happens when $a[i] = 0$ or when else part is executed. Here, the time complexity of each iteration will be $\Theta(counter)$ and after each else, counter is reset to $0$. Let $k$ iterations go to the else part during the worst case. Then the worst case time complexity will be $\Theta(x_1) + \Theta(x_2) + \dots +\Theta(x_k) + \Theta(n-k)$, where $x_i$ is the value of the counter when, $A[i] = 0$ and $f(counter)$ is called.  But due to $counter = 0$ after each call to $f()$,

we have, totally n iterations, in which first k iterations are performed by else part and remaining n-k iterations are performed by if part

∴ if counter = 0, it leads to θ(1) ===> this will repeated k times ( one time for each xi = 0 )

. So, $\Theta(x_1) + \Theta(x_2) + \dots +\Theta(x_k) + \Theta(n-k) = k . \Theta(1) + \Theta(n-k) = \Theta(k) + \Theta(n-k) = \Theta(n)$.

Since the time complexity is $Ω(n)$ and $\Theta(n)$ we can say it is $\Theta(n)$ - Option (C). (Option D is false because the small o needs the growth rate to be STRICTLY lower and not equal to or lower as the case for big 0)

If $counter = 0$ was not there in else part, then time complexity would be $Ω(n)$ and $O(n^2)$ as in worst case we can have equal number of $0$'s and $1$'s in array $a$ giving time complexity $\Theta(1) + \Theta(2) + \dots + \Theta(n/2) + \Theta(n/2)$ would give $O(n^2)$.

Correct Answer: $C$

by Veteran (424k points)
edited
+1
Sir, your conversion of $\Theta (n) to \Omega (n)$ is not clear to me. means "hence overall time complexity of $\Theta(n)$ and we can say time complexity of the code fragment is $Ω(n)$ ".

means lower bound couldnot be less than n, though upper bound could be $n^2$, $n^3$......... rt?
0

Sir

(In the last paragraph..if no counter=0)In worst case i think there will be all 1's right?...i mean each iteration will go to else part

then complexity would be 1+2+3+4+...n=O(n2)....

Please correct me if i m wrong

0
$\Theta (n/2)+\Theta (n/2)=O(n)$

how $O(n^{2})$?
+1
@ arjun sir, what if array has all 1s and only last cell has 0? then counter will be incremented to n-1 and then at the final iteration when A[n] is 0, f(counter)  that is f(n-1) will be called which leads to time complexity of n^2. I dont know where i am thinking wrong. Plzz help :(

Got it where i was going wrong. :)
+1
@sushmita how it would be theta(n) because in worst case ( if array has all 1s and only last cell has 0)  it should be O(n^2)
0
@arjun sir, if counter=0 was not there in else part then small o(n) is correct right?

i mean the option D is correct
0

If counter=0 is not there then in worst case when equal number of 0's and 1's are present then

TC=O(n/2)+(1+2+3+....n/2) =O(n2)

+2

But due to counter=0 after each call to f(), we have, x1+x2+⋯+xk=n.

I could not understand how the sum value becomes n. Can anyone help?

+1

Θ(n/2)+Θ(n/2)=O(n) yes!

but here it is Θ(1)+Θ(2)+⋯+Θ(n/2)+Θ(n/2)

and we know 1+2+3+4+...n/2 = ((n/2)*((n/2)+1) )/2 which make it's order n2

0
Yes I also could not understand that x1+x2+x3+...+xk = n ..sir @Arjun .
0

@Rupendra Choudhary

I have understood the worst case when we have equal no. of zeroes and ones and c=0 is not there ,but   why total time complexity is written as Θ(1)+Θ(2)+⋯+Θ(n/2)+Θ(n/2) ,why not Θ(1)+Θ(2)+⋯+Θ(n/2)+Θ(n), i know that won't make any difference but the last Θ(n/2) if for loop rt? so why not Θ(n)?

+3

The worst case may arise when the array is like:
10 110 1110 11110 111110 and so on…

Then time complexity would be like:
Theta(1)+f(1) + 2xTheta(1)+f(2) + 3xTheta(1)+f(3)…kxTheta(1)+f(k)
where k is the no. of 1’s occurred at the last followed by 1 zero.
f(m) = Theta(m) (given)
Theta(1+1) + Theta(2+2) + Theta(3+3) …Theta(k+k)
=Theta( 2x(1 + 2 + 3 + …+k))
=Theta(2*k(k+1)/2)
=Theta(k2)

Calculating relation b/w k and n,
No. of 1’s is : 1+2+3+…k = k(k+1)/2
No. of 0’s is : k (occurred once after every group of consecutive 1s)
So, n= k(k+1)/2 + k
n=k2 (approx)
Then k=n0.5

So putting this k in TC we get : Theta(k2) =Theta((n0.5)2) = Theta(n)

+1

0
yes,there is a small typing mistake in the answer... i updated it !
0

If counter=0 was not there in else part, then time complexity would be Ω(n) and O(n2) as in worst case we can have equal number of 0's and 1's in array aa giving time complexity Θ(1)+Θ(2)+⋯+Θ(n/2)+Θ(n/2)would give O(n2). @Shaik Masthan

+4

there are n numbers in the array ===> n iterations it takes

let in the worst case it is have first half of the numbers are 1, then remaining half numbers are 0.

$\frac{n}{2}$ number of times if part executes where each time it takes Θ (1) times

===> Θ (1) * $\frac{n}{2}$ = Θ ($\frac{n}{2}$ )

after these $\frac{n}{2}$ iterations, counter increased to $\frac{n}{2}$.

now the remaining $\frac{n}{2}$ times else part executes, where each time it takes Θ ($\frac{n}{2}$ ) times due to counter = 0 not exist.

===> Θ ($\frac{n}{2}$) * $\frac{n}{2}$ = Θ ($\frac{n^2}{4}$ )

∴ Total = Θ ($\frac{n}{2}$ ) + Θ ($\frac{n^2}{4}$ ) = Θ (n2)

0
But the first time it enters the else part, after executing the f(counter) the count is set to 0

so from the next iteration it will take Θ(1)

so shouldn't it be      Θ(n/2)          +         Θ(n/2)            +           Θ(1)*(n/2 - 1)      =         Θ(n)

first half(a[i] = 1)       second half                        second half

first iteration                       remaining iteration
0

@OneZero

Brother, the above comment for the once_2019  question, ( when counter=0 statement removed from the else part. )

0
OOPS sorry. by the way what is the answer for the actual question? is the worst case O(n^2) or O(n).
0
It is clearly written in the answer... i.e., option C
0

@Arjun Sir

@Shaik Masthan I understood you solution... but in the Arjun sir's explanation I didn't get  how  x1+x2+x3+...+xk = n ..please just explain how this came.

0
Need explanation for the equation x1+x2+....+xk = n
0
Sir I got why it's (c) , I also got how we eliminated option (a) and (b) , but please tell me how we eliminated option (d) as in big oh bracket nlog(n) is greater than n , so why not (d )?

Please note that inside the else condition, f() is called first, then counter is set to 0.

Consider the following cases:

a) All 1s in A[]: Time taken is Θ(n) as
only counter++ is executed n times.

b) All 0s in A[]: Time taken is Θ(n) as
only f(0) is called n times

c) Half 1s, then half 0s: Time taken is  Θ(n) as
only f(n/2) is called once.
by Junior (585 points)
0
one more case will be when (n-1) positions are 1 and One position(nth / last position) is 0.

In this case if condition will be executed n-1 times making the value of counter = n-1

when i = n then the else condition will be executed f (n - 1) once

therefore

T(n) = (n-1)*O(1) + 1*Q(n - 1) = O(n)

Now since best case and worst case are n therefore Option C is correct.

I think best case time complexity when all of the array elements contains 1 , then it will be Ω(N) .

and in worst case time complexity when all of the array elements contains 0 . then it will be e(0)+e(0)+....+ e(0) (its totally depends upon the e(m) . )

or the array elemnt is like this 11111 to m000....  then time complexity 1+1+1......+e(m)+e(m)+....+ e(m) ( then also its totally depends upon the e(m) . )

by Boss (10.4k points)
edited
0
I think your analysis in case of worst time complexity is wrong
because each time func(m)is invoked counter becomes 0 and as the
time complexity of fun(m)is O(m) and when counter would be zero then
the func would take O(0) time .
Worst time would be O(n).
+1
no when function is called counter remains same , because the counter = 0 is after calling f(counter) so the counter remain same .
ans should be C which is Θ(n)

because in best case time complexity is Ω(n) when array contain all 1's then only if condition executing n times

in worst case time complexity is O(n)

let if condition  executing n/2 times then counter value is n/2 and "(n/2)+1" th iteration else condition executing so  f(n/2) executing n/2 times  and make counter value is 0 and again remaing (n/2) iterations running if condition (n/2) times so total time (n/2)+(n/2)+(n/2) so worst case time complexity is O(n)
by Active (2.3k points)
edited by
+1 vote

I think it actually depends upon the f(counter) function. what if f(0) is the terminating condition and takes constant time but if f(1) is called then it will take ø(m).

1.) Consider an Array A[5] = { 1,1,1,1,1 }

at index 0 : A[0] =1,then counter is incremented , counter = 1.

at index 1 : A[1] =1, then again counter is incremented, counter = 2.

..similarly counter = 5 at the end. Total Time complexity = constant for every element in this array, if there are N elements in the array and all are 1 then Time Complexity for such case - O(N).

2. )Consider an Array A[5] = { 0,0,0,0,0 } ( You might think this is the worst case but its not )

at index 0 : A[0] =0 ,then else part f(0) is called which takes constant time. because counter is already 0 intially and f(0) is the terminating condition as assumed above.

at index 1 : A[1] =0, then again else part f(0) is called which again takes constant time.

... and this continues till the end. Total time complexity  = constant for every element in this array, if there are N elements in the array and all are 0 then Time complexity for such case - O(N).

3.) Consider an Array A[5] = {1,1,0,0,0}

at index 0: A[0] = 1, counter =1 takes constant time 'c'

at index 1: A[1] = 1, counter =2 takes constant time 'c'

at index 2: A[2] = 0, f(2) is called which takes ø(m) time. and then counter is set to 0 and  takes  ø(m) time

at index 3: A[3] = 0, f(0) is called which takes constant time 'c'

at index 4: A[4] = 0, f(0) is called which again takes constant time 'c'

Total Time complexity  = c + c + ø(m) + c + c = 4c + ø(m) . In case there are N elements where first few elements are 1's followed by continuous 0's. then for elements from starting upto all 1's in the array takes constant time and then for first 0 it takes ø(m) time and for the remaining 0's it takes constant time. Total time Complexity = c + c + c--- + c ( all 1's) + ø(m) ( first 0 encountered) + c + c + --- +c ( remaining 0's)   = (N-1)*c + ø(m) = which will be O(N) + ø(m) =  Max ( O(N) +O(m) )

4.) Consider an Array A[5] = { 1,0,1,0,1 }

at index 0: A[0] = 1, counter =1 takes constant time 'c'

at index 1: A[1] = 0, f(1) is called which takes  ø(m) time. and then counter is set to 0.

at index 2: A[2] = 1, counter = 1 takes constant time 'c'

at index 3: A[3] = 0, f(1) is called which takes ø(m) time. and then counter is set to 0.

at index 4: A[4] = 1, counter =1 takes constant time 'c'

Total time complexity = c + ø(m) + c + ø(m) + c = 3c + 2 * ø(m) =  Ceil(5/2) * c + Floor(5/2) * ø(m)

If there are N elements and the array consists of Alternate 1's and 0's then almost half of the elements will takes constant time and other half will call f(1) which takes ø(m). Total = (N/2)*c + (N/2) * ø(m) = O(N*m).

Hope it helps :)

please correct me if wrong :)

by Active (1.8k points)
0
May I know why have you assumed O(m) time for function "f" always ?

I think f(m) takes time O(m) where m is the argument that is passed to it.Let O(m) be a place holder for a running time say am + b. Then in above case if f(2) is called then it will take time 2a+b.

So I think that in all cases time complexity is Θ(n).

Please correct me if I am wrong.
0

Vishal Rana, you're right. To state my interpretation,

Best case: If all are 1's: TC= n*O(1)= omega(n)

OR If all are zeroes: TC= n* theta(m). m in this case is 0, hence n*theta(0)= O(n)

Worst case (assume): If a sequence like [1,1,1,1,1,1,0,1,1,1,1,1,1]: TC= n/2*O(1) + 1*theta(n/2) + n/2*O(1)= O(n)+ O(n)+ O(n)= O(n)

As best and worst case, all have the same time complexity, answer= theta(n)

Please correct if this is wrong anywhere.

0
f(count) is inside for loop let the case {1,1,1,1,1,1,1,1...........1,0};

then count value is (n-1) and thenf(n-1)=theta(n-1).

f(n)=(n-1)(n-1)=n^2
+1 vote

ans of this always Q(n).

best case also O(n).

worst case also O(n).

if all 1's then O(n) bcoz never enter into f(c). every time const.

if all 0's then O(n) bcoz every time const eventhough function call every time.

if half times 1's and half times 0's suppose n/2 times 1's so time is n/2 bcoz every time const.

after n/2 1's if 0 comes now count becomes n/2 so f(n/2) takes n/2 time and count becomes 0 immediately.

means whatever work he done until now this spoil(waste). now n/2 times 0's every time const so  time is n/2.

so total time becomes n/2+n/2+n/2=3n/2. means O(n).

so "c"  is ans.

if there is no counter =0 then n/2+n/2 times n/2 bcoz every time counter value is n/2 ,counter never decrease

so time=(n^2)/4

by Active (4.5k points)
edited

Consider the best case for this code snippet, for which test case is a string of all 1's
so, best case time complexity = $\mathcal{O}(n)$

for considering worst case for this code snippet, it has a variable time complexity, coz it depends on the function $f(m)$ whose time complexity is $\Theta(m)$
To show it has a variable time complexity, consider two cases to get the idea:

Case 1: $0\ 1 \ 0 \ 11 \ 0 \ 111 \ 0 \ 1111 \ 0 \dots$
in such a case the function behaves as some part of code snippet which has a linear increment i.e. time complexity of $\mathcal{O}(n)$

Case 2: $0\ 1 \ 0 \ 11 \ 0 \ 1111 \ 0 \ 11111111 \ 0 \dots$
in such a case the function behaves as code snippet which has a time complexity of $\mathcal{O}(n^2)$

but in case there are some constant number of 1's in between zeros it may also behave as of time complexity = $\mathcal{O}(1)$

Hence, for this code snippet we can say that $o(n)$ is a stronger option to choose. It covers all except the best case.

by Boss (30.7k points)
The key here is to see that counter can be incremented by maximum n times and the complexity of f depends upon counter.

Now if counter can contain maximum n,then it is not possible for f to have more than n complexity.

f can have one call taking n complexity ,or multiple calls whose combined complexity is n.

Consider that first we get x 1 and then 0 ,now f(x) will run till x,then we get y 1 followed by  0,now f(y) will run till y.

total elements x+y=n,so lop goes till n,

total time f executed =x+y

Min. f executes 0 and max. n.
--------------------------------------------------------------------------------------------------------------------

Also,note that if array contains all 1's then time would be thetha(n),and based on this we can reject A,B,D.
by Boss (25.3k points)
edited