The Gateway to Computer Science Excellence
First time here? Checkout the FAQ!
x
+33 votes
3.6k views

Let $A[1, …n]$ be an array storing a bit ($1$ or $0$) at each location, and $f(m)$ is a function whose time complexity is $\Theta(m)$. Consider the following program fragment written in a C like language:

counter = 0;
for (i=1; i<=n; i++)
{ if a[i] == 1) counter++;
else {f (counter); counter = 0;}
}

The complexity of this program fragment is

  1. $\Omega(n^2)$

  2. $\Omega (n\log n) \text{ and } O(n^2)$

  3. $\Theta(n)$

  4. $o(n)$

asked in Algorithms by Veteran (59.5k points)
edited by | 3.6k views
0
Confused between option C and D? which is correct ?
+6

 Please note that inside the else condition, f() is called first, then counter is set to 0.

Consider the following cases:

a) All 1s in A[]: Time taken is Θ(n) as
                  only counter++ is executed n times.

b) All 0s in A[]: Time taken is Θ(n) as
                  only f(0) is called n times

c) Half 1s, then half 0s: Time taken is  Θ(n) as
                  only f(n/2) is called once.
0
Can you please explain the option c??

10 Answers

+34 votes
Best answer

The key part in the code is "$counter = 0$" in the else part as we can see below.

Lets take the best case. This happens when $a[i] = 1$ for all $i$, and then the loop executes with time complexity $\Theta(1)$ for each iteration and hence overall time complexity of $\Theta(n)$ and we can say time complexity of the code fragment is $Ω(n)$ and hence options A and B are false.

Now, consider the worst case. This happens when $a[i] = 0$ or when else part is executed. Here, the time complexity of each iteration will be $\Theta(counter)$ and after each else, counter is reset to $0$. Let $k$ iterations go to the else part during the worst case. Then the worst case time complexity will be $\Theta(x_1) + \Theta(x_2) + \dots +\Theta(x_k) + \Theta(n-k)$, where $x_i$ is the value of the counter when, $A[i] = 0$ and $f(counter)$ is called.  But due to $counter = 0$ after each call to $f()$, we have, $x_1 + x_2 + \dots + x_k = n$. So, $\Theta(x_1) + \Theta(x_2) + \dots +\Theta(x_k) + \Theta(n-k) = \Theta(n) + \Theta(n-k) = \Theta(n)$.

Since the time complexity is $Ω(n)$ and $\Theta(n)$ we can say it is $\Theta(n)$ - Option (C). (Option D is false because the small o needs the growth rate to be STRICTLY lower and not equal to or lower as the case for big 0)

If $counter = 0$ was not there in else part, then time complexity would be $Ω(n)$ and $O(n^2)$ as in worst case we can have equal number of $0$'s and $1$'s in array $a$ giving time complexity $\Theta(1) + \Theta(2) + \dots  + \Theta(n/2) + \Theta(n/2)$ would give $O(n^2)$. 

answered by Veteran (355k points)
edited by
+2
Sir, your conversion of $\Theta (n) to \Omega (n)$ is not clear to me. means "hence overall time complexity of $\Theta(n)$ and we can say time complexity of the code fragment is $Ω(n)$ ".

means lower bound couldnot be less than n, though upper bound could be $n^2$, $n^3$......... rt?
0

Sir

(In the last paragraph..if no counter=0)In worst case i think there will be all 1's right?...i mean each iteration will go to else part

then complexity would be 1+2+3+4+...n=O(n2)....

Please correct me if i m wrong

0
$\Theta (n/2)+\Theta (n/2)=O(n)$

how $O(n^{2})$?
+1
@ arjun sir, what if array has all 1s and only last cell has 0? then counter will be incremented to n-1 and then at the final iteration when A[n] is 0, f(counter)  that is f(n-1) will be called which leads to time complexity of n^2. I dont know where i am thinking wrong. Plzz help :(

Got it where i was going wrong. :)
+1
@sushmita how it would be theta(n) because in worst case ( if array has all 1s and only last cell has 0)  it should be O(n^2)
0
@arjun sir, if counter=0 was not there in else part then small o(n) is correct right?

i mean the option D is correct
0

If counter=0 is not there then in worst case when equal number of 0's and 1's are present then

TC=O(n/2)+(1+2+3+....n/2) =O(n2)

+2

But due to counter=0 after each call to f(), we have, x1+x2+⋯+xk=n.

I could not understand how the sum value becomes n. Can anyone help?

0

Θ(n/2)+Θ(n/2)=O(n) yes!

but here it is Θ(1)+Θ(2)+⋯+Θ(n/2)+Θ(n/2)

and we know 1+2+3+4+...n/2 = ((n/2)*((n/2)+1) )/2 which make it's order n2

0
Yes I also could not understand that x1+x2+x3+...+xk = n ..sir @Arjun .
0

@Rupendra Choudhary 

I have understood the worst case when we have equal no. of zeroes and ones and c=0 is not there ,but   why total time complexity is written as Θ(1)+Θ(2)+⋯+Θ(n/2)+Θ(n/2) ,why not Θ(1)+Θ(2)+⋯+Θ(n/2)+Θ(n), i know that won't make any difference but the last Θ(n/2) if for loop rt? so why not Θ(n)?

 

+9 votes

 Please note that inside the else condition, f() is called first, then counter is set to 0.

Consider the following cases:

a) All 1s in A[]: Time taken is Θ(n) as
                  only counter++ is executed n times.

b) All 0s in A[]: Time taken is Θ(n) as
                  only f(0) is called n times

c) Half 1s, then half 0s: Time taken is  Θ(n) as
                  only f(n/2) is called once.
answered by Junior (637 points)
+3 votes

I think best case time complexity when all of the array elements contains 1 , then it will be Ω(N) .

and in worst case time complexity when all of the array elements contains 0 . then it will be e(0)+e(0)+....+ e(0) (its totally depends upon the e(m) . )

or the array elemnt is like this 11111 to m000....  then time complexity 1+1+1......+e(m)+e(m)+....+ e(m) ( then also its totally depends upon the e(m) . )

answered by Boss (10.2k points)
edited by
0
I think your analysis in case of worst time complexity is wrong
because each time func(m)is invoked counter becomes 0 and as the
time complexity of fun(m)is O(m) and when counter would be zero then
the func would take O(0) time .
Worst time would be O(n).
+1
no when function is called counter remains same , because the counter = 0 is after calling f(counter) so the counter remain same .
+1 vote
ans should be C which is Θ(n)

because in best case time complexity is Ω(n) when array contain all 1's then only if condition executing n times

in worst case time complexity is O(n)

let if condition  executing n/2 times then counter value is n/2 and "(n/2)+1" th iteration else condition executing so  f(n/2) executing n/2 times  and make counter value is 0 and again remaing (n/2) iterations running if condition (n/2) times so total time (n/2)+(n/2)+(n/2) so worst case time complexity is O(n)
answered by Active (2.3k points)
edited by
+1 vote

I think it actually depends upon the f(counter) function. what if f(0) is the terminating condition and takes constant time but if f(1) is called then it will take ø(m).

1.) Consider an Array A[5] = { 1,1,1,1,1 }

at index 0 : A[0] =1,then counter is incremented , counter = 1.

at index 1 : A[1] =1, then again counter is incremented, counter = 2.

..similarly counter = 5 at the end. Total Time complexity = constant for every element in this array, if there are N elements in the array and all are 1 then Time Complexity for such case - O(N).

2. )Consider an Array A[5] = { 0,0,0,0,0 } ( You might think this is the worst case but its not )

at index 0 : A[0] =0 ,then else part f(0) is called which takes constant time. because counter is already 0 intially and f(0) is the terminating condition as assumed above.

at index 1 : A[1] =0, then again else part f(0) is called which again takes constant time.

... and this continues till the end. Total time complexity  = constant for every element in this array, if there are N elements in the array and all are 0 then Time complexity for such case - O(N).

3.) Consider an Array A[5] = {1,1,0,0,0} 

at index 0: A[0] = 1, counter =1 takes constant time 'c'

at index 1: A[1] = 1, counter =2 takes constant time 'c'

at index 2: A[2] = 0, f(2) is called which takes ø(m) time. and then counter is set to 0 and  takes  ø(m) time

at index 3: A[3] = 0, f(0) is called which takes constant time 'c'

at index 4: A[4] = 0, f(0) is called which again takes constant time 'c'

Total Time complexity  = c + c + ø(m) + c + c = 4c + ø(m) . In case there are N elements where first few elements are 1's followed by continuous 0's. then for elements from starting upto all 1's in the array takes constant time and then for first 0 it takes ø(m) time and for the remaining 0's it takes constant time. Total time Complexity = c + c + c--- + c ( all 1's) + ø(m) ( first 0 encountered) + c + c + --- +c ( remaining 0's)   = (N-1)*c + ø(m) = which will be O(N) + ø(m) =  Max ( O(N) +O(m) )

4.) Consider an Array A[5] = { 1,0,1,0,1 }

at index 0: A[0] = 1, counter =1 takes constant time 'c'

at index 1: A[1] = 0, f(1) is called which takes  ø(m) time. and then counter is set to 0.

at index 2: A[2] = 1, counter = 1 takes constant time 'c'

at index 3: A[3] = 0, f(1) is called which takes ø(m) time. and then counter is set to 0.

at index 4: A[4] = 1, counter =1 takes constant time 'c'

Total time complexity = c + ø(m) + c + ø(m) + c = 3c + 2 * ø(m) =  Ceil(5/2) * c + Floor(5/2) * ø(m)

If there are N elements and the array consists of Alternate 1's and 0's then almost half of the elements will takes constant time and other half will call f(1) which takes ø(m). Total = (N/2)*c + (N/2) * ø(m) = O(N*m).

Hope it helps :)

please correct me if wrong :)

answered by Active (1.7k points)
0
May I know why have you assumed O(m) time for function "f" always ?

I think f(m) takes time O(m) where m is the argument that is passed to it.Let O(m) be a place holder for a running time say am + b. Then in above case if f(2) is called then it will take time 2a+b.

So I think that in all cases time complexity is Θ(n).

Please correct me if I am wrong.
0

Vishal Rana, you're right. To state my interpretation,

Best case: If all are 1's: TC= n*O(1)= omega(n)

OR If all are zeroes: TC= n* theta(m). m in this case is 0, hence n*theta(0)= O(n)

Worst case (assume): If a sequence like [1,1,1,1,1,1,0,1,1,1,1,1,1]: TC= n/2*O(1) + 1*theta(n/2) + n/2*O(1)= O(n)+ O(n)+ O(n)= O(n)

As best and worst case, all have the same time complexity, answer= theta(n)

Please correct if this is wrong anywhere.

+1 vote

ans of this always Q(n). 

best case also O(n).

worst case also O(n).

if all 1's then O(n) bcoz never enter into f(c). every time const.

if all 0's then O(n) bcoz every time const eventhough function call every time.

if half times 1's and half times 0's suppose n/2 times 1's so time is n/2 bcoz every time const.

after n/2 1's if 0 comes now count becomes n/2 so f(n/2) takes n/2 time and count becomes 0 immediately.

means whatever work he done until now this spoil(waste). now n/2 times 0's every time const so  time is n/2.

so total time becomes n/2+n/2+n/2=3n/2. means O(n).

so "c"  is ans.

if there is no counter =0 then n/2+n/2 times n/2 bcoz every time counter value is n/2 ,counter never decrease

so time=(n^2)/4

answered by Active (4.3k points)
edited by
0 votes

answer = option D

Consider the best case for this code snippet, for which test case is a string of all 1's
so, best case time complexity = $\mathcal{O}(n)$

for considering worst case for this code snippet, it has a variable time complexity, coz it depends on the function $f(m)$ whose time complexity is $\Theta(m)$
To show it has a variable time complexity, consider two cases to get the idea:

Case 1: $0\ 1 \ 0 \ 11 \ 0 \ 111 \ 0 \ 1111 \ 0 \dots$
in such a case the function behaves as some part of code snippet which has a linear increment i.e. time complexity of $\mathcal{O}(n)$

Case 2: $0\ 1 \ 0 \ 11 \ 0 \ 1111 \ 0 \ 11111111 \ 0 \dots$
in such a case the function behaves as code snippet which has a time complexity of $\mathcal{O}(n^2)$

but in case there are some constant number of 1's in between zeros it may also behave as of time complexity = $\mathcal{O}(1)$

Hence, for this code snippet we can say that $o(n)$ is a stronger option to choose. It covers all except the best case.

answered by Boss (30.7k points)
0 votes
The key here is to see that counter can be incremented by maximum n times and the complexity of f depends upon counter.

Now if counter can contain maximum n,then it is not possible for f to have more than n complexity.

f can have one call taking n complexity ,or multiple calls whose combined complexity is n.

Consider that first we get x 1 and then 0 ,now f(x) will run till x,then we get y 1 followed by  0,now f(y) will run till y.

total elements x+y=n,so lop goes till n,

total time f executed =x+y

Min. f executes 0 and max. n.
--------------------------------------------------------------------------------------------------------------------

Also,note that if array contains all 1's then time would be thetha(n),and based on this we can reject A,B,D.
answered by Boss (24.5k points)
edited by
0 votes
If counter is 0 , then f(counter) will be O(0) = 0.

Therefore the worst case input will all 1 and at last single 0 that is (n-1) 1's followed by one 0.

In this case , counter will have value n-1, when 0 comes else statement will call f(n-1)= O(n-1)=O(n).

Hence answer is C, O(n)
answered by Active (1.4k points)
–5 votes
i think function is called but not defined

means function definition is missing
answered by (425 points)


Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true

38,040 questions
45,538 answers
131,823 comments
48,855 users