16,976 views

Consider the following C function

int fun(int n) {
int i, j;
for(i=1; i<=n; i++) {
for (j=1; j<n; j+=i) {
printf("%d %d", i, j);
}
}
}

Time complexity of $fun$ in terms of $\Theta$ notation is

1. $\Theta(n \sqrt{n})$
2. $\Theta(n^2)$
3. $\Theta(n \: \log n)$
4. $\Theta(n^2 \log n)$

### 1 comment

Consider for inner for loop :

If the loop would have been:

for( j=1; j<n ; j= j+2)

Then the loop would have iterated n/2 times

So now in place of '2' we have 'i' : So my loop will iterate n/i times.

So this will vary with value of and we will get a series of : n + n/2 + n/3 + n/4........

=> n( 1 + 1/2 + 1/3.....)

=> nlogn

The first loop is independent, so it takes only n times, but the second loop is dependent on outer loop.so it takes o(logN)

so it takes o(NlogN).

opt c is correct
The question’s beauty lies in the j updation part

For every i value increment j’s value is just doubling on it.

When i is 1, j is getting incremented by +1, like 1,2,3,4,5,6 and so on.

When i is 2, j is getting incremented by +2, like 1,3,5,7,9 and so on.

When i is 3, j is getting incremented by +3, like 1,4,7,10,13, 16 and so on.

When i is 4, j is getting incremented by +4 like 1,5,9,13,17 and so on.

and likewise for other values of i

So total Time Complexity can be observed to be n*logn.

n : for each value of i from 1 to n

logn: for each j associated with an i value gets updated by a logarithmic time expansion. .
b

2 for....j varies 1...n
by

1
4,956 views