The Gateway to Computer Science Excellence
+15 votes
3.4k views

Let $f(n)= Ω(n), g(n)= O(n)$ and $h(n)= Ѳ(n)$. Then $[f(n). g(n)] + h(n)$ is:

  1. Ω (n)
  2. O (n)
  3. Ѳ (n)
  4. None of these
in Algorithms by Boss (10.4k points) | 3.4k views
0

sir how can be 

f(n) . g(n)  =  Ω(n)    ?

0
Sorry, that's true only for non decreasing functions. But answer won't change here.
0

f(n) = theta(n) ... so we can write --> c1n <= f(n) <= c2n        [equation 1]

g(n) = omega(n)....we can write --> g(n) >= c3n                     [equation 2]

h(n) = O(n)....we can write --> h(n) <= c4n OR -h(n) >= c4n   [equation 3]

Now using equation 2 nd 3

- g(n) . h(n) >= (c3 . c4)n

- g(n) . h(n) >= c5 n    where c5 = c3 + c4

g(n) . h(n) <= c5 n

Add eq 1 i.e f(n) <= c2n to above equation

f(n) + [ g(n) . h(n) ] <= (c2 + c5)n 

f(n) + g(n) - h(n) <= (c6)n   where c6 = c2 + c5

Here all ci are constants.

f(n) + g(n) - h(n) = O(n) ..Hence Option a

0
its not minus its dot. plz check??
0
Oh sorry.
0
i am not getting it need well good explaination please explain properly
0
Option B
0
My answer is also $B$ but $A$ is provided as answer.
0
Assume:

f(n)=$n^2$

g(n)=logn

h(n)=n

f(n).g(n)=$n^2$logn

[f(n).g(n)] + h(n) ===> $n^2$logn + n ==> omega(n)
0
Any process without any assumptions?

4 Answers

+22 votes
Best answer
$f(n) . g(n)$ - here the individual bounds are ≥n and ≤n respectively. So, for their product we can just say it is ≥ n or Ω(n) provided $g$ is a non-decreasing function. But $(h = \Theta(n))  \implies \left[h = O(n) \text{ AND } h = \Omega(n)\right]$.

So,

$$[f(n). g(n)] + h(n) = \Omega(n).$$

(whatever be the complexity of $f(n).g(n)$, $h(n)$ grows at least at the rate of $n$ and hence the whole expression too, due to '+')
by Veteran (422k points)
edited by
0
none of these... ans is d

We can not say about function which are may be increase or decrease
0
which function here is may increase or decrease?
0
Sir if f(n)= theta(n) and g(n)= O(n) then f(n)*g(n)= O(n) , right or wrong ?
+4

No.

(= n) . (≤ n) = ( ≥ n) and ≤ (n2 )

i.e., Ω(n) and O(n2)

0
Sir actually I am having confusion in solving such computations, I am not getting the intution of finding out what would be the result of the product ,I mean if we multiply theta(n)*omega(n) then it is equal to omega(n) acceptable but theta(n) *O(n) =O(n^2) ,couldn't get this , sorry sir but I am stucked up in such computations.
+11
Solve as in numbers. I say x = 5 and y ≤ 25. Now, x * y = ? (Assume x, y ≥ 1)

We can say it is ≥ 5 and ≤ 125 rt?
0

Ω(n) + Θ(n) = Ω(n)

Explain this part plz @Arjun Sir

+4
x > 5. y = 100. Now, x + y ? surely > 5 rt? Same analogy.
0
Got it thank you Sir
+2

Sir about original question, how is Ω(n)*O(n)=Ω(n)  ?

we cants say anything right? 

lets say n = 5, then x = 6 = Ω(n) 

and y = -6 = O(n)

then x.y = -36 it is not Ω(n) right? Am I correct?

0

Ω(n)= {n, .nlogn,.. n^2,.. n^3,..} O(n)= {1,.. logn, ...n} so mutiply can us give n, so Ω(n)

+1
@Mahesha You are right. Here, I should have considered decreasing functions like $\frac{1}{n}$ which would decrease the value of functions on multiplication. Corrected now.
0

(n) + O(n)=☊(n).. right or wrong??

0
right.
0

@Arjun Sir

I havenot got this line

$h = \Theta(n)  \implies h = O(n) \wedge h = \Omega(n)$

Please explain

Do u mean , for every $\Theta \left ( n \right )$ and non decreasing function $\Theta \left ( n \right )=$$\Omega \left ( n \right )$

How is it possible?? 

+1
Isn't it the definition of Theta notation?
0
How??

$\Theta$ is stronger notation than $O.$ and $\Theta$ is asymptotically tight bound. Then how can we say

$O\wedge \Theta =\Omega$ ??
+1
got it :)
+5 votes
Another approach (BY TAKING EXAMPLE):-

Let $f(n) = An^2+Bn$ so $f(n)=Ω(n)$ is satisfy , $g(n) = Cn+D $ and $h(n) = En $

Now $f(n).g(n) = ( An^2+Bn).(Cn+D)  = Fn^3 + Gn^2+Hn ⇒ Ω(n) $

$f(n).g(n) +h(n)  = Fn^3 + Gn^2+Hn + En ⇒ Ω(n) ,O(n^3)  $

OPTION A
by Active (2.2k points)
+3 votes

Let us think it a bit practically..

We know

If we have modules having time complexities T1 and T2 , then time complexity of entire program = max(T1 , T2)

Keeping this in mind ,

Let T1 corresponds to f(n) and T2 corresponds to g(n) .h(n)..

So f(n) is guaranteed to be linear as f(n) = θ(n) ..

And g(n).h(n) if becomes larger than f(n) then the entire complexity becomes greater than function of n..It will have higher powers of n.So in that case it gives ω(n)..(Small Omega n)..

However if we have this product less than linear function of n , then we need not bother as f(n) is there to make for it..So in that complexity becomes θ(n)..

So considering the above two cases we can say that overall complexity or in other words the function f(n) + g(n).h(n)  = Ώ(n)..Hence C) is the correct answer..

by Veteran (101k points)
0

u say that θ(n) + ω(n) = Ώ(n)

 but here maximum (θ(n) , ω(n)) =  ω(n) ?? plz check

and i am  not get how g(n).h(n)= ω(n) ????

0
By that I mean to say big omega is nothing but it is combination of small omega and theta..As theta gives tightest lower bound and small omega gives not tight (strict) lower bound..

But big omega simply gives lower bound..It may be tight or not tight bound..That is what I meant by that statement..
0 votes
[f(n).g(n)] + h(n)

[O(n).☊(n)] + ⊖(n)       //* since b/w O(n) and ☊(n) .........O(n) is the dominating one*//

O(n) + ⊖(n)

O(n)
by (407 points)
0
It will be O(n²) and Ω(n).

Arjun sir explained it very well.
Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
50,666 questions
56,159 answers
193,767 comments
93,762 users