search
Log In
Redirected from merged question 226416
17 votes
4.8k views

Let $f(n)= Ω(n), g(n)= O(n)$ and $h(n)= Ѳ(n)$. Then $[f(n). g(n)] + h(n)$ is:

  1. Ω (n)
  2. O (n)
  3. Ѳ (n)
  4. None of these
in Algorithms 4.8k views
0

sir how can be 

f(n) . g(n)  =  Ω(n)    ?

0
Sorry, that's true only for non decreasing functions. But answer won't change here.
0

f(n) = theta(n) ... so we can write --> c1n <= f(n) <= c2n        [equation 1]

g(n) = omega(n)....we can write --> g(n) >= c3n                     [equation 2]

h(n) = O(n)....we can write --> h(n) <= c4n OR -h(n) >= c4n   [equation 3]

Now using equation 2 nd 3

- g(n) . h(n) >= (c3 . c4)n

- g(n) . h(n) >= c5 n    where c5 = c3 + c4

g(n) . h(n) <= c5 n

Add eq 1 i.e f(n) <= c2n to above equation

f(n) + [ g(n) . h(n) ] <= (c2 + c5)n 

f(n) + g(n) - h(n) <= (c6)n   where c6 = c2 + c5

Here all ci are constants.

f(n) + g(n) - h(n) = O(n) ..Hence Option a

0
its not minus its dot. plz check??
0
Oh sorry.
0
i am not getting it need well good explaination please explain properly
0
it is $\Omega (n)$
0
yes..
0
utkarsh share approach  :P
0
got it  was doing some mistake
0

why not  θ(n2) ? pls explain.

0
what can be the Ω of this equation?

f(n) can perform Ω(1), g(n) can perform Ω(n) and h(n) can perform Ω(n)

time complexity will be Ω(n)

we cannot determine time complexity in $\Theta$ because we don't know $\Theta$ of f(n) and g(n)
0
it will be $O(n^{2})$
0
Option B
0
My answer is also $B$ but $A$ is provided as answer.
1
Assume:

f(n)=$n^2$

g(n)=logn

h(n)=n

f(n).g(n)=$n^2$logn

[f(n).g(n)] + h(n) ===> $n^2$logn + n ==> omega(n)
0
Any process without any assumptions?
0
how did you assume f(n) is Ω(1)  and h(n) is Ω(n)?

7 Answers

25 votes
 
Best answer
$f(n) . g(n)$ - here the individual bounds are ≥n and ≤n respectively. So, for their product we can just say it is ≥ n or Ω(n) provided $g$ is a non-decreasing function. But $(h = \Theta(n))  \implies \left[h = O(n) \text{ AND } h = \Omega(n)\right]$.

So,$$[f(n). g(n)] + h(n) = \Omega(n).$$
(whatever be the complexity of $f(n).g(n)$, $h(n)$ grows at least at the rate of $n$ and hence the whole expression too, due to '+')

edited by
0
none of these... ans is d

We can not say about function which are may be increase or decrease
0
which function here is may increase or decrease?
0
Sir if f(n)= theta(n) and g(n)= O(n) then f(n)*g(n)= O(n) , right or wrong ?
5

No.

(= n) . (≤ n) = ( ≥ n) and ≤ (n2 )

i.e., Ω(n) and O(n2)

0
Sir actually I am having confusion in solving such computations, I am not getting the intution of finding out what would be the result of the product ,I mean if we multiply theta(n)*omega(n) then it is equal to omega(n) acceptable but theta(n) *O(n) =O(n^2) ,couldn't get this , sorry sir but I am stucked up in such computations.
12
Solve as in numbers. I say x = 5 and y ≤ 25. Now, x * y = ? (Assume x, y ≥ 1)

We can say it is ≥ 5 and ≤ 125 rt?
0

Ω(n) + Θ(n) = Ω(n)

Explain this part plz @Arjun Sir

4
x > 5. y = 100. Now, x + y ? surely > 5 rt? Same analogy.
0
Got it thank you Sir
2

Sir about original question, how is Ω(n)*O(n)=Ω(n)  ?

we cants say anything right? 

lets say n = 5, then x = 6 = Ω(n) 

and y = -6 = O(n)

then x.y = -36 it is not Ω(n) right? Am I correct?

0

Ω(n)= {n, .nlogn,.. n^2,.. n^3,..} O(n)= {1,.. logn, ...n} so mutiply can us give n, so Ω(n)

1
@Mahesha You are right. Here, I should have considered decreasing functions like $\frac{1}{n}$ which would decrease the value of functions on multiplication. Corrected now.
0

(n) + O(n)=☊(n).. right or wrong??

0
right.
0

@Arjun Sir

I havenot got this line

$h = \Theta(n)  \implies h = O(n) \wedge h = \Omega(n)$

Please explain

Do u mean , for every $\Theta \left ( n \right )$ and non decreasing function $\Theta \left ( n \right )=$$\Omega \left ( n \right )$

How is it possible?? 

1
Isn't it the definition of Theta notation?
0
How??

$\Theta$ is stronger notation than $O.$ and $\Theta$ is asymptotically tight bound. Then how can we say

$O\wedge \Theta =\Omega$ ??
1
got it :)
8 votes
Another approach (BY TAKING EXAMPLE):-

Let $f(n) = An^2+Bn$ so $f(n)=Ω(n)$ is satisfy , $g(n) = Cn+D $ and $h(n) = En $

Now $f(n).g(n) = ( An^2+Bn).(Cn+D)  = Fn^3 + Gn^2+Hn ⇒ Ω(n) $

$f(n).g(n) +h(n)  = Fn^3 + Gn^2+Hn + En ⇒ Ω(n) ,O(n^3)  $

OPTION A
0
nice approach sir
3 votes

Let us think it a bit practically..

We know

If we have modules having time complexities T1 and T2 , then time complexity of entire program = max(T1 , T2)

Keeping this in mind ,

Let T1 corresponds to f(n) and T2 corresponds to g(n) .h(n)..

So f(n) is guaranteed to be linear as f(n) = θ(n) ..

And g(n).h(n) if becomes larger than f(n) then the entire complexity becomes greater than function of n..It will have higher powers of n.So in that case it gives ω(n)..(Small Omega n)..

However if we have this product less than linear function of n , then we need not bother as f(n) is there to make for it..So in that complexity becomes θ(n)..

So considering the above two cases we can say that overall complexity or in other words the function f(n) + g(n).h(n)  = Ώ(n)..Hence C) is the correct answer..

0

u say that θ(n) + ω(n) = Ώ(n)

 but here maximum (θ(n) , ω(n)) =  ω(n) ?? plz check

and i am  not get how g(n).h(n)= ω(n) ????

0
By that I mean to say big omega is nothing but it is combination of small omega and theta..As theta gives tightest lower bound and small omega gives not tight (strict) lower bound..

But big omega simply gives lower bound..It may be tight or not tight bound..That is what I meant by that statement..
1 vote

..............

0
@abhishekmehta4u

We can't say for sure that complexity is O(n^2)

As g(n) is omega (n), not O(n)

When we don't know the upper bound of g(n), we can't find out the upper bound of g(n).h(n)

However, we can say that it can be omega(n) for sure, at least.
0
can u explain the steps i am not able to understand  how does u get ???

if any links available then also it will be nice !!

@abhishekmehta4u
1
@vijju

what is the max and min value for this term [g(n)*h(n)] , min would be n as g(n) cant go below n and h(n) can become constant that case would give minimum value,

now think what can be its max value,you cant predict it

so for term [g(n)*h(n)] what you can surely say is its minimum value would be n, i.e  omega(n)

now total exprsn is f(n)+[g(n)*h(n)], f(n) is theta(n) , it cant go beyond n or less than n ,so what you can say from total exprsn that value of this exprsn will always be greater or equal to n i.e omega(n)
0 votes
[f(n).g(n)] + h(n)

[O(n).☊(n)] + ⊖(n)       //* since b/w O(n) and ☊(n) .........O(n) is the dominating one*//

O(n) + ⊖(n)

O(n)
0
It will be O(n²) and Ω(n).

Arjun sir explained it very well.
0 votes

check this solution.

0 votes

Related questions

0 votes
1 answer
1
1.3k views
let us consider f(n) is log(n) and g(n) = n and h(n) = n^2. since logn<= n n2 >= n for all values so given above equality holds true but when we substitute n+((logn)(n2) = O(n2) but ans is Ω(n) can somebody wxplain this plz
asked Jan 22, 2015 in Algorithms saurav04 1.3k views
0 votes
3 answers
3
301 views
Q16). Let $f(x)$,$g(x)$ and $h(x)$ be functions which of the following statement is false? a). if $f(x)$ is $O(g(x))$ and $g(x)$ is $O(h(x))$ then $f(x)$ is $O(h(x))$. b). if $f(x)$ is $\Omega(g(x))$ and $f(x)$ is $\Omega(h(x))$ ... and $(b)$ d). Neither $(a)$ nor $(b)$ I mean according to me , option (a) is correct , right ? by the rule of transitivity . Please correct me , if I am wrong.
asked Jan 18, 2016 in Algorithms worst_engineer 301 views
1 vote
3 answers
4
1.5k views
f(n) = n2 logn g(n) = n (logn)10 Ans given as g(n) = O(f(n)) and f(n) != O(g(n)) But , if I take base of log as 2 then for random value of n ( say 16) g(n) = 16 * ( log216)10 = 16 * ( 4 )10 and f(n) = 256 * log216 = 256*4 So , will it not be f(n) = O(g(n)) ?
asked Aug 21, 2015 in Algorithms worst_engineer 1.5k views
...