748 views

Consider two statements below -

• $\text{S1}:$ For all positive $f(n), f(n) + o(f(n)) = \theta(f(n)).$
• $\text{S2}:$ For all positive $f(n), g(n)$ and $h(n),$ if $f(n) = O(g(n))$ and $f(n) = \Omega(h(n)),$ then $g(n) + h(n) = \Omega(f(n))$

Which of the following is the correct option.

1. $\text{S1}$ is True but $\text{S2}$ is False.
2. $\text{S2}$ is True but $\text{S1}$ is False.
3. Both are True.
4. Both are False.

Tbh everything that has been taught in class is enough to solve this.

When you write f(n) + o(f(n)), this o(f(n)) cannot be a set of functions (common sense), so it must be an element which belongs to set o(f(n)).

For formal definitions and terms read "Asymptotic notation in equations and equalities" from CLRS, it's from chapter 3, as mentioned by {ankitgupta}.
If that's the case then writing

o(f(n)) is strictly less than f(n) is a valid statement

but ankitgupta said we cant write like that

this is the issue

@jiren, I wanted to make things correct. It’s upto you or anyone whether they believe it or not on a good book. And This book is written by Turing award winner which is considered as the Nobel prize of computer science.

You are believing that asymptotic analysis means $n \geq n_0$ or it is for sufficiently large $n$ which is not completely true.

There are books which have written on this single topic. Anyone can read about How these asymptotic notations came into picture, why we use $n \geq n_0$ or $n \leq n_0$ etc if they get free time.

Answer is option C  which is both statements are correct

Yeah we are discussing about that comment only

what do you want to convey??

There are two things which I wanted to convey. Summarising below (most of the things here are copied from the above comments):

Assuming $f$ and $g$ are positive functions,

$1)$ If we write: $f(n) + o(f(n)) = \Theta (f(n))$

It means for “any” function $g(n) \in o(f(n)),$ there is “some” function $h(n) \in \Theta(f(n))$ such that

$f(n) + g(n) = h(n)$ for all $n.$

(Remember this equation should be true for all $n,$ not only for large $n.$)

(Now, based on this, we should have to write things for first statement.)

$2)$ If we write $o(f(n)) < f(n),$ it means for any function $g(n) \in o(f(n))$ such that $g(n) < f(n)$ for all $n.$

(As in case of equation, this inequality should also be true for all $n.$)

If we write $g(n) = o(f(n))$ then for some $n,$ you can have $g(n) > f(n).$ and since, we should have $g(n) < f(n)$ So, this statement might not be correct for all positive functions $f(n)$ because inequality may not hold for all $n.$

For example, $f(n) = n^2$ and $g(n) = 10n.$ Here we have $g(n) = o(f(n))$ but $g(2) > f(2)$

So, $10n \in o(n^2)$ and $10n < n^2$ should hold for all $n$ but for $n=2$ it does not hold.

We might loosely say $o(f) < f.$

section “Asymptotic notation in equations and inequalities”  in Chapter 3

section 2.5

We are saying f(n) < g(n) asymptotically and not exactly so for some constant values even

if f(n) > g(n) this doesn't matter as we are speaking asymptotically

Also when we say f(n) > g(n) asymptotically it means f(n) will grow faster than g(n) for all the values >= n0

see in the above we are taking values which are greater than or equal to n0 and not for every value of n

what i really don't understand is

f(n) + o(f(n)) = Θ(f(n))

here u said this equality is valid because we take o(f(n)) as anonymous function and not as set

but again u will write g(n) belongs to o(f(n)) which is contradicting to your above mentioned statement because u took o(f(n)) as a function and not as a set

What exactly is the convention that you are following

o(f(n)) in an equality or inequality is a

1. set
2. anonymous function

pls upvote if likes the explanation :)

by

1
2
3
243 views