330 views
0 votes
0 votes
We can extend our notation to the case of two parameters $n$ and $m$ that can go to infinity independently at different rates.For a given function $g(n,m)$, we denote by $O(g(n,m))$ the set of functions

$O(g(n,m))$ $=$ {$f(n,m) :$ there exist positive constants $c,n_0,$ and $m_0$ such that $0 \leq f(n,m)  \leq cg(n,m)$ for all $n \geq n_0$ or $m \geq m_0$ }

Give corresponding definitions for $\Omega(g(n,m))$ and $\Theta(g(n,m))$.

Please log in or register to answer this question.

Related questions

0 votes
0 votes
1 answer
1
0 votes
0 votes
0 answers
2
akash.dinkar12 asked Apr 4, 2019
264 views
Prove that the running time of an algorithm is $\Theta (g(n))$ if and only if its worst-case running time is $O(g(n))$ and its best-case running time is $\Omega(g(n))$.
0 votes
0 votes
1 answer
3
0 votes
0 votes
2 answers
4
akash.dinkar12 asked Apr 4, 2019
620 views
Explain why the statement, “The running time of algorithm A is at least $O(n^2),$” is meaningless.