495 views
0 votes
0 votes

Let $f(n)$ be a positive increasing function. Consider the below two statements:

  • S1: if an algorithm is $\Theta(f(n))$ in the average case, then it is $\Omega(f(n))$ in the worst case.
  • S2: if an algorithm is $\Theta(f(n))$ in the average case, then it is $O(f(n))$ in the best case. Here $O$ is Big-oh.

Let $g(n)$ represent the number of comparisons required in the algorithm. Then does the statement “algorithm is $\Theta(f(n))$ in the average case” means $g(n) = \Theta(f(n))$ or $\Theta(g(n)) = \Theta(f(n))$ or $g(n) = \Theta(\Theta(f(n)))$

Can anyone explain it using the standard definition of asymptotic notations? I know both the statements are true and the explanation I’ve been receiving so far is – For S1: “In worst case algorithm takes at least $f(n)$ time”. But I’d really appreciate if anyone could help me understand it mathematically.

Also, $g(n) = \Theta(f(n))$ actually means $g(n)$ belongs to the set $\Theta(f(n))$, right?

Please log in or register to answer this question.

No related questions found