7.4k views

The tightest lower bound on the number of comparisons, in the worst case, for comparison-based sorting is of the order of

1. $n$
2. $n^2$
3. $n \log n$
4. $n \log^2n$
+1
IN ABOVE QUESTION , IF WE TAKE TIGHTEST UPPER BOUND THEN WHAT WILL BE ANSWER?????
+2

On a light note, with infinite computation and memory power available, The God will keep archived the sorted order of all objects possible and will use hashing to find the sorted permutation from given objects.

Yeah, Thug Life.

For comparison-based sorting the asymptotically tight bound for worst case is given by $\Theta (n \log n)$, which means it is the tightest upper bound (big O) as well as the tightest lower bound (big omega). So, answer is $n \log n$.

Tightest lower bound of sorting (say S$(n)$) is $n \log n$ means there is no function $f$ which has an order of growth larger than $n \log n$ and $f(n) = \Omega (S(n))$ holds.

A usual mistake is to think worst case changes with lower and upper bounds, but that is not the case. Worst case is defined for the algorithm and it is always the input which causes the algorithm the maximum complexity.

Correct Answer: $C$
edited
+1

@Arjun Sir,The first statement of the answer says:-

For comparison-based sorting the asymptotically tight bound for worst case is given by Θ(nlogn)

This is for optimal comparison based sorts(heap,merge),right?

Because worst case of all comparison based is not nlogn ? Reading this line it seems to be true for all ,but cases like insertion sort,bubble sort does not fit this.Please clear this point

0

Rajendra Dangwal even I have this doubt , if now u understood it then pls help me.

+1
For the worst case, tighest lower bound in all comparison based sorting algo means - In the worst case which sorting algorithm take the least time to sort. So,  O(nlog) because in the worst case we can't sort in less time than this if it would have been upper bound then it can be infinite.

For best case, tighest lower bound in all comparison based sorting algo means - In the best case, which sorting algorithm take the least time to sort. So O(n) because we can't sort in less time than this if it would have been upper bound then I think it would have been infinite.

please correct me if I am wrong?
+1
@Arjun Sir, none of the comparison based sorting takes nlogn time for comparisons.Merge sort takes n time for comparisons even in worst case. So how you are using Ω(nlogn)? I am in serious confusin regarding this
0

@Arjun Sir...please respond to @rahul sharma 5  doubt... i am confused on why its theta and not only Omega in your answer. Also sir if it was Upper bound than , will the answer be O(n2)???

For comparison-based sorting the asymptotically tight bound for worst case is given by Θ(nlogn)

This is for optimal comparison based sorts(heap,merge),right?

Because worst case of all comparison based is not nlogn ? Reading this line it seems to be true for all ,but cases like insertion sort,bubble sort does not fit this.Please clear this point

0

@ rahul sharma 5  you are right!

0
@Arjun sir,

If we are considering about general comparison based sorting and not specific algorithms in worst cases, then if we say that tight bound is n.logn meaning both big -oh and big-omega are n.logn for any sorting algorithm.

However there exist sorting algorithm like insertion sort which in this worst case takes O(n^2). So how do we claim that tight bound implying big- oh for sorting is n.logn?

Thank you.
0
I think u r saying opposite of what u should say?
0

Hi Sir, I didn't get the line

Tightest lower bound of sorting (say S(n)) is nlogn means there is no function f which has an order of growth larger than nlogn and f(n)=Ω(S(n)) holds.

My question is, if f has an order of growth larger than nlogn isn't it bounded below by S(n) ?  Have I interpreted your statement wrongly?

0

if f has an order of growth larger than nlogn i.e. same as n^2 has  an order of growth larger than nlogn

let f(n) = n^2 and g(n) = nlog n

=>(n^2) $\geqslant$ (nlogn)

=> f(n) = Ω(S(n)) => S(n) is bounded below f(n) i.e. nlog n is below n^2

So how can f be bounded below nlogn ?

no function f which has an order of growth larger than nlogn and f(n)=Ω(S(n)) holds.

This means that both the condition could not occur simultaneously

There is no comparison based sorting algorithm which can perform sorting in less then nlogn in worst case. For proof of this statement please refer https://www.youtube.com/watch?v=voftwFuqH4I or check CLRS. So this is the tightest lower bound for comparison based sorting algorithm in worst case.

edited by
0
Hi @Chhotu,

I have a doubt regarding tight upper and tight lower bounds.

If we specify the tightest lower bound for any algorithm, it means that both big-oh and big -omega are same which is itself the tightest lower bound. Isn't it? Am I correct.

Thank you

An n log n lower bound for sorting

Sorting algorithms can be depicted as trees. The depth of the tree - the number of comparisons on the longest path from root to leaf, is exactly the worst-case time complexity of the algorithm. This way of looking at sorting algorithms is useful because it allows one to argue that mergesort is optimal, in the sense that Ω(n log n) comparisons are necessary for sorting n elements.

Here is the argument: Consider any such tree that sorts an array of n elements. Each of its leaves is labeled by a permutation of {1, 2, ....., n}. In fact, every permutation must appear as the label of a leaf. The reason is simple: if a particular permutation is missing, what happens if we feed the algorithm an input ordered according to this same permutation? And since there are n! permutations of n elements, it follows that the tree has at least n! leaves.

We are almost done: This is a binary tree, and we argued that it has at least n! leaves. Recall now that a binary tree of depth d has at most 2d leaves (proof: an easy induction on d). So, the depth of our tree - and the complexity of our algorithm - must be at least log(n!).

And it is well known that $log(n!) \geq c.nlogn$ for some c > 0. There are many ways to see
this. The easiest is to notice that $n! \geq (n/2)^{n/2}$ because n! = 1*2*3.......*n contains at least n/2 factors larger than n/2; and to then take logs of both sides. Another is to recall Stirling's formula:
$n! \approx \sqrt{\pi (2n + 1/3)}.n^{n}.e^{-n}$

Either way, we have established that any comparison tree that sorts n elements must make, in the worst case,  Ω(n log n) comparisons, and hence mergesort is optimal!

Source: Algorthms by Dasgupta, Papadimitriou, Vazirani PN 59

This is asking no. of comparison needed in worst case to sort the elements. Why we are taking references of merge sort and any other. In merge sort the no. of comparisons to sort the elements is O(n). But if we go by traditional approach like if we have N elements and we need to sort it. Then it has n! to permute the elements out of which we have one correct sequence of elements hich has correct sorted order.

If the algorithm always completes after at most f(N) steps, it cannot distinguish more than 2^f(N) cases because the keys are distinct and each comparison has only two possible outcomes. Therefore,

2^f(N) >= N! or equivalently f(N) >= log(N!).
Since log(N!) is Omega(NlogN), the answer is NlogN.

1
2