An n log n lower bound for sorting
Sorting algorithms can be depicted as trees. The depth of the tree - the number of comparisons on the longest path from root to leaf, is exactly the worst-case time complexity of the algorithm. This way of looking at sorting algorithms is useful because it allows one to argue that mergesort is optimal, in the sense that Ω(n log n) comparisons are necessary for sorting n elements.
Here is the argument: Consider any such tree that sorts an array of n elements. Each of its leaves is labeled by a permutation of {1, 2, ....., n}. In fact, every permutation must appear as the label of a leaf. The reason is simple: if a particular permutation is missing, what happens if we feed the algorithm an input ordered according to this same permutation? And since there are n! permutations of n elements, it follows that the tree has at least n! leaves.
We are almost done: This is a binary tree, and we argued that it has at least n! leaves. Recall now that a binary tree of depth d has at most 2d leaves (proof: an easy induction on d). So, the depth of our tree - and the complexity of our algorithm - must be at least log(n!).
And it is well known that $log(n!) \geq c.nlogn$ for some c > 0. There are many ways to see
this. The easiest is to notice that $n! \geq (n/2)^{n/2}$ because n! = 1*2*3.......*n contains at least n/2 factors larger than n/2; and to then take logs of both sides. Another is to recall Stirling's formula:
$n! \approx \sqrt{\pi (2n + 1/3)}.n^{n}.e^{-n}$
Either way, we have established that any comparison tree that sorts n elements must make, in the worst case, Ω(n log n) comparisons, and hence mergesort is optimal!
Source: Algorthms by Dasgupta, Papadimitriou, Vazirani PN 59