[Math] What does asymptotically optimal mean for an algorithm

algorithmsasymptotics

What does it mean to say that

heap sort and merge sort are asymptotically optimal comparison sorts .

I know What the Big O , Big Omega($\omega)$ and Theta($\theta$) notations are and I also know why these two sorts are called comparison sorts ? But I am not able to understand why they are called asymptotically optimal ? Why doesn't this asymptotically optimal property also apply to quick sort ?

P.S: I had a look in this question but still I am not clear about the definition of asymptotically optimal .

Best Answer

Computing a simple function that characterizes the behavior of an algorithm for any input is difficult. So upper and lower bounds are used, in addition to probabilistic methods.

The term 'asymptotic' is used because the bound is typically only valid for large $n$, as an accurate formula for small $n$ is more complicated, and not really interesting from a performance perspective.

To answer your question, you can prove that any working comparison sort takes a lower bound of $\Omega(n \lg n)$ on running time. Consequently, if you can show that a sorting algorithm (such as heapsort or merge sort) has an upper bound of $O(n \lg n)$ on running time, then in fact you have a running time of $\Theta(n \lg n)$ which is, by definition, an asymptotically tight bound for the running time (See Cormen, Leiserson & Rivest's "Introduction to Algorithms", Chapter 2, Theorem 2.1, and the section on sorting).

Quicksort has a worst-case running time of $\Theta(n^2)$, which is larger than the above bound, hence is not optimal.

Note, however, that quicksort has an average running time of $\Theta(n \lg n)$. (This, of course, assumes that all inputs of length $n$ are equally likely.)

Related Question