[Math] Derive Time from Sorting Method/Time Complexity

algorithmsanalysis-of-algorithmsasymptoticscomputer science

A sorting method with “Big-Oh” complexity O(n log n) spends exactly 1
millisecond to sort 1,000 data items. Assuming that time T(n) of sorting
n items is directly proportional to n log n, that is, T(n) = cn log n, derive
a formula for T(n), given the time T(N) for sorting N items, and estimate
how long this method will sort 1,000,000 items.

I know that you can use the formal definition of Big O that T(n) = c*f(n), given O(f(n)) to solve for c, and use that value to solve the problem. But I don't understand how or why this works.

Best Answer

I don't think the big O notation is needed or useful for this problem at all.

Setting $ n = N $ in

$$ T(n) = c \, n \log n $$

yields

$$ T(N) = c \, N \log N $$

and solving for $ c $ yields the following.

$$ c = \frac{T(N)}{N \log N} $$

Now pluging this expression for $ c $ back into the first equation and adding $ N $ and $ T(N) $ - which I will just call R - as new parameters yields the following function.

$$ T(n, N, R) = \frac{R}{N \log N} n \log n$$

This function gives the runtime of the algorithm for input size $ n $ given the runtime $ R $ for input size $ N $. Note that equation is independent of the base of the logarithm which was not explicitly stated in the question.

Finally using $ n = 1\,000\,000 $, $ N = 1\,000 $ and $ R = 1 \mathrm{ms} $ we obtain a runtime of $ 2\,000 \mathrm{ms} $ for sorting $ 1\,000\,000 $ items.

Related Question