[Math] Simple Algorithm Running Time Analysis

algorithmsanalysis-of-algorithms

A sorting algorithm takes $1$ second to sort $1,000$ items on your local machine. How long will it take to sort $10,000$ items

  1. if you believe that the algorithm takes time proportional to $n^2$,
    and
  2. if you believe that the algorithm takes time roughly proportional to
    $n\log n$?

If the algorithm takes time proportional to $n^2$, then $1,000^2=1,000,000$, and $10,000^2=100,000,000$. Dividing the latter by the former yields $100$. Therefore, the sorting algorithm would take $1$ minute and $40$ seconds to sort $10,000$ items.

If the algorithm takes time proportional to $n\log n$, then $1,000\log 1,000=3,000$, and $10,000\log 10,000=40,000$. Dividing the latter by the former yields $13\frac13$. Therefore, the sorting algorithm would take $13.\bar3$ seconds to sort $10,000$ items.

Does my thought process seem correct?

Best Answer

Yes, it seems correct. -Servaes

Related Question