[Math] Difference between asymptotic bound and running time

algorithmsanalysis-of-algorithmsrecurrence-relationsrecursive algorithms

This question might be trivial but I really don't see the fine line here. I want to understand when I should say asymptotic bound of an algorithm vs running time of an algorithm. I understand that running time is said when you infer the time it takes for an algorithm to run for an input, such as O(n). However, I don't know when I say asymptotic bound! I keep using them interchangeably but I feel there is a difference here. Kindly, can someone simplify these terms.

Thank you

Best Answer

There is in fact an answer on SO: https://stackoverflow.com/questions/4915842/difference-between-time-complexity-and-running-time.

The difference is that an asymptotic bound specifies the term that will be largest in runtime for hypothetical inputs that are of infinite size.

Let's take an example:

Suppose the runtime of your program is given by $f(x)=1000x$.
For inputs that are small, this may take a long time. An input of size 1 will take 1000 seconds.

Suppose another program runs in $f(x)=x^2$. For this same input of size 1, the runtime will be 1 second, so it seems like it's faster (which it is, for the given input). But what happens when input gets large?

The main point here is that, when talking about runtime, you have a given input size in mind, and runtime is a value, in seconds. When talking about asymptotic bounds, you're really talking about hypothetical performance for big input, which is when you really start caring about the performance of your algorithm.

Related Question