Is there a strict maximum or minimum bound for Big O and Big Omega notation? a tight bound

asymptoticsdiscrete mathematics

If Big O notation describes the upper bound of a function, can't most functions have an upper bound of $O(n!)$?

Example:

$f(x) = x^2+6x+9$

If we let $g(x) = n!$, then:

$\forall n\ge 5$, $f(x) \le cg(x)$

I have effectively proved that $f(x) = O(g(x!))$, and the same reasoning can be applied for Big Omega . But this isn't right, since I know $f(x) = O(g(x))$ for $cg(x) = 5x^2$ and $n\ge 3$.

So my question is, what defines the "tight bound" for Big O/Omega notation? For the above example, why do we almost always describe $f(x)$ as $O(n^2)$? What makes my reasoning wrong?

Edit: It isn't possible to prove $f(x) = \Omega(n!)$, so I removed that part of my question.

Best Answer

Lots of functions are $O(n!)$. Every polynomial, for instance. But something like $2^{2^n}$ is not $O(n!)$ (do you see why?). Notice that $n^2 + 6n + 9$ is not $\Omega(n!)$, as it looks like you're claiming.

When I was an undergrad, we were very careful on homework to write "give a tight big O bound for the runtime of this algorithm" in order to prevent smart alec students from making exactly this argument :P. After all, they could probably write $O(n!)$ for everything and never be wrong.

This gives two natural follow up questions:

  1. Why allow this ambiguity?

The answer is "because it's useful". Sometimes we don't know for sure what the perfect upper bound is, so it's nice to be able to say $O(n^2 \log n)$ in the preliminary paper, and be correct, even if we're pretty sure the $\log n$ factor can be removed with more work. People in the real world aren't trying to game the system in this way, so we don't worry about it.

  1. What if we do know the bound is tight? Can we notate that?

The answer is "yes", and there are two choices.

The first is "big Theta", where $f = \Theta(g)$ if and only if $f = O(g)$ and $f = \Omega(g)$. That is, if $f = \Theta(g)$, we know that (eventually) $\frac{1}{C} g \leq f \leq C g$ for some constant $C$. So we know the dominant behavior of $f$ precisely, modulo the exact constant out front.

The second doesn't have a great name (as far as I know), but we write $f \sim g$ to mean that $\lim_{n \to \infty} \frac{f}{g} = 1$. That is, $f$ and $g$ get more and more similar for larger and larger inputs. It might not be as clear from the definition, but $f \sim g$ tells us that we know the dominant asymptotics and the constant term!

For instance, we have

  • $4n^3 + 2n + 7 = \Theta(n^3)$
  • $4n^3 + 2n + 7 \neq \Theta(n^4)$
  • $4n^3 + 2n + 7 \not \sim n^3$
  • $4n^3 + 2n + 7 \sim 4n^3$

As a quick exercise, can you show that $f \sim g$ implies $f = \Theta(g)$?


I hope this helps ^_^

Related Question