[Math] Big $O$, little $o$ Notation

asymptoticsnotation

So I've gone through the typical undergraduate math sequence (two semesters of real analysis, two semesters of abstract algebra, some measure theory, but I haven't taken discrete math) and in various posts online, I keep on seeing things such as
$$O(x^2) $$
which I've never encountered in my formal education. What in the world does the big $O$ notation mean, to someone who's completely new to the notation, so that I understand how to use it? (It seems to me like a way to write "some polynomial stuff.")

I've also encountered "little" $o$ notation in studying for actuarial exams.

Also, why do we care about $O$ and $o$?

Best Answer

Big $O$ notation:

$f(n)=O(g(n))$:

$\exists \text{ constant } c>0 \text{ and } n_0 \geq 1 \text{ such that, for } n \geq n_0: \\ 0 \leq f(n) \leq c g(n)$

$$$$

Little $o$ notation:

$f(n)=o(g(n))$:

$\text{ if for each constant } c>0 \text{ there is } n_0 \geq 1 \text{ such that } \\ 0 \leq f(n) < c g(n) \ \forall n \geq n_0$

In other words $\displaystyle{\lim_{n \rightarrow \infty} \frac{f(n)}{g(n)}=0}$