[Math] Big O-notation and series

notationsequences-and-series

While there are many questions regarding Big O-notation and in particular, its usage when it comes to series, none fit my question perfectly: possibly because I am so unfamiliar with the notation.

Consider an algorithm where the number of operations needed increases by $29n^3 + 48n + 1932$. If I have undestood it correctly, one simply writes that the algorithm runs at $O(n^3)$. While my explanation here might be sloppy, its not that relevant: the main point is that $n^3$ is the dominant term.

Consider the series $\sum_{n=1}^{\infty}\frac{1}{999n}$. We can confirm the divergence of this series by one of the many tests taught Calculus 2. However, is it ever sufficient to point out that "the sequence decreases approximately as $\frac{1}{n}$" or "decreases at a rate of $O(\frac{1}{n})$"?

Best Answer

The series does not decrease (the partial sums do increase, to infinity); but I reckon what you are trying to say is that its general term $a_n$ is $\Omega\left(\frac{1}{n}\right)$, meaning that there exists a constant $\alpha > 0$ such that asymptotically (for any $n$ "big enough") $a_n \geq \alpha\cdot\frac{1}{n}$.

This is what captures the divergence of the series; saying that the general term is $O\left(\frac{1}{n}\right)$ is not what you want, as it does not imply the series diverges (for instance, $\frac{1}{n^2}$ or even $0$ are also $O\left(\frac{1}{n}\right)$).