[Math] Testing if a geometric series converges by taking limit to infinity

calculussequences-and-series

If the limit as n approaches infinity of a geometric series is not zero, then that means the series diverges. This makes intuitive sense to me, because it is an infinite series and we keep adding nonzero terms, it will go to infinity.

However, if the limit as n approaches infinity does equal zero, series that is not enough information to tell whether the series converges or diverges.

I would think that if the limit as n approaches infinity is zero, and the series is a continuous function, then the series would converge to some real number. Why is this not the case? The only counter-example I can think of is a series like: a^n = { (-1)^n },
but that is not a continuous function.

Best Answer

$\lim_{n \rightarrow \infty} \frac{1}{n} = 0$ but $\sum_{n = 1}^\infty \frac{1}{n}$ does not converge. The associated function $\frac{1}{x}$ is continuous from $[1, \infty)$.

Also a geometric series is the series associated with the sequence $a_n = pr^n$ for some $r$ and $p$. Geometric series converges if $|r| < 1$.

Related Question