Interval of convergence for power series obtained by integration

calculuspower seriessequences-and-series

Let the function f be defined as follows:
$\mathrm f(x) = \sum_{n=0}^\infty a_nx^n$

To my understanding, if $\mathrm a_n$ is constant (not depending on $\mathrm n$), then this power series is also a geometric series that converges only if $\mathrm |x| < 1$. If that is true, the interval of convergence for this series is $\mathrm -1 < x < 1$.

According to my AP calculus textbook, the series obtained by integrating the above power series converges to $ \int_0^x f(t) \,dt $ for each $\mathrm x$ within the interval of convergence of the original series. (Please see here and here for the relevant textbook sections.)

However I seem to have found a contradiction to this statement. The series obtained by integrating the power series for f yields
$\int_0^x f(t) \,dt = \sum_{n=0}^\infty a_n\frac{x^{n+1}}{n+1}$. Using the ratio test and testing the endpoints, it can be shown that the interval of convergence for this series is $\mathrm -1 \le x \lt 1 $. The interval of convergence differs from that of the original geometric power series. This is not what the textbook states.

What am I misunderstanding?

Best Answer

Read a little more carefully. The book says correctly that the integrated series converges wherever the original one does.

You have discovered that it might also converge at an endpoint of the interval of convergence even if the original series diverges.