Uniform convergence of the series $\sum_{n=1}^\infty\frac{x}{(1+x)^n}$

analysissequence-of-functionsequences-and-seriesuniform-convergence

I have to determine whether or not the series
$$\sum_{n=1}^\infty\frac{x}{(1+x)^n}$$
converges uniformly on $[0,1]$.

I attempted to use Weierstrass' M-Test, by finding the maxima of each $a_n$ which must exist by the maximum value theorem. The maxima turned out to be at $\dfrac{1}{n-1}$, and to be equal to $\dfrac{(n-1)^{n-1}}{n^n}$. The sum of these terms diverges, and hence I can't use Weierstrass' M-Test.

Now, I'm thinking of using Dini's Theorem since $[0,1]$ is compact, and seting $f_n(x)$ to be the $n$-th partial sum gives us a monotonic increasing sequence. So if I can find the pointiwse limit of this sequence, and verify it is continuous, I will be done.

Is this the right track? If it is, how would I compute the series?

Best Answer

Note that as $N\to \infty$ $$\sum_{n=1}^{N}\dfrac{x}{(1+x)^n}=1-\frac{1}{(1+x)^{N}}\to \begin{cases}0&\text{if $x=0$,}\\ 1&\text{if $x>0$.}\end{cases}$$ So the pointwise limit function is not a continuous in $[0,a)$ with $a>0$. What may we conclude about the uniform convergence over such interval? What happens over $[a,b]$ with $0<a<b$?