[Math] How to use the Lagrange’s remainder to prove that log(1+x) = sum(…)

analysispower seriestaylor expansion

Using Lagrange's remainder, I have to prove that:

$\log(1+x) = \sum\limits_{n=1}^\infty (-1)^{n+1} \cdot \frac{x^n}{n}, \; \forall |x| < 1$

I am not quite sure how to do this. I started with the Taylor series for $x_0 = 0$:

$f(x_0) = \sum\limits_{n=0}^\infty \frac{f^{(n)}(x_0)}{n!} \cdot x^n + r_n$, where $r_n$ is the remainder. Then, I used induction to prove that the n-th derivative of $\log(1+x)$ can be written as:

$f^{(n)} = (-1)^{n+1} \cdot \frac{(n-1)!}{(1+x)^n}, \forall n \in \mathbb{N}$

I plugged this formula into the Taylor series for $\log(1+x)$ and ended up with:

$f(x_0) = \sum\limits_{n=1}^\infty (-1)^{n+1} \cdot \frac{x^n}{n} + r_n$, which already looked quite promising.

As the formula which I have to prove doesn't have that remainder $r_n$, I tried to show that $\lim_{n \to \infty} r_n = 0$, using Lagrange's remainder formula (for $x_0 = 0$ and $|x| < 1$).

So now I basically showed that the formula was valid for $x \to x_0 = 0$. I also showed that the radius of convergence of this power series is $r = 1$, that is to say the power series converges $\forall |x| < 1$.

What is bugging me, is the fact, that to my opinion, the formula is only valid for $x \to 0$. I mean sure, the radius of convergence is 1, but does this actually tell me that the formula is valid within $(-1,1)$? I've never done something like this before, thus the insecurity. I'd be delighted, if someone could help me out and tell me, whether the things I've shown are already sufficient or whether I still need to prove something.

Best Answer

$f(x_0) = \sum\limits_{n=1}^\infty (-1)^{n+1} \cdot \frac{x^n}{n} + r_n$

That should say

$$f(x)=\sum_{n=1}^k (-1)^{n+1} \cdot \frac{x^n}{n} + r_k(x),$$

where $r_k$ is the error term of the $k^\text{th}$ partial sum. You want to use estimates to show that the error term goes to $0$ as $k$ goes to $\infty$, which will justify convergence of the series to $f(x)=\log(1+x)$.


Edit: I've struck through part of my answer that relied on a wrong estimate of the derivatives, as pointed out by Robert Pollack. With the missing $k!$ term, the estimate only works on $[-\frac{1}{2},1)$.

Added: To make this answer a little more useful, I decided to look up a correct method. Spivak in his book Calculus (3rd Edition, page 423) uses the formula $$\frac{1}{1+t}=1-t+t^2-\cdots+(-1)^{n-1}t^{n-1}+\frac{(-1)^nt^n}{1+t}$$ in order to write the remainder as $r_n(x)=(-1)^n\int_0^x\frac{t^n}{1+t}dt$. The estimate $\int_0^x\frac{t^n}{t+1}dt\leq\int_0^xt^ndt=\frac{x^{n+1}}{n+1}$ holds when $x\geq0$, and the harder estimate $\left|\int_0^x\frac{t^n}{1+t}\right|\leq\frac{|x|^{n+1}}{(1+x)(n+1)}$, when $-1\lt x\leq0$, is given as Problem 11 on page 430. Combining these, you can show that the sequence of remainders converges uniformly to $0$ on $[-r,1]$ for each $r\in(0,1)$.

Lagrange's form of the error term can be used to do this. The estimates, which follow from Taylor's theorem, are also found on Wikipedia. In this case, if $0\lt r\lt 1$, then $|f^{k+1}(x)|\leq \frac{1}{(1-r)^{k+1}}$ whenever $x\geq-r$, so you have the estimate $|r_k(x)|\leq \frac{r^{k+1}}{(1-r)^{k+1}}\frac{1}{(k+1)!}$ for all $x$ in $(-r,r)$, which you can show goes to $0$ (because (k+1)! grows faster than the exponential function $\left(\frac{r}{(1-r)}\right)^{k+1}$), thus showing that the series converges uniformly to $\log(1+x)$ on $(-r,r)$. Since $r$ was arbitrary, this shows that the series converges on $(-1,1)$, and the convergence is uniform on compact subintervals.

Related Question