Alternatively, try integrating both sides of the algebraic identity (valid for $x\neq -1$):
$$\sum_{k=0}^{n-1} (-1)^kx^k=\frac{1}{1+x}+\frac{(-1)^{n-1}x^n}{1+x}$$
over a suitable interval and using elementary inequalities to bound the remainder term.
The only proof of this that I know of uses the Cauchy form of the remainder. If $f^{(n+1)}(x)$ exists for all $x$ between $0$ and $h$ and is continuous on this region$^\dagger$, then this form of the remainder can be shown to hold by using the integral form of the remainder:
$$R_{n+1}(x)=
{1\over n!} \int_0^x f^{(n+1)}(t)(x-t)^n\,dt
$$
and applying the Second Mean Value Theorem for Integrals to this expression (with $g(t)=1$).
So, let's use the Cauchy form of the remainder to show that the Taylor (Maclaurin) series of $\ln(1+x)$ converges to $\ln(1+x)$ for $-1<x<0$:
Let $f(x)=\ln(1+x)$. Using the Cauchy form of the remainder, one has
$$
\ln(1+x) =x-{x^2\over2}+\cdots+{(-1)^{n-1} x^n\over n} + R_{n+1}(x),
$$
where
$$
R_{n+1}(x)={f^{(n+1)}(c)\over n!}(x-c)^n x
$$
for some $c$ between $0$ and $x$.
Note we may write
$$
R_{n+1}(x)= {f^{(n+1)}(\theta x)\over n!}(1-\theta)^n x^{n+1}
$$
for some $0\le\theta\le1$.
Evaluating the required derivative, we have
$$
R_{n+1}(x)=(-1)^n{x^{n+1}(1-\theta)^n\over (1+\theta x)^{n+1}}.
$$
Now, for $-1<x<0$, note that
$$
1+\theta x\ge 1+x
$$
and
$$
0\le{1-\theta\over 1+\theta x}\le 1;
$$
whence
$$
| R_{n+1}(x)|
=\biggl|{x^{n+1}(1-\theta)^n\over (1+\theta x)^{n+1}} \biggr|
=\biggl|{1-\theta\over 1+\theta x} \biggr|^n \cdot{|x|^{n+1}\over |1+\theta x|}
\le{|x|^{n+1}\over 1+x}\ \buildrel{n\rightarrow\infty}\over\longrightarrow\ 0.
$$
$^\dagger$ The continuity of $f^{(n+1)}$ is not required here; but in this case, a different proof is required.
Best Answer
Consider :
$$f:\mathbb{R}\to\mathbb{R},x\mapsto\cases{0\qquad\textrm{if }x\le0\cr\exp\left(-\frac1x\right)\quad\textrm{otherwise}}$$
Its Taylor series (at $0$) is the null series.
Proof sketch ...
There exists a sequence $(P_n)_{n\ge0}$ of polynomials such that :
$$\forall n\in\mathbb{N},\,\forall x>0,\,f^{(n)}(x)=P_n\left(\frac1x\right)\exp\left(-\frac1x\right)$$
This can be used to show that :
$$\forall n\in\mathbb{N},\,\lim_{x\to0^+}f^{(n)}(x)=0$$
And now, a last induction will show that, for all $n\ge 0$, $f^{(n)}$ has a right-derivative at $x=0$, which is zero.
Finally (since left derivatives at zero are obviously all zero), this proves that $\forall n\in\mathbb{N},\,f^{(n)}(0)=0$ which in turn shows the claim.