[Math] Why the existence of Taylor series doesn’t imply it coverges to the original function

calculustaylor expansion

Please note that I've read this question and it did not address mine.

I've been presented with the following argument regarding Taylor series:

We have a function $f(x)$, now assume that there exists a power series that's equal to it:

$$f(x)=a_0 + a_1 x + a_2 x^2 +\dots$$

One can quickly show, using differentiation, that

$$f(x) =f(0) + f'(0) x + \dfrac{f''(0)}{2! }x^2 +\dots$$

It seems that this argument implies two things:

  1. For the Taylor series around a point to exist, it has to be
    continuously differentiable (infinitely many times) and defined at
    that point.

  2. If the Taylor series for a function exists, then this implies
    it's equal to it, or that it converges to the original function at
    every point $x$.

Now I know very well that point 2 is false (not every smooth function is analytic).

But point 2 seems to be implied from the argument I presented above which assumes that if a power series exists such that it's equal to the function or in other words, converges to the function for all $x$, then it will be given by the Taylor series. So what's wrong regarding this argument above?

Best Answer

now assume that there exists a power series that's equal to it:

This is where the problem lies. If a function is expressible by a power series at a point, then that power series is the Taylor series. But not all functions are so expressible.