[Math] Why don’t taylor series represent the entire function

calculusintuitionsequences-and-seriestaylor expansion

Say, I have a continuos function that is infinitely differentiate on the interval $I$.

It can then be written as a taylor series. However, taylor series aren't always completely equal to the function – in other words, they don't necessarily converge for all $x$ in $I$.

Why? The way I think of taylor series is that if you know the position , velocity, acceleration, jolt etc. of a particle at one moment in time, you can calculate its position at any time. Taylor series not converging for all $x$ suggests there's a limitation on this analogy.

So why do taylor series "not" work for some $x$?

Using the particle analogy, described above shouldn't taylor series allow you to find the "location" of the function at any "time"?

Please note, I am not looking for a proof – I'm looking for an intuitive explanation of why taylor series don't always converge for all $x$.

Best Answer

My professor used to say:

You might want to do calculus in $\Bbb{R}$, but the functions themselves naturally live in $\Bbb{C}$. Euler was the first to discover that if you don't look at what they do everywhere in the complex plane, you don't really understand their habits.

This is as subjective as it gets, but it has always helped my intuition. In particular, you might think that some function is doing nothing wrong, so it should be analytic. Well, if it does nothing wrong in $\Bbb{R}$, look at what it does in $\Bbb{C}$! If also in $\Bbb{C}$ it does nothing wrong, then it is analytic. If in $\Bbb{C}$ it makes some mess, then you have to be careful also in $\Bbb{R}$. To quote my professor again:

Even in $\Bbb{R}$, and in the most practical and applied problems, you can hear distant echos of the complex behavior of the functions. It's their nature, you can't change it.