Are there alternative proofs of the general Taylor-series expansion theorem for real functions

calculustaylor expansion

With a view to better understanding real Taylor series,
I have examined some books on basic Calculus, with an eye for the proofs of the Taylor series theorem and the possible
authors' comments on its derivation. (My reaction when I first saw a proof of it, many years ago, was a mixture of great surprise
and anxiety. And still, while I understand the individual steps, the way they all combine
to produce e.g. the series for sinx strikes me as little short of miraculous.)

Up to now, from the books I have seen, I get the same impression: that this theorem is a technical exercise
in repeated applications of the mean-value theorem. And we are lucky that some useful functions happen to have
all derivatives bounded, so the remainder tends to zero and a nice series occurs, with nothing else to be said. But some authors do place some comments close to what I feel, albeit not very encouraging, e.g:

from Calculus, by Karl Menger: "Taylor's formula (…) is one of the great marvels of mathematics. (…) This is
something like a mathematical action at a distance (…)"

from Real Analysis, by Laczkovich & Sós: "The statement of Theorem (…) is actually quite surprising (…) the
derivatives of f at a alone determine the values of the function at every other point (…)"

from Introduction to the Calculus, by Osgood: "(…) Since it took the race two centuries to develop
this formula after the Calculus was invented, the student will not be surprised that the reasons which
underlie it cannot be given him in a few words. Let him accept it as a deus ex machina."

Now all this inquiry may be overly romantic and obsessive on my part, and Taylor series be a perfect example of the
"cold and austere beauty of mathematics" as Russell has expressed. But I think that sharing mental experiences helps
the mind to improve its turns and horizons, so may I ask:
What was your reaction when you first saw this theorem?
And has your general understanding of it changed ever since, by some other way of looking at it and proving it?

Best Answer

It's simple to discover Taylor series. Let's start with $$ \tag{1}f(x) = f(a) + \int_a^x f'(s) \, ds, $$ which of course is just the fundamental theorem of calculus. Now if we are feeling playful we might note (again by FTC) that $f'(s) = f'(a) + \int_a^s f''(t) \, dt$. Plugging this into (1), we find that \begin{align} f(x) &= f(a) + \int_a^x f'(a) + \int_a^s f''(t) \, dt \,ds \\ \tag{2}&= f(a) + f'(a)(x - a) + \underbrace{\int_a^x \int_a^s f''(t) \, dt \, ds}_{\text{remainder}}. \end{align} We can keep going like this for as long as we want. The next step is to note that $f''(t) = f''(a) + \int_a^t f'''(u) \, du$. Plugging this into (2), we find that \begin{align} f(x) &= f(a) + f'(a) (x - a) + \int_a^x \int_a^s f''(a) + \int_a^t f'''(u) \, du \, dt \, ds \\ &= f(a) + f'(a)(x - a) + \int_a^x f''(a)(s - a) + \int_a^s \int_a^t f'''(u) \, du \, dt \, ds \\ &= f(a) + f'(a)(x - a) + f''(a) \frac{(x-a)^2}{2} + \underbrace{\int_a^x \int_a^s \int_a^t f'''(u) \, du \, dt \, ds}_{\text{remainder}}. \end{align} You see the pattern. So we have discovered the Taylor polynomial approximation to $f(x)$, and we have a formula for the remainder.


By the way, if $| f'''(u) | \leq M$ for all $u \in [a,x]$, then the remainder $R(x)$ satisfies \begin{align} | R(x) | &\leq \int_a^x \int_a^s \int_a^t | f'''(u) | \, du \, dt \, ds \\ &\leq \int_a^x \int_a^s \int_a^t M \, du \, dt \, ds \\ &= M \frac{(x-a)^3}{3!}. \end{align} You see what the bound on the remainder will be for higher order Taylor series approximations. So we see that the remainder will be small if $x$ is close to $a$.

(If $f$ is sine or cosine, we can take $M = 1$. If $f$ is the exponential function, we can take $M = e^x$.)

Related Question