[Math] Settle a classroom argument – do there exist any functions that satisfy this property involving Taylor polynomials

calculusconvergence-divergencesequences-and-seriestaylor expansionuniform-convergence

I'm going to apologize in advance; I might at some points say Taylor series instead of Maclaurin series.

OK, so backstory: My calculus class recently went over Taylor series and Taylor polynomials. It seemed basic enough. Using the ratio test we were able to prove the radius of convergence of these series as well. For example, we derived that:

$$
e^x = \sum_{n=0}^\infty\dfrac{x^n}{n!}
$$

using the ratio test we can find that the series converges $\forall x$

However, today we had a substitute that talked about Taylor's theorem and Taylor's formula defined as the sum of an $n$th order Taylor polynomial plus the remainder.

$$
f(x) = P_n(x) + R(x)
$$
$$
R(x) = \dfrac{f^{n+1}(c)(x-a)^{n+1}}{(n+1)!}
$$

The substitute teacher then told us that in order to prove that the Taylor polynomial converges to the original function, you must show that
$$
\lim_{n\rightarrow\infty}R(x)=0
$$

Well, after this statement the flood gates opened with a few students asking why you can't just use the ratio test to show the Taylor series converges $\forall x$ like we did for $e^x$.

The substitute said that the ratio test only proved convergence, while this proved it converged to the actual function. The students then said that if we already proved that the Taylor series is the function at an infinite amount of points, if the series converges, doesn't that mean that it converges to the function?

We had already done an example previously in class where:
$$
f(x)=\begin{cases}
0,&\text{ if }x=0;\\
e^{-\frac{1}{x^2}},&\text{ if }x\neq 0.
\end{cases}
$$

This function's Taylor polynomial converges to 0 at every point. However, it doesn't converge to the function at every point.

My classmates said this was a cop-out and "didn't count" because it was a piecewise function. So is there an example of a function whose Taylor polynomial converges on some interval, but does not converge to the function entirely on that interval?

Also a proof would be cool if you could explain why the students or the teacher were wrong.

Best Answer

First, yes, in practical terms, it is very hard to define (indefinitely differentiable) functions that are non-analytic except by doing so piecewise. That's basically because all the usual "pieces" are themselves analytic in the interior of the region where they converge. This itself is an artifact of our history on this subject. In fact, some more-exotic (but standard for 100+ years) functions are not analytic... but their very definition depends on more complicated procedures, so would probably not be very satisfying, either.

A secondary but important point is that proving that the Taylor-Maclaurin series of a function at some point has infinite radius of convergence (or any other radius of convergence $>0$) does not in itself prove that the thing converges to the function whose Taylor-Maclaurin series it is. Rather, the error terms must go to zero. Of course, from our viewpoint, it takes considerable effort to arrange infinite radius of convergence but error terms not going to zero ... again because we must produce an exotic/un-natural function (from our artifactual viewpoint) one way or another, since "natural" (indefinitely differentiable) functions seem to be analytic.

Historically, indeed, many people (Euler, Lagrange) counted piecewise-defined functions as artificial, and not real functions. Sometimes, the very definition of "function" (in those days) was that the thing was representable by a power series. And, since most of the elementary functions we encounter do have such representations (which often requires proof... but sometimes ignorance is bliss), naively presuming that "all" functions have such expansions does not immediately lead to disaster... and, in fact is marvelously effective, because that is a correct assumption in many contexts.

Related Question