What’s the difference between the “remainder” and “radius of convergence” for Taylor series that converge for all $x$

calculustaylor expansion

From what I can understand, the remainder is how much difference there is between the function itself and the polynomial approximation. And the radius of convergence is related to the series representation of the polynomial approximation, and how its convergence could be tested by the ratio test.

But what's the difference between them when they seem to tell you the same thing?

For example, since $\sin(x)$ is $$ \sum_{n=0}^\infty(-1)^n \frac{x^{2n+1}}{(2n+1)!}, $$ it seems that we can find that it converges for all values of $x$ by either

  1. making to the remainder becomes zero (by having it approach $0$ by having $n$ approach $\infty$.
  2. showing that the ratio test has convergence criteria

My other example would be $e^x$, but that seems easier to do the ratio test on since the $f^{(n+1)}(c)$ term isn't bounded.

It seems that doing the ratio test for $\sin(x)$ will give me the same result as making the remainder $0$, and finding that the remainder is $0$ for $e^x$ is the same result as the ratio test.

Best Answer

Note that "Taylor series converges for all $x$" is a completely different statement than "Taylor series equals the original function" (or more commonly phrased as "Taylor series converges to the original function"), and it is this difference which I think you haven't understood

Let $f:\Bbb{R} \to \Bbb{R}$ be a given infinitely differentiable function, and let $a\in \Bbb{R}$ be a given. Then, we can consider three different functions:

  • For each integer $n\geq 0$, we can consider the $n^{th}$ Taylor polynomial for $f$ about the point $a$, $T_{n,a,f}:\Bbb{R} \to \Bbb{R}$ defined by \begin{align} T_{n,a,f}(x) := \sum_{k=0}^n \dfrac{f^{(k)}(a)}{k!}(x-a)^k \end{align}
  • Accordingly, we can consider the $n^{th}$ order remainder function for $f$ about the point $a$, $R_{n,a,f}:\Bbb{R} \to \Bbb{R}$ and this is defined by $R_{n,a,f}:= f- T_{n,a,f}$.
  • Finally, we can consider the Taylor Series of $f$ about the point $a$. To define this, we first consider the formal power series $S(X) := \sum\limits_{k=0}^{\infty}\frac{f^{(k)}(0)}{k!}X^k$. This has a certain radius of convergence $0 \leq \rho \leq \infty$ (the Cauchy-Hadamard formula gives an explicit formula for $\rho$ in terms of the coefficients of the series). Now, we define the Taylor series $S_{a,f}$, of the function $f$ about the point $a$, as follows: if $\rho = 0$, we define $S_{a,f}: \{a\} \to \Bbb{R}$, by $S_{a,f}(a) := f(a)$. If $\rho >0$ then we define $S_{a,f}: (a-\rho,a+\rho) \to \Bbb{R}$ by \begin{align} S_{a,f}(x) := \sum_{k=0}^{\infty}\dfrac{f^{(k)}(a)}{k!}(x-a)^k = \lim_{n\to \infty}T_{n,a,f}(x) \end{align} (with the understanding that if $\rho = \infty$, then the domain is $\Bbb{R}$)

You seem to be interested in the case where $\rho = \infty$, so that $S_{a,f}$ has its domain equal to all of $\Bbb{R}$, so ok let's focus on this case. Now, there is a very natural question to ask, namely, does the function equal its Taylor series? i.e is it true that $f = S_{a,f}$ (or more explicitly, is it true that for every $x\in \Bbb{R}$, $f(x) = S_{a,f}(x)$?).

The answer is NOT NECESSARILY, even if we assume $\rho = \infty$. The typical counter-example is given by $f:\Bbb{R}\to \Bbb{R}$ defined as \begin{align} f(x) &:= \begin{cases} e^{-\frac{1}{x^2}} & \text{if $x\neq 0$} \\ 0 & \text{if $x=0$} \end{cases} \end{align} Then, you can check that $f$ is infinitely-differentiable, and that for every $k$, $f^{(k)}(0) = 0$. So, the radius of convergence is $\rho = \infty$, and the Taylor series of $f$ about the origin is $S_{0,f}:\Bbb{R} \to \Bbb{R}$, $S_{0,f}(x) = 0$ for all $x$. Now, clearly $f$ is not the constant zero-function, so $f\neq S_{0,f}$.

Given this result, the next natural question to ask is "under what conditions (if any) is the function equal to its Taylor series?" The answer to this is pretty simple. Well, fix an $x \in \Bbb{R}$. Then, by definition of Taylor polynomial and Remainder, we have for every integer $n\geq 0$: \begin{align} f(x) &= T_{n,a,f}(x) + R_{n,a,f}(x) \end{align} Since this is true for all $n\geq 0$, we can also take the limit as $n \to \infty$ on both sides to get: \begin{align} f(x) &= \lim_{n\to \infty} \bigg(T_{n,a,f}(x) + R_{n,a,f}(x)\bigg) \\ &= S_{f,a}(x) + \lim_{n\to \infty} R_{n,a,f}(x) \end{align} Therefore, $f(x) = S_{f,a}(x)$ if and only if $\lim\limits_{n\to \infty}R_{n,a,f}(x) = 0$.

With the counter-example and the result above in mind, we can understand the difference between radius of convergence and remainder:

  • The radius of convergence of the Taylor series is simply a number $\rho$. All it tells you is for what values of $x$ does the series even converge (because recall that the Taylor series is defined as the limit $\lim_{n\to \infty}T_{n,a,f}(x)$ provided the limit exists, so we are asking when does this limit exist in $\Bbb{R}$). Things like ratio test/ root-test/alternating-test or any other "series test" you may have learnt are merely techniques/tools in helping you find out what the radius of convergence $\rho$ is (sure there is an explicit formula given by the Cauchy-Hadamard formula, but sometimes that's very difficult to calculate with, so we try to look for simpler alternatives). BUT, the Radius of convergence tells you NOTHING about whether or not (within the intrval of convergence) the Taylor series $S_{a,f}$ is actually equal to the function $f$.

  • The remainder $R_{n,a,f}$ is by definition the difference between $f$ (the actual) and $T_{n,a,f}$ (the approximation). It gives a quantitative measure of how good your approximation is. Also, if the Taylor series converges at a point $x$, then the limit $\lim_{n\to \infty}R_{n,a,f}(x)$ will exist. This limit may or may not be zero, and as shown above, we have $f(x) = S_{f,a}(x)$ if and only if this limit is $0$. So, the (limit of) remainder allows you to answer the question "is my function equal to its Taylor series everywhere".

Related Question