Converse of Taylor’s Theorem

approximation-theoryfunctional-equationsnumerical methodsreal-analysistaylor expansion

Let $n$ be a nonnegative integer and $a,b\in\mathbb{R}$ such that $a<b$. From Taylor's Theorem, we know that any $n$-time differentiable function $f:(a,b)\to \mathbb{R}$ satisfies the condition that
$$f(x+h)=\sum_{k=0}^n\,\frac{f_k(x)}{k!}\,h^k+R_n(x,h)\text{ for all $x\in(a,b)$ and $h\in(a-x,b-x)$}\,,\tag{*}$$
where $f_k:(a,b)\to\mathbb{R}$ is the $k$-th derivative of $f$ for each $k=0,1,2,\ldots,n$ (in particular, $f_0=f$), and the $n$-th remainder term $R_n(x,h)$ satisfies
$$R_n(x,h)\in o\left(h^n\right)\text{ for each $x\in(a,b)$ and for every small $h\in\mathbb{R}$}\,.\tag{**}$$
(In other words, $\lim\limits_{h\to 0}\,\dfrac{R_n(x,h)}{h^n}=0$ for all $x\in (a,b)$.)

I have a question whether the converse of Taylor's Theorem is true. In other words, is the following conjecture correct?

Conjecture. Suppose that functions $f,f_0,f_1,f_2,\ldots,f_n:(a,b)\to\mathbb{R}$ satisfy (*) and (**). Then, $f$ is $n$-time differentiable, with $k$-th derivative $f_k$ for each $k=0,1,2,\ldots,n$ (in particular, $f_0=f$).

From this link, some continuity or boundedness constraints on the $f_k$'s or on the remainder term $R_n$ are assumed for the converse to hold. If the converse does not hold in general (i.e., without these continuity or boundedness constraints), could anybody give a counterexample? If it is true, then can you please give me a proof or a reference? What I know is that the converse holds for $n=0$ (trivially) and $n=1$ (with a small amount of work).

Best Answer

For $n > 1$ we need some regularity assumptions on the $f_k$ (or on $R_n$). Without such assumptions, consider $$f(x) = \begin{cases}\qquad 0 &\text{if } x = 0 \\ x^{n+1}\sin (x^{-n}) &\text{if } x \neq 0\end{cases}$$ and $$f_k(x) = \begin{cases}\quad 0 &\text{if } x = 0 \\ f^{(k)}(x) &\text{if } x \neq 0\end{cases}$$ for any interval with $0 \in (a,b)$. Outside the origin $f$ is analytic, hence the remainder term $R_n(x,h)$ is $o(h^n)$ for every $x \in (a,b) \setminus \{0\}$ (but the constants depend on $x$ of course). For $x = 0$ we find $$\frac{R_n(0,h)}{h^n} = h\sin (h^{-n})\,,$$ thus $R_n(x,h) = o(h^n)$ for every $x \in (a,b)$. But $f$ is differentiable only once, since $$f_1(x) = f'(x) = \begin{cases} \qquad 0 &\text{if } x = 0 \\ (n+1)x^n\sin(x^{-n}) - n \cos (x^{-n}) &\text{if } x \neq 0\end{cases}$$ isn't even continuous at $0$.

Related Question