If the Taylor expansion of $f$ converges to $f$, prove that there are constants $C,R$ such that $f^{(k)}(x) \le C \cdot \frac{k!}{R^k}$

convergence-divergencereal-analysistaylor expansion

Q: Let $f$ be an infinitely differentiable function on an open interval $I$ centered at $a$. Assume that the Taylor expansion of $f$ about $a$ converges to $f$ at every point of $I$. Prove that there are constants $C,R$ and a (possibly smaller) interval $J$ centered at $a$ such that, for each $x \in J$, it holds that:
\begin{align*}
|f^{(k)}(x)| \le C \cdot \frac{k!}{R^k} \\
\end{align*}


My Work:

Since the Taylor series about $a$ converges to $f$ then the Taylor series remainder must converge to zero:

\begin{align*}
R_{k,a}(x) &= \int_a^x f^{(k+1)}(t) \frac{(x-t)^k}{k!} \, dt \\
R_{k,a}(x) &\to 0 \\
\end{align*}

That means for any $\epsilon > 0$ there must exist some $N$ such that:

\begin{align*}
k > N &\implies |R_{k,a} – 0| < \epsilon \\
\end{align*}

So if $k > N$, we have:

\begin{align*}
\left| \int_a^x f^{(k+1)}(t) \frac{(x-t)^k}{k!} \, dt \right| < \epsilon \\
\end{align*}

From here, I'm stuck. I can prove the reverse direction, for what that's worth.

Best Answer

You know that $$ f(x) = \sum_{n=0}^\infty a_n (x-a)^n $$ converges on $I$, and so by the $n$th-root test $$ |a_n| \le c_r / r^n$$ for any positive $r$ less than the radius of convergence, where $c_r>0$ depends upon $r$.

Also $$ f^{(k)}(x) = \sum_{n=k}^\infty n(n-1)\dots (n-k+1)a_n (x-a)^{n-k}.$$ If you plug in the inequality for $a_n$ and naively upper bound the sum when $|x-a| < r$, and use $$ \sum_{n=k}^\infty n(n-1)\dots (n-k+1) y^{n-k} = \frac{d^k}{dy^k} \left(\sum_{n=0}^\infty y^n \right) = \frac{d^k}{dy^k}\left(\frac1{1-y}\right) = \frac{k!}{(1-y)^{k+1}}, \qquad (|y|<1)$$ you get $$ |f^{(k)}(x)| \le c_r \frac{r k!}{(r-|x-a|)^{k+1}} . $$ Now choose $0 < R < r$ and set $J = \{x:|x-a|<r-R\}$.