Power series and differentiation; proof-explanation

power seriesproof-explanationreal-analysis

I am reading a proof in my no-name lecture notes of the theorem stating that a power series is term-wise differentiable under certain conditions and that the derivative of this series has the same radius of convergence as its antiderivative. The proof is a bit lengthy, but quite elementary and I have only one question.

The proof uses the following Lemma:

Lemma. Assume that the power series $\sum_0^{\infty} a_k x^k$ has a positive radius of convergence $R$. Then its sum $s(x)$ is differentiable at the origin and $s'(0)=a_1$.

Theorem. Assume that the power series $\sum_0^{\infty} a_k x^k$ has radius of convergence $R>0$. Then its sum $s(x)$ is a continuous function of $x$ when $|x|<R$. Furthermore, $s(x)$ is differentiable when $|x|<R$ and its derivative is
$$
s^{\prime}(x)=\sum_{k=1}^{\infty} k a_k x^{k-1}, \tag{1}
$$

which is a power series with the same radius of convergence $R$.

Proof. The continuity follows from the differentiability.
Fix an $x$ such that $|x|<R$ and choose $h$ so that $|x|+|h|<R$. Then
$$
\sum_{k=0}^{\infty}\left|a_k\right|(|x|+|h|)^k<\infty . \tag{2}
$$

Now consider the following table.
$$
\begin{matrix}
a_0 & 0 & 0 & \cdots & 0 & 0 & \cdots \\
a_1 x & a_1 h & 0 & \cdots & 0 & 0 & \cdots \\
a_2 x^2 & 2 a_2 x h & a_2 h^2 & \cdots & 0 & 0 & \cdots \\
\vdots & \vdots & \vdots & & \vdots & \vdots & \\
a_n x^n & n a_n x^{n-1} h & \binom{n}{2} a_n x^{n-2} h^2 & \cdots & a_n h^n & 0 & \cdots \\
\vdots & \vdots & \vdots & & \vdots & \vdots &
\end{matrix}
$$

The sum of the elements in row $n+1$ is, according to the binomial theorem, equal to $a_n(x+h)^n$. Hence the sum of all the row sums is
$$
\sum_{n=0}^{\infty} a_n(x+h)^n=s(x+h) .
$$

If we instead form the sum of all the column sums, then we get an expression
$$
A_0(x)+A_1(x) h+A_2(x) h^2+\cdots
$$

where $A_0(x)=s(x)$
$$
A_1(x)=a_1+2 a_2 x+\cdots+n a_n x^{n-1}+\cdots=\sum_{k=1}^{\infty} k a_k x^{k-1},
$$

and so forth. The double series formed by all the terms in the table is absolutely convergent, for if we replace each term with its absolute value and sum row-wise in the same way as above, then we get the series (2). (The sum of a positive series is independent of the order of summation.) From the absolute convergence it follows that row-wise and column-wise summations in the table give the same result. Hence
$$
s(x+h)=A_0(x)+A_1(x) h+A_2(x) h^2+\cdots .\tag{3}
$$

We remind that $x$ is fixed and that the variable is $h$. In (3), we have written $s(x+h)$ as the sum of a power series in $h$. This series is convergent when $|h|<R-|x|$. The Lemma now yields that $s(x+h)$ is differentiable at $h=0$ with derivative $A_{1}(x)$. This shows that $s(x)$ is differentiable at $x$ and that Formula (1) is valid.

It remains to show that the radius of convergence $R^{\prime}$ of the differentiated series (1) is equal to $R$. It is plain that $R^{\prime} \geq R$ because the series $A_{1}(x)$ is convergent when $|x|<R$. The reverse inequality $R \geq R^{\prime}$ follows from the comparison theorem because the inequality
$$
\left|a_{k} x^{k}\right| \leq|x|\left|k a_{k} x^{k-1}\right|, \quad k \geq 1
$$

shows that the given power series is convergent if the differentiated power series is (absolutely) convergent. (This also shows that if $R=0$, then $R^{\prime}=0$.) The proof of the Theorem is complete.

  1. Why does $A_1$ converge for $|x|< R$?

It is plain that $R^{\prime} \geq R$ because the series $A_{1}(x)$ is convergent when $|x|<R$.

Best Answer

There are many knowledgeable ways to prove this theorem, and certainly many posts about it on this site. But the author gave you just above the simplest justification for their words "It is plain that [...] $A_{1}(x)$ is convergent when $|x|<R$":

Choose any $h\ne0$ such that $|x|+|h|<R$, and remember that $$|h|\sum_{k=1}^{\infty} k |a_k| |x|^{k-1}\le\sum_{k=0}^{\infty}\left|a_k\right|\left(|x|+|h|\right)^k<\infty.$$

Related Question