[Math] Radius of convergence of a Taylor series.

real-analysis

I am looking for the shortest possible way to find out the radius of convergence of the Taylor series expansion about $x = a \in \mathbb{R}$ of the function

$$f(x) = \frac{1}{1 + x^2}$$

Taylor series expansion of the function $f(x)$ about $a$ will be $f(x) = \sum_{n = 0}^{\infty} a_n (x – a)^n$ where $a_n = \frac{f^n(a)}{n!}$. So one possible way is to calculate $f^n(a)$ and consider different values of $a_n$. Then apply root test, ratio test etc. to find out the radius of convergence of the power series.

I want to reduce calculation. So the above process will not be applicable.

Thank you for your help.

Best Answer

The function

$$f(z) = \frac{1}{1+z^2}$$

is meromorphic in the entire plane. Therefore, the Taylor series about any point $a$ will converge in the largest disk with centre $a$ that does not contain a pole of $f$. Since $f$ has only two poles, in $i$ and $-i$, the radius of convergence of the Taylor series is $\min \{ \lvert a-i\rvert, \lvert a+i\rvert\}$. For real $a$, the two distances are equal, and the radius of convergence is $\sqrt{1+a^2}$.

Related Question