[Math] Understanding power series and their representation of functions

calculuscomplex-analysispower seriesreal-analysissequences-and-series

Below is my current level of understanding about Power Series (my understanding could be completely wrong, in which case please correct me), and I want to know if it is correct.

I feel that Power Series is something that it treated very poorly in most (introductory) textbooks. It seems as though authors keep on dodging the central ideas of Power Series, and their relation to functions, for seemingly unknown reasons.


My Understanding of Power Series

Let's say we have a power series, call it $p(x)$ (as power
series are functions themselves)
and it converges to a finite set of values, $S$ over an interval of convergence $R$ then it can be used to represent $f(x)$ within that interval of convergence $R$

$$\underbrace{f(x)}_\text{Some analytic function} = \underbrace{p(x)}_\text{A power series representation} \ \ \underbrace{\forall\ |x| < R}_\text{within the power series' radius of convergence}$$

An analytic function is equal to its power series representation within the power series' radius of convergence


An Example: The Geometric Series

Take the famous geometric series (note the LHS is $f(x)$ and the RHS is $p(x))$

$$\frac{1}{1-x} = 1 + x + x^2 + x^3 + \ …$$

it has a radius of convergence of $|x| <1$, now what this means is that only for $x \in (-1, 1)$, can it actually be used as a representation of $f(x) = \frac{1}{1-x}$. Outside of this interval (of convergence), equality is broken and we can't really use it anymore as a representation of $f$, therefore if we let $f(x) = \frac{1}{1-x}$ and $p(x) = 1+ x + x^2 + x^3 + \ …$, the the following two statements are true:

$$f(x) = p(x) \ \ \ \forall \ |x| <1$$
$$f(x) \neq p(x) \ \ \ \forall\ |x| >1$$

This is the reason why we talk about convergence of Power Series, and why we need Power Series to converge because if it didn’t converge, our power series representation $p(x)$, would never equal $f(x)$ and we could never use it as a way to evaluate $f(x)$.


Another example: $e^x$

In the case of $e^x$, the Power Series representation of it can actually be used as one of the definitions of it, because the Power Series representation of $e^x$ is valid for all $x \in \mathbb{C}$

$$e^x = 1 + x + \frac{x^2}{2!} + \frac{x^3}{3!} + \frac{x^4}{4!} \ … \ \ \forall x \in \mathbb{C}$$

Why are Power Series important?

Take the example of $e^x$ I've given above. So if we have a power series representation, $p(x)$ of some function $f$, then we can use $p(x)$ to define $f(x)$ for all $x$ inside the radius of convergence of the power series!


Power Series/Taylor Series and Polynomial Approximations (The Big Picture)

A polynomial approximation (a Power Series with finite terms) of any analytic function approaches the actual function as the number of terms in the polynomial approximation (the Power Series with finite terms, or the partial sums of a Power Series) tends to infinity, at which point it is equivalent to the analytic function.

This is the reason why transcendental functions, like $e^x$, $\sin(x)$ etc. are transcendental, because they need a power series to represent/define them. They can't be defined by a finite sequence of terms. Thus the only way to define transcendental functions is via Power Series.

Furthermore Power Series, provide us with a deep way to express non-polynomial functions, such as trigonometric functions, as polynomial functions (via a Power Series). It's one of the neat shortcuts that an infinite amount of terms provides us with, the ability to represent non-polynomial functions as polynomial functions.


Questions:

  • Is my understanding correct?
  • Is there anything that you can add to what I've written above that would make Power Series clearer to those learning about them?
  • Furthermore are their higher levels of understandings of Power Series from Complex Analysis, Real Analysis, etc?

Best Answer

It's not that your understanding is wrong, it's more that the wording of the question suggests that the definitions, properties, applications, and "cultural context" of power series are a bit tangled.

For concreteness and definiteness, let's work over the real numbers. If $x_{0}$ is a real number, and if $(a_{k})_{k=0}^{\infty}$ is an arbitrary sequence of real numbers, the associated power series with coefficients $(a_{k})$ and center $x_{0}$ is the expression $$ \sum_{k=0}^{\infty} a_{k} (x - x_{0})^{k}. \tag{1} $$

  1. Power series make sense as purely algebraic entities. For example, two power series (with the same center) can be added termwise and multiplied using the "Cauchy product": \begin{gather*} \sum_{k=0}^{\infty} a_{k} (x - x_{0})^{k} + \sum_{k=0}^{\infty} b_{k} (x - x_{0})^{k} = \sum_{k=0}^{\infty} (a_{k} + b_{k}) (x - x_{0})^{k}, \\ \left(\sum_{k=0}^{\infty} a_{k} (x - x_{0})^{k}\right) \left(\sum_{k=0}^{\infty} b_{k} (x - x_{0})^{k}\right) = \sum_{k=0}^{\infty} \left(\sum_{j=0}^{k} a_{j} b_{k - j}\right) (x - x_{0})^{k}. \end{gather*} These definitions make sense whether or not the individual series converge (next item). The crucial point is, each coefficient of the sum or product is a finite algebraic expression in the coefficients of the "operands".

  2. For each real number $x$, the power series (1) is an infinite series of real numbers, which may converge (the sequence of partial sums $$ s_{n}(x) = \sum_{k=0}^{n} a_{k} (x - x_{0})^{k} $$ converges to a finite limit) or diverge (otherwise). Clearly, (1) converges for $x = x_{0}$. It's not difficult to show that: a. If (1) converges for some $x$ with $|x - x_{0}| = r$, then (1) converges for every $x$ with $|x - x_{0}| < r$;

    b. If (1) diverges for some $x$ with $|x - x_{0}| = r$, then (1) diverges for every $x$ with $|x - x_{0}| > r$.

    It follows that for every power series (1), there exists an extended non-negative real number $R$ (i.e., $0 \leq R \leq \infty$) such that (1) converges for $|x - x_{0}| < R$ and diverges for $|x - x_{0}| > R$. (If $R = 0$, the former condition is empty; if $R = \infty$, the latter is empty.) This $R$ is called the radius of the power series (1). The power series $$ \sum_{k=0}^{\infty} k!\, x^{k},\qquad \sum_{k=0}^{\infty} \frac{x^{k}}{R^{k}},\qquad \sum_{k=0}^{\infty} \frac{x^{k}}{k!} $$ have radii $0$, $R$, and $\infty$, respectively.

  3. If (1) has positive radius (i.e., $0 < R$), the sum of the series defines a function $$ p(x) = \sum_{k=0}^{\infty} a_{k} (x - x_{0})^{k} $$ whose domain is the set of $x$ with $|x - x_{0}| < R$. In this region, $p$ is infinitely differentiable, and its derivatives are found by termwise differentiation, e.g., $$ p'(x) = \sum_{k=1}^{\infty} ka_{k} (x - x_{0})^{k-1} = \sum_{k=0}^{\infty} (k + 1) a_{k+1} (x - x_{0})^{k}. $$ Each derivative is itself a power series, convergent in the same region.

    A function $f$ is analytic at $x_{0}$ if $f$ is represented by a convergent power series in some open interval about $x_{0}$, and is analytic if $f$ is analytic at $x_{0}$ for every interior point $x_{0}$ of the domain of $f$.

  4. If $f$ is infinitely-differentiable at some point $x_{0}$, the Taylor series of $f$ at $x_{0}$ is the power series $$ \sum_{k=0}^{\infty} \frac{f^{(k)}(x_{0})}{k!} (x - x_{0})^{k}. $$ As is well-known, a function can be infinitely differentiable at $x_{0}$ without the Taylor series converging to $f$ in any open interval of $x_{0}$. The standard example is $f(x) = e^{-1/x^{2}}$ if $x \neq 0$, and $f(0) = 0$, whose Taylor series is identically zero.

  5. A transcendental function is real-analytic by definition. As Paramanand Singh notes, the defining property of a transcendental function is not "requires an infinite series" (i.e., "not a polynomial"), but "does not satisfy a polynomial equation in two variables" (i.e., "is not an algebraic function").

  6. Power series are important for many reasons. The most common rationale for introducing them in calculus is to obtain algebraic/analytic expressions for the exponential function (and the closely related circular and hyperbolic functions, and power functions with non-integer exponents), etc., etc. For example, from the exponential power series, one obtains $$ e = \exp(1) = \sum_{k=0}^{\infty} \frac{1}{k!}, $$ from which one can easily show $e$ is irrational.

    Another application is to obtain power series solutions of linear ordinary differential equations with non-constant coefficients. Power series are by no means the only infinite series useful for studying functions; Fourier series and wavelets come immediately to mind.

  7. As for generalizations: Power series with complex coefficients (and complex centers) make perfect sense; the wordings and notation above are chosen to minimize the modifications required to consider complex power series. There are useful notions of matrix-valued power series, operator-valued power series, etc.

    Complex power series exhibit interesting behavior (such as monodromy) not encountered over the reals, because a connected open set in the plane need not be simply-connected. Further, every holomorphic (i.e., complex-differentiable) function of one complex variable is automatically analytic. The logical strength of being "once complex-differentiable" gives complex analysis a completely different flavor from real analysis.