[Math] Conditional convergence and Riemann’s series theorem

convergence-divergenceimproper-integralsreal-analysissequences-and-series

  • There are tests to determine whether an integral or sum is convergent.

  • There are test to determine whether an integral or sum is absolutely convergent.

An integral or series is said to be $\mathbf {conditionally \; convergent}$ if it converges, but does not converge absolutely.

$(1) \;$First of all I don't really understand the need to introduce this extra terminology. When you are presented with a certain series or integral, there should be no problem in stating that it is convergent, but not absolutely convergent. In fact that would seem to me the clearest qualification.

$(2) \;$ Secondly I find the qualification "conditionally convergent" vague and suggestive. It seems to imply that only under certain (extra?) conditions the series or integral can be considered convergent; if these (extra?) conditions are not met, the series or sum should in fact be considered non-convergent. However, this is clearly nonsense. Because at this stage we have already established (under generally accepted criteria) that the series or integral is "convergent".

[It sounds to me like the verdict of a court where a defendant is "acquitted" of an alleged crime, but also "conditionally acquitted" of exactly the same crime. What would be the meaning?]

$(3) \;$Thirdly, I have difficulty with the connection between the conditional convergence of a series and Riemann's series theorem. In this theorem the assumption is made that the terms in the series can be shuffled around at liberty. By clever reordering of the terms any result can be obtained. This leads to the conclusion that if an (alternating) series is not absolutely convergent, then the value of the sum is undefined. This is a most curious result, because we have already determined that the series is convergent and its sum has a unique value.

It seems to me that the fallacy resides in the assumption that all the (infinite) terms are available for reordering. But that is not the proper way to look at an infinite sum. One should consider the $L$'s partial sums, and take the limit of $L$ to infinity. The Cauchy method. Since the partial sums are finite, reordering terms is a pointless exercise since it can not affect the sum.

It would appreciate it very much you could comment on the three points above.

EDIT: I accept John Hughes' detailed explanation and I thank him for his efforts.

Best Answer

You don't like the term "conditionally convergent" because it's redundant -- it can be defined in terms of other things we already have -- and misleading, because it sounds as if we're saying something might be convergent when we know that it IS convergent.

The first is (I think) a not very good reason to dislike a definition. Almost everywhere in mathematics, we can already write out the words of any definition in place of the thing defined, but it turns out to be nice to be able to say "convergent" for a series or "continuous" for a function without writing out lots of deltas and epsilons. The same goes for "compact", and "connected" and lots of other good words. The real reason to quibble with a definition (I believe) is when it doesn't really have any purpose. If you define a function to be "q-nice" if it exactly equals $\cos(x) / x$ on the irrationals, no one else will ever have occasion to use your new term. But "absolutely" and "conditionally" convergent turn out to come up a lot, so giving them names is a good idea.

The second complaint is perfectly legitimate -- you may not like the word chosen to denote the thing. (I personally think that "compact" is a stupid word for the thing it denotes, but I'm in a small minority, alas.) I share your distaste for "conditionally convergent", but I can't think of a better replacement, and even if I did, it'd be tough to get much traction for it. (Things can change, though: my sense is that the word "codomain" has gained some traction over "range" in the last couple of decades, esp. since "range" was used in various books to denote the codomain, but used to denote the image in other books, so it was pretty confusing already.)

As for the third thing: A series isn't an algebraic expression (although it's often written to resemble one), so we shouldn't expect it to have all the properties that an algebraic expression has (like satisfying the commutative law for addition). Instead, a series is based on a function from the positive integers to the reals (with $i$ being sent to the thing we call $a_i$); such a function is called a "sequence". Associated to the sequence is another sequence, the "partial sums," $s_k = \sum_{i = 1}^k a_i$. That sequence may or may not have a limit.

If you alter a sequence by permuting its terms (i.e., you build a different function $b$ with the same image as the function $a$), the associated sequence of partial sums will be different, and hence might have a different limit.

Riemann's theorem tells you that in certain circumstances, it will not have a different limit -- that the sum of the series with terms $a_i$ turns out to depend solely on the image of the function $a$ rather than on the function itself.

Added post-comments The claim in the original question, "Since the partial sums are finite, reordering terms is a pointless exercise since it can not affect the sum" is mistaken, as is "Under Cauchy the Riemann shuffle is ineffective. So the trick only works if you drag terms from infinity upfront and/or shift terms towards infinity." (Perhaps I should say of the latter not that it's mistaken, but rather that it makes no sense to me.)

Let me give an example of a shuffled series whose partial-sum sequence is different from that of the original series, which will show that your first statement is false. Let's look at the alternating harmonic series:

$$ 1 - \frac{1}{2} + \frac{1}{3} - \ldots. $$

The partial sums are $$ 1, 1 - \frac{1}{2}, 1 - \frac{1}{2} + \frac{1}{3}, $$ and so on.

Now I'm going to shuffle the part of the series after the first element by the following rule: for $k \ge 2$, move $\frac{1}{2^k}$ so that it appears before $\frac{-1}{2^{k-1} -1}$. I could write this out algebraically, but let me instead write out the first few partial sums, and you'll see the pattern: \begin{align} &- \frac{1}{4}\\ &- \frac{1}{4} + 1\\ &- \frac{1}{4} + 1 + \frac{1}{2}\\ &- \frac{1}{4} + 1 - \frac{1}{2} - \frac{1}{8}\\ &- \frac{1}{4} + 1 - \frac{1}{2} - \frac{1}{8} + \frac{1}{3}\\ &- \frac{1}{4} + 1 - \frac{1}{2} - \frac{1}{8} + \frac{1}{3} + \frac{1}{5}\\ &- \frac{1}{4} + 1 - \frac{1}{2} - \frac{1}{8} + \frac{1}{3} + \frac{1}{5} - \frac{1}{6}\\ &- \frac{1}{4} + 1 - \frac{1}{2} - \frac{1}{8} + \frac{1}{3} + \frac{1}{5} - \frac{1}{6} - \frac{1}{16}\\ &- \frac{1}{4} + 1 - \frac{1}{2} - \frac{1}{8} + \frac{1}{3} + \frac{1}{5} - \frac{1}{6} - \frac{1}{16} + \frac{1}{7} \end{align} and so on.

If we call the original partial sums $s_k$ and these new sums $t_k$, it's pretty easy to see that $t_k < s_k$ for every $k$, hence that we now have two distinct partial-sum sequences where one is strictly greater than the other. There's no reason, a priori, to claim that $\lim t_k = \lim s_k$. In this particular case, it happens that the limits are indeed equal, because after $2^p$ terms, the difference between $t_k$ and $s_k$ is less than $\frac{1}{2^{p}}$, but this actually requires a little proof.

On the other hand, it also guides us to a re-ordering that does not have this property, and for which the limits are not equal.

Observe that \begin{align} &\frac{1}{3} > \frac{1}{4}\\ &\frac{1}{5} + \frac{1}{7} > \frac{1}{4} \\ &\frac{1}{9} + \frac{1}{11} + \frac{1}{13} + \frac{1}{15} > \frac{1}{4} \\ \end{align} and so on. So if we reorder the sequence to be \begin{align} &\frac{1}{3}\\ &\frac{1}{3} + 1 \\ &\frac{1}{3} + 1 - \frac{1}{2} \\ &\frac{1}{3} + 1 - \frac{1}{2} + \frac{1}{5}\\ &\frac{1}{3} + 1 - \frac{1}{2} + \frac{1}{5} + \frac{1}{7} + \frac{1}{9}\\ &\frac{1}{3} + 1 - \frac{1}{2} + \frac{1}{5} + \frac{1}{7} + \frac{1}{9} - \frac{1}{4} \end{align} and so on, then the partial sums for this new sequence will always satisfy $t_k > \frac{1}{4}+ s_k$, so that $\lim t_k \ge \frac{1}{4} + \lim s_k$.

Note that every term of the original series, up to term $k$, is included in the new series (at least by term $2^k$, say), so the two series have the same elements. But their partial sums are nonetheless clearly distinct and have distinct limits. Cauchy doesn't help here because, intuitively speaking, the number of terms you can move can grow as you move out in the series, and that allows you to dominate the Cauchy bounds that you might hope would let you prove the limits are the same.

Further post-comment addition

The OP criticizes my answer above as "dubious," which I'll take not as an insult but as a critique of the definition of "absolutely convergent" as being too generous --- OP seems unhappy about its allowing you to shift forward terms from farther and farther out.

So here's an alternative, which might make a good exercise for an intro analysis course:

We say a permutation $s$ of the positive integers is modest if there's an integer $N$ such that $| s(i) - i | < N$ for every $i$. (Informally, $s$ moves each integer "only a modest distance.")

We say that a convergent series with term sequence $a$ is modestly absolutely convergent if for every modest permutation, $s$, the series with term sequence $b(i) = a(s(i))$ converges to the same sum as does the original series.

Then there's a small theorem: Every convergent series is modestly absolutely convergent.

I leave the proof to the reader, but here's a hint: try to place a bound on the difference between the $i$th partial sums for the two sequences, and show that this difference goes to $0$ as $i$ grows.

This seems to me to capture the thing the OP wanted to say.

The bad news: since "modestly absolutely convergent" turns out to be a synonym for "convergent", we don't really need this definition.

More bad news: even if we did need this definition, showing that a permutation is modest would add considerable complexity to every proof about convergence.

The good news: the OP's intuition, at least when restricted to "modest" permutations, is correct.

Related Question