If series is absolutely convergent then $\sum \limits_{n\in I}a_n=\sum \limits_{k=1}^{\infty}\sum \limits_{n\in I_k}a_n.$

real-analysissequences-and-series

Suppose that the series $\sum \limits_{n=1}^{\infty}a_n$ is absolutely convergent and let $I\subseteq \mathbb{N}$ such that $I=\bigsqcup\limits_{k=1}^{\infty}I_k$. Then show that $$\sum \limits_{n\in I}a_n=\sum \limits_{k=1}^{\infty}\sum \limits_{n\in I_k}a_n. \qquad (*)$$

I don't have any idea how to solve it.

I do know that in any absolute convergent series permutation of terms does not change the sum and I guess it should be used somehow in order to prove equality $(*)$.

Can anyone show the rigorous proof of equality $(*)$, please?

Best Answer

First assume that $a_n \ge 0$ and define $\sum_{n \in I} a_n = \sup_{J \subset I, J \text{ finite}} \sum_{n \in J} a_n$. Note that it follows that if $I \subset I'$ then $\sum_{n \in I} a_n \le \sum_{n \in I'} a_n$.

From https://math.stackexchange.com/a/3680889/27978 we see that if $K = K_1 \cup \cdots \cup K_m$, a disjoint union, then $\sum_{n \in K} a_n = \sum_{n \in K_1} a_n + \cdots + \sum_{n \in K_m} a_n$.

Since $I'=I_1 \cup \cdots \cup I_m \subset I$ we see that $\sum_{n \in I} a_n \ge \sum_{n \in I'} a_n = \sum_{k=1}^m \sum_{n \in I_k} a_n$. It follows that $\sum_{n \in I} a_n \ge \sum_{k=1}^\infty \sum_{n \in I_k} a_n$. This is the 'easy' direction.

Let $\epsilon>0$, then there is some finite $J \subset I$ such that $\sum_{n\in J} a_n > \sum_{n \in I} a_n -\epsilon$. Since $J$ is finite and the $I_k$ are pairwise disjoint we have $J \subset I'=I_1 \cup \cdots \cup I_m$ for some $m$ and so $\sum_{k=1}^\infty \sum_{n \in I_k} a_n \ge \sum_{k=1}^m\sum_{n \in I_k} a_n \ge \sum_{k=1}^m\sum_{n \in J \cap I_k} a_n = \sum_{n\in J} a_n > \sum_{n \in I} a_n -\epsilon$.

(It is not relevant here, but a small proof tweak shows that the result holds true even if the $a_n$ do not have a finite sum.)

Now suppose we have $a_n \in \mathbb{R}$ and $\sum_{n \in I} |a_n| = \sum_{n=1}^\infty |a_n|$ is finite. We need to define what we mean by $\sum_{n \in I} a_n$. Note that $(a_n)_+=\max(0,a_n) \ge 0$ and $(a_n)_-=\max(0,-a_n) \ge 0$. Since $0 \le (a_n)_+ \le |a_n|$ and $0 \le (a_n)_- \le |a_n|$ we see that $\sum_{n \in I} (a_n)_+ = \sum_{k=1}^\infty \sum_{n \in I_k} (a_n)_+$ and similarly for $(a_n)_-$.

This suggests the definition (cf. Lebesgue integral) $\sum_{n \in I} a_n = \sum_{n \in I} (a_n)_+ - \sum_{n \in I} (a_n)_-$.

With this definition, all that remains to be proved is that $\sum_{k=1}^\infty \sum_{n \in I_k} a_n = \sum_{k=1}^\infty \sum_{n \in I_k} (a_n)_+ - \sum_{k=1}^\infty \sum_{n \in I_k} (a_n)_-$ and this follows from summability and the fact that for each $k$ we have $\sum_{n \in I_k} a_n = \sum_{n \in I_k} (a_n)_+ - \sum_{n \in I_k} (a_n)_-$.

Note: To elaborate the last sentence, recall that I defined $\sum_{n \in I_k} a_n$ to be $\sum_{n \in I_k} (a_n)_+ - \sum_{n \in I_k} (a_n)_-$, so all that is happening here is the definition is applied to $I_k$ rather than $I$. Then to finish, note that if $d_k,b_k,c_k$ are summable and satisfy $d_k=b_k-c_k$ then $\sum_{k=1}^\infty d_k= \sum_{k=1}^\infty b_k- \sum_{n=1}^\infty c_k$, where $d_k = \sum_{n \in I_k} a_n$, $b_k = \sum_{n \in I_k} (a_n)_+$ and $c_k = \sum_{n \in I_k} (a_n)_-$.