[Math] Rules of Double Sums

summation

Wat are the (most important) rules of double sums? Below are some rules I encountered – are they all correct and complete? Offerings of clear intuition or proofs (or other additions) are most welcome to remove confusion.

  1. $$\sum_{i=1}^m[x_i] \cdot \sum_{j=1}^n[y_j] = \sum_{i=1}^m\sum_{j=1}^n[x_iy_j]
    \ \text{(general case)}$$
  2. $$\sum_{i=1}^n[x_i] \cdot \sum_{j=1}^n[y_j] = \sum_{i=1}^n\sum_{j=1}^n[x_iy_j] = \sum_{i=1}^n[x_iy_i] + \sum_{i=1}^n\sum_{j=1 \\ j \neq i}^n[x_iy_j] \ (m = n)\\ \text{(less general case)}$$
  3. $$(\sum_{i=1}^n[x_i])^2 = \sum_{i=1}^n[x_i] \cdot \sum_{j=1}^n[x_j] = \sum_{i=1}^n\sum_{j=1}^n[x_ix_j] = \sum_{i=1}^n[x_i^2] + \sum_{i=1}^n\sum_{j=1 \\ j \neq i}^n[x_ix_j] \\ \text{(special case)}$$
    Question related to (3): why suddenly an index $j$ appears (initially, we had only $i$)?

Furthermore, in relation to (3) there is a theorem given in my textbook without intuition/proof:

''When we work with double sums, the following theorem is of special interest; it is an immediate consequence of the multinominal expansion of $(x_1 + x_2 + \ldots + x_n)^2$:

$\bf{Theorem}$: $$\sum_{i<j}\sum[x_ix_j] = \frac{1}{2}[(\sum_{i=1}^n[x_i])^2 – \sum_{i=1}^n[x_i^2]], \text{where} \sum_{i<j}\sum[x_ix_j] = \sum_{i=1}^{n-1}\sum_{j=i+1}^n[x_ix_j]''.$$

What is the special interest/purpose of this theorem (when is it useful?) and what is the relation with (3) above?

Best Answer

The picture for rule 1 looks like this:

$$ \begin{array}{c|ccccc} & x_1 & x_2 & x_3 & x_4 & x_5 \\\hline y_1 & x_1y_1 & x_2y_1 & x_3y_1 & x_4y_1 & x_5y_1 \\ y_2 & x_1y_2 & x_2y_2 & x_3y_2 & x_4y_2 & x_5y_2 \\ y_3 & x_1y_3 & x_2y_3 & x_3y_3 & x_4y_3 & x_5y_3 \end{array} $$

That is, to find $(x_1 + x_2 + x_3 + x_4 + x_5)(y_1 + y_2 + y_3)$ we can look at all possible products $x_iy_j$ and sum them. The products can be arranged in a rectangle as above.

When the $y$'s equal the $x$'s we get

$$ \begin{array}{c|ccccc} & x_1 & x_2 & x_3 & x_4 & x_5 \\\hline x_1 & x_1^2 & \color{red}{x_2x_1} & \color{red}{x_3x_1} & \color{red}{x_4x_1} & \color{red}{x_5x_1} \\ x_2 & \color{purple}{x_1x_2} & x_2^2 & \color{red}{x_3x_2} & \color{red}{x_4x_2} & \color{red}{x_5x_2} \\ x_3 & \color{purple}{x_1x_3} & \color{purple}{x_2x_3} & x_3^2 & \color{red}{x_4x_3} & \color{red}{x_5x_3} \\ x_4 & \color{purple}{x_1x_4} & \color{purple}{x_2x_4} & \color{purple}{x_3x_4} & x_4^2 & \color{red}{x_5x_4} \\ x_5 & \color{purple}{x_1x_5} & \color{purple}{x_2x_5} & \color{purple}{x_3x_5} & \color{purple}{x_4x_5} & x_5^2 \\ \end{array} $$

So what we see is that

$$ \color{red}{\mathsf{RED}} = \color{purple}{\mathsf{PURPLE}} \text{ and } \color{red}{\mathsf{RED}} + \color{purple}{\mathsf{PURPLE}} + \mathsf{BLACK} = \left( \sum_{i = 1}^m x_i \right)^2. $$

Hence

$$ 2 \color{purple}{\mathsf{PURPLE}} + \mathsf{BLACK} = \left( \sum_{i = 1}^m x_i \right)^2 \tag{1} $$

where

$$ \color{purple}{\mathsf{PURPLE}} = \sum_{i < j} x_ix_j \text{ and } \mathsf{BLACK} = \sum_{i = 1}^m x_i^2. \tag{2}$$

$(1)$ and $(2)$ imply

$$ \sum_{i < j} x_ix_j = \frac12 \left[ \left( \sum_{i = 1}^m x_i \right)^2 - \left( \sum_{i = 1}^m x_i^2 \right) \right]. $$

This shows up whenever you want to evaluate a sum in two indices where one index is $<$ the other. This shows up for instance in probability and combintorics.

Related Question