[Math] Confusion about covariance

probabilityrandom variablesstatistics

$\newcommand{\Cov}{\operatorname{Cov}}$

$X_i$ from $i=1,\ldots,n$ is a set of random variables

The following confuses me:

$\Cov\left(\sum_{i=1}^n X_i, \sum_{j=1}^n X_j\right) =$ the sum of all possible covariance pairs (so $n\cdot n$ terms) (source is the book I'm currently reading)

In my thoughts, the expansion might as well be

$$\Cov(X_1, X_1)+\Cov(X_2, X_2)+\cdots+\Cov(X_n, X_n)$$

Or

$$\Cov(X_1 + X_2 + \cdots +X_n, X_1 + X_2 + \cdots +X_n)$$

Best Answer

The other comments and answers are absolutely correct. But rather than just stating covariance identities, maybe it will be helpful to actually expand it out using the definition of covariance.

$\begin{align} \textrm{Cov}\left[\sum_{i=1}^{n}X_i,\sum_{j=1}^{n}X_j\right]&=\textrm{E}\left[\left(\sum_{i=1}^{n}X_i-\textrm{E}\left[\sum_{i=1}^{n}X_i\right]\right)\left(\sum_{j=1}^{n}X_j-\textrm{E}\left[\sum_{j=1}^{n}X_j\right]\right)\right]\\ &= \textrm{E}\left[\left(\sum_{i=1}^{n}\left(X_i-\textrm{E}\left[X_i\right]\right)\right)\left(\sum_{j=1}^{n}\left(X_j-\textrm{E}\left[X_j\right]\right)\right)\right]\\ &= \textrm{E}\left[\sum_{i=1}^{n}\left(\left(X_i-\textrm{E}\left[X_i\right]\right)\sum_{j=1}^{n}\left(X_j-\textrm{E}\left[X_j\right]\right)\right)\right]\\ &= \textrm{E}\left[\sum_{i=1}^{n}\sum_{j=1}^{n}\left(X_i-\textrm{E}\left[X_i\right]\right)\left(X_j-\textrm{E}\left[X_j\right]\right)\right]\\ &= \sum_{i=1}^{n}\sum_{j=1}^{n}\textrm{E}\left[\left(X_i-\textrm{E}\left[X_i\right]\right)\left(X_j-\textrm{E}\left[X_j\right]\right)\right]\\ &= \sum_{i=1}^{n}\sum_{j=1}^{n}\textrm{Cov}\left[X_i,X_j\right]\\ \end{align}$

Related Question