The variance of a sum of random variables is the sum of the elements of the covariance matrix; does this extend to higher-order central moments

covarianceprobability theoryrandom variablesreference-request

In general, the variance of the sum of $n$ not-necessarily-independent random variables is the sum of their covariances, i.e. where $Y = \sum_{i=1}^n X_i$

$\operatorname{Var}\left(Y\right) = \mathbb{E}[(Y – \mathbb{E}[Y])^2] =\sum_{i=1}^n \sum_{j=1}^n \operatorname{Cov}\left(X_i, X_j\right) = \sum_{i=1}^n \sum_{j=1}^n \mathbb{E}[(X_i-\mathbb{E}[X_i])(X_j-\mathbb{E}[X_j])]$

Does this extend to higher order central moments for sums of not-necessarily-independent random variables, e.g. is the following also true (and so on to higher orders)?

$\mathbb{E}[(Y – \mathbb{E}[Y])^3] \overset{?}{=} \sum_{i=1}^n \sum_{j=1}^n \sum_{k=1}^n \mathbb{E}[(X_i-E[X_i])(X_j-\mathbb{E}[X_j])(X_k-\mathbb{E}[X_k])]$

Pretty much every Google result regarding sums of random variables that I can find assume mutual independence. I've found it difficult to find anything for the case where this assumption isn't necessarily true.

From some numerical trials that I've run it seems to be true, but I have struggled to prove it. A suggestion of a reference to this result (if true) would also be appreciated.

Best Answer

For reals $z_1,\dots,z_n$ it holds $\left(\sum_{i=1}^nz_i\right)^3=\sum_{i=1}^n\sum_{j=1}^n\sum_{k=1}^nz_iz_jz_k$. Use this and linearity of the expected value.