Independence – Are Linear Combinations of Independent Random Variables Again Independent?

expected valueindependencemoment-generating-functionrandom variable

Let $X_1,X_2,\ldots,X_n$ be (iid) Random variables and define $Y_n:=\sum_{j=1}^na_jX_j$ with $a_j\in \mathbb{R}$, can we then say that the $a_jX_j$ are independent aswell. Can we express the MGF than in the following way $$M_{Y_n}(t)=\mathbb{E}(e^{t(a_1X_1+\ldots a_nX_n)})=\mathbb{E}(e^{ta_1X_1}\cdot\ldots\cdot e^{t a_nX_n})=M_{X_1}(a_1t)\cdot\ldots \cdot M_{X_n}(a_nt)$$

Best Answer

Yes, for the content of your question. and No, for the title, in general.

Yes:

Your $a_1,\dots, a_n$ are just some constant numbers. Then the independence of $X_1,\dots, X_n$ implies that the $a_iX_i$ are also independent. In fact, for any functions $g_1,\dots, g_n$ you would find that $g_1(X_1)$, $g_2(X_2)\dots g_n(X_n)$ are independent.

From independence it follows that $E\prod g_i(X_i) = \prod E g_i(X_i)$. Your calculations are correct.

No:

Now look at $Y = \sum_{i=1}^n a_iX_i$ and $Z = \sum_{i=1}^n b_iX_i$. The linear combinations $Y$ and $Z$ of the $X_i$ are in general not independent. They are independent when $b_i=0$ for all $i$ where $a_i\neq 0$, and $a_j=0$ for all $j$ with $b_j\neq 0$. The simplest case where this is not fulfilled is $a_i=b_i$ for all $i$: then $Y=Z$.

Normal distributed variables $X_i$

When the $X_i$ are normal distributed, the condition "$a_i\neq 0 \implies b_i=0$" can be relaxed. In that case the linear combinations $Y$ and $Z$ are independent whenever $\sum a_ib_i = 0$, i.e., when the vectors $a=(a_1,\dots, a_n)$ and $b=(b_1,\dots,b_n)$ are orthogonal. This is a fact that contributes to the popularity of the normal distribution for modelling. One of the consequences of this fact is, that estimates $\bar x$ for the mean and $s^2$ for the variance of normal distributed r.v. are independent :-)

Related Question