Probability Theory – Product of Independent Random Variables

independenceprobability theoryrandom variables

The following is a classic example that pairwise independent does not necessarily imply mutually independent:

Let $X_1$ and $X_2$ be independent r.v.'s with distributions $$P(X_i=1)=P(X_i=-1)=\frac{1}{2}\quad\tag{*}$$ for $i=1,2.$ Let $Z=X_1X_2$. Then $X_1,X_2,Z$ are pairwise independent but they are not mutually independent.

In this case $X_1$ and $Z=X_1X_2$ are independent. Now let $(X_i)_{i=1}^{\infty}$ be a family of independent variables which satisfy $(*)$ and let $Z_i=X_1X_2\cdots X_i$. How can I show that $(Z_i)_{i=1}^{m}$ are independent for any finite $m\in{\Bbb N}$?


When $m=2$, it is done. If one needs induction, then a key step is to show that the $n-1$ dimensional random vector $(Z_1,Z_2,\dots,Z_{n-1})$ and $Z_n$ are independent. This where I have no idea how to go on after writing down the definition.

Best Answer

Let $\epsilon_i=-1$ or $1$. We have $$P\left(\bigcap_{j=1}^n\{Z_j=\varepsilon_j\}\right)=P\left(\bigcap_{j=1}^n\{Z_j=\varepsilon_j\}\mid \{X_n=-1\}\right)P(X_n=-1)+P\left(\bigcap_{j=1}^n\{Z_j=\varepsilon_j\}\mid \{X_n=1\}\right)P(X_n=1),,$$ hence $$P\left(\bigcap_{j=1}^n\{Z_j=\varepsilon_i\}\right)=\frac 12P\left(\bigcap_{j=1}^{n-1}\{Z_j=\varepsilon_i\}\cap\{Z_{n-1}=-\epsilon_n\}\right)+\frac 12P\left(\bigcap_{j=1}^{n-1}\{Z_j=\varepsilon_i\}\cap\{Z_{n-1}=\epsilon_n\}\right).$$ We just deal with the case $\epsilon_{n-1}=\epsilon_n$. Then $$P\left(\bigcap_{j=1}^n\{Z_j=\varepsilon_j\}\right)=\frac 12P\left(\bigcap_{j=1}^{n-1}\{Z_j=\varepsilon_j\}\right)=\frac 12\prod_{j=1}^{n-1}P\left(\{Z_j=\varepsilon_j\}\right),$$ which is what we want. The other case is similar.

Related Question