Independent Random Variables – Are Products of Independent Random Variables Also Independent?

independenceprobabilityproofrandom variable

Let $Z_0, Z_1, Z_2,…$ be independent and identically distributed such that

$P(Z_n = 1) = P(Z_n = -1) = 1/2$ for $n = 0, 1, 2, …$

Let $X_0 = Z_0$, $X_1 = X_0 Z_1$, $X_2 = X_1 Z_2$, …

Are $X_0, X_1, X_2, …$ independent?


What I tried:

We must prove that for Borel sets $B_1, …, B_n$,

$$P(X_0 \in B_0, X_1 \in B_1, …, X_n \in B_n) = \prod_{i=0}^{n} P(X_i \in B_i) \ (*) $$

since $X_0, X_1, X_2, …, X_n$ are independent $\forall n \in \mathbb{N}$ iff $X_0, X_1, X_2, …$ are independent.

  1. $\{ X_n \}_{n=0}^{\infty}$ is Markov i.e.

$$P[X_n \in B| X_m] = P[X_n \in B| \mathscr{F}_m]$$

$\forall m \in [0,n], B \in \mathscr B$

[see proof in answer below]

This implies that LHS of (*) is equivalent to:

$$P(X_0 \in B_0) P(X_1 \in B_1 | X_0 \in B_0) … P(X_n \in B_n | X_{n-1} \in B_{n-1})$$

  1. $P(X_{n+1} = 1) = P(X_{n+1} = -1) = 1/2$ can be proven by induction by noting the recurrence relations:

$$P(X_{n+1} = 1) = P(X_n = 1)P(Z_{n+1} = 1) + P(X_n = -1)P(Z_{n+1} = -1)$$

$$P(X_{n+1} = -1) = P(X_n = 1)P(Z_{n+1} = -1) + P(X_n = -1)P(Z_{n+1} = 1)$$.

I made use of the fact that $X_n$ and $Z_{n+1}$ are independent, which follows from $X_n = Z_0Z_1 \dots Z_n$ and $Z_1, …, Z_n$ and $Z_{n+1}$ are independent.

This makes the RHS of (*) to be $(1/2)^{n+1}$

  1. $P(X_i \in B_i | X_{i-1} \in B_{i-1}) = 1/2$ because for $a_{n+1} \in \{-1, +1\}$

$P(X_{n+1} = a_{n+1} | X_n) = E[1_{X_{n+1} = a_{n+1}} | X_n] = E[1_{X_{n+1} = a_{n+1}} | X_n = 1]P(X_n = 1) + E[1_{X_{n+1} = a_{n+1}} | X_n = -1]P(X_n = – 1)$

This makes the LHS of (*) to be $(1/2)^{n+1}$ as well. QED

Any mistakes or missing steps?

Best Answer

You are making this problem a lot harder than it needs to be because the random variables in question are two-valued, and the problem can be treated as one of independence of events rather than independence of random variables. In what follows, I will treat the independence of events even though the events will be stated in terms of random variables.

Let $Z_0,Z_1,Z_2,\cdots$ be independent random variables $\ldots$

I will take this as the assertion that the countably infinite collection of events $A_i = \{Z_i = +1\}$ is a collection of independent events. Now, a countable collection of events is said to be a collection of independent events if each finite subset (of cardinality $2$ or more) is a collection of independent events. Recall that $n\geq 2$ events $B_0, B_1, \cdots, B_{n-1}$ are said to be independent events if $$P(B_0\cap B_1\cap \cdots \cap B_{n-1}) = P(B_0)P(B_1) \cdots P(B_{n-1})$$ and every finite subset of two or more of these events is a collection of independent events. Alternatively, $B_0, B_1, \cdots, B_{n-1}$ are said to be independent events if the following $2^n$ equations hold: $$P(B_0^*\cap B_1^*\cap \cdots \cap B_{n-1}^*) = P(B_0^*)P(B_1^*)\cdots P(B_{n-1}^*)\tag{1}$$ Note that in $(1)$, $B_i^*$ stands for $B_i$ or $B_i^c$ (same on both sides of $(1)$) and the $2^n$ choices ($B_i$ or $B_i^c$) give us the $2^n$ equations.

For our application, $A_i = \{Z_i = +1\}$ and $A_i^c = \{Z_i=-1\}$, and so checking whether the $2^n$ equations $$P(A_0^*\cap A_1^*\cap \cdots \cap A_{n-1}^*) = P(A_0^*)P(A_1^*)\cdots P(A_{n-1}^*)\tag{2}$$ hold or not, is equivalent to checking that the joint probability mass function (pmf) of $Z_0, Z_1, \cdots, Z_{n-1}$ factors into the product of the $n$ marginal pmfs at each and every one of the points $(\pm 1, \pm 1, \cdots, \pm 1)$ which is what you would be doing if you had never heard of independent events, just about independent random variables.

Thus, the statement

Let $Z_0,Z_1,Z_2,\cdots$ be independent random variables $\ldots$

does mean, among other things, that $Z_0,Z_1,Z_2,\cdots, Z_{n-1}$ is a finite collection of independent random variables. But, does the assertion

For all $n \geq 2$, $\{Z_0,Z_1,Z_2,\cdots, Z_{n-1}\}$ is a set of $n$ independent random variables

imply that the countably infinite set $\{Z_0,Z_1,Z_2,\cdots \}$ is a collection of independent random variables?

The answer is Yes, because we know by hypothesis that some specific finite subsets of $\{Z_0,Z_1,Z_2,\cdots \}$ are independent random variables, while any other finite subset, say $\{Z_2, Z_5, Z_{313}\}$, is a subset of $\{Z_0, Z_1, \cdots, Z_{313}\}$ which are independent per the hypothesis and so the subset is also a set of independent random variables.

In your question, with each $a_i \in \{+1, -1\}$ and defining $b_i = \prod_{j=0}^i a_j$ which is also in $\{+1,-1\}$, \begin{align} P(X_0 = a_0, X_1 = a_1, \cdots, X_n = a_n) &= P(Z_0 = a_0, Z_1 = a_0a_1, Z_2 = a_0a_1a_2, \cdots, Z_n = a_0a_1...a_n)\\ &= P(Z_0=b_0, Z_1 = b_1, \cdots, Z_n = b_n)\\ &= \prod_{i=0}^n P(Z_i = b_i)\\ &= 2^{-(n+1)}\\ &= \prod_{i=0}^n P(X_i = a_i), \end{align} that is, all $2^{n+1}$ equations of the form $(2)$ hold. Thus, for each $n \geq 1$, $X_0, X_1, \cdots, X_n$ are independent random variables, and therefore the countably infinite collection $\{X_0, X_1, \cdots\}$ of random variables is a collection of independent random variables.


After reading over my revised answer, perhaps it is I who is making the problem much harder than necessary. My apologies.

Related Question