Show that random vectors consisting of independent random variables are independent

independenceprobabilityrandom variables

Let $X_1,\cdots,X_n$ be independent random variables ($n\geq2$). For some $1\leq k< n$, show that
$$(X_1,\cdots,X_k),\ (X_{k+1},\cdots,X_n)$$ are independent random vectors, i.e. for all measurable sets $E_1\subset\mathbb R^k,\ E_2\subset\mathbb R^{n-k}$,
$$P[(X_1,\cdots,X_k)\in E_1,\ (X_{k+1},\cdots,X_n)\in E_2]\\=P[(X_1,\cdots,X_k)\in E_1]\cdot P[X_{k+1},\cdots,X_n)\in E_2]$$

My attempt is to first prove for the case where $E_1$ and $E_2$ are cubes and then try to generalize this to arbitrary open sets. But there is a gap I can't fix (a set of Lebesgue measure zero is not necessarily a set of measure zero under the measure induced by the random vectors.

Questions:

  1. Is this the right start? How do I continue?

  2. Is there a simple proof?

  3. Do I really need the independence of all $X_j$? If I only require that for each pair $X_i$ and $X_j$ with $i\leq k,\ j\geq k+1$, they are independent, can I get the same result?

Best Answer

$X_1,\dots,X_n$ being independent is, by definition, equivalent to $\sigma(X_1),\dots,\sigma(X_n)$ being independent which is further defined to mean that $$ P(A_1\cap\cdots\cap A_n)=P(A_1)\cdots P(A_n) \quad\text{ for all }A_i\in \sigma(X_i) $$ Now, since $\sigma(X_i)$ consists precisely of sets of the form $(X_i\in B_i)$ where $B_i\in{\cal B}(\mathbb{R})$, the definition of independence can be expressed equivalently as $$ (1) \qquad P(X_1\in B_1,\dots ,X_n\in B_n)=P(X_1\in B_1)\dots P(X_n\in B_n) \quad\text{ for all }B_i\in {\cal B}(\mathbb{R}) $$ Next, in (1), take $B_1=\cdots=B_k=\Omega$. Then (1) becomes $$ P(X_1\in \Omega,\dots,X_k\in\Omega,X_{k+1}\in B_{k+1},\dots,X_n\in B_n) \\ =P(X_1\in \Omega)\cdots P(X_k\in\Omega)P(X_{k+1}\in B_{k+1})\cdots P(X_n\in B_n) $$ which simplifies to

$$ (2)\qquad P(X_{k+1}\in B_{k+1},\dots,X_n\in B_n) =P(X_{k+1}\in B_{k+1})\cdots P(X_n\in B_n) $$ A similar argument gives $$ (3)\qquad P(X_1\in B_1,\dots,X_k\in B_k) =P(X_1\in B_1)\cdots P(X_k\in B_k) $$ Hence, substituting (2) and (3) into (1) gives

\begin{eqnarray*} &&P((X_1,\dots,X_k,X_{k+1},\dots,X_n)\in B_1\times\cdots\times B_k\times B_{k+1}\times\cdots\times B_n)\\ &=&P(X_1\in B_1,\dots,X_k\in B_k,X_{k+1}\in B_{k+1},\dots,X_n\in B_n)\\ &=& P(X_1\in B_1)\cdots P(X_k\in B_k)P(X_{k+1}\in B_{k+1})\cdots P(X_n\in B_n)\\ &=& \Big[P(X_1\in B_1)\cdots P(X_k\in B_k)\Big]\Big[P(X_{k+1}\in B_{k+1})\cdots P(X_n\in B_n)\Big]\\ &=& \Big[P(X_1\in B_1,\dots,X_k\in B_k)\Big]\Big[P(X_{k+1}\in B_{k+1},\dots,X_n\in B_n)\Big]\\ &=& \Big[P((X_1,\dots,X_k)\in B_1\times\cdots\times B_k)\Big]\Big[P((X_{k+1},\dots,X_n)\in B_{k+1}\times\cdots\times B_n)\Big]\\ \end{eqnarray*}

which holds for all $B_i\in {\cal B}(\mathbb{R})$.

Now, sets of the form $B_1\times\cdots\times B_k$, where the $B's\in{\cal B}(\mathbb{R})$, generate ${\cal B}(\mathbb{R}^k)$. Hence, sets of the form $((X_1,\dots,X_k)\in B_1\times\cdots\times B_k)$ generate $\sigma(X_1,\dots,X_k)$. Similarly, sets of the form $B_{k+1}\times\cdots\times B_n$, where the $B's\in{\cal B}(\mathbb{R})$, generate ${\cal B}(\mathbb{R}^{n-k})$. And sets of the form $((X_{k+1},\dots,X_n)\in B_{k+1}\times\cdots\times B_n)$ generate $\sigma(X_{k+1},\dots,X_n)$. Hence, for any $M\in{\cal B}(\mathbb{R}^k)$ and $N\in{\cal B}(\mathbb{R}^{n-k})$ we have

\begin{eqnarray*} &&P((X_1,\dots,X_k)\in M, (X_{k+1},\dots,X_n)\in N )\\ &=&P((X_1,\dots,X_k,X_{k+1},\dots,X_n)\in M\times N )\\ &=&\Big[P((X_1,\dots,X_k)\in M)\Big]\Big[P((X_{k+1},\dots,X_n)\in N)\Big]\\ \end{eqnarray*}

which shows that $(X_1,\dots,X_k)$ and $(X_{k+1},\dots,X_n)$ are independent.

Related Question