Mathematical Statistics – Proof: Joint Probability Density Equals Product of Marginal Densities for Independent Variables

independencejoint distributionmathematical-statistics

Is it true that if $X_1, X_2, \ldots ,X_n$ are independent random variables, then
\begin{align}
& f_{X_1,X_2,\ldots,X_n}(x_1,x_2,\ldots,x_n) \\
= {} & f_{X_1}(x_1)\times f_{X_2}(x_2) \times \cdots \times f_{X_n}(x_n)
\end{align}

(i.e. joint probability density of independent random variables is equal to the product of marginal densities) ?

If so, what is the proof behind this theorem (or should the statement be treated as the definition of independence, rather than a theorem)? It's not a homework, I am asking because I am curious what the proof is.

Thank you,

Best Answer

By definition, the random variables $X_1,\dots,X_n$ are independent iff $$ \Pr(X_1\in B_1,\dots,X_n\in B_n) = \Pr(X_1\in B_1)\dots\Pr(X_n\in B_n) $$ for every choice of Borel sets $B_1,\dots,B_n$. Hence, picking $B_i=(-\infty,t_i]$, we have $$ \Pr(X_1\leq t_1,\dots,X_n\leq t_n) = \Pr(X_1\leq t_1)\dots\Pr(X_n\leq t_n). \qquad (*) $$ If each $X_i$ has a density $f_{X_i}$, then the RHS of $(*)$ is equal to $$ \left(\int_{-\infty}^{t_1} f_{X_1}(x_1)\,dx_1\right) \dots \left(\int_{-\infty}^{t_n} f_{X_n}(x_n)\,dx_n\right). $$ By Fubini's theorem, this is equal to $$ \int_{-\infty}^{t_n}\dots\int_{-\infty}^{t_1} f_{X_1}(x_1)\dots f_{X_n}(x_n)\,dx_1\dots dx_n. $$ Hence, it follows that the random vector $(X_1,\dots,X_n)$ has density $$ f_{X_1,\dots X_n}(x_1,\dots,x_n) = f_{X_1}(x_1)\dots f_{X_n}(x_n). $$