[Math] A geometric interpretation of independence

pr.probability

Consider the set of random variables with zero mean and finite second moment. This is a vector space, and $\langle X, Y \rangle = E[XY]$ is a valid inner product on it. Uncorrelated random variables correspond to orthogonal vectors in this space.

Questions:

(i) Does there exist a similar geometric interpretation for independent random variables
in terms of this vector space?

(ii) A collection of jointly Gaussian random variables are uncorrelated if and only if they
are independent. Is it possible to give a geometric interpretation for this?

Best Answer

There is a Hilbert space interpretation of independence, which follows from the interpretation of conditional expectation as an orthogonal projection, though it may be more complicated than you had in mind.

Say your underlying probability space is $(\Omega, \mathcal{F}, \mathbb{P})$, and write $L^2(\mathcal{F})$ for the Hilbert space of ($\mathcal{F}$-measurable) random variables with finite variance (with $\Omega$ and $\mathbb{P}$ understood). Denote by $\sigma(X)$ the $\sigma$-algebra generated by the random variable $X$. Now the conditional expectation $\mathbb{E}[X|Y]$ is the orthogonal projection in $L^2(\mathcal{F})$ of $X$ onto the subspace $L^2(\sigma(Y))$ of random variables which are $\sigma(Y)$-measurable. $X$ and $Y$ are independent if and only if $\mathbb{E}[f(X)|Y]=\mathbb{E}f(X)$ for every reasonable function $f$. The functions $f(X)$ span $L^2(\sigma(X))$.

So if I now define $L^2_0(\sigma(X))$ to be the mean 0, finite variance random variables which are $\sigma(X)$-measurable, I can say: $X$ and $Y$ are independent iff $L^2_0(\sigma(X))$ is orthogonal to $L^2(\sigma(Y))$ in $L^2(\mathcal{F})$.