Random Variables – What Does Orthogonal Random Variables Mean?

orthonormalrandom variables

As far as I know orthogonality is a linear algebraic concept, where for a 2D or 3D case if the vectors are perpendicular we say they are orthogonal. Even it is OK for higher dimensions. But when it comes to random variables I cannot figure out orthogonality. I saw that somewhere if the expectation of 2 random variables $X$ and $Y$ is zero ( $E[XY] = 0$ ) then the random variables are orthogonal. How is that possible?

Is orthogonality in linear algebra and probability and statistics same?

Best Answer

Orthogonality comes from the idea of vanishing inner product. In case of random variables $$ \mathbb E \left [ X\right ] = \int_{-\infty}^\infty xd\mu_X $$ so, orthogonal RVs are those with $$ \mathbb E \left [ XY\right ] = \int_{-\infty}^\infty \int_{-\infty}^\infty xy d\mu_X d\mu_Y = 0 $$

Related Question