Proving that uncorrelatedness implies independence for bivariate joint normal random variables

bivariate-distributionsdensity functionnormal distributionprobabilityprobability distributions

Question

Let $X$ and $Y$ be independent standard normal random variables and consider the following linear transformations, $$U = aX + bY$$ and $$V = cX + dY,$$ where $a, b, c, d \in \mathbb{R}$. Find the joint density of $U$ and $V$ and prove that, if $U$ and $V$ are uncorrelated, then $U$ and $V$ are independent.

Hint

The bivariate normal PDF for jointly normal random variables $W$ and $Z$ is given by $$f_{W, Z}(w, z) = \frac 1 {2\pi\sqrt{\lvert \Sigma \rvert}} \exp\left\{-\frac 1 2 \begin{pmatrix} z & w \end{pmatrix} \Sigma^{-1} \begin{pmatrix} z \\ w \end{pmatrix}\right\}.$$

My working

$\begin{aligned}
U \sim \mathcal{N}(0, a^2 + b^2)\\[1 mm]
V \sim \mathcal{N}(0, c^2 + d^2)
\end{aligned}$

$\begin{aligned}
Cov(U, V) & = \mathbb{E}(UV) – \mathbb{E}(U)\mathbb{E}(V)\\[1 mm]
& = \mathbb{E}[(aX + bY)(cX + dY)]\\[1 mm]
& = ac + bd
\end{aligned}$

$\implies \Sigma =
\begin{pmatrix}
a^2 + b^2 & ac + bd\\[1 mm]
ac + bd & c^2 + d^2
\end{pmatrix}$

$\implies \lvert \Sigma \rvert = (ad – bc)^2$

$\implies \Sigma^{-1} = \dfrac 1 {(ad – bc)^2}
\begin{pmatrix}
c^2 + d^2 & -ac – bd\\[1 mm]
-ac – bd & a^2 + b^2
\end{pmatrix}$

$\begin{aligned}
\therefore f_{U, V}(u, v) & = \frac 1 {2\pi(ad – bc)} \exp\left\{-\frac 1 {2(ad – bc)^2}
\begin{pmatrix}
u & v
\end{pmatrix}
\begin{pmatrix}
c^2 + d^2 & -ac – bd\\[1 mm]
-ac – bd & a^2 + b^2
\end{pmatrix}
\begin{pmatrix}
u\\[1 mm]
v
\end{pmatrix}
\right\}\\[1 mm]
& = \frac 1 {2\pi(ad – bc)} \exp\left\{-\frac 1 {2(ad – bc)^2} [(c^2 + d^2)u^2 – 2(ac + bd)uv + (a^2 + b^2)v^2]\right\}
\end{aligned}$

For uncorrelated $U$ and $V$, we have

$\begin{aligned}
f_{U, V}(u, v) & = \frac 1 {2\pi(ad – bc)} \exp\left\{-\frac 1 {2(ad – bc)^2} [(c^2 + d^2)u^2 + (a^2 + b^2)v^2]\right\}
\end{aligned}$

However, I am having issues showing that $$f_{U, V}(u, v) = f_U(u)f_V(v)$$ and any intuitive explanations will be greatly appreciated 🙂

Best Answer

I did not check your calculations but, assuming that

  1. $U,V$ are jointly Gaussian,

  2. EDIT: $a,b,c,d \ne 0$

Your covariance calculation is right, thus $\rho^2=\frac{(ac+bd)^2}{(a^2+b^2)(c^2+d^2)}$ and the joint density is the following

$$f_{UV}(u,v)=\frac{1}{2\pi\sigma_u\sigma_v\sqrt{1-\rho^2}}\text{exp}\left\{-\frac{1}{2(1-\rho^2)}\left[\frac{(u-\mu_u)^2}{\sigma_u^2}-2\rho\frac{(u-\mu_u)(v-\mu_v)}{\sigma_u^2\sigma_v^2} +\frac{(v-\mu_v)^2}{\sigma_v^2} \right] \right\}$$

Your case is even simplier, having $\mu_u,\mu_v=0$ thus

$$f_{UV}(u,v)=\frac{1}{2\pi\sigma_u\sigma_v\sqrt{1-\rho^2}}\text{exp}\left\{-\frac{1}{2(1-\rho^2)}\left[\frac{u^2}{\sigma_u^2}-2\rho\frac{uv}{\sigma_u^2\sigma_v^2} +\frac{v^2}{\sigma_v^2} \right] \right\}$$

So just substitute. This expression, which is equivalent to yours, is faster to manage with $n=2$

P.S.: in the expression I wrote $\mu_u,\sigma_u$ instead of the more correct notation $\mu_U,\sigma_U$ for a better formula visualization.


prove that, if U and V are uncorrelated, then U and V are independent.

To answer the question, simply set $\rho=0$ in the above formula getting

$$f_{UV}(u,v)=\frac{1}{\sqrt{(a^2+b^2)2\pi}}e^{-u^2/(2(a^2+b^2))}\times \frac{1}{\sqrt{(c^2+d^2)2\pi}}e^{-v^2/(2(c^2+d^2))}=f_U(u)\times f_V(v) $$

Related Question