[Math] Product of two dependent Gaussian random variables

probability

Suppose $(X,Y)$ is Gaussian with mean $\left[\begin{matrix}0 \\ 0\end{matrix}\right]$ and covariance matrix $\left[\begin{matrix}1 & \delta \\ \delta & 1\end{matrix}\right]$.

It is easy to calculate that $\mathbb E[(XY)^2] = 2\delta^2+1$ as follows:

\begin{equation}
\begin{split}
\mathbb E[(XY)^2] &= \frac{1}{2\pi \sqrt{1-\delta^2}}\int\int(xy)^2 \exp\left(-\frac{1}{2(1-\delta^2)}(x^2-2\delta x y + y^2)\right) \mathrm dx\,\mathrm dy \\
& = \frac{1}{\sqrt{2\pi}}\int y^2 \exp \left(-\frac{1}{2}y^2\right) \frac{1}{\sqrt{2 \pi (1-\delta^2)}}\int x^2 \exp \left(-\frac{1}{2(1-\delta^2)} (x-y\delta)^2\right)\mathrm dx\,\mathrm dy \\
& = \mathbb E_{y \sim \mathcal N(0,1)}[y^2-y^2\delta^2 + y^4\delta^2] = 2\delta^2 + 1.
\end{split}
\end{equation}

Nevertheless, it seems like there should be an easier way to compute this quantity without having to write down a Gaussian integral.

Question 1: Is there a more clever way to compute this expectation (perhaps by using standard facts about the chi-square distribution)?

This question arose when considering the $\ell_2$ norm of Gaussian vectors.

Best Answer

One approach is to use the moment generating function: $$m(t_1,t_2)=\exp{\Big(t_1^2+t_2^2+2\delta t_1t_2\Big)}$$ to obtain $$\frac{\partial^4}{\partial t_1^2 \partial t_2^2}m(t_1,t_2)\Big|_{t_1=0,t_2=0}=1+2\delta^2$$

Related Question