Expectation of product of jointly Gaussian random variables

expected valuenormal distributionprobabilityprobability distributions

I am doing a guided project on Gaussian Spaces and I'm getting stuck in the first stages of the construction. I would really appreciate help with this next point:

Let $N$ be an integer and let $\xi_{1},…,\xi_{N}$ be i.i.d standard real Gaussians (mean zero, unit variance) defined on some joint probability space $\Omega$. For any vector $y:=(y_{1},…,y_{N})\in\mathbb{R}^{N}$, define a r.v. $\xi_{y}$ on $\Omega$ to be $\xi_{y}:=\sum_{i=1}^{N}y_{i}\xi_{i}$

Now let $y^{(1)},\ldots,y^{(k)}$ be k vectors in $\mathbb{R}^{N}$. It turns out (take as given) that $\xi_{y^{(1)}},…,\xi_{y^{(k)}}$ have a joint Gaussian distribution. IOW, the density of the vector $x:=\left(\xi_{y^{(1)}},…,\xi_{y^{(k)}}\right)$ is given by $$f(x)=\frac{1}{\sqrt{(2\pi)^{N}\det(A)}}\exp\left\{ -\frac{1}{2}x^{T}A^{-1}x\right\}$$ Where $A$ is the covariance matrix for $x$. IOW, $A_{ij}:=\mathbb{E}\left[\xi_{y^{(i)}}\xi_{y^{(j)}}\right]$.

Check/Show that $A_{ij}=\left(\left\langle y^{(i)},y^{(j)}\right\rangle \right)$

Here's what I have so far:

$\mathbb{E}\left[\xi_{y^{(i)}}\xi_{y^{(j)}}\right]=\mathbb{E}\left[\left(y^{(i)}\cdot\begin{pmatrix}\xi_{1}\\
\vdots\\
\xi_{N}
\end{pmatrix}\right)\left(y^{(j)}\cdot\begin{pmatrix}\xi_{1}\\
\vdots\\
\xi_{N}
\end{pmatrix}\right)\right]\overset{}{=}\\\mathbb{E}\left[y^{(i)}\left(\begin{pmatrix}\xi_{1}\\
\vdots\\
\xi_{N}
\end{pmatrix}\cdot y^{(j)}\right)\begin{pmatrix}\xi_{1}\\
\vdots\\
\xi_{N}
\end{pmatrix}\right]\overset{(a)}{=}\mathbb{E}\left[y^{(i)}\left(\begin{pmatrix}\xi_{1}\\
\vdots\\
\xi_{N}
\end{pmatrix}\cdot y^{(j)}\right)^{T}\begin{pmatrix}\xi_{1}\\
\vdots\\
\xi_{N}
\end{pmatrix}\right]=\\\mathbb{E}\left[\left(y^{(i)}\cdot y^{(j)^{T}}\right)\left(\begin{pmatrix}\xi_{1}\\
\vdots\\
\xi_{N}
\end{pmatrix}^{T}\cdot\begin{pmatrix}\xi_{1}\\
\vdots\\
\xi_{N}
\end{pmatrix}\right)\right]\overset{(b)}{=}\left(y^{(i)}\cdot y^{(j)^{T}}\right)\mathbb{E}\left[\begin{pmatrix}\xi_{1}\\
\vdots\\
\xi_{N}
\end{pmatrix}^{T}\cdot\begin{pmatrix}\xi_{1}\\
\vdots\\
\xi_{N}
\end{pmatrix}\right]=\\\left\langle y^{(i)},y^{(j)}\right\rangle \mathbb{E}\left[\sum_{k=1}^{N}\xi_{k}^{2}\right]\overset{(b)}{=}\left\langle y^{(i)},y^{(j)}\right\rangle \sum_{k=1}^{N}\mathbb{E}\left[\xi_{k}^{2}\right]\overset{(c)}{=}N\cdot\left\langle y^{(i)},y^{(j)}\right\rangle $

(a) – $\xi_{y}=\xi_{y}^{T}$

(b) – Linearity of expected value

(c) – $1=Var(\xi_{i})=\mathbb{E}\left[\xi_{i}^{2}\right]-\underset{=0^{2}}{\underline{\mathbb{E}\left[\xi_{i}\right]^{2}}}\implies\mathbb{E}\left[\xi_{i}^{2}\right]=1$

First of all, are my steps even correct? It worries me that I just go for "naive" calculations, but it seems to work… except I end up with a factor of N.

I would appreciate feedback on my approach + an explanation of how to actually demonstrate what is being asked of me (if possible, as an extension of what I already did).

EDIT: I think that my use of the associative property is incorrect and the subsequent transfer (a). Unfortunately, I have to go to sleep now, but in the morning I'll take another look at those parts

Best Answer

Ok, this is a little embarrassing, but I just outfancied myself by using vector notation and as a result I made the silly mistake of pretending that multiplying a $1\times{}N$ vector by a $N\times1$ vector is the same as multiplying a $N\times1$ vector by a $1\times{}N$ vector.... Anywho, here is the solution I worked out by just being a little more straightforward:

Assuming all the givens as stated in the question:

$\mathbb{E}\left[\xi_{y^{(i)}}\xi_{y^{(j)}}\right]\overset{def}{=}\mathbb{E}\left[\sum_{k=1}^{N}y^{(i)}_{k}\xi_{k}\sum_{l=1}^{N}y^{(j)}_{l}\xi_{l}\right]=\mathbb{E}\left[\sum_{k,l\in[N]}y_{k}^{(i)}\xi_{k}y_{l}^{(j)}\xi_{l}\right]=\mathbb{E}\left[\sum_{k,l\in[N]}y_{k}^{(i)}y_{l}^{(j)}\xi_{k}\xi_{l}\right]\overset{(a)}{=}\sum_{k,l\in[N]}y_{k}^{(i)}y_{l}^{(j)}\mathbb{E}\left[\xi_{k}\xi_{l}\right]\overset{(b)}{=}\left(\sum_{\overset{k,l\in[N]}{k=l}}y_{k}^{(i)}y_{l}^{(j)}\mathbb{E}\left[\xi_{k}\xi_{l}\right]\right)+\left(\sum_{\overset{k,l\in[N]}{k\neq l}}y_{k}^{(i)}y_{l}^{(j)}\mathbb{E}\left[\xi_{k}\xi_{l}\right]\right)\overset{(c)}{=}\left(\sum_{k\in[N]}y_{k}^{(i)}y_{k}^{(j)}\mathbb{E}\left[\xi_{k}^{2}\right]\right)+\left(\sum_{\overset{k,l\in[N]}{k\neq l}}y_{k}^{(i)}y_{l}^{(j)}\underset{=0}{\underline{\mathbb{E}\left[\xi_{k}\right]}}\underset{=0}{\underline{\mathbb{E}\left[\xi_{l}\right]}}\right)\overset{(d)}{=}\sum_{k\in[N]}y_{k}^{(i)}y_{k}^{(j)}=\left\langle y^{(i)},y^{(j)}\right\rangle $

(a) - Linearity of expected value

(b) - Splitting up the sum

(c) - Recall that all $\xi_{i}$ are i.i.d., therefore $\mathbb{E}\left[\xi_{k}\xi_{l}\right]=\mathbb{E}\left[\xi_{k}\right]\mathbb{E}\left[\xi_{l}\right]$

(d) - $1=Var(\xi_{i})=\mathbb{E}\left[\xi_{k}^{2}\right]-\underset{=0^{2}}{\underline{\mathbb{E}\left[\xi_{k}\right]^{2}}}\implies\mathbb{E}\left[\xi_{i}^{2}\right]=1$