Determine the covariance matrix of a Gaussian random variable by the distributions of its linear transformations

correlationcovariancenormal distributionprobability distributionsprobability theory

Let $d\in\mathbb N$ and $X$ be an $\mathbb R^d$-valued random variable on a probability space $(\Omega,\mathcal A,\operatorname P)$. Assume $X$ has a Gaussian distribution, i.e. $\langle\lambda,X\rangle$ is normally distributed for all $\lambda\in\mathbb R^d$. Let $\operatorname{Cov}[X]:=\operatorname E\left[(X-\operatorname E[X])(X-\operatorname E[X])^T\right]]$ denote the covariance matrix of $X$.

Are we able to express the $ij$-th element $\langle\operatorname{Cov}[X]e_j,e_i\rangle$ of $\operatorname{Cov}[X]$ in terms of the means and variances of the random variables $\langle\lambda,X\rangle$, $\lambda\in\mathbb R^d$?

By assumption, $\langle\lambda,X\rangle=\mathcal N(\mu_\lambda,\sigma_\lambda^2)$ for some $(\mu_\lambda,\sigma_\lambda)\in\mathbb R\times[0,\infty)$ for all $\lambda\in\mathbb R^d$. From this we are able to express the trace elements of $\operatorname{Cov}[X]$ as $$\langle\operatorname{Cov}[X]e_i,e_i\rangle=\sigma_{e_i}^2\tag1.$$ However, unless the components of $X$ are uncorrelated, I don't see how we could determine the other elements.

Best Answer

For r.v.s $Y$ and $Z$, $\operatorname{Var}(Y+Z)=\operatorname{Var}(Y)+\operatorname{Var}(Z)+2\operatorname{Cov}(Y,Z)$. Thus, $$ \langle\operatorname{Cov}[X]e_j,e_i\rangle=\frac{1}{2}(\sigma_{e_i+e_j}^2-\sigma_{e_i}^2-\sigma_{e_j}^2). $$

Related Question