Probability – Conditional Expectation of a Joint Normal Distribution

conditional-expectationnormal distributionprobabilityprobability distributionsstatistics

Let $X_1, X_2$ be jointly normal $N(\mu, \Sigma)$.

I know that in general, $\mathbb{E}[X_2|X_1]$ can be computed by integrating the conditional density, but in the case of jointly normal variables, it suffices to do a linear projection:

$\mathbb{E}[X_2 | \sigma(X_1)] = \mathbb{E}[X_2|\mathrm{span}(\mathbf{1}, X_1)] = \mu_2 + \frac{\mathrm{cov}(X_2, X_1)}{\mathrm{var}(X_1)} (X_1 – \mu_1) $

Is there a neat proof of this fact (one doesn't require doing any integrals)? Looking for references too.

Best Answer

I've found an answer that I'm happy with:

$$ Y - \frac{\mathrm{cov}(X, Y)}{\mathrm{var}(X)} X $$ is jointly normal with $X$ and uncorrelated, hence independent.

Therefore

$$\mathbb{E}[Y|X] = \mathbb{E}[Y - \frac{\mathrm{cov}(X, Y)}{\mathrm{var}(X)} X + \frac{\mathrm{cov}(X, Y)}{\mathrm{var}(X)} X| X] \\ = \mathbb{E}[Y - \frac{\mathrm{cov}(X, Y)}{\mathrm{var}(X)} X] +\frac{\mathrm{cov}(X, Y)}{\mathrm{var}(X)}X \\ = \mathbb{E} [Y] + \frac{\mathrm{cov}(X, Y)}{\mathrm{var}(X)} (X - \mathbb{E}[X])$$

Related Question