Solved – how can I calculate the mutual information between two normal densities using the parameters mu and sigma

density functionmutual informationnormal distribution

I have two normal densities, $X_1$, $X_2$, for which I know their mean and variance ($\mu_1$, $\mu_2$ and $\sigma_1^2$ and $\sigma_2^2$, respectively).

I would like to know the mutual information between them, $I(X_1,X_2)$. But all the formulas I can find need either $\rho(X_1,X_2)$ or $H(X_1,X_2)$, which I don't have. Is it possible to calculate $I$ (or $\rho$ or $H$) from the parameter values I have?

Best Answer

The parameters you have only tell you about the marginal distributions of $X_1$ and $X_2$ so no, you cannot compute a measure of dependence like mutual information. Consider for instance that given $\mu_1, \mu_2, \sigma^2_1$ and $\sigma^2_2$ the random variables $X_1$ and $X_2$ could either be perfectly correlated or entirely independent of one another.