[Math] Divergence between two random variables

gaussianit.information-theorypr.probability

I have two Gaussian random variables $X$ and $Y$, each of which is an estimator of an underlying quantity. I need to measure whether $Y$ is estimating something different than $X$. So if the mean of $Y$ is very different to the mean of $X$, that is good. However, if the variance of $Y$ is also much higher than the variance of $X$ it might just be that $Y$ is a poor estimator and the different mean is not meaningful (if you'll excuse the pun).

So for a fixed $X$, I need a function of $Y$ that increases with $\left|\mu_Y-\mu_X\right|$ but that decreases with $\sigma_Y$ (and is zero if and only if $\sigma_Y = \sigma_X$ and $\mu_Y = \mu_X$). I don't really know whether to call this a measure, a divergence or something else, since it will be negative when $\mu_Y = \mu_X$ and $\sigma_Y > \sigma_X$.

I thought that the Kullback-Leibler divergence might be the way to go but it responds the wrong way to changes in the variance of $Y$. Also, it can't be negative.

Clearly, something like $\left|\mu_Y-\mu_X\right| + \sigma_X – \sigma_Y$ has the above properties but I'm looking for something a little more grounded in information theory.

Ideally, this would generalize to distributions other than Gaussian.

Best Answer

Let $0<\alpha<1$ (typically $\alpha=0.05$) and choose $\varepsilon>0$ such that

$$\alpha = \mathbb P(\left |X-\mu_X\right | > \varepsilon)$$

Now let

$$f = \mathbb P(\left |Y-\mu_X\right | > \varepsilon) - \alpha$$

Then

  1. $f$ is increasing with $|\mu_X-\mu_Y|$,

  2. $f$ is decreasing with $\sigma_Y$, and

  3. $f=0$ when $\mu_X=\mu_Y$ and $\sigma_X=\sigma_Y$.

This approach comes not so much from information theory as from hypothesis testing, in particular the notion of statistical power. Some generalizing from the case of the normal distribution seems possible, but note that in (3) we need the distribution to be determined by $\mu_Y$ and $\sigma_Y$.