Normal Distribution – Mean Absolute Difference Between Values

meannormal distributionstandard deviationvariance

I understand that variance is mean of squared differences and that standard deviation is square root of the mean.

What, however, is the average difference between values in a normal distribution (without considering the sign, of course, since if we consider the sign, it would be 0)?

Best Answer

Assume that $X, Y\sim N(\mu,\sigma^2)$ are iid.

Then their difference is $X-Y\sim N(0,2\sigma^2)$. As you write, the expectation of this difference is zero.

And the absolute value of this difference $|X-Y|$ follows a folded normal distribution. Its mean can be found by plugging the mean $0$ and variance $2\sigma^2$ of $X-Y$ into the formula at the Wikipedia page:

$$ \sqrt{2}\sigma\sqrt{\frac{2}{\pi}} = \frac{2\sigma}{\sqrt{\pi}}. $$

A quick simulation in R is consistent with this:

> nn <- 1e6
> sigma <- 2
> set.seed(1)
> XX <- rnorm(nn,0,sigma)
> YY <- rnorm(nn,0,sigma)
> mean(abs(XX-YY))
[1] 2.257667
> sqrt(2)*sigma*sqrt(2/pi)
[1] 2.256758