[Math] Standardizing A Random Variable That is Normally Distributed

normal distributionprobabilityrandom variablesstatistics

To standardize a random variable that is normally distributed, it makes absolute sense to subtract the expected value $\mu$ , from each value that the random variable can assume–it shifts all of the values such that the expected value is centered at the origin. But how does dividing by the standard deviation play a role in the standardization of a random variable? That part is not as intuitive to me as is subtracting $\mu$.

Best Answer

Let $X \sim N(\mu,\sigma^2)$.

Let $Y = \large\frac{X-\mu}{\sigma}$.

$E[Y] = \large\frac{E[X] - \mu}{\sigma} = \large\frac{\mu-\mu}{\sigma} = 0$

$\text{Var}(Y) = \large\frac{1}{\sigma^2}\text{Var}(X) = \large\frac{1}{\sigma^2}\sigma^2 = 1$.

So that $Y \sim N(0,1)$.

This is precisely why we subtract the mean and divide by the standard deviation.