[Math] What does it mean to divide by the standard deviation

image processingstandard deviationstatistics

I'm trying to "variance-normalize" an image. In order to do so, I subtract the mean from the pixel value (to have a $0$ mean), and divide by the standard deviation (to have a unit variance) right? But I've also seen division by the standard deviation, since it's obviously not the same, what does it do?

I'm also a bit confused about the values I get after normalization: some values still more than $1.0$ or less than $-1.0$. I thought a unit variance means a variance from $-1.0$ to $1.0$.

Best Answer

If you divide all the values by the standard deviation, then you will then have a distribution with a standard deviation equal to $1$ (and so a variance equal to $1^2 = 1$). The difference is that the mean is not $0$, unless it was originally.

You seem to be confusing the variance with (half) the range. The range is the difference between the minimum possible value and the maximum possible value. The variance is the expected value of the square of the differences between the actual values and the mean of the distribution, and the standard deviation is the square root of the variance.

Except in a special case, the range will be more than twice the standard deviation: the special case is where the distribution only takes two values, and takes those with equal probability. So if you have a standardized distribution with mean $0$ and standard deviation $1$ then you will usually find some values outside the interval $[-1,1]$: for a standard normal distribution, there is a probability over 31% that a value will be outside that interval.

Related Question