Solved – the difference between “scale parameter” and the variance

descriptive statisticsdistributionsmomentsscale-estimatorvariance

I would like to understand the difference between "scale parameter" and the variance of a distribution? I found, that the "scale parameter", scaling the width of a distribution is mentioned when speaking about nonparameteric methods, while the variance and moments are called when speaking about parametric methods. Sometimes this is also referred as dispersion. When the dispersion grows, the scale grows, so does the variance. I understand that dispersion is some general concept, so probably the scale parameter, and the variance is a special measure of dispersion and scale?

When I read a documentation of statistical methods for comparing variances, I can see that scale is equated somehow with the variance. But when I run both tests for equality of variances and scales on the same data, I often get contradicting results, as in the linked example. Because I am asking in the context of this question: Why do we need F test of two variances if we have the Ansari-Bradley test? Knowing the answer I could, probably, answer the linked question.

Best Answer

“Variance“ has a definite meaning. Variance always means the second central moment, and when we estimate or test the variance, we are estimating or testing this quantity.

“Scale” is more general. It refers to spread of the data in some way but without committing to discussing the second central moment. After all, the second central moment might not exist!

I like the definition I’m seeing on Wikipedia for a scale parameter $s$ (and other parameters $\theta$):

$$F(x; s, \theta)=F(x/s;1,\theta)$$

So if some $s$ allows us to stretch or compress the CDF to some standardized CDF, we call it a scale parameter. It might be related to the variance, but maybe not.

https://en.wikipedia.org/wiki/Scale_parameter