Solved – Is variance a more fundamental concept than standard deviation

standard deviationvariance

On this psychometrics website I read that

[A]t a deep level variance is a more fundamental concept than the
standard deviation.

The site doesn't really explain further why variance is meant to be more fundamental than standard deviation, but it reminded me that I've read some similar things on this site.

For instance, in this comment @kjetil-b-halvorsen writes that "standard deviation is good for interpretation, reporting. For developing the theory the variance is better".

I sense that these claims are linked, but I don't really understand them. I understand that the square root of the sample variance isn't an unbiased estimator of the population standard deviation, but surely there must be more to it than that.

Maybe the term "fundamental" is too vague for this site. In that case, perhaps we can operationalize my question as asking whether variance is more important than standard deviation from the viewpoint of developing statistical theory. Why/why not?

Best Answer

Robert's and Bey's answers do give part of the story (i.e. moments tend to be regarded as basic properties of distributions, and conventionally standard deviation is defined in terms of the second central moment rather than the other way around), but the extent to which those things are really fundamental depends partly on what we mean by the term.

There would be no insurmountable problem, for example, if our conventions went the other way -- there's nothing stopping us conventionally defining some other sequence of quantities in place of the usual moments, say $E[(X-\mu)^p]^{1/p}$ for $p=1,2,3,...$ (note that $\mu$ fits into both the moment sequence and this one as the first term) and then defining moments -- and all manner of calculations in relation to moments -- in terms of them. Note that these quantities are all measured in the original units, which is one advantage over moments (which are in $p$-th powers of the original units, and so harder to interpret). This would make the population standard deviation the defined quantity and variance defined in terms of it.

However, it would make quantities like the moment generating function (or some equivalent relating to the new quantities defined above) rather less "natural", which would make things a little more awkward (but some conventions are a bit like that). There's some convenient properties of the MGF that would not be as convenient cast the other way.

More basic, to my mind (but related to it), is that there are a number of basic properties of variance that are more convenient when written as properties of variance than when written as properties of standard deviation (e.g. the variance of sums of independent random variables is the sum of the variances).

This additivity is a property that's not shared by other measures of dispersion and it has a number of important consequences.

[There are similar relationships between the other cumulants, so this is a sense in which we might want to define things in relation to moments more generally.]

All of these reasons are arguably either convention or convenience but to some extent it's a matter of viewpoint (e.g. from some points of view moments are pretty important quantities, from others they're not all that important). It may be that the "at a deep level" bit is intended to imply nothing more than kjetil's "when developing the theory".

I would agree with kjetil's point that you raised in your question; to some extent this answer is merely a hand-wavy discussion of it.