[Math] to median as first central moment is to mean

probabilityprobability distributionsstatistics

The question sounds like a riddle, but it isn't intended to be one.

I've been thinking about the Cauchy Distribution which, famously doesn't have any central moments defined. A very informal justification for this is that as the angle approaches $\pm90^\circ$ from the origin, the value of the function tends quickly to infinity… hence, if we were to attempt to calculate the mean, its value would vary to $\pm\infty$ very, easily. Essentially, rather than summarise the data-set as a whole, one would identify only whether or not your samples were biased very slightly to the positive or negative values.

An obvious approach to establish an estimate of expected value would be to calculate the median – which would avoid the outlying data points overwhelming the summary. This single scalar summary value – analogous to mean – then suggests a more reasonable estimate of 'expected' value in some circumstances. Is it common to extend such analysis with measures analogous to variance, skew and kurtosis – to better describe the distribution? If so, how are these concepts commonly defined?

UPDATE: Many thanks for the pointer to MAD… that's definitely relevant. While I wasn't clear about this previously, central moments appealed because they generated a progression of values each further refining the description of a normal distribution… and I really hoped to do something similar for systems where the empirical mean and standard deviation can't be trusted to give a meaningful summary.

Best Answer

There are various measures of variability that fit your description. One that is popular in some fields is the inter-quartile range.

Another one is median absolute deviation. Charmingly, the standard acronym for this is MAD.

Related Question