Solved – Min and max range from standard deviation

extreme valuerangestandard deviation

If I have 2 data points like

16.70 (+/-4.33)
54.70 (+/-16.30)

where 16.7 = mean and +/- 4.33 is the standard deviation, is it possible to calculate a range out of that? I would think that roughly, the min range here would be 16.7-4.33 and the max 54.7+16.30, so the range would be:

(12.37 - 71.00)

But I can't find a source to confirm or disconfirm this. All I find is examples how to calculate standard deviation from a range, but I am looking for the opposite.
Is there a "standard" way to get a range from 2 means and 2 standard deviations?

Best Answer

Assume the population from which these data points are taken is normal (mean $\mu$, variance $\sigma^2$). I think your question is ill-posed because there is no population parameter that you can call "the" range here, and there is no use in trying to estimate it. Think of it this way: if the number of data points becomes very large the sample standard deviation will approach to $\sigma$, but the range of the sample will converge to $+\infty$. So it is useful to talk about standard deviation of the population, but not of its range. The "range" is a statistic, a random quantity of which the distribution keeps changing with the sample size (shifting to larger and larger values). The standard deviation $\sigma$ on the other hand is a fixed population parameter that you can estimate from any given sample.

This is the reason why you will find methods of estimating $\sigma$ from the range (28.00 in your case) of a sample, but not the other way around. Usually these methods suppose the population to be normal. If not, you need to apply methods from order statistics (Tippet integrals...).

In quality engineering for example, for Shewhart control charts, it is still widely customary to use the sample range to estimate $\sigma$, even if this is somewhat less optimal than using the sample standard deviation directly.