[Math] Can we calculate the range form mean and standard deviation in a normal distribution

normal distribution

Suppose in a normal distribution the mean is 90 and the standard deviation is 10. Then what is the range?
Would the following be an acceptable way to find the range, where $\sigma$ represents the standard deviation and $\mu$ represents the mean? $$\text{Range} = \mu + 3 \sigma – \mu + 3 \sigma = 6 \sigma$$

Best Answer

Standard deviation and mean give a probability mass distribution, which is continuous. That means that you can't talk about discrete things (like saying "there are exactly $x$ elements greater than some cutoff $n$) just on the basis of mean and standard deviation. As such, you can't say with certainty that "there are no elements less than the lower bound $a$ or greater than the upper bound $b$", so you can't give discrete values to the bound $[a,b]$ over which your range would be defined.

What the mean and standard deviation of a normal distribution does let you do is make claims about the relative probability of an element existing at any spot along the distribution, so you could pick some set cutoff deviation (say, perhaps $3 \sigma$) and say that $99.7\%$ of the probability mass lies within a $\pm3\sigma$ bound.