[Math] Why the standard deviation approximated by the range rule of thumb much different from the standard deviation calculated directly from sample data

statistics

What makes the standard deviation approximated by the range rule of thumb much different from the standard deviation calculated directly from sample data?
For example, if the approximate standard deviation is 15.36 and the standard deviation calculate from the sample set directly is 21. Is that possible that the way of collecting the data is not valid?

Suppose that the following data are test scores from one physics class in college.
26, 26, 39, 39, 42, 56, 65, 66, 76, 79, 80, 86, 87

The mean is 58.14, and the standard deviation is 20.94. However, by applying the range rule of thumb, I got 15.25.
Why the difference is so large?

Also, in general, does college test scores satisfy normal distribution?

Thanks in advance

Best Answer

The range rule of thumb is extremely crude and should only be used when you are doing expert elicitation, where you don't actually have the data, but an expert knows the range pretty well. Since you have data, you should use the sample standard deviation.

If you want more information on general rules of thumb, I suggest the book Statistical Rules of Thumb by Gerald van Belle. (I've linked to the free preview of chapter 2...see p.10 of the pdf for your specific issue).

The answer to your second question is NO. In general, they do not have to be normal.