[Math] Estimating the standard deviation by simply looking at a histogram

estimationnormal distributionstandard deviation

I would like to make a quick, rough estimate of what a standard deviation is.

Suppose i have the following histogram
enter image description here

By simply looking at it, I can say that the mean is around 10 or 9.8 (middle value) which, when calculating from my dataset, is actually the 9.98.

I understand that the standard deviation is a measure that is used to quantify the amount of variation or dispersion of a set of data values.

How can I estimate the standard deviation by simply looking at the histogram?

Best Answer

It obviously depends on the distribution, but if we assume that the distribution at hand is fairly normal, the full width at half maximum (FWHM) is easy to eye-ball, and as is stated in the given link, it relates to the standard deviation $\sigma$ as $$FWHM \approx 2.36\sigma$$ for a normal distribution.

Edit: Let's try to apply this for your distribution. I'd say that the full maximum of your distribution is around 0.08, so the half maximum is 0.04. Now all the need to figure out is the width at that height, which I'd say is approximately 10. Using the formula above, we find that $$\sigma\approx \frac{10}{2.36}\approx4.24.$$