Solved – Confidence intervals vs. standard deviation

confidence intervalstandard deviation

The 95% confidence interval gives you a range.

The 2 sigma of a standard deviation also gives you a range of ~95%.

Can someone shed some light on how they are different?

Best Answer

There are two things here :

  1. The "2 sigma rule" where sigma refers to standard deviation is a way to construct tolerance intervals for normally distributed data, not confidence intervals (see this link to learn about the difference). Said shortly, tolerance intervals refer to the distribution inside the population, whereas confidence intervals refer to a degree of certainty regarding an estimation.

  2. In case you meant standard error instead of standard deviation (which is what I understood at first), then the "2 sigma rule" gives a 95% confidence interval if your data are normally distributed (for example, if the conditions of the Central Limit Theorem apply and your sample size is great enough).