Solved – Can a confidence interval be greater than 1

accuracyclassificationconfidence intervalstandard deviation

I am doing a classification task and obtain an accuracy of 97.5%. Now, I calculated the confidence interval, assuming a normal distribution, at the 95% confidence level with: Accuracy +/- 1.96*Standard Deviation (see Alex A Freitas, Data Mining and Knowledge Discovery with Evolutionary Algorithms (2002)). I am getting a value of 0.975 +/- 0.048. As this would be more than 1, can that be right?

Best Answer

This sounds like you use normal approximation interval which is not optimal in any case and especially unsuited for probalities close to 0 and 1 (e.g. 97.5%).

Look at the following graph.

Histogram of samples from binomial distribution with n = 100 and p = 0.5 and 0.975 respectively

For the first histogram a normal distribution would work fairly well. In the second case you can see that the distribution has considerable skew, which would makes the normal distribution inappropiate.

In either case, there is no need to use normal approximations for confidence intervals as more exact answers can be derived (in contrast to other more complex statistics, where sometimes a normal approximation is needed). Better options to construct confidence intervals for binomial proportions are described in the link above as well.

Related Question