Solved – the difference between confidence intervals and precision

confidence intervalprecision

I'm studying to be an auditor (CIA exam), but I do not have a statistics background and I'm very confused regarding the difference between confidence intervals and precision, then how they relate to confidence level. I know that as a confidence level is increased (eg: 90% to 95%), the confidence interval widens. In my exam material, it states, "in terms of a stated confidence level, precision is the range into which an estimate of population characteristic is expected to fall. Based on a random sample, it is estimated that 4%, plus or minus 2%, of a firm's invoices contain errors. The plus or minus 2% is the estimate's precision". This sounds almost exactly like what a confidence interval is to me, yet the terms are used separately and seem to have different relationships with the confidence level, "when a confidence level is changed from 95% to 99% and no change in sample standard deviation takes place, the sample size would be larger but achieved precision would not change".

Thank you so much!

Best Answer

Precision is usually referred to as the reciprocal of variance. There is another definition which treats it as the standard error of an estimate. Confidence intervals are different. They provide a statistical interval that in repeated sampling the true parameter will fall in the interval !-$\alpha$% of the time where 1-$\alpha$ is the confidence level.

Related Question