[Math] What does glassware accuracy and precision (tolerance) mean statistically

almost-everywhereprobabilitystatistics

According to Daniel C. Harris (Table 2-5, 9th edition), a 25 mL volumetric pipet has an accuracy of +- 0.06 mL and a precision of +- 0.02 mL.

I'm trying to understand what these values mean from a statistical definition:

1.) a.) Is the "accuracy" value of 0.06 mL indicate a tolerance interval? If so, what is the confidence level and tolerance level(is it assumed 99% for both?) for that tolerance interval? In other words, if 100 experiments of 100 trials using that pipet were done: 99 of those 100 experiments would have at least 99 trials with results between 24.94 and 25.06?

1.) b.) Or is "accuracy" simply the TOLERANCE (permissible range of variation from its nominal value) of the pipet? If so, how is tolerance for glassware determined? Is it arbitrary or is it based on the error inherent in the production process (like the mold used to hold the smelted glass)?

2.) Is the "precision" value of 0.02 mL indicate the population standard deviation? In other words, is it saying that 65% (one standard deviation for a normally distributed population) of all measurements would fall within 24.98 and 25.02?

3.) Does the precision of the glassware assume perfect sampling technique? If so, can the precision uncertainty be directly compounded with the sampling technique uncertainty?

For example:

If the pipet was used 10 times to reveal a sample standard deviation of +- 0.03 mL. Does it mean that 0.02 of that interval is due to the glassware and 0.01 of that interval is a consequence of sampling technique variance?

NOTE: I asked this same question on Physics.exchange but realized it may be more appropriate here.

Best Answer

Statistical meanings of 'accuracy' and 'precision' are as follows:

Precision refers to the variability of the measurements. For example, if repeated measurements using the pipet are normally distributed with mean $\mu$ and standard deviation $\sigma,$ then small $\sigma$ means high precision. How reproducible are the repeated results?

Accuracy refers to whether the mean $\mu$ of the repeated measurements is correct. Are repeated measurements systematically too small or systematically too large.

In terms of target practice at a rifle range: Precision has to do with whether I can hit approximately the same spot on the target every time. Accuracy has to do with whether that spot is the bull's eye.

To find out precisely how the manufacturer determines accuracy and precision, you'd have to ask--and then not be terribly surprised if no one knows the exact answer. For example the precision of $\pm 0.02\, mL$ might mean that $\sigma = .01,$ so that an interval $\pm 0.02$ would contain 95% of repeated determinations. Or it might mean that $\sigma = .02.$ (I'm a statistician and it has been about 50 years since I've had anything to do with pipettes, but I suspect @JohnBenton is correct that there is no uniform standard. Also, I suspect information provided is for ideal circumstances, and assumes an experienced persion is using the pipette.)

Here is one method that is sometimes used (especially in theories of estimation) to combine accuracy and precision. Suppose the standard deviation is $\sigma$ so that the variance is $\sigma^2.$ Suppose that the bias is $b$ (roughly, the average difference between true and experimental mean). Then the mean squared error is $MSE = \sigma^2 + b^2,$ which is expressed in squared units. To return to original units (e.g, mL, in your case), you can take the square root to get $RMSE = \sqrt{\sigma^2 + b^2}.$

Related Question