[Math] Find the $90\%$ confidence interval for a measurement with mean $78.1$ and SD $20$.

descriptive statistics

Here is the full problem:

A calibration laboratory has been measuring a one-kilogram checkweight by the same procedure for several years. They have accumulated several hundred measurements, and the SD of these measurements is $18$ micrograms. Someone now sends in a one-kilogram weight to be calibrated by the same procedure. The lab makes $50$ measurements on the new weight, which average $78.1$ micrograms above a kilogram, and their SD is 20 micrograms.

If possible, find a $90\%$-confidence interval for the number of micrograms this new weight is above $1$ kilogram. (You may assume the Gauss model, with no bias.)

I've been trying to find the confidence interval by using the formula $78.1 \pm 1.64\times\frac{20}{\sqrt{50}}$, where $1.64$ is the $z$-score for a $90\%$ confidence interval and $\frac{20}{\sqrt{50}}$ is the standard error for the mean of the measurements, but the problem entry system keeps telling me that my answer is incorrect.

Best Answer

You are told that several hundred measurements on a known mass have been made and the resulting SD is 18 ug. What this is telling you is that they have estimated the SD of the errors very well, and so you can assume that the true SD is, in fact, 18 ug ($\pm$ some small value). This means that your standard error will be $\frac{18}{\sqrt{50}}$ -- your standard error was 11% too high.

As a theoretical aside, we know the standard deviation is essentially exact because we are told to assume the errors are gaussian with no bias. Therefore, the standard deviation distribution for this quantity will be essentially normally distributed (for N in the hundreds), with mean equal to $b(N)\sigma$, where $b(N)$ is the bias as given in (6) of the link; with large N, this will be extremely close to 1 (e.g. it is 0.996 for N=200). The variance of this estimate will be as given in (10) in the link; for large N, say, N=200, the variance of this estimate will be $0.0025\sigma^2$ and the standard deviation will be $0.05\sigma$. What this means is that the estiamte of the standard deviation (i.e., 18 us) will be within 10% of the true standard deviation 95% of the time! So you know with a high degree of confidence that your standard devation is somewhere between 16.2 and 19.8 ug...and that is just for 200 measurements. If you have HUNDREDS e.g., 600 then the accracy will increase so that you know the standard devation within 6%.

Related Question