[Physics] Error Analysis involving random errors

error analysishomework-and-exercisesMeasurements

The question goes like this:

In an experiment, the time period of an oscillating object in five successive measurements is found to be $0.52$s, $0.56$s, $0.57$s, $0.54$s, $0.59$s. The least count of the watch used for the measurement of time period is $0.01$s. What is the percentage error in measurement of time period $T$.

My attempt: The maximum error in measurement of T due to limited precision of the measuring instrument is the least count i.e. $0.01$s. Also the mean of measured values is $$\frac { 0.52+0.56+0.57+0.54+0.59 }{ 5 } =0.556$$which when rounded off to 2 significant figures is $0.56$. Also the standard deviation can be calculated after rounding off as $0.02$, which can be a good estimate to random error. Hence, the value of $T$ can be written as $0.56\pm (0.01+0.02)=0.56\pm 0.03$s. Hence the percentage error should be
$$\frac { 0.03 }{ 0.56 } \times 100\approx 5.357$$
Hence the percentage error should be $5.357$%.

But the answer given in the book is $3.57$%. How is this possible? Where did I commit a mistake?

Best Answer

I think you are confusing systematic and random errors.
Your experimental results can give you no idea about the systematic error.
For example it might be that your timing device is calibrated incorrectly and when the correct time is 1.00 seconds then your timing device gives a reading of 1.10 seconds; when the correct time is 2.00 seconds the timing device gives a reading of 2.20 seconds.
Repeating readings or the smallest subdivision of your scale will not give you an indication of what the systematic error is.
You could only find that error by checking the calibration of your timing device against a reliable standard.

So in this example you have found an estimate of the random error by evaluation the standard deviation and that is the best you can do.

Related Question