Fractional error in time period

error analysishomework-and-exercisesMeasurementsunits

The length of a pendulum is measured using a meter scale which has 2000 divisions. If the measured value L is 50cm, the accuracy in the determination of g is 1.1% and the time taken for 100 oscillations is 100 seconds, what should be the resolution of the clock (in milliseconds).

I have a confusion about calculation of fractional error in time period.Shouldn't the fractional errors in length and gravity be added and divided by two to find fractional error of time period?

My solution is :

percent error in length = (100cm/2000 i.e least count)/50cm x 100 =0.1.

percent error in g = 1.1 percent(given)

percent error in T = 1/2 (0.1 +1.1)=0.6 (this is obtained by taking logarithms and differentiation)

fractional error in T = resolution/total time =0.6/100 so resolution is 0.6
seconds or 600 milliseconds .

However the answer given is 5ms.

Can someone give me some clarity about fractional errors in time period ? Is my procedure rational?
Thanks in advance.

Best Answer

the time period T is a function of the length L and the gravitation g :

with: $$ T\propto\sqrt{\frac Lg}\\ dT=\frac{\partial T}{\partial L}\,dL+\frac{\partial T}{\partial g}\,dg\quad\Rightarrow\\ \frac{dT}{T}=\frac 12\left(\frac {dL}{L}-\frac{dg}{g}\right)$$

with your data $~\frac {dL}{L}=0.1~,\frac {dg}{g}=0.01~$ you obtain $~\frac {dT}{T}=0.045~=4.5\%$

Related Question