[Physics] Which of these 2 methods for calculating the focal length of a concave mirror is more accurate

geometric-opticslensesopticsrefraction

I have done an experiment to measure $f$ the focal length of a concave mirror.

I have a list of 8 values for $u$ the object distance and 8 values for $v$ the corresponding image distance.

I calculated the focal length using 2 methods.

Method 1:

I got 8 values for $f$ using the formula $f=\frac{uv}{u+v}$

All values fell within $2\sigma$ so I used them all to find an average value.

Method 2:

I graphed $\frac{1}{u}$ against $\frac{1}{v}$ and from the average of the $x$ and $y$ intercepts of the regression line I found the focal length.

My question is:

Which of these 2 methods is more accurate?

Is there some way of qualitatively calculating the accuracy of each method?

NB: I did use the third method of approximating the focal length by focusing a distance object on some paper but let's ignore that method for the purpose of comparing the 2 methods in question.

Best Answer

Generally overall second method is preferred as in first method relation f = uv/u+v is assumed to be always true taking in account that mirror is a part of a perfect parabola but that's not true always. In this light rays are taken to be paraxial which isn't true in real scenario.

While in second method the value is found without assuming anything.

Moreover for perfect values instruments like spherometer can be used .