Find the mean and mean standard deviation of the frequency given a set of periods measurements

error analysisfrequencyMeasurements

I have a set of measurements of the period of a signal i.e. $S = [X_1, X_2, …, X_N]$ with a 1-standard deviation error of the mean i.e. $\sigma_{\bar{S}} = \dfrac{\sigma_{S}}{\sqrt{N}} $ where $\sigma_{S}$ is the standard deviation related to the set $S$.

My aim is to calculate the mean frequency and the standard deviation of the mean frequency of this set. The frequency is defined as the inverse of the period.

We can follow 2 path:

  1. We simply calculate the frequencies, the mean frequency $\bar{S'}$ from the frequencies set: $S'=[1/X_1, 1/X_2, …,1/X_N]$ and the standard deviation of the mean frequency with $\sigma_{\bar{S}'} = \dfrac{\sigma_{S'}}{\sqrt{N}} $.

  2. Or we calculate the mean frequency, denoted now $\bar{S''}$ from the mean of period of $S$ i.e. $\bar{S''} = \dfrac{1}{\bar{S}}$. For the frequency error we propagate the error in the following way: $\sigma_{\bar{S}''} = \dfrac{\sigma_{\bar{S}}}{\bar{S''}^2} $

What is the correct procedure to find the mean frequency and its standard deviation?

PS: I have empirically noted that point 2 leads to a higher uncertainty.

Best Answer

Firstly let's suppose that $\bar{S} \neq 0$ because, in this way, we can define the frequency.

Point 2 is wrong since the standard deviation of the mean related to a set $M$ is calculated via the following expression $\sigma(\bar{M})= \dfrac{\sigma(M)}{\bar{M}} $. Simply you can't propagate the error of the mean since it is not a measurement but a statistics.

In your problem the uncertainty of each measurement is given by the standard deviation related to your set. In point 1, going from periods $S$ to frequencies $S'$, you have redefined the set of measurements and their distribution. In this way the uncertainty is related to the new standard deviation. Essentially point 1 is correct.

Related Question