Calculate the Mean and Standard Deviation on a Rate

meansstandard deviationstatistics

I got a set of data Time (s) and Size (Bytes) for each I calculate a Rate (s/Bytes). I'm trying to predict a Time knowing a Size and using the Mean and the of the Rates. Something like 11 minutes 48 seconds ±3σ 2 minutes 30 seconds

For now, to calculate the mean I Sum the Times and the Sizes and calculate the Mean using these two values. But to calculate de Standard Deviation I need to perform the calculation on the calculated Rates of my set of data.

  • Does that make any sense to calculate the Mean on Times – Sizes sums and the Standard Deviation on each calculated Rates?

  • Or should I calculate the Mean of each calculated Rates knowing that I'm using a Standard Deviation on each calculated Rates to give ±3σ estimation?

Are both method correct or should I apply the second solution even if the Mean calculation of the first solution is more accurate?

Best Answer

The seeking Mean and Standard Deviation are on a ratio, and we can't Mean all Ratio directly. So to calculate the Mean the only way is by summing Times and Sizes and to calculate the Ratio of these two values. And to calculate the estimator of the Standard Deviation, the only ways is to use all calculated Ratios. I have no other way accomplished these two tasks.