[Math] How to calculate standard deviation for a series containing both positive and negative numbers

standard deviationstatistics

I got a stream of numbers in one of my apps to represent an electrical signal. I've observed that the signal ranges from -100 to +100. Other than that, the signal is fairly random and crosses 0 in most cases. I'm trying to understand if I'm calculating the standard deviation of this stream of numbers correctly.

  • I collect 1000 data points
  • I add all the data points and divide by 1000 to calculate the mean < This is the step that I'm not sure of. Do I need to add
    absolute values to calculate the mean if there are both positive and negative numbers?
  • I calculate the sum of square of differences for each value (value – mean)^2
  • I take a square root of
    the (sum of square differences divided by 1000) to get standard deviation

The result I get is something like 46.6 I'm trying to understand if this number appears correct given the steps above, or if I need to adjust how I calculate standard deviation to account for having negative numbers mixed with positive.

Thank you!

Best Answer

If your data is a sample, you should use the sample variance

$s^2=\frac{1}{n-1}\sum_{i=1}^n(x_i-\bar{x})^2$

as it is an unbiased estimator (note the n-1 in the denominator). The standard deviation is given as described above by $s=\sqrt{s^2}$. As Alyosha mentioned, do not use the absolute numbers, but the real values of x given by the experiment.