Solved – How to calculate the standard deviation of a sample given the sample size, mean, and sum of the data’s squares

meansamplestandard deviation

For example, given: $\sum_{i=1}^n x_i^2 = 10$, $n=10$, and $mean = 10$, how would I go about calculating the standard deviation? The formula for standard deviation requires me knowing what $\sum_{i=1}^n (x_i-mean)^2$ is, but that information is not provided to me in this problem.

Best Answer

The sample standard deviation is given by

$$\begin{align*} s &=\sqrt{\frac{\sum\left(x_i-\bar{x}\right)^2}{n-1}}\\\\ &=\sqrt{\frac{\sum\left(x_i^2-2x_i\bar{x}+\bar{x}^2\right)}{n-1}}\\\\ &=\sqrt{\frac{n\bar{x}^2+\sum x_i^2-2\bar{x}\sum x_i}{n-1}}\\\\ &=\sqrt{\frac{n\bar{x}^2+\sum x_i^2-2n\bar{x}^2}{n-1}}\\\\ &=\sqrt{\frac{\sum x_i^2-n\bar{x}^2}{n-1}}\\\\ \end{align*}$$

so having $\bar{x}, n,\text{ and}\sum x_i^2$ suffices for calculating the sample standard deviation.