[Math] Calculating missing data points from standard deviation and mean

standard deviationstatistics

I have to figure out what to missing data points are from a set of 10 scores. The mean of the 10 observed scores is 20.0 and the standard deviation is 6.0. The observed scores are listed below, with the two missing scores shown as a blanks. What formula/method do I use to determine these scores? We learned basic standard deviation calculations in class but I can't figure out how to reverse it to find these two scores.

{13, 11, 20, 24, 29, 27, 16, 20, _, _ }

Best Answer

Hint: The mean is the sum of the scores divided by the number, so if the mean is 20 the total is 200. Let the remaining scores be $x$ and $y$ and this gives you $x+y$. Then the standard deviation is the square root of the variance. The variance is the sum of the squares of the difference of each square from the mean. So you have $(13-20)^2+(11-20)^2+\ldots+(x-20)^2+(y-20)^2=$what?