[Math] Simple question on standard deviation and mean.

standard deviationstatistics

This question has kinda stumped me:

For a data set, the mean is 5 and the standard deviation is 1. There
are 10 values in the data set. Is the range of the data set bigger,
smaller or equal to 10?

I don't want the answer, just the technique. Here's what I've tried (admittedly, not much):

$$\sum_{i=0}^{9}a_i / 10= 5$$
$$\sigma = 1$$
Is there a relation between standard deviation and simple mean which I can use? Any help is appreciated. This is not homework, it's a practice qn for the GRE.

Best Answer

$\def\implies{\quad\Rightarrow\quad}$If the mean is 5, then you know that

$$\frac{1}{10} \sum_i a_i = 5 \implies \sum_ia_i = 50$$

and if the standard deviation is 1 (and hence the variance is 1) then

$$\frac{1}{10} \sum_i(a_i-5)^2 = 1 \implies \sum_i(a_i-5)^2 = 10$$

To make the range as big as possible, we choose two values $a^+$ and $a^-$ to be as far from the mean as possible (symmetrically) and set the rest of the values to the mean, i.e. $a^+=5+x$ and $a^-=5-x$. What does that tell you about $x$?