[Math] How to calculate the sample means standard deviation

statistics

A simulation was conducted using 10 fair six-sided dice, where the faces were numbered 1 through 6, respectively. All 10 dice were rolled, and the average of the 10 numbers appearing faceup was recorded. The process was repeated 20 times. Which of the following best describes the distribution being simulated?

The answer is: A sampling distribution of a sample mean with $n=10$, $\mu_{\bar{x}}=3.5$ and $\sigma_{\bar{x}}\approx 0.54$

I know that $\mu_{\bar{x}}=\mu$ and $\sigma_{\bar{x}}=\frac{\sigma}{\sqrt{n}}$

$\mu=\frac{1+2+3+4+5+6}{6}=3.5$ so $\mu_{\bar{x}}=3.5$

But how do I calculate the standard deviation?

Best Answer

Hint:

The variance of a single roll of a die is

$$\sigma_x^2=\sum_{i=1}^6 \frac16 \left(i-3.5 \right)^2=\frac{35}{12}$$

And the variance of the mean of iid random variables is $Var(\overline X)=\frac{ \Large{\sigma_{x}^2}}n$

Therefore the standard deviation is $\sigma_{\overline x}=\sqrt{\frac{\sigma_x^2}{10}}$

Related Question