[Math] Confused about Standard Deviation

probabilitystandard deviationstatistics

The given was:

A large playlist consists of songs with times which have mean 2 minutes and standard deviation 10 seconds.

The question is: If as above 36 songs were randomly selected, what is the standard deviation of the total length of the songs, in minutes? (Give decimal answer to two places past decimal.)

The solution given was: $V [T] = 36(1/6)^2 = 1$

My question is where did 1/6 come from?

Best Answer

If $X_1, X_2, \dots, X_{36}$ is a random sample from $\mathsf{Norm}(\mu = 120,\sigma = 10)$ sec. then $\bar X \sim \mathsf{Norm}(120, \sigma=10/\sqrt{36})$ sec. and $$T = \sum_{i=1}^{36}X_i \sim \mathsf{Norm}\left(36(120), \sqrt{36(10^2)} =60 \right)\,\text sec.$$

I'll leave it to you to convert to minutes.