[Math] sample standard deviation given population standard deviation

standard deviationstatistics

How do you find the sample standard deviation when given the population standard deviation? What formula do you use? If you can make up an example that would be great.

Best Answer

My answer based on the second version of the original post:

According to the central limit theorem, the standard deviation of the sample mean of $n$ data from a population is $\sigma_{\overline{X}}=\sigma_X/\sqrt{n}$, where $\sigma_X$ is the population standard deviation. In your case, $\sigma_{\overline{X}}=40/\sqrt{100}=4$.

My answer based on the first and third versions of the original post:

In order to "get the sample standard deviation," you need to specify a sample (a subset of the population). If you do not specify a sample, then you cannot get the sample standard deviation. If you do specify the sample, then you can get the sample standard deviation. In either case, knowledge of the population standard deviation is irrelevant.

For example, consider a population $\{0,1,2,3,\ldots,k\}$ where $k$ is, say, some integer greater than 3. Even if I told you what the population standard deviation was, there is still no way to find the sample standard deviation because no sample was specified.

Now consider a sample $\{0,1,2\}$. The sample mean is $(0+1+2)/3=1$, the sample variance is $s_X^2=\frac{1}{n-1}\sum_{i=1}^n(x_i-\overline{x})^2=\frac{1}{3-1}[(0-1)^2+(1-1)^2+(2-1)^2]=1$, and the sample standard deviation is $s_X=1$, regardless of what the population standard deviation is.