Solved – How to find the standard deviation of the difference between two means

meanstandard deviation

I have two means and two standard deviations: M = 45.11, SD = 14.21
M = 60.11, SD = 14.36

I know to calculate the difference in means I can take one from the other, but how do I calculate the standard deviation? Please help!!!

Edit: should have mentioned, they are dependent variables!

Best Answer

In order to calculate the variance of $\bar{X} - \bar{Y}$ you need to know something about the covariance between $X$ and $Y$. If you have the original data then you can estimate the covariance directly, but absent this information we can use the Cauchy-Schwarz inequality to get an upper bound:

\begin{align} \text{Var} \left (\bar{X} - \bar{Y} \right ) &= \sigma_x^2 / n + \sigma_y^2 / n - 2 \sigma_{\bar{x}\bar{y}} \\ &\leq \sigma_x^2 / n + \sigma_\bar{y}^2 / n + 2 | \sigma_{\bar{x}\bar{y}} | \\ &\leq \sigma_x^2 / n + \sigma_y^2 / n + 2 \sigma_x \sigma_y / n \\ &= (\sigma_x + \sigma_y)^2 / n, \end{align}

which can be estimated by plugging in the appropriate point estimates. This could potentially be quite a bit larger than the actual variance as it's only achieved when $X$ and $Y$ are perfectly negatively correlated, but it's the best you can do without more information.