Solved – How to calculate standard deviation of function with two variables

standard deviation

I have the following:

  • x is mean of X={4,6} and equals to 5 and standard deviation of X equals to 1.41
  • y is mean of Y={3,9} and equals to 6 and standard deviation of Y equals to 4.24

How can I calculate the standard deviation of z which is equal to:

$$ z=\frac{y-x}{x}100 = \frac{6-5}{5}100=20$$

Best Answer

Suppose you have quantities $a,b$ and their standard deviations are $\Delta a, \Delta b$. Then if $s=a+b$ or $s=a-b$ then $\Delta s = \sqrt{(\Delta a)^2+(\Delta b)^2}$.

And if $q = a/b$ then $\dfrac{\Delta q}{q} = \sqrt{\left(\dfrac{\Delta a}{a}\right)^2+\left(\dfrac{\Delta b}{b}\right)^2}$

Your problem involves only subtraction and division, so you should be able to apply these easily. Look up "propagation of uncertainty" for more information.