If I multiply the averages of two sets of data, then what can I conclude with the deviation of that data?
That is, for a given average X1 and X2, both are different sets of data but related, I calculated a derived average X3 = X1 * X2. Both X1 and X2 also has their own standard deviation values, D1 and D2. How can I calculate the standard deviation value D3?
My concrete sample problem to illustrate the use-case
I have sample performance measurements of a computer operation, measured in miliseconds / invocation
(the first data set) and invocations / day
(the second data set). Both of these has a mean and a standard deviation value. I would like to get another measure that is in miliseconds / day
. So I multiplied the first data with the second :
X1 * X2 = X3
miliseconds / invocation * invocations / day = miliseconds / day
The question is, how do I get the standard deviation value of X3 ?
Thanks in advance.
Best Answer
Assuming $X_1$ and $X_2$ are independent, $$\text{Var}(X_1 X_2) = \text{Var}(X_1) \text{Var}(X_2) + E[X_1]^2 \text{Var}(X_2) + E[X_2]^2 \text{Var}(X_1)$$
EDIT: Without that assumption, $$\text{Var}(X_1 X_2) = \text{Var}(X_1) \text{Var}(X_2) + E[X_1]^2 \text{Var}(X_2) + E[X_2]^2 \text{Var}(X_1) + \text{Cov}(X_1^2, X_2^2) - \text{Cov}(X_1,X_2)^2 - 2 \text{Cov}(X_1,X_2) E[X_1] E[X_2]$$
At one extreme, if $X_2 = X_1$, $\text{Var}(X_1 X_2) = \text{Var}(X_1^2) = E[X_1^4] - E[X_1^2]^2$. There is no way to estimate this given just $E[X_1]$ and $\text{Var}(X_1)$.
At the other extreme, if $X_2 = 1/X_1$, $\text{Var}(X_1 X_2) = 0$.