Covariance, and the Taylor expansion for the expected value of a linear function of random variables

covarianceexpected valuerandom variablesrandom-functionstaylor expansion

Suppose I have two correlated random variables $X$ and $Y$ and am interested in the quantity
$$
\theta = U(X) – U(Y)\ ,
$$

where U is a smooth function. I am trying to determine the correct Taylor expansion of $\mathbb{E}[\theta]$ to order $\sigma_i^2$. Using the linearity of the expected value, I can write
\begin{align}
\mathbb{E}[\theta] &= \mathbb{E}[U(X)] – \mathbb{E}[U(Y)]\\
&\approx \left( U(\mu_X) + \frac{U''(\mu_X)}{2}\sigma_X^2 \right) – \left( U(\mu_Y) + \frac{U''(\mu_Y)}{2}\sigma_Y^2 \right)\ .
\end{align}

Is this correct? If so, how and why does $Cov[X,Y]$ not enter the picture?

To be concrete, I am comparing this result to the formula for the Taylor expansion of a general non-linear function of two variables:
$$
\mathbb{E}[\theta(X, Y)] = \theta(\mu_X, \mu_Y) + \frac{1}{2}\left[ \partial_{X}^2 \theta \sigma_{X}^2 + 2\partial_X\partial_Y\theta Cov[X, Y] + \partial_{Y}^2 \theta \sigma^2_{Y} \right]\ ;
$$

substituting $\theta = U(X) – U(Y)$ into this formula gives the result above if and only if $\partial_X\partial_Y \theta = 0$. But can this be true if $X$ and $Y$ are correlated? The answer seems to be in the affirmative, but I am having trouble justifying this to myself. Any intuition would be appreciated.

Best Answer

Expectation is linear, and therefore $$ \mathbb{E}[\theta] = \mathbb{E}[U(X)]-\mathbb{E}[U(Y)] $$ regardless of whether $U,Y$ are independent or not. They could be arbitrarily correlated, this would not change the expectation.

Note that, to answer your last specific point, with your $\theta$ you do have $$ \partial_X\partial_Y \theta = \partial_X\partial_Y U(X) - \partial_X\partial_Y U(Y) = 0 - \partial_X\partial_Y U(Y) = 0 $$ since $\partial_Y U(Y)$ does not depend on $X$.

Related Question