Covariance of a standard normal variable

covarianceindependencenormal distribution

Let 𝑋 be a standard normal random variable. Another random variable is determined as follows. We flip a fair coin (independent from 𝑋). In case of Heads, we let 𝑌=𝑋. In case of Tails, we let 𝑌=−𝑋.

Compute $cov(X,Y)$. Are $X$ and $Y$ independent?

Getting either Heads or Tails is equally likely, hence a probability of $1/2$.

If Heads with p=1/2 (fair coin):

$Cov(X,Y) = E[XY] – E[X]E[Y] = E[XY] = E[X^2] = var(X) -(E[X])^2 = var(X) = 1$

If Tails with p=1/2:

$Cov(X,Y) = E[-X^2] = – E[X^2] = -var(X) = – 1$

$Cov(X,Y)= 1/2*1 + 1/2*-1 = 0$.

X and Y are independent if P(Y │X) = P (Y). If we know x, we know y is either ±x, so X gives us some useful information about Y. Therefore X and Y are not independent?

On the other hand, I get different with a contrasting approach. Since X is standard normal r.v. which means that it is symmetric around 0, then it has the following PDF:

$$fx(x) = 1/sqrt(2π) * e^{-x^2/2}$$

Now, if the coin toss happens to be heads (with probability 1/2) then the PDF of Y is exactly the same as above. What happens however if the result is tails?

Given that X is symmetric then Y=-X is exactly the same as Y=X, which can be shown as follows:

$$f_Y(-x) = 1/sqrt(2π) * e^{-(-x)^2/2} = 1/sqrt(2π) * e^{-x^2/2} = f_Y(x) = f_X(x)$$

The above then implies that Y=X no matter the results of the coin toss, so Y is normal and the cov(X,Y) = cov(X,X) = var(X) = 1, as X is standard normal. Finally, since X and Y are identical, then both X and Y are dependent.

The definite question is $cov(X,Y)$ equal to 1 or 0 ?

Best Answer

Similar to your related question (Probability of a standard normal random variables), you simply need to make your conditioning on the external source of randomness (say, $C$, being the independent, random coin flip) much more explicit. Then you will find that your first set of calculations is correct. Note that without this conditioning, $Y$ does not take on any fixed formula yet. To illustrate, we start with $$ \begin{align} \mathrm{Cov}(X, Y) &= \mathbb{E}{XY} - \mathbb{E}{X} \mathbb{E}{Y} . \end{align} $$ This is always true for any two random variables $X$, $Y$. However, we want to condition on $C$ and use the law of total expectation so that we have a concrete formula for what $Y$ is. For example, for the first term, $$ \begin{align} \mathbb{E}XY &= \frac{1}{2} \mathbb{E}[XY \mid C = \text{"Heads"}] + \frac{1}{2} \mathbb{E}[XY \mid C = \text{"Tails"}] \\ &= \frac{1}{2} \mathbb{E}[X^2] + \frac{1}{2} \mathbb{E}[-X^2] = 0. \end{align} $$ The remaining computations proceed identically. Now your intuition and reasoning that $X$ and $Y$ are not independent (even though they are uncorrelated) is perfectly correct.

As for the density of $Y$, the correct computation follows again from performing the conditioning correctly: $$ \begin{align} f_Y(x) = \frac{1}{2} f_{Y \mid C = \text{"Heads"}}(x) + \frac{1}{2} f_{Y \mid C = \text{"Tails"}}(x) = \frac{1}{2} f_X(x) + \frac{1}{2} f_{-X}(x) &=f_X(x). \end{align} $$ (In the end we simply use the symmetry of the standard normal random variable $X$.) Note that $f_Y(x) = f_X(x)$ does not mean that $X = Y$; it simply means they have the same density but as our previous computations reveal, they are intricately linked by the external coin toss.

Related Question