Exam questions continuous random variables

covarianceprobabilityrandom variables

Let $U$ be uniformly distributed on $[0, 1]$, and $X$ and $Y$ be arbitrary non-negative continuous random variables with finite mean and variance. $U$ , $X$ and $Y$ are assumed to be independent. We define
\begin{equation}
Z \sim
\begin{cases}
X \text{ with probability } U\\
Y \text{ with probability } 1-U,
\end{cases}
\end{equation}

meaning that, conditionally on $U = u$, $Z$ is equal to $X$ , where $X$ has some distribution function $F_{x}$ , with probability $u$, or $Z$ is equal to $Y$ , where $Y$ has some distribution function $F_{y}$ , with probability $1−u$.

a) Find $\mathbb{E}[Z]$ in terms of $\mathbb{E}[X]$ and $\mathbb{E}[Y]$

b) Given $Z = Y$ , find the probability that $U > \frac{1}{2}$ .

Assume from now on that $X \sim \mathcal{N}(1,1)$ and $Y \sim \mathcal{N}(−1,1)$.

c) Find Cov$(Z, X )$. Are $Z$ and $X$ independent?

d) Suppose instead of $U$ uniform, we take $U = 1$ with probability $p$ and $U = 0$ with probability $1 − p$, for some $p \in [0, 1]$. Compute Var$(Z)$.

The main issue I have for this problem is that I do not completely understand the wording. From what I understand $Z=X$ with a probability of $U$ and $Z=Y$ with a probability of $1-U$. So then for a) I get
\begin{equation}
\mathbb{E}[Z] \sim
\begin{cases}
\mathbb{E}[X] \text{ with probability } U\\
\mathbb{E}[Y] \text { with probability } 1-U
\end{cases}
\end{equation}

But to me this solution seems too simple. Similarity for b) I interpret the questions to them wanting to find $\mathbb{P}(U>\frac{1}{2}|Y=Z)$. For c) I used my simple result from a) to determine that Cov$(Z,X)=0$. But since this does not imply independence and my understanding of the meaning of the question being so limited I do not know how to continue.

Best Answer

We have $Z = X$ with probability $U$ and $Z = Y$ with probability $1-U$.

We would like to compute the expectation of $Z$.

For this, we define more clearly the variable $Z$. Any ambiguity in finding its expectation relates to the fact that it is difficult to assess the probability of $X,Y$ as fixed values.

To do this, set up the following spaces : $\Omega_1 = [0,1]$ and $\Omega_2 = [0,1] \times [0,1]$ with the Lebesgue measure and Borel sigma algebra for both.

Define the map $U : \Omega_1 \to \mathbb R$, $U(\omega) = \omega$. Then $U$ is the standard uniform variable.

Define the map $\Gamma$ from $\Omega_2 \to \mathbb R$ as follows : $\Gamma(\omega_1,\omega_2) = 0$ if $U(\omega_1) < \omega_2$, and $\Gamma(\omega_1,\omega_2) = 1$ if $U(\omega_1) \geq \omega_2$.

Consider the "component" maps of $\Gamma$. For each fixed $\omega_1$, the map $\omega_2 \to \Gamma(\omega_1,\omega_2)$ is a Bernoulli random variable i.e. $1$ with probability $\omega_1$ and $0$ otherwise. On the other hand, the first component of $\Gamma$ is uniform.

Now, (I have $X,Y$ independent defined on some other probability space $\Omega_3$) I define $Z(\omega_1,\omega_2,\omega_3) = X(\omega_3)\Gamma(\omega_1,\omega_2) + Y(\omega_3)(1-\Gamma(\omega_1,\omega_2))$.

Check that $Z = X$ whenever $\Gamma(\omega_1,\omega_2) = 1$ i.e. with probability $\omega_1$, which equals $U(\omega_1) = U$. Otherwise $Z = Y$.

Thus, $Z$ is the random variable we want.


Now things become easy. For the first, use the law of expectation , with conditioning on $\Gamma$. Note that $E[Z | \Gamma = 1] = X$ and $E[Z | \Gamma = 0] = Y$. Thus, we simply have : $$ E[Z] = E[Z | \Gamma = 1] P(\Gamma = 1) + E[Z | \Gamma = 0]P(\Gamma = 0) = \frac{E[X] + E[Y]}{2} $$

once you note that $P(\Gamma = 0) = P(\Gamma = 1) = \frac 12$ fairly easily from the definition.

For $B$, you note that : $\{Z = Y\}$ equals the event $\{\Gamma = 0\}$. Therefore, all you must do is find $P(U > \frac 12 | \Gamma = 0)$. By definition, this equals $P(U > \frac 12 , \Gamma = 0) \over P(\Gamma = 0)$. The bottom is $\frac 12$.

For the top, note that if $U = u$ then $\Gamma =0$ with probability $1-U$. Therefore, we have : $$ P(U > \frac 12 , \Gamma = 0) = \int_{\frac 12}^1 (1-u)du $$

which you can calculate.


For $c$, note that $E[Z] = 0$ from the formula in $a$. Now, note that $XZ = X^2$ with probability $U$ and $XY$ with probability $1-U$, so running the analysis we did with $Z$, with $XZ$ instead, (because $X^2$ and $XY$ remain independent) tells you that $E[XZ] = \frac{E[X^2] + E[XY]}{2} = 0$.

Therefore, the covariance is zero. Rightly so, you cannot conclude independence.

Consider a set $A \subset \mathbb R$. Then , $P(Z \in A) = \frac 12(P (X \in A) + P(Y \in A))$ after conditioning on $\Gamma$. Thus, $P(Z \in A | X \in B) = \frac 12 (P(Y \in A) + P(X \in A \cap B)) \neq P(Z \in A)$ (again after $\Gamma$ conditioning). Finally, surely the two random variables are not independent!

Part $d$ is much easier : we don't need the $\Gamma$ machinery. Given $X,Y$ independent on some $\Omega_3$, we just have the map $Z : \Omega_3 \times [0,1] \to \mathbb R$ given by $Z(\omega_1,\omega_2) = X(\omega_1)$ if $\omega_2 \leq p$, and $Y(\omega_1)$ otherwise. This definition is exactly as per requirements. Note $Z^2$ would also have a similar definition.

Find $E[Z]$. Similarly find $E[Z^2]$, finish.

Related Question