Expected value of $X-Y$, both uniformly distributed

expected valueprobability theoryrandom variablesuniform distribution

Let be $X,Y$ two independent uniformly distributed random variables on $[0,1]^2$. Calculate the expected value $\mathbb{E}(X-Y)$.


Actually, it seems pretty easy as we can simply use the properties of the expected value and get
$$
\mathbb{E}(X-Y)=\mathbb{E}(X)-\mathbb{E}(Y)=0
$$

because both random variables obey the same distribution. If I define the new random variable $Z:=X-Y$ and compute its probability density function (pdf) by the convolution formula it should yield the same result. We know that the pdfs $f_X(x)=f_Y(y)=1$ for all $0\leq x,y\leq 1$, so

$$
f_Z(z)=\int\limits_{-\infty}^{\infty}f_X(x)f_y(z-x)~dx=\int\limits_{0}^z1\cdot 1 ~dx=z,
$$

The range of $Z$ is $[-1,1]$ so we get
$$
\mathbb{E}(Z)=\int\limits_{-1}^1z\cdot z ~dz=\frac{z^3}{3}\Big|_{-1}^1=\frac{1}{3}+\frac{1}{3}=\frac{2}{3}\neq 0.
$$

This contradicts the result from the beginning. Where is my mistake?

Best Answer

You are convolving $X$ and $-Y$ but you are using the formula for $X+Y$ instead of $X-Y$. In particular, you should use the pdf of $-Y$ which is supported on $[-1,0]$ instead of that of $Y$. So infact, the range of integration will be from $z$ to $1$ and you'll get $1-z$ when $z$ is positive and it will be from $0$ to $1+z$ when $z$ is negative. So $f(z)=\begin{cases} (1+z)\,,-1\leq z\leq 0\\ (1-z)\,, 0\leq z\leq 1\end{cases}$

Now compute the expectation and you'll get $0$

Related Question