[Math] Given a function, calculate MMSE and LMMSE

conditional probabilitymean square errorprobabilityprobability distributions

Let $X = \frac{1}{1+U}$ where $U$ is uniformly distributed over $[0,1]$. I need to evaluate $E[X\mid U]$ and $\hat{E}[X\mid U]$ and the calculate the MSE, $E[(X-E[X\mid U])^2]$ and $E[(X-\hat{E}[X\mid U])^2]$


I know that, in general, the pdf of a uniform distribution is $\frac{1}{b-a} \in [a,b]$ and the mean is $\frac{a+b}{2}$.

In general, the minimum mean square error estimator
is simply the conditional mean,
\begin{align}
E[X\mid Y=y] &= \int x f_{X\mid Y}(x\mid y) \, dx \\
f_{X\mid Y}(x\mid y) &:= \frac{f_{XY}(x,y)}{f_Y(y)}\\
f_Y(y) &= \int_{-\infty}^\infty f_{XY}(x,y) \, dx
\end{align}

In general, the least linear
minimum mean square error (LMMSE) estimator is defined as
\begin{align}
\hat{E}[X\mid Y=y] &= \mathbb E[X] + \operatorname{Cov}(X,Y)\operatorname{Cov}(Y)^{-1}(y-E[Y])
\end{align}


I am having problems formulating the problem function, $X = \frac{1}{1+U}$, in terms of the joint and conditional pdf.

Best Answer

Since $X = \displaystyle \frac{1}{1+U}$, the conditional expectation $E[X\mid U = \alpha]$, the expected value of $X$ given that $U = \alpha$, is the expected value of $\displaystyle \frac{1}{1+U}$ given that $U = \alpha$, and is thus just $\displaystyle \frac{1}{1+\alpha}$. Thus, $$E[X \mid U] = \frac{1}{1+U}$$ is the MMSE estimator for $X$ given $U$. This varies from $1$ when $U = 0$ to $\frac{1}{2}$ when $U = 1$.

For the linear minimum-mean-square-error (LMMSE) estimator, you need to find $E[X]$ which is just $$E[X] = E[E[X \mid U]] = E\left[\frac{1}{1+U}\right] = \int_{-\infty}^\infty \frac{1}{1+u}f_U(u)\,\mathrm du = \int_0^1 \frac{\mathrm du}{1+u}$$ whose value you should work out for yourself.

Write down $\displaystyle E[X] = \int_0^1 \frac{\mathrm du}{1+u} = \cdots \quad$ after computing the integral shown above and putting its value where I have written $\cdots$. Draw a box around this so you can find the numerical value of $E[X]$ again easily. You will need it in the future.

Next, $$\operatorname{cov}(X,U) = E[XU] - E[X]E[U] = E\left[\frac{U}{1+U}\right] - E[X]E[U]$$ where all the quantities on the right are readily computed.

Repeat slowly three times:

  1. I can compute $E\left[\frac{U}{1+U}\right]$ using the law of the unconscious statistician as $$E\left[\frac{U}{1+U}\right] = \int_{-\infty}^{+\infty} \frac{u}{1+u}f_U(u)\,\mathrm du = \int_0^1 \frac{u}{1+u}\,\mathrm du = \bigr[u - ln(1+u)\bigr|_0^1 = 1 - \ln(2).$$

  2. I do not need to compute $E[X]$ again because I already found its value and I have saved it for future use.

  3. I will not write $E[X] = \frac{1}{1+U}$ (as I did in the comments) and needlessly confuse myself because of #2 above. I already know the numerical value of $E[X]$, and I also understand that this real constant cannot possibly equal $\frac{1}{1+U}$ which is a random variable.

  4. I already know that $E[U] = \frac{1}{2}$ and so I don't need to find it again.

Now, compute $\operatorname{cov}(X,U) = E\left[\frac{U}{1+U}\right] - E[X]E[U]$ where the three expectations on the right have known numerical values that you have just computed. Still doesn't work? Carry out the instructions in the highlighted text above one more time.

In order to compute the LMMSE estimator, you will also need $\operatorname{var}(U)$ which I hope you can also compute easily (or use a standard formula) to arrive at the answer $\frac{1}{12}$.

Now put it all together. You should get that the LLMSE estimator is a straight line $au+b$ of negative slope that intersects the hyperbola $\frac{1}{1+u}$ (the MMSE estimator) in two places.

Related Question