[Math] Conditional expectation, max, min of random variables

conditional-expectationprobabilityprobability distributionsprobability theoryuniform distribution

We are given two independent random variables $A, B$ with uniform distribution on $[0,1]$. We define new random variables $X = \max (A,B)$ and $Y = \min (A,B)$.

Find $\mathbb{E}(X\mid Y)$

(defined as the one, up to measure zero, random variable which satisfies $\sigma (\mathbb{E}(X\mid Y)) \subset \sigma(Y))$ and $\forall B \in \sigma(Y) : \int_B X \, dP = \int_B \mathbb{E}(X\mid Y) \, dP$).

I've done this by finding distribution function of $(X, Y)$ and then joint density function $f_{XY}(x,y)$ and then using the formula $$\mathbb{E}(X\mid Y) = \frac{1}{f_Y(Y)}\int_{\mathbb{R}}x f_{XY}(x,Y) \, dx$$

I wonder if there is a clearer, shorter, less time consuming way of dealing with this problem.

Best Answer

The case when $A$ and $B$ are independent you can do in your head. For the general dependent case, instead of finding $f_{XY}(x,y)$, you can compute $E[X|Y=y]$ by integrating $f_{AB}(a,b)$ over the line segments:
$$\{(a, y) : a\in [y,1]\}\cup \{(y,b) : b \in [y,1]\}$$

So: $$ E[X|Y=y] = \frac{\int_{y}^1 af_{AB}(a,y)da + \int_{y}^1 bf_{AB}(y,b)dy}{\int_{y}^1f_{AB}(a,y)da + \int_{y}^1f_{AB}(y,b)dy} $$

If $A$ and $B$ are independent then $f_{AB}(a,b)=1$ for all $a,b \in [0,1]$ and the above integrals give $E[X|Y=y]=(1+y)/2$.


Of course, a more intuitive way in the independent case is to just observe that, given the min is $y$, the max is uniformly distributed over $[y,1]$, so its mean is the midpoint $(1+y)/2$.