[Math] Joint distribution of $(X,\min(X,Y))$ for $X$ and $Y$ i.i.d. uniform on $(0,1)$

probabilityprobability distributions

Take two independent random variables $X_1, X_2$, identically uniformly distributed over $[0,1]$. Now take $U= \min (X_1, X_2).$ There's a standard procedure to evaluate the distribution of $U$: $p_U(u) = 2(1-u).$ What I would like to find is the joint distribution of $X_1$ and $U$. It's probably a standard question, and reference is fine for me: I just couldn't find it by myself. What I tryed, is something informal: $P(X_1 = x, U = u) = P(X_1 = x | U = u)2(1-u)$. But this is something strange, since, from simmetry, I would say that $P(X_1 = u | U = u) = 0.5,$ and so my guess for $P(X_1 = x | U = u)$ is $\frac{1}{2} 1_{\{u\}}(x) + \frac{1}{2(1-u)} 1_{(u,1]}(x),$ which looks somehow wrong.

Best Answer

A measure-theoretic solution:

Let $F$ denote the distribution function of $X$. Intuition gives that the conditional probability of the event $[U > u]$ given the $\sigma$-field $\sigma(X)$ is given by: $$P(U > u | X)(\omega) = I_{[X > u]}(\omega)(1 - F(u)) \tag{$*$}.$$ (A rigorous proof for $(*)$ will be given at the end of this answer.) Intuitively, for fixed $u$, given the observation of $X(\omega)$, if $X(\omega) \leq u$, there is no hope that $U(\omega)$, which is at most $X(\omega)$, can exceed $u$, this explains the existence of the first term of $(*)$; if $X(\omega) > u$, then $U(\omega) > u$ if and only if $Y(\omega) > u$, by independence of $X$ and $Y$, this gives $1 - F(u)$.

Now for any $x \in \mathbb{R}^1$, since $[X \leq x] \in \sigma(X)$, it follows by $(*)$ that \begin{align} & P(U > u, X \leq x) \\ = & \int_{X \leq x} P(U > u|X)(\omega) dP \qquad \text{by the definition of conditional probability} \\ = & \int_{X \leq x} (1 - F(u))I_{[X > u]}(\omega) dP \qquad \text{by $(*)$} \\ = & \int_{(-\infty, x]}(1 - F(u))I_{(u, +\infty)}(v) dF(v) \qquad \text{change of variable formula}\\ = & \begin{cases} 0 & \text{if $x \leq u$}; \\ (1 - F(u))(F(x) - F(u)) & \text{if $x > u$}. \end{cases} \end{align} Together with that $P(X \leq x) = F(x)$, it follows that \begin{align} & P(U \leq u, X \leq x) \\ = & P(X \leq x) - P(U > u, X \leq x) \\ = & \begin{cases} F(x) & \text{if $x \leq u$}; \\ F(x) - (1 - F(u))(F(x) - F(u)) & \text{if $x > u$}. \end{cases} \tag{$**$} \end{align}


Proof of $(*)$: Clearly, the right hand side of $(*)$ is $\sigma(X)$-measurable, in view of $\{X > u\} \in \sigma(X)$. In addition, for every $H \in \mathscr{R}^1$, by the change of variable formula, \begin{align*} & \int_{X \in H} I_{[X > u]}(\omega)(1 - F(u))dP = (1 - F(u))\int_H I_{(u, +\infty)}(x)dF(x) = P(Y > u)P([X \in H] \cap [X > u]) \\ = & P([Y > u] \cap [X > u] \cap [X \in H]) = P([U > u] \cap [X \in H]), \end{align*} where we used $X$ and $Y$ are independent in the first equal sign of the second line above. Since $[X \in H]$ is the general element of $\sigma(X)$, this proves $(*)$.


Some poster was curious about why I didn't report the joint density (i.e., pdf) instead of the joint cdf. Well, the fact is: this is an example that a random vector doesn't have joint density!

Let $A = [(x, u): x = u]$ and let $\mu$ be the probability measure on $\mathbb{R}^2$ induced by $(X, U)$. It can be seen that $$\mu(A) = P(X = U) = P(X \leq Y) = \frac{1}{2}.$$ Therefore there is a positive probability mass concentrated on a set which has Lebesgue measure $0$, so it is impossible for $(X, U)$ to have a joint density $f$ with respect to the planar Lebesgue measure $\lambda_2$ (otherwise, $\mu(A) = \iint_A f(x, u) dx du = 0$, contradiction!). Therefore the most clear way to describe the joint distribution of $(X, U)$ is still through (**).

Related Question