Combinatorial proof calculating the expected minimum of uniform IID random variables (first order statistic)

combinatorial-proofsorder-statisticsprobability theoryuniform distribution

Question: Suppose that $X_1, \ldots, X_n$ are IID uniform random variables in $[0, 1]$. Prove that $\mathbb{E}[ \min(X_1, \ldots, X_n) ] = \frac{1}{n+1}$. Can you prove it with a "combinatorial proof" (e.g., using a symmetry argument or similar)?

Context: I am aware of a simple proof that first argues about the CDF of $\min(X_1, \ldots, X_n)$ and then integrates it to access its expectation. While simple, I would like to find a proof of the fact without using any calculus (or calculations at all if possible) as such proofs tend to generalize more easily.

Best Answer

The points $X_1,\dots,X_n$ split $[0,1]$ into $n+1$ (almost surely) pieces; their lengths called spacings. These spacings are identically distributed$^*$ and add up to $1$, so the expectation of each of them is $1/(n+1)$.


$^*$This is hard to argue rigorously without computation. But maybe the following argument is convincing.

Let $X_{(1)}<\dots < X_{(n)}$ be the ordered values. Conditionally on $X_{(1)}$, the numbers $X_{(2)},\dots,X_{(n)}$ are randomly chosen (ordered) numbers from $[X_{(1)},1]$. Similarly, conditionally on $X_{(n)}$, the numbers $X_{(1)},\dots,X_{(n-1)}$ are randomly chosen (ordered) numbers from $[0,X_{(n)}]$. The lengths of these two intervals, $1-X_{(1)}$ and $X_{(n)}$, have the same distribution thanks to symmetry. As a result, any two consecutive spacings have the same distribution.