Probability – Convergence in Lp Norm of Minimum of Uniform Random Variables

expected valueprobabilityuniform distributionuniform-convergence

${X_i}$ are i.i.d with $X_i\sim U\left(0,1\right)$. Prove that $Y_1=\min\left\{X_1,X_2,\ldots,X_n\right\}$ converges to $Y=0$ in expectation of order $p\ge1$.

My try and where I got stuck:
$$\lim _{n\to \infty }\left(E\left[\left|Y_1-Y\right|^p\right]\right)=\lim _{n\to \infty }\left(E\left[\left|Y_1\right|^p\right]\right)=\lim _{n\to \infty }\left(E\left[\left|\min\left\{X_1,X_2,\ldots,X_n\right\}^p\right|\right]\right)\\ \le \lim _{n\to \infty }\left(E\left[\left|X_i\right|^p\right]\right)\:\forall i\in \left\{1,2,\ldots,n\right\}=\lim _{n\to \infty }\left(\int _0^1\left|X_i\right|^p\:dx\right)\ne 0$$
That is my problem, I don't know how to receive $0$ here

Best Answer

If $X_i\sim U\left(0,1\right), i=1, \dots, n$ and are independent, we have the following property:

$$Y_1=\min\left\{X_1,X_2,\ldots,X_n\right\} \sim \text{Beta}(a=1,b=n).$$

Hence, as the expectation of any power of a beta distribution is known (see here), we have

$$ \mathbb E\left[\left|Y_1-0\right|^p\right]=\mathbb E\left[Y_1^p\right]=\\ \frac{\Gamma(a+b)\Gamma(a+p)}{\Gamma(a)\Gamma(a+p+b)}= \frac{\Gamma(1+n)\Gamma(1+p)}{\Gamma(1)\Gamma(1+n+p)}= \\ \frac{n!\Gamma(1+p)}{(n+p)((n-1)+p)...(1+p)\Gamma(1+p)}=\color{blue}{\frac{n!}{(n+p)((n-1)+p)...(1+p)}}. \tag {1} $$

For $p=1$, it becomes $\frac{1}{n+1}$, which tends to $0$ as $n \to \infty$.

For $p>1$, from

$$\frac{n!}{(n+p)((n-1)+p)...(1+p)} \le \frac{n!}{n!+ p^n},$$

we see that the last term in (1) tends to zero as $n \to \infty$. This yields the desired result:

$$\mathbb \lim_{n \to \infty } E\left[\left|Y_1-0\right|^p\right]=0$$

for any $p \ge 1$.

Related Question