Convergence in distribution for $Y_n=\min(X_1,X_2,\ldots,X_n)$ with i.i.d. Uniform$(0,1)$ random variables

convergence-divergenceprobability theoryrandom variablessolution-verification

I have this exercise where I want to check my solutions:

Let $X_1 , X_2 , X_3 , \ldots$ be a sequence of i.i.d. Uniform $(0,1)$ random variables. Define the sequence $Y_n$ as $Y_n=\min(X_1,X_2,⋯,X_n)$.

Prove the following convergence results independently (i.e, do not conclude the weaker convergence modes from the stronger ones).

$$Y_n \overset{d}{\rightarrow} 0$$

My solutions:

We can write

\begin{align}
F_{Y_n}(y)&=P(Y_n\le y)
\\&=1−P(Y_n>y)
\\&=1−P(X_1>y,X_2>y,⋯,X_n>y)
\\&=1−P(X_1>y)P(X_2>y)⋯P(X_n>y)
\end{align}

(since $X_i$'s are independent)

$=1−(1−F_{X_1}(y))(1−F_{X_2}(y))⋯(1−F_{X_n}(y))=1−(1−y)^n$.

Therefore, we conclude $\lim_{n\to\infty}F_{Y_n}(y)=\left\{\begin{matrix} 0 & y \leq 0 \\ 1& y>0 \end{matrix} \right. $.

Here I cannot see the convergence in distribution since $\lim_{n\to \infty}F_{Y_n}(y)$ could also take value 1.

Where is my error?

Best Answer

Your solution is almost correct, because the equality $\mathbb P(X_1>y)=1-y$ is valid only for $y\in [0,1]$. However, the cases $y<0$ and $y>1$ are trivial.

Let $F$ be the cumulative distribution function of the constant random variable equal to $0$ (thus, $F(t)=0$ if $t<0$ and $F(t)=1$ if $t\geqslant 0$). You showed that $F_n(y)$ converges to $F(y)$ at each continuity point of $F$, that is, for $y\neq 0$ since the only discontinuity point of $F$ is at $0$. By the way, the convergence fails for $y=0$ as you noticed.

Related Question