[Math] Convergence in probability : min and max forms of uniform distribution (0,1)

convergence-divergencelimitsprobabilityprobability theory

$X_i \sim \text{uniform(0,1)}$ independent variables.

$Y_n = \min \{x_1,…,x_n\}$

$Z_n=\max\{x_1,…,x_n\}$

I want to show that both $Y_n$ and $Z_n$ respectively converges in probability to 0 and 1.

That's, $$Y_n \to_p 0$$

$$Z_n \to_p 1$$

Definition of convergence in probability means

$$\lim_{n\to \infty}P(|x_n-0|< \epsilon)=1$$
$$\lim_{n\to \infty}P(|x_n-0|> \epsilon)=0$$

For min form of uniform distribution

Equation-1

$$\lim_{n\to \infty}P(|x_n-0|< \epsilon)=\lim_{n\to \infty}P(-\epsilon <x_n< \epsilon)=\lim_{n\to \infty}P(0<x_n< \epsilon)=1$$

For max form of uniform distribution

Equation-2

$$\lim_{n\to \infty}P(|x_n-1|< \epsilon)=\lim_{n\to \infty}P(1-\epsilon <x_n<1+ \epsilon)=\lim_{n\to \infty}P(1-\epsilon<x_n< 1)=1$$

I need to show equations 1 and 2 equal to one.

For this, I need to know the functions for min and max form of uniform distribution. I'm stuck with this point

Best Answer

For Equation-1 $$ \lim_{n\to \infty}P(|Y_n|< \epsilon)=\lim_{n\to \infty}P(-\epsilon <Y_n< \epsilon)=\lim_{n\to \infty}P(0<Y_n< \epsilon) $$ Since the distribution of $Y_n$ is $1-(1-x)^n$, where $x$ is the distribution of $X_i$ (uniform distribution $(0,1)$), there is $$ \lim_{n\to \infty}P(0<Y_n< \epsilon)=\lim_{n\to \infty}(1-(1-\epsilon)^n)=1 $$

For Equation-2 $$ \lim_{n\to \infty}P(|Z_n-1|< \epsilon)=\lim_{n\to \infty}P(1-\epsilon <Z_n<1+ \epsilon)=\lim_{n\to \infty}P(1-\epsilon<Z_n< 1) $$ Since the distribution of $Z_n$ is $x^n$, where $x$ is the distribution of $X_i$ (uniform distribution $(0,1)$), there is $$ \lim_{n\to \infty}P(1-\epsilon<Z_n< 1)=\lim_{n\to \infty}(1-(1-\epsilon)^n)=1 $$

Related Question