I.i.d. random variables uniform distribution

probability distributionsprobability theoryrandom variablesuniform distributionuniform-convergence

If I have an i.i.d. random variables $X_i$ from a uniform distribution on $[0,1]$.

How would I find scaling sequences $a_n$, $b_n$ such that $a_n(M_n − b_n)$ converges in distribution to a non-trivial limit function $G$ for

$ Y_i = X_i,$ and $M_n = \max(Y_1,…,Y_n) $

Best Answer

  1. Answer to the earlier version of the question which mentioned $M_n$ is distributed as $ \max\{ X_1, X_2, \dots, X_n\}$:

Note that, $$\Pr(a_n(M_n - b_n)\le x) = \Pr(M_n\le b_n +x/a_n) = \prod_{i=1}^n \Pr(Y_i\le b_n +x/a_n) = \left(b_n +x/a_n \right)^n.$$ So, one possible choice would be $b_n=1$ and $a_n=n.$ In that case $n(M_n - 1)\Rightarrow Z$ where $-Z$ follows standard exponential distribution.

  1. Answer to the question which says $M_n$ is distributed as $ \max\{ 1/X_1, 1/X_2, \dots, 1/X_n\}$:

Let $Y_i = 1/X_i.$ Note that,$$\Pr(a_n(M_n - b_n)\le x) = \prod_{i=1}^n \Pr(Y_i\le b_n +x/a_n) = \prod_{i=1}^n \Pr\left(X_i \ge \left(b_n +x/a_n\right)^{-1}\right) = \left(1-\frac{a_n}{a_n b_n + x}\right)^n.$$ Now, it is well known that if $\displaystyle\lim_{n\to\infty} x_n = x,$ then $\displaystyle\lim_{n\to\infty} (1+x_n/n)^n = e^x.$ Using this fact, we can write $$\lim_{n\to\infty} \left(1-\frac{a_n}{a_n b_n + x}\right)^n = \exp\left(\displaystyle\lim_{n\to\infty}-\frac{na_n}{a_n b_n + x}\right).$$ Thus, we can take $a_n = 1/n$ and $b_n=0,$ for which the last limit will be $\exp(-1/x)$. Thus, $n^{-1}(M_n - 0)\Rightarrow Z$ where $Z$ has CDF $F(x) = \exp(-1/x) \mathbf{1}(x\ge 0).$

Note, if $b_n$ is any sequence such that $b_n/n$ converges, say to $b,$ then $n^{-1}b_n\to b$ and $n^{-1} M_n \Rightarrow Z$ implies that $n^{-1} (M_n - b_n) \Rightarrow Z-b, $ whose CDF is $F(x+b).$