Convergence Probability (uniform case)

convergence-divergenceprobabilitystatistics

I understand convergence in probability concept (theoretically), but I'm still having trouble to do this kind of example:

Suppose $X ∼ Uniform[0, 1]$, and let $Y_n = \frac{n−1}{n} \cdot X$

Prove that $Y_n
\rightarrow^{P} X$
.

I started off by finding:

$E[X_i]=\frac{1}{2}, \quad Var(X_i)=\frac{1}{12}$

$E[Y_n]=\frac{n-1}{n}\cdot E[X]=\frac{n-1}{2n}$

$Var(Y_n)=\frac{(n-1)^2}{n^2}\cdot \frac{1}{12}=\frac{(n-1)^2}{12n^2}$

So then I figure I should straight away plug into:

$\lim_{n\to\infty}P(|Y_n – X|> \epsilon) = \lim_{n\to\infty}\frac{(n-1)^2}{12n^2\epsilon^2} = \frac{1}{12\epsilon^2} \neq0$

I'm pretty sure I did it wrong, but I don't know how to go about from here.

Best Answer

$ X $ is uniformly distributed on $[0,1]$, $Y_n = \frac{n-1}{n}X$

We want to use Chebyshew inequality : $\Bbb P(|Y_n-X| > \epsilon) \leq \frac{D^2(Y_n-X)}{\epsilon^2} $

So we need to compute that variance.

Firstly $E[Y_n-X] = E[Y_n] - E[X] = \frac{n-1}{2n} - \frac{n}{2n} = \frac{-1}{2n}$

$E[(Y_n - X)^2] = E[Y_n^2] - E[2Y_nX]+E[X^2]= \frac{(n-1)^2}{n^2}E[X^2] - \frac{2(n-1)}{n}E[X^2] + E[X^2] = $

$=\frac{(n-1)^2}{3n^2} -\frac{2(n-1)}{3n} + \frac{1}{3} = \frac{n^2-2n+1-2n^2+2n+n^2}{3n^2} = \frac{+1}{3n^2} $

So: $ \Bbb D^2[Y_n-X] = \frac{1}{3n^2} - \frac{1}{4n^2} = \frac{1}{12n^2}$

And we're done, since:

$$$\Bbb P(|Y_n-X| > \epsilon) \leq \frac{D^2(Y_n-X)}{\epsilon^2} = \frac{1}{12n^2\epsilon^2} \to 0, as \ n \to \infty $$