[Math] Convergence in Probability and in Quadratic Mean for a sequence of random variables

convergence-divergenceprobabilityprobability theoryself-learning

I have been trying to determine whether a sequence of random variables, $X_1,X_2,\ldots,X_n$, such that

$$P\left(X_n= \frac{1}{n}\right)=1-\frac{1}{n^2}\\ \text{and}\\ P\left(X_n=n\right)=\frac{1}{n^2}$$

converges in probabily and in quadratic mean. I know that convergence in qm implies convergence in probability but what would be a limit candidate for that case? Also, how can I prove or disprove the convergence in probability independently of the convergence in qm?

Thank you.

Best Answer

  1. For convergence in m.s check that $E[|X_n - 0|^2] = n^2.\frac{1}{n^2} + \frac{1}{n^2}(1 - \frac{1}{n^2} )$ which tends to $1$ as $n \rightarrow \infty$. Therefore, it does not converge to $0$ in m.s.

  2. For, convergence in probability, note that for any $\epsilon > 0$ for $n > \frac{1}{\epsilon}$ , $Pr(|X_n - 0| > \epsilon) = \frac{1}{n^2} $. This goes to $0$ as $n \rightarrow \infty$. Therefore $X_n \rightarrow 0$ in probability.

Related Question