Convergence in Probability question.

convergence-divergenceprobabilityprobability distributionsprobability theoryprobability-limit-theorems

Let $\lambda_n = 1/n$ for $n=1,2,\ldots$. Let $X_n \sim Poi(\lambda_n)$. Show that a) $X_n \rightarrow_P 0$. b) Let $Y_n = nX_n$. Show that $Y_n \rightarrow_P 0$ (where $\rightarrow_{P}$ denotes convergence in Probability)

Part a) is relatively straight forward as
We know that $\mathbb{E}(X_n) = \lambda_n = 1/n$ and $\mathrm{var}(X_n) = \lambda_n = 1/n$. We know that since $X_n >0$ that $|X_n| = X_n$. Then from Chebyshevs inequality we know that $$P(X_n >\epsilon) \leq \frac{\mathrm{var}(X_n)^2}{\epsilon^2} = \frac{1}{n^2\epsilon^2} \rightarrow 0\;\mathrm{as\;}n \rightarrow \infty. $$

For part b) i cannot use this same approach as $\mathbb{var}(Y_n) = n^2*1/n = n$. Also i know convergence in quadratic mean implies convergence in probability but $\mathbb{E}(Y_n – 0)^2 = n^2(\mathbb{E}(X_n^2)) = n^2(1/n + 1/n^2) \not \rightarrow \infty$ as $n \rightarrow \infty$. Any ideas?

Best Answer

One way of proving that $nX_n \to 0$ in probability is to use characteristic functions. Recall that $Ee^{itX_n}=e^{-(\frac 1 n) (1-e^{it})}$. Hence $Ee^{itnX_n}=e^{-(\frac 1 n) (1-e^{int})} \to 1$ as $ n \to \infty$. This implies that $nX_n \to 0$ in distribution which is equivalent to convergence in probability.