Probability – Completeness of Metric Induced by Convergence in Probability

convergence-divergenceprobability

My question is as the following:

Let $X_1,X_2,\cdots$ be independent random variables and $S_n=\sum\limits_{k=0}^n X_k$.
Suppose that $\sum\limits_{k=n}^m X_k$ converges in probability to $0$ when $n,m$ go to $\infty$. Does $S_n$ converge also in probability to a certain limit?

It is known that convergence in probability defines a topology on the space of random variables over a fixed probability space. This topology is metrizable by the Ky Fan metric, which is caracterized by:

$d(X,Y)=\inf\{\epsilon>0: P(|X-Y|>\epsilon)\le\epsilon\}$,
or $d(X,Y)=E[\min(|X-Y|,1)]$.

If the Ky Fan metric is complete, then $S_n$ would converge to a limit. So is the Ky Fan metric complete?

Best Answer

We need to use the following facts:

  • If $\{X_n\}$ is Cauchy in measure, we can extract a subsequence $\{Y_k=X_{n_k}\}$ such that $P(|Y_{k+1}-Y_k|>2^{-k})\leq 2^{—k}$.
  • By Borel-Cantelli lemma (classical, not for independent random variables), we have that the series $\sum_k (Y_{k+1}-Y_k)$ is almost everywhere convergent and we denote the limit $Y$. So the sequence $\{X_{n_k}\}$ is almost everywhere convergent, to $Y$.
  • Now you can use the metric to show that the whole sequence $\{X_n\}$ converges to $Y$.

Note that independence of $X_n$ wasn't used.