[Math] Alternative proof or verification of given proof of convergence in probability

alternative-proofconvergence-divergenceprobability theoryproof-verification

I am asked to show that if $X_n \rightarrow c$ in probability and if $g$ is a continuous function, then $g(X_n) \rightarrow g(c)$ in probability for a statistics homework problem in a section titled "Limit Theorems". I am not quite sure about the "in probability" part of the question affecting my proof, but here is my solution:

Let $\delta > 0$. As $X_n \rightarrow c$, there exists $N$ such that for $n > N$, $|X_n – c| < \delta$. Let $\varepsilon > 0$. As there is a $\delta > 0$ such that $|X_n – c| < \delta$ and $g$ is continuous, then by definition of continuity $|g(X_n) – g(c)| < \varepsilon$, and so $g(X_n) \rightarrow g(c)$.

My primary concerns for this solution is that I am not considering the "in probability" notion, and on top we have not learned about convergence of sequences or continuity of function in this course, nor is real analysis a prerequisite to the course so I don't believe this is the type of solution expected. I am more or less looking for the following input:

  • Verification of correctness or notification of flaws in the solution
  • Input on how to incorporate the fact that the sequence converging in probability results in the function converging with the same probability
  • Alternative method of proving without the $\varepsilon – \delta$ definition of continuity or hint at such a solution (because without having taken real analysis before this statistics class I would not have known the definition for a sequence to converge or for the continuity of a function)

EXTRA INFO

For the alternative solution, the following definition and theorem are provided in the text are the only things I should know for continuity in this course as of this moment:

Definition

Let $X_1,X_2,\dots$ be a sequence of random variables with cumulative distribution functions $F_1,F_2,\dots$, and let $X$ be random variable with distribution function $F$. We say that $X_n$ converges in distribution to $X$ if \begin{equation}\lim_{n\rightarrow\infty}F_n(x) = F(x)\end{equation} at every point at which $F$ is continuous.

Theorem A – Continuity Theorem

Let $F_n$ be a sequence of cumulative distribution functions with the corresponding moment-generating function $M_n$. Let $F$ be a cumulative distribution function with the moment-generating function $M$. If $M_n(t) \rightarrow M(t)$ for all $t$ in an open interval containing zero, then $F_n(x) \rightarrow F(x)$ are all continuity points of $F$.

Best Answer

Let $\delta > 0$. As $X_n \rightarrow c$, there exists $N$ such that for $n > N$, $|X_n - c| < \delta$.

It's not what convergence in probability says (even with almost sure convergence, the $N$ would depend on the considered $\omega$).

Here is a way to get a proof.

  1. Assume first that $f$ is uniformly continuous on the real line. Let $\varepsilon>0$ and $\delta$ such that $|f(x)-f(y)|<\varepsilon$ whenever $|x-y|<\delta$. Fix $\delta>a>0$. Then $$\mathbb P\{|X_n-c|>a\}\geqslant \mathbb P\{|X_n-c|>\delta\}\geqslant \mathbb P\{|f(X_n)-f(c)|>\varepsilon\}.$$

  2. To be reduced to the case "$f$ uniformly continuous", take $A$ such that $\mathbb P\{|X_n|>a\}$ is small, and notice that $f$ is uniformly continuous on $[-A,A]$.


An alternative way is the following: a sequence $\{Y_n\}$ of random variables converges in probability to $Y$ if and only if for each subsequence $\{Y_{n_k}\}$, we can extract a further subsequence which converges almost everywhere to $Y$.