[Math] Proving properties of Convergence in Probability.

probability theory

  1. Show that if $X_n$ converges in probability to $a$ and $Y_n$ converges in probability to $b$, then $X_n +Y_n$ converges
    in probability to $a+b$.

  2. Show that, if $g : R → R$ is continuous at $c ∈ R$ and $X_n$ converges in probability to $c$, then $g(X_n)$ converges in probability to $g(c)$.

  3. Show that, if $X_1, X_2$ ,… are identically distributed, then the sequence $Y_n$ defined by $Y_n$ = $X_n/n$ converges in
    probability to 0.

Attempt:

  1. $lim_{n\rightarrow\infty} P(|(X_n + Y_n)-(a+b)|$> $\epsilon$
    = $lim_{n\rightarrow\infty} P(|(X_n + Y_n)-(a+b)|$> $\epsilon$
    is smaller than $lim_{n\rightarrow\infty} P(|X_n -a|+|Y_n -b|$> $\epsilon$. Can I separate $|X_n-a|$ and $|Y_n-b|$ out from the probability? I might have forgotten some fundamental rule.

  2. There exists a $\delta$ s.t. for all $\epsilon$>0, $|X_n – c|$< $\delta$ $\rightarrow$ $|g(X_n)-g(c)|$<$\epsilon$. Since $X_n$ converges in probability to c, $X_n$ gets arbitrarily close to c when n $\rightarrow$ infinity. Hence, we can pick $n>N$ s.t. $|g(X_n)-g(c)|$ < $\epsilon_0$.
    $lim_{n\rightarrow\infty}$ P($|g(X_n)-g(c)|$> $\epsilon$) =$0$ as |$\epsilon_0$-$\epsilon$| gets arbitrarily close to each other. This feels more like an intuitive reasoning than a proof. If so, how do I write a proper proof?

  3. $lim_{n\rightarrow\infty}$ $X_n/n$ =$0$. Doesn't that already imply that it converges in probability to $0$?

Thanks!

Best Answer

1) For convenience let $a=b=0$. This is WLOG since you can just work with $X_n-a$ and $Y_n-b$.

Then: $$|X_n+Y_n|>\epsilon\implies|X_n|+|Y_n|>\epsilon\implies |X_n|>\frac12\epsilon\vee|Y_n|>\frac12\epsilon$$ so that: $$P(|X_n+Y_n|>\epsilon)\leq P(|X_n|>\frac12\epsilon\vee|Y_n|>\frac12\epsilon)\leq P(|X_n|>\frac12\epsilon)+P(|Y_n|>\frac12\epsilon)$$ This inequality can be used to prove that $X_n+Y_n\stackrel{P}\to0$ whenever $X_n\stackrel{P}\to0$ and $Y_n\stackrel{P}\to0$

2) Again WLOG let $c=g(c)=0$.

Since $g$ is continuous at $0$ for $\epsilon>0$ there is a $\delta>0$ such that $\{|X_n|\leq\delta\}\subseteq\{|g(X_n)|\leq\epsilon\}$ and consequently: $$P(|g(X_n)|>\epsilon)\leq P(|X_n|>\delta)$$ This inequality can be used to prove that $g(X_n)\to0$ whenever $X_n\stackrel{P}\to0$.

3) For $\epsilon>0$ we have $P(|Y_n|>\epsilon)=P(|X_n|>n\epsilon)=P(|X_1|>n\epsilon)$ and evidently $\lim_{n\to\infty} P(|X_1|>n\epsilon)=0$.

This proves that $Y_n\stackrel{P}\to0$.

Related Question