[Math] Why does this tight sequence of random variables also converge in probability

convergence-divergenceprobabilityprobability theory

Definition of convergence in probabilty

We say that $X_n$ converges in probability to zero, written $X_n = o_p(1)$, if for every $\epsilon> 0$,

$$ P(| X_n | > \epsilon) \rightarrow 0, \qquad n \rightarrow \infty$$

Definition of boundedness in probability (tightness)

We say that $X_n$ is tight, written $X_n = O_p(1)$, if for every $\epsilon> 0$ there exists $\delta(\epsilon) \in (0, \infty)$ such that

$$ P(| X_n | > \delta(\epsilon)) < \epsilon \qquad \forall n$$

We also say that, given a sequence $\{ a_n \}_{n \in N}$, $X_n =O_p(a_n) \iff a_n^{-1} X_n = O_p(1) $ .

Assume we have a sequence $X_n$ of random variables such that $X_n = a + O_p(r_n)$ and $0<r_n \rightarrow 0$ as $n \rightarrow \infty$. And define

$$h(x) := \left[ g(x) – \sum_{j= 0}^s \frac{g^{(j)}(a)}{j!} ( x – a)^j \right] \bigg/ \left[ \frac{(x-a)^s}{s!} \right] \quad x \ne a$$
and $h(a) = 0$ where $g \in C^s$ and $s \in N$.

Question

Why is it that $h(X_n) = h(a) + o_p(1)$?

I understand that $h$ is a continuous function and that for generic continuous function $f$ we have that $X_n = a +o_p(1) \implies f(X_n) = f(a) + o_p(1)$ but in our assumptions we only have that $X_n = a + O_p(r_n)$.

Best Answer

From Taylor's formula, $$ h\left(X_n\right)-h\left(a\right)=f_s\left(X_n\right), $$ where $f_s\colon\mathbb R\to\mathbb R$ is such that $\lim_{t\to a}f_s\left(t\right)=0$. Writing $X_n=a+r_nY_n$, where $\left(Y_n\right)_{n\geqslant 1}$ is tight, we have to show that the sequence $\left(f_s\left(a+r_nY_n\right)\right)_{n\geqslant 1}$ converges to $0$ in probability. Suppose not: there exists positive $\varepsilon_0,\delta_0$ and an increasing sequence of integers $\left(n_k\right)_{k\geqslant 1}$ such that for all $k\geqslant 1$, $\Pr\left(\left\lvert f_s\left(a+r_{n_k}Y_{n_k}\right)\right\rvert\gt\varepsilon_0\right)\gt \delta_0$. Since the sequence $\left(Y_{n_k}\right)_{k\geqslant 1}$ is tight, we can extract a subsequence $\left(Y_{n_{j_k}}\right)_{k\geqslant 1}$ which converges in distribution to some $Y$. Therefore, $a+r_{n_{j_k}}Y_{n_{j_k}}\to a$ in probability. Extract a further subsequence which converges almost surely to reach a contradiction (along this further subsequence, $f_s\left(X_n\right)\to 0$ almost surely).