Convergence in probability + in distribution

convergence-divergencenormal distributionprobability distributionsprobability theoryprobability-limit-theorems

Let $\left(X_{n}\right)_{n}$ be a sequence of r.v. which converges in probability to $0$ and such that$\sqrt{n} X_{n} \longrightarrow \mathcal{N}\left(0, \sigma^{2}\right)$ in distribution.

How can I verify whether the following statements are true?

$\begin{array}{lll} & &\sqrt{n} \sin \left(X_{n}\right) \longrightarrow \mathcal{N}\left(0, \sigma^{2}\right)\text { in distribution }\end{array}$

$\begin{array}{lll} & & n \cos \left(X_{n}\right) \longrightarrow \frac{-\sigma^{2}}{2} \chi^{2}(1) & \text { in distribution }\end{array}$

As $\left(X_{n}\right)$ are not necessarily independent and not necessarily i.d. and as we work not with the sequence of the sample average, but with the sequence of r.v. itself, we can't apply neither LLN or CLT, nor any consequences of them (Delta method for example).

So I tried to prove using the definition of the convergence in distribution. I tried 3 different definitions: convergence of the cumulative distribution function, convergence of the characteristic function, convergence of the expected value
($E[f(X_n)] \longrightarrow E[f(X)]$ for any continuous and bounded function $f$). None of them really helps. How should I use the fact of the convergence in probability to $0$?

By the way, for the second statement I've noticed that $E[n \cos \left(X_{n}\right)] =nE[cos(X_n)]
\geq $
(by Jensen's inequality) $ncos(E[X_n])$ with $E[X_n] \longrightarrow E[0] = 0$ by third definition of the convergence in distribution ant the the fact that convergence in probability implies convergence in distribution. So $E[n \cos \left(X_{n}\right)] \longrightarrow \infty \neq E[\frac{-\sigma^{2}}{2} \chi^{2}(1)] = \frac{-\sigma^{2}}{2}$.
Have I proved well that the second statement is not true?

But what should I do with the first one?

Best Answer

a) Note that since $X_n \xrightarrow[n \to \infty]{\mathbb P} 0 $, then also $\frac{\sin(X_n)}{X_n} \xrightarrow[n \to \infty]{\mathbb P} 1$. Having said that, we can rewrite $$ \sqrt{n}\sin(X_n) = \sqrt{n}X_n \cdot \frac{\sin(X_n)}{X_n} $$ Now, due to assumptions $\sqrt{n}X_n \xrightarrow[n \to \infty]{\text{distribution}} \mathcal N(0,\sigma^2)$, so by Slutsky theorem ($Y_n$ converges in distribution to $Y$ and $Z_n$ converges in distribution to constant $c$, then $Y_nZ_n$ converges in distribution to $cY$) we get that $\sqrt{n}\sin(X_n)$ converges in distribution to $\mathcal N(0,\sigma^2) \cdot 1 \sim \mathcal N(0,\sigma^2)$

b) As before, $\cos(X_n)$ converges to $1$ in probability, so we shouldn't expect any convergence in distrubution of $n\cos(X_n)$ (rather some sort of divergence). Indeed, assume by contradiction, that $n\cos(X_n) \to X$ in distribution for some (finite almost surely) random variable $X$. Then again, by slutsky theorem, we would get $$ n = (n\cos(X_n)) \cdot \frac{1}{\cos(X_n)} \xrightarrow[n \to \infty]{\text{distribution}} X \cdot \frac{1}{1} = X $$ (We use slutsky theorem with $Y_n=n\cos(X_n)$ and $Z_n=\frac{1}{\cos(X_n)}$) and the above is obviously wrong, since $n$ cannot converge in distribution to (finite a.s) $X$

Related Question