[Math] Convergence in distribution of differences (sums) of random variables

probability theory

Suppose $X_n$ and $Y_n$ are sequences of random variables, defined on a common probability space, and such that $X_n-Y_n \to C$ converges in distribution to the constant r.v. $C\equiv c$ as $n \to \infty$ (and also $Y_n \to Y$). Does it then hold, without making any further assumptions on the r.v.'s, that $X_n \to C+Y$ in distr.?

If not, what are the minimum requirements on $X_n$ and $Y_n$ for this to be true?

What can be said about the case, where $C$ is replaced by an arbitrary random variable?

Best Answer

Lemma 1. If $\{A_n\}$ is a sequence of random variables converges in distribution to a constant $c$, then it converges in probability to $c$.

Fix $\varepsilon>0$, $f$ a function such that $f(x)=0$ if $|x-c|\geqslant 2\varepsilon$, $f(t)=1$ if $|t-c|\leq\varepsilon$ and $f$ is piecewise linear. It's a bounded continuous function, so $$\int f(A_n)dP\to 1.$$ As $$\int f(A_n)dP\leqslant P(|A_n-c|\leqslant\varepsilon)+\varepsilon,$$ we have $$P(|A_n-c|>\varepsilon)\leqslant 1-\int f(A_n)dP+\varepsilon,$$ This proves convergence in probability.

Lemma 2. If $\{X_n\}$ converges in distribution to $X$, and $\{Y_n\}$ in probability to $c$, where $c$ is constant, then $\{X_n+Y_n\}$ converges in distribution to $X+c$.

Indeed, by portmanteau theorem, it's enough to check that $\int f(X_n+Y_n)dP\to \int f(X+Y)dP$ for all $f$ uniformly continuous and bounded. If $\varepsilon>0$ and $\delta$ as in the definition of uniform continuity, we have $$\left|\int f(X_n+Y_n)dP-\int f(X+c)dP\right|\leqslant \sup |f|\cdot P(|Y_n-c|\geqslant \delta)+\varepsilon+\left|\int (f(X_n+c)-f(X+c))dP\right|.$$ As $f(c+\cdot)$ is continuous and bounded, we have $$\limsup_{n\to +\infty}\left|\int f(X_n+Y_n)dP-\int f(X+c)dP\right|\leqslant \varepsilon,$$ proving convergence in law of $\{X_n+Y_n\}$ to $\{X_n+c\}$.