Does $X_n$ converge in distribution when each random variable $X_1,X_2,…$ takes only one of two values

convergence-divergencelimitsprobability theoryweak-convergence

Let each random variable $X_1,X_2,…$ take only two possible values. That is, for every $n$ there are two numbers $a_n<b_n$ where $a,b\in \mathbb{R}$ such that $X_n\in\{a_n,b_n\}$. Let $a_n,b_n$ and $P(X_n=a_n)$ have finite limits. Does $X_n$ converge in distribution?

What I have gotten so far:

Since $a_n$ and $b_n$ are real numbers and have finite limits,

$$\lim_{n\to\infty}a_n=a_n \quad \text{and} \quad \lim_{n\to\infty}b_n=b_n.$$

Since $P(X_n=a_n)$ has a finite limit, for any $\epsilon >0$ there is $N$ such that for every $n\ge N$,
$$|P(X_n=a_n)-L|<\epsilon.$$

By convergence in probability, for every $\epsilon >0$,

$$P(|X_i-X|>\epsilon).$$

Convergence in probability implies convergence in distribution so we can just prove that.

In my thinking, it makes sense that $X_n$ converges in probability but I don't know how to prove it rigorously. How do I use the information about the limits to prove the convergence in distribution?

Best Answer

Let $X$ be such that $P(X=a)=p$ and $P(X=b)=1-p$, where $p_n\,\colon= P(X_n=a_n)\to p\in[0,1]$.

We show that $\lim_n M_{X_n}=M_X$.

$M_{X_n}(t)=\mathbb{E}[e^{t X_n}]=p_n e^{a_n t}+(1-p_n) e^{b_n t}\to p e^{a t}+(1-p) e^{b t} = \mathbb{E}[e^{t X}]=M_X(t)$

Hence, $X_n\overset{\mathrm d}{\to} X$