[Math] Two different sequences of random variables each converge in distribution; does their sum

convergence-divergenceprobability

My question is about basic probability.

We have two sequences of random variables, $ \{ X_n \}$ and $\{ Y_n \}$, such that each converge in distribution – i.e. there exist random variables $X$ and $Y$ such that:

$X_n \rightarrow X$ as $n \rightarrow \infty$ in law, and $Y_n \rightarrow Y$ as $n \rightarrow \infty$ in law.

What can we say about the sequence $X_n + Y_n$? I would like to say that it converges in law to $X +Y$ – but is this true?

Many apologies if this question has already been asked and answered. Thank you so much!!

John

p.s. Feel free to add some assumptions like independence of the $X_n$ and $Y_n$ in the process of answering this question.

Best Answer

In general, this is not true. If one of the two sequences converges in probability (which is the case of $X$ or $Y$ are constant), the result holds by Slutsky's theorem.

Related Question