[Math] Almost sure convergence + convergence in distribution implies joint convergence in distribution

almost-everywhereconvergence-divergenceprobabilityprobability distributions

I'm wondering, if I have two sequences of random variables $(X_n)$ and $(Y_n)$, defined on the same probability space, such that $X_n\stackrel{a.s.}{\rightarrow}X$ and $Y_n\stackrel{d}{\rightarrow}Y$, is it possible to conclude that they converge jointly in distribution, i.e. $$ (X_n,Y_n)\stackrel{d}{\rightarrow}(X,Y) $$ as $n\to\infty$?

I believe that this question is closely related to the following: if $Y_n\stackrel{d}{\rightarrow}Y$, is it true that $$ (X,Y_n)\stackrel{d}{\rightarrow}(X,Y) $$ as $n\to \infty$?

Thank you in advance for any thoughts, comments etc.!

Best Answer

For your second question, take $Y_n = X$, and $Y$ to be independent of $X$ with the same distribution. This implies that the answer to your first question is also no.