Convergence in distribution of a vector random variable implies convergence in distribution of the sum.

probabilityprobability distributionsprobability theorystochastic-calculus

If $X_n,X,Y_n, Y$ ar real valued random variables and I assume that $(X_n,Y_n)\Rightarrow (X,Y)$ in distribution I need to show that $X_n+Y_n\Rightarrow X+Y$ in distribution

Since $(X_n,Y_n)\Rightarrow (X,Y)$ I have that by definition $\forall f\in C_b(\Bbb{R}^2)$ continuous and bounded functions $$E((f(X_n),f(Y_n))= E(f(X_n,Y_n))\rightarrow E(f(X,Y))=E((f(X),f(Y))$$ Now I wanted to take $g\in C_b(\Bbb{R})$ and show that $E(g(X_n+Y_n))\rightarrow E(g(X+Y))$ but using the definition of $E(g(X_n+Y_n))$ with the integral do not help me much. So I think either there is another trick to solve the problem using this definition or I need to use an equivalent statement.

Could maybe someone help me?

Best Answer

Let $g$ be continuous and bounded. Then $g(x+y)=g(h(x,y))$ where $h(x,y)=x+y$. $f=g\circ h$ is continuous and bounded. Can you finish?