If $X_1$ and $X_2$ are independent and follow asymptotic standard normal distribution as $\min(n_1,n_2)\to\infty$, how do I show that $\frac{\sqrt {n_1}X_1+\sqrt {n_2}X_2}{\sqrt{n_1+n_2}}$ also has an asymptotic standard normal distribution?
The problem would be trivial if $n_1$ and $n_2$ would be constants and $X_1$ and $X_2$ would have asymptotic normal distribution based on some other index going to $\infty$, because in that case $\sqrt{n_1}X_1$ would converge to asymptotic $\mathcal N(0,n_1)$ distribution, $\sqrt{n_2}X_2$ would converge to asymptotic $\mathcal N(0,n_2)$ distribution, so their sum (by independence) would converge to asymptotic $\mathcal N(0,n_1+n_2)$ distribution, which is what we want to show.
The same logic would also work if $n_1/n_2$ would be converging to a non-zero constant, because we can dividing numerator and denominator by $n_2$ then.
However, in this case $n_1$ and $n_2$ are not fixed, and I don't know what to do.
Best Answer
Write $a_n=\sqrt{n_1}/\sqrt{n_1+n_2}$ and $b_n=\sqrt{n_2}/\sqrt{n_1+n_2}$, and $Z_n$ for your linear combination.
Both $a_n$ and $b_n$ are bounded above by 1 and below by 0, so for every sequence $n_i$ there is a subsequence $n_{i_j}$ such that $(a_{n_{i_j}},b_{n_{i_j}})$ converges (to $(a,b)$, say). Along this subsubsequence $Z_n\stackrel{d}{\to}N(0,1)$.
So, for every sequence $n_i$ there is a subsequence $n_{i_j}$ such that $Z_{n_{i_j}}\stackrel{d}{\to}N(0,1)$. And that implies $Z_n\stackrel{d}{\to}N(0,1)$ (eg, see this answer)