Convergence in distribution for joint vector and maximum

convergence-divergenceorder-statisticsprobability distributionsweak-convergence

I know that $X_n\xrightarrow{d} X$ and $Y_n\xrightarrow{d} Y$ does not necessarily imply $\max(X_n,Y_n)\xrightarrow{d} \max(X,Y)$, where $\xrightarrow{d}$ is convergence in distribution.

However, if the joint vector $(X_n,Y_n)$ is such that $(X_n,Y_n)\xrightarrow{d} (X,Y)$, is it then true that $\max(X_n,Y_n)\xrightarrow{d} \max(X,Y)$?


Edit: I think this is true as:

$$(X_n,Y_n)\xrightarrow{d} (X,Y)$$
$$\implies \forall x, y, \lim_{n \rightarrow \infty} F_{(X_n, Y_n) }(x, y) = F_{(X, Y)} (x, y) $$
$$\implies \forall x \lim_{n \rightarrow \infty} F_{(X_n, Y_n) }(x, x) = F_{(X, Y)} (x,x) $$
$$\implies \forall x \lim_{n \rightarrow \infty} F_{\max(X_n, Y_n) }(x) = F_{\max(X, Y)} (x) $$
$$\implies\max(X_n,Y_n)\xrightarrow{d} \max(X,Y)$$

where $F_A(a)$ represents the cumulative distribution function of A evaluated at $a$. Please could someone confirm if my reasoning is correct?

Best Answer

Yes, the result is true. I assume the random variables are real-valued.

The function $\mathbb{R}^2\to\mathbb{R}, (x,y) \mapsto \max(x, y)$ is continuous. Hence, applying the Continuous mapping theorem to $(X_n, Y_n) \overset{d}{\to} (X,Y)$ yields $$\max(X_n,Y_n) \overset{d}{\to} \max(X,Y) .$$


Regarding your proof, you probably have to be a bit more careful; remember that convergence in distribution is not equivalent to pointwise convergence of the distribution functions!

For example, consider deterministic random variables $X_n = \frac{1}{n}$, $X = 0$. Then we have $X_n \overset{d}{\to} X$, but the distribution functions do not converge at the point $0$: $$\lim_{n\to\infty} F_{X_n}(0) = \lim_{n\to\infty} 0 \neq 1 = F_{X}(0) $$

Related Question