Convergence of diagonal of double sequence of random variables

convergence-divergenceprobability theoryrandom variables

I have a double sequence of random variables $X^n_m$, where $n, m \in \mathbb{N}$.

There exist variables $X^n$ and $X$ such that $X_m^n \overset{m\to \infty}{\longrightarrow} X^n$ almost surely and $X^n \overset{n\to \infty}{\longrightarrow} X$ in probability where $X$ is a constant.

There also exist variables $X_m$ such that $X_m^n \overset{n\to \infty}{\longrightarrow} X_m$ in probability and $X_m \overset{m\to \infty}{\longrightarrow} X$ almost surely.

My question: under what conditions can it be concluded that the 'diagonal' sequence $X_n^n \overset{n\to \infty}{\longrightarrow} X$ in probability?

In my setting, the sequence $X^n$ are uniformly bounded and at least $0$, i.e. $\exists K : \mathbb{P}(0 \leq X^n \leq K)=1$ for all $n$. But I'm not sure how to use this fact or whether it is helpeful.


Here's what I've tried so far:

To prove that $X_n^n \overset{n\to \infty}{\longrightarrow} X$ in probability, it is suffices to prove that the double limit $X_m^n \overset{n,m\to \infty}{\longrightarrow} X$ in probability, i.e. that
$\forall \epsilon, \delta>0\ \ \exists C$ such that $n, m > C \implies |X – X_m^n| > \epsilon$ with probability $\leq \delta$.

So, let $\epsilon, \delta > 0$ and try to prove existence of such a $C$.
Observe that

$|X – X_m^n| \leq |X – X^n| + |X^n – X_m^n|$

and using the fact that almost sure convergence implies convergence in probability, it follows that

$\exists N(\epsilon, \delta) : n > N(\epsilon, \delta) \implies |X – X^n| > \epsilon$ with probability $\leq \delta$

$\exists M(\epsilon, \delta, n) : m > M(\epsilon, \delta, n) \implies |X^n – X^n_m| > \epsilon$ with probability $\leq \delta$

The issue here is that since $M$ depeneds on $n$, it could be that $M(\epsilon, \delta, N(\epsilon, \delta)) > N(\epsilon, \delta)$. If this is the case, then (informally) $m$ needs to grow faster than $n$ meaning that no such $C$ can exist. I'm not sure how to resolve this problem.

Best Answer

The assumptions given are not sufficient. Consider for example the constant random variables:

$X_m^n = 0$ if $n=m$

$X_m^n = 1$ if $n\not=m$

Then $X^n_m \overset{n\to\infty}{\longrightarrow} X_m = 1$ for any fixed $m$ and $X^n_m \overset{m\to\infty}{\longrightarrow}X^n = 1$ for any fixed $m$, so if we take $X=1$ then these variables satisfy the assumptions as given.

But in this case $X_n^n = 0 $ for all $n$, so $X^n_n \overset{n\to\infty}{\longrightarrow} 0 \not= X$.

Convergence of the diagonal sequence does hold if we assume that $X_m^n \overset{m\to\infty}{\longrightarrow}X^n$ in probability uniformly in $n$, in the sense that:

$\forall \epsilon, \delta > 0 \ \ \exists M(\epsilon, \delta) : m > M(\epsilon, \delta) \implies |X^n - X^n_m| > \epsilon \ $ for any $n$.

If this holds, the attempted proof in the question statement works, taking $C = \max\{ N(\epsilon, \delta), M(\epsilon, \delta) \}$. In fact, it suffices to have convergence of all variables only in probability, and not almost surely.

Related Question