Maximal inequality of iid random variables $\{X_{ij}\}_{1\leqslant i,j \leqslant n}$

analysisinequalitylaw-of-large-numbersprobabilityprobability theory

Suppose that $\{X_{ij}\}_{1\leqslant i,j\leqslant n}$ are iid random variables with $\mathbb{E}(X_{11})=0$ and $\mathrm{Var}(X_{11})=1$, does the following convergence hold:
$$
\max_{1\leqslant j\leqslant n}\biggl\{\frac{1}{n^2}\sum_{1\leqslant i\neq i'\leqslant n}(X_{ij}X_{i'j})\biggr\} \to 0 \qquad \text{almost surely}?
$$

Comment: I have also posted this question on mathoverflow according to @D.R.'s suggestion.

Background

I am reading the AoP paper "Limit of the smallest eigenvalue of a large dimensional sample covariance matrix" by Z. Bai and Y. Yin (1993). Their Lemma 2 states a generalization of the well-known Marcinkiewicz-Zygmund strong law of large numbers to the case of multiple arrays of iid random variables.

[Lemma 2 in Bai and Yin (1993) ] Let $\{\xi_{ij},i,j=1,2,\ldots\}$ be a double array of iid random variables and let $\alpha>1/2,\beta\geqslant 0$ and $M$>0 be constants. Then as $n\to\infty$,
$$
\max_{j\leqslant Mn^{\beta}} \biggl|n^{-\alpha}\sum_{i=1}^n (\xi_{ij}-c)\biggr|\to0\quad \text{almost surely},
$$

if and only if
$$
(i)\quad \mathbb{E}|\xi_{11}|^{(1+\beta)/\alpha}<\infty
$$

$$
(ii)\quad c = \left\{
\begin{array}{ll}
\mathbb{E} \,\xi_{11},& \text{if }\alpha\leqslant 1, \\
\text{any number}, &\text{if }\alpha>1.
\end{array}
\right.
$$

By our assumptions and taking $\alpha=\beta=M=1$, $\xi_i=X_{ij}^2$ in this lemma, we have
$$
\max_{j\leqslant n}\biggl|\frac{1}{n}\sum_{i,j}X_{ij}^2-1\biggr|\to0\quad \text{almost surely}.
$$

This result is for square terms. I wonder if there is a similar result for the cross terms
$$
\max_{1\leqslant j\leqslant n}\biggl\{\frac{1}{n^2}\sum_{i\neq i'}(X_{ij}X_{i'j})\biggr\} \to 0 \qquad \text{almost surely}?
$$


Attempt

I can prove that $(1/n^2)\sum_{i\neq i'}(X_{ij}X_{i'j})\to 0\; a.s.$ for any fixed $j$. But I do not know how to deal with the problem with "$\max$".

For fixed $j$,
$$
\mathrm{Var}\Bigl(\sum_{i\neq i'}X_{ij}X_{i'j}\Bigr)=2\sum_{i\neq i'}\mathrm{E}\bigl(X_{ij}^2\bigr)\cdot\mathrm{E}\bigl(X_{i'j}^2\bigr)=2(n^2-n),
$$

then by Chebyshev's inequality, for any $\varepsilon>0$,
$$
\Pr\biggl(\frac{1}{n^2}\sum_{i\neq i'}(X_{ij}X_{i'j})>\varepsilon\biggr) =O\Bigl(\frac{1}{n^2}\Bigr),
$$

which is summable. Hence, by using the Borel-Cantelli lemma, we have
$$
\frac{1}{n^2}\sum_{i\neq i'}(X_{ij}X_{i'j})\to 0\qquad
\text{almost surely}.$$

If we consider $\max_{1\leqslant j\leqslant n}$, and use the trivial inequality to bound it, we have
$$
\Pr\biggl(\max_{1\leqslant j\leqslant n}\biggl\{\frac{1}{n^2}\sum_{i\neq i'}(X_{ij}X_{i'j})\biggr\}> \varepsilon\biggr)
\leqslant n\cdot \Pr\biggl(\frac{1}{n^2}\sum_{i\neq i'}(X_{i1}X_{i'1})>\varepsilon\biggr)=O\Bigl(\frac{1}{n}\Bigr),$$

which means
$$
\max_{1\leqslant j\leqslant n}\biggl\{\frac{1}{n^2}\sum_{i\neq i'}(X_{ij}X_{i'j})\biggr\}\to 0\qquad
\text{in probability}.\tag{*}
$$

How can we improve the result (*) to "almost surely"?

Best Answer

Applying $$\tag{*} \max_{1\leqslant j\leqslant Mn^{\beta}} \biggl|n^{-\alpha}\sum_{i=1}^n (\xi_{ij}-c)\biggr|\to0\quad \text{almost surely} $$ to $\xi_{i,j}=X_{i,j}^2$, $M=1$, $\beta=1$ and $\alpha=2$ gives that $$ \max_{1\leqslant j\leqslant n} \biggl|n^{-2}\sum_{i=1}^n \left(X_{i,j}^2-1\right)\biggr|\to0\quad \text{almost surely} $$ which gives $$ \tag{1} \max_{1\leqslant j\leqslant n} n^{-2}\sum_{i=1}^n X_{i,j}^2 \to0\quad \text{almost surely}. $$ Applying (*) to $\xi_{i,j}=X_{i,j}$ (then $c=0$), $M=\alpha=\beta=1$ gives that $$ \max_{1\leqslant j\leqslant n} \biggl|n^{-1}\sum_{i=1}^n X_{i,j}\biggr|\to0\quad \text{almost surely} $$ or in other words, $$ \tag{2} \max_{1\leqslant j\leqslant n}n^{-2}\left(\sum_{i=1}^n X_{i,j}\right)^2\to0\quad \text{almost surely}. $$ The wanted convergence is the combination of (1) and (2).

Related Question