Maximal Inequality – IID Random Variables

inequalitieslimit-theoremspr.probability

Suppose that $\{X_{ij}\}_{1\leqslant i,j\leqslant n}$ are iid random variables with $\mathbb{E}(X_{11})=0$ and $\mathrm{Var}(X_{11})=1$, does the following convergence hold:
$$
\max_{1\leqslant j\leqslant n}\biggl\{\frac{1}{n^2}\sum_{1\leqslant i\neq i'\leqslant n}(X_{ij}X_{i'j})\biggr\} \to 0 \qquad \text{almost surely}?
$$

Comment: I first posted this question on MSE, someone suggested me to post it on mathoverflow since it is an open problem.


Background

I am reading the AoP paper "Limit of the smallest eigenvalue of a large dimensional sample covariance matrix" by Z. Bai and Y. Yin (1993). Their Lemma 2 states a generalization of the well-known Marcinkiewicz-Zygmund strong law of large numbers to the case of multiple arrays of iid random variables.

[Lemma 2 in Bai and Yin (1993)] Let $\{\xi_{ij},i,j=1,2,\ldots\}$ be a double array of iid random variables and let $\alpha>1/2,\beta\geqslant 0$ and $M$>0 be constants. Then as $n\to\infty$,
$$
\max_{j\leqslant Mn^{\beta}} \biggl|n^{-\alpha}\sum_{i=1}^n (\xi_{ij}-c)\biggr|\to0\quad \text{almost surely},
$$

if and only if
$$
(i)\quad \mathbb{E}|\xi_{11}|^{(1+\beta)/\alpha}<\infty
$$

$$
(ii)\quad c = \left\{
\begin{array}{ll}
\mathbb{E} \,\xi_{11},& \text{if }\alpha\leqslant 1, \\
\text{any number}, &\text{if }\alpha>1.
\end{array}
\right.
$$

By our assumptions and taking $\alpha=\beta=M=1$, $\xi_i=X_{ij}^2$ in this lemma, we have
$$
\max_{j\leqslant n}\biggl|\frac{1}{n}\sum_{i,j}X_{ij}^2-1\biggr|\to0\quad \text{almost surely}.
$$

This result is for square terms. I wonder if there is a similar result for the cross terms
$$
\max_{1\leqslant j\leqslant n}\biggl\{\frac{1}{n^2}\sum_{i\neq i'}(X_{ij}X_{i'j})\biggr\} \to 0 \qquad \text{almost surely}?
$$


Attempt

I can prove that $(1/n^2)\sum_{i\neq i'}(X_{ij}X_{i'j})\to 0\; a.s.$ for any fixed $j$. But I do not know how to deal with the problem with "$\max$".

For fixed $j$,
$$
\mathrm{Var}\Bigl(\sum_{i\neq i'}X_{ij}X_{i'j}\Bigr)=2\sum_{i\neq i'}\mathrm{E}\bigl(X_{ij}^2\bigr)\cdot\mathrm{E}\bigl(X_{i'j}^2\bigr)=2(n^2-n),
$$

then by Chebyshev's inequality, for any $\varepsilon>0$,
$$
\Pr\biggl(\frac{1}{n^2}\sum_{i\neq i'}(X_{ij}X_{i'j})>\varepsilon\biggr) =O\Bigl(\frac{1}{n^2}\Bigr),
$$

which is summable. Hence, by using the Borel-Cantelli lemma, we have
$$
\frac{1}{n^2}\sum_{i\neq i'}(X_{ij}X_{i'j})\to 0\qquad
\text{almost surely}.$$

If we consider $\max_{1\leqslant j\leqslant n}$, and use the trivial inequality to bound it, we have
$$
\Pr\biggl(\max_{1\leqslant j\leqslant n}\biggl\{\frac{1}{n^2}\sum_{i\neq i'}(X_{ij}X_{i'j})\biggr\}> \varepsilon\biggr)
\leqslant n\cdot \Pr\biggl(\frac{1}{n^2}\sum_{i\neq i'}(X_{i1}X_{i'1})>\varepsilon\biggr)=O\Bigl(\frac{1}{n}\Bigr),$$

which means
$$
\max_{1\leqslant j\leqslant n}\biggl\{\frac{1}{n^2}\sum_{i\neq i'}(X_{ij}X_{i'j})\biggr\}\to 0\qquad
\text{in probability}.\tag{*}
$$

How can we improve the result (*) to "almost surely"?

Best Answer

In the Math Stack Exchange post, I gave a proof based on Lemma 2 in Bai and Yin (1993). I will give an alternative proof.

Expressing $\sum_{1\leqslant i\neq i'\leqslant n}X_{i,j}X_{i',j}$ as $\left(\sum_{i=1}^n X_{i,j}\right)^2-\sum_{i=1}^nX_{i,j}^2$ and considering dyadic numbers, we are reduced to show that $$ \tag{1}\max_{1\leqslant j\leqslant 2^n}\frac{1}{2^{n}}\max_{1\leqslant \ell\leqslant 2^n}\left\lvert \sum_{i=1}^\ell X_{i,j}\right\rvert\to 0\mbox{ almost surely} $$ $$ \tag{2}\max_{1\leqslant j\leqslant 2^n}\frac{1}{2^{2n}} \sum_{i=1}^{2^n} X_{i,j}^2\to 0\mbox{ almost surely}. $$ By the Borel-Cantelli lemma, in order to prove (1), it suffices to show that $$ \tag{1'}\forall\varepsilon>0, \sum_{n\geqslant 1}\mathbb P\left(\max_{1\leqslant j\leqslant 2^n}\frac{1}{2^{n}}\max_{1\leqslant \ell\leqslant 2^n}\left\lvert \sum_{i=1}^\ell X_{i,j}\right\rvert>\varepsilon\right) <\infty$$

which reduces, by a union bound, to prove that $$ \tag{1''}\forall\varepsilon>0, \sum_{n\geqslant 1}2^n\mathbb P\left( \frac{1}{2^{n}}\max_{1\leqslant \ell\leqslant 2^n}\left\lvert \sum_{i=1}^\ell X_{i,0}\right\rvert>\varepsilon\right) <\infty$$

Let us prove (1''). Using Corollary 1.5 in 1 (applied with $r'=r=2$ and $B=\mathbb R$, hence $C_{r',B}=1$, we know that for each $q>0$, there exist constants $A(q)$ and $B(q)$ such that for each $x>0$ and each independent sequence $(Y_i)$, $$ \mathbb P\left(\max_{1\leqslant \ell\leqslant N}\left\lvert \sum_{i=1}^\ell Y_i\right\rvert >x\right)\leqslant A(q)\int_0^1u^{q-1}\mathbb P\left(\max_{1\leqslant i\leqslant N}\lvert Y_i\rvert>xB(q)u\right)du+A(q)x^{-q}\left(\mathbb E\left[Y_i^2\right]\right)^{q/2}. $$

Applying this inequality to $Y_i=X_{i,0}$, $x=2^n\varepsilon$ and $q=3$ shows that $$ \sum_{n\geqslant 1}2^n\mathbb P\left( \frac{1}{2^{n}}\max_{1\leqslant \ell\leqslant 2^n}\left\lvert \sum_{i=1}^\ell X_{i,0}\right\rvert>\varepsilon\right)\leqslant A(3)\sum_{n\geqslant 1}2^{2n}\int_0^1u^{2}\mathbb P\left( \lvert X_{i,0}\rvert>\varepsilon B(3)2^nu\right)du + A(3)\sum_{n\geqslant 1}2^{n}2^{-3n/2}\varepsilon^{-3/2} $$ and finiteness of the first series follows from the elementary fact that $\sum_{n\geqslant 1}2^{2n}\mathbb P\left(Y>2^n\right)\leqslant 4\mathbb E\left[Y^2\right]$.

(2'') follows from an application of the similar argument as the usual strong law of large number, but the truncation level changes. Let $Y_{i,j}=X_{i,j}^2$ and for a fixed $\varepsilon$ and $n$, let $$ Z_{n,i,j}=Y_{i,j}\mathbf{1} \{Y_{i,j}\leqslant\varepsilon 2^{2n} \}-\mathbb E\left[Y_{i,j}\mathbf{1}\{Y_{i,j}\leqslant\varepsilon 2^{2n}\}\right], $$ $$ W_{n,i,j}=Y_{i,j}\mathbf{1}\{Y_{i,j}>\varepsilon 2^{2n}\}-\mathbb E\left[Y_{i,j}\mathbf{1}\{Y_{i,j}>\varepsilon 2^{2n}\}\right], $$ We thus have to show that $$ \tag{2'}\max_{1\leqslant j\leqslant 2^n}\frac{1}{2^{2n}} \sum_{i=1}^{2^n} Z_{n,i,j}\to 0\mbox{ almost surely}. $$ $$ \tag{2''}\max_{1\leqslant j\leqslant 2^n}\frac{1}{2^{2n}} \sum_{i=1}^{2^n} Z_{n,i,j}\to 0\mbox{ almost surely}. $$ For (2'), it suffices to show (by the Borel-Cantelli lemma) that $\sum_m\mathbb E\left[\left(\max_{1\leqslant j\leqslant 2^n}\frac{1}{2^{2n}} \sum_{i=1}^{2^n} Z_{n,i,j}\right)^2\right]<\infty$. For (2''), note that $$ \left(\{\max_{1\leqslant j\leqslant 2^n}\frac{1}{2^{2n}} \sum_{i=1}^{2^n} Z_{n,i,j}\neq 0 \}\right)\subset\bigcup_{i,j=1}^{2^n}\left(\{Y_{i,j}>\varepsilon 2^{2n}\}\right).$$


1 Deviation inequalities for Banach space valued martingales differences sequences and random fields Davide Giraudo ESAIM: PS 23 922-946 (2019)