Principle of Contraction for Random Variables

convergence-divergencemeasure-theoryprobabilityrademacher distributionrandom variables

Let $\epsilon_1, \epsilon_2, \epsilon_3, …, \epsilon_n, …$ be a sequence of independent random variables where $P[\epsilon_i=-1]=P[\epsilon_i=1] = 1/2$ for all $i$, i.e. the so-called Rademacher sequence.

Let $u_1, u_2, u_3, …, u_n, …$ be a sequence of reals.

Define a random series $S=\sum_{i=1}^\infty \epsilon_i u_i$.

Let $\lambda_i\in [0,1]$, fixed, for all $i$.

Define another random series $S_\lambda=\sum_{i=1}^\infty \lambda_i \epsilon_i u_i$.

Principle of Contration: If $S$ converges almost surely, then $S_\lambda$ also converges almost surely.


I tried to prove that $\{\sum_{i=1}^N \lambda_i \epsilon_i u_i\}_N$ is a Cauchy sequence almost surely but I failed in that $\epsilon_i u_i$ can be positive for some $i$'s and negative for some other $i$'s.

To be more specific, for $N_2>N_1$, $$\left|\sum_{i=1}^{N_1} \lambda_i \epsilon_i u_i – \sum_{i=1}^{N_2} \lambda_i \epsilon_i u_i\right|=\left|\sum_{i=N_1+1}^{N_2} \lambda_i \epsilon_i u_i\right|.$$ We want the RHS above goes to zero as $N_1, N_2 \to \infty$.

What we have from the almost sure convergence of $S$ is, $\left|\sum_{i=N_1+1}^{N_2} \epsilon_i u_i\right|$ goes to zero as $N_1, N_2 \to \infty$, which I don't know how to use to derive the desired convergence above.

Thanks for any help.


Edit: The book mentions that, if $S$ converges almost surely, then $\mathbb P[|S|>x]$ goes to zero very rapidly as $x\to\infty$. One can use this fact to prove the Principle of Contration. I didn't see how and so tried the Cauchy sequence argument above. By the way, I also didn't see why the probability $\mathbb P[|S|>x]$ decreases rapidly though I know it will decrease to zero as $x\to\infty$.

Best Answer

When dealing with almost sure convergence of sequences of independent RVs, very useful tools are Kolmogorov's two- and three-series theorems.

In particular, since $\sum_{i=1}^\infty \epsilon_i u_i$ converges almost surely, we have from the three-series theorem that that $\sum_{i=1}^\infty u_i^2 = \sum_{i=1}^\infty \mathrm{Var}(\epsilon_i u_i) < \infty$.

Now, we have from the two-series theorem that, since $\sum_{i=1}^\infty \mathbf{E}(\lambda_i \epsilon_i u_i) = 0$ and $\sum_{i=1}^\infty \mathrm{Var}(\lambda_i \epsilon_i u_i) = \sum_{i=1}^\infty \lambda_i^2 u_i^2 \leq \sum_{i=1}^\infty u_i^2 < \infty$, it follows that $\sum_{i=1}^\infty \lambda_i \epsilon_i u_i$ converges almost surely.


As for the tail probability, the $\epsilon_i$ are $1$-sub-Gaussian, from which it follows that $S_\lambda$ is $\sum_{i=1}^\infty \lambda_i^2 u_i^2$-sub-Gaussian and so $$ \Pr(|S_\lambda| > t) \leq 2\exp\Bigl\{-\frac{t^2}{2\sum_{i=1}^\infty \lambda_i^2 u_i^2}\Bigr\}, $$ which indeed falls of very quickly when $\sum_{i=1}^\infty u_i^2$ is finite.

Related Question