Prove that $Z_n = \sum_{k=1}^{n} (X_k – \mu_k)$ converges a.s.

convergence-divergenceprobability theoryrandom variablesself-learningsequences-and-series

Let $\{ X_n, n\in N\}$ be a sequence of rrvs in $L_2 (\omega, A, P)$ with $\mu_n = E(X_n)$ and $\sigma_n^2 = \sigma^2 (X_n) $

Part – 1:

suppose that $\sum_{k=1}^{\infty} \sigma_n^2 < \infty$. Prove that $\{ X_n – \mu_n, n\ge 1\}$ converges to 0 a.s.

Part – 2:

suppose that $\sum_{k=1}^{\infty} n^4 \sigma_n^2 < \infty$. Then prove that $Z_n = \sum_{k=1}^ n (X_k – \mu_k)$ converges a.s.


My solution for part – 1:

By Cheybshev inequality,

$P(|X_n – \mu_n|>e ) \le \sigma^2_n / e^2$ for small enough e>0

Define $A_n = \{ w: |X_n – \mu_n|>e \}$ such that

$P(A_n) = P(|X_n – \mu_n|>e ) \le \sigma^2_n / e^2$

$\sum P(A_n) \le \sum \sigma^2_n / e^2 < \infty$

By Borel Contelli lemma

if $\sum P(A_n) < \infty$, $P(limsup A_n) = P( U \{ X_n- \mu_n\})=0$

Then, $\{ X_n- \mu_n\} \to 0$ a.s.

So far, I hope it is true. (since I am not sure) And I am stack with doing part 2.How can I deal with the part – 2?

Best Answer

You have done the first part correctly.

$E|X_k-\mu_k| \leq \sqrt {E(X_k-\mu_k)^{2}}=\sigma_k$. Note that $\sum \sigma_k =\sum (k^{2}\sigma_k) (\frac 1{k^{2}})$. By C-S/Holder's inequlity we see that $E\sum |X_k-\mu_k|=\sum \sigma_k <\infty$. Hence $\sum (X_k-\mu_k)$ converges absoluetly with probaility $1$.