Convergence in probability for a sequence of random variable

convergence-divergencecovarianceprobability

I'm currently stuck on the following problem which involves proving the convergence in probability for a sequence of independent random variables. Here's the full statement:

Let $X_1, X_2, X_3,\dots$ be a sequence of independent random variables such that $\mathbb E[X_i]=\mu$ for all $i$ and exist a constant $L$ satisfying $0\lt L\lt\infty$ such that $\mathbb V[X_i]\lt L$ for all $i$. Show that $n^{-1}\sum\limits_{i=1}^nX_kX_{k+1} \to \mu^2, \text{
in probability for $n\ge1$ }$

My attempt: Let $Y_n$= ${n^{-1}\sum\limits_{i=1}^nX_kX_{k+1}}$,

$P \left( \left | {Y_n-\mu } \right | \ge \epsilon \right) \le \frac{Var \left(Y_n \right )}{\epsilon^2}$

$\mathbb V[Y_n]= \mathbb V[{n^{-1}\sum\limits_{i=1}^nX_kX_{k+1}}]$=$n^-2
\{{\sum\limits_{k=1}^n\mathbb V[X_kX_{k+1}]}+2({\sum\limits_{1\le i\le j\le n}Cov(X_iX_{i+1}, X_jX_{j+1})}$
}

I don't know how to proceed any further than that. I appreciate any help. Thank you in advance.

Best Answer

$X_iX_{i+1}$ and $X_jX_{j+1}$ are independent if $|i-j|>1$, so their covariance is zero if $|i-j|>1$. There are only $O(n)$ non-zero terms in the covariance sum, so dividing it by $n^2$ gives an $O(n^{-1})$ contribution.