Substituting in double sum indexes of covariance formula

covariancenotationprobabilityrandom variablessummation

This question might sound very trivial, but I do have a problem with double sums

Let $X_n$ be a bernoulli random where $X_i$'s are independant, and let $Yn := X_nX_{n+1} $

Let $S_n := \sum^n_{i=1}{X_i}$ and $V_n:=\sum^n_{j=1}{Y_j}$

We look for $Cov(Sn,Vn)$

What I thought of was :

$Cov(Sn,Vn) = Cov(X_1 + \dots + X_n, Y_1 + \dots + Y_n)$

Because I have a problem getting what a sum mean directly I am trying to see what it looks like,

By bilinearity, this is equal to the sum of these terms :

$Cov(X_1 , Y_1) + Cov(X_1 , Y_2) + Cov(X_1 , Y_3) + \dots + Cov(X_1 , Y_n) $

$Cov(X_2 , Y_1) + Cov(X_2 , Y_2) + Cov(X_2 , Y_3) + \dots + Cov(X_2 , Y_n)$

$Cov(X_3 , Y_1) + Cov(X_3 , Y_2) + Cov(X_3 , Y_3) + \dots + Cov(X_3 , Y_n)$

$\vdots$

$Cov(X_n , Y_1) + Cov(X_n , Y_2) + Cov(X_n , Y_3) + \dots + Cov(X_n , Y_n)$

In this case, we have $X_i$'s are independant ,

so $X_i$ and $Y_j = X_{j}X_{j+1}$ are independant if $X_i$ and $X_j$ are independant and $X_i$ and $X_{j+1}$ are independant, so they are independant if $j \neq i$ and $j+1 \neq i $ which corresponds to the diagonal terms , the terms of the first diagonal below,

This would give

$$Cov(S_n,V_n) = \sum^n_{i=1} Cov(X_i,Y_i) + \sum^n_{i=2} Cov(X_{i},Y_{i-1})$$

I want to take these terms of the double sum of the general expression of covariance which notation is:

(by bilinearity) $\sum _{1 \leq i , j \leq n } Cov(X_i,Y_j)$

And if I understood correctly this means same this as $\sum^n_{ i = 1 } \sum^n_{j = 1} Cov(X_i,Y_j)$

My question is, how do we think of taking these terms of the double sum of the general expression of covariance without representing them, but by substituting in their expression $\sum^n_{ i = 1 } \sum^n_{j = 1} Cov(X_i,Y_j)$ ?

Best Answer

Bilinearity allows you to say

$$Cov(S_n,V_n) = Cov\left(\sum\limits_i X_i, \sum\limits_j X_jX_{j+1}\right) = \sum\limits_i \sum\limits_j Cov(X_i, X_jX_{j+1})$$

If you exclude what you call the independent terms where $i\not= j$ and $i \not=j+1$ giving a covariance of $0$, then this reduces to

$$Cov(S_n,V_n)= \sum\limits_i \left( Cov(X_i, X_i X_{i+1}) +Cov(X_i, X_{i-1} X_{i})\right)$$

with some slight adjustment in the sum when $i=1$ or $i=n$: overall you are adding up $2n-2$ covariance terms.

With your i.i.d. Bernoulli $X_i$, it seems $Cov(X_i, X_i X_{i+1}) = Cov(X_i, X_{i-1} X_{i})= p^2-p^3$, so overall $$Cov(S_n,V_n)= 2(n-1)p^2(1-p)$$