[Math] Autocovariance function in $AR(1)$ process

self-learningstatisticsstochastic-processestime series

Let a autoregressive process $AR(1)$ given by
$$x_t=\sum_{j=0}^\infty \phi^jw_{t-j}$$
where $|\phi|<1$ and $w_{t-j}$ is i.i.d a white noise with mean 0 and variance $\sigma^2$.

The autocovariance function is
$$\gamma(h)=cov(x_{t+h},x_t)=E\Big[\Big(\sum_{j=0}^\infty \phi^jw_{t+h-j}\Big)\Big(\sum_{k=0}^\infty \phi^k w_{t-k}\Big)\Big]$$
$$=E[(w_{t+h}+\dots+\phi^hw_t+\phi^{h+1}w_{t-1}+\dots)(w_t+\phi w_{t-1}+\phi^2 w_{t-2}+\dots)\qquad\textbf{(1)}$$
$$=\sigma^2\sum_{j=0}^\infty \phi^{h+j}\phi^j=\sigma^2\phi^h\sum_{j=0}^\infty \phi^{2j}=\frac{\sigma_2\phi^h}{1-\phi^2},\qquad h\geq 0$$

I know that last equality is the power series representation of that sum, but I'm having hard time to understand how they get
$$\sigma^2\sum_{j=0}^\infty \phi^{h+j}\phi^j$$

from $\textbf{(1)}$. I know that they are using $$E[w_{j}w_{k}]=E[w_t^2]=Var(w_t)=\sigma^2$$
in the expectations, but I don't understood how to get the exponents in $\phi$

Anyone can help me?

Best Answer

Because $w_i$ is a white noise, $\operatorname{cov}[w_i,w_j]=0$ when $i\ne j$.

It follows that $E[w_iw_j]=0$ when $i\ne j$.

When you expand formula (1) you can get rid of all the terms $w_iw_j$ where $i\ne j$.

The terms you are left with all have an offset of $h$ in the exponent of $\phi$, i.e. $(1)=E[(\phi^hw_t)(w_t)+(\phi^{h+2}w_{t-1})(\phi w_{t-1})+...].$