Solved – Writing AR(1) as a MA($\infty$) process

autoregressivemathematical-statisticsmoving averagetime series

The AR(1) process is

$$
X_t = \phi X_{t-1} + \varepsilon_t
$$

if we use this formula recursively, we get
$$
X_t = \phi(\phi X_{t-2} + \varepsilon_{t-1}) + \varepsilon_t = \phi^2X_{t-2} + \phi\varepsilon_{t-1} + \varepsilon_t = \cdots = \phi^k X_{t-k} + \sum_{j=0}^k \phi^j\varepsilon_{t-j}
$$

If we let $k\to\infty$, we get
$$
X_t = \lim_{k\to\infty}(\phi^k X_{t-k} + \sum_{j=0}^k \phi^j\varepsilon_{t-j}) = \lim_{k\to\infty}(\phi^k X_{t-k}) + \sum_{j=0}^\infty \phi^j\varepsilon_{t-j}
$$
The duality between AR(1) and MA($\infty$) states that there is an equivalence between the two, and that we can write $X_t$ as

$$
X_t = \sum_{j=0}^\infty \phi^j\varepsilon_{t-j}
$$

The difference between the two results is the term $\lim_{k\to\infty}(\phi^k X_{t-k})$, which should be zero, but how do I show this?

Assuming $|\phi| < 1$, we have that $\lim_{k\to\infty}\phi^k = 0$ of course, but I don't see why $\lim_{k\to\infty} X_{t-k} < \infty$? Does convergence asuume the law of large numbers, or is there another way to show equivalence?


I know there is a proof which inverts the lag operator $1-B$, but I didn't find any justification for why the operator can even be inverted, so I wanted an alternative proof, as the one above.

Best Answer

The usual sense in which convergence is understood in this case is in mean square:

$$ E[Y_t-(\epsilon_t+\phi\epsilon_{t-1}+\phi^2\epsilon_{t-2} +\ldots+\phi^j\epsilon_{t-j})]^2=\phi^{2(j+1)} E[Y_{t-j-1}]^2 $$ If $Y_t$ is stationary $$ E[Y_{t-j-1}]^2=\gamma_0+\mu^2 $$ Hence $$ \lim_{j\to\infty}E[Y_t-(\epsilon_t+\phi\epsilon_{t-1}+\phi^2\epsilon_{t-2} +\ldots+\phi^j\epsilon_{t-j})]^2=0 $$

Related Question