Solved – Proving stationarity with difference equations

self-studytime series

This is exercise 2.9.b from Time Series Analysis – With Applications in R (Cryer, Chan). First of all, for $m\in\mathbb N+2$, let us define the $m$-th difference of a discrete time series $\{A_t\}_{t\in\mathbb Z}$ as
$$\nabla^{m+1}A_t\triangleq\nabla(\nabla^m A_t)\quad:\quad\nabla A_t\triangleq A_t-A_{t-1}\quad.$$

Let $\{X_t\}_{t\in\mathbb Z}$ be a zero-mean stationary series with autocovariance $\gamma_k$ for every integer $k$, $\mu_t$ a polynomial in $t$ of degree $d$, and $Y_t\triangleq X_t + \mu_t$. Show that, for natural $m$, the $m$-th difference of $\{Y_t\}_{t\in\mathbb Z}$ is stationary if and only if $m\ge d$.

I realize that
$$Z_t\triangleq\nabla^mY_t=\sum_{j=0}^m(-1)^j\binom m jY_{t-j}\quad,$$
and that this would be trivial if we were dealing with the continuous derivative, because polynomial derivatives vanish if their degree is greater than the degree of the polynomial. Nevertheless, I get lost manipulating $E[Z_t]$ and $Cov[Z_t,Z_{t+k}]$. How do I prove neither of these depend on $t$?

Best Answer

You don't need to worry about expectations and covariances, since the difference operator kills the polynomial term. In other words, the result turns on what the difference operator does to polynomials. Once you have taken $d$ differences, you are only left with the stochastic part of your model.

So let's prove this. Since the difference operator is linear, you only have to prove the result for $t^d$.

I would go for a proof by induction. The first difference kills the linear term, since you get $at-b - a(t-1)-b=2b$ A first difference gives you a stationary time series with non-zero mean.

For higher polynomials, say $d$, the first difference gives $t^d-(t-1)^d=t^d-t^d -nt^{n-1}+\text{a polynomial of degree}\ d-1$.

See what happens when we take the second difference. We have $nt^{n-1}$ (plus lower order junk) against $n(t-1)^{n-1}$ (plus lower order junk) from the difference of the next two terms. The induction assumption works here, because we have the same coefficient for the leading term. Note that we are making use of the assumption that the time series is sampled at equally spaced moments, or we would have got a different leading term in the second "first difference". We usually make this assumption for time series, and you can see that it matters in this situation.

You can argue inductively for both the necessary and the sufficient condition.

You still have to deal with $X_t$, the stationary part of the expression. You need to show that the first difference of a stationary time series is also stationary - but that's obvious.You only need to show this for the first difference.