Linear Process Close to Gaussian Process – Probability Analysis

metric-spacespr.probabilitystochastic-processes

A linear process $(X_t)_{t \in \mathbb{Z}}$ is usually written as a moving-average process with infinity order:
\begin{equation}\label{linear_process}\tag{Eq. 1.1}
X_{t} = \sum_{j =0 }^\infty \psi_{j} \varepsilon_{t-j}, \forall t \in \mathbb{Z}.
\end{equation}

where $\varepsilon_{t}$ is a i.i.d. white noise ($E(\varepsilon_{t})=0,\,\, E |\varepsilon_{t}|^2< \infty$) and $\sum_{j =0 }^\infty \psi_{j}^2 < \infty$.

According to this paper, page 12130, the author says:

Mallows (12) argues that a linear process such as in (\ref{linear_process}) is close to a Gaussian process if $\max_{j\geq 0}|\psi_j|$ is small.

I would like to know if you have any relatively simple examples for this statement. I tried to think of a simple example, but I couldn't. I don't want to go to the Mallows paper before I go through here.

Best Answer

$\newcommand{\R}{\mathbb R}\newcommand{\Z}{\mathbb Z}\newcommand{\ep}{\varepsilon}\newcommand{\de}{\delta}$Let $\psi_j:=0$ for $j=-1,-2,\dots$. Then \begin{equation*} X_t=\sum_{j\in\Z}X_{t,j} \end{equation*} for $t\in\Z$, where \begin{equation*} X_{t,j}:=\psi_{t-j}\ep_j. \end{equation*} Let \begin{equation*} B:=\sqrt{\sum_{j\in\Z} \psi_i^2},\quad m:=\max_{j\ge0}|\psi_j|=\max_{j\in\Z}|\psi_j|. \end{equation*} Suppose that $B>0$ and $m$ vary in any manner such that \begin{equation*} m/B\to0. \tag{1}\label{1} \end{equation*} Let us show that then $X_t/B$ converges in distribution to a standard normal random variable, for each $t\in\Z$.


For each real $\de>0$, \begin{equation*} \begin{aligned} L&:=\frac1{B^2}\sum_{j\in\Z}EX_{t,j}^2\,1(|X_{t,j}|\ge\de B) \\ &=\frac1{B^2}\sum_{j\in\Z}E(\psi_{t-j}\ep_j)^2\,1(|\psi_{t-j}\ep_j|\ge\de B) \\ &\le\frac1{B^2}\sum_{j\in\Z}\psi_{t-j}^2E\ep_j^2\,1(|\ep_j|\ge\de B/m) \\ &=\frac1{B^2}\sum_{j\in\Z}\psi_{t-j}^2E\ep_0^2\,1(|\ep_0|\ge\de B/m) \\ &=E\ep_0^2\,1(|\ep_0|\ge\de B/m)\to0. \end{aligned} \end{equation*} Hence, \begin{equation*} \frac1{B^2}\sum_{j\in\Z}EX_{t,j}^2\,1(|X_{t,j}|<\de B)=1-L\to1, \end{equation*} \begin{equation*} \begin{aligned} &\frac1{B^2}\sum_{j\in\Z}(EX_{t,j}\,1(|X_{t,j}|<\de B))^2 \\ &=\frac1{B^2}\sum_{j\in\Z}(EX_{t,j}\,1(|X_{t,j}|\ge\de B))^2 \le L\to0, \end{aligned} \end{equation*} \begin{equation*} \begin{aligned} &\Big|\frac1B\sum_{j\in\Z}EX_{t,j}\,1(|X_{t,j}|<\de B)\Big| \\ &=\Big|\frac1B\sum_{j\in\Z}EX_{t,j}\,1(|X_{t,j}|\ge\de B)\Big| \\ &\le\frac1B\sum_{j\in\Z}E|X_{t,j}|\,1(|X_{t,j}|\ge\de B)\ \le \frac L\de\to0, \end{aligned} \end{equation*} \begin{equation*} \sum_{j\in\Z}P(|X_{t,j}|\ge\de B)\le L\to0. \end{equation*} So, by Theorem 18 in Chapter IV, $X_t/B$ converges in distribution to a standard normal random variable, for each $t\in\Z$.

Thus, under condition \eqref{1}, all the one-dimensional distributions of the process $(X_t)$ are asymptotically normal.


Similarly considered are all the finite-dimensional distributions of the process $(X_t)$ -- that is, all the joint distributions of $(X_{t_1},\dots,X_{t_p})$ for integers $t_1<\cdots<t_p$. This is done by writing \begin{equation*} \sum_{i=1}^p c_i X_{t_i}=\sum_{j\in\Z}Y_j \end{equation*} for any real $c_1,\dots,c_p$, where \begin{equation*} Y_j:=\phi_j\ep_j,\quad\phi_j:=\sum_{i=1}^p c_i \psi_{t_i-j}, \end{equation*} so that $\sum_{j\in\Z}\phi_j^2<\infty$ and $\max_{j\in\Z}|\phi_j|\le m\sum_{i=1}^p |c_i|$.