Convergence of Lévy Measures – Probability and Measure Theory

levy-processeslimits-and-convergencemeasure-theorypr.probabilitystochastic-processes

Consider the sequence of stochastic processes $(X_n, n \geq 1)$, where $X_n = (X_{t;n})_{t\in \mathbb Z}$ and:
\begin{equation}\label{I}\tag{SP}
X_{t;n} = \sum_{j=0}^\infty \theta_{jn} \varepsilon_{t-j;n}
\end{equation}

with $\sum_{j=0}^\infty \theta_{jn}^2 < \infty$ and $(\varepsilon_{t;n})_{t\in \mathbb Z} \overset{\text{iid}}{\sim} \mu_n(dx)$ with zero mean and variance $\sigma_n =1$, for all $n$.
Suppose:
\begin{equation}\label{uan}\tag{Uan}
\quad\quad \max_{0\leq j } |\theta_{jn}| \longrightarrow 0\quad (n \to \infty).
\end{equation}

Note that each $X_n$ is strictly stationary. Suppose $(X_{t})_{t\in \mathbb Z}$ is another strictly stationary satisfying the following two conditions:

  1. $E[X_{t;n}^2]\longrightarrow E[X_{t}^2]< \infty$, as $n \to \infty$, for all $t$;
  2. $(X_{t_1n}, X_{t_2n},\dotsc, X_{t_pn}) \Longrightarrow (X_{t_1}, X_{t_2},\dotsc, X_{t_p})$ as $n \to \infty$ for all $t_1 <t_2<\cdots < t_p$ (weak convergence of finite dimensional vectors).

Now, fix $(t_1 <t_2<\cdots < t_p)$. We can show, starting from condition 2 and using (\ref{I}) and (\ref{uan}), that:
$$\sum_{j=0}^n \left(\theta_{jn} , \theta_{(j+ t_2 – t_1)n }, \dotsc, \theta_{(j+ t_p – t_1)n } \right) \varepsilon_{t_1-j;n} \Longrightarrow (X_{t_1}, X_{t_2},\dotsc, X_{t_p})\quad (n \to \infty).$$
For simplicity, denote $X_{jn}:=\left(\theta_{jn} , \theta_{(j+ t_2 – t_1)n }, \dotsc, \theta_{(j+ t_p – t_1)n } \right) \varepsilon_{t_1-j;n}$ and $X:=(X_{t_1}, X_{t_2},\dotsc, X_{t_p})$ (note that $X_{jn}$ and $X$ depends on $t_1, t_2,\dotsc, t_p$). So, the last equation means that
$$S_n := \sum_{j=0}^n X_{jn} \Longrightarrow X\quad (n \to \infty).$$
Finally, we can show that there exist a non-negative definite matrix $\Sigma$ and a Levy measure $\nu$ such that the characteristic function of $X$ is:
$$\varphi_X(u)= \exp\left\{ \frac{-u' \Sigma u}{2} +\int_{\mathbb R^p} \left[e^{iu'x} – 1- i u'x \right] d\nu(x) \right\}.$$
We adopt the following notation:
\begin{equation}
X_{jn}\sim \nu_{jn}(dx), \,\, \nu_n(dx):= \sum_{j=0}^n\nu_{jn}(dx).
\end{equation}

Notice that $\nu_{jn}$ is a probability measure in $\mathbb R^p$, since it depends on $\mu_{n}(dx)$ — the probability measure of the iid $(\varepsilon_{t;n})_{t \in \mathbb Z}$ defined on borelians of $\mathbb R$ — but also depends on the vector $\left(\theta_{jn} , \theta_{(j+ t_2 – t_1)n }, \dotsc, \theta_{(j+ t_p – t_1)n } \right)\in \mathbb R^p$.

Moreover, the measure $\nu$ can be characterized as follows: let $\mathcal C_\#$ be the class of continuous and bounded functions vanishing on a neighborhood of $0$. Then:
\begin{equation}\label{M}\tag{M}
\int f \, \nu_n(dx) \to \int f \, \nu(dx),\quad \forall f \in \mathcal C_\# \quad (n \to \infty).
\end{equation}

or equivalently (See Barczy and Pap – Portmanteau theorem for unbounded measures):
$$\nu_n(E) \longrightarrow \nu(E), \quad (E\,\,\ \nu\hbox{-contunity set}, 0 \notin \overline{E},\,\, n \to \infty )\label{MI}\tag{M'}$$

Question

Notice that $E[S_n]= \int_{\mathbb R^p} x \nu_n(dx) =0$, for all $n$. I want to show that:
$$\int_{\mathbb R^p} x \nu(dx) =0\label{q1}\tag{I}$$
Can we show (\ref{q1}) or can we give a counterexample?

Attempt

First, I can show that condition 1 and (\ref{uan}) imply:
\begin{equation}\label{ui}\tag{UI}
\int_{\mathbb R^p} |x|^2 \nu_{n}(dx) = \sum_{j=0}^n \int_{\mathbb R^p} |x|^2 \nu_{jn}(dx) \longrightarrow \int_{\mathbb R^p} |x|^2 \nu(dx)< \infty,\quad ( n\to \infty)
\end{equation}

It is worth noting that we can define
$$m_n(B):= \int_{B} |x|^2 \nu_n(dx)< \infty\quad\hbox{ and }\quad m(B):= \int_{B} |x|^2 \nu(dx)< \infty$$
for all borelian $B$. Since $E[S_n]=0$, we have
$$\int_{\mathbb R^p} \frac{x}{|x|^2} m_n(dx)=0$$
and (\ref{q1}) is equivalent to:
$$\int_{\mathbb R^p} \frac{x}{|x|^2} m(dx)$$
I would venture to say that this is true due to (\ref{ui}), using an argument similar to uniform integrability.

Sorry if I contextualized the issue too much, but I needed to avoid counterexamples like this answer. In this same question, we can also find more technical details about the context given here.

Best Answer

At first, we consider an example.

Let \begin{gather*} f(x)=\frac{I_{\{x>0\}}(x)}{2x^2(1\vee x^2)} =\frac{I_{\{(0,1)\}}(x)}{2x^2} + \frac{I_{\{[1,\infty)\}}(x)}{2x^4},\\ \nu(\mathrm{d}x)=f(x)\,\mathrm{d}x, \end{gather*} where $\nu$ is a $\sigma-$finite measure on $(\mathbb{R},\mathscr{B}(\mathbb{R}))$ and \begin{equation*} \int_0^\infty (1\wedge x^2)\nu(\mathrm{d}x)<\infty \tag{1} \end{equation*} Also, let \begin{align*} \psi_f(u)&=\int_{0}^{\infty}(e^{iux}-1-iux)\,\nu(\mathrm{d}x)\\ &=\int_{0}^{\infty}(e^{iux}-1-iux)f(x)\,\mathrm{d}x,\\ \bar{\phi}_f(u)&=\exp(\psi_f(u))=\exp\Big[\int_{0}^{\infty} (e^{iux}-1-iux)f(x)\,\mathrm{d}x\Big]. \tag{2} \end{align*} Then, from (1), $\bar{\phi}_f(u)$ is a (infinitely divisible) characteristic function and \begin{equation*} \bar{\phi}_f^\prime(0)=\psi_f^\prime(0)=0, \quad -\psi_f^{\prime\prime}(0)=\int_{0}^{\infty}x^2f(x)\,\mathrm{d}x=1. \tag{3} \end{equation*}

Now we are ready to construct a special $(\epsilon_{t;n})$ and $\theta_{jn}$. Let $(\epsilon_{t;n})_{t\in\mathbb{Z}}\stackrel{\mathrm{iid}}{\sim}\mu_n(dx)$, where the characteristic function of $\mu_n$ is \begin{equation*} \phi_{\epsilon;n}(u)\stackrel{\text{def}}{=}\int_{\mathbb{R}}\exp(iux)\mu_n(dx) =\exp\Big[\frac1n \psi_f(\sqrt{n}u)\Big], \end{equation*} then \begin{equation*} \mathsf{E}[\epsilon_{t;n}]=0, \qquad \mathsf{var}[\epsilon_{t;n}]=1,\qquad \forall n\ge 1. \tag{4} \end{equation*} Meanwhile, let \begin{equation*} \theta_{jn}=\frac{1}{\sqrt{n}}, \qquad 0\le j\le n-1,\quad \forall n\ge 1. \tag{5} \end{equation*} Hence, \begin{align*} X(t;n)&\stackrel{\text{def}}{=}X_{t;n}=\sum_{j=0}^{n-1}\theta_{jn}\epsilon_{t-j;n}=\frac{1}{\sqrt{n}} \sum_{j=0}^{n-1}\epsilon_{t-j;n},\qquad \forall n\ge 1,\\ \phi_{X(t;n)}&\stackrel{\text{def}}{=}\mathsf{E}[\exp(iX(t;n)u)] =[\phi_{\epsilon;n}(u/\sqrt{n})]^n \\ &=\exp[\psi_f(u)]=\bar{\phi}(u), \qquad \forall n\ge 1.\tag{6} \end{align*} From (4)(5), (Uan) and conditions 1,2 hold. If $\phi_{X(t_1)}(u) $ is the characteristic function of one dimensional distribution of limit process $X=\{X_t\} $, than, from (6), \begin{equation*} \phi_{X(t_1)}(u)=\phi_{X(t;n)}=\exp[\psi_f(u)]=\bar{\phi}(u). \end{equation*}

Now from above example we could discuss your questions.

Question 1 In above example, \begin{equation*} \int_{\mathbb{R}}|x|\nu(\mathrm{d}x)=+\infty. \end{equation*}

Question 2 In fact, if \begin{equation*} \int_{\mathbb{R}^p}|x|^2 \nu(\mathrm{d}x)<\infty, \tag{7} \end{equation*} $\bar{\phi}_f(u) $, defined by (2), may be $\phi_{X(t_1)}(u) $, the characteristic function of one dimensional distribution of limit process $X=\{X_t\} $. Hence your question transfer to ``if $\nu $ satisfy (7), under what conditions we have $\nu(\mathbb{R}^p)<\infty$ or $\nu(\mathbb{R}^p)=\infty$''.