How to Show ? x d? = 0 Using Pseudo-Weak Convergence of Measures – Probability and Measure Theory

levy-processeslimits-and-convergencemeasure-theorypr.probability

I have a sequence of $p$-dimensional infinitely divisible random vectors $S_n'$, such that $S_n' \Longrightarrow X$, as $n \to \infty$.

Suppose the following assumptions

  1. The characteristic functions are given by:
    $$\varphi_{S_n'}(u)=\exp\left\{ \int_{\mathbb R^p} \left[e^{iu'x} – 1 – i u'x \right] \, d\nu_n \right\}, \quad \varphi_{X}(u) = \exp\left\{ \frac{- u'\sigma u}{2} + \int_{\mathbb R^p} \left[e^{iu'x} – 1 – iu'x \right] d\nu \right\} $$
    ($\nu_n$ and $\nu$ are Lévy measures)

  2. $E[S_n']=\int_{\mathbb R^p}x \,d\nu_n=0$ and
    \begin{equation}\label{0}\tag{0}
    \sup_n \int_{\mathbb R^p} |x|^2 \, d\nu_n(x) \leq C< \infty
    \end{equation}

  3. Let $\mathcal C_\#$ be the class of continuous and bounded functions vanishing on a neighborhood of $0$. Then:
    \begin{equation}\label{I}\tag{I}
    \int f \, d\nu_n \to \int f \, d\nu \quad (n \to \infty),\quad \forall f \in \mathcal C_\#
    \end{equation}

  4. First, for any $\epsilon>0$, define the symmetric non-neg-definite matrix $\sigma_{n,\epsilon}$ as:
    \begin{equation}\label{II}\tag{II}
    \langle u, \sigma_{n,\epsilon}u \rangle := \int_{|x|\leq \epsilon} \langle u ,x\rangle^2 \, d\nu_n(x), \quad u \in \mathbb R^p
    \end{equation}

    Then:
    \begin{equation}\label{III}\tag{III}
    \lim_{\epsilon \downarrow 0} \limsup_{n \to \infty} \left| \langle u, \sigma_{n,\epsilon}u \rangle – \langle u, \sigma u \rangle \right|=0
    \end{equation}

    (Where $\sigma$ appears in the characteristic function $\varphi_X$, see hypothesis 1)

Question:

Since $\int_{\mathbb R^p} x \, d\nu_n =0$ for all $n$, I suspect that

\begin{equation}\label{IV}\tag{IV}
\int_{\mathbb R^p} x \, d\nu = 0
\end{equation}

How to show this?

Remarks:

  • Note that (\ref{I}) is almost like a weak convergence of measures. If I could show that (\ref{I}) holds for every continuous and bounded function, I would actually have that $\nu_n \Longrightarrow \nu$ ($n \to \infty$). In this case, together with (\ref{0}), I could conclude (\ref{IV}) via a uniform integrability argument. Thus, defining $\mathcal C$ be the class of continuous and bounded functions, I think my question boils down to showing that:
    \begin{equation}\label{V}\tag{V}
    \int f \, d\nu_n \to \int f \, d\nu \quad (n \to \infty), \quad \forall f \in \mathcal C
    \end{equation}

    I spent a few hours trying to solve this using hypothesis 1-4, but I couldn't. It may also be that there is another way of showing (\ref{IV}).

I appreciate any help.

Update

So far, Iosif Pinelis gave a counterexample. I apologize for the omission, but I forgot to write some assumptions related to $S_n'$. I will put the updates here.

The $S_n'$ arise as follows: let $(X_{jn})_{1\leq j \leq n}$, $X_{jn} \sim \mu_{jn}$, be a triangular array of $p$-dimensional random vectors (row independent) such that:

  • $E X_{jn}= \int_{\mathbb R^p} x \, d \mu_{jn}=0$
  • $\lim_{n \to \infty} \max_{1\leq j \leq n} P(|X_{jn}|> \epsilon)=0$, for all $\epsilon > 0$
  • Defining $S_n := \sum_{j=1}^n X_{jn}$, we have $var(S_n):=\sum_{j=1}^n \int_{\mathbb R^p} |x|^2 \, d\mu_{jn} \leq C < \infty$, for all $n \in \mathbb N$.
  • $S_n \Longrightarrow X$, as $n \to \infty$. (Here, $X$ is the same limit of the $S_n'$ given above)
  • $\nu_n(E):= \sum_{j=1}^n \int_E d\mu_{jn}, \quad E\, \,\hbox{ borelian set.}$

Definig first $Y_{jn}:= [X_{jn}]^{[1]}$ (This the compound Poisson radom variable i.e $Y_{jn} \sim CP(\mu_{jn}, 1)$ ), we define:
$$S_n' := \sum_{j=1}^n Y_{jn}$$
It is easy to show that $E[S_n']=E[S_n]=0$ and $var[S_n']=var[S_n]$. By an argument of Accompanying Law (section 3.7 from the Varadhan'lecture notes), we have that $S_n'$ is such that
$$S_n' \Longrightarrow X, \quad (n \to \infty)$$

The assumptions 1-4 given above are true using the theorem 8.7, page 41, from the Sato's book:

Best Answer

$\newcommand\de\delta$A counterexample is given by $p=1$, $\nu(dx):=|x|^{-5/2}\,1(0<|x|<1)\,dx$, and $\nu_n(dx):=|x|^{-5/2}\,1(1/n<|x|<1)\,dx$.


The OP has added certain conditions. The only consequence of those additional conditions that matters in this context is that $\int d\nu_n=n$ for all $n$ (so that $\mu_{jn}:=\nu_n/n$ be a probability measure for any $n$ and $j$). This extra condition on $\nu_n$ is easy to satisfy by the following slight modification of the definition of the measure $\nu_n$: $$\nu_n(dx):=|x|^{-5/2}\,1(\de_n<|x|<1)\,dx,$$ where $\de_n:=(1+\frac34\,n)^{-2/3}$ (so that $\de_n\to0$ as $n\to\infty$).