Proof that the jump measure of a Lévy process is a Poisson random measure

levy-processesmeasure-theoryprobability theorystochastic-processes

Let $(X_t)_{t \geq 0}$ be an $\mathbb{R}^d$-valued Lévy process and consider its associated jump measure $N_t: \Omega \times \mathbb{B}(\mathbb{R}^d \setminus \{0\}) \to \bar{\mathbb{N}}_0$ given by
\begin{equation*}
N_t(\omega,B):=\#\left\{0 \leq s \leq t \mid \Delta X_s(\omega) \in B\right\}
\end{equation*}

I am looking for a rigorous proof that, for each $t \geq 0$, $N_t$ is in fact a Poisson random measure on the measure space $(\mathbb{R}^d, \mathbb{B}(\mathbb{R}^d \setminus \{0\}), \mu)$, where $\mu$ is the intensity measure $\mu(B):=t\operatorname{\mathbb{E}}(N_1(B))$. That is, I would like to prove that $N_t$ satisfies the following definition:

$\mathbf{Definition}$: Let $(\Omega, \mathbb{F}, P)$ be a probability space and $(\mathcal{X}, \mathbb{E}, \mu)$ a $\sigma$-finite measure space. A Poisson random measure with intensity measure $\mu$ is a mapping $N: \Omega \times \mathbb{E} \rightarrow \mathbb{N}_0$ satisfying

(i) For every $\omega \in \Omega,$ the $\operatorname{map} B \mapsto N(\omega, B)$ is a measure on $(\mathcal{X}, \mathbb{E})$

(ii) For every $B \in \mathbb{E}$, the map $\omega \mapsto N(\omega, B)$ is a random variable (i.e. measurable) and $N(\cdot, B) \sim Pois $($\mu(B)$)

(iii) If $B_1, \ldots, B_n$ are disjoint, then $N(\cdot, B_1), \ldots, N(\cdot, B_n)$ are mutually independent.

I am aware that Sato, in his book $\textit{Lévy Processes and Infinitely Divisible Distributions}$, provides as proof. However, the approach seems quite involved, and I was wondering if a more direct approach is available. In particular, I would like to know if a simple proof of the measurability in condition (ii) is available.

Thank you!

Best Answer

For now this will be only a partial answer. Suppose it has already been proved that the number $N_t(\cdot,\mathbb R^d)$ of jumps before time $t$ has a Poisson distribution.

Then we can deduce $\text{(i)},$ $\text{(ii)},$ and $\text{(iii)}.$

Proposition $\text{(i)}$ just says the jump measure is a measure. (Here I wonder it one should say $\text{“}$For almost every $\omega\in\Omega.\text{''}$)

Observe that $$ \#\{ 0\le s\le t : \Delta X_s\in B\} \mid N_t(\cdot,\mathbb R^d) \sim\operatorname{Binomial}(N_t(\cdot,\mathbb R^d), p(B)) $$ where $$p(B) = \dfrac{\mu(B)}{\mu(\mathbb R^d)} \tag 1$$ is the probability that any particular jump is in $B.$ Then \begin{align} & \Pr(\#\{0\le s\le t : \Delta_s\in B\} = m) \\[6pt] = {} & \operatorname E(\Pr(\#\{0\le s\le t : \Delta_s\in B\} = m \mid N_t(\cdot,\mathbb R^d)) \\[6pt] = {} & \operatorname E\left( \binom {N_t(\cdot,\mathbb R^d)} m p(B)^m (1-p(B))^{N_t(\cdot, \mathbb R^d)-m} \right) \\[6pt] = {} & \sum_{N=0}^\infty \binom N m p(B)^m (1-p(B))^{N-m} \Pr(N_t(\cdot,\mathbb R^d)=N) \\[6pt] = {} & \sum_{N=0}^\infty \binom N m p(B)^m (1-p(B))^{N-m} \frac{\mu(\mathbb R^d)^N e^{-\mu(\mathbb R^d)}}{N!} \\[6pt] = {} & \frac{\mu(B)^m e^{-\mu(B)}}{m!} \quad(\text{Why? See below.}) \tag 2. \end{align} How do we deduce line $(2)$ from what precedes it?

  • Note that $\dbinom Nm=0$ when $N<m,$ so that we can replace the sum by $\displaystyle \sum_{N=m}^\infty.$
  • Let $M=N-m,$ so we have $\displaystyle \sum_{M=0}^\infty$ and the exponent $N-m$ becomes $M$ and the remaining $N$s become $M+m.$
  • Apply the binomial theorem and the power series for the exponential function.

So we conclude that $N_t(\cdot, B)\sim\operatorname{Poisson}(\mu(B)).$

Next, how do we know $N(\cdot, B_1),\ldots, N(\cdot, B_n)$ are independent?

\begin{align} & \Pr( N(\cdot, B_1)=m_1\ \&\ \cdots\ \&\ N(\cdot, B_n)=m_n) \\[6pt] = {} & \operatorname E(\Pr( N(\cdot, B_1)=m_1\ \&\ \cdots\ \&\ N(\cdot, B_n)=m_n\mid N(\cdot,B_1\cup\cdots\cup B_n) )) \\[6pt] = {} & \operatorname E\left( \binom{N(\cdot,B_1\cup\cdots\cup B_n)}{m_1,\ldots,m_n} p(B_1)^{m_1}\cdots p(B_n)^{m_n} \right) \tag 3 \end{align} where $$ \binom N {m_1,\ldots,m_n} = \begin{cases} \dfrac{N!}{m_1!\cdots m_n!} & \text{if } m_1+\cdots+m_n=N, \\[6pt] 0 & \text{otherwise.} \end{cases} $$ Line $(3)$ is a sum of infinitely many terms all but one of which are $0.$ That one term is \begin{align} & \binom{m_1+\cdots+m_n}{m_1,\ldots,m_n} p(B_1)^{m_1} \cdots p(B_n)^{m_n} \frac{\mu(\mathbb R^d)^{m_1+\cdots+m_n} e^{-\mu(B_1\cup\cdots\cup B_n)} }{(m_1+\cdots+m_n)!} \\[6pt] = {} & \binom{m_1+\cdots+m_n}{m_1,\ldots,m_n} \mu(B_1)^{m_1} \cdots \mu(B_n)^{m_n} \frac{ e^{-\mu(B_1\cup\cdots\cup B_n)} }{(m_1+\cdots+m_n)!} \\[6pt] = {} & \prod_{i=1}^n \frac{\mu(B_i)^{m_i} e^{-\mu(B_i)} }{m_i!}. \end{align}

Related Question