[Math] Strong Markov property for Poisson point process

pr.probabilityreference-requeststochastic-calculusstochastic-processes

The question is thoroughly contained in the title. I just say that I would only like to find a reference for this question. I have searched in some books, to no avail.

Here is what I mean exactly. Let's say we have a process $N$ on $\mathbb{R} _+\times \mathbb{R}^d$, $\mathbb{R} _+$ is time, with intensity being the Lebesgue measure on $\mathbb{R} _+\times \mathbb{R}^d$. Let $\mathscr{F} _t$ be the minimal $\sigma$ -algebra containing all random variable
$N(Q \times U)$, where $Q \in \mathscr{B}([0;T])$, $U \in \mathscr{B}(\mathbb{R}^d)$. Or we may take minimal complete right-continuous $\sigma$ – algebra with this property. Let $\tau$
be a stopping time with respect to $(\mathscr{F} _t)$. It seems reasonable to conjecture,
that the process $\bar N $ defined by

$$\bar N ([0;s] \times U) = N ([\tau;\tau + s] \times U), \ \ \ U \in \mathscr{B}(\mathbb{R}^d)$$

is a Poisson point process with the same intensity measure independent of $(\mathscr{F} _{\tau})$. I was not able however to find a reference to this statement.

In terms of random sets, it corresponds to the strong Markov property of the set
$[0;\tau] \times \mathbb{R}^d$, which is not compact.

In the book by Rozanov, link, the strong Markov property is
considered for compact stopping sets.

I asked this question on math.stackexchange, link.

Update

I would like to add two things.

First, it looks like the strong Markov property for $\bar N$ follows from Theorem 20.9 here:

Theorem 20.9 Suppose $(X_t , P^x)$ is a Markov process with respect to $\{\mathscr{F}_t \}$, that Assumption
20.1 holds, and that $T$ is finite stopping time. If Y is bounded and measurable with respect
to $\mathscr{F}_\infty$, then
$$
E^x[Y \circ \theta _T |\mathscr{F}_T ] = E^{X_T}Y, \ \ \ \ P^x-a.s.
$$

Here $\{\mathscr{F}_t \}$ is the minimal right-continuous filtration, which contains all sets $N$ satisfying $P^x(N)=0$ for all $x$,
and such that
$(X_t)$ is adapted:
$$
\mathcal{F}_t^{00} = \sigma(X_s : s \leq t),
$$
$$ \mathcal{F}_t^{0} = \sigma \left(\mathcal{F}_t^{00} \cup \{ A \subset S : A \text{ is } \Bbb{P}^x\text{-null for all } x \in S\} \right) \quad \text{and} \quad \mathcal{F}_t = \mathcal{F}_{t+}^{0} = \bigcap_{\epsilon > 0} \mathcal{F}_{t+\epsilon}^{0},$$
$S$ is the state space and $P_t $ is associated with the process semigroup. Assumption 20.1:

Assumption 20.1 Suppose $P_t f$ is continuous on $S$ whenever $f$ is bounded and continuous
on $S$.

Assumption 20.1 is satisfied, if we consider a Poisson point process as a canonical process in the space $D_S [0;T_1]$, $T_1 >0$, $S =\Gamma $. Here $\Gamma$ is the space of all simple counting measures over $\mathbb{R}^d$, equipped with the vague topology, i.e. the smallest topology such that
for every $f \in C_K(\mathbb{R}^d)$ the mapping

$$
\Gamma \ni \gamma \to \int f d \gamma
$$
is continuous. With this topology, $Г$ is a Polish space, link. Then assumption 20.1 is equivalent to

\begin{equation}
E g(\gamma _n \cup N _t) \to E g(\gamma \cup N_t), \ \ \ \text{whenever} \ \gamma _n \to \gamma \ \ \ \ \ \ \ \ \ \ \ \ (1)
\end{equation}
where $g: \Gamma \to \mathbb{R}$ is a bounded and continuous function. Convergence
$\gamma _n \to \gamma$ in the vague topology implies $\gamma _n \cup N _t \to \gamma \cup N_t$ a.s. Therefore, (1) follows by the bounded convergence Theorem.

Also, one should prove that $(N_t)$ is a Markov process under $\{\mathscr{F}_t \}$ (this is an assumption of Theorem 20.1).

Second, I have tried to prove the statement in the original question, using the idea suggested by Anthony Quas in his comment. To prove that $ N$ is a strong Markov, enough to show that

  1. for any $b>a>0$ and bounded open $U \subset \mathbb{R}^d$, $\bar N ((a;b),U)$ is an independent of $\mathscr{F} _\tau$ Poisson random variable with mean $(b-a)\lambda (U)$, $\lambda$ is the Lebesgue's measure on
    $\mathbb{R}^d$, and
  2. for any $b_k>a_k>0$, $k=1,…,m$, and any bounded open $U_k \subset \mathbb{R}^d$, such that $((a_i;b_i) \times U_i) \cap( (a_j;b_j) \times U_j) = \varnothing $, $i \ne j$,
    random variables $\bar N ((a_k;b_k) \times U_k)$
    are independent of each other.

To do so, let $\tau _n$ be the sequence of stopping times taking only countably many values, $\tau _n \downarrow \tau$, $\tau _n – \tau \leq \frac{1}{2^n}$. Then $N$ satisfies strong Markov property for $\tau _n$, and the processes $\bar N _n$ defined by
$$
\bar N _n ([0;s] \times U) = N ([\tau_n;\tau_n + s] \times U),
$$
are Poisson point processes. To prove 1, note that $\bar N _n ((a;b) \times U) \to \bar N ((a;b) \times U) $ a.s. and all random variables $\bar N _n ((a;b) \times U)$ have the same distribution, therefore $\bar N ((a;b) \times U)$ is a Poisson random variable with mean
$(b-a)\lambda (U)$. Random variables $\bar N _n ((a;b) \times U)$
are independent of $\mathscr{F}_{\tau}$, hence $\bar N ((a;b) \times U)$ is independent of $\mathscr{F}_{\tau}$, too. Similarly, 2 follows.

Best Answer

Did you try Applied Probability and Queues by Soren Assmusen ? For sure is, at least in some form, on the book on Levy Processes by Kato or Bertoin. More general references like Revouz & Yor or Stochastic Integration by Phillip Protter might also have it.

Related Question