[Math] Independent increments in the Poisson process.

independencepoisson distributionpoisson processprobability theory

Let $(N_t)_{t\geq0}$ be a Poisson process. For me, the definition is the following: $N_t=\max\{n\geq0:T_n\leq t\}$, where $T_0=0$ and $T_n=S_1+\ldots+S_n$, where $S_1,\ldots,S_n$ are independent exponential random variables of parameter $\lambda$ for all $n\geq1$. Hence, $N_t$ is Poisson$(\lambda t)$.

I was able to prove that $N_{t+s}-N_s$ is independent of $N_s$ and that $N_{t+s}-N_s\sim\text{Poisson}(\lambda t)$, for $s,t\geq0$. From the information I wrote, is there a relatively easy way to prove that $N_{t+s}-N_s$ is independent from $(N_{s_1},\ldots,N_{s_m})$ for all $0\leq s_1\leq \ldots\leq s_m\leq s$ and $m\geq1$?

Best Answer

It follows from what you have shown, using a similar argument and the memory-less property characterizing the exponential distribution, that the Poisson process has independent increments. See page. 9 of this document, or even better the remark on p.4 here, or the best so far theorem 3.3.5 on page 6 here. Also pp. 3-4 here.

Independent increments imply the Markov property. In particular, what you have shown allows you to show that the Poisson process has the Markov property.

From the Markov property, we know that $N_{t+s}-N_s$ is independent from the sigma algebra $\mathscr{F}_s$.

Since all of the $s_i \le s$, we have by definition of $\mathscr{F}_s$ that all of the $N_{s_i}$ are measurable with respect to $\mathscr{F}_s$. Since each of $N_{s_1}, \dots, N_{s_m}$ is measurable with respect to $\mathscr{F}_s$, it follows that the joint distributioin of $(N_{s_1}, \dots, N_{s_m})$ is measurable with respect to $\mathscr{F}_s$.

Since $N_{t+s} - N_s$ is independent of $\mathscr{F}_s$, and $(N_{s_1}, \dots, N_{s_m})$ is measurable with respect to $\mathscr{F}_s$, it follows that $N_{t+s}-N_s$ is independent of $(N_{s_1},\dots,N_{s_m})$, which is what we wanted to show.

Proof of independent increments #1 (use induction)

Let $0:=s_0 < s_1 < \dots < s_m$ be given. Then by what you have already showed, $N_{s_2}- N_{s_1}$ is independent of $N_{s_1}$. This is the base case of our induction.

Now assume by means of induction that $N_{s_n}-N_{s_{n-1}}$ is independent of $N_{s_2} -N_{s_1}, \dots, N_{s_{n-1}} - N_{s_{n-2}}$ and that all of these are independent of each other.

By what you have shown, $N_{s_{n+1}}-N_{s_n}$ is independent of $N_{s_n}$.

See here.

Using a telescoping sum representation, we have that $N_{s_n} = (N_{s_n} - N_{s_{n-1}}) + N_{s_{n-1}} = (N_{s_n} - N_{s_{n-1}}) $.

Thus $N_{s_{n+1}}-N_{s_n} $ is independent of $(N_{s_n} - N_{s_{n-1}})$ as well as of $$N_{s_{n-1}}= (N_{s_2} - N_{s_1}) + \dots + (N_{s_1} - N_{s_2}) \,.$$ Since $N_{s_{n-1}}$ is the sum of mutually independent random variables $(N_{s_2} -N_{s_1}, \dots, N_{s_{n-1}} - N_{s_{n-2}}, N_{s_n}-N_{s_{n-1}})$, and $N_{s_{n+1}}-N_{s_n}$ is independent of it, it follows that $(N_{s_{n+1}}-N_{s_n})$ is independent of all of the R.V.'s in the sum. This completes the proof of the induction step and thus one proof of independent increments.

Proof of independent increments #2 (following Durrett's Probability Theory and Examples, pp. 155-156, 4th edition) $\newcommand{\Prob}{\mathbb{P}}$

$N_t = \max \{ n \ge 0: T_n \le t \}$ where $T_0 = 0$ and $T_n = S_1+ \dots S_n$, $S_n \sim \exp(\lambda)$ i.i.d. RV's.

Already shown: $N_{t+s} - N_s$ is independent of $N_s$ for all $s,t \ge 0$.

Let $T_1' = T_{N_t +1} - s$ be the amount of time that elapses after time $s$ until the next arrival. The following computation shows that $T_1'$ is independent of $N_s$: $$\Prob(T_1' \ge t|N_s = n) = \Prob(T_{N_s+1} \ge (s+t) | N_s = n) \\ = \Prob(T_{n+1} \ge (s+t), T_n \le s )/ \Prob(N_s = n)\,.$$ The numerator equals: $$ \Prob(T_{n+1} \ge (s+t), T_n \le s ) = \int_0^s f_{T_n}(u) \Prob(S_{n+1} \ge (s+t) - u)du \\= \int_0^s \frac{\lambda^n u^{n-1}}{(n-1)!}e^{-\lambda u} e^{-\lambda( (s+t) -u)}du = e^{-\lambda (s+t)}\frac{\lambda t^n}{n!}. $$ By what you have already shown, the denominator is $\Prob(N_t=n) = e^{-\lambda s}(\lambda s)^n/n!$, so: $$\Prob(T_{n+1} \ge (s+t)| N_s = n) = \frac{e^{- \lambda(s+t)}}{e^{-\lambda s}} = e^{-\lambda t} =\Prob(T_1' \ge t) \,. $$

For $k \ge 2$, define $T_k' = T_{N_t +k} - T_{N_t +k-1}$, the amount of time elapsing between the $(k-1)$th arrival after time $s$ and the $k$th arrival after time $s$. Observing that: $$\Prob(T_n \le s, T_{n+1} \ge u, T_{n+k} - T_{n+k-1} \ge v_k, k=2,\dots,K) = \Prob(T_{n} \le s, T_{n+1} \ge u) \prod_{k=2}^{K} \Prob(S_{n+k} \ge v_k) \,, $$ it follows that the $T_1', T_2', \dots$ are i.i.d. and independent of $N_t$. In other words, arrivals after time $s$ are independent of $N_s$ and have the same distribution as the original sequence.

From this follows the desired conclusion of independent increments, namely that if $0:= s_0 < s_1 < \dots < s_m$ then $N_{s_i} - N_{s_i - 1}$, $i = 1, \dots, n$ are independent.

This is because the vector $( N_{s_2} - N_{s_1}, \dots, N_{s_m} - N_{s_m -1} )$ is measurable with respect to $\sigma(T_k', k \ge 1)$ and thus independent of $N_{s_1}$. Then it follows by induction: $$\Prob(N_{s_i} - N_{s_{i- 1}} = k_1, i = 1, \dots, n) = \prod_{i=1}^m \exp (-\lambda (s_i - s_{i-1})) \frac{\lambda (s_i - s_{i-1} )^{k_i}}{k_i!}\\ = \prod_{i=1}^{m} \Prob(N_{s_i} - N_{s_{i-1}} = k_i) \,. \square $$

The fact that $T_1'$ is independent of $N_s$ amounts essentially to the memoryless property of the exponential distribution. Note: I imagine this is similar to the argument that you used to show that $N_{s+t}-N_s$ is independent of $N_s$. The only difference here is that now we are renewing the process at a random time, as opposed to a deterministic time.

Proof that independent increments imply the Markov property (see here)

In order to demonstrate that the process is Markov, it suffices to show (for any bounded measurable function $f$) that: $$\mathbb{E}[f(N_{s+t})|\mathscr{F}_s] = \mathbb{E}[f(N_{s+t})|\sigma(N_s)] \,. $$ This follows from the fact that: $$\mathbb{E}[f(N_{s+t})|\mathscr{F}_s] = \mathbb{E}[f(\ [N_{s+t} - N_s]\ + N_s)| \mathscr{F}_s] \,.$$ By assumption, $N_{s+t} - N_s$ is independent of $\mathscr{F}_s$, and by definition $N_s$ is $\mathscr{F}_s$-measurable. Thus the desired equality follows immediately from the above. In other words, $$\mathbb{E}[f(\ [N_{s+t} - N_s]\ + N_s)| \mathscr{F}_s] \mathbb{E}[f(N_{s+t} \underbrace{- N_s + N_s}_{=0})|\sigma(N_s)] = \mathbb{E}[f(N_{s+t})|\sigma(N_s) ] \,.$$

Related Question