Poisson process arrival time.

poisson processprobabilitystochastic-processes

let $N_t$ be the Poisson process with parameter $\lambda$ and $S_{n}$ denote the time of the $n^{th}$ event. define $T_i:=S_i-S_{i-1}$

find $E(S_{4}|N(1)=2)$.


on the one hand, use the method in post 1, think of the Poisson process $\tilde{N}$ restart at time $1$(the corresponding arrival time is $\tilde{S}$):
$$E(S_{4}|N(1)=2)=1+E(S_{4}-1|N(1)=2)=1+E(\tilde{S}_{2}|N(1)=2)=1+E(\tilde{S}_{2})=1+\frac{2}{\lambda}$$


on the other hand, I want to solve this problem directly:
$$E(S_{4}|N(1)=2)=E(S_{2}|N(1)=2)+E(T_3|N(1)=2)+E(T_4|N(1)=2):=A+B+C$$

use the method in post 2, $f_{(S_1,S_2)|N}(s_1,s_2)=2$, then $f_{S_2|N}(x)=2x$, hence $A=2/3$

use independence we get $B=C=\frac{1}{\lambda}$, then $E(S_{4}|N(1)=2)=\frac{2}{3}+\frac{2}{\lambda}$


but result does not consistent. what's wrong? I think perhaps $T_3$ is not independent of $N(1)=2$?

$N(1)=2$ is euivalent to $S_2\le 1, S_3>1$, so
$$B=E[T_3|N(1)=2]=E[T_3|T_1+T_2\le 1,T_1+T_2+T_3>1]\overset{*}{=}E[T_3|T_3>1-T_1-T_2]\overset{\text{memoryless}}{=}=1-T_1-T_2+E[T_3]$$

does * holds? how to make these two result consistent?

Best Answer

does * holds?

The short answer is that $E[T_3 \mid T_1+T_2\le 1,\, T_1+T_2+T_3>1]\overset{*}{=}E[T_3 \mid T_3 > 1-T_1-T_2]$ is true but not helpful in the way you'd like.

You might think that by definition $T_3 > 0$ such that the new expression of conditioning on the right hand side $T_3 > 1-T_1-T_2$ automatically takes care of $T_1$ and $T_2$.

Unfortunately, while the equal sign $\overset{*}{=}$ is technically true in terms of the symbolic algebraic manipulation, when it comes to calculation you still have to make the implicit $T_1 + T_2 < 1$ explicit.

The next equal sign $E[T_3 \mid T_3>1-T_1-T_2]\overset{\text{memoryless}}{=}1-T_1-T_2+E[T_3]$ is sloppy at best, and it should be interpreted as false most of the time.

If we are talking about $t_1$ and $t_2$ as two constants, then $E[T_3 \mid T_3 > 1 -t_1 - t_2] = 1 - t_1 - t_2 + E[T_3]$ is true.

Alternatively, if we are taking the expectation with respect to only $T_3$ then $E_{T_3}[T_3 \mid T_3>1-T_1-T_2] = 1 - T_1 - T_2 + E[T_3]$ is also true, while the right hand side is still a function of the random variables $T_1$ and $T_2$. One still has to further take the expectation $E_{T_1,T_2}[\text{blah}]$.

how to make these two result consistent?

The decomposition $E(S_{4} \mid N(1)=2) = \ldots =A+B+C$ is correct and one indeed can "solve it directly". It's just that one has to actually do some calculations for $B$ (or quote additional known results like what's done for $A$ and $C$).

The $A = \frac23$ and $C = \frac1{\lambda}$ are correct, and your hunch that "$T_3$ is not independent of $N(1) = 2$" is also correct.

Following your notation that $S_2 = T_1 + T_2$, I shall use the dummy variable $s$ for $S_2$ and $t$ for $t_3$ in their respective marginal densities $f_{S_2}(s)$ and $f_{T_3}(t)$. We know that $f_{S_2}$ is a Gamma distribution with shape parameter $2$ and $f_{T_3}$ is exponential.

For both generality and clarity, I shall use $\tau$ to denote the given cutoff time length, as in here we have $\tau = 1$ and we are conditioning on $N(\tau) = 2$.

$$B := E\bigr( T_3 ~\big|~ N(\tau) = 2 \bigr) = \frac{ \iint_{\Omega} t \cdot f_{S_2,T_3}(s,t) \,\mathrm{d}t \,\mathrm{d}s }{ \Pr\{ N(\tau) = 2 \} } $$

where the joint density $f_{S_2,T_3}$ is the direct product of the marginals $f_{S_2}$ and $f_{T_3}$ because all $T_i$ are mutually independent. The 2-dim integral is over the region $\Omega = \{ (t,s): 0 < s < \tau,~t > \tau - s \}$. Lastly, the denominator $\Pr\{ N(\tau) = 2 \}$ is just familiar probability mass function of the discrete Poisson distribution

$$\Pr\{ N(\tau) = k \} = e^{ -\lambda \tau } \frac{ (\lambda \tau)^k }{ k! } $$

Evaluating the integral is just routine work (but by no means allowing any shortcut without actual calculation). \begin{align} \iint_{\Omega} t \cdot f_{S_2,T_3}(s,t) \,\mathrm{d}t \,\mathrm{d}s &= \int_{s = 0}^{ \tau } \int_{t = \tau - s}^{\infty} t \cdot f_{S_2}(s) \cdot f_{T_3}(t)\,\mathrm{d}t \,\mathrm{d}s \\ &= \int_{s = 0}^{ \tau } \int_{t = \tau - s}^{\infty} t \cdot \frac{ \lambda }{ \Gamma(2) } (\lambda s)^{2-1} e^{-\lambda s} \cdot \lambda e^{-\lambda t} \,\mathrm{d}t \,\mathrm{d}s \\ &= \int_{s = 0}^{ \tau } \frac{ \lambda }{ \Gamma(2) } \lambda s e^{-\lambda s} \left[ \int_{t = \tau - s}^{\infty} t \cdot \lambda e^{-\lambda t} \,\mathrm{d}t \right] \,\mathrm{d}s \\ &= \int_{s = 0}^{ \tau } \frac{ \lambda }{ \Gamma(2) } \lambda s e^{-\lambda s} \left[ \left( \tau - s + \frac1{ \lambda} \right) e^{-\lambda (\tau - s)} \,\mathrm{d}t \right] \,\mathrm{d}s \\ &= e^{ -\lambda \tau } \int_{s = 0}^{ \tau } \frac{ \lambda }{ \Gamma(2) } \lambda s \left( \bigl(\tau + \frac1{ \lambda} \bigr) - s\right)\,\mathrm{d}s \\ &= e^{ -\lambda \tau } \left( \bigl(\tau + \frac1{ \lambda} \bigr) \frac{ (\lambda \tau)^2 }2 - \frac1{\lambda} \frac{ (\lambda \tau)^3 }3 \right) \qquad \text{, then with}~\tau = 1 \\ & = e^{-\lambda} \frac{ \lambda^2}2 \left( 1 + \frac1{ \lambda} - \frac23 \right) \end{align} Put things back together $$B = \frac{ \iint_{\Omega} t \cdot f_{S_2,T_3}(s,t) \,\mathrm{d}t \,\mathrm{d}s }{ \Pr\{ N(\tau) = 2 \} } = \frac{ e^{-\lambda} \frac{ \lambda^2}2 \left( 1 + \frac1{ \lambda} - \frac23 \right) }{ e^{-\lambda} \frac{ \lambda^2}2 } = 1 + \frac1{ \lambda} - \frac23 \\ E[T_3] = A + B + C = \frac23 + \left( 1 + \frac1{ \lambda} - \frac23 \right) + \frac1{ \lambda} = 1 + \frac2{ \lambda}$$ So you see, the decomposition into $S_2 + T_3 + T_4$ is not a "natural cutting". Conditioning on $N(1) = 2$ we have a $T_3$ that really should be split into two parts (as remarked by Julien Berestycki in his answer), one part goes together nicely with $S_2$ and the other with $T_4$.

Related Question