Defining unordered arrival times in Poisson process.

poisson distributionpoisson processprobabilityprobability theory

I am looking to precisely define the 'unordered' arrival times in a Poisson process.

Say I have a one-dimensional, unit rate, Poisson process $(K_t)_{t\geq 0}$ and let the ordered arrival times be $(T_i)_{i\geq 1}$. Fix a time $t$ and suppose $K_t=k$. Further let $(U_i)_{i=1}^{k}$ be independent and uniformly distributed on $[0,t]$ and $\sigma$ be the permutation such that with $(\sigma_1,\ldots,\sigma_k)=\sigma(1,\ldots k)$, we have
$$
U_{\sigma_1}<U_{\sigma_2}<\ldots < U_{\sigma_k}.
$$
It is a standard result (for example see here Conditional law of the arrival times of a Poisson process) that the random vectors $(T_i)_{i=1}^{k}$ and $(U_{\sigma_i})_{i=1}^{k}$ are equal in distribution. I've heard it phrased as 'the unordered arrival times are equal in distribution to $(U_i)_{i=1}^{k}$', but how is this second statement made precise?

My attempt at a definition for unordered arrival time:

Let $\rho$ be a random permutation distributed uniformly over the $k!$ available permutations of $\{1,\ldots, k\}$. Then we define the unordered arrival times as $X_i:=T_{\rho_i}$.

Then we'd like to show that
$$
P(X_1\in dx_1,\ldots , X_k\in dx_k)=\prod_{i=1}^{k}\frac{dx_i}{t}.
$$

I feel that if we fixed a realization of the $(U_i)_{i=1}^{k}$, which would then give us a $\sigma$ (by ordering the $(U_i)_{i=1}^{k}$), then choosing $\rho$ to be the inverse of $\sigma$ is key. Schematically
$$
(U_i)_{i=1}^{k}\xrightarrow{\sigma}(T_i)_{i=1}^{k} \mbox{ hence } (T_i)_{i=1}^{k}\xrightarrow{\rho=\sigma^{-1}}(U_i)_{i=1}^{k}
$$
where the arrow means the permutation indicated above the arrow is applied, which results in a random vector equal in distribution to the right hand side of the arrow.

This seems equivalent to: if $f$ is a random function, and $X$ and $Y$ are $k$ dimensional random variables, does
$$
P(f(X)\in A)=P(Y\in A) \implies P(Y\in f^{-1}(A))=P(X\in A)
$$
hold?

Best Answer

I think this suffices. Firstly note for a permutation $r$ of $\{1,\ldots, k\}$, and with $(Y_i)_{i=1}^{k}$ a collection of random variables $$ P(Y_{r(i)}\in dy_i,i=1\ldots k)=P(Y_i\in dy_{r(i)},i=1\ldots k). $$ Now to the question, using $X_i$ as defined above. Let $dx_i$ all be infinitesimal elements in $[0,t]$. Then, conditioning on the random permutation $\rho$, $$ P(X_i\in dx_i,i=1\ldots k)=\frac{1}{k!}\sum_{\mbox{all permutations $r$ of $\{1,\ldots, k\}$ }}P(T_{r(i)}\in dx_i,i=1\ldots k) \\ =\frac{1}{k!}\sum_{\mbox{all permutations $r$ of $\{1,\ldots, k\}$ }}P(T_{i}\in dx_{r(i)},i=1\ldots k). $$ Note that $P(T_{i}\in dx_{r(i)},i=1\ldots k)=0$ unless the $dx_{r(i)}$ are ordered, that is $$ P(T_{i}\in dx_{r(i)},i=1\ldots k)=0) \mbox{ unless $r=r^*$ is such that } dx_{r^*(1)}<dx_{r^*(2)}<\ldots<dx_{r^*(k)}. $$ Hence $$ P(X_i\in dx_i,i=1\ldots k)=\frac{1}{k!}P(T_{i}\in dx_{r^*(i)},i=1\ldots k) \\ =\frac{1}{k!}P(U_{\sigma_i}\in dx_{r^*(i)},i=1\ldots k)=\frac{\prod_{i=1}^{k}dx_i}{t^k}. $$ In the last step the distribution of the ordered uniform random variables was used.

Related Question