$X_1, X_2, …, X_n \sim Exp(\lambda)$, what’s the joint distribution of $X_1, X_1+X_2, …, X_1+X_2+…X_n$ and is it a uniform ordered distribution

exponential distributionprobability distributionsprobability theorystatisticsuniform distribution

To elaborate on the title, here is the entire problem:

Let $X_1, X_2, …, X_n \thicksim Exp(\lambda)$ be an independent sample.

What's the joint distribution of the sequence of $X_1, X_1 + X_2, …, X_1 + X_2 + … + X_{n-1}$ with the condition of $X_1 + X_2 + … + … + X_n = t$ ?

And is this joint distribution equal to an $n-1$ member ordered uniform ($Unif[0,t]$) sample's joint distribution, meaning that:

If $Y_1, Y_2, …, Y_{n-1} \thicksim Unif[0,t]$ independent sample, and we order them: $Y_1^*, Y_2^*, …, Y_{n-1}^*$, then are these equal:

$$F_{X}(x_1,…,x_{n-1}) = \Bbb{P}(X_1 < x_1, X_1 + X_2 < x_2, …,~~~ X_1 + X_2 + … + X_{n-1} < x_{n-1} | X_1 + X_2 + … + X_n = t) \stackrel{?}{=} \Bbb{P}(Y_1^* < x_1, Y_2 < x_2, …, Y_{n-1}^* < x_{n-1}) = F_{Y^*}(x_1,…,x_{n-1})$$

where $F_X$ is the joint distribution function of the $X_1, X_2, …,X_1 + X_2 + … + X_{n-1}$ sample with the condition of $\sum_{i=1}^n{X_i} = t$ and $F_{Y^*}$ is the joint distribution function of the $Y_1^*, Y_2^*, …, Y_{n-1}^*$ sample.

If so, prove it; if not, disprove it.

The problem is…:

…that $X_1, X_1 + X_2, …, X_1 + X_2 + … + X_n$ aren't independent, so calculating the joint distribution function is hard, especially with a condition.

Ordered samples also follow a Beta distribution, which is generally tough to deal with:

$$\forall k \in \{1,…,n\}: \quad Y_k^* \thicksim \frac{1}{t}Beta(n,n-k+1)$$

Here is what I've tried so far:

1. Introduce new variables:
$$A_1 = X_1 \\
A_2 = X_1 + X_2 \\
\vdots \\
A_n = X_1 + X_2 + \dots + X_n$$

This way, we can write up the $X$'s like so:
$$X_1 = A_1 \\
X_2 = A_2 – A_1 \\
X_3 = A_3 – A_2 \\
\vdots \\
X_n = A_n – A_{n-1}$$

We can also calculate the individual distributions of these $A$'s:

$$\forall k \in \{1,…,n\} \quad A_k \thicksim Exp\left(\frac{\lambda}{k}\right)$$

But this didn't lead me much further, since we still can't write up the joint distribution functions of $A$'s or $X$'s since they're not independent.

2. I tried thinking outside the box: $X_1 + X_2 + … + X_k$ could mean the arrival time of a truck, and if they're from an exponential distribution, then their arrival times are expected to be uniform. However, expected value says very little about joint distribution, plus this wouldn't be a very mathematically appropriate proof.

Can anyone lead me on the correct path?

Best Answer

Let $S_i = X_1 + \ldots + X_i$. $$F_{(S_1, ..., S_n)}(x_1, \ldots, x_n) = \\ \int_{-\infty}^{x_1} f_{X_1}(\tau_1) \int_{-\infty}^{x_2 - \tau_1} f_{X_2}(\tau_2) \cdots \int_{-\infty}^{x_n - \tau_1 - ... - \tau_{n - 1}} f_{X_n}(\tau_n) \, d\tau_n \cdots d\tau_1, \\ \frac {\partial^n} {\partial x_n \cdots \partial x_1} F_{(S_1, ..., S_n)}(x_1, \ldots, x_n) = \\ f_{X_1}(x_1) \frac {\partial^{n - 1}} {\partial x_n \cdots \partial x_2} \int_{-\infty}^{x_2 - x_1} f_{X_2}(\tau_2) \cdots \int_{-\infty}^{x_n - x_1 - \tau_2 - ... - \tau_{n - 1}} f_{X_n}(\tau_n) \, d\tau_n \cdots d\tau_2 = \ldots = \\ f_{X_1}(x_1) f_{X_2}(x_2 - x_1) \cdots f_{X_n}(x_n - x_{n - 1}).$$ For $f_{X_i}(x) = \lambda e^{-\lambda x} [0 < x]$, this gives $$f_{(S_1, ..., S_n)}(x_1, \ldots x_n) = \lambda^n e^{-\lambda x_n} [0 < x_1 < \ldots < x_n].$$ Next, $$f_{S_n}(x) = \mathcal L^{-1} {\left[ \left( \frac \lambda {p + \lambda} \right)^{\!n} \right]} = \lambda^n e^{-\lambda x} \mathcal L^{-1}[p^{-n}] = \frac {\lambda^n} {(n - 1)!} x^{n - 1} e^{-\lambda x} \, [0 < x], \\ f_{(S_1, ..., S_{n - 1}) \mid S_n = t}(x_1, \ldots, x_{n - 1}) = \frac {f_{(S_1, ..., S_n)}(x_1, \ldots, x_{n - 1}, t)} {f_{S_n}(t)} = \\ \frac {(n - 1)!} {t^{n - 1}} \, [0 < x_1 < \ldots < x_{n - 1} < t],$$ which is the same as the pdf of the order statistic $(Y_1^*, \ldots, Y_{n - 1}^*)$.

Related Question