Sum of Exponentials – Understanding the Distribution

convolutiondistributionsexponential distributionself-study

Let $X_1$ and $X_2$ be independent and identically distributed exponential random variables with rate $\lambda$. Let $S_2 = X_1 + X_2$.

Q: Show that $S_2$ has PDF $f_{S_2}(x) = \lambda^2 x \text{e}^{-\lambda x},\, x\ge 0$.

Note that if events occurred according to a Poisson Process (PP) with rate $\lambda$, $S_2$ would represent the time of the 2nd event.

Alternate approaches are appreciated. The approaches provided are commonly used when learning queueing theory & stochastic processes.


Recall the Exponential distribution is a special case of the Gamma distribution (with shape parameter $1$). I've learned there is a more general version of this here that can be applied.

Best Answer

Conditioning Approach
Condition on the value of $X_1$. Start with the cumulative distribution function (CDF) for $S_2$.

$\begin{align} F_{S_2}(x) &= P(S_2\le x) \\ &= P(X_1 + X_2 \le x) \\ &= \int_0^\infty P(X_1+X_2\le x|X_1=x_1)f_{X_1}(x_1)dx_1 \\ &= \int_0^x P(X_1+X_2\le x|X_1=x_1)\lambda \text{e}^{-\lambda x_1}dx_1 \\ &= \int_0^x P(X_2 \le x - x_1)\lambda \text{e}^{-\lambda x_1}dx_1 \\ &= \int_0^x\left(1-\text{e}^{-\lambda(x-x_1)}\right)\lambda \text{e}^{-\lambda x_1}dx_1\\ &=(1-e^{-\lambda x}) - \lambda x e^{-\lambda x}\end{align} $

This is the CDF of the distribution. To get the PDF, differentiate with respect to $x$ (see here).

$$f_{S_2}(x) = \lambda^2 x \text{e}^{-\lambda x} \quad\square$$

This is an Erlang$(2,\lambda)$ distribution (see here).


General Approach
Direct integration relying on the independence of $X_1$ & $X_2$. Again, start with the cumulative distribution function (CDF) for $S_2$.

$\begin{align} F_{S_2}(x) &= P(S_2\le x) \\ &= P(X_1 + X_2 \le x) \\ &= P\left( (X_1,X_2)\in A \right) \quad \quad \text{(See figure below)}\\ &= \int\int_{(x_1,x_2)\in A} f_{X_1,X_2}(x_1,x_2)dx_1 dx_2 \\ &(\text{Joint distribution is the product of marginals by independence}) \\ &= \int_0^{x} \int_0^{x-x_{2}} f_{X_1}(x_1)f_{X_2}(x_2)dx_1 dx_2\\ &= \int_0^{x} \int_0^{x-x_{2}} \lambda \text{e}^{-\lambda x_1}\lambda \text{e}^{-\lambda x_2}dx_1 dx_2\\ \end{align}$

Since this is the CDF, differentiation gives the PDF, $f_{S_2}(x) = \lambda^2 x \text{e}^{-\lambda x} \quad\square$ Figure


MGF Approach
This approach uses the moment generating function (MGF).

$\begin{align} M_{S_2}(t) &= \text{E}\left[\text{e}^{t S_2}\right] \\ &= \text{E}\left[\text{e}^{t(X_1 + X_2)}\right] \\ &= \text{E}\left[\text{e}^{t X_1 + t X_2}\right] \\ &= \text{E}\left[\text{e}^{t X_1} \text{e}^{t X_2}\right] \\ &= \text{E}\left[\text{e}^{t X_1}\right]\text{E}\left[\text{e}^{t X_2}\right] \quad \text{(by independence)} \\ &= M_{X_1}(t)M_{X_2}(t) \\ &= \left(\frac{\lambda}{\lambda-t}\right)\left(\frac{\lambda}{\lambda-t}\right) \quad \quad t<\lambda\\ &= \frac{\lambda^2}{(\lambda-t)^2} \quad \quad t<\lambda \end{align}$

While this may not yield the PDF, once the MGF matches that of a known distribution, the PDF also known.

Related Question