Cummulative Distribution Function: Sum of two independent exp-distributed random variables

convolutionexponential distributionprobabilityprobability distributions

Given $X \sim \exp(\lambda)$ and $Y\sim\exp(\mu)$ with $\lambda \neq \mu$ I want to derive the cumulative distribution function of $Z := X + Y$.

My text does not use the convolution of probability density functions (in fact computing the CDF is a means to get to the PDF of $Z$).

It says, that for the CDF $F_Z$ the following holds:
$$
F_Z(a) = P(Z < a) \stackrel{(*)}{=} \int_0^aP(Y < a – x)f_X(x)~dx,
$$
where $f_X(x) = \lambda e^{-\lambda x}$ is the probability density function of $X$.

I have difficulty with $(*)$. I found a related question in the discrete case which I can understand but here it is in particular unclear where this product of a probability and the PDF of $X$ comes from.

Any insight and/or intermediate steps would be appreciated.

Best Answer

You can think of the continuous $f_X(x)$ in terms of the discrete analog, i.e. in the discrete world, $f_X(x) = \mathbb{P}[X=x]$, but in the continuous world, $$ f_X(x) = \lim_{\epsilon \to 0} \mathbb{P}[|X-x|<\epsilon] $$ or if you like $$ \mathbb{P}[|X-x|<\epsilon] = \int_{x-\epsilon}^{x+\epsilon} dF_X(u) $$ so $(*)$ holds like in terms of the discrete analog with identical intuition.

REMARK

That said, in the continuous world, you cannot really state in the same succinct way that $\mathbb{P}[X=x] = f_X(x)$ since LHS is identically zero whenever $X$ is continuous.

UPDATE

To answer your question in the comments, in the discrete world, we would write $$ \begin{split} F_Z(a) &= \mathbb{P}[Z<a] = \mathbb{P}[X+Y<a]\\ &= \sum_{k=0}^\infty \mathbb{P}[X+Y<a \cap X=k]\\ &= \sum_{k=0}^\infty \mathbb{P}[Y<a-k \cap X=k]\\ &= \sum_{k=0}^\infty \mathbb{P}[Y<a-k]\mathbb{P}[X=k]\\ &= \sum_{k=0}^\infty \mathbb{P}[Y<a-k] f_X(k)\\ \end{split} $$ and as we have seen, in the continuous world, the sum will become an integral...

Related Question