Find Distribution of Sum of Random Variables given only Joint Distribution

bivariate-distributionsprobability distributionsstatistics

So if we have two random variables $X, Y$, with unknown distributions. Then we have a random variable $Z$, such that $Z=X+Y$.

Firstly, how do we find the CDF of $Z$, i.e. $F_Z(z)$, given the joint PDF of $X, Y$;

$$f_{XY}(x,y) \ \ \ \ \ for \ \ x, y > 0$$

Intuitively, as far as I can tell, the expression to find this is (hopefully):

$$F_Z(z) = \int_{0}^{z} \int_{0}^{z-x} f_{XY}(x,y) \ dy \ dx $$

Except I wasn't certain about whether this is the case, so I wanted to prove it. Therefore my real question is how to show this expression is correct?

Perhaps something along the lines of ?:

\begin{align}
F_Z(z) & = P(Z \le z) \\
& = P(X + Y \le z) \\
…??? \\
…??? \\
& = P(Y \le z -x \ \ \ and \ \ \ X \le z)
\end{align}

Any help would be great!

Best Answer

We have $$\begin{align*} F_Z(z) &= P(X+Y \leq z) = E[\mathbf{1}_{X+Y \leq z}]\\ &= \int_{[0,\infty) \times [0,\infty)} \mathbf{1}_{x+y \leq z} \; f_{X,Y}(x,y) \, \mathrm{d}x \mathrm{d}y \end{align*}$$ using the following property of the density: $E[g(X,Y)] = \int_{[0,\infty) \times [0,\infty)} g(x,y)\, f_{X,Y}(x,y) \, \mathrm{d}x \mathrm{d}y$. Now notice that $\mathbf{1}_{x+y \leq z} = \mathbf{1}_{x \leq z} \mathbf{1}_{y \leq z - x}$, that is $x+y \leq z$ if and only if $x \leq z$ and $y \leq z - x$. So using Fubini's theorem, we get $$ \begin{align*} F_Z(z) = \int_{[0,\infty) \times [0,\infty)} \mathbf{1}_{x \leq z} \mathbf{1}_{y \leq z - x} \; f_{X,Y}(x,y) \, \mathrm{d}x \mathrm{d}y &= \int_{[0,\infty)} \mathbf{1}_{x \leq z} \left(\int_{[0,\infty)}\mathbf{1}_{y \leq z - x} \; f_{X,Y}(x,y) \, \mathrm{d}y \right)\mathrm{d}x \\ &= \int_0^z \left(\int_0^{z-x} f_{X,Y}(x,y) \, \mathrm{d}y \right)\mathrm{d}x \end{align*}$$ which proves the result. Normally you would not have to go into as much detail and from the first equation above you could directly obtain the result.

Related Question