Probability Theory – Convolution of Two Uniform Random Variables

convolutionprobabilityprobability distributionsprobability theory

We have $X \sim \mathrm{Unif}[0,2]$ and $Y \sim \mathrm{Unif}[3,4]$. The random variables $X,Y$ are independent. We define a random variable $Z = X + Y$ and want to find the PDF of $Z$ using convolution. Here is my work so far:

The definition of convolution is:

  • $f_Z(z) = \int_{-\infty}^{\infty}f_X(x)f_Y(z-x)\mathrm{d} x$

We know the PDF's of $X$ and $Y$ because they are just uniform distributions. The hard part for me is finding the limits of integration. We have to solve for the constraints.

The integrand is nonzero when $3 \leq z-x \leq 4$ and when $0 \leq x \leq 2$. Together these constraints imply that $\max \{0, z-4\} \leq x \leq \min \{2, z-3 \}$.

These constraints imply that there are three cases:

  • Case 1 – $z \leq 4 \implies f_Z(z) = \int_0^{z-3}$
  • Case 2 – $4 \leq z \leq 5 \implies f_Z(z) = \int_{z-4}^{z-3}$
  • Case 3 – $z \geq 5 \implies f_Z(z) = \int_{z-4}^{2}$

My question is how to find the bounds of $Z$ i.e. what are the possible values of $Z$? Does $Z$ run from $0 \to 6$ since it is the sum of $X+Y$ and this sum will have some value for every value $\in [0,6]$?

Best Answer

Instead of trying to find appropriate $z$ values at the very beginning, note that $f_X(x)$ is zero unless $0\le x\le2$. Therefore $$\int_{-\infty}^{\infty}f_X(x)f_Y(z-x)\mathrm{d} x =\int_0^2f_X(x)f_Y(z-x)\mathrm{d} x =\frac12\int_0^2 f_Y(z-x)\mathrm{d} x\ .$$ You can now substitute $t=z-x$ to get $$\int_{z-2}^z f_Y(t)\mathrm{d} t$$ - see if you can take it from here.

Related Question