Solved – Find the joint distribution of two independent random variables

distributionsindependencejoint distributionmarginal-distribution

Two random variables such as $X_{1}, X_{2},…,X_{n}$ be iid's has pdf $\theta x^{\theta-1}$ where $0<x<1$ and $Y_{1}, Y_{2},…,Y_{n}$ be iid discrete random variables have power series distribution $p(Y=y)=\frac{\gamma(y)\theta^y}{c(\theta)}$ where $y=0,1,2,…$. Assume $X$'s and $Y$'s are independent.

I am trying to find the distribution of $Z_{i}=X_{i}+Y_{i}$.

Since $X$'s and $Y$'s are independent I can find the distribution of $f(x,y)$. Later, I can use the transformation to find the distribution of $f(z,x)$. Now I need integrate by $Y$ in order to find the marginal density of $Z$.

My question is how to find the limit for $Y$. Since I am dealing with continuous and discrete random variables.

Thanks in advance.

Best Answer

Find it directly--avoid the middleman!

Finding the distribution of $Z_i$

Because almost surely $0 \lt X_i \lt 1$ and $Y_i$ is one of the natural numbers $\{0,1,2,\ldots,\},$ consider any real number $z \ge 0$ and write it as

$$z = y(z) + x(z)$$

where $$y(z) = \lfloor z \rfloor$$ is the greatest integer less than or equal to $z$ and $$x(z) = z - y(z)$$ is the fractional part left over. From these formulas we can reconstruct $X_i$ and $Y_i$ from $Z_i$ as

$$y(Z_i) = y(Y_i + X_i) = Y_i$$

and

$$x(Z_i) = x(Y_i + X_i) = Z_i - Y_i = X_i.$$

Thus, because $Y_i$ and $X_i$ are independent,

$$\eqalign{F_{Z_i}(z) = \Pr(Z_i \le z) &= \Pr(Y_i \lt y(z)\text{ or } (Y_i = y(z) \text{ and } X_i \le x(z))) \\ &= \Pr(Y_i \lt y(z)) + \Pr(Y_i = y(z))\Pr(X_i\le x(z)) \\ &= F_{Y_i}(y(z)-1) + \Pr(Y_i=y(z)) x(z)^\theta. }$$

This is an effective formula for the distribution $F_{Z_i}$ of $Z_i,$ thereby answering the question. I will demonstrate its use by (a) computing its density and (b) integrating the density.

Computing the density of $Z_i$

When $z$ is not an integer $F_{Z_i}$ is a differentiable function of $z$ with constant derivative $1$ because $y$ is differentiable (it's locally constant with derivative zero) and so, therefore, is $x$ because

$$\frac{d}{dz} x(z) = \frac{d}{dz}(z - y(z)) = 1 - 0 = 1.$$

Moreover, the summation does not change except when its upper endpoint $y(z)$ changes, which occurs only at the natural numbers. Still assuming $z$ is not a natural number, we compute the density of $Z$ simply by differentiating via the sum rule, product rule, and the chain rule:

$$f_{Z_i}(z) = \frac{d}{dz}\Pr(Z_i \le z) = \theta x(z)^{\theta-1}\Pr(Y_i = y(z)).$$

We may arbitrarily define $f_{Z_i}$ at the natural numbers: give it any finite values you like there. And, since $Z_i\ge 0,$ $f_{Z_i}(z) = 0$ for all $z\lt 0.$ That completes the determination of the density.

Figure: Graph of the PDF

This figure depicts the graph of $f_{Z_i}$ where $Y_i$ has a Poisson$(3)$ distribution and $\theta = 4.$ The heights of the spikes in the graph are determined by the Poisson probabilities $\Pr(Y_i=y(z)),$ while the shapes of the graph between the spikes are given by the density of $X_i$ (as scaled by $\Pr(Y_i=y(z))$ and translated by $y(z)$).

Integrating the density

As a check, let's verify that $f_{Z_i}$ is normalized to unit probability by integrating it, which we may do by breaking the integral into a sum of areas over the intervals $[i, i+1)$ for $i=0, 1, 2, \ldots:$

$$\eqalign{ \int_\mathbb{R} f_{Z_i}(z) dz &= \int_0^\infty \theta x(z)^{\theta-1}\Pr(Y_i = y(z)) dz \\ &= \sum_{i=0}^\infty \int_{i}^{i+1} \theta x(i+t)^{\theta-1}\Pr(Y_i = y(i+t)) d(i+t) \\ &= \sum_{i=0}^\infty \int_0^1 \theta t^{\theta-1} \Pr(Y_i = i) dt \\ &= \sum_{i=0}^\infty \Pr(Y_i=i) = 1. }$$

Related Question