[Math] Using Jacobian Transformation to find the Marginal Density

functionsprobabilitystatistics

Let $X$ and $Y$ be two independent random variables having identical gamma distributions.

$(a)$ Find the joint probability density of the random variables

$$U= \frac X {X+Y} ,\quad V= X+Y$$

$(b)$ Find and identify the marginal density of $ U.$


$(a)$
Using the gamma distribution and substitution.

$$f(x,y)= \begin{cases} \dfrac{1}{\left[\beta^{\alpha}\Gamma(\alpha)\right]} \cdot x^{\alpha-1}y^{\alpha-1}e^{\frac{1}{\beta}(x+y)} & \text{for } x,y \ge 0, \\[4pt]
0 & \text{otherwise.}
\end{cases}$$

Using the Jacobian transformation

$$J= \begin{vmatrix} \dfrac{\partial_x}{\partial_u} & \dfrac{\partial_x}{\partial_v} \\[4pt] \dfrac{\partial_y}{\partial_u} & \dfrac{\partial_y}{\partial_v}
\end{vmatrix}$$

$$J= \begin{vmatrix}v &u \\
-v & (1-u)\end{vmatrix}$$

$J=v(1-u)+uv=v$

The joint probability density of $U$ and $V$ is

$$g(u,v)= f(x,y) \cdot |J|$$

$$f(x,y)= \begin{cases} \dfrac 1 {\left[\beta^{\alpha}\Gamma(\alpha)\right]} \cdot [u(1-u)^{\alpha-1}v^{2\alpha-1}e^{\frac{-1}{\beta}(v) ]} \\ 0 \quad \text{ elsewhere}
\end{cases} $$

$(b) $ The marginal density of U is

$$h(u) = \int^\infty_0 g(u,v) \, dv$$

$$\int^\infty_0 \frac 1 {\left[\beta^{\alpha}\Gamma(\alpha)\right]} \cdot [u(1-u)^{\alpha-1} v^{2\alpha-1} e^{\frac{-1}{\beta}(v)}$$

Does one use integration by parts to solve this?

$$ \frac{1}{\left[\beta^{\alpha}\Gamma(\alpha)\right]} \cdot [u(1-u)^{\alpha-1}\int^{\infty}_{0}v^{2\alpha-1}e^{\frac{-1}{\beta}(v)} dv$$

Best Answer

Since the joint density $(u,v)\mapsto f_{U,V}(u,v)$ of $U$ and $V$ factors as a function of $u$ times a function of $v$, one must conclude that $U$ and $V$ are independent and that the marginal density of $U$ is just the factor that depends on $u$ multiplied by a normalizing constant. Thus you have $$ \Big(u^{\alpha-1} (1-u)^{\alpha-1} \cdot\text{constant} \Big) \text{ for } 0\le u\le 1. $$

If you know how to find $\displaystyle \int_0^1 u^{\gamma-1} (1-u)^{\delta-1} \, du,$ that gives you the normalizing constant.