Probability Theory – How to Find a Joint Distribution Function from Marginals with Dependence

probabilityprobability distributionsprobability theory

So I know one can go from a joint density function $f(x,y)$ to marginal density functions, like $f_x(x)$ by integrating against the other variables as in $f_x(x) = \int f(x,y) dy$…but given $f_x(x)$ and $f_y(y)$ as densities for dependent random vars..how would one go about finding a joint density or distribution function?

Thanks

Best Answer

For example, suppose the marginal densities for $X$ and $Y$ are both 1 on the interval $[0,1]$, 0 otherwise. One family of possibilities for the joint density is $f(x,y) = 1 + g(x) h(y)$ for $0 < x < 1$, $0 < y < 1$, 0 otherwise, for functions $g$ and $h$ such that $\int_0^1 g(x)\, dx = \int_0^1 h(y)\, dy = 0$, $-1 \le g(x) \le 1$ and $-1 \le h(y) \le 1$. And there are infinitely many other possibilities.

Related Question