Assume we have 3 random variables $X_1,X_2,X_3$, and we know the pairwise marginal distribution $P(X_1,X_2), P(X_2,X_3), P(X_3,X_1)$, but we don't know anything else (such as conditional independence). Can we get the joint distribution $P(X_1,X_2,X_3)$?
Joint Distribution from Marginal Distribution – How to Get Joint Distribution from Pairwise Marginal Distribution
distributionsprobability
Related Solutions
This question and the OP's lecturer's claims seem to indicate misunderstanding of the notions of independence and conditional independence of random variables. Different sets of distributions for Bernoulli random variables $X$, $Y$, and $Z$ are presented here to illustrate the differences between various notions.
- Suppose that $X$ and $Y$ are known to be independent Bernoulli random variables with parameter $\frac{1}{2}$. Thus, their probability mass functions (pmf) are $$p_X(0) = p_X(1) = \frac{1}{2}; ~ p_Y(0) = p_Y(1) = \frac{1}{2}$$ and their joint pmf is the product of the (marginal) pmfs $$p_{X,Y}(i,j) = p_X(i)p_Y(j) = \frac{1}{2}\times\frac{1}{2} = \frac{1}{4} ~\text{for all} ~ i, j \in \{0, 1\}$$ Are $X$ and $Y$ necessarily conditionally independent given $Z$ where $Z$ is also a Bernoulli random variable with parameter $\frac{1}{2}$? Not necessarily. Suppose that $Z$ has parameter $\frac{1}{2}$ and consider the probability distributions $$\begin{align*}p_{X,Y\mid Z}(i,j\mid Z=0) &= \begin{cases}\frac{1}{2}, & i = j\\ 0, & i \neq j, \end{cases}\\ p_{X,Y\mid Z}(i,j\mid Z=1) &= \begin{cases}\frac{1}{2}, & i \neq j\\ 0, & i = j, \end{cases} \end{align*} $$ The law of total probability shows that these conditional distributions combine to give the known joint pmf of $X$ and $Y$. It is also easy to verify that regardless of whether $Z = 0$ or $Z = 1$, both $X$ and $Y$ are conditionally distributed as Bernoulli random variables with parameter $\frac{1}{2}$, but $X$ and $Y$ are not conditionally independent given $Z$, regardless of whether $Z$ has value $0$ or $1$. In one case, we have $X = Y$, and in the other, $X = 1-Y$. Thus we can say the following
If $X$ and $Y$ are independent random variables with known marginal distributions, then their joint distribution is the product of the marginal distributions. However, $X$ and $Y$ need not be conditionally independent even if the conditional marginal distributions of $X$ and $Y$ are the same as their given unconditional marginal distributions. Thus the conditional joint distribution of unconditionally independent random variables need not be the product of the conditional marginal distributions.
- Suppose that $X$ and $Y$ are conditionally independent given $Z = 0$ and also conditionally independent given $Z = 1$. Are $X$ and $Y$ necessarily unconditionally independent? Not necessarily, not even if $Z$ is a Bernoulli random variable with parameter $\frac{1}{2}$. Suppose that $X$ and $Y$ are conditionally independent Bernoulli random variables with parameter $p$ if $Z = 0$ and parameter $q$ if $Z = 1$. Thus, the conditional joint pmfs are $$\begin{align*} p_{X,Y\mid Z}(0,0\mid Z = 0) &= (1-p)^2; \qquad \quad p_{X,Y\mid Z}(0,0\mid Z = 1) = (1-q)^2;\\ p_{X,Y\mid Z}(0,1\mid Z = 0) &= p(1-p); \qquad \quad p_{X,Y\mid Z}(0,1\mid Z = 1) = q(1-q);\\ p_{X,Y\mid Z}(1,0\mid Z = 0) &= p(1-p); \qquad \quad p_{X,Y\mid Z}(1,0\mid Z = 1) = q(1-q);\\ p_{X,Y\mid Z}(1,1\mid Z = 0) &= p^2; \qquad \quad \, \qquad p_{X,Y\mid Z}(1,1\mid Z = 1) = q^2; \end{align*}$$ Suppose that $p \neq q$. Then, $X$ and $Y$ are unconditionally independent only in the trivial cases when $Z$ has parameter $\lambda$ equal to $0$ or $1$ (when one of the above two joint pmfs has weight $0$ in the total probability formula.
If $X$ and $Y$ are conditionally independent given $Z$, they need not be unconditionally independent.
Is there any instance where conditional independence guarantees unconditional independence? If $X$ and $Y$ are not only conditionally independent given $Z$ but also have the same conditional joint distribution for all choices of $Z$ then $X$ and $Y$ are unconditionally independent. But this is also a trivial special case because the necessary condition means that $X$, $Y$, and $Z$ are mutually independent random variables, and so the conditional joint distribution of $X$ and $Y$ does not depend on the value of $Z$.
As indicated in the earlier comments, once you get a sample from the joint distribution of $(X_1,X_2,X_3)$, $$(x_1^1,x_2^1,x_3^1),\ldots,(x_1^t,x_2^t,x_3^t)$$ the marginal sample $$(x_1^1,x_2^1),\ldots,(x_1^t,x_2^t)$$is indeed a sample from the marginal joint distribution of $(X_1,X_2)$ and you can ignore the simulated $x_3^j$'s. They can however be useful in Monte Carlo evaluations through a technique called Rao-Blackwellisation since the average$$\frac{1}{t}\sum_{i=1}^t h(x_1^i,x_2^i)$$is improved by the average$$\frac{1}{t}\sum_{i=1}^t \mathbb{E}[h(x_1^i,x_2^i)|x_3^i]$$as a conditional expectation shares the same expectation as the original but reduces the variance. See for instance this discussion on Cross Validated.
Best Answer
No.
Consider a trivariate distribution with bivariate (standard, independent) normal margins, but with half the octants having 0 probability and half having double probability. Specifically, consider octants ---, -++, +-+, ++- have double probability.
Then the bivariate margins are indistinguishable from the one you'd get with three iid standard normal variates. Indeed, there's an infinity of trivariate distributions which would produce the same bivariate margins
As Dilip Sawarte points out in comments he has discussed essentially the same example in an answer (but reversing the octants which are doubled and zeroed), and defines it in a more formal way. Whuber mentions an example involving Bernoulli variates that (in the trivariate case) looks like this:
... where every bivariate margin would be
and so would be equivalent to the case of three independent variates (or indeed to three with exactly the reverse form of dependence).
A closely related example I initially started to write about involved a trivariate uniform with alternating "slices" in a checkerboard pattern of greater and lower probability (generalizing the usual zero and double).
So you can't compute the trivariate from bivariate margins in general.