Joint Distribution from Marginal Distribution – How to Get Joint Distribution from Pairwise Marginal Distribution

distributionsprobability

Assume we have 3 random variables $X_1,X_2,X_3$, and we know the pairwise marginal distribution $P(X_1,X_2), P(X_2,X_3), P(X_3,X_1)$, but we don't know anything else (such as conditional independence). Can we get the joint distribution $P(X_1,X_2,X_3)$?

Best Answer

No.

Consider a trivariate distribution with bivariate (standard, independent) normal margins, but with half the octants having 0 probability and half having double probability. Specifically, consider octants ---, -++, +-+, ++- have double probability.

Then the bivariate margins are indistinguishable from the one you'd get with three iid standard normal variates. Indeed, there's an infinity of trivariate distributions which would produce the same bivariate margins

As Dilip Sawarte points out in comments he has discussed essentially the same example in an answer (but reversing the octants which are doubled and zeroed), and defines it in a more formal way. Whuber mentions an example involving Bernoulli variates that (in the trivariate case) looks like this:

  X3=0      X1                  X3=1      X1
          0    1                        0    1

    0    1/4   0                  0     0   1/4 
 X2                         X2
    1     0   1/4                 1    1/4   0

... where every bivariate margin would be

            Xi         
          0    1       

    0    1/4  1/4      
 Xj                  
    1    1/4  1/4    

and so would be equivalent to the case of three independent variates (or indeed to three with exactly the reverse form of dependence).

A closely related example I initially started to write about involved a trivariate uniform with alternating "slices" in a checkerboard pattern of greater and lower probability (generalizing the usual zero and double).

So you can't compute the trivariate from bivariate margins in general.