There is no joint density function since the random variable $(U,V,W,Z)$ takes values on a subset $D=\{(f_1^{-1}(x),f_2^{-1}(x),f_3^{-1}(x),f_4^{-1}(x))\mid x\in\mathbb R\}$ of $\mathbb R^4$ which has Lebesgue measure zero.
Informally, $D$ has co-dimension $3$, hence one can compare $D$ to a line in $\mathbb R^4$.
Formally, for every measurable function $\varphi$ on $\mathbb R^4$,
$$
\mathrm E(\varphi(U,V,W,Z))=\int\varphi(f_1(x),f_2(x),f_3(x),f_4(x))\,g(x)\mathrm dx,
$$
where $g$ is the density of the distribution of $X$ hence $\mathrm E(\varphi(U,V,W,Z))$ is an integral on (a subset of) $\mathbb R$ instead of $\mathbb R^4$.
The simplest analogue is when $U=V=X$ with $X$ uniformly distributed on $[0,1]$. Then $(U,V)$ is uniformly distributed on the diagonal $\Delta=\{(x,x)\mid x\in[0,1]\}$ hence the distribution of $(U,V)$ is
$$
\mathrm dP_{(U,V)}(u,v)=\mathbf 1_{u\in[0,1]}\,\delta_u(\mathrm dv)\,\mathrm du,
$$
where, for every $u$, $\delta_u$ is the Dirac distribution at $u$. One sees that $\mathrm dP_{(U,V)}(u,v)$ has no density with respect to Lebesgue measure $\mathrm du\mathrm dv$.
There is much more information in a joint distribution than can
be captured by its marginal distributions.
It is one thing to be told that a joint distribution can't be
constructed from marginals in a unique way. It
is another to have some examples. Here are a few.
Discrete distributions. Consider four different joint distributions
with the same marginals. In all cases, let the marginals have the
distribution Bin(3. 1/2).
Positively Correlated
x/y o 1 2 3 Tot
-------------------------------------
0 1/8 0 0 0 1/8
1 0 2/8 1/8 0 3/8
2 0 1/8 2/8 0 3/8
3 0 0 0 1/8 1/8
-------------------------------------
Tot 1/8 3/8 3/8 1/8 1
Negatively Correlated
x/y o 1 2 3 Tot
-------------------------------------
0 0 0 0 1/8 1/8
1 0 1/8 2/8 0 3/8
2 0 2/8 1/8 0 3/8
3 1/8 0 0 0 1/8
-------------------------------------
Tot 1/8 3/8 3/8 1/8 1
A perfectly correlated example arises from putting the
numbers 1/8, 3/8, 3/8, 1/8 down the main diagonal.
And, of course, there is the independent case in which
the cells are filled by multiplying the marginals.
There are many more examples of different joint distributions that have these same
marginal distributions. And maybe you should try to construct one.
Fill in the body of the table any way you like, using
numbers between 0 and 1 such that the marginal totals
remain unchanged.
Continuous distributions. A slightly more advanced situation comes from the family of
bivariate normal distributions with both means 0 and
both standard deviations 1. And the correlation $\rho$
can take any value between -1 and +1. In this example
both marginal distributions are standard normal, no matter
what the value of $\rho.$
Best Answer
Going back to the definition of (discrete) joint probability distribution, you want to find the quantity: $$ P(X=x,Y=y) $$ Note that the event $(X=x,Y=y)$ means $(X=x\textrm{ and }Y=y)$. This event zero probability when $x+y\ne 1$ because $Y=1-X$. When $x+y=1$, you have $$ P(X=x,Y=y)=P(X=x,Y=1-x)=P(X=x),\quad x\in\{0,1\} $$
Now you should know how to go on.
Notes.
In the calculation above, $P(X=x,Y=1-x)=P(X=x)$ because $(X=x,Y=1-x)$ and $(X=x)$ are the same event since $Y=1-X$.