The sum of $n$ independent Gamma random variables $\sim \Gamma(t_i, \lambda)$ is a Gamma random variable $\sim \Gamma\left(\sum_i t_i, \lambda\right)$. It does not matter what the second parameter means (scale or inverse of scale) as long as all $n$ random variable have the same second parameter. This idea extends readily
to $\chi^2$ random variables which are a special case of Gamma random variables.
We have, Assuming $\psi$ has support on the positive real line,
$$\xi \,\psi = X$$ Where $X \sim F_n$ and $F_n$ is the empirical distribution of the data.
Taking the log of this equation we get,
$$ Log(\xi) + Log(\psi) = Log(X) $$
Thus by Levy's continuity theorem, and independance of $\xi$ and$\psi$
taking the charactersitic functions:
$$ \Psi_{Log(\xi)}(t)\Psi_{Log(\psi)}(t) = \Psi_{Log(X)}$$
Now, $ \xi\sim Unif[0,1]$$, therefore $$-Log(\xi) \sim Exp(1) $
Thus,
$$\Psi_{Log(\xi)}(-t)= \left(1 + it\right)^{-1}\,$$
Given that $\Psi_{ln(X)} =\frac{1}{n}\sum_{k=1}^{1000}\exp(itX_k) ,$
With $ X_1 ... X_{1000}$ The random sample of $\ln(X)$.
We can now specify completly the distribution of $Log(\psi)$ through its characteristic function:
$$ \left(1 + it\right)^{-1}\,\Psi_{Log(\psi)}(t) = \frac{1}{n}\sum_{k=1}^{1000}\exp(itX_k)$$
If we assume that the moment generating functions of $\ln(\psi)$ exist and that $t<1$ we can write the above equation in term of moment generating functions:
$$ M_{Log(\psi)}(t) = \frac{1}{n}\sum_{k=1}^{1000}\exp(-t\,X_k)\,\left(1 - t\right)\,$$
It is enough then to invert the Moment generating function to get the distribution of $ln(\phi)$ and thus that of $\phi$
Best Answer
The problem is ill-posed. There is no unique solution if you dont specify Z. Think about it. Pick any Y with any distribution you choose. X+Y will have some distribution that depends on what you chose for Y. So if you specify Z then Y is determined but two distinct choices for Y will give two different distributions for Z. There are infinitely many solutions to your problem.
Now given that of course your second problem is also ill-posed. Just knowing that X and Y a re independent tells you almost nothing about Z. Take X and Y normal and Z will be normal. If X and Y are identical distributed Cauchys Z will be Cauchy. Two independent chi squares lead to Z be chi-square. So there are three solutions to problem 2 and that is only scratching the surface
In problem 1 you need to know Z's distribution to determine Y's. In problem 2 you cant get Z without specifying the distributions of X and Y.