Solved – Product of two independent random variables

distributionsmathematical-statisticsprobability

I have a sample of about 1000 values​​. These data are obtained from the product of two independent random variables $\xi \ast \psi $. The first random variable has a uniform distribution $\xi \sim U(0,1)$. The distribution of the second random variable is not known. How can I estimate the distribution of the second ($ \psi $) random variable?

Best Answer

We have, Assuming $\psi$ has support on the positive real line, $$\xi \,\psi = X$$ Where $X \sim F_n$ and $F_n$ is the empirical distribution of the data.
Taking the log of this equation we get,

$$ Log(\xi) + Log(\psi) = Log(X) $$

Thus by Levy's continuity theorem, and independance of $\xi$ and$\psi$ taking the charactersitic functions:

$$ \Psi_{Log(\xi)}(t)\Psi_{Log(\psi)}(t) = \Psi_{Log(X)}$$

Now, $ \xi\sim Unif[0,1]$$, therefore $$-Log(\xi) \sim Exp(1) $
Thus, $$\Psi_{Log(\xi)}(-t)= \left(1 + it\right)^{-1}\,$$

Given that $\Psi_{ln(X)} =\frac{1}{n}\sum_{k=1}^{1000}\exp(itX_k) ,$ With $ X_1 ... X_{1000}$ The random sample of $\ln(X)$.

We can now specify completly the distribution of $Log(\psi)$ through its characteristic function:

$$ \left(1 + it\right)^{-1}\,\Psi_{Log(\psi)}(t) = \frac{1}{n}\sum_{k=1}^{1000}\exp(itX_k)$$

If we assume that the moment generating functions of $\ln(\psi)$ exist and that $t<1$ we can write the above equation in term of moment generating functions:

$$ M_{Log(\psi)}(t) = \frac{1}{n}\sum_{k=1}^{1000}\exp(-t\,X_k)\,\left(1 - t\right)\,$$

It is enough then to invert the Moment generating function to get the distribution of $ln(\phi)$ and thus that of $\phi$