Well, $X_1 = 1$ only when $X_2 = X_3 = X_4 = 1$ and is $0$ otherwise, therefore
$$E(X_1) = P(X_2 = 1, X_3 = 1, X_4 = 1)$$
As @leonbloy mentions, knowledge of the correlations and marginal success probabilities is not sufficient for calculating $E(X_1)$, but it can be written in terms of the conditional probabilities; using the definition of conditional probability,
$$ E(X_1) = P(X_2 = 1, X_3 = 1 | X_4 = 1) \cdot P(X_4 = 1) $$
and $P(X_2 = 1, X_3 = 1 | X_4 = 1)$ can be similarly decomposed as
$$ P(X_2 = 1 | X_3 = 1, X_4 = 1) \cdot P(X_3 = 1 | X_4 = 1) $$
implying
$$E(X_1) = P(X_2 = 1 | X_3 =1, X_4 = 1) \cdot P(X_3 =1 | X_4 = 1) \cdot P(X_4 = 1)$$
Explicit calculation of $E(X_1)$ will require more information about the joint distribution of $(X_2,X_3,X_4)$. The above expression makes sense intuitively - the probability that three dependent Bernoulli trials are successes is the probability that the first is a success, and the second one is a success given the first, and the third is a success given that the first two are. You could equivalently interchange the roles of $X_2, X_3, X_4$.
We have, Assuming $\psi$ has support on the positive real line,
$$\xi \,\psi = X$$ Where $X \sim F_n$ and $F_n$ is the empirical distribution of the data.
Taking the log of this equation we get,
$$ Log(\xi) + Log(\psi) = Log(X) $$
Thus by Levy's continuity theorem, and independance of $\xi$ and$\psi$
taking the charactersitic functions:
$$ \Psi_{Log(\xi)}(t)\Psi_{Log(\psi)}(t) = \Psi_{Log(X)}$$
Now, $ \xi\sim Unif[0,1]$$, therefore $$-Log(\xi) \sim Exp(1) $
Thus,
$$\Psi_{Log(\xi)}(-t)= \left(1 + it\right)^{-1}\,$$
Given that $\Psi_{ln(X)} =\frac{1}{n}\sum_{k=1}^{1000}\exp(itX_k) ,$
With $ X_1 ... X_{1000}$ The random sample of $\ln(X)$.
We can now specify completly the distribution of $Log(\psi)$ through its characteristic function:
$$ \left(1 + it\right)^{-1}\,\Psi_{Log(\psi)}(t) = \frac{1}{n}\sum_{k=1}^{1000}\exp(itX_k)$$
If we assume that the moment generating functions of $\ln(\psi)$ exist and that $t<1$ we can write the above equation in term of moment generating functions:
$$ M_{Log(\psi)}(t) = \frac{1}{n}\sum_{k=1}^{1000}\exp(-t\,X_k)\,\left(1 - t\right)\,$$
It is enough then to invert the Moment generating function to get the distribution of $ln(\phi)$ and thus that of $\phi$
Best Answer
Assuming that you are only observing the variable $y$, the probabilities $p_1$ and $p_2$ are not identifiable in your model; only the parameter $p= p_1 \cdot p_2$ is identifiable. This means that you essentially have no information with which to properly estimate $p_1$ and $p_2$.