Solved – Expectation of a product of multiple random (Bernoulli) variables

bernoulli-distributioncorrelationcovarianceexpected valuerandom variable

I'm doing a research and using random variables to model a random process. I'm defining a Bernoulli random variable as a product of several other Bernoulli variables (three or more). So, I have the following as the expectation:

$E[X_1]=E[X_2X_3X_4]$

I cannot assume the independence of the variables $X_i$ for any $i$. The issue is to find a correct formula for the expectation to consider the correlation of the variables $X_2$ to $X_4$. I know that for a product of two variables, I can have: $E[X_2X_3]=E[X_2]E[X_3] + Cov(X_2,X_3)$. But this covariance will be a matrix in case of three or more variables. Then, how would one compute a value for $E[X_1]$?

I appreciate if somebody can help with this or point me to a good resource (e.g. book).

Best Answer

Well, $X_1 = 1$ only when $X_2 = X_3 = X_4 = 1$ and is $0$ otherwise, therefore

$$E(X_1) = P(X_2 = 1, X_3 = 1, X_4 = 1)$$

As @leonbloy mentions, knowledge of the correlations and marginal success probabilities is not sufficient for calculating $E(X_1)$, but it can be written in terms of the conditional probabilities; using the definition of conditional probability,

$$ E(X_1) = P(X_2 = 1, X_3 = 1 | X_4 = 1) \cdot P(X_4 = 1) $$

and $P(X_2 = 1, X_3 = 1 | X_4 = 1)$ can be similarly decomposed as

$$ P(X_2 = 1 | X_3 = 1, X_4 = 1) \cdot P(X_3 = 1 | X_4 = 1) $$

implying

$$E(X_1) = P(X_2 = 1 | X_3 =1, X_4 = 1) \cdot P(X_3 =1 | X_4 = 1) \cdot P(X_4 = 1)$$

Explicit calculation of $E(X_1)$ will require more information about the joint distribution of $(X_2,X_3,X_4)$. The above expression makes sense intuitively - the probability that three dependent Bernoulli trials are successes is the probability that the first is a success, and the second one is a success given the first, and the third is a success given that the first two are. You could equivalently interchange the roles of $X_2, X_3, X_4$.