Probability – Analyzing Conjugate Mixture of Beta Distributions and Weights

bayesianprobability

Suppose that my prior estimate for the bias of a coin is split evenly between a beta distribution $(\theta,20,10)$ and a beta distribution $(\theta,20,20)$, with $\theta$ indicating the probability of tossing heads. Hence my prior is:
$$
0.5\times\mathcal{Beta}(20,10)+0.5\times\mathcal{Beta}(20,20)
$$
Assume that I observe that among 30 coin tosses I get 20 heads. I know that the updated betas have the form of $\mathcal{Beta}(40,20)$ and $\mathcal{Beta}(40,30)$, and that the new posterior has the form:
$$
\lambda\times\mathcal{Beta}(40,20)+(1-\lambda)\times\mathcal{Beta}(40,30)
$$
However, I was hoping someone can help me to understand how to calculate $\lambda$ (i.e., the weights of the updated distributions).

Best Answer

Assuming you meant a binomial likelihood,

$$ \begin{eqnarray*} \text{Posterior}(\theta) & \propto & \text{Likelihood}(\theta) \times \text{Prior}(\theta) \\ \\ & = & \text{Binomial}(20 \mid 30, \theta) \times \bigg[ \lambda \times \text{Beta}(\theta \mid 20,10) + (1-\lambda) \times \text{Beta}(\theta \mid 20, 20) \bigg] \\ \\ & = & \lambda \times \text{Binomial}(20 \mid 30, \theta) \times \text{Beta}(\theta \mid 20,10) \\[8pt] &&+ (1 - \lambda) \times \text{Binomial}(20 \mid 30, \theta) \times \text{Beta}(\theta \mid 20,20) \\ \\ & = & \lambda { 30 \choose 20} \frac{1}{\text{B}(20, 10)} \theta^{40 - 1} (1-\theta)^{20-1} \\[8pt] &&+ (1-\lambda) {30 \choose 20} \frac{1}{\text{B}(20,20)} \theta^{40-1} (1-\theta)^{30-1} \\ \\ & = & \lambda { 30 \choose 20} \frac{\text{B}(40,20) \text{B}(40,30)}{\text{B}(20, 10) \text{B}(40,20) \text{B}(40,30)} \theta^{40 - 1} (1-\theta)^{20-1} \\[8pt] &&+ (1-\lambda) {30 \choose 20} \frac{\text{B}(40,20) \text{B}(40,30)}{\text{B}(20,20) \text{B}(40,20) \text{B}(40,30)} \theta^{40-1} (1-\theta)^{30-1} \\ \\ & = & \lambda { 30 \choose 20} \frac{ \text{B}(40,20)}{\text{B}(20, 10)} \text{Beta}(\theta \mid 40,20) \\[8pt] &&+ (1- \lambda) { 30 \choose 20} \frac{\text{B}(40,30)}{\text{B}(20, 20)} \text{Beta}(\theta \mid 40,30) \\ \\ & \propto & \lambda \frac{ \text{B}(40,20)}{\text{B}(20, 10)} \text{Beta}(\theta \mid 40,20) \\[8pt] &&+ (1- \lambda) \frac{\text{B}(40,30)}{\text{B}(20, 20)} \text{Beta}(\theta \mid 40,30). \end{eqnarray*} $$ Thus, the new weights $\omega_1, \omega_2$ are $$ \begin{eqnarray*} \omega_1 & = & \left( \lambda \frac{ \text{B}(40,20)}{\text{B}(20, 10)} \right) \left( \lambda \frac{ \text{B}(40,20)}{\text{B}(20, 10)} + (1- \lambda) \frac{\text{B}(40,30)}{\text{B}(20, 20)} \right)^{-1} \\ \omega_2 & = & 1 - \omega_1, \end{eqnarray*} $$ and $$ \text{Posterior}(\theta) = \omega_1 \times \text{Beta}(\theta \mid 40,20) + \omega_2 \times \text{Beta}(\theta \mid 40,30). $$