Find Bayes Estimator Under Mixture Normal Distributions

bayes-theorembayesianprobabilitystatistics

Consider the following model:

$$\mathbf{X}=(X_1,…,X_p)|\theta \sim (1-\theta)N_p (\mathbf{0},\mathbf{\Sigma})+\theta N_p (\mathbf{\delta}, \mathbf{\Sigma})$$

$$\theta \sim \textrm{Bernoulli}(1-\pi)$$

for some known $\mathbf{\delta}$, $\mathbf{\Sigma}$ and $\pi$. Show that the Bayes estimator of $\theta$ is given by

$$\frac{(1-\pi) e^{\alpha + \Sigma_{i=1}^p \beta_i X_i}}{\pi + (1-\pi) e^{\alpha + \Sigma_{i=1}^p \beta_i X_i}}$$

for some constants $\alpha$ and $\beta_i, i=1,…,p$. Give formulas for $\alpha$ and $\beta_i$.

My idea:

The posterior distribution of $\theta$ given the data $\mathbf{X}$ is proportional to the product of the likelihood and the prior:

$$\pi(\theta | \mathbf{X}) \propto f(\mathbf{X} | \theta) \pi(\theta),$$

where
$$f(\mathbf{X} | \theta) = (1-\theta)N_p (\mathbf{0},\mathbf{\Sigma})+\theta N_p(\mathbf{\delta}, \mathbf{\Sigma})$$
is the likelihood function, and
$$\pi(\theta) = (1-\pi)^\theta \pi^{1-\theta}$$
is the prior distribution.

Now, I can calculate the posterior distribution:

$$\pi(\theta | \mathbf{X}) \propto \left[(1-\theta)N_p (\mathbf{0},\mathbf{\Sigma})+\theta N_p (\mathbf{\delta}, \mathbf{\Sigma})\right] \left[(1-\pi)^\theta \pi^{1-\theta}\right].$$

I don't know how to do next.

Best Answer

The conditional characteristic function of $X$ is $$E[e^{i\xi X}|\theta]=(1-\theta)e^{-\xi'\Sigma \xi/2}+\theta e^{i\xi'\delta-\xi'\Sigma \xi/2},\,\xi \in \mathbb{R}^p$$ So $$E[e^{i\xi X}]=(1-E[\theta])e^{-\xi'\Sigma \xi/2}+E[\theta] e^{i\xi'\delta-\xi'\Sigma \xi/2}=\pi e^{-\xi'\Sigma \xi/2}+(1-\pi)e^{i\xi'\delta-\xi'\Sigma \xi/2}$$ and therefore $X\sim \pi \mathcal{N}_p(0,\Sigma)+(1-\pi)\mathcal{N}_p(\delta,\Sigma)$. So we have $$P(\theta=t|X=x)=\frac{(1-t) \phi_p(x,0,\Sigma)+t\phi_p(x,\delta,\Sigma)}{\pi \phi_p(x,0,\Sigma)+(1-\pi)\phi_p(x,\delta,\Sigma)}(1-\pi)^t\pi^{1-t}$$ where $\phi_p(x,\mu,\Sigma)$ is the multivariate normal pdf. The mean of the posterior $E[\theta|X=x]=P(\theta=1|X=x)$ so $$E[\theta|X=x]=\frac{(1-\pi)\frac{\phi_p(x,\delta,\Sigma)}{\phi_p(x,0,\Sigma)}}{\pi +(1-\pi)\frac{\phi_p(x,\delta,\Sigma)}{\phi_p(x,0,\Sigma)}}$$ So it remains to compute $$\begin{aligned}2\ln\frac{\phi_p(x,\delta,\Sigma)}{\phi_p(x,0,\Sigma)}&=-(x-\delta)'\Sigma^{-1}(x-\delta)+x'\Sigma^{-1}x\\ &=-\delta'\Sigma^{-1} \delta+2\delta'\Sigma^{-1}x\\ \end{aligned}$$ So we conclude: $$E[\theta|X=x]=\frac{(1-\pi)e^{-\delta'\Sigma^{-1} \delta/2+\delta'\Sigma^{-1}x}}{\pi +(1-\pi)e^{-\delta'\Sigma^{-1} \delta/2+\delta'\Sigma^{-1}x}}$$ so that $\alpha = -\delta'\Sigma^{-1} \delta/2$ and $\beta'=\delta'\Sigma^{-1}$.

Related Question