[Math] How to derive mean and variance for a Bayes estimator

bayesianprobabilityprobability distributionsstatistical-inferencestatistics

Let $X_1,…,X_n \sim$ iid $\mathcal{N}\left(\theta , \sigma ^2\right)$, where the variance is known. Also, suppose the prior distribution $\theta \sim \mathcal{N}\left(\theta_0,\frac{\sigma^2}{\kappa_0}\right)$

The Bayes estimator will be the posterior mean $E[\theta | \mathbf{X}]$, where in this case
$$\pi (\theta | \mathbf{X}) \propto_\theta p(\mathbf{X} | \theta) \pi(\theta)$$

Supposedly $\pi(\theta | \mathbf{X})$ is proportional to the kernel of a normal density with mean

$$\frac{\kappa_0 \theta + n\bar{X}}{\kappa_0 + n}$$ and variance $$\frac{\sigma^2}{\kappa_0+ n}$$

My Question:

How does one derive the above mean and variance?

Best Answer

I get a slightly different value for the posterior variance for $\theta$.

You have:

$\pi (\theta | \mathbf{X}) \propto_\theta p(\mathbf{X} | \theta) \pi(\theta) = \frac{1}{\left(\sigma\sqrt{2\pi}\right)^n } \; e^{ -\sum_i\frac{(X_i-\theta)^2}{2\sigma^2} }\frac{1}{\sigma\sqrt{2\pi/\kappa_0} } \; e^{ -\frac{(\theta-\theta_0)^2}{2\sigma^2/\kappa_0} } \\ \propto \exp\left(-\frac{\sum_i X_i^2 - 2 \sum_i X_i \theta+ n \theta^2 + \kappa_0\theta^2-2 \kappa_0 \theta_0 \theta+\kappa_0\theta_0^2}{2\sigma^2}\right) \\ \propto \exp\left(-\frac{(\kappa_0+n)\theta^2 - 2(\kappa_0\theta_0+\sum_i X_i ) \theta }{2\sigma^2}\right) \\ \propto \exp\left(-\dfrac{\left(\theta - \dfrac{ \kappa_0\theta_0+\sum_i X_i }{\kappa_0+n} \right)^2 }{2\dfrac{\sigma^2}{\kappa_0+n}}\right) $

which, using $\overline{\mathbf{X}} =\frac1n \sum_i X_i$, is proportional to the density of a normal distribution with mean $ \dfrac{ \kappa_0\theta_0+n\overline{\mathbf{X}} }{\kappa_0+n} $ and variance $\dfrac{\sigma^2}{\kappa_0+n}$