Posterior Distribution – When is Using a Loss Function Necessary?

bayesianloss-functionsposterior

Let $X_i\sim Poisson(\theta)$ for $i = 1, 2, …, n.$ Let $\theta \sim Gamma(\alpha, \beta)$ be the conjugate prior distribution.

It is easy to show that the posterior
$$
\theta|\textbf{x} \sim Gamma(n\bar{x} + \alpha,\beta+n),
$$

If we already know how the posterior distribution behaves, do we need to use loss function to estimate the parameter?
Can't we use
$$
\hat{\theta} = \frac{n\bar{x} + \hat{\alpha}}{\hat{\beta}+n}.
$$

the estimated mean from the posterior distribution as the estimator for $\theta$?

Best Answer

do we need to use loss function to estimate the parameter?

An estimate is, in essence, a kind of summary. By choosing the expectation of the posterior as your summary, you've implicitly decided to minimize the squared error. This is because the expected value is the minimizer of the squared error.

But nothing prevented you from using the median, which would minimize a different loss function. The result from a Bayesian analysis is not a number, it is a distribution. How you decide to summarize that distribution will depend on what qualities you want your summary to have.

Related Question