Derive the Bayes estimator under the loss > $L(\theta,\delta)=\frac{(\theta-\delta)^2}{\theta^k}$

bayesiangamma distribution

Let $X_i,…,X_n$ be iid $Poisson(\theta)$ with $Gamma(a, \frac 1 b)$
prior and a posterior $Gamma\left(a+n\bar x,\frac{1+nb}{b}\right)$.
Derive the Bayes estimator under the loss
$L(\theta,\delta)=\frac{(\theta-\delta)^2}{\theta^k}$ for some $k>0$,
also assume $a-k>0$.

I think it has to do with finding a function $\delta*$ that minimizes $E\left(\frac{(\theta-\delta)^2}{\theta^k}\right)$, but not sure how to go about it. Is $\delta$ distributed as the posterior Gamma, or are those separate things?

I know the Bayes estimator when the loss is $L(\theta,\delta)=(\theta-\delta)^2$ is the mean of the posterior distribution. But I don't know what to do when it's divided by $\theta^k$.

Best Answer

Your loss function IS quadratic in $\delta=\hat{\theta}$, the only difference is that the loss function is

$$l(\theta;\hat{\theta})=C(\theta)(\theta-\hat{\theta})^2$$

thus doing the same passages you surely know to prove that with a quadratic loss function Bayes MMSE is the posterior mean you will easy find that in your case you have that your MMSE is

$$\hat{\theta}=\frac{\mathbb{E}[\theta C(\theta)|\mathbf{x}]}{\mathbb{E}[C(\theta)|\mathbf{x}]}$$

Using a Gamma posterior it is not difficutl to to the calculations. but observe that when posting Gamma distributions without writing their density you must explicitate if one of the two parameter is the "scale" or "rate" parameter or saying something about their expectations...oherwise we cannot do the calculations.

Obseve that if $k=1$ then

$$\hat{\theta}_{\text{MMSE}}=\frac{1}{\mathbb{E}\left[\frac{1}{\theta}|\mathbf{x}\right]}$$

which is the armonic mean of the posterior distribution

Related Question