Solved – Need help calculating a Bayes estimation for a Poisson

bayesianmathematical-statisticsself-study

My study group and I are stuck on this Bayes' estimator problem.
The question is:

Let X~Pois($\lambda$)
Find the Bayes estimator for $\lambda$ with respect to:

(i) The prior distribution: $\lambda$ ~ exp(1)

(ii) The squared error loss function

So for our posterior distribution we ended up with:

$g(\lambda|x_1…x_n)= \dfrac{\frac{\lambda^{\Sigma x_i}}{\Pi x_i!}e^{-\lambda(n+1)}}{\int_0^\infty \frac{\lambda^{\Sigma x_i}}{\Pi x_i!}e^{-\lambda(n+1)}, d\lambda} $

which is a [Gamma distribution][1] with parameters $\Sigma x_i + 1$ and $n+1$.

Then our Bayes' estimator with respect to the Prior and the Squared Error Loss Function is the expected value which we get:

$\frac{\Sigma x_i + 1}{n+1} $

Which we are just not sure if this is right or not.

Could someone please let us know if our thinking is correct on this one?

Thanks!

Best Answer

This is a fairly straightforward example to check for one simple reason: Your prior is the conjugate prior for Poisson data. Your prior $\lambda \sim Exp(1)$, can be written as a Gamma distribution, because $$\lambda \sim Exp(1) \Rightarrow \lambda \sim \Gamma(1,1).$$ The posterior distribution should then also be a Gamma. I won't go through the math on it, but you can check Wikipedia's Table of Conjugate Priors to verify the distribution. You are correct that the posterior distribution is a $\,\Gamma(\sum x +1, n+1)$.

A Bayes Estimator under squared error loss is just the posterior mean, which yields $$E (\lambda | \mathbf{x}) = \frac{\sum{x_i} + 1}{n+1},$$ just as you have written.