Solved – Minimax estimator for the mean of a Poisson distribution

bayesianestimationpoisson distribution

I recently took a course on Bayesian statistics based on The Bayesian Choice by C. Robert (aka Xi'an). I couldn't solve one of the exercises regarding minimax estimators and was hoping that someone here can give me a clue as to have it can be solved. This is not homework; this is me wanting to fully understand the material covered in the book.

The exercise is labelled 2.32 (and is found on page 90 in the second edition of the book). It amounts to the following problem.

Problem: Show that when $X\sim \mathrm{Poisson}(\lambda)$ the estimator $\hat{\lambda}=x$ is minimax.

Here $\lambda$ denotes the mean of the Poisson distribution. Presumably the use of the quadratic loss is implicit. A hint is given:

Hint: Notice that $\hat{\lambda}$ is a generalized Bayes estimator for $\pi(\lambda)=1/\lambda$ and use a sequence of $\mathrm{Gamma}(\alpha,\beta)$ priors.

This points me to Lemma 2.4.15 (p. 72) which states that if there exists a sequence $(\pi_n)$ of proper priors such that the generalized Bayes estimator $\delta_0$ satisfies $R(\theta,\delta_0)\leq \lim_{n\rightarrow\infty}r(\pi_n)<+\infty$ for every $\theta\in\Theta$ then $\delta_0$ is minimax. $\Theta$ is the parameter space, $\theta$ is the parameter of interest, $R$ is the frequentist risk which in this case is $E((\hat{\theta}-\delta)^2)$ for an estimator $\delta$ and $r$ is the integrated risk, $r(\pi,\delta)=E^\pi(R(\theta,\delta))$.

All other results that I can find deal with Bayes estimators of proper priors and are thus of no use to me here (?).

Now, what's been giving me a headache for some time is that $R(\lambda,\hat{\lambda})=E(X-\lambda)^2=\lambda$ for all $\lambda\in\mathbb{R}_+$. Thus $r(\pi_n)=E^{\pi_n}\lambda$. I don't see how one can construct a sequence $(\pi_n)$ such that $\lambda\leq \lim r(\pi_n)<\infty$ for all $\lambda$. $r(\pi_n)$ can't be a function of $\lambda$ and $\lambda$ can be arbitrarily large…

I think that the reason that I'm stuck is that I've been staring myself blind at Lemma 2.4.15. Is there some other way of showing that this estimator is minimax (or isn't it?)?

Best Answer

Define a sequence of prior distributions, $\pi_n = Ga(\lambda|a_n,b_n)$, for the sequence $a_n = \alpha/n$ and $b_n = \beta/n$.

The Bayes estimator for this sequence is $\delta_n = (a_n+x)(b_n+1)$, and the integrated risk is \begin{gather} r_n = \int^\infty_0 (\lambda-\delta_n)^2 Poi(x|\lambda)Ga(\lambda|a_n,b_n)d\lambda \end{gather} which reduces to \begin{gather} r_n = \delta_n^2 - 2\delta_n E_n\lambda + E_n\lambda^2 \end{gather} \begin{gather} = \delta_n^2 - 2\delta_n E_n\lambda + V_n\lambda +(E_n\lambda)^2 \end{gather} \begin{gather} = \delta_n^2 - 2\delta_n^2 + \delta_n(b_n+1) + \delta_n^2 \end{gather} \begin{gather} = \delta_n(b_n+1). \end{gather}

The limiting Bayes estimator is \begin{gather} \delta = \lim_{n\rightarrow\infty}\delta_n=\lim_{n\rightarrow\infty}(a_n+x)(b_n+1) = x \end{gather} and the limiting integrated risk is \begin{gather} r = \lim_{n\rightarrow\infty}\delta_n(b_n+1)=\lim_{n\rightarrow\infty}(a_n+x)(b_n+1)^2=x \end{gather}

The limiting rule $\delta$ is Bayes with respect to the improper prior $\pi_\infty$. Because the integrated risk $r$ is constant for all $\lambda$, the estimator $\delta = x$ is minimax.

Related Question