Solved – Cramér-Rao Lower Bound for Exponential Families

estimatorsexponential-familypoint-estimationvariance

I am having a problem with applying the Cramér-Rao inequality to identify the lower bound for the variance of an unbiased estimator and hoped that you guys could help me. The problem is the following:

Let $X_{1},\dots,X_{n}$ be a random sample from the density
$f(x \mid \theta)=\frac{1}{\theta}e^{-\frac{x}{\theta}}, x\geq0$, $0$ otherwise. Find the Cramér-Rao lower bound for the variance of an unbiased estimater of the population variance $\theta^{2}$.

Now, the problem itself isn't too difficult and finding the MLE for $\theta^{2}$ wasn't a problem either. By finding the MLE for $\theta^{2}$ (which I found to be $\left(\dfrac{\sum^{n}_{i=1}Xi}{n}\right)^{2} = \bar{X}^{2}$ if I am not mistaken), I already found log$f(x \mid \theta)$ and its derivatives. So since the Cramér-Rao Inequality (for iid random variables) is given by

$Var_{\theta}[W(\boldsymbol{X})] \geq \frac{\left(\dfrac{d}{d\theta}E_{\theta}W(\boldsymbol{X})\right)^{2}}{nE_{\theta}\left[\left(\dfrac{\partial}{\partial\theta}\text{log}f(\boldsymbol{X} \mid \theta)\right)^{2} \right]}$

Also, since $f(\boldsymbol{x \mid \theta})$ belongs to the exponential family,

$E_{\theta}\left[\left(\dfrac{\partial}{\partial\theta}\text{log}f(\boldsymbol{X \mid \theta})\right)^{2}\right] = -E_{\theta}\left[\dfrac{\partial^{2}}{\partial^{2}\theta^{2}}\text{log}f(\boldsymbol{X \mid \theta})\right]$

If I am not totally mistaken with the notation here (we discussed this topic in class very briefly), I found the following values:

  • For $E_{\theta}\left[\dfrac{\partial^{2}}{\partial^{2}\theta^{2}}\text{log}f(\boldsymbol{X \mid \theta}) \right]$:

$ E_{\theta}\left[ \dfrac{n}{\theta^{2}} – \dfrac{2\sum^{n}_{i=1}X_{i}}{\theta^{3}} \right] = \dfrac{n}{\theta^{2}} – \dfrac{2}{\theta^{3}}E\left[ \sum^{n}_{i=1}X_{i} \right] \\
= \dfrac{n}{\theta^{2}} – \dfrac{2}{\theta^{3}}nE[X_{1}] = \dfrac{n}{\theta^{2}} – \dfrac{2}{\theta^{3}}n \theta = -\dfrac{n}{\theta^{2}}$

Thus, the term in the denominator becomes $\dfrac{n^{2}}{\theta^{2}}$

  • For $\left(\dfrac{d}{d\theta}E_{\theta}W(\boldsymbol{X})\right)^{2}$:

This is the part where I am really confused. Since we're talking about an estimator of $\theta^{2}$, I would have taken the first derivative with respect to $\theta$ and squared it, to get $4\hat{\theta}^{2}$ in the numerator. This would give

$Var_{\theta}\left[\hat{\theta}^{2}\right] \geq \dfrac{4\hat{\theta}^{4}}{n^{2}}$

for my Cramér-Rao bound. It just seems odd, considering the rather simple expressions we usually get as solutions to our problems. Am I on the wrong track here?

Also sorry for asking such a rather simple question. The whole statistics class was just a half-semester course, so we treated some of the concepts rather superficially.

Thanks in advance!

Best Answer

While your initial formulas and computations look correct, I am getting a slightly different result for the CRLB of unbiased estimators for $\theta^2$. It might be a little tricky to get the derivatives with respect to terms that do not appear in the likelihood so let me show you a shortcut.

The basic ingredient of the CRLB is the Fisher information of course. Assume then that we have the Fisher Information for a parameter $\theta$ and we wish to derive the Fisher Information for a function of $\theta$, say $g(\theta)$. In your notation, we wish to compute

$$ I \left( g\left(\theta \right) \right) = E_{\theta}\left\{ \left[ \frac{\partial}{\partial g(\theta)} \log f\left(\mathbf{x};\theta\right) \right]^2 \right\} $$

would you agree? But notice what happens when we apply the chain rule along with the definition of the derivative of the inverse function,

\begin{align} I \left( g\left(\theta \right) \right) = E_{\theta}\left\{ \left[ \frac{\partial}{\partial g(\theta)} \log f\left(\mathbf{x};\theta\right) \right]^2 \right\} & = E_{\theta} \left\{ \left[ \frac{\partial}{\partial \theta} \log f\left(\mathbf{x};\theta\right) \frac{\partial \theta}{\partial g\left( \theta \right)} \right]^2 \right\} \\ & = E_{\theta} \left\{ \left[ \frac{\partial}{\partial \theta} \log f\left(\mathbf{x};\theta\right) \frac{1}{ g ^{\prime}\left( \theta \right)} \right]^2 \right\} \\ & = \frac{I(\theta)}{\left[g ^{\prime}\left( \theta \right) \right]^2} \end{align}

which simplifies matters. With this definition, if we are looking for the Information for $\theta^2$, then since $g^{\prime} (\theta) = 2\theta$ and $I(\theta) = \frac{1}{\theta^2}$, we see that

$$I\left(\theta^2 \right) = \frac{1}{\theta^2} \frac{1}{4\theta^2} $$

Multiply by $n$ and take the reciprocal of this to arrive at your bound, $\frac{4 \theta^4}{n}$.

Related Question