[Math] Cramer-Rao Casella Berger 7.38 for exponential family

estimationstatisticsupper-lower-boundsvariance

The question states ''let $X_{1}, \dots, X_{n}$ be random sample from $f(x \mid \theta) = \theta\cdot x^{\theta-1}$ for $0 < x< 1 ; \theta > 0$. Is there a function of $\theta, g(\theta)$ for which there exists an unbiased estimator of $\theta$ whose variance $\textbf{attains}$ the Cramèr-Rao lower bound ? if so find it!".

so we have an exponential family, and we can interchange differentiation and integration, so the fisher information term, denominator of the Cramér-Rao lower bound, I calculated as

$E \left(\dfrac{\partial}{\partial \theta}[\ln(\theta \cdot x^{\theta-1})] \right)^{2} = -n E \left(\dfrac{\partial^{2}}{\partial \theta^{2}}(\ln(\theta x^{\theta-1}\right) \implies \dfrac{\theta^{2}}{n}$ taking $\dfrac{1}{I(\theta)}$

Now just taking a stab the statistic I've used is $W(X)= \overline{X}$ which I calculate to be UBE since $E[X] = \dfrac{\theta}{\theta + 1}$ and thus E$[\overline{X}] = \frac{\theta}{\theta + 1}$ from my calculations (hopefully right).

similarly, if I calculate the variance I get $\dfrac{\theta}{(\theta + 1)^{2}(\theta + 2)} \geq \dfrac{\theta^{2}}{n}$ satisfies the lower bound.

If I examine the MLE I find: $\dfrac{n}{\theta} + \sum_{i}\ln(x_{i}) = 0 \implies \hat{\theta_{MLE}} = \dfrac{-n}{\sum \ln(x_{i})}$.

the attainment theory states
\begin{equation}
\begin{split}
\frac{n}{\theta} + \sum_{i}\ln(x_{i}) &= a(\theta)[W(\vec{x})-\tau(\theta)]\\
&= n [\frac{ \sum \ln(x_{i})}{n} – \dfrac{-1}{\theta}]
\end{split}
\end{equation}

where if W(x) satisfies the above, then it is the best estimate for $\tau$. We need to use Rao-Blackwell theorem for W in the above equation to show that $\dfrac{1}{\hat{\theta_{MLE}}}$ is the best.

Consider Y = \log(X) now applying the transformation with the Jacobian we have
\begin{equation}
\begin{split}
f_{X}(e^{y}) = \theta e^{y(\theta-1)}\cdot e^{y} = \theta e^{y\theta} = f_{Y}(y)
\end{split}
\end{equation}

where E[Y] = $\frac{1}{\theta}$ (Which I think is an unbiased estimator ?)

Using the Rao-Blackwell theorem, let $\theta^{\star} = E[Y_{i} \mid \sum y_{i} = t]$ where since $f_{Y}(y)$ is an exponential family then $\sum y_{i}$ can be shown to be sufficient statistic.

then
\begin{equation}
\begin{split}
E[\theta^{\star}] &= E \left[ E[Y_{i} \mid \sum y_{i} = t] \right] \\
&= E\left[ \theta^{\star} \right] \\
&= \frac{1}{\theta^{\star}}
\end{split}
\end{equation}

which really fits into the attainment equation written above! so my guess is to let $W(\vec{x}) = \dfrac{1}{\theta_{\text{MLE}}^{\star}}$ be the best UBE (?)

Best Answer

In case you are still interested...

You could have been working with the equality condition of the Cramer-Rao inequality:

$$\frac{\partial}{\partial\theta}\ln f_{\theta}(x_1,\ldots,x_n)=k(\theta)\left(T(x_1,\ldots,x_n)-g(\theta)\right)\tag{*}$$

If $(*)$ holds, then variance of the statistic $T$ attains the Cramer-Rao lower bound for $g(\theta)$.

Moreover, if $T$ is unbiased for $g(\theta)$, then $T$ is the UMVUE of $g(\theta)$.

Here joint density of $(X_1,\ldots,X_n)$ is

\begin{align} f_{\theta}(x_1,\ldots,x_n)&=\theta^n\left(\prod_{i=1}^nx_i\right)^{\theta-1}\mathbf1_{0<x_1,\ldots,x_n<1} \\\implies \ln f_{\theta}(x_1,\ldots,x_n)&=n\ln\theta+(\theta-1)\sum_{i=1}^n\ln x_i+\ln(\mathbf1_{0<x_1,\ldots,x_n<1}) \end{align}

So,

$$\frac{\partial}{\partial\theta}\ln f_{\theta}(x_1,\ldots,x_n)=\frac{n}{\theta}+\sum_{i=1}^n\ln x_i=-n\left(-\frac{1}{n}\sum_{i=1}^n\ln x_i-\frac{1}{\theta}\right)$$

This implies that variance of $T=-\frac{1}{n}\sum\limits_{i=1}^n\ln X_i$ attains the Cramer-Rao lower bound for $1/\theta$.

Besides,

\begin{align} E_{\theta}(T)&=-\frac{1}{n}\sum_{i=1}^n E_{\theta}(\ln X_i) \\&=-\frac{1}{n}\sum_{i=1}^n \left(-\frac{1}{\theta}\right)=\frac{1}{\theta} \end{align}

So the function you are looking for is indeed $g(\theta)=\theta^{-1}$, but you have worked too hard for the answer.

Related Question