[Math] How to find the Fisher Information of a function of the MLE of a Geometric (p) distribution

statistical-inferencestatistics

Suppose that we have $X_1,…,X_n$ iid observations from a Geometric($p$) distribution.

I found that the MLE of p is $\hat{p} = \frac{n}{n+\sum{X_i}}$.

I am trying to find the MLE of $\theta = \frac{p}{1-p}, \hat{\theta}$ and the Fisher Information of a single observation $I_1(\theta)$.

Now, by the functional invariance property of MLE estimators, I found that the MLE of $\hat{\theta}$ is just:

$\hat{\theta}$= $\frac{\hat{p}}{1-\hat{p}}$.

However, I have no idea how to find the Fisher Information $I_1(\theta)$.

I know of two ways of doing this, one is to exploit the asympotic efficiency and the lower bound princple of MLE's. That is, $Var(\hat{\theta})=\frac{1}{nI_1(\theta)}$.

Also, is the parametrization formula

${\mathcal I}_\eta(\eta) = {\mathcal I}_\theta(\theta(\eta)) \left( \frac{{\mathrm d} \theta}{{\mathrm d} \eta} \right)^2$
where ${\mathcal I}_\eta$ and ${\mathcal I}_\theta$ are the Fisher information measures of $η$ and $θ$, respectively.

Is there another easier way to do this? I have no idea how to find the variance above. Thank you!

Best Answer

I used the second method you suggested with the parametrization formula:

Since we know the Fisher information of $p$ is $$I(p)=-E\left[\displaystyle\frac{d^2\log L(p)}{dp^2}\right]=\frac{n}{p^2}-\frac{1}{(1-p)^2}\sum_{i=1}^n EX_i=\frac{n(1-p)^2-np^3}{p^2(1-p)^2}$$ and we can find the Jacobian by:

$$p(\theta)=\frac{\theta}{1+\theta}$$ so $$\displaystyle\frac{dp (\theta)}{d\theta}=\frac{1}{(1+\theta)^2}$$

Then

$$I_n(\theta)=J(\theta)^2I_n(p(\theta))=\frac{1}{(1+\theta)^4}\left(\frac{n(1+\theta)^2}{\theta^2}-n\theta(1+\theta)\right)$$ by plugging in $\frac{\theta}{1+\theta}$ for $p$ in $I_n(p)$.

Note we can only use this parametrization when the function $\theta(p)$ is 1-1.