[Math] Fisher Information for Geometric Distribution

estimationstatistical-inference

Find the Cramer-Rao lower bound for unbiased estimators of $\theta$, and then given the approximate distribution of $\hat{\theta}$ as $n$ gets large. This is for a geometric($\theta$) distribution.

I am stuck on calculating the Fisher Information, which is given by $-nE_{\theta}\left(\dfrac{d^{2}}{d\theta^{2}}\log f(X\mid\theta)\right)$. So far, I have the second derivative of the log likelihood as $\dfrac{-n}{\theta^{2}}+\dfrac{\theta(n-\sum x_{i})}{(1-\theta)^{2}}$. I just need some help finding the expectation of this.

Best Answer

I think you misscalculate the loglikelihood:

$$L=\prod_{i=1}^{n}(1-\Theta)^{x_i-1}\Theta =$$ $$=(1-\Theta)^{\sum_{i=1}^{n}x_i-n}\cdot \Theta^n$$ Then you calculate $lnL$ $$lnL=(\sum_{i=1}^{n}x_i-n)ln(1-\Theta)+nln\Theta$$ $$\frac{\partial lnL}{\partial \Theta}=\frac{n}{\Theta}-\frac{\sum_{i=1}^{n}x_i-n}{1-\Theta}$$

The second derivate is th folowing: $$\frac{\partial^2lnL}{\partial^2 \Theta}=-\frac{n}{\Theta^2}-\frac{\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}$$

For the Fisher information you need $$-E\bigg(\frac{\partial^2lnL}{\partial^2 \Theta}\bigg)$$

$$E\bigg(-\frac{n}{\Theta^2}-\frac{(\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}\bigg)$$

In this formation the onli variable is in $$\sum_{i=1}^{n}x_i$$ all others are constans. $$E(const)=const$$ so you can get the folowing:

$$E\bigg(-\frac{n}{\Theta^2}-\frac{(\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}\bigg)=$$ $$\bigg(-\frac{n}{\Theta^2}-\frac{1}{\big(1-\Theta)^2}\big(E(\sum_{i=1}^{n}x_i\big)-n\big)\bigg)=$$

Well known as if $x_i$ is geometrical then $E(x_i)=\frac{1}{\Theta}$

Because all $x_i$ are independent so $E(\sum_{i=1}^{n}x_i)=n \cdot \frac{1}{\Theta}$

So the final formation is: $$=\bigg(-\frac{n}{\Theta^2}-\frac{1}{\big(1-\Theta)^2}\big(\frac{n}{\Theta}-n\big)\bigg)$$ So Finaly you get the Fisher information:

$$F_{\Theta}=-E\bigg(-\frac{n}{\Theta^2}-\frac{(\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}\bigg)=n\big(\frac{1}{\Theta^2}+\frac{1}{(1-\Theta)\Theta}\big)$$