Fisher information and Expected Information for Gamma Distribution

fisher informationgamma distributionmaximum likelihood

I would like some help with calculating the Fisher Information $I_o(\beta)$ and the expected information for a gamma distribution defined by

\begin{align*}
f_X(x) = \frac{\beta^\alpha x^{\alpha – 1}e^{-\beta x}}{\Gamma(\alpha)} \; x > 0, \alpha >0, \beta > 0
\end{align*}

Where $\alpha$ is a known value and $\beta$ is the parameter of interest.

Attempt

I have attempted to calculate a likelihood function as follows:

\begin{align}
L(\beta | X_i) &= \prod_{i = 1}^{n}f(x_i | \alpha, \beta) \\
&= \prod_{i = 1}^{n}\left( \frac{\beta^{\alpha}}{\Gamma(\alpha)}x_{i}^{\alpha – 1}\mathrm{exp}{\{-\beta x\}}\right) \\
&= \left(\frac{\beta^{\alpha}}{\Gamma(\alpha)}\right)^{n}\prod_{i = 1}^{n} \left(x_{i}^{\alpha-1}\right) \mathrm{exp}\{-\beta\sum_{i = 1}^{n} x_i\} \\
L(\beta | X_i) &= \left(\frac{\beta^{\alpha}}{\Gamma(\alpha)}\right)^{n}\left(\prod_{i = 1}^{n} x_{i}\right)^{\alpha-1} \mathrm{exp}\{-\beta\sum_{i = 1}^{n} x_i\}
\end{align}

Thus the log-likelihood would be the following:

\begin{align}
\ell(\beta | x_i) &= \ln\left((L(\beta | X_i)\right)) \\
&= n\alpha\ln(\beta) – n\ln(\Gamma(\alpha)) + (na-n)\ln(x_i) – \beta \sum_{i = 1}^{n} x_i \\
\end{align}

I understand that the information is found by taking the 2nd derivative of any of the likelihood functions where $I_o(\beta) = -\frac{\mathrm{d}^{2}{\ell}}{\mathrm{d}{\beta}^{2}} $

The derivatives calculated were as follows:
\begin{align}
&= n\alpha\ln(\beta) – n\ln(\Gamma(\alpha)) + (na-n)\ln(x_i) – \beta \sum_{i = 1}^{n} x_i \\
&= \frac{n\alpha}{\beta} – \beta \\
&= – \frac{n \alpha}{\beta^2} – 1
\end{align}

Thus the information would be

\begin{align}
I_0(\beta) = \frac{n \alpha}{\beta^2} + 1
\end{align}

I am unsure what to do once I get to the expectation.

\begin{align}
\mathbb{E}\{I_0(\beta)\} &= \mathbb{E}\left(\frac{n \alpha}{\beta^2} + 1\right)
\end{align}

Would I have made a mistake within the derivation process? Any insight would be very much appreciated.

Best Answer

You almost got it right ! You just made a tiny mistake when computing the derivative of the log-likelihood, you should have had : $$\frac{\partial\ell}{\partial \beta} = \frac{n\alpha}{\beta} -\color{red}{\sum_{i=1}^n x_i} $$

From which it follows that $$\frac{\partial^2\ell}{\partial \beta^2} = -\frac{n\alpha}{\beta^2} $$

Next, to compute the Fisher information, all you have to do is to take the expectation of $-\frac{\partial^2\ell}{\partial \beta^2} $. However, that expectation is with respect to the distribution of $x_i$, and there are no $x_i$'s in the expression of $\frac{\partial^2\ell}{\partial \beta^2}$, it is a constant ! Its expectation is thus equal to itself : $$ \mathbb E\left[-\frac{\partial^2\ell}{\partial \beta^2}\right] = \mathbb E\left[\frac{n\alpha}{\beta^2}\right] = \frac{n\alpha}{\beta^2}. $$

Related Question