[Math] Fisher Information of log-normal distribution

fisher informationnormal distributionprobability distributions

I have the pdf of a log-normal distribution

$$ f(y;\theta)= \frac {1}{y\sqrt{2\pi\theta}}\exp \left(-{\frac {(\log y)^2}{ 2\theta}}\right)$$ for $y>0$ and $\theta>0$ and $f(y; \theta) = 0$ otherwise.

and assuming that $E(\log Y_i) = 0$ and $\operatorname{var}(\log Y_i) = \theta$

I've calculated the log likelihood to be
$$l(\theta) = \log \prod_{i=1}^n f(y_i;\theta) \propto – \sum_{i=1}^n \log y_i – \frac{n}{2} \log \theta – \frac{1}{2\theta}\sum_{i=1}^n \log^2 y_i$$

first derivative:
$$l'(\theta) = – \frac{n}{2\hat{\theta}} + \frac{1}{2\hat{\theta}^2} \sum_{i=1}^n \log^2 y_i$$

second derivative

$$l''(\theta) = – \frac{n}{2\hat{\theta}^2} – \frac{1}{2\hat{\theta}^3} \sum_{i=1}^n \log^2 y_i$$

Fisher information:

$$-E(l''(\theta)) = \frac{n}{2\hat{\theta}^2} – \frac{1}{2\hat{\theta}^3} \sum_{i=1}^n \log^2 y_i$$

(can someone please tell me if the derivatives are correct?)
log-
I'm suck on the Fisher calculation. Can someone please help me to get it right and show me how to derive the Standard error from that

Best Answer

For a sample of $n$ observations, use the result $$I(\theta)=-nE_{\theta}\left[\frac{\partial^2}{\partial\theta^2}\ln f(Y_1;\theta)\right]$$

, where $I(\theta)$ is the information about $\theta$ contained in $Y_1,Y_2,\ldots,Y_n$. This formula is valid here.

It is enough to do the calculation for a single observation $y_1$:

The log-likelihood is $$\ln f(y_1;\theta)=-\frac{1}{2}\ln(2\pi\theta)-\ln y_1-\frac{1}{2\theta}(\ln y_1)^2$$

Therefore, $$\frac{\partial}{\partial\theta}\ln f(y_1;\theta)=-\frac{1}{2\theta}+\frac{1}{2\theta^2}(\ln y_1)^2$$

And

$$\frac{\partial^2}{\partial\theta^2}\ln f(y_1;\theta)=\frac{1}{2\theta^2}-\frac{1}{\theta^3}(\ln y_1)^2$$

So, $$E_{\theta}\left[\frac{\partial^2}{\partial\theta^2}\ln f(Y_1;\theta)\right]=\frac{1}{2\theta^2}-\frac{1}{\theta^3}E_{\theta}(\ln Y_1)^2$$

Since $\ln Y_1\sim N(0,\theta)$, you have $E_{\theta}(\ln Y_1)^2=V_{\theta}(\ln Y_1)+(E_{\theta}(\ln Y_1))^2=\theta$.

Hence the Fisher information is $$I(\theta)=\frac{n}{2\theta^2}$$