[Math] Calculating a Fisher expected information

statistics

Let $X_1,…, X_n$ be a random sample from a distribution with probability density function $f(x;\theta) = (1/2\theta) exp(-|x|/\theta)$ for $-\infty<x<\infty$.

(a) Find the maximum likelihood estimator of $\theta$ and calculate the Fisher (expected) information in the sample.

I've calculated the MLE to be $\sum |X_i|/n$ and I know the definition of Fisher expectation, but I'm getting really stuck with calculating it. I think the $|X_i|$ terms are throwing me off.

Any help in doing this problem would be much appreciated!

Thanks

Best Answer

Recall that $$I(\theta)=-\mathbb{E}\left[\frac{\partial^2}{\partial \theta^2}l(X\,| \,\theta)\right]\,$$ under certain regularity conditions (that apply here), where $I$ is the Fisher information and $l$ is the log-likelihood function of $X$. The log-likelihood function in this case is given by $$\begin{align} l(X\,|\,\theta) &=\text{log}f(X\,|\,\theta) \\&=\log\left(\frac{1}{2\theta}\text{exp}\left(-\frac{|X|}{\theta}\right)\right) \\ &= -\frac{|X|}{\theta} - \text{log}(2\theta)\,\,. \end{align}$$ It follows that $$\frac{\partial}{\partial \theta}l(X \,|\,\theta) = \frac{|X|}{\theta^2}-\frac{1}{\theta} \implies \frac{\partial^2}{\partial \theta^2}l(X \,|\,\theta) = -\frac{2|X|}{\theta^3}+\frac{1}{\theta^2}\,.$$ So, we have $$I(\theta)=-\mathbb{E}\left[-\frac{2|X|}{\theta^3}+\frac{1}{\theta^2}\right]=\mathbb{E}\left[\frac{2|X|}{\theta^3}-\frac{1}{\theta^2}\right]=\frac{2}{\theta^3}\mathbb{E}(\,|X|\,)-\frac{1}{\theta^2}\,.$$ It remains to compute the expectation of $|X|$. To this end, I will set up the integral. By definition of expected value for transformations of continuous random variables, we have $$\mathbb{E}(\,|X|\,)=\int_{-\infty}^{\infty}|x|\,f(x \,|\, \theta)\,\text{d}x=\int_{-\infty}^{\infty}\frac{|x|}{2\theta}\text{exp}\left(-\frac{|x|}{\theta}\right)\,\text{d}x = \theta\,.$$ Note: to compute the integral, alter its form by taking advantage of the fact that $|x|$ is symmetric (and, you can also decompose the integral based on cases for $|x|$).

Thus, the Fisher information is $$I(\theta)= \frac{2}{\theta^3}\mathbb{E}(\,|X|\,)-\frac{1}{\theta^2} = \frac{2}{\theta^2}-\frac{1}{\theta^2}=\frac{1}{\theta^2}$$ For a sample $X_1,X_2,...,X_n$ of size $n$, the Fisher information is then $$I(\theta \,|\,n)=nI(\theta)=\frac{n}{\theta^2}\,.$$ Therefore, by the Cramér–Rao inequality, the variance of any unbiased estimator $\hat{\theta}$ of $\theta$ is bounded by the reciprocal of the Fisher information (this includes the MLE that you have computed, which achieves this lower bound, and is said to be an efficient estimator). In other words, $$\text{Var}(\hat{\theta}) \geq \frac{1}{nI(\theta)} = \frac{\theta^2}{n}\,\,.$$

Related Question