[Math] Cramér-Rao Lower Bound for estimator of mean in Exponential distribution

exponential distributionmaximum likelihoodparameter estimationvariance

Let $X_{1},…,X_{n}$ be a random sample of size $n\geq3$ from the exponential family with mean $1/\theta$.

(1) Find a sufficient statistic $T(X)$ for $\theta$ and write down its density.

(2) Obtain the maximum likeihood estimator $\hat{\Theta}_{n}$ and show that it is biased, but a muliple is not.

(3) Calculate the Cramér-Rao Lower Bound for the variance of an unbiased estimator, and explain why you woud not expect the bound to be attained in this example. Confirm this by calculating the variance of your unbiased estimator and comment on its behaviour as $n\rightarrow\infty$.

For (1) I found that a sufficient statistic for $\theta$ is $T(X)=\sum_{i=1}^{n}x_{i}$ and the corresponding density of the likelihood function is $L(\theta|X)=\theta e^{-\theta T(X)}$.

Next, for (2) I found the Maximum Likelihood Estimator $\hat{\Theta}_{n}$ to be equal to $\frac{1}{\sum_{i=1}^{n}x_{i}}$, which is biased, since $E[\hat{\Theta}_{n}]=\frac{\theta}{n-1}$. Therefore, I thought that the estimator $\frac{1}{n-1}\frac{1}{\sum_{i=1}^{n}x_{i}}$ is unbiased.

Finally, I do not understand (3) properly and I am not sure if the answers I found for (1) and (2) are right.

Best Answer

I think you are missing an $n$ in the maximum likelihood estimator. It is given by the reciprocal of the sample mean. In point (1) the density you found is not correct. You need to have $\theta^n$ instead of $\theta$ as a multiple of the exponential. Thus, the MLE is biased because of the fraction. Observe that your calculations of $E[\hat{\theta}]$ are not correct. Observe that in general \begin{equation} E\left[\frac{1}{X} \right ] \neq \frac{1}{E[X]} \end{equation}

Instead, one can aim at estimating $\beta := \frac{1}{\theta}$ instead of $\theta$. This amounts to a re-parameterization of the distribution. In this case the MLE of $\beta$ is given by $\frac{1}{n} \sum_{i=1}^n x_i$. And because $E[x] = \frac{1}{\theta} = \beta$, and using the linearity of the expectation operator (i.e. $E[\sum x_i] = \sum E[x_i]$), it is clear that such an MLE is unbiased for $\beta$.

Now observe that an exponential distribution is a special case of a Gamma distribution. We have \begin{equation} x_i \sim \text{Gamma}(1, \theta) \end{equation} and due to the independence and using the properties of the Gamma distribution it holds that \begin{equation} \sum_{i=1}^n x_i \sim \text{Gamma}(n, \theta) \end{equation}

Now the distribution of the inverse of a gamma random variable is the inverse-gamma distribution

Therefore, \begin{equation} \frac{1}{\sum_{i=1}^n x_i} \sim \text{Inv-Gamma}(n, \theta) \end{equation} whose expectation is given by \begin{equation} E\left[\frac{1}{\sum_{i=1}^n x_i}\right] = \frac{\theta}{n-1} \end{equation}

Using this result, you can find the bias in the MLE of $\theta$. It simply goes as follows \begin{equation} E\left[\frac{n}{\sum_{i=1}^n x_i} \right] - \theta = \frac{n \theta}{n-1} - \theta = \frac{1}{n-1} \theta \end{equation} hence \begin{equation} E\left[\frac{n}{\sum_{i=1}^n x_i} \right] = \frac{n}{n-1}\theta \end{equation}

Therefore, if you multiply the MLE of $\theta$ by $\frac{n-1}{n}$ it becomes unbiased.

Point (3) asks you to find the distribution of the unbiased estimator, i.e the distribution of the random variable \begin{equation} (n-1) \frac{1}{\sum_{i=1}^n x_i} \end{equation} which we already computed above (up to a scale by $n-1$), and use it to compute the variance of the unbiased estimator and show that it is not equal to the CRLB.

Related Question