Statistics – Bias of Maximum Likelihood Estimator of an Exponential Distribution

statistics

The maximum likelihood estimator of an exponential distribution $f(x, \lambda) = \lambda e^{-\lambda x}$ is $\lambda_\text{MLE} = \frac n {\sum x_i}$; I know how to derive that by taking the derivative of the log likelihood and setting it equal to zero.

I then read in an article that "Unfortunately this estimator is clearly biased since $\left\langle \sum_i x_i\right\rangle$ is indeed $1/\lambda$ but $\langle 1/\sum_i x_i \rangle \neq \lambda$."

Why does $\left\langle\sum_i x_i\right\rangle = 1/\lambda$? If I am correct in deducing the $\langle \cdot\rangle$ operator means expected value, then I thought $E(x_i) = 1/\lambda$ – that is, the expected value of one such $x_i$, is $1/\lambda$, not the sum of all $x_i$'s. And can someone explain the second of the statement and how these two statements demonstrate the MLE is biased?

Best Answer

$\newcommand{\e}{\operatorname{E}}$You have $$ \Pr\left( \sum_{i=1}^n x_i \in A \right) = \int\limits_A \frac 1 {\Gamma(n)} (\lambda u)^{n-1} e^{-\lambda u} (\lambda\,du), $$ i.e. the sum has a gamma distribution.

So \begin{align} \e\left( \frac n {\sum_{i=1}^n x_i} \right) & = \int_0^\infty \frac n u \cdot \frac 1 {\Gamma(n)} (\lambda u)^{n-1} e^{-\lambda u} (\lambda\,du) \\[10pt] & = \frac {n\lambda} {\Gamma(n)} \int_0^\infty (\lambda u)^{n-2} e^{-\lambda u} (\lambda\,du) \\[10pt] & = \frac{n\lambda}{\Gamma(n)} \cdot\Gamma(n-1) = \frac n {n-1} \lambda. \end{align}