MLE of the Geometric Distribution

maximum likelihoodmeansprobability distributionsstatistics

Suppose that $X_{1},X_{2},…,X_{n}$ are independently and identically distributed as $Ge(\theta)$.

(i) Find the maximum likelihood estimator of $\theta$

My solution:

$\theta = \frac{n}{\sum_{i=1}^{n}x_{i}}$

Therefore, $E(\hat\theta) = \frac{1}{\theta}$

(ii) Hence show that the maximum likelihood estimator of $\psi = \frac{(1-\theta)}{\theta}$ is the sample mean $(\bar X)$.

Try as I might, I can't re-arrange the answer to question 1 into the form shown in question 2. Please may someone help me?

Best Answer

Regrettably, there are two distributions that are called geometric [1], the classical one, taking values in $1,2,\ldots$ and the shifted variant that takes values in $0,1,2,\ldots$. The problem you posed was probably referring to the latter, where the likelihood of the sample $X_1,\ldots,X_n$ is $$L(\theta)=\prod_{j=1}^n \theta (1-\theta)^{X_j} \,.$$ By differentiating $\log L$, we see that $L(\theta)$ is maximized at a parameter $\hat{\theta}$ that satisfies $$\sum_{j=1}^n \Bigl(\frac{1}{\hat{\theta}}-\frac{X_j}{1-\hat{\theta}}\Bigr)=0 \,.$$ Dividing by $n$, we infer that the MLE $\; \hat{\theta}$ satisfies $$\frac{1}{\hat{\theta}}=\frac{\bar{X}}{1-\hat{\theta}} \,,$$ so $$ \hat{\theta} =\frac{1}{1+\bar{X}} \quad \text{and} \quad \frac{1-\hat{\theta}}{\hat{\theta}}=\bar{X} \,.$$

[1] https://en.wikipedia.org/wiki/Geometric_distribution

Related Question