Maximum Likelihood Estimation – MLE for Gamma Distribution Function

estimatorsgamma distributionmaximum likelihoodself-study

Let $X_{1},X_{2},X_{3},…,X_{n}$ be a random sample from a distribution with pdf
$$f(x;\alpha,\theta)=\frac{e^{-x/\theta}}{\theta^{\alpha}\Gamma(\alpha)}x^{\alpha-1}I_{(0,\infty)}(x ),\alpha,\theta>0$$

Find the maximum likelihood estimator of $\alpha$ and $\theta$. Let $\Psi(\alpha)=\frac{d\Gamma(\alpha)}{d\alpha}$

My attempt,
\begin{eqnarray*}
\mathcal{L}(\alpha,\theta)&=&\prod_{i=1}^{n}f(x_i)\\
&=&\prod_{i=1}^{n}\frac{e^{-x_i/\theta}}{\theta^{\alpha}\Gamma(\alpha)}x_i^{\alpha-1}\\
&=&\frac{1}{\Gamma^{n}(\alpha)\cdot \theta^{n \alpha}}(\prod_{i=1}^{n}x_i)^{\alpha-1}\exp(-\sum_{i=1}^{n}\frac{x_i}{\theta})
\end{eqnarray*}
\begin{eqnarray*}
\ell(\alpha,\theta)&=&-n\log(\Gamma(\alpha))-n\alpha\log(\theta)+(\alpha-1)\sum_{i=1}^{n}\log(x_i)-\frac{1}{\theta}\sum_{i=1}^{n}x_i\\
\frac{\delta \ell(\alpha,\theta)}{\delta \theta}&=&-\frac{n\alpha}{\theta}+\frac{1}{\theta^2}\sum_{i=1}^{n}x_i=0\\
\frac{1}{\theta^2}\sum_{i=1}^{n}x_i&=&\frac{n\alpha}{\theta}\\
\hat{\theta}&=&\frac{\sum_{i=1}^{n}x_i}{n\alpha}\\
&=&\frac{1}{\alpha}\bar{x}\\
\end{eqnarray*}
\begin{eqnarray*}
\frac{d \ell(\alpha,\hat{\theta})}{d\alpha}&=&\frac{-n \cdot \Gamma'(\alpha)}{\Gamma(\alpha)}-n\log(\frac{1}{\alpha}\bar{x})+\sum_{i=1}^{n}\log(x_i)=0\\
&=&\frac{-n \cdot \Gamma'(\alpha)}{\Gamma(\alpha)}+n\log(\alpha)-n\log(\bar{x})+\sum_{i=1}^{n}\log(x_i)=0\\
\log(\alpha)-\frac{\Gamma'(\alpha)}{\Gamma(\alpha)}&=&\log(\bar{x})-\frac{\sum_{i=1}^{n}\log(x_i)}{n}
\end{eqnarray*}

I could't proceed anymore for finding the $\alpha$. Second, I do not know how to use $\Psi(\alpha)=\frac{d\Gamma(\alpha)}{d\alpha}$ as given in the question. Hope someone can explain it to me.

Thanks in advance.

Best Answer

Let $\psi(\alpha) = \frac{\Gamma'(\alpha)}{\Gamma(\alpha)}$ so $\psi$ is the digamma function (I'm using $\psi$ rather than your $\Psi$).

By the AM-GM inequality $$ \bar x \geq \left(\prod_i x_i\right)^{1/n} $$ so $$ \log \bar x - \overline{\log x} \geq 0 $$ (where $\log \bar x$ and $\log x_i$ are defined almost surely). Furthermore, equality only holds for $x_1=\dots=x_n$ which is a probability $0$ event, so $\log \bar x - \overline{\log x} > 0$ almost surely.

For simplicity, I'll take $y = \log \bar x - \overline{\log x}$.

Consider $f(\alpha) = \log(\alpha) - \psi(\alpha)$ on $(0,\infty)$. This is continuous and $$ \lim_{\alpha\to 0} f(\alpha) = \infty $$ $$ \lim_{\alpha\to\infty} f(\alpha) = 0 $$ so by the intermediate value theorem $f$ hits every real number in $(0,\infty)$. In particular, this means that $$ f^{-1}\left(\left\{y\right\}\right) \neq \emptyset $$ i.e. there is at least one point in $(0,\infty)$ mapped to $y$, since $y > 0$.

Furthermore, $f$ turns out to be injective on $(0,\infty)$ as $f' < 0$ so there is actually a unique $\hat \alpha$ with $f(\hat\alpha) = y$.

Actually finding this $\hat \alpha$ will require numerical methods though, as @StubbornAtom says.

Related Question