[Math] Maximum Likelihood Estimation with a Gamma distribution

statistics

I have this problem that I stumbled upon. Suppose the random variable $X$ follows a Gamma distribution with parameters $\alpha$ and $\beta$ with the probability density function for $x>0$ as

$$f(x)= \frac{\beta^\alpha}{\Gamma(\alpha)} x^{\alpha-1} \exp(-\beta x)$$

where $\Gamma(\alpha)$ represents the Gamma function with $\Gamma(\alpha)=(\alpha-1)!$ when $\alpha$ is a natural number.

Further suppose we know that for the random variable $X$, the parameter $\alpha=4$. We record the independent observations $X_1,X_2,\ldots,X_n$ as a random sample from the distribution.

And I must find the likelihood function for $\beta$, $L(\beta)$, given $\alpha=4$, the maximum likelihood estimator $β$ and show that this indeed is a maximum.

I found that the Maximum Likelihood is: $\beta= 4n/\sum x_i$ but i am not sure if my way of thinking is correct. Any help will be much appreciated

Best Answer

\begin{align} L(\beta) & = \prod_{i=1}^n \frac{\beta^4}{\Gamma(4)} x_i^{4-1} \exp(-\beta x_i) \\[8pt] & \propto \beta^{4n} \exp\left(-\beta\sum_{i=1}^n x_i\right) \end{align} (The factor $\prod_{i=1}^n x_i$ does not depend on $\beta$ and so is a part of the constant of proportionality, as is $(\Gamma(4))^n$.) $$ \ell(\beta) = \log L(\beta) = C + 4n\log\beta -\beta\sum_{i=1}^n x_i $$ $$ \ell'(\beta) = \frac{4n} \beta -\sum_{i=1}^n x_i \quad \begin{cases} >0 & \text{if } 0<\beta<\dfrac{4n}{\sum_{i=1}^n x_i}, \\[6pt] = 0 & \text{if } \beta=\dfrac{4n}{\sum_{i=1}^n x_i}, \\[6pt] <0 & \text{if } \beta>\dfrac{4n}{\sum_{i=1}^n x_i}. \end{cases} $$

Related Question