Solved – Maximum Likelihood Estimation for Bernoulli distribution

bernoulli-distributionmaximum likelihood

Let's say we have $X_1,\ldots, X_n$ iid Bernoulli($p$), ask for MLE for $p$. I'm pretty struggled on the second derivative of log-likelihood function, why it is negative? My second question is what is MLE when the maximum is achieved on the boundary of the parameter space : ${\sum x_i}=0$ or $n$? Looking forward to any feedback and suggestions.

Best Answer

Its often easier to work with the log-likelihood in these situations than the likelihood. Note that the minimum/maximum of the log-likelihood is exactly the same as the min/max of the likelihood. $$ \begin{align*} L(p) &= \prod_{i=1}^n p^{x_i}(1-p)^{(1-x_i)}\\ \ell(p) &= \log{p}\sum_{i=1}^n x_i + \log{(1-p)}\sum_{i=1}^n (1-x_i)\\ \dfrac{\partial\ell(p)}{\partial p} &= \dfrac{\sum_{i=1}^n x_i}{p} - \dfrac{\sum_{i=1}^n (1-x_i)}{1-p} \overset{\text{set}}{=}0\\ \sum_{i=1}^n x_i - p\sum_{i=1}^n x_i &= p\sum_{i=1}^n (1-x_i)\\ p& = \dfrac{1}{n}\sum_{i=1}^n x_i\\ \dfrac{\partial^2 \ell(p)}{\partial p^2} &= \dfrac{-\sum_{i=1}^n x_i}{p^2} - \dfrac{\sum_{i=1}^n (1-x_i)}{(1-p)^2} \end{align*} $$

The penultimate line gives us the MLE (the $p$ that satisfies the first derivative of the log-likelihood (also called the score function) equal to zero).

The last equation gives us the second derivative of the log-likelihood. Since $p\in [0,1]$ and $x_i \in \left\{0,1\right\}$, the second derivative is negative.