Compute the maximum likelihood estimator for $θ$.

statistics

Let $X_1, X_2, . . . , X_n$ be a random sample from a Bernoulli distribution with parameter $θ$. Compute the maximum likelihood estimator for $θ$.

In my opinion this is the correct way to solve it:
$L(\Theta)=\theta_1*(1-\theta_1)*…\theta_n*(1-\theta_n)=\theta^n*(1-\theta)^n$
$l(\Theta)=n*ln(\theta)+n*ln(1-\theta)$
$l'(\theta)=\frac{n}{\theta}-\frac{n}{1-\theta}=0$, $\theta=\frac{1}{2}$
For the official solution it should be $\frac{\sum_{i=1}^nX_i}{n}$, why MLE is incorrect, where should I put the $x$ that in my solution is not present?

Best Answer

The likelihood function for Bernoulli can be written as follows: $$ L(\theta)=\begin{cases}\theta, & X_1=1\cr 1-\theta, & X_1=0\end{cases}\times \begin{cases}\theta, & X_2=1\cr 1-\theta, & X_2=0\end{cases}\times \cdots\times\begin{cases}\theta, & X_n=1\cr 1-\theta, & X_n=0\end{cases} $$ Note that $\sum_{i=1}^n X_i=n\overline X$ calculates the number of units in a sample, and $n-n\overline X$ calculates the number of zeros. In the above product, $\theta$ is multiplied as many times as there are units in the sample. And $1-\theta$ is multiplied as many times as there are zeros in the sample. So $$ L(\theta) = \theta^{n\overline X}\cdot (1-\theta)^{n-n\overline X}. $$ Starting from here you can calculate MLE with derivatives.

Related Question