[Math] Maximum likelihood estimator of $p(1-p)$, where $p$ is the parameter of a Bernoulli distribution

maxima-minimamaximum likelihoodprobabilityprobability distributionsstatistical-inference

Problem

Let $X_1,…,X_n$ be a random sample with Bernoulli distribution of parameter $p$. Consider the new parameter $\theta=p(1-p)$. Find the MLE of $\theta$ and show that is an asintotically unbiased estimator but not an unbiased estimator.

So, I know that the maximum likelihood function is $$L(x_1,…,x_n,\theta)=\prod_{i=1}^np^{x_i}(1-p)^{1-x_i}$$

I don't think that an adequate approach to find the point that maximizes this function in terms of $\theta$ is using derivatives, at least I don't see how to derive it in terms of $\theta$. I can't think of another approach so I would really appreciate suggestions or hints and see if I can go from there. Thanks in advance

Best Answer

The Maximum Likelihood Estimator has a great property called Functional Invariance.

Which means you may find the MLE of $ p $ and then calculate the MLE of $ p \left( 1 - p \right) $ easily.