[Math] Method of Maximum Likelihood for Normal Distribution CDF

maximum likelihoodnormal distributionparameter estimationstatistics

Based on a random sample of size n from a normal distribution, $X$~$N(\mu, \sigma^2)$ find the
MLE (maximum likelihood estimator) of the following:

$P[X>c]$ for arbitrary $c$.

This seems to be strange question, and the provided solution is even more troubling:

enter image description here

Would that not be the method of moments estimate? Surely that solution isn't correct. But if it's not… then how exactly do I calculate the MLE? I'm just downright confused.

Best Answer

The question, as you wrote it, is worded in an unclear way. My interpretation is the following:

Suppose you have a sample of $X_i\sim \mathcal{N}(\mu,\sigma)$, which are i.i.d. Find the Maximum likelihood estimate for the two parameters, $\mu$ and $\sigma$.

So

$$\mathcal{L}(\mu,\sigma)=(\frac{1}{\sqrt{2\pi\sigma^2}})^n\cdot \prod_{i=1}^ne^{-\frac{(X_i-\mu)^2}{2\sigma^2}}$$ which is the likelihood function, that we seek to maximize. To that end, we take its logarithm; it will make the calculation easier, while preserving the extrema, being a monotonically increasing function. Thus

$$\mathcal{F}=\ln(\mathcal{L}(\mu,\sigma))=-\frac{n}{2}\ln(2\pi)-\frac{n}{2} \ln(\sigma^2)+\sum_{i=1}^n-\frac{(X_i-\mu)^2}{2\sigma^2}$$ Now, you simply differentiate with respect to the two parametrs and equate to zero $$\frac{\partial}{\partial\mu}\mathcal{F}=0\implies \mu=\sum_{i=1}^n\frac{X_i}{n}$$ $$\frac{\partial}{\partial\sigma}\mathcal{F}=0\implies \sigma^2=\sum_{i=1}^n\frac{(X_i-\mu)^2}{n}$$

So, the estimated distribution, given $\bar X=\sum_{i=1}^n\frac{X_i}{n}$, is

$$X\sim \mathcal{N}\left(\bar X,\sqrt{\sum_{i=1}^n\frac{(X_i-\bar X)^2}{n}}\right)$$ Thus it follows that $$P(X>c)=1-\phi\left(\frac{c-\bar X}{\sqrt{\sum_{i=1}^n\frac{(X_i-\bar X)^2}{n}}}\right)$$

Related Question