Solved – MLE of Parameters of Bivariate Normal Distribution

bivariatemaximum likelihoodnormal distributionself-study

I am working through find the maximum likelihood estimators of the bivariate normal distribution, without using matrices. I have the following density function:

$f(Y_1,Y_2) = \frac{1}{2\pi\sigma_1\sigma_2\sqrt{1-\rho_{12}^2}} \exp \bigg\{ -\frac{1}{2(1-\rho_{12}^2)} \bigg[ \bigg(\frac{Y_1 – \mu_1}{\sigma_1} \bigg)^2 -2\rho_{12} \bigg( \frac{Y_1 – \mu_1}{\sigma_1} \bigg)\bigg( \frac{Y_2 – \mu_2}{\sigma_2} \bigg) + \bigg( \frac{Y_2 – \mu_2}{\sigma_2} \bigg)^2 \bigg] \bigg\}$

So far, I've obtained the log likelihood:

\begin{equation}
\begin{split}
\ell(\mu_1, \mu_2, \sigma_1, \sigma_2,\rho_{12}) & = -n\log2 \pi – n\log \sigma_1 – n\log \sigma_2 – \frac{n}{2} \log(1-\rho_{12}^2) \\&- \frac{1}{2(1-\rho_{12}^2)}\sum_{i=1}^n \bigg[ \bigg(\frac{Y_1 – \mu_1}{\sigma_1} \bigg)^2 -2\rho_{12} \bigg( \frac{Y_1 – \mu_1}{\sigma_1} \bigg)\bigg( \frac{Y_2 – \mu_2}{\sigma_2} \bigg) + \bigg( \frac{Y_2 – \mu_2}{\sigma_2} \bigg)^2 \bigg]
\end{split}
\end{equation}

I'm having trouble with next steps: taking the partial derivatives w.r.t. each parameter, setting them equal to 0, and solving for the parameters.

Is there a better way to write the log likelihood so that I can more easily take each partial derivative?

Best Answer

I guess hgupta has moved on, but for anyone else trying to work through this problem, check out Chapter 9, Section 1, Exercises 1.12-1.14 on pages 294-300 of the book An Introduction to Probability and Statistical Inference, Second Edition, written by George Roussas. He walks you through the whole problem, from deriving the estimators to verifying that they are the MLEs.