Simple linear regression – maximum likelihood estimators and least squares

linear regressionprobabilityprobability theoryregressionstatistics

Suppose we have data $\{(X_i,Y_i)\}_{i=1}^n$ along with two simple linear regression models as follows:

$Y_i=\beta_0+\beta_1X_i+\epsilon_i$ where $X_i=\gamma_0+\gamma_1Y_i+u_i$, where $\epsilon_i$ and $u_i$ are distributed normally with mean zero and variance $\sigma_{\epsilon}^2, \sigma_{u}^2$ respectively.

What is the process to find the estimators for $\beta_0$, $\beta_1$, $\gamma_0$ and $\gamma_1$? I need MLE and OLS estimators. It would also be helpful to prove that the product of the MLE estimators of $\beta_1$ and $\gamma_1$ is equal to the square of the sample correlation coefficient.

Best Answer

Your model is $Y_i=\beta_0 + \beta_1(\gamma_0 + \gamma_1Y_i + u_i) + \epsilon_i$ which is equivalent to \begin{equation} Y_i=\frac{\beta_0}{1-\beta_1\gamma_1} + \frac{\beta_1\gamma_0}{1-\beta_1\gamma_1} + \beta_1u_i + \epsilon_i \quad \quad (1) \end{equation}

As $u_i$ and $\epsilon_i$ are normally distributed, their sum $\varepsilon_i =\beta_1u_i + \epsilon_i$ is normally distributed with mean $\mu_{\varepsilon}=\beta_1\mathbb{E}(u_i) + \mathbb{E}(\epsilon_i) = 0$ and variance $\sigma_{\varepsilon}^2$. Notice that $\mathbb{E}(Y_i)=\frac{\beta_0 + \beta_1\gamma_0}{1-\beta_1\gamma_1}$.Therefore, your likelihood function is

$$f(y_1, y_2, ...y_n)= [2\pi\sigma_{\varepsilon}^2]^{-n/2}e^{-\frac{1}{2\sigma^2}\sum_{i=1}^{n}\big[y_i - \frac{\beta_0 + \beta_1\gamma_0}{1-\beta_1\gamma_1}\big]^2} \quad (2)$$

Take the log of the likelihood function, and from the first order condition of the log-likelihood, find the estimators of your interest. For the ols estimators, use expresion (1).

Related Question