Solved – Method of moments for linear regression

estimationlinear modelmethod of momentsregression

I have been reading about the method of moments, and now I understand how to obtain the method of moments estimator for a random sample $x_1,…,x_n$ from a distribution $f(x;\theta)$, in the multiparameter case. However, I fail to see how to formulate the method of moments for the linear regression model
$$y_i = x_i^T\beta + e_i,$$
where $E[e_i]=0$, $\beta,xi\in R^p$. There is this moment condition, but I cannot link it to a system of equations as in least squares estimation.

Is there a reference with this formulation?

Best Answer

The least squares estimator is the solution to the estimating equation:

$$ 0 = \mathbf{X}^T \left( Y - \mathbf{X}\beta \right)$$

Where $\mathbf{X} = [1, x_1, x_2, \ldots, x_p]$ is a $n \times p$ model matrix of covariate(s).

This is a trivial result, but a more general discussion on estimating equations can be found in Wakefield "Bayesian and Frequentist Regression Methods". Estimating Equations are also called M-estimators. Another reference would be Boos, Stefanski "Essential Statistical Inference" ch. 7.

Related Question