Solved – Similarities and differences between regression and estimation

estimationregression

What's the similarities and differences between parametric regression analysis and estimation theory?

I notice that they are both about parameter estimation, and both require some models for estimation.

One difference is that regress requires both independent and dependent variables, while estimation only requires observed variables. Also, regression minimizes the distance between the observed values and the values predicted by the model (least square), as the estimation, like MMSE estimator, minimizes the mean square error (MSE) of the to-be-estimated parameters.

For linear model with Gaussian noise, the maximum likelihood (ML) estimator will identical with the regression in form of (weighted) least square. In other words, the estimate achieves maximum likelihood, and also minimizes the residual.

Is there any other similarity or difference between these two?

Best Answer

Regression analysis is a form of statistical model.

Estimation methods like maximum likelihood, method of moments, or least squares (which is the same as minimum mean squared error - minimising the total of squared residuals is the same as minimising the mean of them) are ways of estimating the values of parameters of a statistical model, given the sample of observations available to us.

Hence there are no differences or similarities as such. An estimation method is needed to fit your regression model. Hence you cannot have a regression model without an "estimation theory" of some sort.

A common method of estimating the parameters in a regression is ordinary least squares, which is also the maximum likelihood method if certain assumptions are met (equal variance, Gaussian error terms, model specified correctly).

Related Question