Solved – Can OLS be considered an optimization technique

least squaresoptimizationregressionterminology

Can ordinary least squares estimation be considered an optimization technique? If so, how can I explain this?

Note:

From an AI perspective, supervised learning involves finding a hypothesis function $h_\vec{w}(\vec{x})$ that approximates the true nature between predictor variables and the predicted variable. Let some set of functions with the same model representation define the hypothesis space $\mathbb{H}$ (That is we hypothesise the true relationship to be a linear function of inputs or a quadratic function of inputs and so forth). The objective is to find the model $h\in\mathbb{H}$ that optimally maps inputs to outputs. This is done by application of some technique to finds optimal values for the adjustable parameters $\vec{w}$ that defines the function $h_w(\vec{x})$. In AI we call this parameter optimization. A parameter optimization technique/model inducer/learning algorithm would for example be the back propagation algorithm.

OLS is used to find/estimate for $\beta$ parameters that defines the linear regression line that optimally maps predictor variables to output variables. This would be parameter optimization in the scenario above.

Best Answer

Yes, it is. In OLS, you are looking for the linear model that provides the "best" fit to the data. Implementation requires specifying some notion of what you mean by "best". OLS works by defining the "best" model as the one that minimizes a certain measure of model error -- in this case, the sum of the squares of the model residuals. The residuals are the part of the data that aren't explained by the model: OLS seeks to give the best description of the data, by minimizing the "total amount" of unexplained variation in the data.

Formally, any operaton in which you are solving for the minimum or maximum of some function can be interpreted as an optimization.

Related Question