[Math] Solving Non Negative Constrained Least Squares by Analogy with Least Squares (MATLAB)

least squareslinear algebraMATLABoptimization

There is a least-squares problem Ax = b. It can be solved using backslash in Matlab (x = A \ b).
Let's assume that I have the same problem, but all x must be non-negative (>=0). How can I solve this problem by analogy with the previous one (without non-negativity constraints)? I think it can be somehow connected with the active set method.
I know that the non-negative least squares problem can be easily solved with Matlab Optimization toolbox or CVX or in many other ways. But still I'm curious about solving it by analogy with a straight-forward least-squares method. The idea is to make somehow from non-negative least-squares regular least squares.
Could anyone help me please?

Best Answer

"Straightforward" least squares cannot include constraints. It has a simple function to optimize, the sum of squared residuals, and that's it.

Exactly because of that, the solution $x$ can be found by means of linear algebra, i.e. it is the result of applying a linear operator to $b$. This linear operator is described by the matrix $(A^T A)^{-1} A^T$ if the system has a unique solution; if not, the pseudo-inverse $A^-$ defines a linear operator that gives the minimum-norm solution. Though it is recommended to use the backslash operator in Matlab for numerical reasons, that doesn't change this basic idea. As soon as you introduce constraints what you are doing is no longer "least squares", and the solution is no longer given by a linear operator applied to $b$.

To my knowledge, the task of optimizing a quadratic function under linear (equality or inequality) constraints is the subject of quadratic programming.