Using an nonlinear optimizer to solve this is overkill and completely silly, when a linear solver does it for you. Besides, with an optimizer you need to worry about convergence issues, starting values, etc.
Append an identity matrix to X, in this case as extra columns of X. Also append zeros to y. Then use slash (i.e /) to solve the problem.
W = [Y,zeros(1,N)]/[X,eye(N)];
This solves the regularized problem, where N is the number of rows in X. This solves the problem you have, with no optimizer needed at all. See that when [X,ones(N)] has more columns than rows, it solves the least squares problem you are asking to solve.
As an example,
For example, make up some data:
X = rand(10,20);
Y = rand(1,20);
We wish to solve the problem
This is easily accomplished using slash, if we did no regularization at all.
u = Y/X
u =
-0.10061 -0.039857 -0.087301 0.14731 0.40622 0.36113 -0.31992 0.32343 0.071708 0.16989
With regularization, thus solving the penalized sum of squares, we do it as:
u = [Y,zeros(1,10)]/[X,eye(10)]
u =
-0.013147 0.042605 0.0085151 0.089504 0.24349 0.1602 -0.11616 0.20876 0.14126 0.13649
The solution has been biased towards zero, as we should expect.
Best Answer