Hi there,
i am currently facing the following problem: i want to minimize the L1-norm (sum of distances between my datapoints and a streight line). Therefore i wrote the following lines which just worked fine for my random data:
cvx_setup;%Definition of random Data
a = .9;x = sort(4*(rand(25,1)-.5));b = a*x + .1*randn(size(x));%Minimization
cvx_begin; variable aL1; minimize( sum(abs(aL1*x-b)) );cvx_end;%Visualization
figure;plot(x,b,'.','color','b');hold on;xgrid = -2:0.1:2;plot(xgrid, xgrid*aL1);title('L1 Norm')
However when i offset my data by a constant "b" the optimization does not fit.
cvx_setup;%Definition of random Dataa = .9;x = sort(4*(rand(25,1)-.5));b = a*x + .1*randn(size(x));b = b+10;%Minimizationcvx_begin; variable aL1; minimize( sum(abs(aL1*x-b)) );cvx_end;%Visualizationfigure;plot(x,b,'.','color','b');hold on;xgrid = -2:0.1:2;plot(xgrid, xgrid*aL1);title('L1 Norm')
Can anybody describe me how to tune my code to fit the optimal streight regarding an constant axis offset? Setting the first datapoint to zero is not working at all, because the optimization is currently only regarding a slope and no axis offset. Thus the first point would be part of the axis…
Thanks!
Best Answer