Total Least Square fitting

gradient descentleast squareslinear regressionordinary differential equations

Say I want to fit a straight line using Total Least Square (as opposed to Least Square), which is to minimize the sum of (yi-k*xi-b)^2/(k^2+1) over all xi's and yi's, where xi's and yi's are training data point coordinates, and k and b are the fitting parameters.

I use gradient descent to solve for k and b. The cost function is just the above sum. But the result turns out that the cost isn't monotonic, it goes down and goes up. It's not supposed to be so, as this sum has 1 global minimum only. I don't know what goes wrong to make the cost function non monotonic?

Best Answer

In applied statistics, total least squares is a type of errors-in-variables regression, a least squares data modeling technique in which observational errors on both dependent and independent variables are taken into account. It is a generalization of Deming regression and also of orthogonal regression, and can be applied to both linear and non-linear models.