[Math] Computing least squares error from plane fitting SVD

geometrylinear algebrasvd

I know one can define a least-squares-fit plane as a point and normal using the centroid of a set of points and the singular vector associated with its least singular value.

However, in doing that, is it possible to compute the residual point-to-plane distance error without simply summing all the point-to-plane distances? That is, as SVD can give the least-squares results, is there any part of the process (a singular value, or some matrix product, perhaps) that quickly produces the value of the minimized error?

Best Answer

Let $A$ be a matrix whose columns are the coordinates of the points being fitted, relative to the centroid (that is, every column of $A$ is a point being fitted minus the coordinates of the centroid). From the Eckhart-Young thoeorem, we find that if $A$ has singular value decomposition $A = U \Sigma V^T$ where $$ \Sigma = \pmatrix{\sigma_1\\&\sigma_2 \\ & & \sigma_3}, \qquad \sigma_1 \geq \sigma_2 \geq \sigma_3 $$ Then the coordinates of the projection of the columns on the best fit plane are the columns of the matrix $\tilde A = U \tilde \Sigma V^T$, where $$ \tilde \Sigma = \pmatrix{\sigma_1\\&\sigma_2 \\ & & 0} $$ Now, let $\|A\|$ denote the Frobenius norm, which is to say that $\|A\|^2 = \sum_{i,j}|a_{ij}|^2$. We find that the square of the minimized error is given by $$ \|A- \tilde A\|^2 = \|U(\Sigma - \tilde \Sigma)V^T\|^2 = \|\Sigma - \tilde \Sigma\|^2 = \sigma_3^2 $$

Related Question