Least Squares solution for a symmetric singular matrix

least squaresmatrices

I want to solve this system by Least Squares method:$$\begin{pmatrix}1 & 2 & 3\\\ 2 & 3 & 4 \\\ 3 & 4 & 5 \end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} =\begin{pmatrix}1\\5\\-2\end{pmatrix} $$ This symmetric matrix is singular with one eigenvalue $\lambda1 = 0$, so $\ A^t\cdot A$ is also singular and for this reason I cannot use the normal equation: $\hat x = (A^t\cdot A)^{-1}\cdot A^t\cdot b $.
So I performed Gauss-Jordan to the extended matrix to come with $$\begin{pmatrix}1 & 2 & 3\\\ 0 & 1 & 2 \\\ 0 & 0 & 0 \end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} =\begin{pmatrix}1\\3\\-1\end{pmatrix} $$
Finally I solved the $\ 2×2$ system: $$\begin{pmatrix}1 & 2\\\ 0 & 1\end{pmatrix}\begin{pmatrix}x\\y\end{pmatrix} =\begin{pmatrix}1\\3\end{pmatrix} $$ taking into account that the best $\ \hat b\ $ is $\begin{pmatrix}1\\3\\0\end{pmatrix}$

The solution is then $\ \hat x = \begin{pmatrix}-5\\3\\0\end{pmatrix}$

Is this approach correct ?

EDIT

Based on the book 'Lianear Algebra and its applications' from David Lay, I also include the Least Squares method he proposes: $(A^tA)\hat x=A^t b $

$$A^t b =\begin{pmatrix}5\\9\\13\end{pmatrix}, A^tA = \begin{pmatrix}14 & 20 & 26 \\
20 & 29 & 38 \\
26 & 38 & 50\end{pmatrix}$$

The reduced echelon from the augmented is:
$$ \begin{pmatrix}14 & 20 & 26 & 5 \\
20 & 29 & 38 & 9 \\
26 & 38 & 50 & 13 \end{pmatrix} \sim \begin{pmatrix}1 & 0 & -1 & -\frac{35}{6} \\
0 & 1 & 2 & \frac{13}{3} \\
0 & 0 & 0 & 0
\end{pmatrix} \Rightarrow \hat x = \begin{pmatrix}-\frac{35}{6} \\ \frac{13}{3} \\ 0 \end{pmatrix}$$
for the independent variable case that $z=\alpha , \alpha=0 $

Best Answer

Your solution is incorrect for the following reason. When you perform the Gauss-Jordan elimination, you transform the original system $$\tag{1} Ax=b $$ to another $$\tag{2} SAx=Sb. $$ But the least squares solutions of (1) and (2) do not coincide in general. Indeed, the least squares solution of (1) is $A^{+}b$, the least squares solution of (2) is $(SA)^{+}Sb$. If $A$ is invertible, then $$(SA)^{+}S=(SA)^{-1}S=A^{-1}=A^{+}$$ and everything is OK, but in general case $(SA)^{+}S\ne A^{+}$.

In your case, in particular, $$ S=\left(\begin{array}{rrr} 1 & 0 & 0 \\ 2 & -1 & 0 \\ 1 & -2 & 1 \\ \end{array}\right),\quad (SA)^{+}S=\left(\begin{array}{rrr} -11/6 & 4/3 & 0 \\ -1/3 & 1/3 & 0 \\ 7/6 & -2/3 & 0 \\ \end{array}\right), $$ $$ A^{+}=\left(\begin{array}{rrr} -13/12 & -1/6 & 3/4 \\ -1/6 & 0 & 1/6 \\ 3/4 & 1/6 & -5/12\\ \end{array}\right).$$

You can calculate the pseudoinverse matrix by using the rank factorization: $$ A=BC,\quad B=\left(\begin{array}{rr} 1 & 3\\ 2 & 4\\ 3 & 5\\ \end{array}\right),\quad C=\left(\begin{array}{rrr} 1&1/2&0\\ 0&1/2&1 \end{array}\right) $$ (this decomposition comes from the fact the second column of $A$ is the arithmetic mean of the remaining columns). It remains only to calculate the pseudoinverse matrix $$ A^{+}=C^{+}B^{+}=C^T(CC^T)^{-1}(B^TB)^{-1}B^T $$ and the least squares solution is $A^{+}b$.

Related Question