Gauss-Seidel-method-Matrix has eigenvalue 0 and a claim about Gauss-Seidel convergence

eigenvalues-eigenvectorsmatricesnumerical linear algebra

Show that the Gaus-Seidel-Interation-matrix $ M = (D-L)^{-1}R $ has the eigenvalue zero for every regular matrix $A \in \mathbb{R}^{n \times n} $.

For every regular matrix $ A \in \mathbb{R}^{n \times n} $ and every $ b\in \mathbb{R}^n$ there is a a value $x_0$ with $ x_o \neq A^{-1}b $ so that the Gauss-Seidel method convergences after one iteration.

So first of all I want to tell you what $D,L,R$ are. So you get these matrices if you rewrite $A$ :

$A = D – L – R$,

$L,R$ are strictly lower \ upper triangular matrices and D is a diagonal matrix.

My idea for the second part is to use the first part. Maybe we could choose the eigenvector with eigenvalue 0 for $x_0$ ? Im not sure. For the first part we should just consider $(D-L)^{-1}R$ but I don't see the trick.

Best Answer

The matrix $R$ is strictly upper triangular (has zeros on the diagonal) and hence it is singular. Namely, if $v:=[1,0,\ldots,0]^T$, we have $Rv=0$. Therefore $Mv=0$ which means that the first column of $M$ is zero.

The error $e_k:=x-x_k$ in the Gauss-Seidel method transforms as $$ e_{k+1}=Me_k, $$ so if the initial error $e_0$ is such that $e_0=\beta v$ for some scalar $\beta$, then $$ e_1=Me_0=\beta Mv=0. $$ That is, the method finds the exact solution in one step. The initial guess to achieve this hence needs to be $x_0=x-\beta v$.

Related Question