Linear Algebra – Proving Jacobi Method Convergence for Diagonally-Column Dominant Matrices

linear algebranumerical linear algebranumerical methods

The Jacobi method is an iterative method for approaching the solution of the linear system $Ax=b$, with $A\in\mathbb{C}^{n\times n}$, where we write $A=K-L$, with $K=\mathrm{diag}(a_{11},\ldots,a_{nn})$, and where we use the fixed point iteration $$\alpha_{j+1}=K^{-1}L\alpha_j+K^{-1}b,$$ so that we have for a $j\in\mathbb{N}$: $$\alpha-\alpha_{j+1}=K^{-1}L(\alpha-\alpha_{j}).$$

Now my syllabus provides a proof for convergence for the case that $A$ is diagonally-row dominant, but I and our teacher both couldn't see a way to rewrite the proof to a proof for the diagonally-column dominant case. The proof for the diagonally-row dominant is given using the $\|\cdot\|_{\infty}$ norm, and I found on the internet that the diagonally-column dominant case can be proved using the $\|\cdot\|_{1}$ norm. The proof strategy would be to show that $$\|\alpha-\alpha_{j+1}\|\leq \rho^j\|\alpha-\alpha_1\|,$$ for some matrix norm $\|\cdot\|$ and a $0\leq\rho<1$, from which convergence would follow immediately.

Does anyone have an idea how to continue? Using the $\|\cdot\|_{1}$ norm as suggested by the internet we want to have something like $$\|\alpha-\alpha_{j+1}\|_1\leq\|(K^{-1}L)^j\|_1\cdot\|\alpha-\alpha_1\|_1,$$ and somehow conclude $\|K^{-1}L\|_1$ should be bounded by $1$. I know about the characterization $$r:=\max_{i\in\{1,\ldots,n\}}\Bigg\{|a_{ii}|^{-1}\sum_{\substack{i&=1\\ i&\neq j}}|a_{ji}|\Bigg\}.$$

I'd like to proof this without Gershgorin's theorem, since this is not covered in my course. It is fine if the case follows from the diagonally-row dominant case.

Best Answer

The Jacobi iteration is an example of a stationary iteration $$ x_{n+1} = G x_n + f.$$ By induction we obtain $$ x_n = \sum_{j=0}^n G^j f.$$ In particular, we have $x_n \rightarrow (I-G)^{-1}f$ when, say, $\|G\|_\infty<1$ or $\|G\|_1 < 1$.

Now, Jacobi's method is often introduced with row diagonal dominance in mind. Therefore, the linear system $Ax=b$ is rewritten at $Dx = (D-A)x+b$ where $D$ is the main diagonal. This gives rise to the stationary iteration corresponding to $G = D^{-1}(D-A)$ and $f = D^{-1}b$. If $A$ is strictly diagonally dominant by rows, then $\|G\|_\infty < 1$ is immediate. However, another construction is possible. The introduction of a new variable $y = Dx$ yields the equivalent linear system $y = (D-A)D^{-1} y + b$ and the stationary iteration $$y_{n+1} = (D-A) D^{-1} y_n + b.$$ When $A$ is strictly diagonally dominant by columns, the driving matrix $G = (D-A)D^{-1}$ satisfies $\|G\|_1 < 1$. In this case $y_n \rightarrow y$ which solves $y = (D-A)D^{-1} y + b$. Now, what about Jacobi's method? With the definition $z_n := D^{-1}y_n$ we have $z_{n+1} = D^{-1}(D-A)z_n + D^{-1}b$, i.e., Jacobi's method. The sequence $z_n$ is convergent because the sequence $y_n$ is convergent and $z_n \rightarrow z$ where $z$ solves the equation $z=D^{-1}(D-A)z+D^{-1}b$, i.e., $z = x = A^{-1}b$.

Related Question