[Math] Jacobi vs. Gauss-Seidel: convergence

linear algebranumerical linear algebra

I know that for tridiagonal matrices the two iterative methods for linear system solving, the Gauss-Seidel method and the Jacobi one, either both converge or neither converges, and the Gauss-Seidel method converges twice as fast as the Jacobi one. Are there theorems relating the convergence speeds of these two methods when the system's matrix is not tridiagonal?

Best Answer

Yes. From pg. 233 of A Friendly Introduction to Numerical Analysis by Brian Bradie, we have the following:

Theorem. Suppose $A$ is a an $n \times n$ matrix. If $ a_{ii} > 0$ for each $i$ and $a_{ij} \leq 0$ whenever $ i \neq j$, then one and only one of the following statements holds:

  1. $0 \leq \rho (T_{gs}) < \rho (T_{jac}) < 1$

  2. $1 < \rho (T_{jac}) < \rho (T_{gs})$

  3. $\rho (T_{jac}) = \rho (T_{gs}) = 0$

  4. $\rho (T_{jac}) =\rho (T_{gs}) = 1$

where $T$ is the iteration matrix that arises for each method, and $\rho (T)$ denotes the spectral radius of $T$.

Thus, for this larger class of matrices, the methods converge and diverge together. When they converge, Gauss-Seidel converges faster; when they diverge, Gauss-Seidel again does so faster.

If you want the proof of this, Bradie cites the following sources:

A. Ralston and P. Rabinowitz, A First Course in Numerical Analysis, 2nd edition, McGraw-Hill, New York, 1978.

D. M. Young, Iterative Solution of Large Linear Systems, Academic Press, New York, 1971.