[Math] On diagonal dominant matrices don’t need row changes, no row change is needed for Gaussian Elimination

linear algebranumerical linear algebranumerical methods

A diagonally dominant matrix is a matrix such that in each line, the diagonal element is greater than the sum of the rest of the elements of that line.

Prove that in a diagonally dominant matrix, there's no need to do row
changes in the gaussian algorithm. Also, conclude that its determinant
is $\neq 0$ and that $A$ has LU decomposition

I found this proof: https://math.stackexchange.com/a/804128/166180 but I don't know why he starts the sum from $i=3$ and not $i=1$. Also, how to prove that the determinant is $\neq 0$ and that LU decomposition exists?

UPDATE:

Could someone sketch a proof that no row change is necessary? I opened a bounty

Best Answer

Diagonal dominance can be defined by rows or by columns. In the view of what you need, i.e. to show that during the Gauss elimination process, in which you eliminate column by column, no partial pivoting is necessary, the definition of diagonal dominance by columns is more helpful. If the diagonal dominance by columns is preserved during the elimination process, no exchange of rows (i.e. the partial pivoting) would help and can be omitted.

Let us consider to have a square matrix $\textbf{A}$, of type $n\times n$, diagonally dominant by columns and let us eliminate the first column, i.e. the values $a_{21},\dots,a_{n1}$. Then we show that the submatrix $\textbf{A}'$ is diagonal dominant by columns as well. $$ ~~~\textbf{A} ~~\rightarrow~~ \left[\begin{array}{cc} a_{11} & \textbf{a} \\ \textbf{0} & \textbf{A}' \\ \end{array}\right],$$ $$ \left[\begin{array}{cccc} a_{11} & \dots & a_{1j} & \dots \dots \\% & a_{1n} \\ a_{21} & \dots & a_{2j} & \dots \dots \\% & a_{2n} \\ \vdots & & \vdots & \\% & \vdots \\ a_{j1} & \dots & a_{jj} & \dots \dots \\% & a_{jn} \\ \vdots & & \vdots & \\% & \vdots \\ a_{n1} & \dots & a_{nj} & \dots \dots \\% & a_{nn} \\ \end{array}\right] ~~\rightarrow~~ \left[\begin{array}{ccccc} a_{11} & \dots & a_{1j} & \dots \dots \\% & a_{1n} \\ 0 & \dots & a_{2j}-\frac{a_{21}}{a_{11}}a_{1j} & \dots \dots \\%& a_{2n}-\frac{a_{21}}{a_{11}}a_{1n} \\ \vdots & & \vdots & \\ 0 & \dots & a_{jj}-\frac{a_{j1}}{a_{11}}a_{1j} & \dots \dots \\%\\ & a_{jn} -\frac{a_{j1}}{a_{11}}a_{1n} \vdots & & \vdots & \\ 0 & \dots & a_{nj}-\frac{a_{n1}}{a_{11}}a_{1j} & \dots \dots %\\ & a_{nn} -\frac{a_{n1}}{a_{11}}a_{1n}\\ \end{array}\right]. $$
First of all, diagonal dominance of the first column of $\textbf{A}$ yields $$ |a_{11}|\ge \sum_{i=2}^n |a_{i1}|~~\implies~~1\ge \sum_{i=2}^n \left|\frac{a_{i1}}{a_{11}}\right|. $$ Then we look to non-diagonal entries of the $j$-th column of the $\textbf{A}'$ matrix: $$ \sum_{i=2,~i\neq j}^n \left|a_{ij}-a_{1j}\frac{a_{i1}}{a_{11}}\right| \le \sum_{i=2,~i\neq j}^n \left|a_{ij}\right|+\sum_{i=2,~i\neq j}^n \left|a_{1j}\frac{a_{i1}}{a_{11}}\right|= $$ $$ \sum_{i=1,~i\neq j}^n \left|a_{ij}\right|-|a_{1j}|+ \sum_{i=2}^n \left|a_{1j}\frac{a_{i1}}{a_{11}}\right|-\left|a_{1j}\frac{a_{j1}}{a_{11}}\right|\le_{(\text{use d.d. of the $j$-th column})} $$ $$ |a_{jj}|-|a_{1j}|+|a_{1j}|\sum_2^n \left|\frac{a_{i1}}{a_{11}}\right|-\left|a_{1j}\frac{a_{j1}}{a_{11}}\right| \le_{(\text{use d.d. of the 1-st column})} $$ $$ |a_{jj}|-\left|a_{1j}\frac{a_{j1}}{a_{11}}\right| \le \left|a_{jj}-a_{1j}\frac{a_{j1}}{a_{11}}\right|, $$ which is the diagonal term in the $j$-th column of the $\textbf{A}'$ matrix, i.e. the diagonal dominance of the $j$-th column was shown. So the $\textbf{A}'$ matrix is diagonally dominant by columns. If this procedure is used inductively we get that any submatrix in any stage of the Gauss-elimination is diagonally dominant by columns. As a consequence, the partial pivoting would be useless.

You could of course ask if full pivoting would be useful. However, first, full pivoting is out of practical use due its computational demands and second it can also be shown, that a modification of matrix entries is maximally by factor $2$, which makes the whole Gauss-elimination procedure safe. This result can be found in J.A.Tragenstein, Scientific Computing Nonlinear Computational Engineering, Springer Verlag, 2018, p.284.

Let us focus now on the question of regularity. Diagonal dominance is not sufficient to ensure regularity (consider a matrix $[1, 1; 1, 1]$). However, strict diagonal dominance is sufficient, see https://en.wikipedia.org/w/index.php?title=Diagonally_dominant_matrix#Applications_and_properties.