[Math] Diagonal dominance preserved by row elimination

gaussian eliminationmatricesnumerical methods

I'm trying complete a proof I've seen in a class, which says that diagonally dominant matrices have an LU decomposition. For this, given a diagonally dominant matrix $A \in \mathbb{R}^{n \times n}$ we know that if sequence $A^{(0)}, \ \dots \ , \ A^{(n-1)} $ defined as

$$
A^{(0)} = A \\
a_{ij}^{(k)} = a_{ij}^{(k-1)} – \frac{a_{ik}^{(k-1)} \cdot a_{kj}^{(k-1)}}{a_{kk}^{(k-1)}}
$$

which consists of each step of gaussian elimination without pivoting is well defined, (i.e $\ a_{kk}^{(k-1)} \neq 0 $ for each $k$), then there is a LU decomposition. Therefore, it is sufficient to show the stronger statement:

If $A^{(k)}$ is diagonally dominant, $A^{(k+1)}$ is as well.

Now, here is where I've lost track of my notes. I know this has to be true but I've failed to succeed in proving it. We want to show that

$$
\sum_{j \neq i}|a_{ij}^{(k)}| < |a_{ii}^{(k)}|
$$

which is equivalent to

$$
\sum_{j \neq i}|a_{ij}^{(k-1)} – \frac{a_{ik}^{(k-1)} \cdot a_{kj}^{(k-1)}}{a_{kk}^{(k-1)}}| < |a_{ii}^{(k-1)} – \frac{a_{ik}^{(k-1)} \cdot a_{ki}^{(k-1)}}{a_{kk}^{(k-1)}}|
$$

Any thoughts?

Best Answer

The basic idea of the proof is as follows. Assume you are eliminating the entry $a_{ik}^{k-1}$ in the $k$th step of Gaussian elimination. Hence, $a_{ik}^k = 0$ at the end of the computation. To show that the matrix is still diagonally dominant after the elimination, we show that the total amount that we add to the $i$th row is less than the eliminated entry.

We start with the diagonal dominance of the $k$th row, i.e., $$ \sum_{j \not= k} | a_{kj}^{k-1} | < | a_{kk}^{k-1} | \,. $$ Multiplying by $\frac{| a_{ik}^{k-1} |}{| a_{kk}^{k-1} |}$ gives that $$ \tag{a} \sum_{j \not= k} \frac{| a_{ik}^{k-1} | \cdot | a_{kj}^{k-1} |} {| a_{kk}^{k-1} |} < | a_{ik}^{k-1} | \,. $$ Hence, indeed, what we add to the $i$th row is less than the entry that we eliminate. With this knowledge we can start the computation.

We have, because $a_{ik}^{k} = 0$, that \begin{align} \sum_{j \not= i} | a_{ij}^{k} | &= \sum_{\substack{j \not= i \\ j \not= k}} | a_{ij}^{k} | \,. \end{align} Using the definition of the elimination step and then the triangle inequality gives that \begin{align} \sum_{j \not= i} | a_{ij}^{k} | &= \sum_{\substack{j \not= i \\ j \not=k}} | a_{ij}^{k-1} - \frac{a_{ik}^{k-1} \cdot a_{kj}^{k-1}}{a_{kk}^{k-1}} | \\ &\le \sum_{\substack{j \not= i \\ j \not=k}} | a_{ij}^{k-1} | + \sum_{\substack{j \not= i \\ j \not=k}} | \frac{a_{ik}^{k-1} \cdot a_{kj}^{k-1}}{a_{kk}^{k-1}} | \,. \end{align} From equation (a) and the fact that we leave out the $i$th term it follows that \begin{align} \sum_{j \not= i} | a_{ij}^{k} | &\le \sum_{\substack{j \not= i \\ j \not=k}} | a_{ij}^{k-1} | + | a_{ik}^{k-1} | - | \frac{a_{ik}^{k-1} \cdot a_{ki}^{k-1}}{a_{kk}^{k-1}} | \\ &= \sum_{j \not= i} | a_{ij}^{k-1} | - | \frac{a_{ik}^{k-1} \cdot a_{ki}^{k-1}}{a_{kk}^{k-1}} | \,. \end{align} Using that the $i$th row was diagonally dominant before the elimination and then applying the reversed triangle inequality, \begin{align} \sum_{j \not= i} | a_{ij}^{k} | & \le | a_{ii}^{k-1} | - | \frac{a_{ik}^{k-1} \cdot a_{ki}^{k-1}}{a_{kk}^{k-1}} | \\ &\le | a_{ii}^{k-1} | - | \frac{a_{ik}^{k-1} \cdot a_{ki}^{k-1}}{a_{kk}^{k-1}} | \\ &= | a_{ii}^{k} | \,. \end{align} This inequality completes the proof.