Proof: Every irreducible diagonaldominant matrix is $A$ is invertible.

linear algebramatricesproof-explanation

Definition: A matrix $A\in\mathbb{K}^{N,N}$ is irreducible diagonaldominant if it is irreducible and row or column diagonaldominant, where at least in one row/ column strict inequality has to hold.

Let $A$ be irreducible diagonaldominant (row-diagonaldominant). Then there exists $m\in{1,…,N}$ for which

$\hspace{4cm} |a_{mm}|>\sum\limits_{j\neq m}|a_mj|\hspace{1cm} (1)$

Assume that $A$ isn't invertible. Then $Ax=0$ has a solution $x\neq 0$. Because of $(1)$ not all components of $x$ can have the same absolute value. Let

$\hspace{4cm} I:=\{k\in\{1,…,N\}:|x_k|\geq x_j \forall j = 1,…,N\}$

Since not all indices $x_k$ have the same absolute value not all $N$ indices are a member of $I$. Now let $k\in I$

$\hspace{4cm} a_{kk}x_k = -\sum\limits_{j\neq k}a_{kj}x_j,$

$\hspace{4cm} |a_{kk}|\leq\sum\limits_{j\neq k}|a_{kj}|\dfrac{|x_j|}{|x_k|}$

because of $\dfrac{|x_j|}{|x_k|}\leq 1$ (while for $j\notin I$ we have strict inequality) and the diagonaldominance of $A$ we get $a_{kj}=0\forall j\notin I$. However this means that $A$ is reducible which is a contradiction to our prequisites.

So I can follow this proof right up to the last paragraph of which I don't understand why we get $a_{kj}=0\forall j\notin I$ and furthermore why this would imply reducibility.

I'm aware of the fact that a matrix $A$ is irreducible iff its digraph is strongly connected, but I don't see how one could apply this theorem here.

Edit the answer of @user1551 made things clearer for me but I think I can elaborate on two more steps in the proof that I didn't understand before. First of all why $\dfrac{|x_j|}{|x_k|}<1$ for $j\notin I$.

Let $j\notin I$ therefore there is $ m=1,…,N:x_m>|x_j|$. Now we assume that $|x_j|=|x_k|$ this though implies $x_m>|x_k|$ which is a contradiction to the fact that $k\in I$ hence $|x_j|<|x_k|$.

In the end this leads to the conclusion that indeed $A$ is reducible because since $k$ was an arbitrary element from $I$ we have

$\forall k\in I:\forall j\notin I: a_{kj} = 0$.

One could've tought about taking a path that passes trough other nodes before but by remarking that $I\cup I^C=N$ we can see that this isn't possible because there are no connections between $I$ and $I^C$ at all.

Best Answer

$A$ is diagonally dominant. Therefore $\sum_{j\ne k}|a_{kj}|\le|a_{kk}|$. Combine this with the inequality in question, we get $\sum_{j\ne k}|a_{kj}|\le\sum_{j\ne k}|a_{kj}|\frac{|x_j|}{|x_k|}$. However, as $\frac{|x_j|}{|x_k|}\le1$, we must have $a_{kj}=0$ or $\frac{|x_j|}{|x_k|}=1$ for each $j\ne k$. So, for $j\ne I$, the latter case is excluded and we must have $a_{kj}=0$.

Thus $A$ is reducible, because the nodes $j\not\in I$ are not reachable from the nodes in $k\in I$.

To make it more visual, you may reindex the rows and columns of $A$ so that $I=\{1,2,\ldots,|I|\}$. Let $J=\{|I|+1,\ldots,N\}$. Since $a_{kj}=0$ for all $k\in I$ and $j\not\in I$, the matrix $A$ is block-triangular: $$ A=\pmatrix{A_{II}&0\\ A_{JI}&A_{JJ}}. $$ Now we see that the nodes in $I$ are not joined to the nodes in $J$ by any out-edges.

Related Question