[Math] LU factorization of a nonsingular matrix exists if and only if all leading principal submatrices are nonsingular.

linear algebramatrix decomposition

I'm struggling to prove this theorem. I can prove that if the $LU$ factorization exists, then the leading principal submatrices are nonsingular.

To do that, I can show that the determinant of every leading principal submatrix is not zero. (The leading principal submatrix is the product of $L$ and $U$ corresponding leading principal submatrices , and determinant of every $L$ leading principal submatrix is $1$ and determinant of the $U$ leading principal submatrix is product of the diagonal elements).

To prove that if the leading principal submatrices are nonsingular, then $LU$ factorization exists, I believe I should use induction, but I'm getting nowhere. Can anyone help me with the proof?

Best Answer

We show by induction that every $n \times n$ matrix $A$ with nonsingular leading principal minors has a factorization $A = LU$ where $L$ is strictly lower triangular, $U$ is upper triangular, and $L$ and $U$ are both nonsingular. (This statement, as you show, is an if-and-only-if.)

The $1\times 1$ base case is just factoring $a = 1 \cdot a$. To induct, write your $n \times n$ matrix $A$ as a leading principal $(n-1) \times (n-1)$ matrix $A'$ and some leftover entries: $$ A = \left[\begin{array}{ccc|c} & & & \\ & A' & & \vec{b} \\ & & & \\ \hline & \vec{c}^{\mathsf T} & & d \\\end{array}\right]. $$ By the inductive hypothesis (since all leading principal minors of $A'$ are also leading principal minors of $A$), $A'$ has an $LU$ factorization as $A' = L' U'$ with nonsingular $L'$, $U'$. We want to use this to make the factorization $$ \left[\begin{array}{ccc|c} & & & \\ & A' & & \vec{b} \\ & & & \\ \hline & \vec{c}^{\mathsf T} & & d \\\end{array}\right] = \left[\begin{array}{ccc|c} & & & \\ & L' & & \vec{0} \\ & & & \\ \hline & \vec{x}^{\mathsf T} & & 1 \\\end{array}\right]\left[\begin{array}{ccc|c} & & & \\ & U' & & \vec{y} \\ & & & \\ \hline & \vec{0}^{\mathsf T} & & z \\\end{array}\right] $$ work, by picking appropriate $\vec x$, $\vec y$, and $z$.

By doing the block multiplication, we get four equations.

  • We have $A' = L'U' + \vec{0}\vec{0}^{\mathsf T}$, which we know is true, so that's done.
  • We have $\vec b = L'\vec y + \vec 0 z$, so we want to set $\vec y = L'^{-1}\vec b$. Fortunately that's possible since $L'$ is invertible.
  • We have $\vec c^{\mathsf T} = \vec{x}^{\mathsf T}U' + \vec{0}^{\mathsf T}$, so we want to set $\vec{x}^{\mathsf T} = \vec{c}^{\mathsf T}U'^{-1}$. This is possible since $U'$ is also invertible.
  • We have $d = \vec{x}^{\mathsf T}\vec y + z$, so we want to set $z = d - \vec{x}^{\mathsf T} \vec y$.

For future inductive steps, we also want to know that the resulting matrices $L$ and $U$ are nonsingular. This is immediate for $L$ since its diagonal is $1$; for $U$, it's not obvious how to check that the value of $z$ we get is nonzero. But once we have $A = LU$ where $A$ and $L$ are nonsingular, we know that $U = L^{-1}A$ is nonsingular.


There are also $LU$ factorizations out there for which $U$ is singular (some of the diagonal entries of $U$ are zero). For these, there is not an if-and-only-if condition this nice.

You can see from the above proof, for instance, that if $A$ is possibly singular but all of its proper leading principal minors are still nonsingular, then we get a factorization $A = LU$ in which the bottom right entry is possibly $0$. (This is because arguing $z\ne 0$ is the only place where we needed $A$ to be nonsingular.)

Related Question