[Math] Gauss Transform with $LU$ factorization

linear algebramatricesnumerical methods

Consider a symmetric matrix $A$, i.e. $A = A^T$.

a.) Consider the use of Gauss Transforms to factor $A = LU$ where $L$ is unit lower triangular and $U$ is upper triangular. You may assume that the factorization does not fail. Show that $A = LDL^T$ where $L$ is unit lower triangular and $D$ is matrix with nonzeros on the main diagonal.

b.) For an arbitrary factorization matrix the $LDL^T$ factorization will not always exist due to the possibility of $0$ in the $(i,i)$ position of the transformed matrix that defines the $i$-th Gauss transform. Suppose, however that $A$ is positive definite symmetric matrix,i.e., $x^tAx > 0$ for any vector $x\neq 0$. Show that the diagonal element of the transformed matrix $A$ that is used to define the vector $l_i$ that determines the Gauss transform on step $i$, $M_i^{-1} = I – l_ie_i^{T}$, is always positive and therefore the factorization will not fail. Combine this with the existence of the $LDL^T$ factorization to show that, in this case, the nonzero elements of $D$ are in fact positive.

Solution a.) Define $$A = LU = LD\tilde{U}$$ where $L$ is unite lower triangular, $D$ is a matrix with nonzeros on the main diagonal, and $\tilde{U}$ is unit upper triangular. Since $A$ is symmetric we have $$A = LD\tilde{U} = A^T = \tilde{U}^T D L^T$$ multiply $L^{-1}$ and $(L^T)^{-1}$ to both sides and we get $$D\tilde{U}(L^T)^{-1} = L^{-1}\tilde{U}^T D$$ The left side is upper triangular and the right side is lower triangular this implies that for them to be equal, both have to be diagonal. Since all of the triangular matrices are unit triangular it follows that the diagonal matrix $D$ has to be itself. Therefore we have $$D = D\tilde{U}(L^T)^{-1}$$ thus multiplying this by $L^T$ we get $$DL^T = D\tilde{U}$$ and we have $$A = LD\tilde{U} = LDL^T$$

Solution b.) (As suggested by AJ) Assume $A = LU$ exists and denote by subscripts $k$ the $k\times k$ leading submatrix. Then $A_k = L_k U_k$ and hence $$\det(A_k) = \det(L_k)\det(U_k) = \det(U_k) = d_1 d_2\ldots d_k$$
In other words, the product of the first $k$ pivots for Gaussian elimination. Now assume $\det(A_k)\neq 0 $ for $1\leq k \leq n-1$ then $A_k$ is singular. After $k$ steps of Gaussian elimination we have $$M^{-1}_{k},\ldots M^{-1}_{1}A_{1}^{(1)} = U_{k}$$ Not really sure where to go from here…

Best Answer

Solution a) looks right, but can be simplified. Just taking $A=LU$ and pre and post-multiplying as you did gives $$ L^{-1}AL^{-T}=UL^{-T}. $$ LHS is symmetric while RHS is upper-triangluar, hence, it is a diagonal matrix. Call it $D$.

Hint for Solution b): Denote by subscript $k$ the $k\times k$ leading principal submatrix. Assume $A=LU$ exists. Then $A_k=L_kU_k$ and, hence, $\det A_k=\det U_k=d_1d_2\ldots d_k$, i.e. the product of the first $k$ pivots for Gauss eliminations. Use induction to prove that if all $\det A_k$ for $k=1$ to $n-1$ are nonzero then $LU$ factorization is possible. Then use the fact that a positive definite matrix $A$ must have all $A_k$ positive definite too, thus, $\det A_k\ne 0$.

Addendum:

  1. Lemma: if $\det A_k\ne 0$ for $1\le k\le n-1$ then $A=LU$ exists. Proof: $\det A_1=a_{11}\ne 0$ then it is possible to eliminate the first column. Now assume that you have eliminated first $k$ columns. Look at the matrices you have got at the step $k$. You may look at the case $n=4$, but just to understand what happens in general. When you've understood that, you consider the general case and conclude from the matrix structure that even after $k$ steps of the partial Gauss elimination it holds $A_{k+1}=L_{k+1}U_{k+1}$, which leads (from the relation above) to the $d_{k+1}$ pivot being nonzero, thus, next column elimination is possible. It is called the mathematical induction.
  2. If $A$ is pos. def. then by definition for all $z=[x\ \ y]^T\ne 0$ $$ z^TAz= \begin{bmatrix} x^T & y^T \end{bmatrix} \begin{bmatrix} A_k & *\\* & * \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix}>0. $$ Now set $y=0$ to obtain the definition of $A_k$ being pos. def. Finally, if $A$ pos. def. then $\det A\ne 0$. Proof: assume that $\det A=0$, then there exists $x\ne 0$ such that $Ax=0$ $\Rightarrow$ $x^TAX=0$, contradiction!
Related Question