A bilinear form is nonsingular and a self-adjoint operator is nonsingular.

bilinear-formlinear algebraself-adjoint-operators

I am reviewing some more advanced linear algebra before the semester begins, and I came accross these two proofs, and I was hoping some one could verify that my proofs are correct.

(1) The matrix, relative to any basis, of any positive definite Hermitian form is nonsingular.

(2) For any complex $n\times n$ matrix $I+A^*A$ is nonsingular, where $A^*$ is the conjugate transpose of $A$.

Here are my proofs.

(1) Let $\beta$ be any basis for the $n$ dimensional vector space in question. Let $H$ be any positive definite Hermitian form. Let $A$ be the matrix associated with $H$ with respect to the basis $B$. By that I mean $A_{ij} = H(v_i,v_j)$ for $v_i,v_j\in \beta$. Suppose that $A$ is singular, then there is a nonzero vector $x$ such that $Ax=0$. This implies that $x^*Ax=0$. Hence $H(x,x)=x^{*}Hx=0$. But this contradicts the fact that $H$ is a positive definite.

(2) Since $(I+A^*A)^*=I^*+A^*(A^*)^*=I+A^*A$, then the operator is $(I+A^*A)$ is self-adjoint. A matrix is Self-adjoint if and only if there exists a basis such that the representation of $A$ is diagonal. A diagonal matrix is nonsigular, which implies that $A$ is nonsingular.

Is there a more direct way to show (2) without using the fact that the operator is self-adjoint?

Best Answer

If $(I+A^{*}A)x=0$ then $x^{*}(I+A^{*}A)x=0$. This can be written as $x^{*}x+((Ax)^{*}(Ax)=0$. Since both terms are non-negative this implies that the first term is $0$ and hence $x=0$. Thus the kernel of $I+A^{*}A$ is $\{0\}$ so it is invertible.

Similarly, for the first part $Ax=0$ implies $x^{*}Ax=0$ which implies $x=0$ so $A$ is invertible.