[Math] Show that a matrix is invertible with norm less than one

eigenvalues-eigenvectorseuclidean-algorithmmatricesnormed-spacesnumerical methods

If $\|G\| < 1$, then show that $I-G$ is invertible.

This can be proved by contradiction: If $I-G$ is singular, then 1 is an eigenvalue of $I-I+G$. So if the matrix norm is induced the 2-norm, $\| G\|$ is at least 1 since the largest singular value of a matrix is not less than its eigenvalue in absolute value.

I have two questions:
1) Why does 1 have to be an eigenvalue of $I-A$ if A is singular? and why is the largest singular value of a matrix not less than its eigenvalue in absolute value?

2) I am learning iterative methods in class and i don't believe we have come acrossed eigenvalues yet; is there any other ways to prove this?

Thanks!

Best Answer

So recall that every vector is eigenvector of $I$. Since $I-G$ is singular by assumption, this means that it has an eigenvector $v$ with eigenvalue zero. This is equivalent to saying that the kernel is nontrivial, or the determinant is zero. But then $v$ is also an eigenvector of $G=I-(I-G)$ with $Gv = Iv - (I-G)v = v$, so it has eigenvalue $1$ as asserted.

Next, let's look at how the operator norm is defined: we have $||G||=\mbox{sup}_{w \in V} \frac{||Gw||}{||w||}$. Clearly, we may take $w=v$. Then $\frac{||Gw||}{||w||} =1$. So the supremum and hence the norm of $G$ itself is certainly at least $1$, which is a contradiction to our other assumption $||G||<1$. Hope this helps!

Related Question