[Math] Showing that a matrix is invertible and finding its inverse

inverselinear algebramatrices

I'm incredibly rusty at linear algebra, and in preparation for my course I've been doing some review questions. I've been staring at this one for a half hour and still don't know how to approach it:

"Let A be a square matrix such that $A^3 = 0$. Show that the matrix $I + A + 2A^2$ is invertible and find its inverse."

I'm pretty sure I need to find a relationship between $A^3$ and $I + A + 2A^2$, but I'm not sure how. A matrix is invertible if the determinant is nonzero, and I know how to find the inverse of a matrix, but since this is a more theoretical question I'm not entirely certain how to approach it. Any hints would be much-appreciated 🙂

Best Answer

We start by an informal deduction: by the Taylor's expansion (around 0) $$ (1+x+2x^2)^{-1}=1-x-x^2+(\text{terms with order 3 or above}). $$ We would like to substitute $A$ into $x$. But with $A^3=0$, all the terms with power $3$ or above vanish. This suggests $$ (I+A+2A^2)^{-1}=1-A-A^2. $$ Now the solution is made rigorous by direction verification: $$ (I+A+2A^2)(I-A-A^2)=I-3A^3-2A^4=I-0-0=I. $$ While perhaps not elegant, this approach is mechanical so it can be applied to similar problems. For example, with $I+a A+b A^2$, we have $$ (1+a x+bx^2)^{-1}=1-ax+(a^2-b)x^2+(\text{terms with order 3 or above})\\ \implies (I+aA+bA^2)^{-1}=I-aA+(a^2-b)A^2. $$ Again, rigor will be supplied by verifying the solution via direct multiplication.