Eigenvalues, diagonalization and convergence of matrices

diagonalizationeigenvalues-eigenvectorslinear algebramatrices

I am trying to wrap my head around some basic results in Linear Algebra.

I am trying to avoid more abstract concepts like Rank-Nullity, and stay in simple properties at an introductory level. (I've taken more advanced Linear Algebra before, but it's been a long while, just trying to brush up on some basic facts).

Suppose I have a $n \times n$ matrix $A$.

Which of these facts are true?

  1. $\lim A^k$ exists if and only if all eigenvalues are strictly less than 1 in absolute value.

  2. If there are at least 2 eigenvalues that are equal to $1$, then the limit above does not exist.

  3. If a matrix has all its eigenvalues less than or equal to one in absolute value (not necessarily unique), then the limit above exists.

  4. If a matrix is diagonalizable and all its eigenvalues are less than or equal to one, then the limit above exists.

I think that statement 4 is the only one that is correct ("easy" to show in an informal way by the fact that $A^k = PJ^kP^{-1}$), but I've come across notes in Linear Algebra that state 1) and 2) as facts.

Best Answer

To summarize the comments, none of these statements is true. An easy sufficient condition for this limit to exist is that $A$ is diagonalizable and all of its eigenvalues lie in the half-open interval $(-1, 1]$, and a necessary condition is that the eigenvalues of $A$ are either equal to $1$ or strictly less than $1$ in absolute value. It's a bit tricky to say what happens if $A$ isn't diagonalizable.

People write all sorts of things in notes; they are just documents that someone decided to put up on the internet, not authoritative references. They can easily be very sloppy, especially if they're e.g. about math but not written by mathematicians (physicists, for example). There may also be missing hypotheses, you may have misread them, etc. All sorts of things.