[Math] Eigenvectors for an eigenvalue of $0$

linear algebra

Without calculation, find one eigenvalue and two linearly independent eigenvectors of $A = \begin{bmatrix} 2 && 2 && 2 \\ 2 && 2 && 2 \\ 2 && 2 && 2 \end{bmatrix} $.

This matrix is non-invertible because its columns are linearly dependent. So the number $0$ is an eigenvalue of $A$. Eigenvectors for the eigenvalue $0$ are solutions of $Ax=0$ and therefore have entries that produce a linear independence relation among the columns of A. Any nonzero vector in ($\Bbb R^3$) whose entries sum to $0$ will work. Find any two vectors that are not multiples; for instance,

$\begin{bmatrix} 1 \\ 1 \\ -2 \end{bmatrix} $ and $\begin{bmatrix} 1 \\ -1 \\ 0 \end{bmatrix} $.

I do not understand how the first vector is obtained.

So far I've done.

$ \sim \begin{bmatrix} 1 && 1 && 1 \\ 0 && 0 && 0 \\ 0 && 0 && 0 \end{bmatrix} $

$\begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} -x_2-x_3 \\ x_2 \\ x_3 \end{bmatrix} = x_2\begin{bmatrix} -1 \\ 1 \\ 0 \end{bmatrix} + x_3\begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix} $

How do you obtain $\begin{bmatrix} 1 \\ 1 \\ -2 \end{bmatrix} $?

Why is it that any nonzero vector in ($\Bbb R^3$) whose entries sum to $0$ will work?

Best Answer

"Why must the entries sum to $0$"

Let $A=\begin{bmatrix}a_{1,1}&a_{1,2}&a_{1,3}\\ a_{2,1}&a_{2,2}&a_{2,3}\\a_{3,1}&a_{3,2}&a_{3,3}\end{bmatrix}$ and $x = \begin{bmatrix} x_{1}\\x_2\\x_3\end{bmatrix}$

By definition of matrix product, you have

$Ax = \begin{bmatrix}a_{1,1}x_1 + a_{1,2}x_2 + a_{1,3}x_3\\a_{2,1}x_1 + a_{2,2}x_2 + a_{2,3}x_3\\a_{3,1}x_1 + a_{3,2}x_2 + a_{3,3}x_3\end{bmatrix}$

Note what happens when all entries of $A$ are identical and nonzero:

$\dots = \begin{bmatrix} a(x_1+x_2+x_3)\\a(x_1+x_2+x_3)\\a(x_1+x_2+x_3)\end{bmatrix} = a\begin{bmatrix}x_1+x_2+x_3\\x_1+x_2+x_3\\x_1+x_2+x_3\end{bmatrix}$

If this were to equal the zero vector (which is implied since these are eigenvectors for the eigenvalue of zero) then that means that every entry above must be zero. That is exactly the same as saying that $x_1+x_2+x_3=0$

On the other hand, supposing that $x_1+x_2+x_3=k\neq 0$, you would have $Ax = a\begin{bmatrix}k\\k\\k\end{bmatrix}\neq \overrightarrow{0}=0x$ and so $x$ would not be an eigenvector for the eigenvalue of zero.

In general, any nonzero multiple of the $Ones_{n\times n}$ matrix, say $a\cdot Ones_{n\times n}$ (matrix where all entries are $a$) will have an eigenvalue of zero and every eigenvector for zero will satisfy the relation that the sum of the entries is zero. We know that there are no others by a rank-nullity argument and that the remaining eigenvalue is equal to the trace (sum of the diagonal).

Furthermore, the other eigenvalue will necessarily be $a\cdot n$ with eigenvector $\begin{bmatrix}1\\1\\\vdots\\1\end{bmatrix}$.

Related Question