Prove that a $3\times 3$ matrix has only $2$ eigenvectors

eigenvalues-eigenvectorslinear algebramatrices

I am working through a problem in Riley, Hobson and Bence (Mathematical Methods for Physics and Engineering) that revolves around the following matrix:

$$ A=
\begin{pmatrix}
2 & 0 & 0 \\
-6 & 4 & 4 \\
3 & -1 & 0 \\
\end{pmatrix}
$$

I first have to show that the eigenvalues are degenerate (all three eigenvalues are 2) and that any eigenvector takes the form:

$$ \vec{x}=
\begin{pmatrix}
u\\
3u-2v\\
v\\
\end{pmatrix}
$$

Proving these two statements are easy. The interesting part of the question asks to prove the following statement:

If two pairs of values, $u_1, v_1$ and $u_2, v_2$, define two independent eigenvectors $\vec{x_1}$ and $\vec{x_2}$ , then any third similarly defined eigenvector $\vec{x_3}$ can be written as a linear combination of $\vec{x_1}$ and $\vec{x_2}$, i.e.

$$\vec{x_3}=a\vec{x_1}+b\vec{x_2}$$

Where:

$$a=\frac{u_3v_2-u_2v_3}{u_1v_2-u_2v_1} \ \ \ \ \ \ b=\frac{u_1v_3-u_3v_1}{u_1v_2-u_2v_1}$$

I've been struggling with this for a while but I don't know where to start. Any hints would be much appreciated.

Best Answer

The null space of the matrix $A - 2I$ is the eigenspace for your matrix. The only matrix for which three linearly independent solutions exist for the matrix equation $(A - 2I)\mathbf{x} = \mathbf{0}$ is when $A - 2I = \mathbf{0_{(3,3)}}$ where $\mathbf{0_{(3,3)}}$ is the zero $3 \times 3$ matrix. Proving this is quite trivial. Given your matrix $A$, it is clear that $A - 2I \neq \mathbf{0_{(3,3)}}$, and as such we can conclude that $A$ has at most $2$ linearly independent eigenvectors, a fact which is given in the beginning of the problem.