Linear Algebra – Finding Eigenvectors with Repeated Eigenvalues

eigenvalues-eigenvectorslinear algebra

I have a matrix $A = \left(\begin{matrix} -5 & -6 & 3\\3 & 4 & -3\\0 & 0 & -2\end{matrix}\right)$ for which I am trying to find the Eigenvalues and Eigenvectors. In this case, I have repeated Eigenvalues of $\lambda_1 = \lambda_2 = -2$ and $\lambda_3 = 1$.

After finding the matrix substituting for $\lambda_1$ and $\lambda_2$, I get the matrix $\left(\begin{matrix} 1 & 2 & -1\\0 & 0 & 0\\0 & 0 & 0\end{matrix}\right)$ after row-reduction.

To find the result of $\left(\begin{matrix} 1 & 2 & -1\\0 & 0 & 0\\0 & 0 & 0\end{matrix}\right)$ $\left(\begin{matrix} e_1\\e_2\\e_3\end{matrix}\right) = \left(\begin{matrix} 0\\0\\0\end{matrix}\right)$, I set $e_2$ and $e_3$ as free variables $s$ and $t$, respectively, solved for $e_1$ and put into vector form:

$$\left[\begin{matrix} -2s + t\\s\\t\end{matrix}\right] = s \left[\begin{matrix} -2\\1\\0\end{matrix}\right] + t \left[\begin{matrix}1\\0\\1\end{matrix}\right]$$

So for the first two Eigenvectors, I found $v_1 = [-2, 1, 0]^{T}$ and $v_2 = [1, 0, 1]^T$. However, checking my answer on Wolfram Alpha, $v_1$ is assigned to the latter and $v_2$ is assigned to the former. Does this matter?

If so, should I instead find the first Eigenvector for $\lambda_1$ then using the same reduced matrix set it equal to the result of $v_1$ instead of the zero vector for $\lambda_2$? I.e., solving the above matrix for $\lambda_1$:
$$e_1 = -2e_2 + e_3 \rightarrow 1 = -2(0) + 1 $$
$$v_1 = [1, 0, 1]^T$$

And then for $\lambda_2$:

$$\left(\begin{matrix} 1 & 2 & -1\\0 & 0 & 0\\0 & 0 & 0\end{matrix}\right) \left(\begin{matrix} e_1\\e_2\\e_3\end{matrix}\right) = \left(\begin{matrix} 1\\0\\1\end{matrix}\right)$$

However, using $e_3$ as a free variable where $e_3 = 0$, I don't get the above vector I originally found for $v_1$:

$$e_1 + 2e_2 = 1 \rightarrow 1(2) + 2(-1/2) = 1$$
$$v_2 = [2, -1/2, 0]^T$$

What am I doing wrong here?

Best Answer

It is not a good idea to label your eigenvalues $\lambda_1$, $\lambda_2$, $\lambda_3$; there are not three eigenvalues, there are only two; namely $\lambda_1=-2$ and $\lambda_2=1$.

Now for the eigenvalue $\lambda_1$, there are infinitely many eigenvectors. If you throw the zero vector into the set of all eigenvectors for $\lambda_1$, then you obtain a vector space, $E_1$, called the eigenspace of the eigenvalue $\lambda_1$. This vector space has dimension at most the multiplicity of $\lambda_1$ in the characteristic polynomial of $A$. In this case, looking at the characteristic polynomial of $A$, we see that the dimension of the eigenspace $E_1$ is at most two.

As you determined, the dimension of $E_1$ is exactly two, as you found two independent eigenvectors for $\lambda_1$. Your eigenvectors $v_1$ and $v_2$ form a basis of $E_1$. It does not matter that WA listed them in the opposite order, they are still two independent eigenvectors for $\lambda_1$; and any eigenvector for $\lambda_1$ is a linear combination of $v_1$ and $v_2$.

Now you need to find the eigenvectors for $\lambda_2$. Note the dimension of the eigenspace must be one here (since the multiplicity of $\lambda_2$ in the characteristic polynomial is 1).