Linear Algebra – Show Set {v_1, …, v_n} is Linearly Independent

linear algebralinear-transformationsmatrices

I have the following question

Suppose an $n \times n$ matrix $A$ has $n-1$ distinct eigenvalues $\{\lambda_1, \lambda_2, \dots, \lambda_{n-1}\}$ and that the eigenvalue $\lambda_{n-1}$ has algebraic multiplicity $2$ and geometric multiplicity $1$. The corresponding eigenvector set is $\{v_1, v_2, \dots, v_{n-1}\}$. Suppose now there is a vector $v_n$ such that $(A – \lambda_{n-1}I)v_n = v_{n-1}$. Show that the set $\{v_1, \dots, v_n\}$ is linearly independent.

I have no idea on how to proceed on this. Any suggestions would be helpful. Thanks in advance!!!

EDIT
I know how to prove $\{v_1, \dots, v_{n-1}\}$ are linearly independent but having this knowledge of linear independence of $\{v_1, \dots, v_{n-1}\}$, how do I incorporate $v_n$ into the set?

Here's a proof of the fact that $\{v_1, \dots, v_{n-1}\}$ are linearly independent.

All we need to prove is that eigenvectors corresponding to different eigenvalues are linearly independent. Let $P = \begin{bmatrix} \mathbb{v}_1 \mathbb{v}_2 \dots \mathbb{v}_n\end{bmatrix}_{n \times n}$ be a matrix of eigenvectors. Let the eigen decomposition of $A$ be $A = P\Sigma P^{-1}$ and let WLOG, $\lambda_1 > \lambda_2 > \dots > \lambda_{n-1} = \lambda_n$. So, $\Sigma = \operatorname{diag}(\lambda_1, \lambda_2, \dots, \lambda_{n-1}, \lambda_n)$. Let $\mathbb{c} = \begin{bmatrix} c_1 & c_2 & \dots & c_{n-1} & c_n \end{bmatrix}^T$. Since $\lambda_{n-1}$ has geometric multiplicity of $1$, we have $v_n = \alpha \mathbb{v}_{n-1}$, $\alpha \in \mathbb{R}\setminus \{0\}$, We need to show,
\begin{align*}
c_1\mathbb{v_1} + c_2\mathbb{v_2} + \dots + c'_{n-1}\mathbb{v_{n-1}} + c'_n\mathbb{v_n} \\
= c_1\mathbb{v_1} + c_2\mathbb{v_2} + \dots + c'_{n-1}\mathbb{v_{n-1}} + c'_n(\alpha v_{n-1}) \\
= c_1\mathbb{v_1} + c_2\mathbb{v_2} + \dots + (c'_{n-1} + \alpha c'_n)\mathbb{v_{n-1}} \\
= c_1\mathbb{v_1} + c_2\mathbb{v_2} + \dots + c_{n-1}\mathbb{v_{n-1}} = 0 \\
\Rightarrow c_1 = c_2 = \dots = c_{n-1} = 0
\end{align*}

Applying $A\mathbb{v_i} = \lambda_i\mathbb{v_i}$ we get,
\begin{equation}
\label{eq3}
c_1\lambda_1\mathbb{v_1} + \dots + c_n\lambda_n\mathbb{v_n} = 0
\end{equation}

We can re-write the above equation in matrix form as
\begin{equation}
P\Sigma \mathbb{c} = \mathbb{0}
\end{equation}

but since $A$ can be diagonalized to $\Sigma$, we know $P \Sigma = AP$ and thus the above equation becomes
\begin{equation}
\label{eq5}
AP\mathbb{c} = \mathbb{0}
\end{equation}

but since $AP \neq 0$, it must be the case that $\mathbb{c} = \mathbb{0}$

Best Answer

The set $\{v_1,\ldots, v_{n-1}\}$ is L.I. as you have shown.

Suppose, including $v_n(\neq 0)$ to this set makes it L.D. which is possible $\iff$ there exists coefficients $a_{i}$'s (not all zero) such that $$v_n=\sum_{i=1}^{n-1}a_iv_i\\ \Rightarrow (A-\lambda_{n-1}I)v_n=\sum_{i=1}^{n-1}a_i(A-\lambda_{n-1}I)v_i$$

By definition of eigen vector, $Av_i=\lambda_iv_i$ for each $i$.

Given, $(A-\lambda_{n-1}I)v_{n}=v_{n-1}$.

Simplify and you have:

$$ v_{n-1}=\sum_{i=1}^{n-2} a_i(\lambda_i-\lambda_{n-1})v_i$$

This contradicts that $\{v_1,\ldots,v_{n-1}\}$ is L.I. $\blacksquare$

Related Question