Problem about extending $S\subseteq V$ to a Jordan basis of a nilpotent operator $T:V\to V$

jordan-normal-formlinear algebralinear-transformations

Let $V$ be a finite-dimensional vector space over $\mathbb{F}$, on which $T\in\mathcal{L}(V)$ is a nilpotent linear operator. Assume that $k$ is the smallest integer such that $T^k=0$. Show that any linearly independent $S\subseteq V$ such that $L(S)\cap\ker T^{k-1} =\{0\}$ can be extended to a Jordan basis for $T$.

I am trying to prove this using induction, but I would appreciate any other ideas (possibly simpler or more direct). I chose to do induction on $\dim V = n$.

If $n=1$, and $T$ is nilpotent, then $T$ must be the zero operator. Any linearly independent non-empty subset $S$ of $V$ contains only one vector, say $0\neq v\in V$ since $\dim V = 1$. Here, $L(S) = V$ and $\ker T^{k-1} = \ker I = \{0\}$. $L(S)\cap\ker T^{k-1} =\{0\}$ holds, and $S$ itself is a (Jordan) basis for $T$. If $S=\emptyset$, then $L(S)=\{0\}$ and $L(S)\cap\ker T^{k-1} =\{0\}$ holds. We can add any $0\neq v\in V$ to $S$ to construct a basis of the one-dimensional space $V$. The base case ends here.

Now, assume that the statement holds for all $\dim V \in \{1,2,…,n-1\}$. We must show that it holds for $\dim V=n$.

Consider a linearly independent subset $S\subseteq V$ such that $\dim L(S) = r$ where $0\leq r\leq n$, and $L(S)\cap\ker T^{k-1} =\{0\}$. We want to construct a Jordan basis $B$ of $V$ for $T: V\to V$ by adding $n-r$ linearly independent vectors to $S$ from $V\backslash L(S)$. How should I choose these vectors so that the resulting basis is a Jordan basis? I am stuck here.

Clearly, I am unable to make use of the condition that $L(S)\cap\ker T^{k-1} =\{0\}$.

I would appreciate any hints on how to take this solution forward, and also ideas for any other ways of approaching this problem. Thank you!

Best Answer

I believe it is simpler to prove by induction on $k$ rather than on $n$. The main observation is that, for Jordan basis of a nilpotent operator, if $v\in B$ then either $Tv\in B$ or $Tv=0$, meaning that if $S \subset B$ we have $\{Tv \; | \; v\in S, Tv\ne 0\} \subset B$.

The induction basis of $k=1$ is identical to the solution you presented for $n=1$, so I skip it. Suppose now that $k>1$ and that you have proved the assertion for linear transformations with nilpotency index smaller than $k$. Consider the set $T(S):=\{Tv \; | \; v\in S\} \subset \operatorname{Im}(T)$. I assert it is linearly independent. Indeed, if $\sum_{v\in S}\alpha_v Tv=0$, then $\sum_{v\in S}\alpha_vv \in \ker T \subset \ker T^{k-1}$ so $\sum_{v\in S}\alpha_vv=0$ since $L(S)\cap\ker T^{k-1}=\{0\}$.

By the induction hypothesis, exist a Jordan basis $B_0$ of $\operatorname{Im}(T)$ containing $T(S)$, we are left to understand what vectors from $V\setminus \operatorname{Im}(T)$ we need to add to it to get a Jordan basis of $V$. Obviously $S$ has to go there since it was our original goal. For any other $v\in B_0$ s.t. $v\notin T(S)$ and $v\notin T(B_0)$ we also need to pick a preimage $w_v\in V$ s.t. $Tw_v=v$ and add it to our Jordan basis (this is always possible since $v\in \operatorname{Im}(T)$). Finally, we take $B_0\cap \ker T$ and add to our Jordan basis its completion to a basis of $\ker T$, to get all the length $1$ chains. So the Jordan basis is $$ B:= B_0 \cup S \cup \{ w_v \; | \; v\in B_0\setminus (S\cup T(B_0)) \} \cup B'\setminus (\ker T\cap B_0) $$ where $B'$ is a basis of $\ker T$ containing $B_0\cap \ker T$.

Until here is the construction, let prove it works. If we were to prove the resulting set is indeed a basis, it is immediate that this basis is a Jordan basis for $T$ since by construction every $v\in B$ satisfies that either $Tv=0$ or $Tv\in B$. Suppose that $\sum_{v\in B}\alpha_v v=0$ is a linear dependency. Applying $T$ to it we get $\sum_{v\in B\setminus \ker T}\alpha_v Tv = 0$ which means that $\alpha_v=0$ for all $v\in B\setminus \ker T$, since for all such $v$, $Tv\in B_0$. So any linear dependency includes only $B\cap \ker T$ but this was chosen as to be a basis of $\ker T$ so can not consist of non trivial linear dependencies.

We are left with showing that $B$ spans $V$. Let $u\in V$. Since $B$ contains a basis for $\operatorname{Im}(T)$ and since every $v\in B\cap \operatorname{Im}(T)$ is an image of an element of $B$, $T(u)=\sum_{v\in B\setminus \ker T}\alpha_vT(v)$. Consider now the vector $w = \sum_{v\in B\setminus \ker T}\alpha_v v$. The difference $u-w$ is clearly in the kernel of $T$ so we can span it by $B$ and thus get that $u\in Span(B)$.

Related Question