A nilpotent linear operator could be represented by a upper triangular matrix with diagonal entires zero

linear algebratriangularization

I am trying to prove the proposition that:

A linear operator $T$ is nilpotent if there exists some positive natural number $k$ such that $T^k=0$. If $T$ is nilpotent, then there is a basis of $V$ such that the matrix of $T$ is upper triangular with diagonal entries all equal to zero.

$V$ is assumed to be finite-dimensional and $T\in L(V)$, and $L(V)$ denotes the set of all linear operators on $V$.

I proved this proposition under the additional assumption that $V$ is a complex vector space by induction on the dimension of $V$. My question is that (1) if it is possible to prove this proposition without the assumption that $V$ is over the complex field and (2) if my proof under the assumption is correct.

  1. Basis Step: $dim(V)=1$

This is obvious. If $T$ is nilpotent, then $T=0$. Otherwise, $T$ would be onto, and $T$ cannot be nilpotent. Thus $T$ is represented by a zero matrix $[0]$.

  1. Inductive Step

Since $V$ is a complex vector space, there exist a eigenvalue $\lambda$. $\lambda$ should be zero because $T$ is nilpotent. If $v$ is a eigenvector of $T$, then $0=T^k(v)=\lambda^kv$. Now $\lambda=0$ follows.

Now pick a eigenvector $u$ and extend $(u)$ to a basis $B=(u; v_1, \dots, v_{n-1})$. Now the matrix $A$ representing $T$ is the block form,

$A=
\left[
\begin{array}{c|c}
0 & * \\
\hline
0 & D
\end{array}
\right]
$

where $D$ is an $(n-1)$x$(n-1)$ matrix($D\in M_{n-1}(C)$). By the assumption there is a $Q\in M_{n-1}(C)$ such that $Q^{-1}DQ$ is a diagonal matrix with entries all zero.

Let

$Q_1=\left[
\begin{array}{c|c}
1 & 0 \\
\hline
0 & Q
\end{array}
\right]
$
.

Then $A'=Q_1^{-1}AQ=\left[
\begin{array}{c|c}
0 & * \\
\hline
0 & Q^{-1}DQ
\end{array}
\right]
$

is an upper triangular with all entries zero. $A'$ represents $T$ with the new basis $B' = BQ$.

Over the proof above, the assumption on the existence of $\lambda$ is crucial and this follows naturally from the assumption that $V$ is a complex vector space. I want to know whether the proposition holds without this assumption.

Best Answer

Eigenvalue existence is a matrix or linear operator based thing, not a vector space thing. Every $N x N$linear operator has an eigenvalue. This can follow from characteristic polynomials, $N^{th}$ degree polynomials whose roots are are all eigenvalues of the linear operator.

Note that this doesn't mean every linear operator has a full set of "normal" eigenvalues (duplicates roots cause issues; to get a full set for many duplicate root cases, you need to generalize, which leads to the concept of generalized eigenvalues and Jordan form), but regardless it does guarantee that since every linear operator has a characteristic polynomial, that it must have at least one eigenvalue.

On that note, the rest looks good