[Math] Dimension of the space of linear operators on a finite vector space

linear algebralinear-transformations

In the context of the minimal polynomial of a linear transformation,

What does it mean that the space of linear operators on a finite vector space $V$ (of dimension $\dim(V) =n$)

$$I, T, T^2,\dots, T^{n^2}\in \mathscr L(V)$$

has dimension $\dim \mathscr L(V)=n^2$?

My understanding is that, chosen a basis vectors, linear transformations can be viewed as matrices, and these will have to be congruent to the vectors in $V$ so that they can be multiplied.

But I have no idea why the powers of the matrices $T$ are obtained. Are they square matrices (if so, why not rectangular?)? Why do we need the powers to obtain the dimension of the space of transformations (is this even a correct statement?)?

Best Answer

The space $\mathscr{L}(V)$ is the space of linear operators, meaning the set of linear functions from $V$ to $V$. You can take powers of them (or indeed multiply them generally) by composition; the result still maps from $V$ to $V$. If you were to represent these linear operators as matrices, they would all be square.

They form a vector space in their own right, under addition and scalar multiplication pointwise. This takes some proving, but is very easy to do.

The minimal polynomial of an arbitrary element of $\mathscr{L}(V)$ doesn't tell you the dimension of $\mathscr{L}(V)$. Instead, one can fix a basis $$B = (v_1, \ldots, v_n)$$ of $V$, and for $i, j$ between $1$ and $n$, define $T_{ij}$ to map $v_i$ to $v_j$, and map each other vector in $B$ to $0$. In terms of matrices, these correspond to the matrices with the entry in the $i$th column and the $j$th row being $1$, and everywhere else being $0$. It's not too difficult to show that these $n^2$ maps are linearly independent and spanning, and hence a basis.

However, the fact that $\operatorname{dim} \mathscr{L}(V) = n^2$ is useful in proving the existence of a minimal polynomial. It means that a list of $n^2 + 1$ vectors from $\mathscr{L}(V)$ is necessarily linearly dependent. So, if we take such a list $$(I, L, L^2, \ldots, L^{n^2}),$$ then there is necessarily a non-trivial linear combination of these vectors that produces the $0$ vector. That is, there exist $a_0, a_1, \ldots, a_{n^2}$, not all equal to $0$, such that $$a_0 I + a_1 L + a_2 L^2 + \ldots + a_{n^2} L^{n^2} = 0.$$ These are equal as operators, i.e. equality holds at each point. So, we have found a polynomial $$p(z) = a_0 + a_1 z + a_2 z^2 + \ldots + a_{n^2} z^{n^2}$$ such that $p(L) = 0$. One can then argue that, if one such polynomial must exist, there must be a unique monic polynomial of minimal degree with the same property (and indeed divides all other polynomials with this property). Thus, the minimal polynomial has degree at most $n^2$.

Of course, you can get a tighter bound with the Cayley-Hamilton theorem!

Related Question