Can a polynomial also be a vector

linear-transformationsmatricespolynomialsvector-spaces

To the best of my knowledge, a vector is defined simply as a member of a particular linear space. Since we can construct a linear space composed out of polynomial expressions above the real field, I conclude that a polynomial from that vector space is a vector — because that's how we defined it.

What I struggle to understand is what happens when we start working with linear transformations. Assuming I have a linear operator $T: \mathbb{R}_n[x]\rightarrow \mathbb{R}_n[x]$, its representing matrix will allow for $T(v) = Av$, where $A$ is the representing matrix and $v \in \mathbb{R}_n[x]$. But how do we multiply a matrix by a polynomial, unless we treat it like a vector?

To help clarify what I'm trying to ask: I've been told that when finding the eigenvectors of $A$ for the above transformation, I must transform the vector coordinates into the polynomial field. But, if a polynomial isn't a vector, it sounds wrong to me to do so.

Best Answer

You say $T(v)=Av$. This is wrong. You should say $$[T(v)]_{B'}=A[v]_{B}$$

where $A$ is the matrix of $T$ with respect to the basis $B$ of the domain and the basis $B'$ of the co-domain. Here $[T(v)]_{B'}$ is the coordinate vector with respect to $B'$ and $[v]_{B}$ is the coordinate vector with respect to $B$.

So instead of multiplying the matrix with a polynomial, the effect of the linear transformation is captured by multiplying the matrix with the coordinate vector (i.e. the column matrix) of the polynomial.