Matrix representation and non standard basis vectors

differential-geometrylinear algebra

Theres a theorem which states that if I have a linear transformation from $F^n—>F^m$ and I have the matrix associated with the transformation with respect to the standard bases of $F^n$ and $F^m$ then the linear transformation of the vector from the domain is equal to the matrix associated with the transformation multiplied by the same vector from the domain. Is there a theorem similar to that but for non-standard bases?

Best Answer

Nothing about that theorem, or its proof, mentions anything about the standard basis, so yes. Given any ordered bases $B$ and $C$ of $F^n$ and $F^m$ respectively (or any other pair of finite-dimensional vector spaces over the same field that you care to name, though it doesn't matter, since they're all isomorphic to such things), for any vector $v \in F^n$, define $v_B$ to be the column vector (if you somehow like your operators on the right, substitute "row vector") whose entries are the coefficients of the unique linear combination of elements of $B$ equal to $v$, and similarly in $F^m$. Then for any linear transformation $T: F^n \to F^m$, define $M_B^C(T)$ to be the matrix whose $(i,j)$-th entry is the coefficient of the $j$th element of $C$ in the unique linear combination of elements of $C$ equal to the image under $F$ of the $i$th element of $B$. Then $M_B^C(F)v_B = (F(v))_C$.