[Math] Representing a linear transformation as a tensor

linear algebratensor-productstensors

I understand that a linear transformation from a vector space $V$ to a vector space $W$ is a rank-$2$ tensor. What I would like some help with is how exactly to represent specific linear transformations in tensor notation. I will write below my attempt to link the two; please let me know if it makes sense, and provide whatever suggestions you think might help me.

So, how much, if any, of the following is correct?

Let $(v_1,\ldots,v_n)$ and $(w_1,\ldots,w_m)$ be bases of $V$ and $W$, respectively. Then $v_i \otimes w_j$ is the linear transformation that maps $v_i$ to $w_j$, and $\{v_i \otimes w_j:{i \in \{1,\ldots,n\}, j \in \{1,\ldots,m\}}\}$ is a basis of $V \otimes W$.

If some linear transformation $T$ has the matrix
\begin{bmatrix}
t_{1,1} & t_{1,2} & \ldots & t_{1,n}\\
t_{2,1} & t_{2,2} & \ldots & t_{2,n}\\
\vdots & \vdots & \ddots & \vdots \\
t_{m,1} & t_{m,2} & \ldots & t_{m,n}
\end{bmatrix}

then $$T= \sum_{i \in \{1,\ldots,n\},\\ j \in \{1,\ldots,m\}}(t_{j,i} [v_i \otimes w_j])$$ is the tensor representation of $T$.

Best Answer

how exactly to represent specific linear transformations in tensor notation.

Indeed every linear map $A:V\rightarrow W$ has a representation as a tensor $T\in V^{*}\otimes W$ such that $\forall x\in V:Ax=T(x)\in W$. In chosen bases,

$$ T=\sum a_{ij}\;v_j^*\otimes w_{i} $$

where $a_{ij}$ is the matrix of the linear map and $\{v_j^*\}$ is the basis in $V^*$ dual to $\{v_i\}$ so that each term is a map reading off the j-th coordinate of the $x$, $x\mapsto a_{ij}x^j\:w_i$.

Related Question