[Math] the connection between a row vector and covariant vector (or column and contravariant)

differential-geometrytensorsvectors

This youtube video by Eugene Khutoryansky makes the distinction clear between the covariant coordinates of a vector (dot product of the vector with each of the basis vectors or vector projection), and the contravariant coordinates (parallelogram law):

enter image description here

Is there a way you could explain to a lay person that this somehow underpins the following fact:

Covariant vectors are representable as row vectors. Contravariant vectors are representable as column vectors.

?


I would like to know, for example, if the idea carries beyond being able to calculate the length of a vector in non-Cartesian coordinates as the dot product of its covariant and contr$avariant expressions:

$$ \lVert V\rVert ^2=\begin{bmatrix} V_X & V_Y & V_Z\end{bmatrix}\cdot \begin{bmatrix} V^X \\ V^Y\\ V^Z\end{bmatrix}.$$

In a curvilinear system presumably the contravariant basis vectors would be tangential, whereas the covariant basis vectors would be orthogonal to the coordinates:

enter image description here


Apropos of the first comment, and if it can be confirmed (as a bonus), covariant vectors are covectors or dual vectors, while contravariant vectors are just vectors.

Best Answer

You can represent "contravariant vectors" as rows and "covariant vectors" as columns all right if you want.

It's just a convention. The dual space of the space of column vectors can be naturally identified with the space of row vectors, because matrix multiplication can then correspond to the "pairing" between a "covariant vector" and a "contravariant vector".

Remember that "covariant vectors" are defined as scalar-valued linear maps on the space of "contravariant vectors", so if $\omega$ is a covariant vector and $v$ is a contravariant vector, then $\omega(v)$ is a real number that depends linearly on both $v$ and $\omega$. If you make $v$ correspond to a column vector, and make $\omega$ correspond to a row vector then $$ \omega(v)=\omega v=(\omega_1,...,\omega_n)\left(\begin{matrix}v^1 \\ \vdots \\ v^n\end{matrix}\right)=\omega_1v^1+...+\omega_nv^n. $$

If $\omega$ was the column instead, then the above matrix multiplication would look as $\omega(v)=v\omega$, which would not look as aesthetically pleasing, as we are used to displaying the argument of a function to the right of the function, and in this case $v$ is the argument.

Related Question