Skew-symmetric covariant tensor

linear algebratensors

I read the following and there is something I don't understand about the notation. Why isn't $S_{ij}$ just zero in the below?

Show that if $T_i$ are the components of covariant tensor $T$, then $S_{ij} = T_i T_j – T_j T_i$ are the components of a skew-symmetric covariant tensor $S$.

The skew-symmetry is obvious. From the transformation law for $T$.
$$\overline{T}_i \overline{T}_j – \overline{T}_j \overline{T}_i =
T_r \frac{\partial x^r}{\partial \overline{x}^i} T_s \frac{\partial x^s}{\partial \overline{x}^j} – T_s \frac{\partial x^s}{\partial \overline{x}^j} T_r \frac{\partial x^r}{\partial \overline{x}^i} = T_r T_s \frac{\partial x^r}{\partial \overline{x}^i} \frac{\partial x^s}{\partial \overline{x}^j} – T_s T_r \frac{\partial x^r}{\partial \overline{x}^i} \frac{\partial x^s}{\partial \overline{x}^j}= (T_r T_s – T_s T_r) \frac{\partial x^r}{\partial \overline{x}^i} \frac{\partial x^s}{\partial \overline{x}^j}$$

or
$$\overline{S}_{ij} = S_{rs} \frac{\partial x^r}{\partial \overline{x}^i} \frac{\partial x^s}{\partial \overline{x}^j}$$
which establishes the covariant tensor character of $S$.

Best Answer

Well, it is zero, but the point of your text is to prove that it indeed is a tensor.

A consequence of this is that if it is zero in one coordinate system ($x^{i}$, say) then it will be zero in every other coordinate system (because of the last expression you wrote).

Related Question