[Math] the covariant derivative of the covector field $\alpha_{\bf i} {\bf e}^{\bf i}$

derivativesdifferential-geometrydual-spacesriemannian-geometrytensors

I'm trying to understand why the covariant derivative of a covector $\bf{\alpha} = \alpha_i \bf{e^i}$ is:

$$\nabla_k\alpha = (\frac{\partial\alpha_i}{\partial x^k} – \alpha_j \Gamma^j_{ik})\bf{e^i} $$

(I'm using the Einstein summation convention here.)

I've seen many sources define the covariant derivative of a vector $\bf{v}$ using abstract axioms. However, it can also be defined in a more concrete way just by taking the ordinary derivative and using product rule to differentiate both the vector components and the basis vectors:

$$\nabla_k(\bf{v}) = \frac{\partial}{\partial x^k} (v^i \bf{e_i}) = \frac{\partial v^i}{\partial x^k} \bf{e_i} + v^i\frac{\partial e_i}{\partial x^k} = \frac{\partial v^i}{\partial x^k} \bf{e_i} + v^i\Gamma^j_{ik} \bf{e_j} = (\frac{\partial v^j}{\partial x^k} + v^i\Gamma^j_{ik}) \bf{e_j}$$

where we define $\Gamma^j_{ik} \bf{e_j} = \frac{\partial e_i}{\partial x^k}$.

We can follow similar reasoning for a covector $\bf{\alpha} = \alpha_i \bf{e^i}$:

$$\nabla_k(\bf{\alpha}) = \frac{\partial}{\partial x^k} (\alpha_i \bf{e^i}) = \frac{\partial \alpha_i}{\partial x^k} \bf{e^i} + \alpha_i\frac{\partial e^i}{\partial x^k} = \frac{\partial \alpha_i}{\partial x^k} \bf{e^i} + \alpha_i\Lambda^i_{jk} \bf{e^j} = (\frac{\partial \alpha_j}{\partial x^k} + \alpha_i\Lambda^i_{jk}) \bf{e^j}$$

where $\Lambda^i_{jk} \bf{e^j} = \frac{\partial e^i}{\partial x^k}$.

We can also note that:
$$\nabla_k(\bf{\alpha}(\bf{v})) = \frac{\partial}{\partial x^k} (\alpha_i v^i)
= \frac{\partial \alpha_i}{\partial x^k} v^i + \alpha_i\frac{\partial v^i}{\partial x^k} $$

Now if we follow the standard definitions of $ \bf{\alpha}(\bf{e_i}) = \alpha_i $ and $\bf{e^i}(\bf{v}) = v^i$, then it can be shown that :

$$\nabla_k(\bf{\alpha}(\bf{v})) – (\nabla_k \bf{\alpha})(\bf{v}) – \bf{\alpha}(\nabla_k\bf{v}) = \alpha_i v^j (\Gamma^i_{jk} + \Lambda^i_{jk}) $$

To get the correct formula for the covariant derivative of a covector, I need to somehow show that $\Gamma^i_{jk} + \Lambda^i_{jk} =0$. Many sources show this simply by declaring the left-hand-side of the previous line to be zero without proof, usually using "product rule" as justification. I have a hard time accepting this as $\bf{\alpha}(\bf{v})$ doesn't look like a product.

What am I missing in order to show that $\Gamma^i_{jk} + \Lambda^i_{jk} =0$? Is there a way to prove it without arbitrarily declaring a new property/axiom?

Best Answer

Shouldn't you write $\nabla_k e_i$ (and $\nabla_k e^i$) instead of $\frac{\partial e_i}{\partial x^k}$ and $\frac{\partial e^i}{\partial x^k}$? Those notations are confusing to me. Anyway, you just need to remember that the induced connection on $T^*M$ is defined (or characterized) by the Leibniz rule: $$\nabla (\alpha(X)) = (\nabla \alpha)(X) + \alpha(\nabla X)$$ For any one-form $\alpha$ and vector field $X$. This defines $\nabla \alpha$. Note that $\alpha(X)$ is just a function (a scalar), so $\nabla(\alpha(X)) = d(\alpha(X))$.

You get what you want by taking $\alpha = e^j$ and $X = e_k$: we get $$\nabla \delta^j_{k} = (\nabla e^j)(e_k) + e^j(\nabla e_k)$$ which means that for all $i$: $$\nabla_i \delta^j_{k} = (\nabla_i e^j)(e_k) + e^j(\nabla_i e_k)$$ Note that $\delta^j_{k}$ is a constant function, so $\nabla_i \delta^j_{k} = 0$. We're left with $$0 = \Lambda^j_{ik} + \Gamma^j_{ik}$$