Basic multivector derivatives $∂_X X = n$ and $∂_X X^2 = 2X$, etc…

calculusclifford-algebrasgeometric-algebras

Using geometric algebra, one may define the multivector derivative $∂_X$ with respect to a general multivector $X$ as
$$
∂_X ≔ \sum_J 𝒆^J (𝒆_J * ∂_X)
$$

where each “component” $𝒆_J * ∂_X$ is defined by
$$
(𝒆_J * ∂_X)f(X) ≔ \frac{\mathrm{d}}{\mathrm{d}\tau}f(X + \tau𝒆_J)\big|_{\tau=0}
.$$

Notations

  • $A * B \equiv ⟨AB⟩_0$ denotes the scalar product.
    For any multivector $A$, we have $𝒆^J(𝒆_J * A) = A$.
    In literature, parentheses are often dropped with the understanding that $A*B C ≡ (A*B)C$.
  • We employ multi-index notation, $𝒆_J = 𝒆_{j_1}∧\cdots∧𝒆_{j_k}$. (If $k = 0$ then $𝒆_J = 1$).
    Reciprocal bases are reversed, $𝒆^J = 𝒆^{j_k}∧\cdots∧𝒆^{j_1}$, so that $𝒆^I * 𝒆_J = δ^I_J$ is always satisfied.

Problem

I’m having a humiliating time trying to sanity-check this definition by verifying, e.g., $∂_X X = n$, as stated in eq. (2.29) of [2] or eq. (7.8) of [3].
My computation begins as follows.
$$
∂_X X
= \sum_J 𝒆^J (𝒆_J * ∂_X)X
= \sum_J 𝒆^J \frac{\mathrm{d}}{\mathrm{d}\tau} (X + \tau𝒆_J)\big|_{\tau=0}
= \sum_J 𝒆^J 𝒆_J
.$$

There seems to be no room for confusion here.
But this is not $n$. Indeed,
\begin{align}
\sum_J 𝒆^J 𝒆_J &= \sum_{k=0}^n \sum_{j_1 < \cdots < j_k} \underbrace{𝒆^{j_1\cdots j_k}𝒆_{j_k\cdots j_1}}_1 = \sum_{k=0}^n \binom{n}{k} = 2^n
.\end{align}

This contradicts Proof 46 of [3], which includes the step “$\sum_{J_d} δ^J{}_J = d$” (a sum over multi-indices in $d$ dimensions) — which I can’t see to be true!

My failure is easily generalised: in trying to show $∂_X X^2 = 2X$, we have
\begin{align}
∂_X X^2 &= 𝒆^J(𝒆_J * ∂_X)X^2 = 𝒆^J \frac{\mathrm{d}}{\mathrm{d}\tau} (X + \tau𝒆_J)^2\big|_{\tau=0}
\\ &= 𝒆^J(𝒆_J X + X 𝒆_J) = 2^n X + 𝒆^J X 𝒆_J
.\end{align}

Note that it is easy to verify these results with the less general vector derivative, $\vec ∂ ≔ 𝒆^i ∂_i$ where $∂_i = 𝒆_i * ∂_X$ in the notation above.
Then, if $X = X^i𝒆_i$ is the position vector, we have $∂_i X = 𝒆_i$ and thus
$
\vec ∂ X = 𝒆^i ∂_i X = 𝒆^i 𝒆_i = n
$

and
$$
\vec ∂ X^2 = 𝒆^i \frac{\mathrm{d}}{\mathrm{d}\tau} (X + \tau𝒆_i)^2\big|_{\tau=0} = 𝒆^i(𝒆_i X + X 𝒆_i) = 2𝒆^i \, 𝒆_i * X = 2X
.$$

Clearly I am misinterpreting the way in which the vector derivative $\vec ∂ = 𝒆^i(𝒆_i * ∂_X)$ is ‘generalised’ to have components at all grades, $∂_X = 𝒆^J (𝒆_J * ∂_X)$.
Could someone with fresh eyes help me out?


References

  1. Lasenby and Doran, “Multivector Lagrangian Fields” – Ch. 1.
  2. Hestenes and Sobczyk, “Clifford Algebra to Geometric Calculus” – Ch. 2, §2.
  3. Hitzer, “Multivector Differential Calculus” – page 3.

Best Answer

Your computation is correct, but your understanding of “$d$-dimensional subspace” and the function meant by $X$ in $\partial_XX$ is not. To avoid confusion, let $f(X)$ be the function whose derivative you're trying to compute.

On page 57 of Hestenes and Sobcykz, just above the list of identities (2.28a)-(2.35), it says $X$ is the identity function on a linear subspace of dimension $d$. By this they mean the orthogonal projection onto a certain $d$-dimensional subspace of the algebra. (E.g., $X ↦ ⟨X⟩_1$.) Explicitly, if the space is denoted by $Y$, and we pick an orthonormal basis of multivectors $Y_1,...,Y_d$, then the $X ↦ \sum_{k=1}^d(X*Y_k)Y_k$ is the projection function meant by $X$. Note that $Y_i$ are basis multivectors of the algebra $G(V)$, not basis vectors $𝒆_i$ of a vector subspace of $V$.

In your computation, you take $Y$ to be the whole geometric algebra (on an $n$-dimensional space), so your $X$ is simply the identity function $f(X)=X$, and $Y$ has dimension $d=2^n$. This is the sense in which your computation is correct.

For the more general $X$ corresponding to a projection operator, the computation is $(\mathbf e_J * ∂_X)f(X) ≔ \frac{\mathrm{d}}{\mathrm{d}\tau}f(X + \tau\mathbf e_J)\big|_{\tau=0}=f(\mathbf e_J)$ for linear $f$ and for the above-desribed projection operator is explicitly given by $\sum_{k=1}^d\mathbf e_J*Y_k$.


In summary, if $G(V)$ is a geometric algebra over vector space of dimension $\dim V = n$, then $\dim G(V) = 2^n$ and the multivector derivative of the identity is indeed $$ ∂_X X = 2^n .$$

To make contact with the vector derivative, you must include the projection onto the grade-$1$ subspace: $$ ∂_X ⟨X⟩_1 = n .$$

Written differently, the vector and multivector derivatives are related by $\vec ∂ ≔ ⟨∂⟩_1$; $$ \vec ∂ = ⟨∂⟩_1 = ⟨𝒆^J (𝒆_J * ∂)⟩_1 = ⟨𝒆^J⟩_1 𝒆_J * ∂ = 𝒆^i (𝒆_i * ∂) ,$$ where $J$ is a multi-index and $i$ is a single index. Then we have $\vec ∂_X X = n$.

Similarly, your ‘unexpected’ result $$ ∂_X X^2 = 2^n X + 𝒆^J X 𝒆_J $$ is indeed correct. But by including a projection, you can easily verify the familiar results $$ \vec ∂_X X^2 = ⟨∂_X⟩_1 X^2 = 2X \quad\text{or}\quad ∂_X (⟨X⟩_1)^2 = ∂_X ⟨X^2⟩_0 = 2X $$ which you expected for the vector derivative. (In verifying these, note that $𝒆^i X 𝒆_i = (2 - n)X$.)