Proof Verification – What the Jacobi Identity Imposes on Structure Constants

lie-algebrasproof-verification

I am currently self-studying Lie algebras and want to check my answer to the following question from Erdmann and Wildon:

Let $L$ be a Lie algebra with basis $(x_1, \dots, x_n)$. What condition does the Jacobi identity impose on the structure constants $a_{ij}^k$?

The structure constants are defined as the scalars $a_{ij}^k$ in the underlying field $F$ such that $$[x_i, x_j] = \sum_k a_{ij}^k x_k.$$ I have already established that $a_{ji}^k = – a_{ij}^k$ and that $a_{ii}^k =0$.

Here's my stab at answering the above question:

Let $x_i, x_j, x_k$ be basis elements. Applying the Jacobi identity:

\begin{align}
[x_i, [x_j, x_k]] + [x_j, [x_k, x_i]] + [x_k, [x_i, x_j]]&=0 \\
\left[x_i, \sum_l a_{jk}^l x_l \right] + \left[ x_j, \sum_l a_{ki}^l x_l \right] + \left[ x_k, \sum_l a_{ij}^l x_l \right] &= 0 \\
\sum_l a_{jk}^l [ x_i, x_l] + \sum_l a_{ki}^l [ x_j, x_l] + \sum_l a_{ij}^l [ x_k, x_l] &= 0 \\
\sum_l a_{jk}^l \sum_m a_{il}^m x_m + \sum_l a_{ki}^l \sum_m a_{jl}^m x_m + \sum_l a_{ij}^l \sum_m a_{kl}^m x_m &= 0 \\
\sum_l \sum_m \left( a_{jk}^l a_{il}^m x_m + a_{ki}^l a_{jl}^m x_m + a_{ij}^l a_{kl}^m x_m \right) &=0 \\
\sum_m x_m \underbrace{\sum_l \left( a_{jk}^l a_{il}^m + a_{ki}^l a_{jl}^m + a_{ij}^l a_{kl}^m \right)} &= 0.
\end{align}
Since $(x_1, \dots, x_n)$ forms a basis, they are linearly independent. Therefore, the indicated term is zero for all $i, j, k, m$.

Questions

  1. Is the above correct? Is there a less computationally intensive approach? Note that I have been particularly explicit to try to catch any possible errors; my scratch work compressed this to three lines.

  2. Is this the strongest result possible?

  3. The conditions that $a_{ji}^k = – a_{ij}^k$ and that $a_{ii}^k =0$ are easy to visualize by analogy with skew-symmetric matrices. Is there a similar interpretation of this result?

Best Answer

(1) Yes, this is correct. One can cut the amount of work by $2/3$ if one observes that the Jacobian identity is skew-symmetric in $ijk$. If we denote the full alternation over these indices by $\textrm{Alt}$, the Jacobi identity is $$0 = \textrm{Alt} \,[x_i, [x_j, x_k]] ,$$ which gives $$0 = \text{Alt} (a^l_{ij} a^m_{kl}) ;$$ optionally, we can write out the full alternation (and use antisymmetry of the lower indices of structure constants) to recover your formula.

(2) This is the strongest result possible, in the sense that (again, given the antisymmetry in the lower induces of the structure constants) this is equivalent to the Jacobi identity for any (equivalently, every) choice $(x_i)$ of basis.

(3) There isn't exactly an analogue of those statements---in both of those cases, one fixes the upper index $l$ of $a^l_{ij}$ and interprets the entries $a^l_{ij}$ as components of a matrix $A$, but in the Jacobi identity condition one contracts with this index, so it cannot be fixed in any sense.

Here are two ways we might see this a little more concretely, however:

(1) Using antisymmetry, we can rearrange the Jacobi identity as $$[x, [y, z]] = [[x, y], z] + [y, [x, z]] .$$ Now, if we define for $x \in L$ the operator $\textrm{ad}_x : L \mapsto L$, $y \mapsto [x, y]$, we can rewrite this as $$\textrm{ad}_x[y, z] = [\textrm{ad}_x(y), z] + [y, \textrm{ad}_x(z)] .$$ In other words, the Jacobi identity is just a Leibniz (product) rule for the operator $\textrm{ad}_x$.

(2) In the special case $L = \mathfrak{so}(3, \Bbb R) \cong \mathfrak{su}(2)$, we may identify the Lie bracket with the usual cross product $\times$ on $\Bbb R^3$. Then, the Jacobi identity translates to the possibly familiar identity $${\bf x} \times ({\bf y} \times {\bf z}) + {\bf y} \times ({\bf z} \times {\bf x}) + {\bf z} \times ({\bf x} \times {\bf y}) = 0.$$

Related Question