Differential Geometry – Lie Bracket of gl_n(R) is the Matrix Commutator

differential-geometrylie-algebraslie-groupssmooth-manifolds

Notation/preliminaries.

  1. Let $\mathfrak{g}$ denote the Lie algebra (of left-invariant vector fields) on the Lie group $G$. Its Lie bracket $[.,.]\colon \mathfrak{g}\times\mathfrak{g}\to\mathfrak{g}$ is defined by
    $$[X,Y]_p(f)=X_p(Y_{\square}(f))-Y_p(X_\square(f))$$
    for any vector fields $X,Y\in C^{\infty}(TG)$, any point $p\in G$ and any smooth function $f\in C^{\infty}(G)$. Here, $X_\square(f)\colon G\to\mathbb{R}$ is the smooth map defined by $q\mapsto X_q(f)$.

  2. Let $T_eG$ denote the tangent space at the identify element $e$, consisting of all linear maps $C^{\infty}(G)\to \mathbb{R}$ which satisfy the product rule. The tangent space is equipped with the Lie bracket $[\![.,.]\!]\colon T_eG\times T_eG\to T_eG$ given by $[\![X_e,Y_e]\!]=[X,Y]_e$.

  3. This gives us a Lie algebra isomorphism $\mathfrak{g}\cong T_eG$. More precisely, one can show that every tangent vector $X_e\in T_eG$ can be extended in a unique way to a left-invariant vector field $X\in \mathfrak{g}$.

  4. For the Lie group $G=\mathrm{GL}_n(\mathbb{R})$, we can use $x\colon \mathrm{GL}_n(\mathbb{R})\to \mathbb{R}^{n\times n}$ defined by $p\mapsto p$ as global coordinates.

  5. We will use $\Big\{\Big(\frac{\partial}{\partial x^{ij}}\Big)_e\Big\}_{i,j=1}^n$ as a basis for $T_e\mathrm{GL}_n(\mathbb{R})$. Here, $\Big(\frac{\partial}{\partial x^{ij}}\Big)_e(f)=\partial_{ij}(f\circ x^{-1})\vert_{x(e)}$ for everh smooth function $f\colon \mathrm{GL}_n(\mathbb{R})\to\mathbb{R}$.

  6. This gives rise to a vector space isomorphism $T_e\mathrm{GL}_n(\mathbb{R})\cong \mathbb{R}^{n\times n}$ via
    $\sum_{i,j} a_{ij}\Big(\frac{\partial}{\partial x^{ij}}\Big)_e\mapsto (a_{ij})$.

  7. For any vector field $X$ on $\mathrm{GL}_n(\mathbb{R})$, we let $M_X$ denote the matrix associated to $X$ via the identifications $\mathfrak{gl}_n(\mathbb{R})\cong T_e\mathrm{GL}_n(\mathbb{R})\cong \mathbb{R}^{n\times n}$.

Problem. I want to show that under the identifications $\mathfrak{gl}_n(\mathbb{R})\cong T_e\mathrm{GL}_n(\mathbb{R})\cong \mathbb{R}^{n\times n}$, $[.,.]$ corresponds to the matrix commutator on $\mathbb{R}^{n\times n}$. Or more precisely:

For any vector fields $X$ and $Y$ on $\mathrm{GL}_n(\mathbb{R})$, it holds that
$M_{[X,Y]}=M_XM_Y-M_YM_X$.

Own attempt. I have realized that it suffices to show that for two tangent vectors $X_e=\sum_{i,j} a_{ij} \Big(\frac{\partial}{\partial x^{ij}}\Big)_e$ and $Y_e=\sum_{i,j} b_{ij} \Big(\frac{\partial}{\partial x^{ij}}\Big)_e$, it holds that
$$[X,Y]_e=\sum_{i,j,k} \big(a_{ik}b_{kj}-b_{ik}a_{kj}\big)\Big(\frac{\partial}{\partial x^{ij}}\Big)_e.$$

The first step, I guess, is to find the extensions $X$ and $Y$ of $X_e$ and $Y_e$, respectively. I'm more or less convinced that for any $p=(p_{ij})\in \mathrm{GL}_n(\mathbb{R})$, it holds that
$$X_p=\sum_{i,j,k} p_{ik}a_{kj}\Big(\frac{\partial}{\partial x^{ij}}\Big)_p.$$ Can we conclude from this that
$X=\sum_{i,j,k} x_{ik}(\square)a_{kj}\Big(\frac{\partial}{\partial x^{ij}}\Big)_\square$
holds?

I've then tried to compute $[X,Y]_e(f)$ for an arbtirary $f\in C^\infty(\mathrm{GL}_n(\mathbb{R}))$, using the formula in (1) above, and end up with the scary expression
$$[X,Y]_e(f)=\Bigg(\sum_{i,j} a_{ij} \Big(\frac{\partial}{\partial x^{ij}}\Big)_e\Bigg)\Bigg(\Big(\sum_{i,j,k} x_{ik}(\square)b_{kj}\Big(\frac{\partial}{\partial x^{ij}}\Big)_\square\Big)(f)\Bigg)-\Bigg(\sum_{i,j} b_{ij} \Big(\frac{\partial}{\partial x^{ij}}\Big)_e\Bigg)\Bigg(\Big(\sum_{i,j,k} x_{ik}(\square)a_{kj}\Big(\frac{\partial}{\partial x^{ij}}\Big)_\square\Big)(f)\Bigg)\,,$$
from which I have no idea where to go. Am I at all on the right track here? It feels like my main problem is that I get a little bit lost in all the notation and all identifications we make back and forth. Indeed, proofs of this fact can be found in many text books (e.g. Lee's Introduction to Smooth Manifolds p. 194), but the notation there tends to be too coarse for me to follow what is going on.

Best Answer

The "scary expression" will give you what you want, but you need to be careful with the names of the indices, you have some conflicts there, like $i$ appearing twice in not summed expressions.

Instead, let's streamline things a bit, for $A,B\in \mathfrak g$ we have $$ A=a^{ij}\partial_{ij}|_e,\ B=b^{ij}\partial_{ij}|_e $$ with the components being constants, and summing over repeated indices are understood. A generic group element is denoted as $x^{ij}$ and $\partial_{ij}$ are the holonomic frame vectors associated with the canonical coordinate system mapping group elements to their matrix elements.

We also recall that given any manifold $M$ with local chart $(U,\varphi)$ (with $\varphi(x)=(x^1(x),...,x^n(x))$), if two vector fields $X=X^i\partial_i$ and $Y=Y^i\partial_i$ are given, then their commutator is locally given by $$ [X,Y]=\left(X^j\partial_j Y^i-Y^j\partial_j X^i\right)\partial_i. $$

Left multiplication:

Let $\gamma:(-\epsilon,\epsilon)\rightarrow G$ be a smooth curve such that $\gamma(0)=x$ and $\dot\gamma(0)=X=X^{ij}\partial_{ij}|_x$. Let $g=(g^{ij})\in G$ be a group element. Then $$ (l_g)_\ast X=\frac{d}{dt}g\gamma(t)|_{t=0}=^!gX=g^{ik}X^{kj}\partial_{ij}|_{gx}, $$ where at the equality sign with the exclamation mark we use the fact that the group elements are just ordinary matrices embedded into $\mathbb R^{n\times n}$.

So left translation of vectors in $\text{GL}(n,\mathbb R)$ is just ordinary left multiplication of the vector (which is a matrix, rememeber!).

The derivation:

By the previous, the left invariant vector fields corresponding to $A,B\in\mathfrak g$ (also denoted the same way) are given by $$ A_x=x^{ik}a^{kj}\partial_{ij}|_x\ B_x=x^{ik}b^{kj}\partial_{ij}|_x. $$

If we now reinterpret the $x^{ij}$ from being specific variables to being coordinate functions, we can also write the vector fields without evaluation at a specific point as $$ A=x^{ik}a^{kj}\partial_{ij}\ B=x^{ik}b^{kj}\partial_{ij}. $$ the commutator is then $$ [A,B]=(A^{mn}\partial_{mn}B^{ij}-B^{mn}\partial_{mn}A^{ij})\partial_{ij}, $$ where $$ A^{ij}=x^{ik}a^{kj}, $$ and similarly for $B$. This is $$ [A,B]=\left( x^{mr}a^{rn}\partial_{mn}(x^{ik}b^{kj})-x^{mr}b^{rn}\partial_{mn}(x^{ik}a^{kj}) \right)\partial_{ij}=^!\left(x^{mr}a^{rn}\delta^i_m\delta^k_n b^{kj}-x^{mr}b^{rn}\delta^i_m\delta^k_n a^{kj}\right)\partial_{ij} \\ =\left( x^{ir}a^{rk}b^{kj}-x^{ir}b^{rk}a^{kj} \right)\partial_{ij}. $$ At the equality with the exclamation mark we have used that $\partial_{mn}x^{ij}=\delta^i_m\delta^j_n$ and that the coefficients $a^{ij},b^{ij}$ are constants.

For the identity element we have $x^{ij}(e)=\delta^{ij}$, so $$ [A,B]_e=\left(a^{ik}b^{kj}-b^{ik}a^{kj}\right)\partial_{ij}|_e=[A_e,B_e], $$ where the last expression is the ordinary matrix commutator of the matrices $A_e,B_e$.

Related Question