[Math] In a free group two elements commute if and only if they are powers of a common element

free-groupsgroup-theory

In other words, $uv = vu$ in $F_n$ if and only if $u=w^m$ and $v=w^n$ for some $w\in F_n$.

I would like to prove this without making use of Nielsen-Schreier (every subgroup of a free group is free). We can always find reduced representations $u=t_1^{\epsilon_1}\cdots t_k^{\epsilon_k}$ and $v=s_1^{\eta_1}\cdots s_l^{\eta_l}$ and the statement $uv = vu$ transforms into $$t_1^{\epsilon_1}\cdots t_k^{\epsilon_k}\cdot s_1^{\eta_1}\cdots s_l^{\eta_l}\cdot t_k^{-\epsilon_k}\cdots t_1^{-\epsilon_1}\cdot s_l^{-\eta_l}\cdots s_1^{-\eta_1}\sim 0$$ where $\sim$ denotes the equivalence relation coming from setting $x\cdot x^{-1}\sim 0$. However, the arbitrariness of $u$ and $v$ makes it hard to go on from here.

Best Answer

I answer the question just to remove it from the list of unanswered questions.

Of course, the simplest solution is probably to use Nielsen-Schreier theorem: if $a$ and $b$ commute, then $\langle a,b \rangle$ is an abelian subgroup; but the only abelian free group is $\mathbb{Z}$, so $\langle a,b \rangle$ is cyclic.

Otherwise, the result can be shown thanks to a combinatorial argument from the classical normal form in free groups. The proof is made by induction on $\mathrm{lg}(a)+\mathrm{lg}(b)$, and the argument follows the hint given by Derek Holt; the same proof can be found in Johnson's book, Presentations of Groups. The case where $\mathrm{lg}(a)$ or $\mathrm{lg}(b)$ belongs to $\{0,1\}$ is obvious, so let us suppose that $\mathrm{lg}(a), \mathrm{lg}(b) \geq 2$.

First, we write $a$ and $b$ as reduced words on a free basis $$\left\{ \begin{array}{l} a= x_1 \cdots x_m \\ b= y_1 \cdots y_n \end{array} \right., \ n \leq m.$$

Then, we have the reduced product $$x_1 \cdots x_{m-r} y_{r+1} \cdots y_n =ab =ba = y_1 \cdots y_{n-s} x_{s+1} \cdots x_m.$$

Notice that $$m+n-2r-1= \mathrm{lg}(ab)= \mathrm{lg}(ba)= m+n-2s-1$$ implies $r=s$. Moreover, $0 \leq r \leq n$.

Case 1: $r=0$, there is no cancellation in the product. Then $y_i=x_i$ for $1 \leq i \leq n$ hence

$$ a = \underset{=y_1 \cdots y_n=b}{\underbrace{ \left( x_1 \cdots x_n \right) }} \cdot \underset{:=u}{\underbrace{ \left( x_{n+1} \cdots x_m \right) }} = bu.$$

Notice that $u$ and $b$ commute: $$bu=a=b^{-1}ab= ub.$$ Therefore, the induction hypothesis applies, and it is sufficient to conclude.

Case 2: $r=n$, the number of cancellations is maximal. Then $y_i=x_{m-i+1}^{-1}$ for $1 \leq i \leq n$ hence

$$a^{-1} = \underset{=y_1 \cdots y_n=b}{\underbrace{ \left( x_m^{-1} \cdots x_{m-n+1}^{-1} \right) }} \cdot \underset{:=u}{\underbrace{ \left( x_{m-n+2}^{-1} \cdots x_m^{-1} \right) }} = bu.$$

You conclude that $b$ and $u$ commute and you apply the induction hypothesis.

Case 3: $0 < r <n$. Then $x_1=y_1$, $y_1=x_m^{-1}$, $y_n=x_m$ and $x_1=y_m^{-1}$. Let $z:=x_1$, $a'= x_2 \cdots x_{m-1}$ and $b'= y_2 \cdots y_{n-1}$. Then $a=za'z^{-1}$ and $b=zb'z^{-1}$. Finally, you may apply the induction hypothesis to $a'$ and $b'$, and conclude.

Related Question