Number of errors of $\sigma$ is the same as minimum transpositions to convert $\sigma$ to the identical permutation

linear algebralinear-transformationspermutationsproof-writing

Suppose, we have this general permutation $\sigma$:

$$
\sigma =
\begin{pmatrix}
1 & 2 & \dots & n\\
\sigma(1) & \sigma(2) & \dots & \sigma(n)
\end{pmatrix}
$$

How do I show, that the minimum number of transpositions of adjacent elements necessary to convert $σ$ to the identical permutation is always exactly the same as the number of errors of $σ$?

Suppose, that $\sigma$ has $m$ $\color{red}{\text{inversion-pairs}}$, with $m\leq n$ and $m,n\in\mathbb{N}$. Then we have $m$ pairs, with $i<j;\; 1 \leq i \geq n,\; 1 \leq j \geq n\quad i,j\in\mathbb{N}$, where $ \sigma(i)> \sigma(j)$, which we can fix by interchanging $\sigma(i)$ and $\sigma(j)$:

$$\sigma=\begin{pmatrix} i &\dots& j \\ \sigma(i) &\dots &\sigma(j)\end{pmatrix}\implies \sigma'=\begin{pmatrix} i &\dots& j \\ \sigma(j) &\dots &\sigma(i)\end{pmatrix}$$

I don't know how to go on. I'm not sure how I should show, that this is the same as with interchanging only adjacent elements. This will lead to the desired conclusion, because after we've interchanged all $m$ errors, we have sorted all elements: Let $\sigma'(1)<\sigma'(2)<\sigma'(n)$:
$$
\begin{pmatrix}
1 & 2 & \dots & n\\
\sigma'(1) & \sigma'(2) & \dots &\sigma'(n)
\end{pmatrix}
$$

(It's like we're sorting elements with the bubble-sort algorithm.)


Can you help me with hints?

Best Answer

I'll call a transposition of two adjacent elements "a swap", and I will call any way of writing a permutation as a product (or, equivalently, composition) of swaps a "swap decomposition". I will also call "an inversion" what you seem to call an "error", namely a pair of elements $i>j$ with $\sigma(i)<\sigma(j)$; I call such elements "out of order". So what we want to show is that the for any permutation $\sigma$, the number of inversions is the smallest number of swaps in any of its swap decompositions.

It is not too bad to see that the number of inversions is not more that the number of swaps in any swap decomposition: thinking of the decomposition as starting from the identity permutation and building $\sigma$, every swap either increases the number of inversions by 1 (if we swap two elements that used to be in order), or decreases the number of inversions by 1 (if we swap two elements that used to be out of order). So after all the swaps are done (i.e. after we get $\sigma$) the number of inversions is at most the number of swaps.

Now it is also clear what we need to show to see that there exists a swap decomposition where the number of swaps is actually equal to the number of inversions: we need a swap decomposition where each swap actually increases the number of inversions by 1. To see that this exists it is easier to think "backwards": the swap decomposition of $\sigma$ also tells us how do go from $\sigma$ to identity (here we are using that inverse of a swap is a swap). Now every swap decreases the number of inversions by at most 1, and we want to show that there is a "path" from $\sigma$ to identity where every "step" in fact does decrease the number of inversions of $\sigma$. But this is not too hard to see: formally, by induction on the number of inversions. Base case is where the number of inversions is $0$, so $\sigma$ is identity, and has empty swap decomposition (if this makes you uncomfortable, use the base case where $\sigma$ is itself a swap instead). For inductions step, suppose $\sigma$ has $k>0$ inversions. Then $\sigma$ is not identity, and hence there exists a pair of adjacent elements that are out of order. If we swap them, the number of inversions will become $k-1$, and the new $\tilde{\sigma}$ has, by induction hypothesis, a swap decomposition of the length $k-1$. Composing this decomposition with the initial swap we get swap decomposition of $\sigma$ of length $k$. QED

Related Question