[Math] If some series of n terms is deranged, what is the probability that no term stands next to a term it was next to originally

combinatoricseuler-mascheroni-constantinfinityprobability theorysequences-and-series

I stumbled upon this most interesting problem whose solution has so far escaped me and I think that it'd be very fascinating to see how it may be precisely dealt with.

It runs as follows,

If some series of $n$ terms is deranged, how can we first calculate the probability that no term shall stand next to a term it was next to originally, and further, if $n$ happens to be infinite, how can we prove that the probability is $e^{-2}$?

Best Answer

The limiting ratio is not hard to derive via the method of moments.

In a random permutation $\sigma$ of $\{1,2,\dots,n\}$ (not necessarily a derangement), let $X_n$ be

  • the total number of integers $i$, $1 \le i \le n$, such that $\sigma(i) = i$, plus
  • the total number of integers $i$, $1 \le i \le n-1$, such that $\sigma(i+1) = \sigma(i)+1$, plus
  • the total number of integers $i$, $1 \le i \le n-1$, such that $\sigma(i) = \sigma(i+1)+1$.

We'll show that for any fixed $k$, we have $\lim_{n \to \infty} \mathbb E[\binom{X_n}{k}] = \frac{3^k}{k!}.$ If a random variable $X$ is Poisson with mean $3$, we also have $\mathbb E[\binom{X}{k}] = \frac{3^k}{k!}$. The Poisson distribution is determined by its moments, so it follows that $X_n$ converges in distribution to $X$, and in particular $\lim_{n \to \infty} \Pr[X_n = 0] = e^{-3}$.

To see this, note that $X_n$ is the sum of $3n-2$ indicator variables corresponding to each of the events (listed above) that $X_n$ counts, so $\binom{X_n}{k}$ counts the number of size-$k$ sets of events that occur. The calculation is difficult to do exactly but more or less straightforward asymptotically. Out of the $\binom{3n-2}{k}$ choices of $k$ events, $(1-o(1))\binom{3n-2}{k}$ never involve the same value $\sigma(i)$ more than once, so the probability that they occur is $(1+o(1))n^{-k}$. These contribute $(1+o(1))\frac{3^k}{k!}$ to the expected value $\mathbb E[\binom{X}{k}]$, which is all that we wanted.

So it remains to reassure ourselves that the contribution from all other choices of $k$ events is insignificant in the limit. Muddying the picture, some groups of $j \le k$ events that overlap have a significantly higher probability than a group of $j$ nonoverlapping events. For example, the events that $\sigma(3)=3$, that $\sigma(4)=4$, and that $\sigma(4)=\sigma(3)+1$ have this property.

However, we can never win this way, because there are $O(n)$ ways to pick that a group of $j$ overlapping events (with a constant factor depending on $k$) but a $O(n^{-2})$ chance that all of the events occur (since at least two values of $\sigma$ are involved). So for any possible overlapping structure, the contribution to $\mathbb E[\binom{X}{k}]$ is $O(n^{-1})$, and there is a constant number (depending only on $k$) of overlapping structures.

As a result, we can ignore all overlaps, conclude that $\lim_{n\to\infty}\mathbb E[\binom{X}{k}] = \frac{3^k}{k!}$, and deduce that $\lim_{n\to\infty} \Pr[X_n = 0] = e^{-3}$. Therefore $$\lim_{n\to\infty} \Pr[X_n = 0 \mid \text{$\sigma$ is a derangement}] = \lim_{n\to\infty} \frac{\Pr[X_n=0]}{\Pr[\text{$\sigma$ is a derangement}]} = \frac{e^{-3}}{e^{-1}} = e^{-2}.$$