[Math] How many 2-card swaps until a “card deck” is close to true random

randomsequences-and-series

Let's define a "card deck" as a sequence of the first $x$ natural numbers, and a "swap" as the creation of a new sequence where the $i$th and $j$th member of the original sequence are exchanged e.g. swapping cards $2$ and $3$ so that $[1, 2, 3, 4 ] \longmapsto [1, 3, 2, 4]$.

My question is, how many swaps (where i and j are randomly chosen) must be made before a sequence is created that can be pronounced random? Is there a faster way to achieve randomness if i and j and not chosen randomly?

EDIT:
I suppose I don't care how random it is, though let's say the sequence is a real deck of cards; the randomness should be close to what you would get with $7$ good riffles (see Shuffling). Even better would be a way to control the degree of randomness, i.e. estimate $z$, where your odds of predicting the location of a particular card in the deck are $\frac1x + z$.

Note: It seems that picking i and j randomly is not the best way to achieve randomness. See Fisher-Yates shuffle.

Best Answer

In a truly random shuffle, the probability that, say, the top card remains where it started is $1/x$. So you want to do enough swaps to make the probability of the top card moving be about $(x-1)/x$. I don't think you need to do more than that; since the swaps are random, if the top card moves, it moves somewhere random, and there's nothing special about the top card, so every card should wind up somewhere random.

So: on each swap, the probability that you don't move (say) the top card is $(x-2)/x$. After $r$ independently chosen swaps, this probability is $((x-2)/x)^r$. So we want $r$ such that $$((x-2)/x)^r=1/x$$ (roughly). If $x$ is large then $((x-2)/x)^x$ is close to $e^{-2}$ and we get an approximate solution $r=(1/2)x\log x$ for the number of swaps needed to randomize the deck.

Related Question