[Math] “Surprising” examples of Markov chains

markov chainspr.probability

I am looking for examples of Markov Chains which are surprising in the following sense: a stochastic process $X_1,X_2,…$ which is "natural" but for which the Markov property is not obvious at first glance. For example, it could be that the natural definition of the process is in terms of some process $Y_1,Y_2,…$ which is Markovian and where $X_i=f(Y_i)$ for some function $f$.

Let me give you an example which is slightly surprising, but not surprising enough for my taste. Suppose we have $n$ bins that are initially empty, and at each time step $t$ we throw a ball into one of the bins selected uniformly at random (and independently of previous time steps). Let $X_t$ be the number of empty bins at time $t$. Then $X_1,X_2,…$ form a Markov chain.

Are there good, more surprising examples? I apologize if this question is vague.

Best Answer

I believe that if $(X_n)$ is a biased simple random walk on $[-N,N]$, then $|X_n|$ is a Markov chain.

Related Question