Toy Model of Heat Death – Probability and Stochastic Processes

pr.probabilitystochastic-processes

Motivation:

This is a toy model of how a closed system will always evolve towards the distribution of maximal entropy, where no further transfer of heat/energy is possible.

Problem set up:

Fix a positive integer $N$, and denote by $[N]$ the set $\{1, \dots, N\}$.

Let $\mathcal L := [N] \times [N]$ be a 2D lattice.

We model a flow of heat as follows – Initially at time $0$, all heat is concentrated at $(1, 1)$. At each time step $t$, for $t \in \mathbb Z_+$, an element of $\mathcal L$ is picked uniformly at random. At that stage, it and all its immediate neighbours to its east, west, north and south average their heat.

Formally, we have a sequence of iid uniformly distributed $\mathcal L$-valued random variables $\varepsilon_n$, for $n \in \mathbb Z$, and a sequence $X_n$ of functions $\Omega \times \mathcal L \to [0, 1]$, defined as follows:

$X_0 = \mathbf 1_{(1, 1)}$, almost surely.

Assume now $X_0, \dots, X_n$ have already been defined.

Write $X_n = \sum_{z \in \mathcal L} \lambda_z \mathbf 1_{z}$ for (random) $\lambda_z \in [0, 1]$ with $\sum_{x \in \mathcal L} \lambda_z = 1$, and $\mathcal N(\varepsilon_n)$ for the set consisting of $\varepsilon_n$ and its two to four, depending on the location of $\varepsilon_n$, neighbours.

Then define $X_{n+1} := \left[\underset{x \in \mathcal N(\varepsilon_n)}{\sum} \underset{y \in \mathcal N(\varepsilon_n)}{\sum} \frac{\lambda_y}{|\mathcal N(\varepsilon_n)|} \mathbf 1_x \right]+ \underset{z \in \mathcal L \setminus \mathcal N(\varepsilon_n)}{\sum} \lambda_z \mathbf 1_z$.

where $|S|$ denotes the cardinality of a finite set $S$.

Question: Let $Y$ be the “uniform distribution of heat” given by $Y := \underset{z \in \mathcal L}{\sum} \frac{1}{|\mathcal L|} \mathbf 1_z$. Is it true that almost surely, $X_n \to Y$ uniformly?

Thus the system evolves almost surely toward a distribution where no further transfer of heat is possible.

Best Answer

The Ehrenfest model (in discrete time, for simplicity) is just a Markov chain with the finite state space $\{0,1,\dots, N\}$ and the transition probabilities $$ p(k,k-1)=k/N, \quad p(k,k+1)=1-k/N $$ described by a single transition (averaging) operator $P$. Its stationary distribution $m$ is the binomial one with the parameters $\frac12,N$, and $\frac12 \theta (P^n+P^{n+1}) \to m$ for any initial distribution $\theta$. [Since the operator $P$ has period 2, one has to take the average of $\theta P^n$ and $\theta P^{n+1}$.]

In your situation there is a family of averaging operators $P_x$ indexed by the points from the state space $X$ which have a unique common invariant measure $m$ (the uniform distribution on $X$). You take a sequence $\boldsymbol x=(x_1,x_2,\dots)$ of iid $X$-valued uniformly distributed random variables, and ask whether, given an initial distribution $\theta$ on $X$, the sequence $$ \theta P_{x_1} P_{x_2} \dots P_{x_n} $$ converges to $m$ almost surely. Note that since we are talking about measures on a finite state space, all reasonable kinds of convergence are equivalent (in particular, the $\ell^1$ convergence in the total variation norm $\|\cdot\|$ and the $\ell^\infty$ "uniform" convergence).

Let $$ f_n(\boldsymbol x) = \| \theta P_{x_1} P_{x_2} \dots P_{x_n} - m \| \;. $$ The sequence $f_n$ is non-increasing, and therefore convergent. By Kolmogorov's 0-1 law its limit $f_\infty$ is almost surely constant. Let $k$ be the minimal number such that for any $x\in X$ there is a sequence $x_1,x_2,\dots, x_k\in X$ with $$ \text{supp}\,\delta_x P_{x_1} P_{x_2} \dots P_{x_k} = X \;. $$ Then there is $\varepsilon > 0$ such that $$ \mathbf E [ f_{n+k} | f_n ] \le (1-\varepsilon) f_n \qquad \forall\,n\ge 0 \;, $$ whence $f_\infty=0$.

EDIT. The fact that the sequence $f_n$ is non-increasing is a consequence of the following inequality: $$ \frac1d \sum_{i=1}^d |\theta_i - C| \ge \left| \frac1d \sum_i \theta_i - C \right| \;. $$ Here $d$ is the cardinality of the averaging set (i.e., between 3 and 5 in your example), and $C=1/N^2$ is the common value of the weights of the uniform distribution. After removing $C$ and the division by $d$ the above inequality amounts to the well-known $$ \sum_i |\theta_i| \ge \left| \sum_i\theta_i \right| \;. $$ The expectation bound is just a constructive version of this inequality: if $f_n(\boldsymbol x)=F>0$, then there are two points $z_1,z_2\in X$ such that $$ \theta P_{x_1} P_{x_2} \dots P_{x_n}(z_i) - m(z_i) \qquad i=1,2\;, $$ have absolute values comparable with $F$ and opposite signs. Therefore by the definition of $k$ there is at least one choice of $x_{n+1},\dots, x_{n+k}$ with $$ \begin{aligned} &\|\theta P_{x_1} P_{x_2} \dots P_{x_n+k} - m \| \\ \\ &< (1-\epsilon) \cdot \|\theta P_{x_1} P_{x_2} \dots P_{x_n} - m \| \;, \end{aligned} $$ where $\epsilon$ is an appropriate constant (which only depends on $N$).

Related Question