The Ehrenfest model (in discrete time, for simplicity) is just a Markov chain with the finite state space $\{0,1,\dots, N\}$ and the transition probabilities
$$
p(k,k-1)=k/N, \quad p(k,k+1)=1-k/N
$$
described by a single transition (averaging) operator $P$. Its stationary distribution $m$ is the binomial one with the parameters $\frac12,N$, and $\frac12 \theta (P^n+P^{n+1}) \to m$ for any initial distribution $\theta$. [Since the operator $P$ has period 2, one has to take the average of $\theta P^n$ and $\theta P^{n+1}$.]
In your situation there is a family of averaging operators $P_x$ indexed by the points from the state space $X$ which have a unique common invariant measure $m$ (the uniform distribution on $X$). You take a sequence $\boldsymbol x=(x_1,x_2,\dots)$ of iid $X$-valued uniformly distributed random variables, and ask whether, given an initial distribution $\theta$ on $X$, the sequence
$$
\theta P_{x_1} P_{x_2} \dots P_{x_n}
$$
converges to $m$ almost surely. Note that since we are talking about measures on a finite state space, all reasonable kinds of convergence are equivalent (in particular, the $\ell^1$ convergence in the total variation norm $\|\cdot\|$ and the $\ell^\infty$ "uniform" convergence).
Let
$$
f_n(\boldsymbol x) = \| \theta P_{x_1} P_{x_2} \dots P_{x_n} - m \| \;.
$$
The sequence $f_n$ is non-increasing, and therefore convergent. By Kolmogorov's 0-1 law its limit $f_\infty$ is almost surely constant. Let $k$ be the minimal number such that for any $x\in X$ there is a sequence $x_1,x_2,\dots, x_k\in X$ with
$$
\text{supp}\,\delta_x P_{x_1} P_{x_2} \dots P_{x_k} = X \;.
$$
Then there is $\varepsilon > 0$ such that
$$
\mathbf E [ f_{n+k} | f_n ] \le (1-\varepsilon) f_n \qquad \forall\,n\ge 0 \;,
$$
whence $f_\infty=0$.
EDIT. The fact that the sequence $f_n$ is non-increasing is a consequence of the following inequality:
$$
\frac1d \sum_{i=1}^d |\theta_i - C| \ge \left| \frac1d \sum_i \theta_i - C \right| \;.
$$
Here $d$ is the cardinality of the averaging set (i.e., between 3 and 5 in your example), and $C=1/N^2$ is the common value of the weights of the uniform distribution. After removing $C$ and the division by $d$ the above inequality amounts to the well-known
$$
\sum_i |\theta_i| \ge \left| \sum_i\theta_i \right| \;.
$$
The expectation bound is just a constructive version of this inequality: if $f_n(\boldsymbol x)=F>0$, then there are two points $z_1,z_2\in X$ such that
$$
\theta P_{x_1} P_{x_2} \dots P_{x_n}(z_i) - m(z_i) \qquad i=1,2\;,
$$
have absolute values comparable with $F$ and opposite signs. Therefore by the definition of $k$ there is at least one choice of $x_{n+1},\dots, x_{n+k}$ with
$$
\begin{aligned}
&\|\theta P_{x_1} P_{x_2} \dots P_{x_n+k} - m \| \\ \\ &< (1-\epsilon) \cdot \|\theta P_{x_1} P_{x_2} \dots P_{x_n} - m \| \;,
\end{aligned}
$$
where $\epsilon$ is an appropriate constant (which only depends on $N$).
$\newcommand\Om\Omega\newcommand\om\omega\newcommand\R{\mathbb R}$This is indeed straightforward. Spelling out
$$\sup_{x\in\R}|F_n(x)-F(x)|\to0 \text{ a.s.}, $$
we see that for some subset $N$ of $\Om$ of outer probability $0$ and all $\om\in\Om_0:=\Om\setminus N$ we have
$$\sup_{x\in\R}\Big|\frac1n\sum_{j=1}^n 1(X_j(\om)\le x)-F(x)\Big|\to0$$
and hence
$$\frac1n\sum_{j=1}^n 1(X_j(\om)\le X_0(\om))\to F(X_0(\om)).$$
So, indeed we have
$$F_n(X_0)\to F(X_0)$$
a.s. and hence in law. Also, if $F$ is continuous, then the random variable $F(X_0)$ has the standard uniform distribution.
Best Answer
The answer is yes for $d=1$. By denoting $F$ the cdf of $X^1$, you have that $\hat{F}_n(t)$ converges uniformly to $F(t)$ by the Glivenko–Cantelli theorem. You want to show that $\hat{F}_n(t/\alpha_n)$ converges uniformly to $F(t/\alpha)$. You can check this on open sets $(-\infty,t)$. You get $$ \left|\int_{-\infty}^{t/\alpha_n} d\hat{P}_n-\int_{-\infty}^{t/\alpha} dP\right|\le \left|\int_{-\infty}^{t/\alpha} d\hat{P}_n-dP\right|+\left|\int_{-\infty}^{t/\alpha_n-t/\alpha} d\hat{P}_n-dP\right|+\left|\int_{-\infty}^{t/\alpha_n-t/\alpha} dP\right|$$
Now use uniform convergence of $\hat{F}_n$ to $F$ to make the first two terms as small as you want. Then you can use uniform convergence of $t/\alpha_n\rightarrow t/\alpha$ to make the third term small (assuming that $\alpha\neq 0)$.