The asymptotics is $1/\sqrt6=0.40824829$.
To see this, consider i.i.d. random variables $X_i$ and $Y_i$ uniform on $[0,1]$ and write the quantity to be computed as
$I_k=\mathrm E\left(\sqrt{Z_k}\right)$ with
$$
Z_k=\frac1k\sum_{i=1}^k(X_i-Y_i)^2.
$$
By the strong law of large numbers for i.i.d. bounded random variables, when $k\to\infty$, $Z_k\to z$ almost surely and in every $L^p$, with $z=\mathrm E(Z_1)$. In particular, $I_k\to \sqrt{z}$. Numerically,
$$
z=\iint_{[0,1]^2}(x-y)^2\mathrm{d}x\mathrm{d}y=2\int_0^1x^2\mathrm{d}x-2\left(\int_0^1x\mathrm{d}x\right)^2=2\frac13-2\left(\frac12\right)^2=\frac16.
$$
Edit (This answers a different question asked by the OP in the comments.)
Consider the maximum of $n\gg1$ independent copies of $kZ_k$ with $k\gg1$ and call $M_{n,k}$ its square root. A heuristics to estimate the typical behaviour of $M_{n,k}$ is as follows.
By the central limit theorem (and in a loose sense), $Z_k\approx z+N\sqrt{v/k}$ where $v$ is the variance of $Z_1$ and $N$ is a standard gaussian random variable. In particular, for every given nonnegative $s$,
$$
\mathrm P\left(Z_k\ge z+s\right)\approx\mathrm P\left(N^2\ge ks^2/v\right).
$$
Furthermore, the typical size of $M_{n,k}^2$ is $z+s$ where $s$ solves $\mathrm P(Z_k\ge z+s)\approx1/n$. Choose $q(n)$ such that $\mathrm P(N\ge q(n))=1/n$, that is, $q(n)$ is a so-called quantile of the standard gaussian distribution. Then, the typical size of $M_{n,k}^2$ is $k(z+s)$ where $s$ solves $ks^2/v=q(n)^2$. Finally,
$$
M_{n,k}\approx \sqrt{kz+q(n)\sqrt{kv}}.
$$
Numerically, $z=1/6$, $v=7/180$, and you are interested in $k=1'000$. For $n=10'000$, $q(n)=3.719$ yields a typical size $M_{n,k}\approx13.78$ and for $n=100'000$, $q(n)=4.265$ hence $M_{n,k}\approx13.90$ (these should be compared to the values you observed).
To make rigorous such estimates and to understand why, in a way, $M_{n,k}$ concentrates around the typical value we computed above, see here.
For every independent nonnegative random variables $X$ and $Y$, the random variable $Z=\min(X,Y)$ has expectation
$$
E[Z]=\int_0^\infty P[Z\geqslant t]\,\mathrm dt
=\int_0^\infty P[X\geqslant t]\,P[Y\geqslant t]\,\mathrm dt.
$$
In your case, one can decompose the integral on the RHS as the sum of the integrals on the intervals $(0,1)$, $(1,2)$, and $(2,3)$, and compute $P[X\geqslant t]$ and $P[Y\geqslant t]$ separately for $t$ in each of these intervals. This yields:
- If $t$ is in $(0,1)$, then $P[X\geqslant t]=P[Y\geqslant t]=1$
- If $t$ is in $(1,2)$, then $P[X\geqslant t]=\frac{3-t}2$ and $P[Y\geqslant t]=\frac23$
- If $t$ is in $(2,3)$, then $P[X\geqslant t]=\frac{3-t}2$ and $P[Y\geqslant t]=\frac13$
Thus, $E[Z]=$ $1$ $+$ $______$ $+$ $______$ $=$ $______$.
Best Answer
I started with two die $(n=6)$ and made a table $$\begin{array}{|c|c|c|c|c|c|}\hline \text{die 1 / die 2 } & 1 &2 &3 &4 &5 &6 \\ \hline\hline \hline 1 & 0&1 &2 &3 &4 &5 \\ \hline 2 & 1 &0 &1 &2 &3&4 \\ \hline 3 &2 &1 &0 &1&2&3 \\ \hline 4 &3 &2 &1&0&1&2 \\ \hline 5 &4 &3&2&1&0 &1 \\ \hline 6 &5&4&3&2 &1 &0 \\ \hline\end{array}$$
For $|X_1-X_2|$ I got the following pmf
$f_{|X_1-X_2|}(x)=\begin{cases} \frac6{36}, \, x=0 \\ 2\cdot \frac{(6-x)}{36},\, x=\{1,2,3,4,5\} \\ 0, \, \textrm{elsewhere}\end{cases}$
From this smaller example I deduced the case $n=100$
$f_{|X_1-X_2|}(x)=\begin{cases} \frac{10}{100}, \, x=0 \\ 2\cdot \frac{(100-x)}{100},\, x=\{1,2,3\ldots , 99\} \\ 0, \, \textrm{elsewhere}\end{cases}$
Thus the expected value is
$$\mathbb E\left(|X_1-X_2| \right)=\sum_{x=1}^{99}x\cdot 2\cdot \frac{(100-x)}{100}=\frac{3333}{100}=33.33$$
To calculate it by hand it is good to know that $\sum\limits_{x=1}^{n-1} x^2=\frac16\cdot (n-1)\cdot n\cdot (2n-1)$. This is one approach to get the expected value.
For two discrete uniform random variables with outcomes from $1$ to $n$ we have
$$\mathbb E\left(|X_1^n-X_2^n| \right)=\frac{n^2-1}{3n}$$