The asymptotics is $1/\sqrt6=0.40824829$.
To see this, consider i.i.d. random variables $X_i$ and $Y_i$ uniform on $[0,1]$ and write the quantity to be computed as
$I_k=\mathrm E\left(\sqrt{Z_k}\right)$ with
$$
Z_k=\frac1k\sum_{i=1}^k(X_i-Y_i)^2.
$$
By the strong law of large numbers for i.i.d. bounded random variables, when $k\to\infty$, $Z_k\to z$ almost surely and in every $L^p$, with $z=\mathrm E(Z_1)$. In particular, $I_k\to \sqrt{z}$. Numerically,
$$
z=\iint_{[0,1]^2}(x-y)^2\mathrm{d}x\mathrm{d}y=2\int_0^1x^2\mathrm{d}x-2\left(\int_0^1x\mathrm{d}x\right)^2=2\frac13-2\left(\frac12\right)^2=\frac16.
$$
Edit (This answers a different question asked by the OP in the comments.)
Consider the maximum of $n\gg1$ independent copies of $kZ_k$ with $k\gg1$ and call $M_{n,k}$ its square root. A heuristics to estimate the typical behaviour of $M_{n,k}$ is as follows.
By the central limit theorem (and in a loose sense), $Z_k\approx z+N\sqrt{v/k}$ where $v$ is the variance of $Z_1$ and $N$ is a standard gaussian random variable. In particular, for every given nonnegative $s$,
$$
\mathrm P\left(Z_k\ge z+s\right)\approx\mathrm P\left(N^2\ge ks^2/v\right).
$$
Furthermore, the typical size of $M_{n,k}^2$ is $z+s$ where $s$ solves $\mathrm P(Z_k\ge z+s)\approx1/n$. Choose $q(n)$ such that $\mathrm P(N\ge q(n))=1/n$, that is, $q(n)$ is a so-called quantile of the standard gaussian distribution. Then, the typical size of $M_{n,k}^2$ is $k(z+s)$ where $s$ solves $ks^2/v=q(n)^2$. Finally,
$$
M_{n,k}\approx \sqrt{kz+q(n)\sqrt{kv}}.
$$
Numerically, $z=1/6$, $v=7/180$, and you are interested in $k=1'000$. For $n=10'000$, $q(n)=3.719$ yields a typical size $M_{n,k}\approx13.78$ and for $n=100'000$, $q(n)=4.265$ hence $M_{n,k}\approx13.90$ (these should be compared to the values you observed).
To make rigorous such estimates and to understand why, in a way, $M_{n,k}$ concentrates around the typical value we computed above, see here.
Byron has already answered your question, but I will attempt to provide a detailed solution...
Let $X$ be a random variable uniformly distributed over $[0,L]$, i.e., the probability density function of $X$ is the following
$$f_X (x) = \begin{cases} \frac{1}{L} & \textrm{if} \quad{} x \in [0,L]\\ 0 & \textrm{otherwise}\end{cases}$$
Let us randomly pick two points in $[0,L]$ independently. Let us denote those by $X_1$ and $X_2$, which are random variables distributed according to $f_X$. The distance between the two points is a new random variable
$$Y = |X_1 - X_2|$$
Hence, we would like to find the expected value $\mathbb{E}(Y) = \mathbb{E}( |X_1 - X_2| )$. Let us introduce function $g$
$$g (x_1,x_2) = |x_1 - x_2| = \begin{cases} x_1 - x_2 & \textrm{if} \quad{} x_1 \geq x_2\\ x_2 - x_1 & \textrm{if} \quad{} x_2 \geq x_1\end{cases}$$
Since the two points are picked independently, the joint probability density function is the product of the pdf's of $X_1$ and $X_2$, i.e., $f_{X_1 X_2} (x_1, x_2) = f_{X_1} (x_1) f_{X_2} (x_2) = 1 / L^2$ in $[0,L] \times [0,L]$. Therefore, the expected value $\mathbb{E}(Y) = \mathbb{E}(g(X_1,X_2))$ is given by
$$\begin{align} \mathbb{E}(Y) &= \displaystyle\int_{0}^L\int_{0}^L g(x_1,x_2) \, f_{X_1 X_2} (x_1, x_2) \,d x_1 \, d x_2\\[6pt]
&= \frac{1}{L^2} \int_0^L\int_0^L |x_1 - x_2| \,d x_1 \, d x_2\\[6pt]
&= \frac{1}{L^2} \int_0^L\int_0^{x_1} (x_1 - x_2) \,d x_2 \, d x_1 + \frac{1}{L^2} \int_0^L\int_{x_1}^L (x_2 - x_1) \,d x_2 \, d x_1\\[6pt]
&= \frac{L^3}{6 L^2} + \frac{L^3}{6 L^2} = \frac{L}{3}\end{align}$$
Best Answer
In regards to the first part of your question, no closed formula is known for this, though approximations for small values of $m$ are given here.
Given that this question does not have a closed form answer, and that the second question is even less analytically tractable, I think it is fairly clear that the second question also will not have a closed form answer.
A simple argument via Jensen's inequality gives an upper bound; we first note that if $X = (X_1,\ldots, X_m)$ is uniform on $[0,1]^m$ then each of the marginals $X_i \in [0,1]$ are independent and uniform. Then
$$ \begin{aligned} \mathbf E \left[ |X - Y|\right] & = \mathbf E \left[ \left( \sum_{i=1}^m(X_i - Y_i)^2 \right)^{1/2} \right] \\ & \leq \mathbf E \left[ \sum_{i=1}^m(X_i - Y_i)^2 \right]^{1/2} \\ & = \sqrt{m} \mathbf E[(X_1 - Y_1)^2]^{1/2} \\ & = \sqrt{\frac{m}6} \end{aligned} $$ To derive this, we note that in one dimension the difference between two uniform $[0,1]$ distributions $Z = X-Y$ has a triangular distribution:
$$\mathbf P(Z = z) = \begin{cases} z+1 \qquad \text{if $-1 \leq z < 0$}, \\ 1 -z \qquad \text{if $0 \leq z < 1$.} \end{cases} $$ From which you can deduce $\mathbf E[Z] = \frac13$ and $\mathbf E[Z^2] = \frac16$.
In fact, a more complex argument can be used to show that this upper bound is asymptotically tight, i.e. that in the limit the expected distance is $\sqrt{m/6}$. Details of this are given here.
Again, linking to your second question, as your small boxes become smaller the distance will also ssatisfy this asymptotic relationship, though you need to take care in ordering the limits (box size, vs $m$).