Throughout this answer, we will fix $\alpha \in (0, 1]$.
Let $\mathcal{M}$ denote the set of all finite signed Borel measures on $[0, 1]$ and $\mathcal{P} \subset \mathcal{M}$ denote the set of all Borel probability measure on $[0, 1]$. Also, define the pairing $\langle \cdot, \cdot \rangle$ on $\mathcal{M}$ by
$$ \forall \mu, \nu \in \mathcal{M}: \qquad \langle \mu, \nu\rangle = \int_{[0,1]^2} |x - y|^{\alpha} \, \mu(dx)\nu(dy). $$
We also write $I(\mu) = \langle \mu, \mu\rangle$. Then we prove the following claim.
Proposition. If $\mu \in \mathcal{P}$ satisfies $\langle \mu, \delta_{t} \rangle = \langle \mu, \delta_{s} \rangle$ for all $s, t \in [0, 1]$, then $$I(\mu) = \max\{ I(\nu) : \nu \in \mathcal{P}\}.$$
We defer the proof of the lemma to the end and first rejoice its consequence.
When $\alpha = 1$, it is easy to see that the choice $\mu_1 = \frac{1}{2}(\delta_0 + \delta_1)$ works.
When $\alpha \in (0, 1)$, we can focus on $\mu_{\alpha}(dx) = f_{\alpha}(x) \, dx$ where $f_{\alpha}$ is given by
$$ f_{\alpha}(x) = \frac{1}{\operatorname{B}(\frac{1-\alpha}{2},\frac{1-\alpha}{2})} \cdot \frac{1}{(x(1-x))^{\frac{1+\alpha}{2}}}, $$
Indeed, for $y \in [0, 1]$, apply the substitution $x = \cos^2(\theta/2)$ and $k = 2y-1$ to write
$$ \langle \mu_{\alpha}, \delta_y \rangle = \int_{0}^{1} |y - x|^{\alpha} f_{\alpha}(x) \, dx = \frac{1}{\operatorname{B}(\frac{1-\alpha}{2},\frac{1-\alpha}{2})}\int_{-\frac{\pi}{2}}^{\frac{\pi}{2}} \left| \frac{\sin\theta - k}{\cos\theta} \right|^{\alpha} \, d\theta. $$
Then letting $\omega(t) = \operatorname{Leb}\left( \theta \in \left(-\frac{\pi}{2}, \frac{\pi}{2}\right) \ : \ \left| \frac{\sin\theta - k}{\cos\theta} \right| > t \right)$, we can check that this satisfies $\omega(t) = \pi - 2\arctan(t)$, which is independent of $k$ (and hence of $y$). Moreover,
$$ \langle \mu_{\alpha}, \delta_y \rangle = \frac{1}{\operatorname{B}(\frac{1-\alpha}{2},\frac{1-\alpha}{2})} \int_{0}^{\infty} \frac{2t^{\alpha}}{1+t^2} \, dt = \frac{\pi}{\operatorname{B}(\frac{1-\alpha}{2},\frac{1-\alpha}{2})\cos(\frac{\pi\alpha}{2})} $$
Integrating both sides over $\mu(dy)$, we know that this is also the value of $I(\mu_{\alpha})$.
So it follows that
\begin{align*}
&\max \{ \mathbb{E} [ |X - Y|^{\alpha}] : X, Y \text{ i.i.d. and } \mathbb{P}(X \in [0, 1]) = 1 \} \\
&\hspace{1.5em}
= \max_{\mu \in \mathcal{P}} I(\mu)
= I(\mu_{\alpha})
= \frac{\pi}{\operatorname{B}(\frac{1-\alpha}{2},\frac{1-\alpha}{2})\cos(\frac{\pi\alpha}{2})}.
\end{align*}
Notice that this also matches the numerical value of Kevin Costello as
$$ I(\mu_{1/2}) = \frac{\sqrt{2}\pi^{3/2}}{\Gamma\left(\frac{1}{4}\right)^2} \approx 0.59907011736779610372\cdots. $$
The following is the graph of $\alpha \mapsto I(\mu_{\alpha})$.
$\hspace{8em} $
Proof of Proposition. We first prove the following lemma.
Lemma. If $\mu \in \mathcal{M}$ satisfies $\mu([0,1]) = 0$, then we have $I(\mu) \leq 0$.
Indeed, notice that there exists a constant $c > 0$ for which
$$ \forall x \in \mathbb{R}: \qquad |x|^{\alpha} = c\int_{0}^{\infty} \frac{1 - \cos (xt)}{t^{1+\alpha}} \, dt $$
holds. Indeed, this easily follows from the integrability of the integral and the substitution $|x|t \mapsto t$. So by the Tonelli's theorem, for any positive $\mu, \nu \in \mathcal{M}$,
\begin{align*}
\langle \mu, \nu \rangle
&= c\int_{0}^{\infty} \int_{[0,1]^2} \frac{1 - \cos ((x - y)t)}{t^{1+\alpha}} \, \mu(dx)\nu(dy)dt \\
&= c\int_{0}^{\infty} \frac{\hat{\mu}(0)\hat{\nu}(0) - \operatorname{Re}( \hat{\mu}(t)\overline{\hat{\nu}(t)} )}{t^{1+\alpha}} \, dt,
\end{align*}
where $\hat{\mu}(t) = \int_{[0,1]} e^{itx} \, \mu(dx)$ is the Fourier transform of $\mu$. In particular, this shows that the right-hand side is integrable. So by linearity this relation extends to all pairs of $\mu, \nu$ in $\mathcal{M}$. So, if $\mu \in \mathcal{M}$ satisfies $\mu([0,1]) = 0$ then $\hat{\mu}(0) = 0$ and thus
$$ I(\mu) = -c\int_{0}^{\infty} \frac{|\hat{\mu}(t)|^2}{t^{1+\alpha}} \, dt \leq 0, $$
completing the proof of Lemma. ////
Let us return to the original proof. Let $m$ denote the common values of $\langle \mu, \delta_t\rangle$ for $t \in [0, 1]$. Then for any $\nu \in \mathcal{P}$
$$ \langle \mu, \nu \rangle
= \int \left( \int_{[0,1]} |x - y|^{\alpha} \, \mu(dx) \right) \, \nu(dy)
= \int \langle \mu, \delta_y \rangle \, \nu(dy)
= m. $$
So it follows that
$$ \forall \nu \in \mathcal{P} : \qquad I(\nu)
= I(\mu) + 2\underbrace{\langle \mu, \nu - \mu \rangle}_{=m-m = 0} + \underbrace{I(\nu - \mu)}_{\leq 0} \leq I(\mu) $$
as desired.
Best Answer
If $X_k$ is the $k$-th number after the shuffle then you want to compute
$$ \mathbb{E}\left[\sum_{k=1}^{n-1}|X_{k+1}-X_k|\right]=\sum_{k=1}^{n-1} \mathbb{E}[|X_{k+1}-X_k|]=(n-1)\mathbb{E}[|X_2-X_1|] $$ as $\mathbb{E}[|X_{k+1}-X_k|]$ is the same for any $k \in\{1,\ldots,n-1\}$. Finally, observe that
$$ \begin{align*} \mathbb{E}[|X_2-X_1|]&=\sum_{1\leqslant j,k\leqslant n}|j-k|\Pr [X_1=j]\Pr [X_2=k|X_1=j]\\ &=2\sum_{1\leqslant j<k\leqslant n}(k-j)\frac1{n}\cdot \frac1{n-1}\\ &=\frac{2}{n(n-1)}\sum_{k=2}^n\sum_{j=1}^{k-1}(k-j)\\ &=\frac{2}{n(n-1)}\left(\sum_{k=2}^nk(k-1)-\sum_{k=2}^n\frac{k(k-1)}{2}\right)\\ &=\frac{2}{n(n-1)}\cdot \frac{(n+1)n(n-1)}{6}\\ &=\frac{n+1}{3} \end{align*} $$
Therefore
$$ \mathbb{E}\left[\sum_{k=1}^{n-1}|X_{k+1}-X_k|\right]=\frac{(n-1)(n+1)}{3} $$