I have a good way to find $C$ :
As in the question, the Inequality is valid $\forall \ n \ge 2,$ thus why not find $C$ for the case where $n=2.$
Then our Inequality becomes $\displaystyle (x_1 = x , x_2 =y) $:
$$\displaystyle xy(x^2+y^2) \le C(x+y)^4 $$
Assume that $x+y = 1$ then we are left to show that $\displaystyle xy(x^2+y^2) \le C$ for some value of C, or in other words, to find maxima of $\displaystyle xy(x^2+y^2) .$
Now using Lagrangian Multilpier:
$$
\displaystyle
\mathcal{L}(x,y,\lambda) = f(x,y) - \lambda g(x,y) \\
\text{where} \ f(x,y) = xy(x^2+y^2) \ \text{is the objective function to maximize}\\
\text{where} \ g(x,y) = x + y \ \text{is the constraint function } \\
$$
Hence, we solving for $\displaystyle \nabla \mathcal{L}(x,y,\lambda) = 0$ we get:
$
\displaystyle
x^3 + 3y^2x - \lambda = 0 \ \ \ \ \ \ \ \ \ .....(1) \\
y^3 + 3x^2y - \lambda = 0 \ \ \ \ \ \ \ \ \ .....(2) \\
$
Case $1:$
Subtracting equation $(1)$ from equation $(2)$ gives $x=y,$ as the only possible case. So $x=y=0.5$ to satisfy the constraint $g(x,y)$
Thus $\displaystyle f(x,y) = 0.125 $ and as at this point $\nabla ^2 f(x,y) \ge 0 $ it is a maxima.
Therefore
$$\displaystyle xy(x^2+y^2) \le 0.125 \implies C = 0.125 $$
Case $2:$
Substitute value of $x$ from $(2)$ into $(1)$ simplify it to get an eqaution in $y$ and $\lambda$:
$$\displaystyle 100z^3 -120z^2 +48z -1 = 0 \\
\text{where} \ y^3 = \lambda z
$$
Here $z = 0.022024 $ is a real solution, using this we can say: $ y \approx 0.28 \lambda^{1/3}$
From this value of $y$ we get $x \approx 0.57 \lambda^{1/3}$ and then using the conatraint we get : $ \lambda^{1/3} \approx 1.7647 \implies \lambda \approx 1.628 .$
Hence from here we get another solution set $\displaystyle \mathcal{L}(0.67,0.329,1.628)$. Therefore $f(x,y) = 0.1233$
Thus,
$$\displaystyle xy(x^2+y^2) \le 0.1233 \implies C = 0.1233 $$
The point to note is both values are nearly equal, thus we conclude $C = 0.125$.
EDIT:
As pointed out by @TheBestMagician in the comment, I have added a generalized version:
Let us look for an objective function of general $x_k \ \text{s.t.} \ 1 \le k \le n.$
$$\displaystyle f(x_k) = \sum_{i=1}^{k-1} x_k x_i (x_k^2 + x_i^2) + \sum_{i=k+1}^{n} x_k x_j (x_k^2 + x_j^2) = - 2x^4 + \sum_{i=1}^{n} x_k x_i (x_k^2 + x_i^2) $$
Now we will define some summations :
$$\displaystyle S = \sum_{i=1}^{n} x_i \ \text{and} \ S_k = S - x_k \\
S^3 = \sum_{i=1}^{n} x^3_i \ \text{and} \ S^3_k = S^3 - x^3_k
$$
Thus now we can modify our objective function into:
$$\displaystyle f(x_k) = S_k x^3_k + S^3_k x_k. \\ g(x) = S = \sum_{i=1}^{n} x_i \\ \text{constraint: } g(x) = 1 $$
Then applying Lagrange Mulitplier,
$\displaystyle f'(x_{k-1})=f'(x_k) = f'(x_{k+1}) = \lambda .$ We can then subtract the functions.
Where, $\displaystyle f'(x_k) = 3(S_k)x^2_k + S^3_k = 3(S-x_k)x^2_k + S^3- x^3_k $
$$\displaystyle f'(x_k) - f'(x_{k-1}) = 0 = 3(S-x_k)x^2_k + S^3- x^3_k - [3(S-x_{k-1}k)x^2_{k-1} + S^3- x^3_{k-1}] \\ \implies 3S = 4\frac { x^3_k - x^3_{k-1}}{x^2_k - x^2_{k-1}}$$
Assume that $x_k \ne x_{k-1} $, and involve another variable $x_{k+1}$ in the scene as:
$$\displaystyle f'(x_{k+1}) - f'(x_k) \implies 3S = 4\frac { x^3_k - x^3_{k+1}}{x^2_k - x^2_{k+1}}$$
Similarly assume $x_k \ne x_{k+1} $ and equate both of $S$ values.
$$\displaystyle \frac { x^3_k - x^3_{k+1}}{x^2_k - x^2_{k+1}} = \frac { x^3_k - x^3_{k-1}}{x^2_k - x^2_{k-1}} \\ \text{simplify to get} \ x_{k} = -\frac{x_{k+1}x_{k-1}}{x_{k+1}+x_{k-1}} \ \text{assuming} \ x_{k-1} \ne x_{k+1} \ $$
But here $x_{k-1},x_k,x_{k+1} \ge 0 $ as per the domain. But this contradict the above statement, thus we have to agree on $x_{k-1} = x_{k+1} \forall \ 1 \le k \le n $.
Case A:
Assuming $\displaystyle x_k = x_{k-1} = x_{k+1}$.
Using constraint $g(x) = 1 $, we get $\displaystyle x_1 = x_2 = ... = x_n = \frac{1}{n} $
Hence $\displaystyle F:\{x_1,...,x_n\} \rightarrow \mathbb{R} = \sum_{1\le i < j \le n} x_i x_j (x_i^2 + x_j^2) =\frac{2}{n^4} \sum_{1\le i < j \le n} 1 = \frac{2}{n^4} \times \frac{n(n-1)}{2}.$ Hence
$$ \displaystyle F(x) \le \frac{n-1}{n^3} \implies C = \frac{n-1}{n^3} $$
Here we see that $C $ is depended on $n$. We can verify it, for $n=2$ for the above cases I discussed.
Now, for the boundary conditions the function act strangely and keep on increasing its maxima untill it reaches a global max at boundary with only two $x's$ not zero.
We conclude $f(x)$ is always less than or equal to $\displaystyle \frac{1}{8} .$
Best Answer
Yes, this is true (and there is no need to assume that the $x_j$ are integers). This is Exmaple 3.5 in Wells, Williams. Embeddings and extensions in analysis. The proof relies on the integral representation $$ |x|=\frac{2}{\pi}\int_0^\infty\frac{\sin^2(xu)}{u^2}\,du. $$ With the trigonometric identity $$ \sin^2((L_j-L_i)u)=\sin^2(L_i u)+\sin^2(L_j u)-2\sin^2(L_i u)\sin^2(L_j u)-\frac 1 2 \sin(2L_i u)\sin(2L_j u) $$ one obtains \begin{align*} \sum_{i,j}\sin^2((L_i-L_j)u)x_i x_j&=2\sum_i\sin^2(L_i u)x_i\sum_j x_j-\left(2\sum_i\sin^2(L_i u)x_i\right)^2-\frac 12\left(\sum_i \sin(2L_i u)x_i\right)^2\\ &=-\left(2\sum_i\sin^2(L_i u)x_i\right)^2-\frac 12\left(\sum_i \sin(2L_i u)x_i\right)^2. \end{align*} Thus \begin{align*} \sum_{i,j}|L_i-L_j|x_ix_j=-\frac 2 \pi\int_0^\infty u^{-2}\left(\left(2\sum_i\sin^2(L_i u)x_i\right)^2+\frac 12\left(\sum_i \sin(2L_i u)x_i\right)^2\right)\,du\leq 0. \end{align*}
Edit: I should add that functions $\Phi\colon E\times E\to\mathbb R$ that are symmetric, vanish on the diagonal and satisfy $$ \sum_{j,k}\Phi(e_j,e_k)x_j x_k\leq 0 $$ whenever $\sum_j x_j=0$ are called conditionally negative definite. They are closely related to the question when a metric space can be isometrically embedded into a Hilbert space: A function $\Phi\colon E\times E\to\mathbb R_+$ is conditionally of negative type if and only if $\Phi^{1/2}$ is a metric and $(E,\Phi^{1/2})$ embeds isometrically into a Hilbert space. This result can be found in the book cited above.