I have a good way to find $C$ :
As in the question, the Inequality is valid $\forall \ n \ge 2,$ thus why not find $C$ for the case where $n=2.$
Then our Inequality becomes $\displaystyle (x_1 = x , x_2 =y) $:
$$\displaystyle xy(x^2+y^2) \le C(x+y)^4 $$
Assume that $x+y = 1$ then we are left to show that $\displaystyle xy(x^2+y^2) \le C$ for some value of C, or in other words, to find maxima of $\displaystyle xy(x^2+y^2) .$
Now using Lagrangian Multilpier:
$$
\displaystyle
\mathcal{L}(x,y,\lambda) = f(x,y) - \lambda g(x,y) \\
\text{where} \ f(x,y) = xy(x^2+y^2) \ \text{is the objective function to maximize}\\
\text{where} \ g(x,y) = x + y \ \text{is the constraint function } \\
$$
Hence, we solving for $\displaystyle \nabla \mathcal{L}(x,y,\lambda) = 0$ we get:
$
\displaystyle
x^3 + 3y^2x - \lambda = 0 \ \ \ \ \ \ \ \ \ .....(1) \\
y^3 + 3x^2y - \lambda = 0 \ \ \ \ \ \ \ \ \ .....(2) \\
$
Case $1:$
Subtracting equation $(1)$ from equation $(2)$ gives $x=y,$ as the only possible case. So $x=y=0.5$ to satisfy the constraint $g(x,y)$
Thus $\displaystyle f(x,y) = 0.125 $ and as at this point $\nabla ^2 f(x,y) \ge 0 $ it is a maxima.
Therefore
$$\displaystyle xy(x^2+y^2) \le 0.125 \implies C = 0.125 $$
Case $2:$
Substitute value of $x$ from $(2)$ into $(1)$ simplify it to get an eqaution in $y$ and $\lambda$:
$$\displaystyle 100z^3 -120z^2 +48z -1 = 0 \\
\text{where} \ y^3 = \lambda z
$$
Here $z = 0.022024 $ is a real solution, using this we can say: $ y \approx 0.28 \lambda^{1/3}$
From this value of $y$ we get $x \approx 0.57 \lambda^{1/3}$ and then using the conatraint we get : $ \lambda^{1/3} \approx 1.7647 \implies \lambda \approx 1.628 .$
Hence from here we get another solution set $\displaystyle \mathcal{L}(0.67,0.329,1.628)$. Therefore $f(x,y) = 0.1233$
Thus,
$$\displaystyle xy(x^2+y^2) \le 0.1233 \implies C = 0.1233 $$
The point to note is both values are nearly equal, thus we conclude $C = 0.125$.
EDIT:
As pointed out by @TheBestMagician in the comment, I have added a generalized version:
Let us look for an objective function of general $x_k \ \text{s.t.} \ 1 \le k \le n.$
$$\displaystyle f(x_k) = \sum_{i=1}^{k-1} x_k x_i (x_k^2 + x_i^2) + \sum_{i=k+1}^{n} x_k x_j (x_k^2 + x_j^2) = - 2x^4 + \sum_{i=1}^{n} x_k x_i (x_k^2 + x_i^2) $$
Now we will define some summations :
$$\displaystyle S = \sum_{i=1}^{n} x_i \ \text{and} \ S_k = S - x_k \\
S^3 = \sum_{i=1}^{n} x^3_i \ \text{and} \ S^3_k = S^3 - x^3_k
$$
Thus now we can modify our objective function into:
$$\displaystyle f(x_k) = S_k x^3_k + S^3_k x_k. \\ g(x) = S = \sum_{i=1}^{n} x_i \\ \text{constraint: } g(x) = 1 $$
Then applying Lagrange Mulitplier,
$\displaystyle f'(x_{k-1})=f'(x_k) = f'(x_{k+1}) = \lambda .$ We can then subtract the functions.
Where, $\displaystyle f'(x_k) = 3(S_k)x^2_k + S^3_k = 3(S-x_k)x^2_k + S^3- x^3_k $
$$\displaystyle f'(x_k) - f'(x_{k-1}) = 0 = 3(S-x_k)x^2_k + S^3- x^3_k - [3(S-x_{k-1}k)x^2_{k-1} + S^3- x^3_{k-1}] \\ \implies 3S = 4\frac { x^3_k - x^3_{k-1}}{x^2_k - x^2_{k-1}}$$
Assume that $x_k \ne x_{k-1} $, and involve another variable $x_{k+1}$ in the scene as:
$$\displaystyle f'(x_{k+1}) - f'(x_k) \implies 3S = 4\frac { x^3_k - x^3_{k+1}}{x^2_k - x^2_{k+1}}$$
Similarly assume $x_k \ne x_{k+1} $ and equate both of $S$ values.
$$\displaystyle \frac { x^3_k - x^3_{k+1}}{x^2_k - x^2_{k+1}} = \frac { x^3_k - x^3_{k-1}}{x^2_k - x^2_{k-1}} \\ \text{simplify to get} \ x_{k} = -\frac{x_{k+1}x_{k-1}}{x_{k+1}+x_{k-1}} \ \text{assuming} \ x_{k-1} \ne x_{k+1} \ $$
But here $x_{k-1},x_k,x_{k+1} \ge 0 $ as per the domain. But this contradict the above statement, thus we have to agree on $x_{k-1} = x_{k+1} \forall \ 1 \le k \le n $.
Case A:
Assuming $\displaystyle x_k = x_{k-1} = x_{k+1}$.
Using constraint $g(x) = 1 $, we get $\displaystyle x_1 = x_2 = ... = x_n = \frac{1}{n} $
Hence $\displaystyle F:\{x_1,...,x_n\} \rightarrow \mathbb{R} = \sum_{1\le i < j \le n} x_i x_j (x_i^2 + x_j^2) =\frac{2}{n^4} \sum_{1\le i < j \le n} 1 = \frac{2}{n^4} \times \frac{n(n-1)}{2}.$ Hence
$$ \displaystyle F(x) \le \frac{n-1}{n^3} \implies C = \frac{n-1}{n^3} $$
Here we see that $C $ is depended on $n$. We can verify it, for $n=2$ for the above cases I discussed.
Now, for the boundary conditions the function act strangely and keep on increasing its maxima untill it reaches a global max at boundary with only two $x's$ not zero.
We conclude $f(x)$ is always less than or equal to $\displaystyle \frac{1}{8} .$
Best Answer
If you don't care about the constant, the inequality is rather simple. For all practical purposes (i.e., up to an absolute constant factor) $1-(1-x_i)^k\asymp\min(kx_i,1)=y_i$. Also, we trivially have $\sum_i y_i\le k\sum_i x_i=k$. Now we note that $(1-x_i-x_j)^{k-1}-(1-x_i)^{k-1}(1-x_j)^{k-1}\le 0$ (just because $1-x_i-x_j\le(1-x_i)(1-x_j)$). Thus it suffices to prove that $$ \sum_{i,j}x_ix_j[1-(1-x_i)(1-x_j)](1-x_i)^{k-1}(1-x_j)^{k-1}\le \frac Ck\sum_i x_iy_i\,. $$ Now,$1-(1-x_i)(1-x_j)\le x_i+x_j$ and $x_i(1-x_i)^{k-1}\le\min(x_i,\frac 1k)=\frac 1ky_i$, so the LHS is bounded by $$ \frac 1{k^2}\sum_{i,j}(x_i+x_j)y_iy_j=\frac 2{k^2}\sum_{i,j}x_iy_iy_j \\ =\frac 2{k^2}\left[\sum_{i}x_iy_i\right]\left[\sum_{j}y_j\right]\le \frac 2{k^2}\left[\sum_{i}x_iy_i\right]k= \frac 2k\sum_{i}x_iy_i $$ and we are done.
This crude computation doesn't yield $C=2$ though. To get $C=2$ in this way, you just want to define $y_i=1-(1-x_i)^k$ (which is always $\le\min(kx_i,1)$, so the final computation is fine) and show that the inequality $kx_i(1-x_i)^{k-1}\le y_i$ still holds. This is also easy once you realize that $1-(1-x)^k=\int_0^x k(1-t)^{k-1}\,dt$ and you can now rewrite the above argument with this slick definition in the same way, but, of course, that is not how one would guess it in the first place, so I included the crude computation based on the simple idea to replace hard functions with equivalent simple ones :-)