Here is an answer for any number $n$ of variables.
For any partition
$\lambda$, we let $s_{\lambda}$ denote the Schur
function
corresponding to $\lambda$. Let $\delta$ be the partition $\left(
n-1,n-2,\ldots,1\right) $ of $n\left( n-1\right) /2$. Then,
\begin{equation}
\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right)
= s_{\delta}\left( x_{1},x_{2},\ldots,x_{n}\right) .
\label{1}
\tag{1}
\end{equation}
This well-known formula (which, I think, is due to Jacobi) can easily be
derived from the definition of Schur polynomials through alternants. Let me explain:
For every $n$-tuple $\alpha=\left( \alpha_{1},\alpha_{2},\ldots,\alpha
_{n}\right) \in\mathbb{N}^{n}$ of nonnegative integers, we define the
alternant $a_{\alpha}$ to be the polynomial
\begin{equation}
\sum_{\sigma\in S_{n}}\left( -1\right) ^{\sigma}x_{1}^{\alpha_{\sigma\left( 1\right) }}x_{2}^{\alpha_{\sigma\left( 2\right) }}\cdots x_{n}^{\alpha_{\sigma\left( n\right) }}
= \det\left( \left( x_i^{\alpha_j}\right) _{1\leq i\leq n,\ 1\leq
j\leq n}\right)
\end{equation}
in $\mathbb{Z}\left[ x_{1},x_{2},\ldots,x_{n}\right] $. If $\alpha=\left(
\alpha_{1},\alpha_{2},\ldots,\alpha_{n}\right) $ and $\beta=\left( \beta
_{1},\beta_{2},\ldots,\beta_{n}\right) $ are two $n$-tuples in $\mathbb{N}
^{n}$, then the $n$-tuple $\alpha+\beta$ is defined to be $\left( \alpha
_{1}+\beta_{1},\alpha_{2}+\beta_{2},\ldots,\alpha_{n}+\beta_{n}\right) $.
Any partition $\lambda=\left( \lambda_{1},\lambda_{2},\ldots,\lambda
_{k}\right) $ of length $k\leq n$ will be identified with the $n$-tuple
$\left( \lambda_{1},\lambda_{2},\ldots,\lambda_{k},\underbrace{0,0,\ldots
,0}_{n-k\text{ times}}\right) \in\mathbb{N}^{n}$. Notice that $\delta$ is
thus identified with the $n$-tuple $\left( n-1,n-2,\ldots,1,0\right) $.
Then, the "alternant formula" for the Schur polynomial $s_{\lambda}\left(
x_{1},x_{2},\ldots,x_{n}\right) $ says that
\begin{equation}
s_{\lambda}\left( x_{1},x_{2},\ldots,x_{n}\right) =\dfrac{a_{\lambda+\delta
}}{a_{\delta}}
\label{2}
\tag{2}
\end{equation}
whenever $\lambda$ is a partition of length $\leq n$. This is the historically
first definition of Schur polynomials, long before the modern definitions via
Young tableaux or the Jacobi-Trudi formulas were discovered; its main
disadvantages are that it only defines $s_{\lambda}$ in finitely many
variables and that it requires proof that $\dfrac{a_{\lambda+\delta}
}{a_{\delta}}$ is indeed a polynomial. But this is fine for us.
(For a proof of the fact that this definition of $s_\lambda$ is equivalent
to the modern combinatorial definition, you can consult Corollary 2.6.6 in Darij Grinberg and Victor Reiner, Hopf algebras in combinatorics, arXiv:1409.8356v5. In the same chapter you will find an exercise proving the Jacobi-Trudi identity, which is yet another popular definition of the Schur polynomials.)
We can now prove \eqref{1}. Indeed, $\delta=\left( n-1,n-2,\ldots,0\right)
\in\mathbb{N}^{n}$, so that the definition of $a_{\delta}$ yields
\begin{equation}
a_{\delta+\delta}=\det\left( \left( x_{i}^{n-j}\right) _{1\leq i\leq
n,\ 1\leq j\leq n}\right) =\prod_{1\leq i<j\leq n}\left( x_{i}-x_{j}\right)
\label{3}
\tag{3}
\end{equation}
(by the Vandermonde determinant formula). But $\delta+\delta=\left( 2\left(
n-1\right) ,2\left( n-2\right) ,\ldots,2\cdot0\right) $, so that
\begin{align}
a_{\delta+\delta} & =\det\left( \left( x_{i}^{2\left( n-j\right)
}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) =\det\left( \left(
\left( x_{i}^{2}\right) ^{n-j}\right) _{1\leq i\leq n,\ 1\leq j\leq
n}\right) \nonumber\\
& =\prod_{1\leq i<j\leq n}\left( x_{i}^{2}-x_{j}^{2}\right)
\label{4}
\tag{4}
\end{align}
(again by the Vandermonde determinant formula). Dividing \eqref{4} by
\eqref{3}, we obtain
\begin{equation}
\dfrac{a_{\delta+\delta}}{a_{\delta}}=\dfrac{\prod_{1\leq i<j\leq n}\left(
x_{i}^{2}-x_{j}^{2}\right) }{\prod_{1\leq i<j\leq n}\left( x_{i}
-x_{j}\right) }=\prod_{1\leq i<j\leq n}\underbrace{\dfrac{x_{i}^{2}-x_{j}
^{2}}{x_{i}-x_{j}}}_{=x_{i}+x_{j}}=\prod_{1\leq i<j\leq n}\left( x_{i}
+x_{j}\right) .
\end{equation}
But applying \eqref{2} to $\lambda=\delta$, we obtain $s_{\delta}\left(
x_{1},x_{2},\ldots,x_{n}\right) =\dfrac{a_{\delta+\delta}}{a_{\delta}}
=\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right) $. Thus, \eqref{1} is proven.
Finally, we can transform \eqref{1} into an explicit (if you allow
determinants) formula for $\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right)
$ in terms of the elementary symmetric polynomials. To that aim, we let
$e_{k}\left( \overrightarrow{x}\right) $ denote the $k$-th elementary
symmetric polynomial in $x_{1},x_{2},\ldots,x_{n}$. Notice that $e_{k}\left(
\overrightarrow{x}\right) =0$ whenever $k<0$ and also whenever $k>n$.
Recall that one of the two Jacobi-Trudi formulas says that if $\lambda$ is a
partition of length $\leq n$, then
\begin{equation}
s_{\lambda^{t}}\left( x_{1},x_{2},\ldots,x_{n}\right) =\det\left( \left(
e_{\lambda_{i}-i+j}\left( \overrightarrow{x}\right) \right) _{1\leq i\leq
n,\ 1\leq j\leq n}\right) ,
\end{equation}
where $\lambda^{t}$ denotes the conjugate partition of $\lambda$ (this is the
partition whose $i$-th entry is the number of entries of $\lambda$ that are
$\geq i$, for each $i\in\left\{ 1,2,3,\ldots\right\} $). Applying this to
$\lambda=\delta$, we obtain
\begin{align*}
s_{\delta^{t}}\left( x_{1},x_{2},\ldots,x_{n}\right) & =\det\left(
\left( e_{\left( n-i\right) -i+j}\left( \overrightarrow{x}\right)
\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) \\
& =\det\left( \left( e_{n-2i+j}\left( \overrightarrow{x}\right) \right)
_{1\leq i\leq n,\ 1\leq j\leq n}\right) \\
& =\det\left(
\begin{array}
[c]{cccc}
e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\
e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\
\vdots & \vdots & \ddots & \vdots\\
e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left(
\overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right)
\end{array}
\right) .
\end{align*}
Since $\delta^{t}=\delta$ (this is easy to check), this simplifies to
\begin{equation}
s_{\delta}\left( x_{1},x_{2},\ldots,x_{n}\right) =\det\left(
\begin{array}
[c]{cccc}
e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\
e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\
\vdots & \vdots & \ddots & \vdots\\
e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left(
\overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right)
\end{array}
\right) .
\end{equation}
Thus, \eqref{1} rewrites as
\begin{equation}
\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right)
=\det\left(
\begin{array}
[c]{cccc}
e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\
e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\
\vdots & \vdots & \ddots & \vdots\\
e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left(
\overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right)
\end{array}
\right) .
\end{equation}
Note that the matrix on the right hand side is not generally upper-triangular,
but it has a lot of zeroes (more or less the whole lower half of its
southwestern triangle consists of zeroes), so its determinant is a bit easier
to compute than a general $n\times n$ determinant. But I don't think there is
a more explicit formula.
For $ k = 0, 1, \ldots, n-1$, consider the (horizontal) vector $v_k$ with $i$th coordinate $$ \sum \prod_{j=1, a_j \neq i}^{k} {x_{a_j}}.$$
For example, with $n = 3$, we have
$v_0 = (1, 1, 1)$,
$v_1 = (x_2 + x_3, x_3 + x_1, x_1 + x_2)$,
$v_2 = ( x_2x_3, x_3x_1, x_1x_2)$.
With $n = 4$, we have
$v_0 = (1, 1, 1, 1)$,
$v_1 = (x_2 + x_3 + x_4, x_3 + x_4 + x_1, x_4 + x_1 + x_2, x_1 + x_2 + x_3)$,
$v_2 = ( x_2x_3+x_3x_4+x_4x_2, x_3x_4+x_4x_1+x_1x_3, x_4x_1+x_1x_2+x_2x_4, x_1x_2 + x_2x_3 + x_3x_2)$,
$v_3 = (x_2x_3x_4, x_3x_4x_1, x_4x_1x_2, x_1x_2x_3)$.
Claim: $v_kA = (n-1-k) v_k$.
Proof: Expand it. A lot of the cross terms cancel out.
For example, with $v_0$, the column sum is $n-1$, so $v_0 A = (n-1) v_0$.
For example, with $v_{k-1}$, the numerators are all $\prod x_i$, and by looking at the denomninators, they cancel out to 0, so $v_{k-1} A = 0 $.
Do you see how we get $v_k A = (n-1-k)v_k$?
Corollary: The eigenvalues are $0, 1, 2, \ldots, n-1 $.
Best Answer
For $k=4$ we have
$$x_1 x_2 x_3 x_4 = \frac{1}{192} ({x_1}+{x_2}+{x_3}+{x_4})^4+\frac{1}{384} \left(({x_1}+{x_2}-{x_3}-{x_4})^4+({x_1}-{x_2}+{x_3}-{x_4})^4+(-{x_1}+{x_2}+{x_3}-{x_4})^4+({x_1}-{x_2}-{x_3}+{x_4})^4+(-{x_1}+{x_2}-{x_3}+{x_4})^4+(-{x_1}-{x_2}+{x_3}+{x_4})^4\right)+\frac{1}{192} \left(-({x_1}+{x_2}+{x_3}-{x_4})^4-({x_1}+{x_2}-{x_3}+{x_4})^4-({x_1}-{x_2}+{x_3}+{x_4})^4-(-{x_1}+{x_2}+{x_3}+{x_4})^4\right) $$
Maybe you recognize a pattern, the numbers in the denominator are for instance $192=4 \cdot 6 \cdot 8$ and $384=2 \cdot 4 \cdot 6 \cdot 8$.
They can be found like so:
With some combinatorics it should be possible to work out the general rule.
Granular bastard's own answer:
$${\prod_{i=1}^k x_i =\frac{1}{(2k)!!} \sum_{(s_1, s_2, \dots, s_k) \in \{-1,1\}^k}\left[\left(\prod_{i=1}^k s_i \right)\left(\sum_{i=1}^k {s_i x_i}\right)^k\right]},$$ where $\{-1, 1\}^k := \Big\{(s_1, s_2, \dots, s_k) \mid s_i \in \{-1, 1\} \text{ for all positive integers } i \leq k \Big\}$ are the set of all $2^k$ tuples of length $k$ whose elements are either $−1$ or $1$. So, for example, the set $\{−1,1\}^2$ is defined to be $\{(−1,−1),(−1,1),(1,-1,),(1,1)\}$.
Sketch of a proof
Some lingo:
The number different linear combinations $p$ due to the sign permutations is $p=2^k$. The number of completely mixed monomials $x_1 x_2 \cdots x_k$ when expanding a single linear combination $(\pm x_1 \pm x_2\pm \dots \pm x_k)^k$ is $k!$. Therefore, the completely mixed monomial occurs $k!\times 2^k=(2k)!!$ times when the powers of the linear combinations are expanded completely. Due to the construction of the above expression with $(-1)^l=\left(\prod_{i=1}^k s_i \right)$ with $l$ being the number of minuses in each bracket, all completely mixed monomials $x_1 x_2 \cdots x_k$ have $+1$ as expansion coefficient. Therefore, the completely mixed monomials sum up to $(2k)!! x_1 x_2 .... x_k$.
In each incompletely mixed monomial at least one $x_j$ is missing. For each missing $x_j$ in an incomplete monomial we can divide the $p=2^k$ (which is an even number) linear combinations that are raised to the power $k$ into two groups, $(\pm x_1\dots + x_j \dots\pm x_k)^k$ and $-(\pm x_1 \dots- x_j \dots \pm x_k)^k$. When expanding for the monomials that lack $x_j$, these cancel each other out.
Therefore, only the completely mixed monomials are not cancelled out, which leads to the above expression.