Let $f\in K[X_1,\dots,X_n]$ be a symmetric polynomial. Then $$f\in K(X_1,\dots,X_n)^{S_n}=K(\sigma_1,\dots,\sigma_n).$$ We want to prove that $f\in K[\sigma_1,\dots,\sigma_n]$. The ring extension $K[\sigma_1,\dots,\sigma_n]\subset K[X_1,\dots,X_n]$ is integral since $X_i$ is integral over $K[\sigma_1,\dots,\sigma_n]$ for all $i=1,\dots,n$. (Note that $X_i$ is a root of the monic polynomial $X^n ā \sigma_1X^{nā1} +\cdots +(ā1)^n\sigma_n\in K[\sigma_1,\dots,\sigma_n]$.) In particular, $f$ is integral over $K[\sigma_1,\dots,\sigma_n]$. Since $f\in K(\sigma_1,\dots,\sigma_n)$ and $K[\sigma_1,\dots,\sigma_n]$ is integrally closed (why?) we get $f\in K[\sigma_1,\dots,\sigma_n]$.
Here is an answer for any number $n$ of variables.
For any partition
$\lambda$, we let $s_{\lambda}$ denote the Schur
function
corresponding to $\lambda$. Let $\delta$ be the partition $\left(
n-1,n-2,\ldots,1\right) $ of $n\left( n-1\right) /2$. Then,
\begin{equation}
\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right)
= s_{\delta}\left( x_{1},x_{2},\ldots,x_{n}\right) .
\label{1}
\tag{1}
\end{equation}
This well-known formula (which, I think, is due to Jacobi) can easily be
derived from the definition of Schur polynomials through alternants. Let me explain:
For every $n$-tuple $\alpha=\left( \alpha_{1},\alpha_{2},\ldots,\alpha
_{n}\right) \in\mathbb{N}^{n}$ of nonnegative integers, we define the
alternant $a_{\alpha}$ to be the polynomial
\begin{equation}
\sum_{\sigma\in S_{n}}\left( -1\right) ^{\sigma}x_{1}^{\alpha_{\sigma\left( 1\right) }}x_{2}^{\alpha_{\sigma\left( 2\right) }}\cdots x_{n}^{\alpha_{\sigma\left( n\right) }}
= \det\left( \left( x_i^{\alpha_j}\right) _{1\leq i\leq n,\ 1\leq
j\leq n}\right)
\end{equation}
in $\mathbb{Z}\left[ x_{1},x_{2},\ldots,x_{n}\right] $. If $\alpha=\left(
\alpha_{1},\alpha_{2},\ldots,\alpha_{n}\right) $ and $\beta=\left( \beta
_{1},\beta_{2},\ldots,\beta_{n}\right) $ are two $n$-tuples in $\mathbb{N}
^{n}$, then the $n$-tuple $\alpha+\beta$ is defined to be $\left( \alpha
_{1}+\beta_{1},\alpha_{2}+\beta_{2},\ldots,\alpha_{n}+\beta_{n}\right) $.
Any partition $\lambda=\left( \lambda_{1},\lambda_{2},\ldots,\lambda
_{k}\right) $ of length $k\leq n$ will be identified with the $n$-tuple
$\left( \lambda_{1},\lambda_{2},\ldots,\lambda_{k},\underbrace{0,0,\ldots
,0}_{n-k\text{ times}}\right) \in\mathbb{N}^{n}$. Notice that $\delta$ is
thus identified with the $n$-tuple $\left( n-1,n-2,\ldots,1,0\right) $.
Then, the "alternant formula" for the Schur polynomial $s_{\lambda}\left(
x_{1},x_{2},\ldots,x_{n}\right) $ says that
\begin{equation}
s_{\lambda}\left( x_{1},x_{2},\ldots,x_{n}\right) =\dfrac{a_{\lambda+\delta
}}{a_{\delta}}
\label{2}
\tag{2}
\end{equation}
whenever $\lambda$ is a partition of length $\leq n$. This is the historically
first definition of Schur polynomials, long before the modern definitions via
Young tableaux or the Jacobi-Trudi formulas were discovered; its main
disadvantages are that it only defines $s_{\lambda}$ in finitely many
variables and that it requires proof that $\dfrac{a_{\lambda+\delta}
}{a_{\delta}}$ is indeed a polynomial. But this is fine for us.
(For a proof of the fact that this definition of $s_\lambda$ is equivalent
to the modern combinatorial definition, you can consult Corollary 2.6.6 in Darij Grinberg and Victor Reiner, Hopf algebras in combinatorics, arXiv:1409.8356v5. In the same chapter you will find an exercise proving the Jacobi-Trudi identity, which is yet another popular definition of the Schur polynomials.)
We can now prove \eqref{1}. Indeed, $\delta=\left( n-1,n-2,\ldots,0\right)
\in\mathbb{N}^{n}$, so that the definition of $a_{\delta}$ yields
\begin{equation}
a_{\delta+\delta}=\det\left( \left( x_{i}^{n-j}\right) _{1\leq i\leq
n,\ 1\leq j\leq n}\right) =\prod_{1\leq i<j\leq n}\left( x_{i}-x_{j}\right)
\label{3}
\tag{3}
\end{equation}
(by the Vandermonde determinant formula). But $\delta+\delta=\left( 2\left(
n-1\right) ,2\left( n-2\right) ,\ldots,2\cdot0\right) $, so that
\begin{align}
a_{\delta+\delta} & =\det\left( \left( x_{i}^{2\left( n-j\right)
}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) =\det\left( \left(
\left( x_{i}^{2}\right) ^{n-j}\right) _{1\leq i\leq n,\ 1\leq j\leq
n}\right) \nonumber\\
& =\prod_{1\leq i<j\leq n}\left( x_{i}^{2}-x_{j}^{2}\right)
\label{4}
\tag{4}
\end{align}
(again by the Vandermonde determinant formula). Dividing \eqref{4} by
\eqref{3}, we obtain
\begin{equation}
\dfrac{a_{\delta+\delta}}{a_{\delta}}=\dfrac{\prod_{1\leq i<j\leq n}\left(
x_{i}^{2}-x_{j}^{2}\right) }{\prod_{1\leq i<j\leq n}\left( x_{i}
-x_{j}\right) }=\prod_{1\leq i<j\leq n}\underbrace{\dfrac{x_{i}^{2}-x_{j}
^{2}}{x_{i}-x_{j}}}_{=x_{i}+x_{j}}=\prod_{1\leq i<j\leq n}\left( x_{i}
+x_{j}\right) .
\end{equation}
But applying \eqref{2} to $\lambda=\delta$, we obtain $s_{\delta}\left(
x_{1},x_{2},\ldots,x_{n}\right) =\dfrac{a_{\delta+\delta}}{a_{\delta}}
=\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right) $. Thus, \eqref{1} is proven.
Finally, we can transform \eqref{1} into an explicit (if you allow
determinants) formula for $\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right)
$ in terms of the elementary symmetric polynomials. To that aim, we let
$e_{k}\left( \overrightarrow{x}\right) $ denote the $k$-th elementary
symmetric polynomial in $x_{1},x_{2},\ldots,x_{n}$. Notice that $e_{k}\left(
\overrightarrow{x}\right) =0$ whenever $k<0$ and also whenever $k>n$.
Recall that one of the two Jacobi-Trudi formulas says that if $\lambda$ is a
partition of length $\leq n$, then
\begin{equation}
s_{\lambda^{t}}\left( x_{1},x_{2},\ldots,x_{n}\right) =\det\left( \left(
e_{\lambda_{i}-i+j}\left( \overrightarrow{x}\right) \right) _{1\leq i\leq
n,\ 1\leq j\leq n}\right) ,
\end{equation}
where $\lambda^{t}$ denotes the conjugate partition of $\lambda$ (this is the
partition whose $i$-th entry is the number of entries of $\lambda$ that are
$\geq i$, for each $i\in\left\{ 1,2,3,\ldots\right\} $). Applying this to
$\lambda=\delta$, we obtain
\begin{align*}
s_{\delta^{t}}\left( x_{1},x_{2},\ldots,x_{n}\right) & =\det\left(
\left( e_{\left( n-i\right) -i+j}\left( \overrightarrow{x}\right)
\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) \\
& =\det\left( \left( e_{n-2i+j}\left( \overrightarrow{x}\right) \right)
_{1\leq i\leq n,\ 1\leq j\leq n}\right) \\
& =\det\left(
\begin{array}
[c]{cccc}
e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\
e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\
\vdots & \vdots & \ddots & \vdots\\
e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left(
\overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right)
\end{array}
\right) .
\end{align*}
Since $\delta^{t}=\delta$ (this is easy to check), this simplifies to
\begin{equation}
s_{\delta}\left( x_{1},x_{2},\ldots,x_{n}\right) =\det\left(
\begin{array}
[c]{cccc}
e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\
e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\
\vdots & \vdots & \ddots & \vdots\\
e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left(
\overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right)
\end{array}
\right) .
\end{equation}
Thus, \eqref{1} rewrites as
\begin{equation}
\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right)
=\det\left(
\begin{array}
[c]{cccc}
e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\
e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x}
\right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\
\vdots & \vdots & \ddots & \vdots\\
e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left(
\overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right)
\end{array}
\right) .
\end{equation}
Note that the matrix on the right hand side is not generally upper-triangular,
but it has a lot of zeroes (more or less the whole lower half of its
southwestern triangle consists of zeroes), so its determinant is a bit easier
to compute than a general $n\times n$ determinant. But I don't think there is
a more explicit formula.
Best Answer
Proof for symmetric polynomials with two variables X, Y:
Let $f(X,Y)$ be a symmetric polynomial with degree n. By definition, $f(X,Y)=f(Y,X)$. Now, $$f(X,Y)= \sum_{0 \le i\le n,0\le j \le n,i+j\le n} a_{ij}X^iY^j$$ for some $a_{ij}$, and $a_{ij}=a_{ji} \;\forall{0\le i \le n,0\le j\le n}.\;\;\;\;(*)$
If $\alpha,\beta \in{\mathbb R}, c$ are some arbitrary constants, and $f(X,Y), g(X,Y)$ are two symmetric polynomials, then $\alpha f(X,Y)+\beta g(X,Y)+c$ is also a symmetric polynomial. This obviously extends to cases with multiple symmetric functions, i.e.
$$\sum_{j=1}^{p} \alpha_j f_j(X,Y)+c \; \text{is symmetric} \;\forall{\alpha_j \in{\mathbb R}, f_j \;\text{are all symmetric}} \;\;\;\;(**)$$
If n is even, then let $n=2m$ $$(X+Y)^n=X^n+\binom n 1 X^{n-1}Y+ \;...\;+\binom n mX^mY^m+ \;...\;+\binom n {n-1} XY^{n-1}+Y^n$$
If n is odd, then let $n=2m+1$ $$(X+Y)^n=X^n+\binom n 1 X^{n-1}Y+ \;...\;+\binom n mX^{m+1}Y^m+\binom n {m+1}X^mY^{m+1}+ \;...\;+\binom n {n-1} XY^{n-1}+Y^n$$
It is clear that $(X+Y)^n$ is symmetric $\forall{n\in {\mathbb N}}$. Moreover, $\forall 0\le k\le n,$ the $k^{th}$ term is the same as the $(n+2-k)^{th}$ term.
By $(*)$ and $(**)$ , it suffices to prove that every symmetric polynomial $f_0(X,Y)$ with at most 2 non-constant terms can be generated by $X+Y$ and $XY$, and that all polynomials with at most 2 non-constant terms generated by $X+Y$ and $XY$ are symmetric.
Suppose $\mu X^iY^j$ is symmetric, where $i\ge 0, j\ge 0$ are not both $0$. Then $X^iY^j=X^jY^i,$ which implies $i=j$.
Now, let us consider the polynomial $\;\eta_1 X^iY^j + \eta_2 X^rY^s$, where $i+j \ne r+s, i+j \ne 0, r+s \ne 0$. If it is symmetric, then $$\;\eta_1 X^iY^j + \eta_2 X^rY^s=\;\eta_1 X^jY^i + \eta_2 X^sY^r$$
This means either $X^iY^j=X^jY^i,X^rY^s=X^sY^r$ or $\eta_1=\eta_2, X^iY^j=X^sY^r, X^jY^i=X^rY^s$. Therefore, in the first case, $i=j, r=s$, and in the second case, $i=s,j=r$. Hence, all symmetric polynomials with two non-constant terms are of the form $\eta X^iY^i+\omega X^jY^j+c \;\text{or}\; \gamma (X^iY^j+X^jY^i)+c$.
Now, $\mu X^iY^i$ and $\eta X^iY^i+\omega X^jY^j$ can clearly be generated by $XY$, so we only have to check $\gamma (X^iY^j+X^jY^i)\;,$ where $i\ne j$.
Without loss of generality, suppose that $i\gt j$, so we have $$\gamma (X^iY^j+X^jY^i)=\gamma X^jY^j(X^{i-j}+Y^{i-j})$$. Now we only have to check if $$X^{i-j}+Y^{i-j}$$ can be generated by $X+Y$ and $XY$.
Let $i-j=q$. When $q=1$, $X^q+Y^q$ can trivially be generated by $X+Y$, since it's itself $X+Y$. When $q=2, X^q+Y^q=X^2+Y^2=(X+Y)^2-2XY,$ so $X^q+Y^q$ can be generated by $X+Y$ and $XY$ as well. Notice that
$$X^{q+1}+Y^{q+1}=(X^q+Y^q)(X+Y)-(XY^q+YX^q)=(X^q+Y^q)(X+Y)-XY(X^{q-1}+Y^{q-1})$$
Therefore, $X^{q+1}+Y^{q+1}$ can be generated by $X+Y,XY,X^{q-1}+Y^{q-1},$ and $X^q+Y^q$. Hence, by induction,$\forall{q\in {\mathbb N}}, X^q+Y^q$ can be generated by $X+Y$ and $XY$. This means
$$\gamma(X^iY^j+X^jY^i)$$ can be generated by $X+Y$ and $XY$, hence all symmetric polynomials $f(X,Y)$ can be generated by $X+Y$ and $XY$.
All polynomials of degree n generated by $X+Y$ and $XY$ are of the form $$\sum_{0 \le i'\le n,0\le j' \le n,i'+j'\le n} \kappa_{i'j'}(X+Y)^{i'}(XY)^{j'}$$
Each term of this sum is symmetric, so the sum is symmetric as well. Hence, every polynomial generated by $X+Y$ and $XY$ is symmetric, and every symmetric polynomial can be generated by $X+Y$ and $XY$.