Let $a,b \in \mathbb{N}_0$ such that $a+b \leq n$. Since the random variables $X_i$ take only the values $0$ and $1$, we have
$$\begin{align*} \{S_n \geq a+b\} &= \bigcup_{k=0}^n \left( \{S_0 < a, \ldots, S_{k-1} < a, S_k = a\} \cap \{S_n \geq a+b\} \right) \\ &= \bigcup_{k=0}^n \left( \{S_0 < a, \ldots, S_{k-1} < a, S_k = a\} \cap \{S_n-S_k \geq b\} \right) \end{align*}$$
Thus,
$$\mathbb{P}(S_n \geq a+b) \leq \sum_{k=0}^n \mathbb{P}(S_0<a,\ldots,S_{k-1}<a, S_k = a, S_n-S_k \geq b).$$
As the random variables $S_n-S_k$ and $S_0,\ldots,S_k$ are independent, this implies
$$\mathbb{P}(S_n \geq a+b) \leq \sum_{k=0}^n \mathbb{P}(S_0 < a,\ldots,S_{k-1}<a, S_k = a)\mathbb{P}(S_n-S_k \geq b).$$
Noting that
$$\mathbb{P}(S_n-S_k \geq b) = \mathbb{P}(S_{n-k} \geq b) \leq \mathbb{P}(S_n \geq b)$$
for any $k \leq n$, we get
$$\mathbb{P}(S_n \geq a+b) \leq \mathbb{P}(S_n \geq b) \underbrace{\sum_{k=0}^n \mathbb{P}(S_0 < a,\ldots,S_{k-1}<a, S_k = a)}_{\mathbb{P}(S_n \geq a)}.$$
Remark: Abstractly speaking, the proof uses the strong Markov property of the random walk $(S_n)_{n \in \mathbb{N}}$, i.e. the fact that for any stopping time $\tau$ the process $T_n := S_{n+\tau}-S_{\tau}$, $n \in \mathbb{N}$, is independent from $(S_k)_{k \leq \tau}$ and equals in distribution $(S_n)_{n \in \mathbb{N}}$.
$X_1,\dots,X_n$ being independent is, by definition, equivalent to $\sigma(X_1),\dots,\sigma(X_n)$ being independent which is further defined to mean that
$$
P(A_1\cap\cdots\cap A_n)=P(A_1)\cdots P(A_n) \quad\text{ for all }A_i\in \sigma(X_i)
$$
Now, since $\sigma(X_i)$ consists precisely of sets of the form $(X_i\in B_i)$ where $B_i\in{\cal B}(\mathbb{R})$, the definition of independence can be expressed equivalently as
$$
(1) \qquad P(X_1\in B_1,\dots ,X_n\in B_n)=P(X_1\in B_1)\dots P(X_n\in B_n) \quad\text{ for all }B_i\in {\cal B}(\mathbb{R})
$$
Next, in (1), take $B_1=\cdots=B_k=\Omega$. Then (1) becomes
$$
P(X_1\in \Omega,\dots,X_k\in\Omega,X_{k+1}\in B_{k+1},\dots,X_n\in B_n)
\\
=P(X_1\in \Omega)\cdots P(X_k\in\Omega)P(X_{k+1}\in B_{k+1})\cdots P(X_n\in B_n)
$$
which simplifies to
$$
(2)\qquad P(X_{k+1}\in B_{k+1},\dots,X_n\in B_n)
=P(X_{k+1}\in B_{k+1})\cdots P(X_n\in B_n)
$$
A similar argument gives
$$
(3)\qquad P(X_1\in B_1,\dots,X_k\in B_k)
=P(X_1\in B_1)\cdots P(X_k\in B_k)
$$
Hence, substituting (2) and (3) into (1) gives
\begin{eqnarray*}
&&P((X_1,\dots,X_k,X_{k+1},\dots,X_n)\in B_1\times\cdots\times B_k\times B_{k+1}\times\cdots\times B_n)\\
&=&P(X_1\in B_1,\dots,X_k\in B_k,X_{k+1}\in B_{k+1},\dots,X_n\in B_n)\\
&=&
P(X_1\in B_1)\cdots P(X_k\in B_k)P(X_{k+1}\in B_{k+1})\cdots P(X_n\in B_n)\\
&=&
\Big[P(X_1\in B_1)\cdots P(X_k\in B_k)\Big]\Big[P(X_{k+1}\in B_{k+1})\cdots P(X_n\in B_n)\Big]\\
&=&
\Big[P(X_1\in B_1,\dots,X_k\in B_k)\Big]\Big[P(X_{k+1}\in B_{k+1},\dots,X_n\in B_n)\Big]\\
&=&
\Big[P((X_1,\dots,X_k)\in B_1\times\cdots\times B_k)\Big]\Big[P((X_{k+1},\dots,X_n)\in B_{k+1}\times\cdots\times B_n)\Big]\\
\end{eqnarray*}
which holds for all $B_i\in {\cal B}(\mathbb{R})$.
Now, sets of the form $B_1\times\cdots\times B_k$, where the $B's\in{\cal B}(\mathbb{R})$, generate ${\cal B}(\mathbb{R}^k)$. Hence, sets of the form $((X_1,\dots,X_k)\in B_1\times\cdots\times B_k)$ generate $\sigma(X_1,\dots,X_k)$. Similarly, sets of the form $B_{k+1}\times\cdots\times B_n$, where the $B's\in{\cal B}(\mathbb{R})$, generate ${\cal B}(\mathbb{R}^{n-k})$. And sets of the form $((X_{k+1},\dots,X_n)\in B_{k+1}\times\cdots\times B_n)$ generate $\sigma(X_{k+1},\dots,X_n)$. Hence, for any $M\in{\cal B}(\mathbb{R}^k)$ and $N\in{\cal B}(\mathbb{R}^{n-k})$ we have
\begin{eqnarray*}
&&P((X_1,\dots,X_k)\in M, (X_{k+1},\dots,X_n)\in N )\\
&=&P((X_1,\dots,X_k,X_{k+1},\dots,X_n)\in M\times N )\\
&=&\Big[P((X_1,\dots,X_k)\in M)\Big]\Big[P((X_{k+1},\dots,X_n)\in N)\Big]\\
\end{eqnarray*}
which shows that $(X_1,\dots,X_k)$ and $(X_{k+1},\dots,X_n)$ are independent.
Best Answer
Everything seems correct.
But yes, there is both quicker and better way, based on the fact (which is easy to prove) that a random variable is symmetric iff its characteristic function is real-valued. Given this, the symmetry of $S_n$ immediately follows from $$ \varphi_{S_n} (t) = \prod_{k=1}^n \varphi_{X_k}(t). $$
It is better because it allows to prove the required property for any random variables, not only for discrete.