This response attempts to address your second question.
I believe the following (a modification of a classical counterexample) is a counterexample, but it would be helpful to have it checked by others.
It is well-known that the lognormal distribution is not determined by its moments. The density of the (standard) lognormal is
$$
f_0(x) = \frac{1}{x \sqrt{2\pi}} e^{-(\log x)^2/2} ,
$$
for $x > 0$ and is $0$ otherwise.
We can construct an indexed family of distributions with the same moments, as follows. Let the density parameterized by $a \in [-1,1]$ be
$$
f_a(x) = f_0(x) (1 + a \sin(2\pi\log x)) .
$$
Note that
$$
\sqrt{2\pi} \int_0^\infty x^r f_0(x) \sin(2\pi\log x) = \int_{-\infty}^{\infty} e^{yr + r^2} e^{-(y+r)^2/2} \sin(2\pi(y+r)) \mathrm{d}y ,
$$
by making the change of variables $\log x \mapsto y + r$. Simplifying the second integral, we get
$$
e^{r^2/2} \int_{-\infty}^{\infty} e^{-y^2/2} \sin(2\pi y + 2\pi r) \mathrm{d}y ,
$$
and, in particular, the integral is equal to zero for all $r = k/2$ for $k \in \mathbb Z$.
Hence, the $f_a$ are all densities and they all have the same moments and "half-moments".
Counterexample (claimed): Let $\{X_a\}$ be a set of random variables indexed by $a$ having density $f_a$. Let $\epsilon \in \{-1,+1\}$ be a random variable such that $\mathbb P(\epsilon = 1) = 1/2$ and independent of $\{X_a\}$. Set $Y_a = \epsilon X_a^{1/4}$. Then, the $Y_a$ are symmetric and have the same moments, but the absolute moments differ.
Proof: The $Y_a$ are symmetric by construction and $\mathbb E Y_a^{2n} = \mathbb E X_a^{n/2}$ and so they all share the same moments. But, for odd $n$,
$$
\mathbb E |Y_a|^n = \mathbb E X_a^{n/4} = \mathbb E X_0^{n/4} + a \frac{e^{(n/4)^2/2}}{\sqrt{2\pi}} \int_{-\infty}^{\infty} e^{-y^2/2} \sin(2\pi y + \pi n / 2) \mathrm{d}y.
$$
Taking $n = 1$, we see that
$$
\mathbb E |Y_a| = e^{1/32} + a e^{\frac{1}{32}-2\pi^2} = e^{1/32}(1 + a e^{-2 \pi^2}).
$$
If the $X_i$ are i.i.d. Gaussian with variance $1$, then you have
$$ c_p := \mathbb{E} |X_k|^p = \frac{2^{p/2} \Gamma(\frac{p+1}{2})}{\sqrt{\pi}}.$$
The variable $S_n$ is also Gaussian with variance $n$, therefore you have
$$\mathbb{E} |S_n|^p = c_p n^{p/2}.$$
Hence, $\frac{\sum_{k=1}^n \mathbb{E} |X_k|^p}{\mathbb{E} |S_n|^p} = n^{1-p/2} \rightarrow \infty$ for $1<p<2$. At least, it means that you cannot hope for a constant $C$ as you expected.
Best Answer
Yes, this inequality, with the best possible $C$ ($\le 2$), was proved in this paper; see e.g. inequality (1.11) there.
Indeed, that inequality implies that $$E\Big|\sum_{i=1}^m U_i\Big|^r\le E|U_1|^r+C_r\sum_{i=2}^m E|U_i|^r,$$ where the $U_i$'s are martingale differences and $C_r\in[1,2]$. Taking here $m=2$, $U_1:=Y_{n-1}$, and $U_2:=X_n$, we get $$E|Y_n|^r\le E|Y_{n-1}|^r+C_rE|X_n|^r,$$ as desired.