In the arXiv paper "Schoenberg matrices of radial positive definite functions
and Riesz sequences in $L^2(\mathbb{R}^n)$" by L. Golinskii, M. Malamud and L. Oridoroga, Definition 1.1 states
Let $n \in \mathbb{N}$. A real-valued and continuous function $f$ on $\mathbb{R}_+ = [0,\infty)$ is called a radial positive definite function, if for an arbitrary finite set $\{x_1,\ldots,x_m\}$, $x_k \in \mathbb{R}^n$, and $\{\xi_1,\ldots,\xi_m\} \in \mathbb{C}^m$ $$\sum_{k,j = 1}^{n}f(|x_k-x_j|)\xi_k\overline{\xi}_j \ge 0.$$
Note that this definition is equivalent to saying that the matrix $S \in \mathbb{R}^{m \times m}$ defined by $S(k,j) = f(\|x_k-x_j\|)$ is positive semidefinite for any choice of points $x_1,\ldots,x_m \in \mathbb{R}^n$.
Also, Theorem 1.2 states the following:
A function $f \in \Phi_n$, $f(0) = 1$ if and only if there exists a probability measure $\nu$ on $\mathbb{R}_+$ such that $$f(r) = \int_{0}^{\infty}\Omega_n(rt)\nu(dt), ~~~~ r \in \mathbb{R}_+$$ where $$\Omega_n(s) := \Gamma(q+1)\left(\dfrac{2}{s}\right)^qJ_q(s) = \sum_{j = 0}^{\infty}\dfrac{\Gamma(q+1)}{j!\Gamma(j+q+1)}\left(-\dfrac{s^2}{4}\right)^j, ~~~~ q = \dfrac{n}{2}-1,$$ $J_q$ is the Bessel function of the first kind and order $q$.
(In their paper $\Phi_n$ denotes the set of radial positive definite functions on $\mathbb{R}^n$.)
Theorem 1 in "Metric Spaces and Completely Monotone Functions" by I. J. Schoenberg gives the proof of this result.
By applying this theorem for $n = 3$ and $\nu = \delta_1$, you get that $f(r) = \Omega_3(r) = \dfrac{\sin r}{r}$ is a radial positive definite function in $\mathbb{R}^3$. In other words, for any choice of $m$ points $x_1,\ldots,x_m \in \mathbb{R}^3$, the matrix $S \in \mathbb{R}^{m \times m}$ defined by $S(k,j) = \dfrac{\sin(\|x_k-x_j\|)}{\|x_k-x_j\|}$ is positive semidefinite.
It should be clear that if you choose $m$ points $x_1,\ldots,x_m$ in $\mathbb{R}$ or $\mathbb{R}^2$, then $S$ is still guaranteed to be positive semidefinite. However, that is not the case if the points are chosen in $\mathbb{R}^n$ where $n \ge 4$. joriki's answer gives a nice counterexample for $n \ge 5$. Here is an ugly counterexample (found by randomly generating points in MATLAB) with $6$ points in $\mathbb{R}^4$:
$x_1 = (-0.7,0.8,-0.3,-0.4)$
$x_2 = (0.8,-0.8,-0.9,0.7)$
$x_3 = (-0.1,0.0,0.1,-0.1)$
$x_4 = (-0.8,-0.7,-0.1,0.2)$
$x_5 = (0.2,-0.2,0.5,-0.9)$
$x_6 = (0.4,0.8,0.8,0.9)$
You can check that the $6 \times 6$ sinc distance matrix $S$ for these points has an eigenvalue of $\approx -0.0024103$, and thus, $S$ is not positive semi definite.
Best Answer
Since $-(A+B)\le A-B\le A+B$, we have $\|A-B\|_2\le\|A+B\|_2$. It follows that \begin{aligned} \|A^2+B^2\|_2 &=\frac12\left\|(A-B)^2+(A+B)^2\right\|_2\\ &\le\frac12\left(\left\|(A-B)^2\right\|_2+\left\|(A+B)^2\right\|_2\right)\\ &=\frac12\left(\|A-B\|_2^2+\|A+B\|_2^2\right)\\ &\le\frac12\left(\|A+B\|_2^2+\|A+B\|_2^2\right)\\ &=\|A+B\|_2^2. \end{aligned} By Euler's four-square identity and its generalisation to sums of eight squares, I think the above proof can be generalised to the case when the LHS of the inequality is a sum of eight squares, but I am not sure if it can be generalised further.