Claim: $D$ has at most one negative eigenvalue, and the absolute value of the negative eigenvalue is less than or equal to the next-smallest eigenvalue.
Proof: Let $E_{ij}$ denote the matrix with a $1$ in the $i,j$ entry and zeros elsewhere.
It suffices to show that if the $i$th and $j$th diagonal entries of $D$ have a negative sum, then $D$ cannot satisfy the criterion. To that end, it suffices to note that there exists a skew-symmetric matrix with $B^2 = -(E_{ii} + E_{jj})$ (take $B = E_{ij} - E_{ji}$ for instance). $\square$
I am not sure whether this condition is equivalent to your inequality.
We can also prove that the condition above is sufficient as follows. Suppose that $D$ has at most one negative eigenvalue, and the absolute value of the negative eigenvalue is less than or equal to the next-smallest eigenvalue.
We first note that every matrix of the form $M = B^2$ for a skew-symmetric $B$ can be written in the form
$$
M = -[a_1 \, (x_1x_1^T + y_1y_1^T) + \cdots + a_k \, (x_kx_k^T + y_ky_k^T)].
$$
where the coefficients $a_i$ are non-negative and $x_i,y_i$ are a pair of orthonormal unit vectors for all $i$. So, it suffices to show that $\langle D,M\rangle \leq 0$ where $M = -(xx^T + yy^T)$ for some orthonormal $x,y$.
Now, let $v_1,\dots,v_n$ be an orthonormal basis for $\Bbb R^n$ such that $x = v_1$ and $y = v_2$. Let $V$ be the orthogonal matrix whose columns are $v_1,\dots,v_n$, and let $A = V^TDV$. We now note that
$$
\langle D, xx^T + yy^T \rangle = x^TDx + y^TDy = a_{11} + a_{22}.
$$
From here, it suffices to apply the $(\implies)$ direction of the Schur-Horn theorem to $-A$ in order to conclude that $a_{11} + a_{22} \geq \lambda_{n}(D) + \lambda_{n-1}(D)$.
About the squares of skew-symmetric matrices: by the spectral theorem, there exists a unitary $U$ with columns $u_1,u_2,\dots,u_n$ such that
$$
B = U \pmatrix{i \lambda_1 \\ & - i\lambda_1 \\ && \ddots \\ &&& i \lambda_k \\ &&&& - i \lambda_k \\ &&&&& 0 } U^* \\
= \lambda_1 i \ [u_1u_1^* - u_2 u_2^*] + \cdots + i\lambda_{k}\ [u_{2k-1}u_{2k-1}^* - u_{2k}u_{2k}^*]
$$
where each $\lambda_i$ is positive. Thus, squaring $B$ yields
$$
B^2 = -(\lambda_1^2 \ [u_1u_1^* + u_2 u_2^*] + \cdots + \lambda_{k}^2\ [u_{2k-1}u_{2k-1}^* + u_{2k}u_{2k}^*]).
$$
We could equivalently have used the canonical form (with a real, orthogonal $U$)
$$
B = U \pmatrix{0 & -\lambda_1 \\ \lambda_1 & 0 \\ && \ddots \\ &&& 0 & -\lambda_k \\ &&& \lambda_k & 0 \\ &&&&& 0 } U^T \\
= \lambda_1 \ [u_2u_1^T - u_1 u_2^T] + \cdots + \lambda_{k}\ [u_{2k}u_{2k-1}^T - u_{2k-1}u_{2k}^T]
$$
A construction can be found in lemma 5.2.2, pp.36-37 of Olga Ruff's master thesis The Jordan canonical forms of complex orthogonal and skew-symmetric matrices: characterization and examples.
To summarise, let $z=\frac{1-i}{2}$. Since $\pmatrix{z&\overline{z}\\ \overline{z}&z}^2=\pmatrix{0&1\\ 1&0}$, if we set $X$ to the $(2n+1)\times(2n+1)$ matrix
$$
\pmatrix{
z&&&&&&&&&&\overline{z}\\
&iz&&&&&&&&i\overline{z}\\
&&z&&&&&&\overline{z}\\
&&&iz&&&&i\overline{z}\\
&&&&\ddots&&\unicode{x22F0}\\
&&&&&\sqrt{(-1)^n}\\
&&&&\unicode{x22F0}&&\ddots\\
&&&i\overline{z}&&&&iz\\
&&\overline{z}&&&&&&z\\
&i\overline{z}&&&&&&&&iz\\
\overline{z}&&&&&&&&&&z},
$$
then
\begin{aligned}
X^2&=\operatorname{antidiag}(1,-1,1,-1,\ldots,1)=DR=RD,\text{ where}\\
D&=\operatorname{diag}(1,-1,1,-1,\ldots,1),\\
R&=\operatorname{antidiag}(1,1,\ldots,1).
\end{aligned}
Let $J=J_{2n+1}(0)$. Since $X$ is symmetric and $X^4=I$, we have
$$
(XJX^{-1})^T=X(X^2J^TX^2)X^{-1}
=XDRJ^TRDX^{-1}=XDJDX^{-1}=-XJX^{-1},
$$
i.e. $K=XJX^{-1}$ is skew-symmetric and similar to $J$.
We can prove by a parity argument that nilpotent Jordan blocks of even sizes are not similar to any complex skew-symmetric matrices. First, we need the following result of Horn and Merino (2009) (which is also part of lemma 5.1.2 in Olga Ruff's thesis).
Lemma. A complex square matrix $A$ is similar to a complex skew-symmetric matrix $K$ only if $SA$ is skew-symmetric for some complex symmetric matrix $S$.
Proof. If $A=P^{-1}KP$ where $K^T=-K$, then $A^T=-(P^TP)A(P^TP)^{-1}$. Hence $P^TPA$ is skew-symmetric. $\square$
Now suppose an $m\times m$ nilpotent Jordan block $J=J_m(0)$ is similar to a skew-symmetric matrix. By the above lemma, $SJ$ is skew-symmetric for some non-singular symmetric matrix $S$. Note that the first column of $SJ$ is zero. Therefore
$$
S_{1j}=(SJ)_{1,j+1}=-(SJ)_{j+1,1}=0 \textrm{ for all } j<m.\tag{1}
$$
Moreover, by the symmetry of $S$ and skew-symmetry of $SJ$,
$$
S_{ij}=S_{ji}=(SJ)_{j,i+1}=-(SJ)_{i+1,j}=-S_{i+1,j-1}.\tag{2}
$$
Equality $(1)$ means that all entries on the first row of $S$ except the rightmost one are zero. Equality $(2)$ means that if we travel down an anti-diagonal of $S$, the entries are basically constant but they have alternating signs. It follows from $(1)$ and $(2)$ that all entries of $S$ above the main anti-diagonal are zero and the main anti-diagonal of $S$ is $\left(s,-s,s,-s,\ldots,(-1)^{m-1}s\right)$ for some $s$. As $S$ is non-singular, $s$ must be nonzero. Yet, as $S$ is symmetric, the first and the last entries on the anti-diagonal must be equal. Hence $s=(-1)^{m-1}s$ and $m$ is odd.
Best Answer
Let me work with skew-symmetric operators instead of matrices. If $T,S \colon \mathbb{R}^n \rightarrow \mathbb{R}^n$ are two skew-symmetric operators then we have
$$ \left< (T^2 + S^2)x, x \right> = \left< T(Tx), x \right> + \left< S(Sx), x \right> = \left< Tx, T^{*}x \right> + \left< Sx, S^{*} x \right> \\= -\left< Tx, Tx \right> - \left< Sx, Sx \right> = -\| Tx \|^2 - \| Sx \|^2 $$
which implies that
$$ \ker(T^2 + S^2) \subseteq \ker(T) \cap \ker(S). $$
Let's assume now that $n$ is odd. In this case, if $T$ is skew-symmetric then $T$ is singular and so is $T^2$. Hence, to construct a counterexample it is enough to find two skew-symmetric operators $T,S$ such that $\ker(T) \cap \ker(S) = \{ 0 \}$ because then $T^2 + S^2$ will be invertible and so it can't be the square of a skew-symmetric operator.
To provide a concrete example, take $n = 3$. Any skew-symmetric operator on $\mathbb{R}^3$ has the form $L_{v} \colon \mathbb{R}^3 \rightarrow \mathbb{R}^3$ where $L_v(x) = v \times x$. Denote by $e_1,e_2,e_3$ the standard basis and consider the skew-symmetric operators $L_{e_i}$. The kernel of $L_{e_i}$ is $\operatorname{Span} \{ e_i \}$ and so $L_{e_1}^2 + L_{e_2}^2$ is invertible and can't be a square of a skew-symmetric operator.