I assume that by a Hurwitz matrix, you mean a stable matrix, i.e. a matrix such that all of its eigenvalues have negative real parts. Please correct me if I am wrong.
I did some numerical experiments too, but found that your conjecture is wrong. Here is a counterexample:
$$
J=\begin{pmatrix}
-1.4268 & -0.6777 & 0.7134 & 0.4497\\
-0.6777 & -0.3444 & 0.2280 & 0.1722\\
-3.7490 & 0 & 0 & 0\\
0 & -8.6780 & 0 & 0
\end{pmatrix}.
$$
You may verify that the eigenvalues of $A$ are $-1.7529$ and $-0.0183$, the eigenvalues of $B$ are $0.8620$ and $0.0236$, and the eigenvalues of $J$ are $-0.9401\pm1.8348i$ and $\color{red}{0.0545\pm0.3906i}$.
As to the stability of $$M = \begin{bmatrix}A & B \\ -B^T &0\end{bmatrix}$$
for negative definite $A$, I assume that the $A$ and $B$ here are real matrices. The proof is easy. Suppose $v^T=(x^T,y^T)$ is a unit eigenvector of $M$ corresponding to the eigenvalue $\lambda$. Then
\begin{align}
\lambda = v^\ast Mv
&=(x^\ast,y^\ast)\begin{bmatrix}A & B \\ -B^T &0\end{bmatrix}\begin{bmatrix}x\\ y\end{bmatrix}\\
&=x^\ast Ax + x^\ast By - y^\ast B^Tx\\
&=x^\ast Ax + x^\ast By - (y^\ast B^Tx)^T\\
&=x^\ast Ax + x^\ast By - x^T B\bar{y}\\
&=x^\ast Ax + x^\ast By - \overline{x^\ast By}\\
&=x^\ast Ax + 2\,\mathrm{Imag}(x^\ast By).
\end{align}
Therefore the real part of $\lambda$ is given by $x^\ast Ax$, which is negative because $A$ is negative definite.
Edit: In the modified question, you ask whether $J=\begin{bmatrix}A & B \\ -cI &0\end{bmatrix}$ is always stable for sufficiently small $c>0$, given that $A=-(B+B^T)$ is negative definite. After some rough calculations, the answer seems to be negative. (OK, I was wrong. For the 2-by-2 case, actually $J$ is stable when $c$ is small.) Consider $A=\mathrm{diag}(-2a,-2b)$ and $B=\begin{pmatrix}a&-w\\w&b\end{pmatrix}$ with $a,b>0$ and $w\in\mathbb{R}$. It can be shown that $J$ is nonsingular and its characteristic equation is given by $\det(x^2 I - xA + cB)=0$, or equivalently,
$$f(x) := x^4 + 2(a+b)x^3 + (ca+cb+4ab)x^2 + 4cabx + c^2(ab+w^2) = 0.$$
One can further show that $f$ has no purely imaginary root. So, we may employ Routh–Hurwitz theorem (see also here) to test the stability of $f$. I have done some calculations, but they are too long to fit here (uh, ... I've started to sound like Fermat). The result seems to show that when $w$ is large and $c>0$ is small, the related Cauchy index is always zero (thus $J$ is not stable).
I have also done some computer experiments. The results apparently point to the same conclusion. For instance, when $B=\begin{bmatrix}1 & -320\\ 320&1\end{bmatrix}$, the resulting $J$ always seems to have an eigenvalue with nonnegative real part when $0<c\le\frac12$.
Another edit: I may have messed up something, but it is too tedious to double-check the calculations.
Best Answer
Let me call your matrix $S$. The set of eigenvalues of a square matrix $X$ will be denoted by $\sigma(X)$. I will show that
Note that $\sigma(AA^T)$ and $\sigma(A^TA)$ coincide up to zero. If $A$ is not square, zero will be in at least one of them. If $A$ is square, then the two sets coincide.
Proof. Set $L := S+I$. For $\lambda\neq 1$ you can easily check that $$ L-\lambda I = \begin{pmatrix}I&0\\(1-\lambda)^{-1}A^T&I\end{pmatrix}\begin{pmatrix}(1-\lambda)I&0\\0&T(\lambda)\end{pmatrix}\begin{pmatrix}I&(1-\lambda)^{-1}A\\0&I\end{pmatrix}, $$ where $$ T(\lambda) = -\frac{\lambda}{1-\lambda}A^TA + (1 - \lambda)I = -\frac{\lambda}{1-\lambda}\left(A^TA - \frac{(1-\lambda)^2}{\lambda}I\right). $$ Since the two matrices enclosing the diagonal matrix are invertible, we see that $L-\lambda I$ is not invertible iff $T(\lambda)$ is not invertible, i.e., iff $\frac{(1-\lambda)^2}{\lambda}\in\sigma(A^TA)$. Let us consider $\lambda = 1$. Then $L-\lambda = S$ is easily seen to be non-invertible iff $0\in\sigma(A^TA)\cup\sigma(AA^T)$. Hence, we get that $$ \lambda\in\sigma(L)\,\Longleftrightarrow\,\frac{(1-\lambda)^2}{\lambda}\in\sigma(A^TA)\cup\sigma(AA^T). $$ The claim now follows from $\lambda\in\sigma(S) \Longleftrightarrow \lambda+1\in\sigma(L)$.