Yes, it's always true, but note that there is a premise to fulfill: $C$ and $D$ have to commute, that is, we need $CD=DC$. If this condition is violated, the formula may fail to hold.
More generally, when the entries of $A,B,C,D$ be matices over a commutative ring (this includes, but isn't limited to, the cases where $A,B,C,D$ are real or complex matrices), we have
$$
\det \pmatrix{A&B\\ C&D}=
\begin{cases}
\det(AD-BC) & \text{ if } CD=DC,\\
\det(DA-CB) & \text{ if } AB=BA,\\
\det(DA-BC) & \text{ if } BD=DB,\\
\det(AD-CB) & \text{ if } AC=CA.
\end{cases}
$$
This was first proved by M. H. Ingraham (see his paper A note on determinants) and later proved independently by John Silvester (see his paper). Essentially, suppose $AC=CA$ (this is the last case in the above). Then
$$
\pmatrix{I&0\\ -C&A+xI}\pmatrix{A+xI&B\\ C&D}=\pmatrix{A+xI&B\\ AC-CA&(A+xI)D-CB}.
$$
By assumption, $AC-CA=0$, so the RHS is block-triangular and
$$\det(A+xI)\,\det\pmatrix{A+xI&B\\ C&D}=\det(A+xI)\,\det\left((A+xI)D-CB\right).$$
As $\det(A+xI)$ is a nonzero polynomial in $x$, we can divide both sides by it and obtain $\det\pmatrix{A+xI&B\\ C&D}=\det\left((A+xI)D-CB\right)$. Put $x=0$, the assertion follows. The proofs for the other three cases are similar.
Remark. Those who are familiar with Schur complement may recognise that the same technique using block matrix decomposition is also used in the derivation of Schur complement. In fact, if the matrix entries are taken from a field, we can use Schur complement to prove the above formula. E.g. suppose again that $AC=CA$. Let $x$ be an indeterminate and extend the underlying field $F$ to $F(x)$, the field containing all rational functions in $x$. Then $A+xI$ is invertible and it commutes with $C$. Therefore, by using Schur complement, we get
\begin{aligned}
\det\pmatrix{A+xI&B\\ C&D}
&=\det(A+xI)\det\left(D-C(A+xI)^{-1}B\right)\\
&=\det(A+xI)\det\left(D-(A+xI)^{-1}CB\right)\\
&=\det\left((A+xI)D-CB\right)
\end{aligned}
and the assertion again follows by putting $x=0$.
Best Answer
You need to require that the matrices $A$ and $B$ commute (i.e., that $AB=BA$). Otherwise, for example, $A=\begin{pmatrix} 1 & 1\\ 0 & 1 \end{pmatrix}$ and $B=\begin{pmatrix} 1 & 0\\ 1 & 1 \end{pmatrix}$ yield a counterexample (since $\det\begin{pmatrix} A & B\\ -B & A \end{pmatrix}=\det\begin{pmatrix} 1 & 1 & 1 & 0\\ 0 & 1 & 1 & 1\\ -1 & 0 & 1 & 1\\ -1 & -1 & 0 & 1 \end{pmatrix}=1\neq0=\det\left( A^2 +B^2 \right) $ in this case).
But if $A$ and $B$ do commute, then your claim holds:
First proof of Theorem 1 (sketched). One fact about block matrices is the following: If $A$, $B$, $C$ and $D$ are four $n\times n$-matrices over $\mathbb{K}$ such that $AB=BA$, then \begin{align} \det\begin{pmatrix} A & B\\ C & D \end{pmatrix}=\det\left( DA-CB\right) . \label{darij1.pf.t1.1st.1} \tag{1} \end{align} (This is mentioned in https://math.stackexchange.com/a/548487/ , and can be proven using the Schur complement in the case when $A$ is invertible. When $A$ is not invertible, replace $A$ by $A+xI_{n}$, where $x$ is a polynomial indeterminate. This argument is probably all over math.stackexchange. For a specific reference, see (16) in John R. Silvester, Determinants of Block Matrices, The Mathematical Gazette, Vol. 84, No. 501 (Nov., 2000), pp. 460--467.)
Applying \eqref{darij1.pf.t1.1st.1} to $C=-B$ and $D=A$, we find \begin{align} \det\begin{pmatrix} A & B\\ -B & A \end{pmatrix}=\det\underbrace{\left( AA-\left( -B\right) B\right) } _{=A^2 +B^2 }=\det\left( A^2 +B^2 \right) . \end{align} This proves Theorem 1. $\blacksquare$
A second proof of Theorem 1 will result from proving a somewhat more general result, which however relies on the existence of an "imaginary unit" in our ring $\mathbb{K}$ (that is, an element $i$ such that $i^2 = -1$):
Proof of Theorem 2. It is straightforward to see that the block matrix $\begin{pmatrix} I_{n} & iI_{n}\\ 0_{n\times n} & I_{n} \end{pmatrix}$ (where $0_{n\times n}$ denotes the $n\times n$ zero matrix) is invertible (with inverse $\begin{pmatrix} I_{n} & -iI_{n}\\ 0_{n\times n} & I_{n} \end{pmatrix} $) and satisfies \begin{align} \begin{pmatrix} I_{n} & iI_{n}\\ 0_{n\times n} & I_{n} \end{pmatrix} \begin{pmatrix} A & B \\ -B & A \end{pmatrix} = \begin{pmatrix} A-iB & 0\\ -B & A+iB \end{pmatrix} \begin{pmatrix} I_{n} & iI_{n}\\ 0_{n\times n} & I_{n} \end{pmatrix} . \end{align} Hence, \begin{equation} \begin{pmatrix} A & B\\ -B & A \end{pmatrix} = \begin{pmatrix} I_{n} & iI_{n}\\ 0_{n\times n} & I_{n} \end{pmatrix} ^{-1} \begin{pmatrix} A-iB & 0\\ -B & A+iB \end{pmatrix} \begin{pmatrix} I_{n} & iI_{n}\\ 0_{n\times n} & I_{n} \end{pmatrix} . \end{equation} Thus, the matrices $\begin{pmatrix} A & B\\ -B & A \end{pmatrix}$ and $\begin{pmatrix} A-iB & 0\\ -B & A+iB \end{pmatrix}$ are similar, and therefore have the same determinant. Hence, \begin{align*} \det\begin{pmatrix} A & B\\ -B & A \end{pmatrix} & =\det\begin{pmatrix} A-iB & 0\\ -B & A+iB \end{pmatrix}\\ & =\det\left( A-iB\right) \cdot\det\left( A+iB\right) \end{align*} (because the determinant of any block-triangular matrix whose diagonal blocks are square matrices always equals the product of the determinants of these diagonal blocks). This proves Theorem 2. $\blacksquare$
Second proof of Theorem 1 (sketched). We can find a commutative ring $\mathbb{L}$ such that $\mathbb{K}$ is a subring of $\mathbb{L}$ and such that there exists some $i\in\mathbb{L}$ satisfying $i^2 =-1$. (For example, if $\mathbb{K}=\mathbb{R}$ or $\mathbb{K}=\mathbb{C}$, then we can take $\mathbb{L}=\mathbb{C}$. In the general case, we can let $\mathbb{L}$ be the quotient ring $\mathbb{K}\left[ x\right] /\left( x^2 +1\right) $, which is a free $\mathbb{K}$-module with basis $\left( \overline{1},\overline{x}\right) $ because $x^2 +1$ is a monic polynomial; then, $i$ should be taken to be the residue class $\overline{x}$ of the indeterminate $x$.)
Anyway, having picked our ring $\mathbb{L}$ and element $i$, let us now regard our matrices as matrices over $\mathbb{L}$. Now, Theorem 2 (applied to $\mathbb{L}$ instead of $\mathbb{K}$) yields \begin{align*} \det\begin{pmatrix} A & B\\ -B & A \end{pmatrix} & =\det\left( A-iB\right) \cdot\det\left( A+iB\right) \\ & =\det\left( \underbrace{\left( A-iB\right) \left( A+iB\right) }_{=AA+iAB-iBA-i^2 BB}\right) \\ & =\det\left( \underbrace{AA}_{=A^2 }+i\underbrace{AB}_{=BA} -iBA-\underbrace{i^2 }_{=-1}\underbrace{BB}_{=B^2 }\right) \\ & =\det\underbrace{\left( A^2 +iBA-iBA-\left( -1\right) B^2 \right) }_{=A^2 +B^2 }=\det\left( A^2 +B^2 \right) . \end{align*} This proves Theorem 1. $\blacksquare$