The matrix, as you note, is of the form
$$
\pmatrix{A&0&B\\0&\sigma_2 &0\\C&0&F}.
$$
With the block permutation matrix
$$
P = \pmatrix{0&I_2&0\\I_2&0&0\\0&0&I_2},
$$
we find that this matrix is similar to the block-diagonal matrix
$$
PDP^T = \pmatrix{A&B&0\\C&F&0\\0&0&\sigma_2}.
$$
So, the eigenvalues of $D$ are equal to those of $\sigma_2$ (namely $\pm 1$) together with those of the matrix $M = \left[\begin{smallmatrix}A&B\\C&F\end{smallmatrix}\right]$. On the other hand, $M$ is the matrix
$$
M = \left[\begin{array}{c|cc|c}
1&0&0&0\\
\hline 0&0&1&0\\
0&1&0&0\\
\hline 0&0&0&-1\end{array}\right].
$$
As the partition lines indicate, this matrix is block-diagonal with diagonal blocks $1$, $\left[\begin{smallmatrix}0&1\\1&0\end{smallmatrix}\right]$, $-1$.
With that, you can conclude that the eigenvalues of $D$ are equal to $1$ and $-1$, both with multiplicity $3$.
For a more direct approach, the permutation matrix
$$
P = \pmatrix{
1&0&0&0&0&0\\
0&0&0&0&0&1\\
0&0&1&0&0&0\\
0&0&0&1&0&0\\
0&0&0&0&1&0\\
0&1&0&0&0&0}
$$
is such that
$$
PDP^T = \pmatrix{\sigma_3 &0&0\\0&\sigma_2&0\\0&0&\sigma_1}.
$$
To find the eigenvectors of $D$, begin by finding the eigenvectors of $PDP^T$. We can find these eigenvectors using the eigenvectors of the blocks $\sigma_1,\sigma_2,\sigma_3$.
For example, $\sigma_2$ has the eigenvector $v = (1,i)$ associated with eigenvalue $1$. It follows that $PDP^T$ has the "block-vector"
$$
\tilde v = \pmatrix{0_{2 \times 1}\\ v\\ 0_{2 \times 1}}
$$
as an eigenvector associated with the eigenvalue 1.
Once all eigenvectors are obtained in this fashion, note that for any eigenvector $x$ of $PDP^T$, $P^Tx$ will be an eigenvector of $D$ because
$$
D(P^Tx) = (DP^T)x = P^T(PDP^T)x = P^T(\lambda x) = \lambda P^T x.
$$
So, once the eigenvectors $v_1,\dots,v_6$ of $PDP^T$ are obtained, $P^Tv_1,\dots,P^Tv_6$ will be the eigenvectors of $D$.
Best Answer
First we note that the elements $(2,1)$ and $(2,2)$ of matrix $M$ commute. Then we consider:
\begin{equation} (\lambda I_{2n}-M)v = 0 \end{equation}
which is the eigenvalue problem of $M$. The equation admits a solution $v \neq 0$ when $|\lambda I_{2n}-M| = 0$, so we resort to computing the determinant of the characteristic matrix:
\begin{equation} \begin{bmatrix} \lambda I_n & -C \\ I_n & \lambda I_n + B \\ \end{bmatrix} \end{equation}
We assume that $(1,1)$ is invertible, i.e., that $0$ is not an eigenvalue of $M$. Then since $I_n$ commutes with any matrix, we can write:
\begin{equation} |\lambda I_{2n} - M| = |\lambda ( \lambda I_n + B) + C | = |\lambda^2 I_n + \lambda B + C| = 0 \end{equation}
This is a Quadratic Eigenvalue Problem and has been intensively studied. Particularly, in (Tisseur, 2001), properties of the solutions $\lambda$ have been given in terms of the coefficient matrices $M, C, K$ of $\lambda^2 M + \lambda C + K = 0$. When the three matrices are Hermitian, then the eigenvalues are some solutions of:
\begin{equation} \lambda = \frac{-c(x) \pm \sqrt{c(x)^2 - 4k(x)m(x)}}{2m(x)} \end{equation}
with $m(x) = x^*Mx,c(x) = x^*Cx, k(x) = x^*Kx$. Then in this case it can be seen that a sufficient, but not necessary condition for $\lambda$ to be complex is that $c(x)^2 < 4k(x)m(x)$ $\forall x$, which in our case reduces to
\begin{equation}b(x)^2 < 4c(x) \text{ } \forall x\end{equation}
with $b(x) = x^* B x, c(x) = x^* C x$ (pay attention to the different meaning of $c(x)$ in the problem of Tisseur and the answer's problem).
If $B, C$ are Hermitian (which, being them real, implies symmetry), then $\lambda_{C,n} \leq c(x) \leq \lambda_{C,1}$ and $\min(|\lambda_{B}|)^2 \leq b(x)^2 \leq \max(|\lambda_{B}|)^2$, so a (very) broad sufficient condition is that:
\begin{equation} \min(\lambda_C) > \bigg(\frac{\max(|\lambda_{B}|)}{2}\bigg)^2 \end{equation}
If they are not Hermitian, then the same condition can be stated in terms of their symmetric parts since $b(x) = x^*Bx = \frac12 x^*(B+B^T)x$ and similarly for $c(x)$.