Here are some ideas how to decide the conjecture. (EDIT: In fact these ideas lead to a proof of the conjecture as Terry Tao explained in two comments below.)
As Christian Remling and Will Sawin showed, the conjecture is equivalent to $\det(I+T)\geq 0$ for any $T\in\mathrm{SO}^0(n,n)$.
We can assume that $-1$ is not an eigenvalue of $T$. Up to conjugacy, $T$ is a sum of indecomposable blocks as in Theorem 1 of Nishikawa's 1983 paper, and then $\det(I+T)$ is the product of the determinants of the corresponding blocks of $I+T$. Hence, by the idea of jjcale, we can forget about the blocks that are of exponential type. By page 83 in Djoković's 1980 paper, the remaining blocks are of type $\Gamma_m(\lambda,\lambda^{-1})$ with $\lambda<0$ and $\lambda\neq -1$, which in turn are described on page 77 of the same paper. Such a block contributes $(1+\lambda)^{2m+2}/\lambda^{m+1}$ to $\det(I+T)$, hence we can forget about the blocks where $m$ is odd.
To summarize, we can assume that $T$ is composed of $(2m+2)\times(2m+2)$ blocks of type $\Gamma_m(\lambda,\lambda^{-1})$ with $\lambda<0$ and $\lambda\neq -1$ and $m$ even. The conjecture is true if and only if the number of such blocks is always even. For this, the explicit description of $\mathrm{SO}^0(n,n)$ on page 64 of Nishikawa's 1983 paper might be useful (see also page 68 how to use this criterion for $m=1$). Based on this, I verified by hand that one cannot have a single block for $m=2$, which also shows that the smallest possible counterexample to the conjecture is of size $10\times 10$ (i.e. $n\geq 5$).
Added 1. Terry Tao realized and kindly added that in the remaining case we are done. Read his comments below. To summarize and streamline his ideas, we have in this case
\begin{align*}\det(I_{2n}+T)
&=\det(I_n+A)\det(I_n+A^{*-1})\\
&=\det(A)\det(I_n+A^{-1})\det(I_n+A^{*-1})\\
&=\det(A+A^{*-1})\frac{\det(I_n+A^{-1})^2}{\det(I_n+A^{-1}A^{*-1})},
\end{align*}
where $(A+A^{*-1})/2$ can be described as the restriction of $T$ to a totally positive subspace followed by the orthogonal projection to this subspace. Now we have $\det(A+A^{*-1})>0$ by $T\in\mathrm{SO}^0(n,n)$, while the fraction on the right is clearly positive, hence we conclude $\det(I_{2n}+T)>0$.
Added 2. Terry Tao wrote a great blog entry on this topic.
Added 3. Let me add a variation on Terry's original argument. Djoković defines $\mathrm{SO}(n,n)$ via $J:=\begin{pmatrix} 0 & I_n \\ I_n & 0 \end{pmatrix}$, while Nishikawa defines it via $K:=\begin{pmatrix} I_n & 0 \\ 0 & -I_n\end{pmatrix}$. These two matrices are connected via $J=M^*KM$, where $M:=\frac{1}{\sqrt{2}}\begin{pmatrix} I_n & I_n\\ -I_n & I_n\end{pmatrix}$, hence any matrix $T$ in Djoković's $\mathrm{SO}(n,n)$ corresponds to $MTM^*$ in Nishikawa's $\mathrm{SO}(n,n)$. We need to examine the case of $T = \begin{pmatrix} A & 0 \\ 0 & A^{*-1} \end{pmatrix}$, which corresponds to $MTM^*=\frac{1}{2}\begin{pmatrix} A+A^{*-1} & -A+A^{*-1} \\ -A+A^{*-1} & A+A^{*-1} \end{pmatrix}$. This lies in Nishikawa's $\mathrm{SO}^0(n,n)$, whence $\det(A+A^{*-1})>0$.
Bound 1 does not hold already for $n=2$. Take a matrix $A=\pmatrix{e^{ia}&e^{ib}\\e^{ic}&e^{id}}$, it satisfies your conditions if $|a+d-b-c|\leqslant \pi/3$. On the other hand, $X=\pmatrix{\cos a&\cos b\\\cos c&\cos d}$ and $$\det X=\cos a\cos d-\cos b\cos c=\frac12\left(\cos(a+d)+\cos(a-d)-\cos(b+c)-\cos(b-c)\right)\\=
\frac12\left(2\sin\frac{b+c-a-d}2\sin\frac{a+d+b+c}2+\cos(a-d)-\cos(b-c)\right),
$$
thus if $a-d=0$, $b-c=\pi$, $a+d+b+c=\pi$, $b+c-a-d=\pi/3$ (I am lazy to solve this explicitly), this expression is equal to $3/2>1$.
Best Answer
For the first point, note that by Sylvester's identity
$$\det(I_n + v^TA^{-1}v) = \det(I_n + (v^T)(A^{-1}v)) = \det(I_n+(A^{-1}v)(v^T)) = \det(I_n + A^{-1}vv^T),$$
so
$$\det(A)\det(I_n+v^TA^{-1}v) = \det(A)\det(I_n + A^{-1}vv^T) = \det(A + vv^T).$$
Given $u \in \mathbb{R}^m$ and $v \in \mathbb{R}^n$, their outer product is the $m\times n$ matrix $uv^T$. If $u$ and $v$ are non-zero, then their outer product has rank one. Conversely, a rank one $m\times n$ matrix can be written as the outer product of some non-zero $u$ and $v$.
Now suppose $C$ is an $n\times n$ matrix (not necessarily invertible) and $A$ is a rank one $n\times n$ matrix. By the above discussion, there are $u, v \in \mathbb{R}^n$ such that $A = uv^T$. So
\begin{align*} \det(I_n + tCA) &= \det(I_n + tCuv^T)\\ &= \det(I_n + (tCu)(v^T))\\ &= \det(I_1 + (v^T)(tCu))\\ &= \det(I_1 + tv^TCu)\\ &= 1 + tv^TCu \end{align*}
where the last equality follows because $I_1 + tv^TCu$ is a $1\times 1$ matrix.
A function of the form $f(t) = at + b$ is called affine. I'm guessing this is what is meant by affine-linear.
Added Later: Let me add a theorem which further demonstrates the relationship between the rank of a matrix and outer products. In the following statement, I am taking the definition of the rank of a matrix to be the dimension of its column space.
Proving this is a really nice exercise in elementary linear algebra. I would hate to rob you of the experience by posting the solution here.