Self-adjoint linear map has determinant $ < 0$.

adjoint-operatorslinear algebralinear-transformations

I would like to know whether the following is correct and if so, how to generalize it:

Claim: Let $V$ be a $\mathbb R$-vector space of dimension $2$, and let $\langle \cdot , \cdot \rangle : V \times V \to \mathbb R$ be a scalar product on $V$. Let $F: V \to V$ be a self-adjoint linear map such that $v, F(v)$ is an orthogonal basis for $V$.
Then $\det F < 0$.

Proof: We can calculate the transformation matrix $A$ of $F$ with respect to the basis $v, F(v)$:

Since $$ v \overset F \mapsto F(v) , \\ F(v) \overset F \mapsto F^2(v) = av + b F(v) $$ for some $a, b \in \mathbb R$, we know that $A= \begin{pmatrix} 0 & a \\ 1 & b \end{pmatrix} $, hence $\det F = \det A = -a$. It now suffices to show that $a > 0$. For $x \in V$ we write $\lVert x \rVert := \langle x , x \rangle$.
We have $$\begin{align} a &= \frac{a}{\lVert v \rVert}\langle v, v \rangle = \frac{1}{\lVert v \rVert}\left( a \langle v , v \rangle + b \underbrace{\langle F(v), v \rangle}_{=0} \right) \\
&= \frac{1}{\lVert v \rVert}\langle av + b F(v) , v \rangle = \frac{1}{\lVert v \rVert} \langle F^2 (v), v \rangle \\
&= \frac{1}{\lVert v \rVert} \langle F(v), F(v) \rangle = \frac{\lVert F(v) \rVert}{\lVert v \rVert} > 0 \end{align}$$

Question: Is there a similar result for $n$-dimensional $\mathbb R$-vector spaces, where $n \in \mathbb N$? The naive approach, namely trying it with a basis $v, F(v), F^2(v),…, F^{n-1}(v)$, is destined to fail, since $\langle F^2(v) , v \rangle =0 \iff \lVert F(v) \rVert = 0 \iff F(v) = 0$.

Best Answer

The generalization is false if $n \geq 3$ is odd, because $\det(-F_n) = - \det(F_N)$. It is also false if $n \geq 4$ is even, because we can define $F_n = I_{n-3} \oplus F_3$, where $F_3$ is from the previous example, that is $F$ acts on the first $n-3$ vectors by doing nothing and on the last three vectors by treating them like a three dimensional vector space. We then have $F_n$ self-adjoint the matrix of $F_n$ is a block matrix with $F_3$ at the bottom right corner and otherwise $1$'s on the rest of the $n-3$ diagonal entries.

I believe that your proof works. Here is another: For $n = 2$, recall that being self-adjoint means that in some basis it looks like $\begin{pmatrix}\lambda_1 & 0 \\ 0 & \lambda_2 \end{pmatrix}$. If the determinant is positive, then $\lambda_i$ are both the same sign (and non-zero). So, we can multiply by $-1$ if necessary to get a positive sign for each. We then can scale so that the matrix is $\begin{pmatrix} a & 0 \\ 0 & 1 \end{pmatrix}$. Now, suppose that $v = (r, s)$ is like that in the hypothesis. Then $\langle F(v), v\rangle = ar^2 + s^2$ which can only be zero if $a < 0$ or ($a = 0$ and $s = 0$). However, the latter case gives $F = \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix}$ and $v = (r, 0)$ so $Fv = 0$.