Eigendecomposition of Self-Adjoint Operator with Non-Positive Inner Product

inner-productslinear algebraself-adjoint-operators

The spectral theorem (well, some version thereof) says that if $A$ is a self-adjoint linear operator on a (finite-dimensional) Hilbert space $H$, there exists a basis of $H$ consisting of eigenvectors of $A$. My question is this: say I have an operator $A$ which is self-adjoint with respect to some inner product $\langle \cdot | \cdot \rangle$ on a (finite-dimensional) vector space $V$, except that this inner product is not positive definite (which I mean in the strict sense that there exist vectors $|v\rangle$, $|u\rangle$ such that $\langle v | v \rangle > 0$ and $\langle u | u \rangle < 0$). Is it known whether the spectral theorem generalizes to this case? I.e. can I conclude that there exists a basis of $V$ consisting of eigenvectors of $A$?

I thought perhaps there might be a way of defining a new positive-definite inner product $\langle \cdot | \cdot \rangle_+$ from the old one in such a way that $A$ is still self-adjoint with respect to $\langle \cdot | \cdot \rangle_+$, and then the usual spectral theorem can be applied, but I'm hoping that someone already knows the answer.

EDIT: The answer to my question as stated is no, as Robert's counterexample shows. Comparing with @Joppy's comment, it seems the issue is that with an indefinite inner product, the orthogonal complement to a null vector (i.e. a nonzero vector with vanishing norm) contains that same vector. Thus if $A$ has a null eigenvector, the usual proof of the spectral theorem doesn't go through. With this observation, let me modify my question: is this the only obstacle to there existing a basis of eigenvectors of $A$? In other words, if $A$ is self-adjoint and none of its eigenvectors are null, does there exist an basis of eigenvectors of $A$?

Best Answer

EDIT: Oops, it's not true. In dimension $2$, consider the indefinite inner product $$ \langle u, v \rangle = u_1 v_1 - u_2 v_2$$ The matrix $$A = \pmatrix{1 & -1\cr 1 & -1\cr}$$ is "self-adjoint" with respect to this, i.e. $$ \langle u, A v \rangle = \langle A u, v \rangle = (u_1 - u_2)(v_1 - v_2)$$ but it is not diagonalizable: its eigenvalue $0$ has algebraic multiplicity $2$ but geometric multiplicity $1$, its only eigenvectors being scalar multiples of $\pmatrix{1\cr 1\cr}$.

Related Question