Does every eigenspace of the exterior power $\bigwedge^k A$ corresponds to an invariant subspace

eigenvalues-eigenvectorsexterior-algebrainvariant-subspacelinear algebralinear-transformations

Let $V$ be an $n$-dimensional real vector space, and let $1<k<n$ be fixed. Given an automorphism $A \in \text{GL}(V)$, consider its $k$-th exterior power $\bigwedge^k A \in \text{GL}(V)$.

Suppose $\bigwedge^k A$ admits an eigenvector. (equivalently, $\bigwedge^k A$ admits a non-zero eigenvalue).

Does $A$ has a $k$-dimensional invariant subspace?

If the eigenvector $v$ of $\bigwedge^k A$ is decomposable, then the answer is positive:

Write $v=v_1 \wedge v_2 \dots \wedge v_k$; Then
$$\bigwedge^k A(v_1 \wedge v_2 \dots \wedge v_k)=\lambda v_1 \wedge v_2 \dots \wedge v_k \Rightarrow \text{span}(Av_1,\dots,Av_k)=\text{span}(v_1,\dots,v_k),$$

so $\text{span}(v_1,\dots,v_k)$ is $A$-invariant.

However, I am not sure that every eigenspace $\bigwedge^k A$ of should be decomposable.

This question looks somewhat related to this nice question, which is still not fully answered.

Best Answer

Nice question. First, note that over $\mathbb{C}$, any operator can be represented with respect to an appropriate basis by an upper triangular matrix. This implies that any operator $A$ has invariant subspaces of all possible dimensions so the question is not interesting over $\mathbb{C}$.

To construct a counterexample over $\mathbb{R}$, I will use the following observations:

  1. If $n$ is even and the characteristic polynomial of $A$ has no real roots, then $A$ has no odd-dimensional $A$-invariant subspaces. The reason is that if you restrict $A$ to an odd-dimensional $A$-invariant subspace, you get an operator which must have an eigenvector (with a real eigenvalue), contradicting the fact that all the roots of the characteristic polynomial of $A$ aren't real.
  2. If the (possibly complex) roots of the characteristic polynomial of $A$ are $(\lambda_i)_{i=1}^n$ (with multiplicity) then the roots of the characteristic polynomial of $\Lambda^k(A)$ are $(\lambda_{\alpha})$ where $\alpha = (i_1 < \dots < i_k)$ runs over all possible multi-indices and $\lambda_{\alpha} := \lambda_{i_1} \dots \lambda_{i_k}$. To see this, assume first that $A$ is a complex operator and choose an ordered basis $(e_i)_{i=1}^n$ with respect to which $A$ is represented by an upper triangular matrix with $$Ae_i = \lambda_i e_i \mod \operatorname{span} \{ e_j \}_{j < i}. $$ Then $\Lambda^k(A)$ is represented with respect to the induced ordered basis $(e_{\alpha})$ (where the order on the multi-indices is the lexicographical one) by an upper triangular matrix with $$ \Lambda^k(A)(e_\alpha) = \lambda_{\alpha} e_{\alpha} \mod \operatorname{span} \{ e_{\beta} \}_{\beta < \alpha}. $$ The result for real operators follows by complexification using the fact that exterior power and complexification commute.

Now, let $\theta = \frac{2\pi}{3}$ and set $\alpha = e^{i\theta}$. Consider the operator $A \colon \mathbb{R}^6 \rightarrow \mathbb{R}^6$ which is represented with respect to the standard basis by the block diagonal matrix $$ \begin{pmatrix} \cos \theta & -\sin \theta & 0 & 0 & 0 & 0 \\ \sin \theta & \cos \theta & 0 & 0 & 0 & 0 \\ 0 & 0 & \cos \theta & -\sin \theta & 0 & 0 \\ 0 & 0 & \sin \theta & \cos \theta & 0 & 0 \\ 0 & 0 & 0 & 0 & \cos \theta & -\sin \theta \\ 0 & 0 & 0 & 0 & \sin \theta & \cos \theta \end{pmatrix}. $$

The characteristic polynomial of $A$ is $$ (z - \alpha)^3(z - \overline{\alpha})^3 = (z^2 - (2 \Re{\alpha})z + |\alpha|^2)^3 = (z^2 + z + 1)^3 $$ with roots $$ \alpha, \overline{\alpha}, \alpha, \overline{\alpha}, \alpha, \overline{\alpha}. $$ The roots aren't real so $A$ doesn't have a three-dimensional invariant subspace. However $\alpha^3 = 1$ is a real root of the characteristic polynomial of $\Lambda^3(A)$ (of multiplicity two) so $\Lambda^3(A)$ has two linearly independent eigenvectors which are necessarily indecomposable.


Remark: One can show using primary decomposition that if a real operator has a real eigenvalue then it has invariant subspaces of all possible dimensions. Hence, counterexamples are possible only in even dimensions. It is a nice exercise to see why you can't have a counterexample in dimension four so this is a minimal counterexample in terms of dimension.