Is it possible to have n matrices that generate a linearly independent vectors for the same input

linear algebramatrices

The question I'm trying to solve is finding when there are $n$ $n \times n$ matrices $A_1, \dots, A_n$ such that for all $v \neq 0$, $A_1v, \dots, A_nv$ are linearly independent.

I think it's possible for $n = 1$ and all even $n$, but I'm not sure how to find an explicit construction. So far, I've only shown it's impossible for odd $n > 3$: WLOG we can assume $A_1 = I$ and then some other matrix has an eigenvector because $n$ is odd, which means plugging that eigenvector into all the matrices will not generate linearly independent vector. I think this also proves that no matrix can have an eigenvector, but I'm not 100% sure.

Context: My friend asked me this question a couple of days ago. They're only in a linear algebra class, but I or any of my friends haven't been able to solve this with what we know from analysis/algebra so I thought to ask here.

Thank you in advance!

Best Answer

Given such $A_i$, we can define a "multiplication" $\times$ on $\Bbb R^n$ by setting $ v\times w\:=(u_1,\ldots, u_n)^T$ where $u_i=v^TA_iw$. This multiplication is linear, as you can easily check: $$( v+v')\times w = v\times w+v'\times w$$ $$v\times (w+w')=v\times w+v\times w' $$ $$(\alpha v)\times w=v\times(\alpha w)=\alpha(v\times w)$$ Careful: In general $\times$ is not abelian, i.e., $$v\times w\stackrel ?=w\times v$$ is not guaranteed to hold for all $v,w$.

Super-Careful: In general $\times$ is not even associative, i.e., $$u\times(v\times w)\stackrel ?=(u\times v)\times w$$ is not guaranteed to hold for all $u,v,w$.

However, our multiplication can be inverted by a "division", that is: For $v\ne 0$, every equation of the form $$ x\times v=w$$ has a unique solution $x$ (because the $A_iv$ form a basis), and also every equation $$ v\times x=w$$ has a unique solution $x$ (to see this, you may want to show that the matrix $n$-tuple $(A_1^{-1},\ldots, A_n^{-1})$ also has the desired property).

At any rate, such a structure is called a division algebra (over the reals). You may start reading here to learn more about these. In particular, such a beast has been proven to exists only for $n=1$, $2$, $4$, or $8$, and can be constructed from the reals $\Bbb R$ themselves, from the complex numbers $\Bbb C$, from the quaternions $\Bbb H$, and from the octonions $\Bbb O$, respectively. You of course know the first two well, and may perhaps have heard about quaternions playing a role in 3d computer graphics, for example. But I guess the this theorem about division algebras is the only time you'll ever hear about octonions. :)

Specifcally, for $n=2$, we identify $\Bbb R^2$ with $\Bbb C$ in the obvious and define $A_1$, $A_2$ as multiplication with $1$ and $i$, respectively: $$A_1=\begin{pmatrix}1&0\\0&1\end{pmatrix}, \quad A_1=\begin{pmatrix}0&-1\\1&0\end{pmatrix}.$$ You may want to try to do the same for $\Bbb H$.


Remark: Note that if you take $\Bbb C$ instead of $\Bbb R$ as base field, the scenario is much simpler: Your eigenvector-based proof that odd dimensions $\ge 3$ do not work now works for all $n\ge 2$ because complex matrices always have eigenvalues.

Related Question