[Math] linearly independent commuting $2\times 2$ complex matrices (Hoffman Kunzze, Linear algebra, 6.5.2)

linear algebramatrices

Actual Question is:

Let $\mathcal{F}$ be a commuting family of $3\times 3$ complex matrices. How many linearly independent matrices can $\mathcal{F}$ contain? what about the $n\times n$ case? (Hoffman Kunzze, Linear algebra, 6.5.2)

I have no idea ow to go directly for $n\times n$.. SO, I thought i would try for $2\times 2$ and $3\times 3$ and then generalize.

For $2\times 2$ I know that basis of $\mathcal{M_2}=\{2\times 2 ~complex ~ matrices\}$ is $ \left( \begin{array}{cccc}
1 & 0 \\
0 & 0 \end{array} \right),\left( \begin{array}{cccc}
0 & 1 \\
0 & 0 \end{array} \right),\left( \begin{array}{cccc}
0 & 0 \\
1 & 0 \end{array} \right) ,\left( \begin{array}{cccc}
0 & 0 \\
0 & 1 \end{array} \right) $

So, what i was thinking is if i can check which matrices commutes in this connection, this would be the linearly independent commuting sets of $2\times 2$ matrices ( this is my guess not very sure)

we have $ \left( \begin{array}{cccc}
1 & 0 \\
0 & 0 \end{array} \right)\left( \begin{array}{cccc}
0 & 1 \\
0 & 0 \end{array} \right)= \left( \begin{array}{cccc}
0 & 1 \\
0 & 0 \end{array} \right)$ but, $ \left( \begin{array}{cccc}
1 & 0 \\
0 & 0 \end{array} \right)\left( \begin{array}{cccc}
0 & 1 \\
0 & 0 \end{array} \right)=\left( \begin{array}{cccc}
0 & 0 \\
0 & 0 \end{array} \right)$

i dont want to write all other combinations, but, i would write only commuting matrices

$\left( \begin{array}{cccc}
1 & 0 \\
0 & 0 \end{array} \right)\left( \begin{array}{cccc}
0 & 0 \\
0 & 1 \end{array} \right)=\left( \begin{array}{cccc}
0 & 0 \\
0 & 0 \end{array} \right)$ and $\left( \begin{array}{cccc}
0 & 0 \\
0 & 1 \end{array} \right)\left( \begin{array}{cccc}
1 & 0 \\
0 & 0 \end{array} \right)=\left( \begin{array}{cccc}
0 & 0 \\
0 & 0 \end{array} \right)$

So, I think these two elements must be linearly independent in the set of commuting $2\times 2$ matrices.

I am expecting to do the same for general $n\times n$ but this would be cumbersome..

So, I would be thankful if some one can help me out to solve this in detail… atleast for $2\times 2$ …

Thank You.

Best Answer

From what I can gather, this seems like a hard problem to pin down an exact bound for, however I suspect the book just wants you to notice that all these matrices will be simultaneously triangulable. Linear independence does not change when conjugating by an invertible matrix (this is what I alluded to in my comment above), so the dimension of this space is $\leq$ the dimension of all upper triangular matrices; which is ...

[Also see large sets of commuting linearly independent matrices, and the link therein to a theorem of Schur.

Very interesting indeed]

Related Question