Linear Algebra – Understanding Commuting Operators

eigenvalues-eigenvectorslinear algebra

Let's consider a number of linear operators, defined on a finite dimensional complex vector space, which two by two commutes with each other. (the amount of them can be infinite). How to prove that that will have a common eigenvector?

The finite case can be done by induction:
1) $n=2$, $AB=BA$, then let $x$ be an eigenvector of $A$ (it does exist, because we are working over a $\mathbb{C}$) and $\alpha$ – an eigenvalue. Then, $A(x)=\alpha \cdot x, B(A(x))=A(B(x))=B(\alpha x)=\alpha B(x)$, so $B(x)$ is also an eigenvector of $A$, associated with $\alpha$ eigenvalue.
Analogically, we do it for $n>2$.

But, what can i do, while working with an infinite number of operators( induction doesn't work here, actually).

Any help would be appreciated.

Best Answer

Let's say your vector space is $\mathbb{C^n}$. Then $M_n(\mathbb{C})$, i.e. all $n$ by $n$ matrices are the bounded operators on $\mathbb{C^n}$.

Now, $M_n(\mathbb{C})$ is finite dimensional. So, even if you have infinitely many operators, say $A_1,A_2,\cdots$, there will exist $i_1,i_2,\cdots,i_k$ such that $A_{i_1},\cdots,A_{i_k}$ will span the rest of the matrices. So, your case for finitely many operators will work.

Related Question