Let three distinct subspaces of $\mathbb{R}^{10}$ be $W_1,W_2,W_3$ have dimensions $9$ each. Then, what are the upper and lower bounds on the dimension of $W=\displaystyle\cap_{i=1}^3W_i$? Can we say that $dim (W)=7$? I know that the standard formula for sets does not apply here because the subspaces need not be independent. Any ideas. Thanks beforehand.
[Math] Bounds on dimension of intersection of three subspaces
linear algebra
Related Solutions
This is too long to be a comment, so I make it an answer instead. Not a solution, just remarks.
I think that you are on the right track in the sense that this approach will lead to a proof. The observation that the other matrices $B$ need to keep the subspaces $W_i,i=1,2,3$ invariant is the key, and you can do the rest case by case. However, if I were grading your exam I would want you to justify the following claim that you made:
Now the dimension of the family of commuting operators on a 2-dimensional vectors space over $C$ is at most 2. Indeed each of the commuting operators on $W_2$ will have a repeated eigenvalue.
Ok. Probably this was meant to be $W_3$, i.e. the 2-dimensional characteristic space of $A$. Your first claim is correct in that a commuting family of operators on $W_3$ has dimension at most $2$, but you haven't proven it yet! It may be a lemma or an example in your textbook, in which case you are ok. But to be on the safe side, I want you to consider the case that $A$ acts as identity operator (or zero - any scalar will do!) on the subspace on $W_3$. Then $A$ will commute with all the operators on $W_2$, and those form a 4-dimensional space. Granted, they don't commute with each other, and in the end the maximum dimension will be 2, but you haven't shown it yet.
My suspicion was raised by your second claim that each operator on $W_2$ will have a repeated eigenvalue. This is most certainly false! For example, the space $F$ could consists of the diagonal matrices, but you just happened to pick as $A$ the diagonal matrix with entries $1,2,3,3$. Then $W_3$ consists of the span of the last two basis vectors, but the diagonal matrix $B$ with entries $2,3,5,8$ does not have a repeated characteristic value on $W_3$.
Then to a hint: There is no way to guarantee that you could pick $A$ in the right way, so another matrix $B$ may have several eigenvalues on a characteristic space of $A$. But, then the relation $AB=BA$ means that $A$ has to keep those eigenspaces of $B$ in $W_3$ invariant! In other words $A$ has to be a scalar on $W_2$, whenever such a $B$ exists in $F$. But we can use $B$ to split the space $W_3$ further into a direct sum of smaller components. By the preceding arguments, all the matrices in $F$ then need to respect this finer direct sum decomposition, and you are back to the case of simultaneously diagonalizable set.
[Edit: Clarifying the hint] So I would begin by showing that the 4-d space can be split into a direct sum of subspaces $\oplus\sum_iW_i$ in such a way that every element of $F$ keeps each summand invariant, and has only a single characteristic value on any component.]
But another trick is needed, when $A$ is not diagonalizable, and (consequently) no $B\in F$ has several characteristic values in a characteristic space of $A$. Hint: Any $B\in F$ must map the null space of $(A-\lambda I)^k$ to itself. [Edit: In view of the general result cited by Pierre-Yves this requires careful work in the case that the multiplicity of the characteristic value is three, as my hint only shows that all the matrices of $F$ are upper triangular w.r.t. to the basis that puts $A$ into its Jordan form. That set of upper triangular 3x3 matrices with identical entries along the diagonal forms a 4-d space. But is not commutative!]
Since $\,\dim (W_1+W_2)=\dim W_1+\dim W_2-\dim(W_1\cap W_2)\,$ , it is probably wiser to try to find first the dimension of the intersection:
$$A=\begin{pmatrix}\alpha&\beta\\\gamma&\delta\end{pmatrix}\in W_1\cap W_2\Longleftrightarrow 0=\alpha=\delta\,\,,\,\,\beta=-\gamma$$
and from the above clearly the dimension of the intersection is $\,1\,$ . Now complete the answer.
Added: $\,W_1\,$ is closed under multiplication by scalar:
$$k\begin{pmatrix}a&b\\c&a\end{pmatrix}:=\begin{pmatrix} ka&kb\\kc&ka\end{pmatrix}$$
and we can easily see the right hand matrix belongs to $\,W_1\,$ as its main diagonals' elements are equal, as required from any element in $\,W_1\,$
Best Answer
Note that if $W_i \neq W_j$ then $(W_i + W_j) = W$. Now use the fact that dim $(U+V) =$ dim $U$ + dim $V$ - dim $U \cap V$ to find dim $W_i \cap W_j$. Finally if $W_i \cap W_j \not \subset W_k$ for $k \neq i,j$ then $W = W_k + W_i \cap W_j$, so apply the dimension formula again.