Dimension of kernel subspace of trace transformation

linear algebralinear-transformationsvector-spaces

From S.L Linear Algebra:

Let $V$ be the vector space of real $n \times n$ symmetric matrices.
What is $\textrm{dim} \, V$? What is the dimension of the subspace $W$ consisting
of those matrices $A$ such that $tr(A) = 0$?

where $tr$ is a trace, the sum of all diagonal elements of matrix.


My Solution:

What is $\textrm{dim} \, V$?

Considering that $V$ is a vector space that contains elements of cardinality $n \times n$, therefore it must be generated by the basis of cardinality $n \times n$, hence $\textrm{dim} \, V = n \times n = n^2$.

What is the dimension of the subspace $W$ consisting
of those matrices $A$ such that $tr(A) = 0$?

From my understanding, $tr$ in this case is a linear transformation such that:

$$tr: \mathbb{R^{n \times n }} \rightarrow \mathbb{R}$$

where for any arbitrary symmetric matrix $A = (a_{ij}) \in \mathbb{R^{n \times n }}$, $tr(A)$ is defined by:

$$tr(A) = \sum_{k=0}^{n}{a_{kk}}=a_{11} + a_{22} + … + a_{nn}$$

Therefore, $\textrm{dim} \, \textrm{Im}(tr) = 0$, which leads us to Rank-Nullity theorem asserting that:

$$\textrm{dim} \, V = \textrm{dim} \, \textrm{Im}(tr) + \textrm{dim} \, \textrm{Ker}(tr)$$

By which we can derive a dimension of kernel:

$$\textrm{dim} \, \textrm{Ker}(tr) = \textrm{dim} \, V – \textrm{dim} \, \textrm{Im}(tr)$$

$$\textrm{dim} \, \textrm{Ker}(tr) = n^2 – 0$$

It seems to me that $W=\textrm{Ker}(tr)$ (since kernel is a subspace), therefore $\textrm{dim} \, W = n \times n = n^2$.

Question:

Are the arguments for my answers completely valid? Are there any fundamental mistakes? If so, what's the valid argument?

Thank you!

Best Answer

The dimensions you are obtaining for $V$, the $\mathbb R$-vector space of symmetric matrices, and $W$, the subspace of traceless symmetric matrices, are not correct. So your arguments cannot be completely valid, sorry.
And yes, there are fundamental mistakes.

It appears (to me) as if you are still struggling with the notion of a basis. Several (equivalent) definitions are available, e.g., the following one which suits the present context:
A system of vectors $b_1, b_2,\dots , b_d\in X$ is called a basis for the vector space $X$ if any vector $x\in X$ admits a unique representation as a linear combination $$\alpha_1b_1+\alpha_2b_2+\ldots +\alpha_db_d\,=\,x\,.$$ That a basis always exists, and that different bases of the same vector space have equal cardinality are two (proven) fundamental consequences. On the one hand they allow for attributing the dimension $d$ directly to the vector space, without referring to a chosen basis. On the other hand, any suitable basis can be considered, to read off the dimension of the vector space.

If $M_n(\mathbb R)$ denotes the $\mathbb R$-vector space of real $n\times n$ matrices, let $E_{ij}\in M_n(\mathbb R)$ be the matrix equal to $1$ at position $i,j$ and zero elsewhere. The $E_{ij}$ with $i,j=1,\dots,n\,$ constitute a basis of $M_n(\mathbb R)$, the uniqueness according to the above definition is fairly obvious. And $\,\dim M_n(\mathbb R)=n^2$.

Symmetry of a matrix $A$ means $a_{ji}=a_{ij}$ for all its entries (for the diagonal elements this condition is void), which is a restriction compared to $M_n(\mathbb R)$ as the upper triangular matrix entries of $A$ fix the entries in the strictly lower triangular corner. This leads to considering the system $E_{ij}+E_{ji}$ where $1\leqslant i<j\leqslant n$, joint with $E_{ii}, 1\leqslant i\leqslant n$, as a candidate for a basis of $V$. Again, checking the required uniqueness is straightforward, so this is indeed a basis of the symmetric matrices. As a result $$\dim V \:=\:\frac{(n-1)\,n}{2}+n \:=\: \frac{n\,(n+1)}{2}$$ Imposing $\,\operatorname{tr}A=0$ is an additional constraint on elements of $V$, hence $\dim W<\dim V$ is expected: Invoking Rank-Nullity as in the OP yields $$\dim W \:=\:\dim V -\dim\operatorname{Im}(\operatorname{tr}: V\to\mathbb R) \:=\:\frac{n\,(n+1)}{2}-1 \:=\:\frac{(n-1)(n+2)}{2}$$ Alternatively, you may take $V$'s basis as of above and subtract from each element its trace times $E_{nn}$. This leaves the $E_{ij}+E_{ji}$ unchanged, and gives $F_{ii}:=E_{ii}-E_{nn}$ where $1\leqslant i\leqslant n-1$, hence at this stage the dimension goes down by one. (Generating sets may contain the null vector $F_{nn}=0$, whereas a basis cannot, never ever.)
Checking uniqueness and the "final count down" is left to you.

Hope this reply is helpful to you. When searching for the terms "dim symmetric trace" I found the site-internal references (being so closely related that your post probably gets marked as duplicate)