As Paul already noted, you are on the right path (although you have to assume that $n\geq 2$ as of course the zero matrix is the only matrix with $a_{11}=0$).
Now, how to show that this map is surjective if $n\geq 2$: Let $(r,s)\in \mathbb{R}^2$. Then define the symmetric matrix $A$ by $a_{11}=r$ and $a_{nn}=s-r$ and all other entries equal to zero. Then $T(A)=(r,s)$, so $T$ is surjective. Now you can apply rank-nullity and get the result you stated.
To comment on the solution that Daniel Rust gave in the comments: To define a symmetric matrix you have to give a value to each diagonal entry and to each entry above the diagonal (the values below the diagonal are then determined by $a_{ij}=a_{ji}$. There are $n$ entries on the diagonal. But $a_{11}=0$ by assumption, so there is no choice and also say, for $a_{nn}$ there is no choice as $0=\operatorname{trace}(A)=\sum a_{ii}$. Thus $a_{nn}=-\sum_{i\neq n}a_{ii}$. So you are left with $n-2$ choices.
For the part above the diagonal note that there are $n^2$ entries of a matrix in total. Of these all but $n$ don't lie on the diagonal, that's $n^2-n$. Exactly half of them lie above the diagonal, so that's $\frac{1}{2}(n^2-n)$. For all of them you have no constraints from your conditions.
Adding all choices up, you have $(n-2)+\frac{1}{2}(n^2-n)=\frac{1}{2}n(n+1)-2$.
The dimensions you are obtaining for $V$, the $\mathbb R$-vector space of symmetric matrices, and $W$, the subspace of traceless symmetric matrices, are not correct. So your arguments cannot be completely valid, sorry.
And yes, there are fundamental mistakes.
It appears (to me) as if you are still struggling with the notion of a basis.
Several (equivalent) definitions are available, e.g., the following one which suits the present context:
A system of vectors $b_1, b_2,\dots , b_d\in X$ is called a basis for the vector space $X$ if any vector $x\in X$ admits a unique representation as a linear combination
$$\alpha_1b_1+\alpha_2b_2+\ldots +\alpha_db_d\,=\,x\,.$$
That a basis always exists, and that different bases of the same vector space
have equal cardinality are two (proven) fundamental consequences. On the one hand they allow for attributing the dimension $d$ directly to the vector space, without referring to a chosen basis. On the other hand, any suitable basis can be considered, to read off the dimension of the vector space.
If $M_n(\mathbb R)$ denotes the $\mathbb R$-vector space of real $n\times n$ matrices, let $E_{ij}\in M_n(\mathbb R)$ be the matrix equal to $1$ at position $i,j$ and zero elsewhere. The $E_{ij}$ with $i,j=1,\dots,n\,$ constitute a basis of $M_n(\mathbb R)$, the uniqueness according to the above definition is fairly obvious. And $\,\dim M_n(\mathbb R)=n^2$.
Symmetry of a matrix $A$ means $a_{ji}=a_{ij}$ for all its entries (for the diagonal elements this condition is void), which is a restriction compared to $M_n(\mathbb R)$ as the upper triangular matrix entries of $A$ fix the entries in the strictly lower triangular corner. This leads to considering the system
$E_{ij}+E_{ji}$ where $1\leqslant i<j\leqslant n$, joint with $E_{ii}, 1\leqslant i\leqslant n$, as a candidate for a basis of $V$. Again, checking the required uniqueness is straightforward, so this is indeed a basis of the symmetric matrices. As a result
$$\dim V \:=\:\frac{(n-1)\,n}{2}+n \:=\: \frac{n\,(n+1)}{2}$$
Imposing $\,\operatorname{tr}A=0$ is an additional constraint on elements of $V$, hence $\dim W<\dim V$ is expected: Invoking Rank-Nullity as in the OP yields
$$\dim W \:=\:\dim V -\dim\operatorname{Im}(\operatorname{tr}: V\to\mathbb R)
\:=\:\frac{n\,(n+1)}{2}-1 \:=\:\frac{(n-1)(n+2)}{2}$$
Alternatively, you may take $V$'s basis as of above and subtract from each element its trace times $E_{nn}$. This leaves the $E_{ij}+E_{ji}$ unchanged, and gives $F_{ii}:=E_{ii}-E_{nn}$ where $1\leqslant i\leqslant n-1$, hence at this stage the dimension goes down by one. (Generating sets may contain the null vector $F_{nn}=0$, whereas a basis cannot, never ever.)
Checking uniqueness and the "final count down" is left to you.
Hope this reply is helpful to you.
When searching for the terms "dim symmetric trace" I found the site-internal references (being so closely related that your post probably gets marked as duplicate)
Best Answer
We claim that $\{AB - BA : A,B \in M_n(\mathbb{C})\} = \{A \in M_n(\mathbb{C}), \operatorname{Tr }A = 0\} = \ker \operatorname{Tr}$.
We already know that $\{AB - BA : A,B \in M_n(\mathbb{C})\} \subseteq \ker \operatorname{Tr}$.
Let $E_{ij}$ denote the matrix with $1$ at the position $(i,j)$ and $0$ elsewhere.
Check that $B = \{E_{ij} : 1 \le i, j \le n, i\ne j\} \cup \{E_{ii} - E_{nn} : 1 \le i \le n-1 \}$ is a basis for $\ker \operatorname{Tr}$.
For $1 \le i, j \le n, i\ne j$ we have
$$E_{ij} = E_{ik}E_{kj} - E_{kj}E_{ik}$$
where $k$ is some index $\ne i,j$. To see this, let $\{e_1, \ldots, e_n\}$ be the standard basis for $\mathbb{C}^n$ and note that $E_{ij}e_j = e_i$ and $E_{ij}e_r = 0$ for $r \ne j$. Now verify that
$$(E_{ik}E_{kj} - E_{kj}E_{ik})e_r = \begin{cases} 0, &\text{if } r \ne j,k\\ E_{ik}E_{kj}e_j = E_{ik}e_k = e_i, &\text{if }r = j\\ -E_{kj}E_{ik}e_k = -E_{kj}e_i = 0, &\text{if }r = k\\ \end{cases}$$
For $1 \le i \le n-1$ we have
$$E_{ii} - E_{nn} = E_{in}E_{ni} - E_{ni}E_{in}$$
Indeed
$$(E_{in}E_{ni} - E_{ni}E_{in})e_r = \begin{cases} 0, &\text{if } r \ne i,n\\ E_{in}E_{ni}e_i = E_{in}e_n = e_i, &\text{if }r = i\\ - E_{ni}E_{in}e_n = -E_{ni}e_i = -e_n, &\text{if }r = n\\ \end{cases}$$
Therefore $B \subseteq \{AB - BA : A,B \in M_n(\mathbb{C})\}$ so we conclude $\ker \operatorname{Tr} \subseteq \{AB - BA : A,B \in M_n(\mathbb{C})\}$.