You asked for pointers to inconsistencies or inaccuracies, so I'll do that. The most serious ones are in part (a).
$(T_1+T_2)A=T_1A+T_2A$ is true, but it is not the matrix distributive property which makes it true, since matrix multiplication is not the only linear transformation, and indeed it doesn't even make sense for $V\neq F^n$ (with $F$ being $\Bbb R$ or $\Bbb C$ or a generic field, depending on your definition).
Similarly your other property is true, but it has nothing to do with matrices.
These two properties do not show directly that $V^*$ is a vector space, but rather that it is a subspace of a larger space $W$. Certainly if it is a subspace of $W$ then it is a vector space, but you never mention which larger space $W$ you are working in.
Actually, that's not strictly true; you imply that $W=V$ in your last line, which is false. For instance, if $V=\Bbb R$ and $f$ is defined by $f(x)=x$, then $f$ is not a real number, so it is not even true that $V\subseteq V^*$.
Your proof of linear independence is great. Rough around the edges, but logically sound.
Span has some issues. First, it is not true that $0$ cannot be written as $a_1f_1+\cdots+a_nf_n$; set $a_1=\cdots=a_n=0$.
Even if you add the condition that "not all $a_i$ are zero", you still must prove that $0$ is the only linear transformation that cannot be written this way. A priori, this is far from obvious.
As a more helpful comment; the usual method for proving that span$\,B = U$ is to say "Suppose $g\in U$ is an arbitrary vector" and then construct explicitly some $a_i$ such that $\sum a_i b_i=g$ (where $b_i\in B$ are basis elements).
The third part is perfect.
Best Answer
A longer hint:
Pick a basis $\{e_i\}$ of V and a basis $\{f_j\}$ of W. Consider the set of linear transformations $S=\{T_{mn}\}$ where we define $T_{mn}$ as $$T_{mn}(\sum a_ie_i) =a_mf_n$$ In other words, $T_{mn}$ maps $e_m$ to $f_n$ and maps the other basis vectors of V to 0 (in W), and then we extend this over all of V while keeping $T_{mn}$ linear.
Can you show that the transformations in S are independent i.e. you cannot make one of them by adding together multiples of the others ?
Can you see how to add together multiples of transformations in S to create a transformation that maps $e_m$ in V to any given vector w in W ?
Now note that a linear transformation in T is defined by its action on each of the basis vectors in V. Can you see how to add together multiples of transformations in S to create any given linear transformation in T ?
Once you have shown that the transformations in S are independent and that they cover all of T, then you know that S is a basis of T. Count the size of S to find the dimension of T