It should be pretty clear that there are infinitely many solutions $\mathbf a$, which means that we can't "solve for $\mathbf a$".
Let's look at an example in the $2$-D case: take $\mathbf b = (2,3)$, and consider
$$
\mathbf a \cdot \mathbf b = 2 \implies 2a_1 + 3a_2 = 2
$$
Notice that this equation is solvable for any value of $a_2$, since we have
$$
2a_1 = 2 - 3a_2 \implies a_1 = 1 - \frac 32 a_2
$$
Which is to say we can consider $a_2$ to be a free variable. So, for example, $\mathbf a = (1,0),(- \frac 12, 1),(-2,2),(-5,4)$ are all solutions to this equation.
There are a couple ways to view a dot product as a linear map by changing your view slightly.
The map $\langle \cdot,\cdot \rangle : V\times V\to F$ is not linear, it is what we call bilinear, which means that it is linear in each variable. I.e. a map $B : V\times V' \to W$ is bilinear if for all fixed $v\in V$, and all fixed $v'\in V'$ the maps $u'\mapsto B(v, u')$ and $u\mapsto B(u, v')$ are linear. (Though these maps are equal in the case of a dot product, since it is symmetric, so you just need to check that one is linear. Symmetric meaning that $\langle v, w\rangle= \langle w, v\rangle$)
The other view is perhaps a little more faithful to the idea of viewing the dot product as a linear map, but essentially equivalent. Though perhaps a little more abstract (though such judgments are inherently subjective).
The idea is that we can take a bilinear map $B: V\times V' \to W$ and turn it into a linear map $\tilde{B}: V\to \newcommand\Hom{\operatorname{Hom}}\Hom_F(V',W)$. Where $\Hom_F(V',W)$ denotes the vector space of $F$-linear maps from $V'$ to $W$. We define $\tilde{B}(v) = v'\mapsto B(v,v')$. Then one can use the defining property of bilinear maps given above to show that $\tilde{B}$ is linear, and for any $v\in V$, $\tilde{B}(v)$ is a linear map from $V'$ to $W$. This process is called currying. Then $\tilde{B}$ is basically the same as $B$, since we can recover $B$ from $\tilde{B}$ from the fact that $B(v,v')=(\tilde{B}v)v'$ (sorry for changing notation to parenthesis-less function application, I just think it's much more readable here).
Thus one can curry the dot product to get a linear map, call it $D$ from $V$ to $\Hom_F(V,F)$. In general, $\Hom_F(V,F)$ is a vector space called $V$-dual, often written $V^*$, so we can say $D$ is a linear map from $V$ to $V^*$. I.e., we can view the dot product as being equivalent to a particular nice linear map from $V$ to $V^*$.
Best Answer
Really, there isn't a notation that is more correct. It is just a matter of convention. All of them mean the operation $\sum_{i = 1}^n a_ib_i$. The important thing is that you understand what you must do. Like you said yourself, in $\mathbf{A \cdot B^T}$, we see $\mathbf{A}$ and $\mathbf{B}$ as row vectors. The $\mathbf{^T}$ serves just to remind you that you can see the dot product as a matrix multiplication, after all, we will have a $1 \times n$ matrix times a $n \times 1$, which is well defined, and gives as result a $1 \times 1$ matrix, i.e., a number.
The notation $\mathbf{A \cdot B}$ doesn't sugest any of these things, and you can think directly of the termwise multiplication, then sum.
In Linear Algebra, we often talk about inner products in arbitrary vector spaces, a sort of generalization of the dot product. Given vectors $\mathbf{A}$ and $\mathbf{B}$, a widely used notation is $\langle \mathbf{A}, \mathbf{B} \rangle$. An inner product (in a real vector space), put simply, is a symmetric bilinear form (form means that the result is a number), which is positive definite. That means:
i) $\langle \mathbf{A}, \mathbf{B} \rangle=\langle \mathbf{B}, \mathbf{A} \rangle $;
ii) $\langle \mathbf{A} + \lambda \mathbf{B}, \mathbf{C} \rangle = \langle \mathbf{A}, \mathbf{C} \rangle + \lambda \langle \mathbf{B}, \mathbf{C} \rangle$ ;
iii) $\langle \mathbf{A}, \mathbf{A} \rangle > 0 $ if $\mathbf{A} \neq \mathbf{0}$
I, particularly, don't like the notation $\mathbf{A \cdot B^T}$, because when working in more general spaces than $\Bbb R^n$, we don't always have a finite dimension, so matrices don't work so well. I never saw a notation different from those three I talked about. But I enforce what I said at the beginning: there isn't a correct notation, but you should be used to all of them, as possible.