Vector dot product in machine learning

inner-productsvectors

In machine learning, I often see equations like $y = w^{T} x+b$ and $y = w^{T} \cdot x+b$, but which one is correct?

According to this dot product that says "Let $v, w \in \mathbb{R}^{n}$. Then $v^{T}w = v \cdot w$", I think $y = w^{T} x+b$ is incorrect because it is equivalent to $y = w \cdot x+b$. Because the dimension of $w$ and $x$ were originally the same, the dot product can't be done.

Am I missing something here?

Best Answer

There are competing conventions at play here. In the mathematical community, it is primarily as you describe it: the "dot-product" is an operation between two vectors of the same shape. This convention is demonstrated, for instance, in the relevant Wikipedia page

On the other hand, the computer science community will use the term "dot-product" to refer to the usual product of two matrices. This convention is demonstrated, for instance, in the documentation for the numpy dot function.