In my experience, the dot product refers to the product $\sum a_ib_i$ for two vectors $a,b\in \Bbb R^n$, and that "inner product" refers to a more general class of things. (I should also note that the real dot product is extended to a complex dot product using the complex conjugate: $\sum a_i\overline{b}_i)$.
The definition of "inner product" that I'm used to is a type of biadditive form from $V\times V\to F$ where $V$ is an $F$ vector space.
In the context of $\Bbb R$ vector spaces, the biadditive form is usually taken to be symmetric and $\Bbb R$ linear in both coordinates, and in the context of $\Bbb C$ vector spaces, it is taken to be Hermetian-symmetric (that is, reversing the order of the product results in the complex conjugate) and $\Bbb C$ linear in the first coordinate.
Inner products in general can be defined even on infinite dimensional vector spaces. The integral example is a good example of that.
The real dot product is just a special case of an inner product. In fact it's even positive definite, but general inner products need not be so. The modified dot product for complex spaces also has this positive definite property, and has the Hermitian-symmetric I mentioned above.
Inner products are generalized by linear forms. I think I've seen some authors use "inner product" to apply to these as well, but a lot of the time I know authors stick to $\Bbb R$ and $\Bbb C$ and require positive definiteness as an axiom. General bilinear forms allow for indefinite forms and even degenerate vectors (ones with "length zero"). The naive version of dot product $\sum a_ib_i$ still works over any field $\Bbb F$. Another thing to keep in mind is that in a lot of fields the notion of "positive definite" doesn't make any sense, so that may disappear.
The dot product is the special case of a more general concept, the inner product. If you have a vector space $ V $ over the reals or the complex numbers, then an inner product is a map $ f : V \times V \to \mathbb{C} $ or $ f : V \times V \to \mathbb{R} $ which is conjugate symmetric, positive definite, and linear in its first argument. We usually write $ f(u, v) = \langle u, v \rangle $, in which case these properties can be summed up as follows:
- Conjugate symmetry: $ \overline{\langle u, v \rangle} = \langle v, u \rangle $, where $ \bar{z} $ denotes complex conjugation. Note that this implies $ \langle u, u \rangle $ is always real for any vector $ u $.
- Positive definiteness: $ \langle v, v \rangle \geq 0 $ for any $ v \in V $, with equality holding iff $ v = 0 $.
- Linearity in the first argument: $ \langle \alpha u + \beta v, w \rangle = \alpha \langle u, w \rangle + \beta \langle v, w \rangle $ where $ u, v, w \in V $ and $ \alpha, \beta $ are in the field of scalars.
If $ V = \mathbb{R}^n $, then we can fix a basis $ B = \{ b_i \in \mathbb{R}, 1 \leq i \leq n \} $ and define $ \langle b_i, b_i \rangle = 1 $ and $ \langle b_i, b_j \rangle = 0 $ for $ i \neq j $. Extending this to all of $ \mathbb{R}^n $ by linearity gives us
$$ \left \langle \sum_{k=1}^{n} c_k b_k, \sum_{j=1}^{n} d_j b_j \right \rangle = \sum_{1 \leq k, j \leq n} d_k c_j \langle b_i, b_j \rangle = \sum_{i=1}^{n} c_i d_i $$
where positive definiteness is readily verified. You will recognize this expression as the definition of the dot product. Indeed, if we take our basis $ B $ to be the standard basis of $ \mathbb{R}^n $, then this inner product is the dot product.
Why is this formalism more powerful? A result about the inner product is the Cauchy-Schwarz inequality, which says that $ |\langle u, v \rangle| \leq |u| |v| $ where $ |u| = \sqrt{\langle u, u \rangle} $. This tells us that
$$ -1 \leq \frac{\langle u, v \rangle}{|u| |v|} \leq 1 $$
assuming that our field of scalars is $ \mathbb{R} $. We then see that the arccosine of this expression is well-defined, so we can define the angle between nonzero vectors $ u $ and $ v $ as
$$ \theta = \arccos \left( \frac{\langle u, v \rangle}{|u| |v|} \right) $$
The properties we expect to be true are then easily verified. This notion extends to infinite dimensional vector spaces over $ \mathbb{R} $, where defining angle is not at all obvious. It is then trivially true that we have $ \langle u, v \rangle = |u| |v| \cos(\theta) $, since that is how $ \theta $ was defined.
The cross product is an entirely separate concept which allows us to find a vector orthogonal to two given vectors in $ \mathbb{R}^3 $. In addition, its magnitude also gives the area of the parallelogram spanned by the vectors. These properties can be taken as the definition of the cross product (with appropriate care for orientation), or they can be derived as theorems starting from the algebraic definition.
Best Answer
The notation you use for inner product (dot product) and outer product of two vectors is completely up to you. Whether you decide to use row vectors, $a,b\in\mathbb{R}^{1\times n}$, or column vectors, $a,b\in\mathbb{R}^{n\times 1}$, the notation \begin{equation*} a\cdot b = \sum_{i=1}^n a_ib_i \end{equation*} is commonly used. If you decide to use row vectors, then the dot product can be written in terms of matrix multiplication as $ab^\top$, and the outer product can be written as \begin{equation*} a^\top b = \begin{bmatrix} a_1 \\ a_2 \\ \dots \\ a_n \end{bmatrix}\begin{bmatrix}b_1 & b_2 & \cdots & b_n \end{bmatrix} = \begin{bmatrix}a_1b_1 & a_1b_2 & \cdots & a_1b_n \\ a_2b_1 & a_2b_2 & \cdots & a_2 b_n \\ \vdots & \vdots & \ddots & \vdots \\ a_nb_1 & a_nb_2 & \cdots & a_nb_n\end{bmatrix}. \end{equation*} In the case you decide to use column vectors $a,b\in\mathbb{R}^{n\times 1}$, the notation reverses, i.e. the dot product is $a^\top b$ and the outer product is $ab^\top$.