The dot product is the special case of a more general concept, the inner product. If you have a vector space $ V $ over the reals or the complex numbers, then an inner product is a map $ f : V \times V \to \mathbb{C} $ or $ f : V \times V \to \mathbb{R} $ which is conjugate symmetric, positive definite, and linear in its first argument. We usually write $ f(u, v) = \langle u, v \rangle $, in which case these properties can be summed up as follows:
- Conjugate symmetry: $ \overline{\langle u, v \rangle} = \langle v, u \rangle $, where $ \bar{z} $ denotes complex conjugation. Note that this implies $ \langle u, u \rangle $ is always real for any vector $ u $.
- Positive definiteness: $ \langle v, v \rangle \geq 0 $ for any $ v \in V $, with equality holding iff $ v = 0 $.
- Linearity in the first argument: $ \langle \alpha u + \beta v, w \rangle = \alpha \langle u, w \rangle + \beta \langle v, w \rangle $ where $ u, v, w \in V $ and $ \alpha, \beta $ are in the field of scalars.
If $ V = \mathbb{R}^n $, then we can fix a basis $ B = \{ b_i \in \mathbb{R}, 1 \leq i \leq n \} $ and define $ \langle b_i, b_i \rangle = 1 $ and $ \langle b_i, b_j \rangle = 0 $ for $ i \neq j $. Extending this to all of $ \mathbb{R}^n $ by linearity gives us
$$ \left \langle \sum_{k=1}^{n} c_k b_k, \sum_{j=1}^{n} d_j b_j \right \rangle = \sum_{1 \leq k, j \leq n} d_k c_j \langle b_i, b_j \rangle = \sum_{i=1}^{n} c_i d_i $$
where positive definiteness is readily verified. You will recognize this expression as the definition of the dot product. Indeed, if we take our basis $ B $ to be the standard basis of $ \mathbb{R}^n $, then this inner product is the dot product.
Why is this formalism more powerful? A result about the inner product is the Cauchy-Schwarz inequality, which says that $ |\langle u, v \rangle| \leq |u| |v| $ where $ |u| = \sqrt{\langle u, u \rangle} $. This tells us that
$$ -1 \leq \frac{\langle u, v \rangle}{|u| |v|} \leq 1 $$
assuming that our field of scalars is $ \mathbb{R} $. We then see that the arccosine of this expression is well-defined, so we can define the angle between nonzero vectors $ u $ and $ v $ as
$$ \theta = \arccos \left( \frac{\langle u, v \rangle}{|u| |v|} \right) $$
The properties we expect to be true are then easily verified. This notion extends to infinite dimensional vector spaces over $ \mathbb{R} $, where defining angle is not at all obvious. It is then trivially true that we have $ \langle u, v \rangle = |u| |v| \cos(\theta) $, since that is how $ \theta $ was defined.
The cross product is an entirely separate concept which allows us to find a vector orthogonal to two given vectors in $ \mathbb{R}^3 $. In addition, its magnitude also gives the area of the parallelogram spanned by the vectors. These properties can be taken as the definition of the cross product (with appropriate care for orientation), or they can be derived as theorems starting from the algebraic definition.
3D:
Let's say you have two vectors, $\vec{a}$ and $\vec{b}$. They are perpendicular to $\vec{n}$,
$$\vec{n} = \vec{a} \times \vec{b} \tag{1}\label{NA1}$$
If we measure the rotation around the vector $\vec{n}$, the angle between the two vectors is $\varphi$,
$$\cos\varphi = \frac{\vec{a} \cdot \vec{b}}{\left\lVert\vec{a}\right\rVert \, \left\lVert\vec{b}\right\rVert} \tag{2}\label{NA2}$$
In general, we can also utilize $$\sin\varphi = \pm \frac{\left\lVert \vec{a} \times \vec{b} \right\rVert}{\left\lVert\vec{a}\right\rVert\,\left\lVert\vec{b}\right\rVert} \tag{3}\label{NA3}$$
where the sign depends on the orientation (clockwise or counterclockwise) we measure the rotation (around the plane normal $\vec{n}$) from $\vec{a}$ to $\vec{b}$.
If you have a specific unit direction vector $\hat{d}$, either parallel or opposite to $\vec{n}$,
$$\hat{d} = \pm \frac{\vec{n}}{\left\lVert\vec{n}\right\rVert} = \pm \frac{\vec{a}\times\vec{b}}{\left\lVert\vec{a}\times\vec{b}\right\rVert}$$
then you can calculate the angle counterclockwise around $\hat{d}$ using $\eqref{NA2}$ and
$$\sin\varphi = \frac{\hat{d}\cdot(\vec{a}\times\vec{b})}{\left\lVert\vec{a}\right\rVert\,\left\lVert\vec{b}\right\rVert}$$
2D:
In this case, there is only one "side"; the 2D coordinate plane. Here, we can use the 2D analog of the vector cross product:
$$\begin{array}{l}
\vec{a} = ( x_a , y_a ) \\
\vec{b} = ( x_b , y_b ) \\
\vec{a} \cdot \vec{b} = x_a x_b + y_a y_b \\
\vec{a} \times \vec{b} = x_a y_b - x_b y_a \end{array}$$
and
$$\begin{cases}
\cos\varphi = \frac{\vec{a} \cdot \vec{b}}{\left\lVert\vec{a}\right\rVert \, \left\lVert\vec{b} \right\rVert} = \frac{ x_a x_b + y_a y_b}{\sqrt{(x_a^2 + y_a^2)(x_b^2 + y_b^2)}} \\
\sin\varphi = \frac{\vec{a} \times \vec{b}}{\left\lVert\vec{a}\right\rVert \, \left\lVert\vec{b} \right\rVert} = \frac{ x_a y_b - x_b y_a}{\sqrt{(x_a^2 + y_a^2)(x_b^2 + y_b^2)}} \end{cases} \tag{4}\label{NA4}$$
Best Answer
Let $\vec{\mathbf a} = (x_1, y_1) = (a \cos \alpha, a \sin \alpha)$
Let $\vec{\mathbf b} = (x_2, y_2) = (b \cos \beta, b \sin \beta)$
Then $\theta = |\beta - \alpha|$
By definition,
\begin{align} \vec{\mathbf a} \circ \vec{\mathbf b} &= x_1x_2 + y_1y_2 \\ &= ab(\cos \alpha \cos \beta + \sin \alpha \sin \beta) \\ &= ab \cos(\beta - \alpha)\\ &= ab \cos \theta \end{align}
(Note $\cos(\theta) = \cos(-\theta)$)