The dot product is the special case of a more general concept, the inner product. If you have a vector space $ V $ over the reals or the complex numbers, then an inner product is a map $ f : V \times V \to \mathbb{C} $ or $ f : V \times V \to \mathbb{R} $ which is conjugate symmetric, positive definite, and linear in its first argument. We usually write $ f(u, v) = \langle u, v \rangle $, in which case these properties can be summed up as follows:
- Conjugate symmetry: $ \overline{\langle u, v \rangle} = \langle v, u \rangle $, where $ \bar{z} $ denotes complex conjugation. Note that this implies $ \langle u, u \rangle $ is always real for any vector $ u $.
- Positive definiteness: $ \langle v, v \rangle \geq 0 $ for any $ v \in V $, with equality holding iff $ v = 0 $.
- Linearity in the first argument: $ \langle \alpha u + \beta v, w \rangle = \alpha \langle u, w \rangle + \beta \langle v, w \rangle $ where $ u, v, w \in V $ and $ \alpha, \beta $ are in the field of scalars.
If $ V = \mathbb{R}^n $, then we can fix a basis $ B = \{ b_i \in \mathbb{R}, 1 \leq i \leq n \} $ and define $ \langle b_i, b_i \rangle = 1 $ and $ \langle b_i, b_j \rangle = 0 $ for $ i \neq j $. Extending this to all of $ \mathbb{R}^n $ by linearity gives us
$$ \left \langle \sum_{k=1}^{n} c_k b_k, \sum_{j=1}^{n} d_j b_j \right \rangle = \sum_{1 \leq k, j \leq n} d_k c_j \langle b_i, b_j \rangle = \sum_{i=1}^{n} c_i d_i $$
where positive definiteness is readily verified. You will recognize this expression as the definition of the dot product. Indeed, if we take our basis $ B $ to be the standard basis of $ \mathbb{R}^n $, then this inner product is the dot product.
Why is this formalism more powerful? A result about the inner product is the Cauchy-Schwarz inequality, which says that $ |\langle u, v \rangle| \leq |u| |v| $ where $ |u| = \sqrt{\langle u, u \rangle} $. This tells us that
$$ -1 \leq \frac{\langle u, v \rangle}{|u| |v|} \leq 1 $$
assuming that our field of scalars is $ \mathbb{R} $. We then see that the arccosine of this expression is well-defined, so we can define the angle between nonzero vectors $ u $ and $ v $ as
$$ \theta = \arccos \left( \frac{\langle u, v \rangle}{|u| |v|} \right) $$
The properties we expect to be true are then easily verified. This notion extends to infinite dimensional vector spaces over $ \mathbb{R} $, where defining angle is not at all obvious. It is then trivially true that we have $ \langle u, v \rangle = |u| |v| \cos(\theta) $, since that is how $ \theta $ was defined.
The cross product is an entirely separate concept which allows us to find a vector orthogonal to two given vectors in $ \mathbb{R}^3 $. In addition, its magnitude also gives the area of the parallelogram spanned by the vectors. These properties can be taken as the definition of the cross product (with appropriate care for orientation), or they can be derived as theorems starting from the algebraic definition.
The error is that $[c\ \ a\times b\ \ a]$ does not equal $c\times((a\times b)
\cdot a)$. Inside this last expression, $(a\times b)\cdot a$ is a scalar, and
one cannot take the vector product of a vector with a scalar. In fact
$$[c\ \ a\times b\ \ a]=c\cdot((a\times b)\times a).$$
Now one can use the vector triple product formula to simplify this
$$(a\times b)\times a=(a\cdot a)b-(b\cdot a)a$$
etc.
But a simpler way to approach this problem is to note that $a\times b-a\times c
=a\times (b-c)$. Since $a$ and $b-c$ are orthogonal, $|a\times (b-c)|=|a||b-c|$
etc.
Best Answer
It's a vector in the plane spanned by $a$ and $b$ that is also perpendicular to $c$. I don't think that the magnitude maps to anything especially interesting for vectors. The place that I usually see this sort of construct, by the way, is when some of the "vectors" are a gradient operators $\nabla$. In that case, it's a differential operator rather than a vector operation but, with care (and taking more of a physicist's view than a mathematician's), you can use many of the same identities. In the differential operator case, you can extract some geometric meaning about changes in vector fields in different directions, e.g. "curls of curls," which arise in some calculations with electromagnetic fields.