I'm having a hard time proving this vector identity:
$$(A \cdot (B \times C))D = (C \cdot D)(A \times B) + (A \cdot D)(B \times C) + (B \cdot D)(C \times A)$$
Please note: I was hoping to prove this using index notation. Not any other math methods. I mean, by using index notation with Kronecker Delta or Epsilon symbols.
Could someone help me about this? I have no idea how to proceed.
Best Answer
Linear Independent Case
$1)$ If $A$, $B$, and $C$ are three independent vectors then $A \times B$, $B \times C$, and $C \times A$ are also independent so they can form a basis for $\mathbb{R}^3$. So we can write any vector as a linear combination of them.
$2)$ According to step $(1)$ we have
$$(A \cdot (B \times C))D = \alpha (A \times B) + \beta (B \times C) + \gamma (C \times A)$$
where $\alpha$, $\beta$, and $\gamma$ are unknown coefficients. If we dot product with $A$, $B$, and $C$, respectively, we can get
$$\eqalign{ & \left( {A \cdot (B \times C)} \right)\left( {A\cdot D} \right) = \beta A\cdot (B \times C) \cr & \left( {A \cdot (B \times C)} \right)\left( {B \cdot D} \right) = \gamma B \cdot (C \times A) \cr & \left( {A \cdot (B \times C)} \right)\left( {C \cdot D} \right) = \alpha C \cdot (A \times B) \cr} $$
$3)$ We note that
$$A \cdot (B \times C) = B \cdot (C \times A) = C \cdot (A \times B)$$
$4)$ We finally conclude that $$\eqalign{ & \alpha = C \cdot D \cr & \beta = A \cdot D \cr & \gamma = B \cdot D \cr} $$
Linear Dependent Case
Consider the case when the vectors $A$, $B$, and $C$ are linearly dependent and hence they cannot form a basis for $\mathbb{R}^3$. This means that there exists real numbers $a$ and $b$ not both zero such that $A = aB + bC$ will hold. This will lead to $(A \cdot (B \times C))=0$ and turns our equation into a trivial identity $0=0$.