I apologize if this question is too basic, but I am wondering if identities for commutators such as $[AB,C]=A[B,C]+[A,C]B$ also hold for dot and cross products within the commutator (i.e., $[\vec{A}\times \vec{B},C]=\vec{A}\times[\vec{B},C]+[\vec{A},C]\times \vec{B}$, and same for dot product). I've tried writing it through in component form and it seems like it checks out okay, but I would really like to double check that I'm not making a stupid mistake and thinking wishfully, as I frequently lose myself in long chains of algebra.
[Math] Commutators of dot and cross products
quantum mechanicsvectors
Related Solutions
The method at that site ignores the sign of the determinant, and is phrased in a way that makes the generalization to higher dimensions a bit less clear than I'd like. I'll try here to provide an altered version of the approach up to four dimensions. I'll use bold for vectors, subscripts for components, and double vertical lines for length, so that $\left\Vert \mathbf{a}\right\Vert =\sqrt{a_{1}^{2}+a_{2}^{2}+\cdots}$.
As was mentioned by LutzL in a comment, this method is very closely connected to using the $QR$-decomposition to find the absolute value of the determinant, as described on Wikipedia here.
1D:
Let's calculate $\det\left(\mathbf{a}\right)$ where $\mathbf{a}$ has one nonzero component. It's $\left\Vert \mathbf{a}\right\Vert$ if $\mathbf{a}$ has a positive component and $-\left\Vert \mathbf{a}\right\Vert$ if $\mathbf{a}$ has a negative component.
2D:
Let's calculate $\det\left(\mathbf{a},\mathbf{b}\right)$ where $\mathbf{a}$ and $\mathbf{b}$ are not collinear.
Let's ignore $\mathbf{a}$ for now. The first step is to find a vector $\mathbf{n}$ that's orthogonal to $\mathbf{b}$. We set $\mathbf{n}\bullet\mathbf{b}$ equal to $\boldsymbol{0}$. That's two unknowns and only one equation. In a typical case, the component $n_{1}$ of $\mathbf{n}$ is not forced to be $0$, so it can be whatever we want that's nonzero (e.g. $1$). (In a special case, $n_{1}$ might be forced to be $0$, but then $n_{2}$ can be chosen freely.)
Now scale $\mathbf{n}$ to get a new vector $\mathbf{n}'$ so that $\left\Vert \mathbf{n}'\right\Vert =\left\Vert \mathbf{b}\right\Vert$ . By some geometry, the area of the parallelogram formed by $\mathbf{a}$ and $\mathbf{b}$ is then $\left|\mathbf{n}'\bullet\mathbf{a}\right|$. The determinant is $\pm\mathbf{n}'\bullet\mathbf{a}$ where the $\pm$ sign here (which may not be the sign of the determinant) is positive exactly when the rotation to get from $\mathbf{a}$ to $\mathbf{b}$ is in the same direction (clockwise or counterclockwise) as the rotation to get from $\mathbf{b}$ to $\mathbf{n}'$. Unfortunately, that can't be determined by a dot-product calculation.
3D:
Let's calculate $\det\left(\mathbf{a},\mathbf{b},\mathbf{c}\right)$ where $\mathbf{a},\mathbf{b},\mathbf{c}$ are not coplanar.
Let's ignore $\mathbf{a}$ for now. The first step is to find a vector $\mathbf{n}$ that's orthogonal to both $\mathbf{b}$ and $\mathbf{c}$. We set $\mathbf{n}\bullet\mathbf{b},\mathbf{n}\bullet\mathbf{c}$ equal to $\boldsymbol{0}$. That's three unknowns and only two equations. In a typical case, the component $n_{1}$ of $\mathbf{n}$ is not forced to be $0$, so it can be whatever we want that's nonzero (e.g. $1$). (In a special case, $n_{1}$ might be forced to be $0$, but then $n_{2}$ or $n_{3}$ can be chosen freely.)
The second step is to find a vector $\mathbf{o}$ that's orthogonal to $\mathbf{c}$ (this choice differs from the original author), but lies in the same plane as $\mathbf{b}$ and $\mathbf{c}$. To keep it in that plane, we need $\mathbf{o}$ to be orthogonal to $\mathbf{n}$. So we have $\mathbf{o}\bullet\mathbf{n}=\boldsymbol{0}$ as well as $\mathbf{o}\bullet\mathbf{c}=\boldsymbol{0}$. Again that's three unknowns and two equations, so we have a degree of freedom and could choose a particular value for some component.
Now scale $\mathbf{o}$ to get a new vector $\mathbf{o}'$ so that $\left\Vert \mathbf{o}'\right\Vert =\left\Vert \mathbf{c}\right\Vert$ . By some geometry, the area of the parallelogram formed by $\mathbf{b}$ and $\mathbf{c}$ is then $\left|\mathbf{o}'\bullet\mathbf{b}\right|$. Now scale $\mathbf{n}$ to get a new vector $\mathbf{n}'$ so that $\left\Vert \mathbf{n}'\right\Vert =\left|\mathbf{o}'\bullet\mathbf{b}\right|$. By some geometry, the volume of the parallelepiped formed by $\mathbf{a}, \mathbf{b}, \mathbf{c}$ is then $\left|\mathbf{n}'\bullet\mathbf{a}\right|$. The determinant is then $\pm\mathbf{n}'\bullet\mathbf{a}$ where I'm pretty sure the $\pm$ sign is positive when the handedness (right or left) of $\mathbf{a},\mathbf{b},\mathbf{c}$ is the same as that of $\mathbf{c},\mathbf{n}',\mathbf{o}'$.
4D:
Let's calculate $\det\left(\mathbf{a},\mathbf{b},\mathbf{c},\mathbf{d}\right)$ where $\mathbf{a},\mathbf{b},\mathbf{c},\mathbf{d}$ are not in the same $3$-dimensional hyperplane.
Let's ignore $\mathbf{a}$ for now. The first step is to find a vector $\mathbf{n}$ that's orthogonal to all three of $\mathbf{b},\mathbf{c},\mathbf{d}$. We set $\mathbf{n}\bullet\mathbf{b},\mathbf{n}\bullet\mathbf{c},\mathbf{n}\bullet\mathbf{c}$ all to $\boldsymbol{0}$. That's four unknowns and only three equations. In a typical case the component $n_{1}$ of $\mathbf{n}$ is not forced to be $0$, so it can be whatever we want that's nonzero (e.g. $1$). (In a special case, $n_{1}$ might be forced to be $0$, but there will be at least one component we could choose freely.)
The second step is to find a vector $\mathbf{o}$ that's orthogonal to $\mathbf{c}$ and $\mathbf{d}$, but lies in the same $3$-dimensional hyperplane as $\mathbf{b},\mathbf{c},\mathbf{d}$. To keep it in that hyperplane, we need $\mathbf{o}$ to be orthogonal to $\mathbf{n}$. So we have $\mathbf{o}\bullet\mathbf{n}=\boldsymbol{0}$ as well as $\mathbf{o}\bullet\mathbf{c},\mathbf{o}\bullet\mathbf{d}=\boldsymbol{0}$. Again that's four unknowns and three equations, so we have a degree of freedom and could choose a particular value for some component.
The third step is to find a vector $\mathbf{p}$ that's orthogonal to $\mathbf{d}$, but lies in the same $2$-dimensional plane as $\mathbf{c},\mathbf{d}$. To keep it in that plane, it should be orthogonal to $\mathbf{n}$ as well as $\mathbf{o}$. So we have $\mathbf{p}\bullet\mathbf{n},\mathbf{p}\bullet\mathbf{o}=\boldsymbol{0}$ as well as $\mathbf{p}\bullet\mathbf{d}=\boldsymbol{0}$. Again that's four unknowns and three equations, so we have a degree of freedom and could choose a particular value for some component.
Now scale $\mathbf{p}$ to get a new vector $\mathbf{p}'$ so that $\left\Vert \mathbf{p}'\right\Vert =\left\Vert \mathbf{d}\right\Vert$ . By some geometry, the area of the parallelogram formed by $\mathbf{c}$ and $\mathbf{d}$ is then $\left|\mathbf{p}'\bullet\mathbf{c}\right|$. Now scale $\mathbf{o}$ to get a new vector $\mathbf{o}'$ so that $\left\Vert \mathbf{o}'\right\Vert =\left|\mathbf{p}'\bullet\mathbf{c}\right|$. By some geometry, the volume of the parallelepiped formed by $\mathbf{b}, \mathbf{c}, \mathbf{d}$ is then $\left|\mathbf{o}'\bullet\mathbf{b}\right|$. Now scale $\mathbf{n}$ to get a new vector $\mathbf{n}'$ so that $\left\Vert \mathbf{n}'\right\Vert =\left|\mathbf{o}'\bullet\mathbf{b}\right|$. By some geometry, the hypervolume of the hyperparallelepiped formed by $\mathbf{a},\mathbf{b}, \mathbf{c}, \mathbf{d}$ is then $\left|\mathbf{n}'\bullet\mathbf{a}\right|$. The determinant is then $\pm\mathbf{n}'\bullet\mathbf{a}$ where I think the $\pm$ sign is positive when the orientation of $\mathbf{a},\mathbf{b},\mathbf{c},\mathbf{d}$ is the same as that of $\mathbf{d},\mathbf{n}',\mathbf{o}',\mathbf{p}'$.
HINT: The right way (one of them, atleast) is to use the square of the scalar triple product to get a $3×3$ matrix with only magnitude and dot product terms.
Your second approach is wrong because you confused between vector triple product and scalar triple product.
You first approach is wrong because it is not necessary for $c$ to subtend $\frac {\pi}{3}$ radians with the plane of $a$ and $b$.
Best Answer
It looks like the identities you want to prove are$$\left[\sum_iA_iB_i,\,C\right]=\sum_iA_i[B_i,\,C]+\sum_i[A_i,\,C]B_i,\,\\\left[\sum_{jk}\epsilon_{ijk}A_jB_k,\,C\right]=\sum_{jk}A_j[B_k,\,C]+\sum_{jk}[A_j,\,C]B_k.$$These just follow from the usual identity, with one of the operators $\sum_i,\,\sum_{jk}\epsilon_{ijk}$ applied to it. For example,$$\left[\sum_iA_iB_i,\,C\right]=\sum_i\left[A_iB_i,\,C\right]=\sum_i\left(A_i\left[B_i,\,C\right]+\left[A_i,\,C\right]B_i\right).$$