[Math] How to solve for an unknown vector given an otherwise known cross product

cross productlinear algebramatrices

Given known vectors $\vec a$ and $\vec b$, is it possible to solve for $\vec u$, given the following equation?

$\vec a \times \vec u = \vec b$

So far I have found that the following must be true

$\vec u = \vec c + \lambda \hat a$

where

  • $\vec c = \frac{b}{a} (\hat b \times \hat a) = \frac{\vec b \times \vec a}{a^2}$
  • $\lambda \in \Bbb R $

but I'm unsure whether it is possible to further determine $\vec u$, since any vector $\vec u$ such that

$(\vec u \cdot \hat c) \cdot \hat c = \vec c$

should do the trick, which means that $\lambda$ can take any value.

However, when developing the cross product $\vec a \times \vec u = \vec b$ by hand, one can reach a matricial equation of the form

$ A \cdot (\vec x)^t = (\vec b)^t $

where $A$ is a matrix with coefficients from $ \vec a $ and which is solvable with at least one method, one example thereof being

$ (\vec x)^t = A^{-1} \cdot (\vec b)^t $

Am I doing something wrong? Or is this equation truly not solvable using vectorial math?

Best Answer

The operator $L_{\vec{a}} \colon \mathbb{R}^3 \rightarrow \mathbb{R}^3$ given by

$$ L_{\vec{a}}(\vec{u}) = \vec{a} \times \vec{u} $$

is a linear operator. There are two cases:

  1. If $\vec{a} = 0$ then $L_{\vec{a}}$ is the zero operator and so your equation is solvable if and only if $\vec{b} = 0$ in which case any vector $\vec{u} \in \mathbb{R}^3$ is a solution of your equation.
  2. If $\vec{a} \neq 0$ then the operator $L_{\vec{a}}$ has a one-dimensional kernel (spanned by $\vec{a}$) and a two-dimensional image (given by $\operatorname{span} \{ \vec{a} \}^{\perp}$). In this case, your equation is solvable if and only if $\vec{b} \perp \vec{a}$ and then there won't be a unique solution but a one-dimensional family of solutions $\vec{u} = \vec{u}_0 + t\vec{a}$ where $\vec{u}_0$ is (any) particular solution. Indeed, we have $$ \vec{a} \times (\vec{b} \times \vec{a}) = \vec{a} \times ((b_2 a_3 - b_3 a_2) \vec{e_1} + (b_3 a_1 - b_1 a_3) \vec{e}_2 + (b_1 a_2 - b_2 a_1) \vec{e}_3) = \\ (a_2 (b_1 a_2 - b_2 a_1) - a_3 (b_3 a_1 - b_1 a_3)) \vec{e}_1 + \\ (a_3 (b_2 a_3 - b_3 a_2) - a_1(b_1a_2 - b_2 a_1)) \vec{e_2} +\\ (a_1(b_3a_1 - b_1 a_3) - a_2 (b_2a_3 - b_3a_2)) \vec{e}_3 = \\ (b_1(a_2^2 + a_3^3) - a_1(a_2b_2 + a_3b_3))\vec{e}_1 + \\ (b_2(a_1^2 + a_3^2) - a_2(a_1b_1 + a_3b_3))\vec{e}_2 + \\ +(b_3(a_1^2 + a_2^2) - a_3(a_1b_1 + a_2b_3) )\vec{e}_3 = \\ \vec{b} \|\vec{a}\|^2 - (\vec{a} \cdot \vec{b}) \vec{a} $$ so if $\vec{b} \perp \vec{a}$ then $\vec{u} := \frac{\vec{b} \times \vec{a}}{\| \vec{a}\|^2}$ solves the equation and all the solutions have the form $\frac{\vec{b} \times \vec{a}}{\| \vec{a}\|^2} + t\vec{a}$ for some $t \in \mathbb{R}$.

In any case, the operator $L_{\vec{a}}$ is not invertible so if you represent the equation $L_{\vec{a}}(\vec{u}) = \vec{b}$ as a matrix equation $A\vec{u} = \vec{b}$ the matrix $A$ won't be of full rank so you can't invert it.