Given known vectors $\vec a$ and $\vec b$, is it possible to solve for $\vec u$, given the following equation?
$\vec a \times \vec u = \vec b$
So far I have found that the following must be true
$\vec u = \vec c + \lambda \hat a$
where
- $\vec c = \frac{b}{a} (\hat b \times \hat a) = \frac{\vec b \times \vec a}{a^2}$
- $\lambda \in \Bbb R $
but I'm unsure whether it is possible to further determine $\vec u$, since any vector $\vec u$ such that
$(\vec u \cdot \hat c) \cdot \hat c = \vec c$
should do the trick, which means that $\lambda$ can take any value.
However, when developing the cross product $\vec a \times \vec u = \vec b$ by hand, one can reach a matricial equation of the form
$ A \cdot (\vec x)^t = (\vec b)^t $
where $A$ is a matrix with coefficients from $ \vec a $ and which is solvable with at least one method, one example thereof being
$ (\vec x)^t = A^{-1} \cdot (\vec b)^t $
Am I doing something wrong? Or is this equation truly not solvable using vectorial math?
Best Answer
The operator $L_{\vec{a}} \colon \mathbb{R}^3 \rightarrow \mathbb{R}^3$ given by
$$ L_{\vec{a}}(\vec{u}) = \vec{a} \times \vec{u} $$
is a linear operator. There are two cases:
In any case, the operator $L_{\vec{a}}$ is not invertible so if you represent the equation $L_{\vec{a}}(\vec{u}) = \vec{b}$ as a matrix equation $A\vec{u} = \vec{b}$ the matrix $A$ won't be of full rank so you can't invert it.