Finding the direction vector magnitude from position and angle

trigonometryvectors

I'm writing a piece of code for one astronomical problem, basically I want a fast way to check if the field of view of two meteor cameras is overlapping so I can proceed to feed the data into a meteor trajectory solver.

I was able to simplify my problem to this little graph: Problem

In summary: I have a vector $\vec{b}$ (representing the centre of the field of view of one camera at the height of 100 km) and a unit direction vector $\hat{c}$ pointing to the same point in another camera's field of view. I would like to find a vector $\vec{a}$ which is basically $\vec{a} = \vec{b} + k \hat{c}$ so that the angle between $\vec{a}$ and $\vec{b}$ is $\theta$. Finding either $\vec{a}$ or the scalar $k$ would do the trick.

I'm ashamed that my knowledge of trigonometry is so rusty that I can't solve what seems to be a simple problem! I tried attacking it using the law of cosines, but to no avail.

All help would be much appreciated!

Cheers

Best Answer

the dot product $$ {{{\bf \hat c} \cdot {\bf \vec b}} \over {\left| {{\bf \hat c}} \right|\left| {{\bf \vec b}} \right|}} = \cos \left( {\pi - \alpha } \right) $$ will give you the angle $\alpha$: the one opposite to $\bf \vec a$.
Then knowing two angles you know all.
After that apply the "law of sines" to find the length of $\bf \vec c$.

Related Question