[Math] How to convert spherical coordinate system to Euler angles

anglelinear algebraspherical coordinates

I have a point at the origin of a $3D$ environment and a second point which is free to move along the surface of a sphere. Obviously, the best way to represent the direction of the vector created by those two points (with the origin being the tail) would be to use an azimuth angle and an altitude as show in the following picture (taken from here):

My question is, how do I convert that to Euler angles?

Best Answer

Usually in spherical coordinates there are two angles, $\theta$ and $\phi$. Start with a point on the $z$-axis. Rotate about the current $z$-axis by $\phi$. Then, rotate about the new $y$-axis by $\theta$. This should be directly convertible to some convention of Euler angles.

If you're working with a Z-X'-Z'' convention, then the only subtlety involved is to line up the $y$-axis with the first rotation instead. This should correspond to a rotation about the $z$-axis by $\phi - \pi/2$ as your first rotation.

Related Question