[Math] How to move a camera in “Flight Simulator” style (Roll, Pitch, Yaw) using a joystick

geometryspherical coordinatestrigonometry

I am working on a video game where the camera movement (what the viewer sees) is controlled by a joystick. I want the camera movement to act like a flight simulator meaning the following:

When the user tilts the joystick down (toward the screen) the camera points down ("pitch").

When the user moves the joystick sideways the camera just "rolls" along the Z-Axis.

Additionally, when the user twists the joystick, the camera would "yaw"

How can I calculate the distance in each direction, X, Y, Z the user would go based on those rotations.

My axis
Joystick example

For example, if the user rotated along the X axis 90 degrees, future movement would be 100% in the Y direction.

If the user "rolls" along the Z axis, the movement would be 100% in the Z direction but,
as soon as there is rotation along the X axis (and rotation along the Z axis), there would be movement in three directions.

This question comes very close to answering, but is basically asking the inverse. I just want to know, given roll and pitch, how do I calculate yaw. Or, how do I figure the distance in X Y and Z based on their rotations. Thanks

Best Answer

When I've done this, I've preferred using a 2-vector system to maintain direction: a "forward" vector and an "up" vector, which express forward and up in local coordinates.

Roll, then is rotation about the forward vector, yaw is about the up vector, and pitch is about the vector produced by the cross product UP cross FORWARD.

The advantage of this representation is that it's easily convertible to matrix form, and it's easy to keep normalized. As you compute successive rotations using a matrix, any finite representation can begin to stray from the unit vector, and add errors to the projection results. Normalizing after each additive transform will prevent that.

You can normalize a matrix, but it's more complex and less intuitive.

Related Question