[Math] How to get the minimum angle between two crossing lines

computational mathematicstrigonometry

I'm not a student, I'm just a programmer trying to solve a problem … I just need the practical way to calculate the smallest angle between two lines that intersect. The value, of course, must always be less than or equal to 90 º. For ease, imagine the hands of a clock as line segments, starting from a common vertex at its center. At 12 o'clock we have 0° and 360°, at 3 o'clock we have 90, at 6 o'clock we have 180, at 9 am have 270, ie, the angles range from 0 to 360° clockwise ALWAYS. This is my reference.

Each hour is 30 degrees (360/12), so suppose one of hands is in position 1h and the other one is in position 11h. Using my reference, which always starts at zero from 0 hours (or 12 hours, whichever you prefer) we have:

1h Position is equivalent to 30°
11h position is equivalent to 330°

I know, looking at the clock, the result I hope to find is 60°, however I need a mathematical relationship where I tell two angles starting from the same source and I get as result a value that is the smallest angle by two intersecting lines.

OBS.: I really do not need explanations about lines in Cartesian planes or angular coefficient. I really do not have these informations. What I have are only two angles relative to each other to form imaginary lines and I need only calculate the smallest angle between these lines. Thanks for understanding!

Best Answer

Given two angles $\alpha$ and $\beta$ in degrees, the required one would be: $$\gamma=min(|\alpha - \beta|, 360-|\alpha-\beta|)$$

Edit: Seems there is a nicer formula $$\gamma=180 - ||\alpha-\beta|-180|$$

Related Question