[Math] How to count angle between vector and horizontally oriented vector

algebra-precalculustrigonometry

I need to calculate in my Java application an angle between my line and horizontal line that has the same beginning. I have a line described by its equation:

$$f(x) = ax + b.$$

I would like to know angles alpha and beta in degrees. From this equation I can compute few points that lie on both lines and use it for computation. An angle between horizontal line and another two lines is max. 90 degreees.

Edit: now I am in my problem in this stage:

I have three points and I need to calculate an angle between them:

start = {x, y}
end1 = {x, y}
end2 = {x, y}

I create two vectors that these three points define a move them into beginnig of coordinate system:

vector1 = [{end1.x - start.x}, {end1.y - start.y}]
vector2 = [{end2.x - start.x}, {end2.y - start.y}]

Now I count the angle between these two vectors (in radians):

radians = ({vector1.x * vector2.x} + {vector1.y * vector2.y}) /
(vector1Length * vector2Length);

then I convert (in Java) radians into degrees.

Question:

When I apply above mentioned technique with these three points:

start = {0, 0}
end1 = {1, 0}
end2 = {1, 1}

it calculates 1.41 radians what is 81 degress, but in my opinion it should be 45 degrees.

What am I doing wrong?

Best Answer

Pay attention to brackets and arccosine functions, you missed them in your code.

radians = arccos(( {vector1.x * vector2.x} + {vector1.y * vector2.y} ) / vector1Length * vector2Length );