I'm dealing with geographic coordinates and geographic bearings (true north) and I want to get the intersection point of two points with different coordinates based on their azimuths (True north bearings), see below a print screen.
While searching for a solution on the internet I came across this method with the formulas given, I tried to implement this on Python and after testing my code it does not give the correct result.
For those who are interseted in where I found that solution here is the link : https://www.movable-type.co.uk/scripts/latlong.html
Below is a print screen for the formulas used :
My Python implementation :
from cmath import pi
from math import radians, cos, sin, sqrt, degrees, asin, atan2, acos
# GPS Coordinates and bearing of point 1 and point 2
# Point 2
x1 = 3.252780556
y1 = 36.793608333
b1 = 0
# Point 1
x2 = 3.238638889
y2 = 36.798261111
b2 = 100
# Convert to radian
lon1 = radians(float(x1))
lat1 = radians(float(y1))
b1 = radians(float(b1))
lon2 = radians(float(x2))
lat2 = radians(float(y2))
b2 = radians(float(b2))
dlon = lon2 - lon1 # Distance between longitude points
dlat = lat2 - lat1 # Distance between latitude points
# Great-circle distance between point 1 and point 2
haversine = sin(dlat / 2)**2 + cos(lat1) * cos(lat2) * sin(dlon / 2)**2
# angular distance between point 1 and point 2
ang_dist_1_2 = 2 * asin(sqrt(haversine))
# Initial and final bearings between point 1 and point 2
initial_bearing = acos((sin(lat2) - sin(lat1) * cos(ang_dist_1_2)) / (sin(ang_dist_1_2) * cos(lat1)))
final_bearing = acos((sin(lat1) - sin(lat2) * cos(ang_dist_1_2)) / (sin(ang_dist_1_2) * cos(lat2)))
# Adjust the bearings on the trigonometric circle
if sin(x2 - x1) > 0:
bearing_1_2 = initial_bearing
bearing_2_1 = (2 * pi) - final_bearing
else:
bearing_1_2 = (2 * pi) - initial_bearing
bearing_2_1 = final_bearing
# Angles between different points
ang_1 = b1 - bearing_1_2 # angle p2<--p1-->p3
ang_2 = bearing_2_1 - b2 # angle p1<--p2-->p3
ang_3 = acos(-cos(ang_1) * cos(ang_2) + sin(ang_1) * sin(ang_2) * cos(ang_dist_1_2)) # angle p1<--p3-->p2
# angular distance between point 1 and intersection point (point 3)
ang_dist_1_3 = atan2(sin(ang_dist_1_2) * sin(ang_1) * sin(ang_2), cos(ang_2) + cos(ang_1) * cos(ang_3))
# Latitude of point 3
lat3 = asin(sin(lat1) * cos(ang_dist_1_3) + cos(lat1) * sin(ang_dist_1_3) * cos(b1))
# Longitude of point 3
delta_long_1_3 = atan2(sin(b1) * sin(ang_dist_1_3) * cos(lat1), cos(ang_dist_1_3) - sin(lat1) * sin(lat3))
lon3 = lon1 + delta_long_1_3
# Print results
print("Lat3 : ", lat3)
print("Lon3 : ", lon3)
The results obtained :
I don't know where my code is wrong, I will be glad if anyone can help with another method, or some library that can do the trick.
Best Answer
I think you've not converted the radians to degrees. If I use the numbers in the screenshot from the converter tool:
and convert to degrees:
your code gives me:
The degrees-minutes-seconds given as answers in the screengrab can be converted to decimal degrees and agree well:
I'd expect some difference due to tolerances in numeric libraries, and that I can't see the exact decimal in the input boxes!