I want to use OGR in Python to write a simple point-in-polygon test:
def point_in_polygon(point, polygon):
"""
point : [longitude, latitude]
polygon : [(lon1, lat1), (lon2, lat2), ..., (lonn, latn)]
"""
# Create spatialReference
spatialReference = osgeo.osr.SpatialReference()
spatialReference.SetWellKnownGeogCS("WGS84")
# Create ring
ring = osgeo.ogr.Geometry(osgeo.ogr.wkbLinearRing)
# Add points
for lon, lat in polygon:
ring.AddPoint(lon, lat)
# Create polygon
poly = osgeo.ogr.Geometry(osgeo.ogr.wkbPolygon)
poly.AssignSpatialReference(spatialReference)
poly.AddGeometry(ring)
# Create point
pt = osgeo.ogr.Geometry(osgeo.ogr.wkbPoint)
pt.AssignSpatialReference(spatialReference)
pt.SetPoint(0, point[0], point[1])
# test
return pt.Within(poly)
However, it doesn't seem to work properly:
In [22]: polygon = [(-10., -10.), (-10., 10.), (10., 10.), (10., -10.), (-10., -10.)]
In [23]: point_in_polygon([0., 0.], polygon)
Out[23]: True
In [24]: point_in_polygon([359., 0.], polygon)
Out[24]: False
Any ideas what I'm missing?
Best Answer
I suspect it's because the underlying GEOS library only work in Cartesian space rather than spherical, so you'll have to subtract 360 from any longitudinal coordinate greater than 180, which makes 359 == -1. Of course, you'll still have problems with features crossing the anti-meridian (i.e. +- 180 degrees longitude), but you can easily detect that and not do the 360 subtraction step.