[GIS] OGR GetX() returns always zero

gdalogr

I am trying to get the first x-coordinates of a polygon. However I only get zero in return. What could cause that?

import os, sys, gdal, ogr
from gdalconst import *

driver = ogr.GetDriverByName('ESRI Shapefile')
testarea = driver.Open('testarea.shp', 0)
testarealyr = testarea.GetLayer()
testareafeature = testarealyr.GetNextFeature()

# get the x,y coordinates for the point
testareageom = testareafeature.GetGeometryRef()
print testareageom

x = testareageom.GetX()
print x


>>> 
POLYGON ((-124.1963006330602 43.006410659375554,-124.1861022086067 43.006647759060762,-124.1858958821004 43.00274627515271,-124.19612378176909 43.002422936639086,-124.19612378176909 43.002422936639086,-124.1963006330602 43.006410659375554))
0.0
>>> 

Best Answer

testareageom.GetGeometryRef(0).GetPoint(0)[0] will get you the X coord of the first point.

For completeness sake, if you were after the centroid of the polygon or the coordinates of all the points that comprise the polygon, you could use something like this:

centroid=testareageom.Centroid()
x = centroid.GetX()
print x

for ring in testareageom:
    for point in xrange(ring.GetPointCount()):
        x,y,z = ring.GetPoint(point)
        print x