[GIS] Get pixel(Row Column) from Lat Long for Moon

gdalgeotiff-tiffmoonogr

How to obtain pixel(Row Column) Value from Latitude and Longitude given, for Moon?

Actually, I am Having a GeoTiff file which contains craters, and I need to crop the specific Crater whose Center is given to me in the form of Latitude and longitude.

I know that I have to use gdal to get this done, but I am having wrong results from my code.

Code is in Python
Code:

from osgeo import osr, gdal
import sys

# get the existing coordinate system
#SPHEROID["MOON_localRadius",1737400,0]
# create the new coordinate system
ds = gdal.Open("C:\Users\ZARL\Desktop\PRL_Project\small.tif")
old_cs= osr.SpatialReference()
old_cs.ImportFromWkt(ds.GetProjectionRef())
wgs84_wkt = """
        GEOGCS["GCS_MOON",
    DATUM["D_MOON",
        SPHEROID["MOON_localRadius",1737400,0]],
    PRIMEM["Reference_Meridian",0],
    UNIT["degree",0.0174532925199433]],
PROJECTION["Equirectangular"],
PARAMETER["latitude_of_origin",0],
PARAMETER["central_meridian",22.46049118041992],
PARAMETER["standard_parallel_1",0],
PARAMETER["false_easting",0],
PARAMETER["false_northing",0],
UNIT["metre",1,
    AUTHORITY["EPSG","9001"]]]"""
new_cs = osr.SpatialReference()
new_cs .ImportFromWkt(wgs84_wkt)

# create a transform object to convert between coordinate systems
transform = osr.CoordinateTransformation(old_cs,new_cs) 

#get the point to transform, pixel (0,0) in this case
width = ds.RasterXSize
height = ds.RasterYSize
gt = ds.GetGeoTransform()
minx = gt[0]
miny = gt[3] + width*gt[4] + height*gt[5] 

#get the coordinates in lat long
#latlong = transform.TransformPoint(minx,miny) 



transform1 = ds.GetGeoTransform()

xOrigin = transform1[0]
yOrigin = transform1[3]
pixelWidth = transform1[1]
pixelHeight = -transform1[5]

latlong = transform.TransformPoint(xOrigin,yOrigin)
#data = band.ReadAsArray(0, 0, width, height)

points_list = [ (22.4655,-71.7136)] #list of Lat long

for point in points_list:
    col = int((point[0] - latlong[1]) / pixelWidth)
    row = int((latlong[0]-point[1] ) / pixelHeight)


    print row,col

I am getting wrong values.

The Tiff file is:https://www.dropbox.com/s/twov5jpe40hcf4q/small.tif?dl=0

This is a Tiff file in which Lat Long and other info is tagged

when I run gdalinfo command on the picture I get the following:

    Driver: GTiff/GeoTIFF
Files: small.tif
       small.tif.aux.xml
Size is 40, 33
Coordinate System is:
PROJCS["unnamed",
    GEOGCS["GCS_MOON",
        DATUM["D_MOON",
            SPHEROID["MOON_localRadius",1737400,0]],
        PRIMEM["Reference_Meridian",0],
        UNIT["degree",0.0174532925199433]],
    PROJECTION["Equirectangular"],
    PARAMETER["latitude_of_origin",0],
    PARAMETER["central_meridian",22.46049118041992],
    PARAMETER["standard_parallel_1",0],
    PARAMETER["false_easting",0],
    PARAMETER["false_northing",0],
    UNIT["metre",1,
        AUTHORITY["EPSG","9001"]]]
Origin = (135.724395353784870,-2174581.951404873300000)
Pixel Size = (0.812720930262190,-0.812720930262190)
Metadata:
  AREA_OR_POINT=Area
  TIFFTAG_XRESOLUTION=1
  TIFFTAG_YRESOLUTION=1
Image Structure Metadata:
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  (     135.724,-2174581.951) ( 22d27'53.88"E, 71d42'47.22"S)
Lower Left  (     135.724,-2174608.771) ( 22d27'53.88"E, 71d42'50.40"S)
Upper Right (     168.233,-2174581.951) ( 22d27'57.74"E, 71d42'47.22"S)
Lower Right (     168.233,-2174608.771) ( 22d27'57.74"E, 71d42'50.40"S)
Center      (     151.979,-2174595.361) ( 22d27'55.81"E, 71d42'48.81"S)
Band 1 Block=40x1 Type=Byte, ColorInterp=Gray
  Min=1.000 Max=254.000
  Minimum=1.000, Maximum=254.000, Mean=122.977, StdDev=48.776
  Metadata:
    STATISTICS_MAXIMUM=254
    STATISTICS_MEAN=122.97651515152
    STATISTICS_MINIMUM=1
    STATISTICS_STDDEV=48.776222721807

Any Other Approach is also fine.

Best Answer

I have done the solution and it is working for almost every cases. I am putting the solution over here. It is done in python.

Code:

from osgeo import osr,gdal

infile = "C:\Users\ZARL\Desktop\PRL_Project\gale463m.tif" #File Path goes here
#Lat Long value
long = 137.855486
lat =  -5.378912

indataset = gdal.Open( infile)

srs = osr.SpatialReference()
srs.ImportFromWkt(indataset.GetProjection())

srsLatLong = srs.CloneGeogCS()
ct = osr.CoordinateTransformation(srsLatLong, srs)
(X, Y, height) = ct.TransformPoint(long, lat)

# Report results
print('longitude: %f\t\tlatitude: %f' % (long, lat))
print('X: %f\t\tY: %f' % (X, Y))
#VALUE OF COORDINATE IN METERS
print X ,Y

driver = gdal.GetDriverByName('GTiff')
band = indataset.GetRasterBand(1)

cols = indataset.RasterXSize
rows = indataset.RasterYSize

transform = indataset.GetGeoTransform()

xOrigin = transform[0]
yOrigin = transform[3]
pixelWidth = transform[1]
pixelHeight = -transform[5]
data = band.ReadAsArray(0, 0, cols, rows)

points_list = [(X,Y)] #list of X,Y coordinates

for point in points_list:
    col = int((point[0] - xOrigin) / pixelWidth )
    row = int((yOrigin - point[1] ) / pixelHeight)
    #ROW AND COLUMN VALUE
    print row,col
    #Data AT THAT ROW COLUMN
    value = data[row][col]

    print value

This will give the data value at that pixel whose Lat Long is provided.

Related Question