Resolution of Python GDAL-generated GeoTiff different from plotted image

gdalwarpgeotiff-tiffqgisresampling

The Situation

I use the GDAL library to warp geotiffs (subsequently referred to as "rasters") of different resolutions (30m and 500m) to a 100m (subsequently referred to as "default") resolution. The function below (WarpRasterToDefaultGridcellFormat) is supposed to be applicable to up- and downsampling tasks and should employ different aggregation algorithms:

def WarpRasterToDefaultGridcellFormat(in_rds_FilePath, defaultRaster_FilePath, polygonUniqueID, AggregationType,
                                      FileName, timer_dict=None):
    ################
    ### Link for inspiration : http://jgomezdans.github.io/gdal_notes/reprojection.html
    ### OBJECTIVE: REPROJECT A RASTER FILE SO THAT IT ALIGNS (I.E. SHARES THE SAME META-CARACTERISTICS (CRS, GEOTRANSFORM, RESOLUTION, ...)) AS A DEFAULT RASTER
    ### INPUTS: in_rds_FilePath -> PATH TO RASTER FILE TO BE REPROJECTED/up- or downsampled. TYPE: STRING
    ###         defaultRaster_FilePath -> PATH TO DEFAULT RASTER TO USE AS "TEMPLATE". TYPE: STRING
    ###         polygonUniqueID -> TILE UNIQUE ID (TILE_ID). TYPE: STRING
    ###         AggregationType -> DEFINE THE AGGREGATION TYPE. POSSIBILITIES HERE ARE SUM, AVERAGE, NEARESTNEIGHBOUR
    ###         FileName -> NAME TO BE GIVEN TO THE OUTPUT DATASET. TYPE: STRING
    ### OUTPUTS: FILEPATH TO THE CREATED RASTER DATASET.
    ################
    tic = time.perf_counter()
    in_rds = gdal.Open(in_rds_FilePath, gdalconst.GA_ReadOnly)  # GA_ReadOnly for it not to be overwritten
    in_rds_proj = osr.SpatialReference(wkt=in_rds.GetProjection())
    in_rds_geoTransform = in_rds.GetGeoTransform()

    defaultRaster_rds = gdal.Open(defaultRaster_FilePath,
                                  gdalconst.GA_ReadOnly)  # from function DefineProxyRasterDatasetBasedOnShapefileExtent
    defaultRaster_rds_proj = osr.SpatialReference(wkt=defaultRaster_rds.GetProjection())
    def_rds_geoT = defaultRaster_rds.GetGeoTransform()

    tx = osr.CoordinateTransformation(defaultRaster_rds_proj, in_rds_proj)  #

    x_size = defaultRaster_rds.RasterXSize  # number of pixels from east to west
    y_size = defaultRaster_rds.RasterYSize  #  number of pixels from north to south

    (ulx, uly, ulz) = tx.TransformPoint(def_rds_geoT[0], def_rds_geoT[3])
    # (lrx, lry, lrz) = tx.TransformPoint(def_rds_geoT[0] + def_rds_geoT[1]*x_size, def_rds_geoT[3] + def_rds_geoT[5]*y_size)   # ulx = upper left corner x, ... Function to find bottom right corner of. NOT USED?

    TileOutputFolderPath = GetTileOutputFolderPath(polygonUniqueID)
    out_rds_FilePath = os.path.join(TileOutputFolderPath, polygonUniqueID + '_' + FileName + '.tif')

    # CREATE THE OUTPUT RASTERFILE
    driver = gdal.GetDriverByName('GTiff')
    out_rds = driver.Create(out_rds_FilePath, defaultRaster_rds.RasterXSize, defaultRaster_rds.RasterYSize,
                            in_rds.RasterCount, in_rds.GetRasterBand(1).DataType, options=['COMPRESS=LZW'])

    # SET THE GEOTRANSFORM AND THE CRS OF THE OUTPUT RASTER FILE (i.e. the resampled one)
    # out_rds_geoT=(ulx, def_rds_geoT[1], def_rds_geoT[2], uly, def_rds_geoT[4], def_rds_geoT[5])
    # out_rds.SetGeoTransform(out_rds_geoT)
    out_rds.SetGeoTransform(def_rds_geoT)
    out_rds.SetProjection(defaultRaster_rds_proj.ExportToWkt())

    # REPROJECT THE INPUT RASTER USING THE APPROPRIATE AGGREGATION ALGORITHM
    if AggregationType == 'sum':
        gdal.Warp(out_rds, in_rds, resampleAlg=gdal.GRA_Sum, dstNodata=np.nan)

    elif AggregationType == 'average':
        gdal.Warp(out_rds, in_rds, resampleAlg=gdal.GRA_Average, dstNodata=np.nan)

    elif AggregationType == 'NearestNeighbour':
        gdal.Warp(out_rds, in_rds, resampleAlg=gdal.GRA_NearestNeighbour, dstNodata=np.nan)

    elif AggregationType == 'mode':
        gdal.Warp(out_rds, in_rds, resampleAlg=gdal.GRA_Mode, dstNodata=np.nan)

    # DEFINE THE APPROPRIATE RASTER DESCRIPTION
    for band_nbr in range(in_rds.RasterCount):
        band_nbr = band_nbr + 1

        RasterBand_in = in_rds.GetRasterBand(band_nbr)
        RasterBand_in_descr = RasterBand_in.GetDescription()

        RasterBand_out = out_rds.GetRasterBand(band_nbr)
        RasterBand_out.SetDescription(str(RasterBand_in_descr) + '_Reprojected')

    in_rds = defaultRaster_rds = out_rds = None

    toc = time.perf_counter()
    runtime = round((toc - tic) / 60, 2)
    print('Creation of ' + FileName + ' with the ' + AggregationType + ' algorithm took ' + str(runtime) + ' minutes.')

    if timer_dict is not None:
        timer_dict[FileName] = runtime

    return out_rds_FilePath

The problem

This approach works well for the 30m -> 100m downsampling. However, when upsampling from 500m -> 100m, the displayed resolution is 500m, but the metadata indicates that it is 100m. This is a problem regardless of the employed aggregation algorithm (e.g., NearestNeighbour, Mode, …)

What I tried

  • Import the wrapped rasters to QGIS and check the metadata: The displayed resolution in the main window is different between the 500 -> 100m upsampled raster and the 100x100m default raster (Figure 1), although the metadata indicates that the two rasters (Figure 2 and Figure 3) should be equal.

Figure 1: Visual of the 100x100m raster, which I want to sample to (greens) and the raster that should have been upsampled to 100x100m (grayscale)

Figure 1: Visual of the 100x100m raster, which I want to sample to (greens) and the raster that should have been upsampled to 100x100m (grayscale)

Figure 2: Metadata of 100x100m default raster as a reference

Metadata of 100x100m default raster as a reference

Figure 3: Metadata of the upsampled 500 -> 100m raster, which should be equal to the metadata of the 100x100m default raster (except the statistics)

Metadata of the upsampled 500 -> 100m raster, which should be equal to the metadata of  the 100x100m default raster (except the statistics)

  • Plot the generated rasters 500 -> 100m upsampled and the 100x100m default raster, and display the metadata. Same issue, that although the metadata shows a 100x100m resolution, the 500->100m upsampled raster appears more coarse.

  • Generating the files again in Python with QGIS closed, since sometimes changes in QGIS are reflected in the files.

Ideas what the problem could be

I am pretty lost here, since the aggregation works well from 30 -> 100m and from 100m -> 10000m. The issue seems to happen only when upsampling.

MWE

Edit: Since the question is resolved, I removed the MWE from the repository.

Best Answer

As @user30184 pointed out in the comments, upsampling does not create new information. The resolution appears more coarse since 5 adjacent pixels all have the same values.

This is by default (at least in my case) not shown through "gridlines" that would separate the single pixels, neither in QGIS nor in Python.

Related Question