show(stack[0])
show(stack[1])
will work fine.
If you choose tuple, you have to choose the channel to output. Multi-band output is possible only when the raster dataset is loaded.
The same is true for the creation of raster files.
dst.write(stack, 1)
will change
dst.write(stack[0], 1)
dst.write(stack[1], 2)
UPDATE
The cause of the error is that the width and height of the image are incorrectly specified.
stack.shape
return (band, height, width).
so, it will change
height=stack.shape[1],
width=stack.shape[2]
You can also use this.
height=img.height,
width=img.width
PyQGIS manages to perform this task significantly faster than numpy
or rasterstats
. Here is the link to the cookbook: https://docs.qgis.org/3.16/en/docs/pyqgis_developer_cookbook/index.html.
To complete the task:
... a EPSG:4326 raster file with pixel size of 15 arcseconds, and a list of longitudes and latitudes. For each pair of coordinate, ... create a buffered circle of X kilometers surrounding them, and aggregate over the pixel values within each circle.
You can use the below template:
from qgis.core import *
from qgis.PyQt.QtCore import QVariant
from qgis.analysis import QgsZonalStatistics
import pandas as pd
# Inits app
QgsApplication.setPrefixPath(path_to_qgis, True)
qgs = QgsApplication([], False)
qgs.initQgis()
# Sets up original CRS
epsg4326 = QgsCoordinateReferenceSystem("EPSG:4326")
transformContext = QgsProject.instance().transformContext()
# Creates shapefile to write buffer into
filename = "buffers.shp"
fields = QgsFields()
fields.append(QgsField(lon_col_name, QVariant.Double))
fields.append(QgsField(lat_col_name, QVariant.Double))
save_options = QgsVectorFileWriter.SaveVectorOptions()
save_options.driverName = "ESRI Shapefile"
save_options.fileEncoding = "UTF-8"
writer = QgsVectorFileWriter.create(
filename,
fields,
QgsWkbTypes.Polygon,
epsg4326,
transformContext,
save_options
)
# Loads .csv file as vector layer
data_url = "link/to/csv/file/contains/lat/lon"
vlayer = QgsVectorLayer(data_url, "coordinates", "ogr")
features = vlayer.getFeatures()
field_names = [ field.name() for field in vlayer.fields() ]
lat_col_idx = field_names.index("lat")
lon_col_idx = field_names.index("lon")
for f in features: # each feature is a point
attributes = f.attributes()
lat = float(attributes[lat_col_idx])
lon = float(attributes[lon_col_idx])
# Sets up projected CRS and transformer
local_azimuthal = QgsCoordinateReferenceSystem()
proj4_str = "+proj=aeqd +R=6371000 +units=m +lat_0={} +lon_0={}".format(lat, lon)
local_azimuthal.createFromProj(proj4_str)
wgs84_to_azimuthal = QgsCoordinateTransform(epsg4326, local_azimuthal, transformContext)
azimuthal_to_wgs84 = QgsCoordinateTransform(local_azimuthal, epsg4326, transformContext)
# Projects the point, creates buffer, and reprojects to original CRS
projected_point = wgs84_to_azimuthal.transform(QgsPointXY(lon, lat))
point_geom = QgsGeometry.fromPointXY(projected_point)
radius = 5000 # in meters
segment = 18
buffer = point_geom.buffer(radius, segment)
buffer.transform(azimuthal_to_wgs84)
# Writes buffer to file
fet = QgsFeature()
fet.setGeometry(buffer)
fet.setAttributes([lon, lat])
writer.addFeature(fet)
# flush file to avoid memory leak
del writer
# Reads in the buffer shape files
filename = "buffers.shp"
vlayer = QgsVectorLayer(filename, "buffers", "ogr")
# Calculates zonal mean-s for each raster
raster_url = "link/to/raster"
raster = QgsRasterLayer(raster_url)
zoneStats = QgsZonalStatistics(vlayer, raster, stats = QgsZonalStatistics.Statistics(QgsZonalStatistics.Mean))
zoneStats.calculateStatistics(None)
# Writes zonal mean-s to csv
destUrl = "link/to/destination/csv"
options = QgsVectorFileWriter.SaveVectorOptions()
options.driverName = "CSV"
QgsVectorFileWriter.writeAsVectorFormatV2(vlayer, destUrl, transformContext, options)
del vlayer
Thank you @StefanBrand for suggesting QGIS.
Best Answer
The Python API method that supports the rio-sample command is documented here: https://rasterio.readthedocs.io/en/latest/api/rasterio._io.html#rasterio._io.DatasetReaderBase.sample
src.sample()
takes an iterator over x, y tuples, so do: