[GIS] Using GDAL’s Rasterize (Vector to Raster) utility

gdal-rasterizepythonqgis

I am trying to convert a point layer to a raster file using QGIS python consle. I want the output raster with the extent of a raster file. The new raster should have the point values from the specified field of the input vector and a value of 0 elsewhere.

I am using GDAL's Rasterize (Vector to Raster) utility (https://docs.qgis.org/2.8/en/docs/user_manual/processing_algs/gdalogr/gdal_conversion/rasterize.html).
This documentation is for QGIS 2.8 version while I am running my script QGIS v. 3.6.3 and 3.4.8.

I am getting error:

'Incorrect parameter value for UNITS' for the last line of the code, that is for the output.

I have seen several existing posts on QGIS but I still couldn't solve this issue. Like for instance, when I use the script given here: Rasterizing shapefiles with GDAL and Python? I get an output raster with a 0 value for all pixels.

I also tried by giving the pixel size instead of extent by following this post – extent parameter pyqgis gdal processing rasterize. But then too I get the same error of 'Incorrect parameter value for UNITS'.

Using the code given in another post Python, creating layers and rasterizing polygons in GDAL
I get the error:

'AttributeError: 'NoneType' object has no attribute 'SetGeoTransform'.

I checked documentation for SetGeoTransform and I failed to figure out what went missing. With different codes I am getting different error.

I am attaching the code that I have used.

I just do not seem to understand why the 'Units' error is occurring. Both input data have same projection.

import os 
from osgeo import gdal, ogr
import processing
#input = iface.activeLayer()
input = 'C:\Trials\points.shp'
#layer = QgsVectorLayer(input.source(),"point","ogr")
layer = QgsVectorLayer(input,"point", "ogr")

#Raster extent for the new layer
St_Area_Rst = 'C:\Trials\Extent.tif'
#collecting dimensions of the above raster
dim = gdal.Open(St_Area_Rst)
xoff,a,b,yoff,c,d = dim.GetGeoTransform()
band1= dim.GetRasterBand(1)
rows = dim.RasterYSize
cols = dim.RasterXSize
Total_pixel = (rows*cols)
#specifying extent of input layer
extent = layer.extent()
xmin = xoff
xmax = xoff + (a*cols)
ymin = yoff + (d*rows)
ymax = yoff
output = 'C:\Trials\points_rst.tif'
processing.run("gdal:rasterize",
               {"INPUT":layer,
               "FIELD":"Rstrz",
               "DIMENSIONS":0,
               "WIDTH":cols,
               "HEIGHT":rows,
               "RAST_EXT":"%f,%f,%f,%f"% (xmin, xmax, ymin, ymax),
               "TFW":1,
               "RTYPE":4,
               "NO_DATA":0,
               "COMPRESS":0,
               "JPEGCOMPRESSION":1,
               "ZLEVEL":1,
               "PREDICTOR":1,
               "TILED":False,
               "BIGTIFF":2,
               "EXTRA": '',
               "OUTPUT":output})

Best Answer

I was trying GDAL:rasterize yesterday but eventually gave up. My data was a bit fragmented for GDAL to accept it, and some other issues I couldn't solve. However, I had success using Grass v.to.rast in QGIS (3.8). It felt like a "simpler" to use tool for what I was doing, as long as you pay attention to the settings in the "advanced" section. In my case this regarded the cell size.

This is not a solution to your problem, but may be an option to get the result you are after.

Related Question