[GIS] How to determine the optimal raster resolution when interpolating a surface from points

interpolationraster

I'm looking for guidance on how to determine the desired output resolution, when interpolating a surface from irregularly-spaced point samples.

I have a series of boreholes taken across a city, and the density of the samples varies considerably – sometimes boreholes are located within ~5m of each other, while in other locations they are ~1km apart.

enter image description here

What techniques should I use to determine the applicable cell-size, when interpolating a surface from these points? Does the optimal cell-size depend on the interpolation method?

I'd like the highest resolution which is supported by the dataset. eg I assume that a 1m grid is no more accurate than a ~10m grid – but how do I determine that number?

Best Answer

You are interpolating the unknown z=f(x,y) from scattered data. Interpolation (surface reconstruction) on highly irregular point clouds of moderate size is best done with globally, non-compactly supported radial basis functions (RBF, thin plate spline, multiquadric). Implementations are available for SciPy, Matlab, C++ TPS.

One could then easily, for moderate datasets, rasterize by evaluating in each "pixel" or cell the fitted RBF interpolant function for plotting or multiscale analysis by choosing different resolutions.

Related Question