[GIS] Filtering above ground LiDAR points using a catalog in lidR

filterin-memorylidarlidrr

I am analyzing LiDAR data collected two separate years- 2007 and 2015. My analysis is centered around changes in vegetation during this time period. The ultimate goal is to thin the data from the 2015 data set so it is consistent with the 2007 set, normalize the data, then calculate a few metrics for each hectare (average height, crown size, etc).

The 2007 data has a significantly lower point density than the 2015 data, so my question is about thinning the 2015 data to be more consistent with the 2007 point density.

For each area, the 2007 data has been split into two files, one for ground points and one for non-ground points. The 2015 data is all contained in one file covering the same area.

I am planning to normalize the data using a DTM created from the higher density 2015 data (assuming that the terrain has not changed dramatically in the 8 years between data collection events), and so I would like to save all of the 2015 "not ground" points and then thin the point cloud.

I have been using the lidR package in R for my analysis with a catalog of the LAS files.

My idea was this:

#Get the list of LAS files to work with
files <- list.files(path = "/...2015TestGroup", full.names = TRUE,  recursive=FALSE)

#Set up the catalog properties
ctg <- catalog(files)
opt_chunk_size(ctg) <- 500
opt_chunk_buffer(ctg) <- 20
opt_output_files(ctg) <- ".../Outputs/2015nonground/{XLEFT}_{YBOTTOM}_2015"

#Select only those points not classified as "ground"
aboveground <- lasfilter(ctg, Classification != 2L)

#Get the list of 2015 "non-ground" points
files <- list.files(path = ".../2015nonground", full.names = TRUE,  recursive=FALSE)
#Set up the catalog properties
ctg <- catalog(files)
opt_output_files(ctg) <- ".../2016thinned/{XLEFT}_{YBOTTOM}_2015"

#Thin the points so they are consistent with the 2007 data
thinned <- lasfilterdecimate(ctg, homogenize(1, 1))

However, I realized that the lasfilter function doesn't work on a catalog after receiving an error message.

Is there a way to filter to keep everything but the ground points? Or is there a different approach I should take that gets me to normalized data sets of similar point densities?

Best Answer

In your case the filter you are using is a simple one: Classification != 2. And you don't need the ground points at all. You are better to use a streaming filter and a streamed processing.

ctg <- catalog("/...2015TestGroup")
opt_chunk_size(ctg) <- 0
opt_chunk_buffer(ctg) <- 0
opt_output_files(ctg) <- ".../Outputs/2015nonground/{XLEFT}_{YBOTTOM}_2015"
opt_filter(ctg) <- "-drop_class 2"
new_ctg <- catalog_retile(ctg)

catalog_retile is a streamed function. It means that not a single point is loaded at the R level. Each point is read, filtered and written immediately without calling any R code and without loading any memory. It is thus way faster than using the R function lasfilter.

That being said, you are better to use las2las from LAStools for this kind of job. It does exactly the same actually but is more efficient and simple than calling more or less equivalent code from R.

Related Question