-
-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Decimation algorithm similar to voxelize_points #406
Comments
You can do it on-the-fly at read time. There are many things you can do at read time so I do not necessarily provide an equivalent for loaded point cloud. readLAS("file", filter = "-thin_with_voxel 1") I will consider the request anyway but I can't swear I'll do it. |
Thank you! This was perfect for what I needed. I still believe the feature addition would be nice anyway, but definitely can see why it becomes lower priority. |
This should do the job. I will see if I include that in lidR as is or not. keep_n_per_voxel = function(n = 1, res = 1) {
f = function(las) {
by <- lidR:::group_grid_3d(las$X, las$Y, las$Z, res)
return(las@data[, if (n <= .N) .I[1:n] else .I[1:.N], by = by]$V1)
}
class(f) <- lidR:::LIDRALGORITHMDEC
return(f)
} Example: LASfile <- system.file("extdata", "Megaplot.laz", package="lidR")
las = readLAS(LASfile)
vlas = decimate_points(las, keep_n_per_voxel(res = 2))
plot(vlas) |
Thank you very much for this helpful function. |
Indeed two numbers are expected. You can have non-cubic voxels. You can use something like that so you are free to have voxel height different of voxel width keep_n_per_voxel = function(n = 1, res = 1) {
if (length(res)) == 1) res <- c(res, res)
f = function(las) {
by <- lidR:::group_grid_3d(las$X, las$Y, las$Z, res)
return(las@data[, if (n <= .N) .I[1:n] else .I[1:.N], by = by]$V1)
}
class(f) <- lidR:::LIDRALGORITHMDEC
return(f)
} |
added in 3.2.0 |
I would like to propose the addition of a decimation algorithm that keeps 1 point (if there are any) per cubic region in a LAS. This would be somewhat similar to the voxelize_points function, but without the introduction of substantial rounding errors and keeping important information such as the classification data in the final result.
I believe this would be particularly useful in order to homogenize point clouds in which large localized variations of the vertical dimension play an important role, such as cities with tall buildings, where that information could be lost using the already existing homogenize.
Even in applications where this is not the case, if the performance overhead is not too relevant, it could still be a better method in many cases. It is more likely to maintain an homogeneous point density on the scanned surface, rather than underlying grid. Thus, it does not lose as much detail on feature-rich tiles of the grid, which often the most critical ones.
The text was updated successfully, but these errors were encountered: