You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to create a large conversion, about 400 billion points, reading 9500 1km tiles, and the program keeps crashing at 100% indexing.
I saw that previous problems like this were due to duplicates, I have run lasduplicate on the whole dataset, and am using the latest potreeconverter version which drops duplicates itself. I made sure to run lasinfo -repair_bb _repair_count on all tiles beforehand too.
I am running the job on a networked NVME drive which doesn't usually give us any problems
I have run similar jobs in the past successfully, if anyone ahs any advice on the situation
Would anyone ahve any advice on troubleshooting, or alternately is there a way to take the current files left by potreeconverter2 and finish the packaging into the hierarchy.bin, octree.bin, and metadata.json triplet without reprocessing everything? (it takes about 3 days per attempt)
The text was updated successfully, but these errors were encountered:
I found that there was an issue with the data, in that the tiles were non contiguous. there were tiles making up one large area, and then a big blank space for a few hundred kilometers, then another big block of tiles.
I reran these pieces independently without issue. There seems to be some significant tolerance for this issue, as I've made mistakes with other datasets in the 100-200 billion point range, with blocks being separated from each other in the same cloud, but at what point it becomes an issue I'm not able to say.
Wow that's for sure some massive dataset. Looks like country wide laser scan from a national institution open data program? Would love to know the potree conversion processing time and file filesize!
Hello,
I'm trying to create a large conversion, about 400 billion points, reading 9500 1km tiles, and the program keeps crashing at 100% indexing.
I saw that previous problems like this were due to duplicates, I have run lasduplicate on the whole dataset, and am using the latest potreeconverter version which drops duplicates itself. I made sure to run lasinfo -repair_bb _repair_count on all tiles beforehand too.
I am running the job on a networked NVME drive which doesn't usually give us any problems
I have run similar jobs in the past successfully, if anyone ahs any advice on the situation
I am using the --encoding BROTLI tag
Here's the log, 52mb, I can't seem to find any errors (though I'm not skilled at this parsing)
https://drive.google.com/file/d/1_rSSYBEuMuc9GmFYONWx6eaM9EsEoRaO/view?usp=sharing
Would anyone ahve any advice on troubleshooting, or alternately is there a way to take the current files left by potreeconverter2 and finish the packaging into the hierarchy.bin, octree.bin, and metadata.json triplet without reprocessing everything? (it takes about 3 days per attempt)
The text was updated successfully, but these errors were encountered: