Crash on various operation with very large clouds

If you are allergic to bug trackers, you can post here any remarks, issues and potential bugs you encounter
Post Reply
Dimitri
Posts: 156
Joined: Mon Oct 18, 2010 9:01 am
Location: Rennes (France)
Contact:

Crash on various operation with very large clouds

Post by Dimitri »

Hi Daniel,

I've been trying to use CC to subsample (with a space constraint) large point clouds (55 millions points, even 22 millions points), but I systematically get a crash (after the octree was built and before the phase of sub-sampling).
Meanwhile, it's easy to randomly subsample to get the cloud down to ~ 10 millions points, and then the spatial resampling is working.

Cheers

Dimitri

PS: Now Cloudcompare has got a few users in New-Zealand ! It's the beggining of world domination...
daniel
Site Admin
Posts: 7391
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Crash on various operation with very large clouds

Post by daniel »

Ok, I'll investigate this asap!
Daniel, CloudCompare admin
daniel
Site Admin
Posts: 7391
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Crash on various operation with very large clouds

Post by daniel »

Ok Dimitri, here are some news:

indeed there was a general limitation due to the extensive use of a standard structure (std::vector) that seems to be limited (at least on Windows) around ~20 M. values. As we were using it very often, it explains that a lot of algorithms couldn't handle much more points.

So I replaced this structure as much as possible (by the special CC "chunked" array that is a little bit slower than a standard array, but that can fill almost all available memory). I have also updated the spatial sampling algorithm to be a little more efficient (but of course, with 90 M. points, it won't make a lot of differences... it will still be very slow!). I've also tried to make CC more robust to "out of memory" errors. A lot of things have been changed in the code during this few days, so for once I'll try to make intensive tests before releasing this new version of CC. I'll keep you informed (maybe you have already signed up to the CC update mailing list?).

Anyway, as a general consideration, the actual version of CC is best suited for clouds up to 30M. points (depending on their spatial distribution). This is mainly due to the actual octree implementation (which is limited to 10 levels of subdivision, and which doesn't like big density variations).I'll try to test new implementations of the octree or even a K-D tree structure to see if we could go farther. For the moment, as you suggested a while ago, I'll try to make the resampling algorithms available from the command line.
Daniel, CloudCompare admin
daniel
Site Admin
Posts: 7391
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Crash on various operation with very large clouds

Post by daniel »

I've released a new version today (11 dec. 2012) that should fix this issue (and various others!). It is also possible to apply subsampling algorithms in command line mode (wiki: http://www.cloudcompare.org/doc/wiki/in ... ommandLine).
Daniel, CloudCompare admin
Post Reply