Page 1 of 1

Working with large data sets

Posted: Tue Oct 18, 2016 1:49 am
by Coldpoints
Hello,
I am working with two LAS files 17.3 and 13.4 GB. My computer has 128GB ram, with two Xenon CPU and four graphic cards.
First, why does CC not load the entire two files into the memory, instead of to the hard-drive?
Second, I find that CC is working very hard, when I just orient the model in different directions- the dial in the upper left is constantly spinning?!
The two files are SfM products, captures about eight hours different, and yet I am not sure why there is a difference in file size for what is suppose to be the same area with minimal change.

Re: Working with large data sets

Posted: Tue Oct 18, 2016 2:11 pm
by daniel
Well, the file size is not always exactly proportional to the number of points (depending on the format, etc.). In your case you should compare the actual number of points (you can see it in the cloud 'properties' once highlighted in CloudCompare). I bet they are quite different aren't they?

And CC does load all the points in memory (RAM), but to speed-up display they have to be loaded on the GPU memory (which might not be big enough). You can see this in the Console, something about the cloud being loaded on 'VBO' I believe). If the cloud are completely loaded on the GPU, then you can deactivate the 'LOD' mechanism that may be counter productive if you have a very powerful GPU (use the Display Options to do this - the little wrench icon - and uncheck the 'Decimate clouds over...' option).

And last but not least, CC only uses one card by default (I don't know if the GPU driver is able to dispatch anything on the 4 cards, I have some doubts about this ;).

Generally it's hard to work with clouds bigger than 300 or 400 M. points. The LOD helps a little but there is definitely some room for improvements ;)