Hello guys,
I've some files that exceded 8go and I can't open it. Is there a limitation of the size for .bin files ?
Thanks
Maximum file's size
Re: Maximum file's size
There's no limit on the filesize (at least for what I'm aware of), but there's one on the maximum number of points (I think it's 2 billion points per cloud, but you'll generally hit the machine memory limit before that).
You can increase the virtual memory limit but in this case you'll get horrible performances (which should already be poor anyway with so many points). At least it will let you apply some decimation in command line mode for instance...
You can increase the virtual memory limit but in this case you'll get horrible performances (which should already be poor anyway with so many points). At least it will let you apply some decimation in command line mode for instance...
Daniel, CloudCompare admin
Re: Maximum file's size
Is the limit (2Billion) still valid for 2.9beta? Is there a way to overcome this? Tried to decimate the cloud using command line, but the program still crashes. :/
Re: Maximum file's size
Yep, no change on this side.
If the file is a LAS file, then you can split it before loading it with the LAS 'split' tab in the dedicated loading dialog.
If it's ASCII, you can also try to split the file in multiple clouds with the ASCII/text loading dialog (at the bottom). But in this case you'll need a lot of memory as CC will still try to load all the points in memory (but in several clouds)
If the file is a LAS file, then you can split it before loading it with the LAS 'split' tab in the dedicated loading dialog.
If it's ASCII, you can also try to split the file in multiple clouds with the ASCII/text loading dialog (at the bottom). But in this case you'll need a lot of memory as CC will still try to load all the points in memory (but in several clouds)
Daniel, CloudCompare admin
Re: Maximum file's size
Thks for the answer. Finaly, it was just my files that was corrupted.
But my answer up a new problems : my files are unusally fat. For 11,000 points I have .bin of 1.8go. If i convert it in .e57 and resave it in .bin the file is arround few mo.
Do you have an idea ?
Sorry for my bad english, i'm french...
But my answer up a new problems : my files are unusally fat. For 11,000 points I have .bin of 1.8go. If i convert it in .e57 and resave it in .bin the file is arround few mo.
Do you have an idea ?
Sorry for my bad english, i'm french...
Re: Maximum file's size
Where do these files come from? (which version of CC?)
Daniel, CloudCompare admin
Re: Maximum file's size
This files are created with .fls and saved with CC 2.8.1. The original files with no modification and around 6 millions points is around 1.7go. After few operation (section, cut, clean...) and of course, after deleting all the temp files, my final files is bigger than the source...daniel wrote:Where do these files come from? (which version of CC?)
I found after few manipulation that an resample reduce the file size but it delete the SF. So it's not a good option for me.
Re: Maximum file's size
The original FLS file is 1.7gb with 6 M. points only? Or is it the first BIN file?
For BIN files at least, the problem is that the scan grid (the gridded structure) is kept in memory with the point cloud. And each time you split the cloud, the scan grid is duplicated. If this grid is huge then the memory might explode quickly.
I believe the latest versions (2.8.1 or 2.9.beta) let you delete this scan grid (it's only useful to compute normals actually). The idea is to remove it right from the start to avoid duplication (and make the BIN file much slower).
And as far as I know the resampling should not remove the scalar fields (maybe it's only deactivated?).
For BIN files at least, the problem is that the scan grid (the gridded structure) is kept in memory with the point cloud. And each time you split the cloud, the scan grid is duplicated. If this grid is huge then the memory might explode quickly.
I believe the latest versions (2.8.1 or 2.9.beta) let you delete this scan grid (it's only useful to compute normals actually). The idea is to remove it right from the start to avoid duplication (and make the BIN file much slower).
And as far as I know the resampling should not remove the scalar fields (maybe it's only deactivated?).
Daniel, CloudCompare admin
Re: Maximum file's size
My first .bin with mutliples .fls.daniel wrote:The original FLS file is 1.7gb with 6 M. points only? Or is it the first BIN file?
How can delete it ?daniel wrote:I believe the latest versions (2.8.1 or 2.9.beta) let you delete this scan grid (it's only useful to compute normals actually). The idea is to remove it right from the start to avoid duplication (and make the BIN file much slower).