Hi :-)
I'm trying to write a script that can calculate the mean value and standard deviation of the scalar field in a cloud. Here after, using -FILTER_SF, all points which scalar field values exceed 3 times the standard deviation, on both sides of the mean value, will be extracted to one cloud, and the remaining to another.
Is this possible in command line mode? Or is it only possible if you use the C++ library?
Best regards, Michael
Calculating mean value and standard deviation from Command Line Mode
-
- Posts: 5
- Joined: Sun Nov 15, 2020 6:07 pm
Re: Calculating mean value and standard deviation from Command Line Mode
I don't think this is possible with the command line mode indeed... And if one wanted to improve the code, one would need to add a specific option of the 'Filter by SF' command to filter with +/N sigmas I guess.
Daniel, CloudCompare admin