-
Notifications
You must be signed in to change notification settings - Fork 213
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Questions about partition on a large colored point cloud #15
Comments
Right, an error happens if the size of the file is a multiple of the batch size. Good catch! It is fixed in the latest commit. Also please use the newest commit and use I see that there is a problem displaying the correct pruning %, it is now also fixed Now to your point clouds. This has nothing to do with color as far as I can tell . Your first file (691892 points) is already subsampled with at least a 5cm grid (actually about 12 cm), so the pruning does nothing. The second one (635137) is very small, about 50 cm of length with a huge precision. Hence the pruning decimates the cloud completely. |
Thanks for your answer! Now I can run your partition code successfully. But there are still some little problems: 1. The pruning % is still not correct 2. I still got a |
It is usually beneficiary both in processing speed and precision to subsample the input point cloud with |
Hi loicland, I have a similar issue there when running partition.py on semantic3d test_full: =================
|
Hi, Can you print the size of Also priniting the batch number :
With |
Hi loicland, Here are the prints before the error in provider.py:
|
Hi, can you add the following line 238 of
and report the log after:
I am trying to reproduce your bug but I need these information. Did you use the default values for |
Hi Loic,
When I tried to run partition on a large scale colored point cloud (contains 5000000 points), I got the following error:
/home/ubuntu/capstone/superpoint_graph/partition/provider.py:357: UserWarning: genfromtxt: Empty input file: "/home/ubuntu/capstone/semantic3d/data/test_full/colored.txt" , skip_header=i_rows) Traceback (most recent call last): File "partition/partition_Semantic3D.py", line 93, in <module> xyz, rgb = prune(data_file, args.ver_batch, args.voxel_width) File "/home/ubuntu/capstone/superpoint_graph/partition/provider.py", line 361, in prune xyz_full = np.array(vertices[:, 0:3], dtype='float32') IndexError: too many indices for array
The command I run here is:
partition/partition_Semantic3D.py --SEMA3D_PATH $SEMA3D_DIR
.You can find my file at:I also tried to reduce the file size (I created another txt file and copy the first 691892 lines of my original colored point cloud to this new file, you can find it at:) and re-run partition with the same command. This time the error disappeared, but the number of point didn't reduced at all (the log is
Reduced from 691892 to 691892 points (37.92%)
). I remembered that when I run partition on a point cloud without color (you can find it at ), the point number drastically reduced (the log isReduced from 635137 to 289 points (0.04%)
). Could you please tell me reason of it?Many Thanks,
Yongchi
The text was updated successfully, but these errors were encountered: