You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I'm trying to reproduce this work. As said in the paper, you used a Quadro RTX 4000 laptop GPU, whose memory size is 8GB. For DSEC-Det dataset, you used bs=8 for inference and bs=32 for training. But when I ran the code, I found that the memory usage for inference with bs=1 is about 13-20GB. Is there any parameter that I set wrong?
The text was updated successfully, but these errors were encountered:
Hi! I'm trying to reproduce this work. As said in the paper, you used a Quadro RTX 4000 laptop GPU, whose memory size is 8GB. For DSEC-Det dataset, you used bs=8 for inference and bs=32 for training. But when I ran the code, I found that the memory usage for inference with bs=1 is about 13-20GB. Is there any parameter that I set wrong?
The text was updated successfully, but these errors were encountered: