Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimize Lazy Camera Loading for Efficient Compression of Large Files with Numerous Images and Corresponding Cameras (#8) #9

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

EtaCassiopeia
Copy link

  • Implemented LazyCameraLoader class in camera_utils.py to enable on-demand loading and unloading of cameras.
  • Refactored calc_importance function to use LazyCameraLoader for efficient memory management.
  • Modified finetune function to leverage LazyCameraLoader and ensure memory is freed after processing each camera.
  • Improved memory usage and performance during sensitivity calculation and finetuning processes.
  • This solution is particularly beneficial for handling large files and datasets, though it may not be as efficient for smaller datasets.

This commit addresses issue #8.

… with Numerous Images and Corresponding Cameras (KeKsBoTer#8)

- Implemented LazyCameraLoader class in camera_utils.py to enable on-demand loading and unloading of cameras.
- Refactored calc_importance function to use LazyCameraLoader for efficient memory management.
- Modified finetune function to leverage LazyCameraLoader and ensure memory is freed after processing each camera.
- Improved memory usage and performance during sensitivity calculation and finetuning processes.
- This solution is particularly beneficial for handling large files and datasets, though it may not be as efficient for smaller datasets.

This commit addresses issue KeKsBoTer#8.
Copy link

@arcman7 arcman7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@bzy-080408
Copy link

I tried but sttill out of memory. gaussian splatting model used about 4000x 4k photo.my device has 660g ram and 3x tesla l20 48Gb vram. date put into cpu memory

@arcman7
Copy link

arcman7 commented Oct 22, 2024

Each image was 4MB and you had 4000 of them? How many splats were in your scene when it crashed?

@bzy-080408
Copy link

Each image was 4MB and you had 4000 of them? How many splats were in your scene when it crashed?

image

@bzy-080408
Copy link

4396589,this is the pointcloud file before compression

@bzy-080408
Copy link

I tried by a 1.1TB RAM device.still out of memory

@arcman7
Copy link

arcman7 commented Nov 12, 2024

I need to dig up the code again and take a look. I know we had this issue fixed. Is this a time sensitive issue for you?

@bzy-080408
Copy link

bzy-080408 commented Nov 14, 2024 via email

@bzy-080408
Copy link

I need to dig up the code again and take a look. I know we had this issue fixed. Is this a time sensitive issue for you?

so anyway to fix it?

@arcman7
Copy link

arcman7 commented Jan 22, 2025

Use something like this:

def lazy_call(f, *args, **kwargs):
    return lambda: f(*args, **kwargs)
    
def cameraList_from_camInfos(cam_infos, resolution_scale, args, lazy_load):
    camera_list = []

    # for id, c in enumerate(cam_infos):
    #     camera_list.append(loadCam(args, id, c, resolution_scale))
    
    for id, c in enumerate(cam_infos):
        camera_list.append(loadCam(args, id, c, resolution_scale))
        if lazy_load:
            camera_list.append(lazy_call(loadCam, args, id, c, resolution_scale))
        else:
            camera_list.append(loadCam(args, id, c, resolution_scale))

    return camera_list

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants