Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deep image prior #192

Open
wants to merge 31 commits into
base: test
Choose a base branch
from
Open

deep image prior #192

wants to merge 31 commits into from

Conversation

dmarx
Copy link
Member

@dmarx dmarx commented May 31, 2022

still to do:

  • Add option to use MADGRAD optimizer (should be default for DIP?)
  • Add EMA
  • Add noise annealing?
  • May need to add grad scaling (other AMP features?)

@dmarx
Copy link
Member Author

dmarx commented May 31, 2022

re: noise annealing (via https://github.com/LAION-AI/notebooks/blob/main/DALLE2-Prior%2BDeep-Image-Prior.ipynb):

        noise_ramp = 1 - min(1, itt / iterations)
        net_input_noised = net_input

        if input_noise_strength:
            phi = min(1, noise_ramp * input_noise_strength) * math.pi / 2
            noise = torch.randn_like(net_input)
            net_input_noised = net_input * math.cos(phi) + noise * math.sin(phi)

        with torch.cuda.amp.autocast():
            out = net(net_input_noised * input_scale).float()

@dmarx
Copy link
Member Author

dmarx commented Jun 2, 2022

current blocker: a lot of dependent code assumes latent is a single tensor. downstream operations on the image representation attempt to call methods on it like "clone" that don't have good analogs with the module dict/list.

I think the solution is instead of adding special cases, generalize the other image reps to expect containers with arbitrarily many image representation components of heterogeneous sizes.

updating EMAImage to operate on parameter dicts/lists could potentially solve a good chunk of this

@dmarx
Copy link
Member Author

dmarx commented Jun 23, 2022

migrate to branch dip_ema_simple

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants