-
-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
deep image prior #192
base: test
Are you sure you want to change the base?
deep image prior #192
Conversation
re: noise annealing (via https://github.com/LAION-AI/notebooks/blob/main/DALLE2-Prior%2BDeep-Image-Prior.ipynb):
|
current blocker: a lot of dependent code assumes latent is a single tensor. downstream operations on the image representation attempt to call methods on it like "clone" that don't have good analogs with the module dict/list. I think the solution is instead of adding special cases, generalize the other image reps to expect containers with arbitrarily many image representation components of heterogeneous sizes. updating EMAImage to operate on parameter dicts/lists could potentially solve a good chunk of this |
Replaced subprocess.run with python's native os.remove()
replaced hardcoded path with os.path.join(backup_path) for automatic backup removal
Fixed backup remover
Fix backup xplatform
…this is unnecessarily complicated.
…w image models and losses work.
…mplify how losses and image_models work first, then come back to this afterwards.
migrate to branch |
still to do: